[ad_1]
On August 22, 2022, the Federal Register published an Advance Notice of Proposed Rulemaking (ANPR)
by the US Federal Trade Commission (FTC or Commission) to address
what the FTC characterized in its press release as “harmful commercial
surveillance and lax data security.” The FTC’s new
Democratic majority, in a 3-2 party-line vote, agreed to issue the
ANPR, which also covers automated decision-making. The broad scope
of the ANPR—and the Commission’s announcement that
“it is exploring rules to crack down on” harmful
practices—signal that major regulatory changes affecting the
digital economy may be on the horizon. (We provided a brief summary
of the ANPR in our August 15 blog post.)
For months, FTC Chair Lina Khan has indicated that developing
regulations on data security, privacy and automated decision-making
would be a top priority for the Commission. Chair Khan and her fellow commissioners
(Commissioners Alvaro Bedoya, Rebecca Kelly Slaughter, Christine Wilson, and Noah Joshua Phillips) believe the FTC’s
current approach of regulation through case-by-case enforcement
does not sufficiently protect consumers. As Chair Khan has
emphasized, adopting regulations would enable the Commission to
fine first-time violators, which it generally cannot do under the
FTC Act;1 and the very existence of rules would
set a baseline for business behavior to prevent injuries that can
be difficult to remediate through case-by-case enforcement.
To help inform its planned rulemaking, the FTC will hold a virtual public forum on September 8, 2022. In
addition, the Commission is soliciting written comments on a wide
range of topics to help guide its formulation of rules. These
topics include companies’ practices in collecting, using and
retaining consumer data and when those practices might involve
sharing, selling or monetizing consumer data unfairly or
deceptively. According to the FTC, even if it ultimately declines
to pursue rulemaking, the “comments will help to sharpen the
Commission’s enforcement work and may inform reform by Congress
or other policymakers.”
Comments must be received on or before October 21,
2022.
Overview and Scope of the ANPR
The ANPR’s definitions of “commercial
surveillance” and “consumers” indicate that the FTC
may be contemplating a sweeping scope for new regulations.
“Commercial surveillance” is broadly defined as “the
collection, aggregation, analysis, retention, transfer, or
monetization of consumer data and the direct derivatives of that
information.” This definition appears to cover virtually any
type of processing of consumer data.
The ANPR’s definition of “consumers” is also
expansive: it includes businesses and workers, “not just
individuals who buy or exchange data for retail goods and
services.” This definition of “consumer” is vastly
broader than generally exists under existing privacy laws and
almost certainly would increase compliance burdens on
companies.
The breadth of the anticipated rules is further indicated by the
questions on which the FTC seeks comment—95 in
total—many of which are open-ended and have various subparts.
Below we highlight some of those questions to provide insight into
the issues the FTC plans to address.
Data Security
The FTC has an extensive history of disciplining companies for
allegedly lax security under the “unfairness” prong of
the Commission’s statutory authority. In the ANPR, the FTC
seeks comments on whether to codify data security requirements, and
on how granular those requirements should be. Among other things,
the FTC contemplates requiring companies to certify that their
“data practices meet clear security standards.” The
Commission also asks whether it should set these standards or rely
on third parties, potentially including trade associations as well
as standards bodies such as the National Institute of Standards and
Technology or the International Standards Organization, to do so.
It also asks whether it should incorporate the data security
requirements imposed under laws such as the Children’s Online Privacy Protection Act,
the EU/UK General
Data Protection Regulation (GDPR), the Gramm-Leach Bliley Act
(GLBA) Safeguards Rule, or other federal or
state laws as a template for its new
rules.2
The FTC also asks whether it should codify the prohibition on
deceptive claims regarding consumer data security to allow civil
penalties for first-time violations.
Limitations on Allowable Collection, Use and Retention of
Consumer Data
The FTC appears to be considering requirements for data
minimization, purpose limitation, necessity, and proportionality
for the collection, use and retention of consumer data. These
requirements would limit the quantity and types of data that
entities could collect and process based on the intended purpose of
the collection and use. With respect to data minimization and
purpose limitation, the FTC seeks input on the possible effects of
these requirements on algorithmic decision-making and processes and
poses questions on the balance between potential requirements’
harms and benefits. Similar obligations are familiar to businesses
that have to comply with the GDPR, and those companies should be
able to draw from their experiences to inform their comments on
these questions.
Targeted Advertising
The FTC asks multiple questions about targeted advertising.
Mindful that regulating these practices could “burden
companies, stifle innovation or competition, or chill the
distribution of lawful content,” the Commission seeks input on
the benefits and costs of possible regulations generally and in
specific sectors such as finance, healthcare, search, and social
media. In particular, the FTC asks about the use of alternative
advertising strategies in the event it limits first- or third-party
targeting. Entities involved in targeted advertising should
therefore prepare to justify their practices with evidence
demonstrating the potential harms restrictions would impose on
consumers and companies.
Consent
The FTC requests views on whether it should prohibit certain
commercial surveillance practices “irrespective of whether
consumers consent to them.” In addition, the FTC is interested
in the form, scope and meaningfulness of consumer consent; the
process for giving and withdrawing consent; and how consent could
figure in the determination of whether a commercial surveillance
practice is unfair or deceptive.
Transparency
Presumably to facilitate consumer consent or at least choice
over products and services, the Commission asks whether it should
require companies to describe the types of data they use, how they
collect and process those data, if and how they use automated
decision-making to analyze data, how they use the data to arrive at
a decision, the impacts of their practices, and any risk-mitigation
measures they employ. Relatedly, the FTC seeks input on mandating
regular self-reporting, audits or impact assessments about
businesses’ commercial surveillance practices and whether these
materials should be disclosed publicly.
Automated Decision-Making and Algorithmic Errors
The ANPR may lead to the first specific regulation of artificial
intelligence (AI) and other automated decision-making systems
applicable to the entire US economy. The FTC presents a number of
questions on the reliability of automated decision-making systems.
These questions go far beyond earlier descriptions of the proceeding as
focused on discrimination when it comes to algorithms.
For example, the FTC asks whether it should require companies to
take specific steps to prevent algorithmic errors and what those
steps should be. The FTC is also interested in whether it should
require companies to certify that their reliance on automated
decision-making satisfies standards related to accuracy, validity,
reliability, and error (perhaps without regard to risk of harm) and
whether it should forbid or limit the development, design and use
of systems that produce unfair, deceptive or abusive outcomes.
Algorithmic Discrimination
In recent years, the Commission and various of its members have
pressed for companies to take greater care to avoid algorithmic
discrimination against protected classes of people. For instance,
Commissioner Rebecca Kelly Slaughter has warned that “[a]s an enforcer, I will see
self-testing [for unlawful credit discrimination] as a strong sign
of good-faith efforts at legal compliance, and will see a lack of
self-testing as indifference to alarming credit
disparities.”
The ANPR includes questions about how to address this problem,
including whether the FTC should “bar or somehow limit the
deployment of any system that produces discrimination, irrespective
of the data or processes on which those outcomes are based,”
and, if so, which standards the Commission should use to measure or
evaluate disparate outcomes. These questions suggest the FTC may
adopt a disparate-impact analysis for prohibited discrimination
along the lines of the Equal Employment Opportunity
Commission’s
80-percent rule for prima facie evidence of
discrimination. (This rule provides that it is evidence of an
adverse disparate impact if the selection rate for the protected
class is not at least 80 percent of the rate of the unprotected
class.)
Notably, as have other federal agencies in this Administration,
the FTC also seeks comment on possible antidiscrimination
protections for “underserved groups” that are not
recognized and protected from discrimination under laws enacted by
Congress.
Biometrics
The FTC requests information on the kinds of biometric
information that companies collect, the manners in which such
information is collected and the purposes of the collection. The
FTC also seeks views on whether and, if so, how it should limit
practices that use facial recognition, fingerprinting or other
biometric technologies. The ANPR refers to states such as Illinois
and Texas that have enacted laws governing the use of biometric
data, and companies could draw from their experience complying with
these laws to respond to the ANPR’s biometrics-related
questions.
Remedies
The FTC seeks comment on the remedies that it should impose for
violations of the new regulations that could arise from the ANPR.
It asks whether the rules should “enumerate specific forms of
relief or damages that are not explicit in the FTC Act,”
including algorithmic disgorgement to prevent companies from
profiting from unlawful practices involving AI and automated
processes.
Despite not having express authority, the FTC recently has used
its broad powers to require disgorgement as part of its settlements
with Everalbum, Inc. and WW International, Inc. Everalbum was required
to delete the facial-recognition models and algorithms it developed
with biometric information from consumers who did not give
affirmative consent to the practice while WW International had to
delete personal information collected from children without
parental consent and to destroy work product and algorithms derived
from such information.
Other Issues and Areas of Concern
Other topics covered by the ANPR include questions regarding
the:
- extent to which commercial surveillance harms consumers;
- authority of the FTC to promulgate rules and the extent to
which provisions of the FTC Act could be used to regulate certain
practices; - harm caused on children by commercial surveillance and
child-protective privacy measures; and - balance between the costs and benefits of promulgating rules
governing commercial surveillance practices and the impacts of
regulations on competition.
Conclusions
Although the ANPR is the initial step in the lengthy rulemaking
process, companies should be mindful of the potential impacts the
eventual regulations could have on their business models and
operations. The numerous and specific questions posed in the ANPR
indicate that the FTC is thinking carefully about how it can set
and enforce clear—but potentially quite granular and
burdensome—requirements for businesses. Clear requirements
could provide more objective, bright-line standards that would help
companies avoid inadvertent violations of the open-ended
prohibitions of unfair, deceptive and abusive commercial practices
under Section 5 of the FTC Act. On the other hand, the more
prescriptive and particular the requirements, the more they would
hinder innovation in the digital economy and fail to keep pace with
technological developments.
If the FTC decides to move forward with promulgating
regulations, it will first have to review and evaluate the comments
it receives. The ANPR presents an opportunity for companies to
assist the FTC in arriving at proposals that preserve innovation
and at the same time ensure the ethical use of data. Once the FTC
has settled on a set of proposed rules, it may be much harder to
obtain significant modifications. Concerned companies should
therefore consider engaging with the FTC in this early stage by
submitting focused comments on particular questions.
Privacy policy is attracting lots of attention in Washington
this summer. Privacy mavens will be aware that the
“bipartisan, bicameral” American Data Privacy and Protection Act
(ADPPA) has progressed farther in Congress than any previous
privacy legislation of such broad scope. The ADPPA would require
the FTC to adopt rules on many of the topics covered by the ANPR
and constrain the Commission’s discretion in doing so. However,
the bill continues to face significant opposition, so its prospects
are uncertain. If the ADPPA does become law, FTC would
“reassess the value-add” of the rulemaking effort,
according to Chair Khan. (Commissioners Slaughter, Bedoya, Wilson,
and Phillips generally agree, notwithstanding nuanced differences
in their views.) In the meantime, the FTC is pressing forward, and
businesses would be wise to pay attention.
Footnotes
1. Sections 5(l) and (m)(1) of the FTC Act, 15 U.S.C. § 45(l), (m)(1), prevent the
FTC from seeking civil penalties for unfair or deceptive business
practices unless it already has adopted a rule or a written
decision (other than a consent order) prohibiting the specific
practice.
2. These questions are interesting, particularly given
that COPPA and the GDPR are very high-level in terms of security
requirements (the GDPR requires adequate technical and
organizational measures, while COPPA requires reasonable procedures
to protect the confidentiality, security and integrity of personal
information).
The content of this article is intended to provide a general
guide to the subject matter. Specialist advice should be sought
about your specific circumstances.
[ad_2]
Source link