Content

/

Letters And Testimony

/

Re: The FTC’s Approach to Consumer Privacy

letters and testimony

Re: The FTC’s Approach to Consumer Privacy

June 3, 2019

The featured image for a post titled "Re: The FTC’s Approach to Consumer Privacy"

I submitted the following comments to the Federal Trade Commission on May 31, 2019 regarding Docket FTC-2018-0098, "Hearings on Competition and Consumer Protection in 21st Century: Consumer Privacy." Click here for the PDF version (which includes my footnotes).

Mr. Donald S. Clark
Federal Trade Commission Office of the Secretary
600 Pennsylvania Avenue, NW Washington, DC 20580

Re: Hearings on Competition and Consumer Privacy in the 21st Century: The FTC’s Approach to Consumer Privacy

Dear Mr. Clark:

On behalf of the Lincoln Network, a nonprofit organization that fosters collaboration between the worlds of government and technology to create a future full of individual liberty and economic opportunity, we respectfully submit these comments regarding the Federal Trade Commission’s approach to consumer privacy. Given the unprecedented public interest in how technology firms collect, use, and share consumer information, the FTC plays an important role in enforcing federal consumer protection laws while preserving innovation and disruption in the technology marketplace. We applaud the agency’s thorough efforts to better understand the privacy opportunities that new tech offers—and the privacy concerns that it raises.

The agency asks how, and to what extent, companies compete on privacy. As consumers grow more familiar with mobile devices, social media platforms, apps, and Internet services, they increasingly care about the privacy of information that third parties learn about them. In March 2018, one of the most high-profile privacy controversies to date came to light when news broke that the data analysis firm Cambridge Analytica had secretly amassed a vast quantity of information about over 50 million Facebook users. Within weeks, public trust in Facebook had fallen 66 percent. In the following months, about 3 million of the social network’s European users left the platform,

causing the company’s market value to fall by over $119 billion. Though far from a fatal blow to Facebook, the Cambridge Analytica scandal underscores how much consumers care about privacy and how willing they are to behave differently when they lose confidence in a company.

Facebook’s privacy mistake spurred a competitive response. Apple CEO Tim Cook, when asked about Cambridge Analytica in March 2018, contrasted his firm’s approach to data privacy with that of Facebook, maintaining that Apple would never have found itself in a similar situation. Cook explained that while Facebook’s business model entails monetizing consumer information for individualized advertising, Apple generates revenue by selling devices and services to consumers, as opposed to selling advertisers access to consumers. Facebook CEO Mark Zuckerberg has defended his company’s approach to consumer information, and the two executives have since been at odds in their public statements regarding privacy.

Although tech companies vary considerably in their approach to privacy, the market is generally moving toward greater transparency about how data is collected and used. Apple, Google, and Facebook each now require third-party apps that collect consumer information on their platforms to maintain a privacy policy and adhere to specific information practices. Nearly all major tech companies have voluntarily agreed to adhere to the Network Advertising Initiative’s code of conduct, which among other things forbids using certain categories of sensitive information for personalized advertising without a consumer’s opt-in consent. Such voluntary codes are generally enforceable by the FTC, given that a company engages in deception when it acts in a manner contrary to its public representations.

The agency also asks about the privacy risks entailed by the use of “big data” in automated decisionmaking. Several sector-specific federal laws address how firms make decisions about how much to charge and with whom they transact in markets including consumer credit, housing, and employment. In general, however, firms enjoy significant leeway in deciding how to price the goods and services they sell. Companies employ numerous forms of differential pricing, also known as price discrimination, which entails charging different prices based on variables such as the quantity a consumer wishes to purchase, the segment to which a consumer belongs, and, in some cases, a consumer’s individualized willingness to pay.

When firms employ algorithms and use big data to make decisions, consumers may enjoy substantial benefits as a result, such as expanded access to credit, more effective health care, and more individualized employment opportunities, as the FTC has noted. To the extent that algorithms enable firms to engage in more effective differential pricing—i.e., charging each consumer a price closer to how much she is actually willing to pay—total welfare is likely to grow, as a 2015 White House report concluded.

As companies experiment with novel approaches to data-driven pricing, some implementations will entail pricing disparities that some consumers may perceive as unfair. For example, in 2012, The Wall Street Journal reported that Staples.com tended to charge lower prices to U.S. users located in close proximity to physical stores owned by Staples’ main competitors in the office supply sector. One implication of this pricing disparity is that the Staples website tended to display higher prices to consumers in lower-income areas, which happen to have fewer stores that compete with Staples than relatively affluent areas. Differential pricing can also benefit lower-income consumers: the online travel agency Orbitz has reportedly charged higher prices to Mac users than to Windows users.

Ascertaining a consumer’s willingness to pay is usually quite difficult. Firms cannot be expected to offer a more palatable price to every consumer through differential pricing. But when firms make pricing mistakes, the solution is not necessarily to refrain from algorithm-based pricing. Rather, by harnessing even larger, more nuanced data sets, companies can employ more effective price discrimination—thus increasing output and, with it, overall welfare, at least in most instances.

The agency also asks where privacy interventions should be focused and whether the collection of certain types of data should be banned. The economist Ronald Coase noted that when an economist finds something that he does not understand, he looks for a monopoly explanation. Similarly, when a privacy advocate encounters a form of data collection that he doesn’t understand, he often looks for an unfairness explanation. Given the potential consumer benefits of novel data collection practices, however, the FTC should exercise great restraint before condemning a type of data collection as presumptively unfair.

Congress has authorized the FTC to police unfair business practices only where such a practice is “likely to cause substantial injury to consumers” that they cannot “reasonably avoid” and it is not “outweighed by countervailing benefits to consumers or to competition.” This relatively strict test checks the agency’s authority, making litigation a risky proposition when the FTC brings action against an allegedly unfair practice based on a novel theory of injury. But this constraint on the agency’s unfairness authority is a feature, not a bug.

Given how quickly consumer preferences evolve, and the immense benefits that data collection can deliver, insisting that companies minimize the amount of information they collect about consumers in all circumstances is a mistake. Collecting vastly more data than is necessary may be sufficiently risky that it meets the unfairness threshold in certain instances, such as when a flashlight app for smartphones collects precise geolocation information from user devices. But treating data collection as unfair whenever it exceeds the bare minimum necessary to enable the functionality of an app or service means that many unanticipated benefits of data collection will go unrealized. Individualized advertising is not just a way for firms to monetize apps that collect information; rather, effective advertising based on substantiated claims and tailored to individual preferences can help consumers decide what to buy and make markets work more efficiently by reducing information asymmetries.

As the FTC assesses challenges to consumer privacy, the agency should continue to think about the potentially serious tradeoffs of excessive federal intervention in the marketplace. Locking in the status quo by classifying novel forms of data collection as unfair will hurt consumers and discourage competition. Consumers have shown that they are willing to educate themselves about privacy, and make purchasing decisions accordingly, when the perceived benefit is worth the perceived cost. Robust privacy competition, transparency, and control need not come at the expense of innovation and experimentation with greater information collection, algorithmic decisionmaking, and more effective advertising.

Sincerely,

Ryan Radia
Senior Policy Counsel Lincoln Network ryan@joinlincoln.org

Explore More Policy Areas

InnovationGovernanceNational SecurityEducation
Show All

Stay in the loop

Get occasional updates about our upcoming events, announcements, and publications.