"Notice and choice" is "really a charade," according to Justin Brookman, director of consumer privacy and technology policy at Consumer's Union, so he said consumers need a "different mechanism" for the marketplace, "perhaps icons or a simplified notice," because the marketplace "doesn't work."

Speaking at an Information Technology and Innovation Foundation (ITIF) event on September 25 about European versus American privacy law, Brookman lamented further that there are no “expectations of trust for any connected service or platform" today, highlighting differential pricing driven by algorithms and broad access to data by companies unknown to the consumer. While Brookman loves the Federal Trade Commission (FTC) and it "does a really good job with the tools that it has," he urged Congress to provide the agency more resources, power and authority to have any hope of protecting privacy. The agency’s authority under Section 5 of the FTC Act to combat deceptive or unfair acts or practices is "necessary, but not sufficient," and, according to Brookman, encourages companies "not to say anything" to avoid charges of deception.

He also contended that consumers “shouldn't be expected to manage" tons of "different privacy settings" for different entities. "I don't think you need unbounded data collection to make cool products," but "that is where we're headed."

Brookman complained that industry trade association's self-regulatory programs don't work anymore, if they ever did, and fewer companies even belong to the associations anymore.

Will Rinehart, director of technology and innovation policy at American Action Forum, debated the opposite position. He urged policymakers to think about the European Union (EU) General Data Protection Regulation (GDPR) “in context" with the EU privacy scene, what has happened since implementation in May, and "what it means for the ecosystem and what it means for choice." While GDPR provides “some sorts of control rights" within "certain types of platforms," there are somethings that GDPR "does really poorly." For example, Rinehart said, "data minimization" stifles secondary usage and data sharing. Consumers have the right to export their data, but that provides minimal panacea. Consumers can enforce the delisting of information with the Right To Be Forgotten, but that has spillover effects that are still being explored. If any of these GDPR provisions are violated there's a high fine involved. "The stick here is pretty big,” Rinehart said. As a result, "companies have tried to understand this new set of regulatory risks," and have taken on "clear costs" in compliance or avoidance.

He agreed with Brookman that "ubiquitous and unfettered data collection" is an important concern, but urged policymakers to consider what might be optimal for the marketplace.

There are lots of surveys about consumer attitudes to data privacy, but when measuring costs and benefits, data demonstrates that consumers believe that online services benefit them personally and they feel good about how they've benefited. "So how do we solve these problems that we've talked about," like Cambridge Analytica, Rinehart asked, which was a "third-party contract problem" rather than an issue between Facebook and consumers?

Rinehart also agreed with Brookman that a federal law may be a good idea, but that it needs to avoid GDPR's problematic approach; "costs and benefits" to consumers and businesses should be considered in implementing "writ large" privacy law. GDPR's onerous fines make the data sector "a more risky investment,” Rinehart concluded, and a GDPR-like policy would be “very detrimental.”

Brookman immediately countered that the U.S. “should not just copy and paste GDPR and call it a day."

Amie Stepanovich, U.S. policy manager for Access Now, piped up in support of GDPR, contending that Americans are "being left behind" while other countries and jurisdictions are building systems to protect consumers. GDPR didn't really change much from the old EU Data Directive, she insisted; it mostly just added heavy fines.

Daniel Castro, vice president at ITIF, stressed the need to focus on consumer harm, fraud, identity theft, and similar tangible harms, so that policymakers would be "solving a real problem" and not "have a negative impact on innovation" and "burden companies with excessive compliance costs." ITIF, he pointed out, “opposed GDPR for Europe, too, not just the U.S."

Castro felt that a key question was, "Do consumers actually want to pay more for privacy?" When researchers actually study this with consumers, consumers show "they don't want to pay." Asked about the tradeoff in a practical environment, consumers overwhelmingly choose privacy invasion over privacy if they can save some money in the process, he said.

What presents the best way forward: sector-specific fixes or smaller approaches?

Stepanovich suggested that America’s "sector-specific approach" may have helped in some specific small ways and areas, but has not "done the job of protecting consumers' data overall." If the goal is keeping compliance costs down, it isn't accomplishing that, she said, since companies still must comply everywhere else.

Taking a more practical approach, Brookman replied, "I'll take what I can get." Some of the sectoral laws have done a good job, he said, but we can't just pass a data security breach notification law and then call it a day. People don't like data aggregation, data brokers, or surveillance across websites and services, and there is a need "to align practices with expectations."

Castro said that "transparency" is "one of the best principles out there," and is more than just a first step.

Brookman countered that transparency hasn't worked great for consumers: "nobody knows what's going on." He referenced privacy-invasive opinion surveys (a worry for our industry getting specifically targeted by privacy activists at some point in the future).

America v. California v. Europe

The U.S. "is basically under attack from GDPR" and the California Consumer Privacy Act, Castro insisted, and those jurisdictions will be setting the rules for data everywhere if Congress doesn't pursue a federal U.S. standard. California’s "manipulation question" comes back to concerns about American democracy and "controlling your own destiny," so we need to consider the empowerment of individuals in that context.”

According to Kim Hart, managing editor of Axios, the Cambridge Analytica case involved two levels of intrusion: Facebook sending lots of consumer data to Cambridge Analytica, and a contract violation between the two companies, while Facebook also sent data on the app's users' friends, which was a Facebook fault. She posited that a notice of transfer to third parties might be called for in response, instead of “throwaway notice buried in a privacy policy.” Even if that may be meaningless to the consumer, Hart suggested, activists and regulators might be able to use that information on a macro level to regulate the digital space.

Brookman countered Castro’s points, highlighting a key problem with GDPR. “Europeans don’t ever enforce the laws that they write,” he said, which is why the e-Privacy regulation “never changed anything.” Brookman instead said that private sector “profit models need to change to align with consumer demands.”

Does the FTC have the tools necessary to police consumer privacy?

Brookman again stressed that the agency needs a lot more staff and civil penalty authority. He emphasized that harms are bigger than we think. For example, the simple consumer worry about misuse of their data down the road is a harm on its own. “There is a legitimate worry that data collected could be used against you in the future” and “there is always some degree of risk.”

Carl Szabo of Netchoice contended that the FTC’s “unfairness” authority “is pretty broad,” so he asked if Section 5 can address a lot of Brookman’s concerns? And should it apply to nonprofits, he asked, since they deal with so much data, too?

Brookman countered that “unfairness is incredibly fluffy,” and should be replaced with a specific law on privacy. Nonprofits should be included, he agreed. Brookman further lamented that GDPR focuses too much on process, not enough on substance, and “isn’t at all what drafters wanted it to be.” The Europeans’ “coercive consent process” is counter to the intent of the GDPR and he would prefer “clear rules about what you can do” instead of a “massive accountability program” with millions of lawyers helping you to comply.

Castro piped in that GDPR obviously didn’t and doesn’t solve the problem, since Cambridge Analytica was a European-based company.

The FTC’s current unfairness authority isn’t enough, Stepanovich agreed. As an example, she pointed out that Facebook was already under an FTC consent order when the Cambridge Analytica case happened, and the required audits and assessments don’t appear to have understood that the deletion of data had never occurred, let alone had Facebook taken any action against the violations.

Hart asked the speakers if there was a better body than the FTC to enforce privacy law.

While admitting he would “like to see some reforms at the FTC,” Rinehart stressed that the agency is “where you’d want this kind of enforcement.”

Brookman again said that, “If you want the FTC to litigate more cases, they need a lot more resources.”

- See also "Data-Driven Industry Groups Propose New Paradigm for Consumer Privacy Regulation"