“As businesses, you need to be concerned about losing your customers’ trust if the big data analytics on which you rely cannot handle consumers’ private information with sensitivity and respect.”

That was the opening warning from FTC Commissioner Julie Brill at a U.S. Chamber of Commerce conference on October 7 on “The Future of Data-Driven Innovation.” In her remarks, she told attendees that “trust is the belief – but not certainty – that a person, company, or device will do what an individual expects, and not something else.” Meeting and respecting “norms and consumer expectations,” in Brill’s opinion, can be as or more important than just following the law. When consumers trust a company, they will use their services without understanding every “jot” of their privacy practices. Violating that trust will ruin the relationship – it takes a long time to build but “you can lose it in a minute,” she said. “This requires constant effort as risks to consumer information constantly shift.”

Brill felt that "data security is the biggest challenge to consumer trust in the data-driven economy,” and that “there is no privacy without appropriate data security." Companies have to deliver appropriate data security protections in order for consumers to be able to enjoy the benefits of social media and the Internet of Things. However, Brill was concerned that companies are not learning the lessons from past FTC enforcement actions: too many companies are not encrypting their data and aren’t using good security practices.

Turning to the now-familiar boogeyman of data brokers, Brill observed that, in the marketing context that these analytics are intended, they should not be harmful, “but there’s also a clear potential for use of the information to be harmful and discriminatory and to destroy consumer trust.” Reiterating a question from the FTC’s Big Data workshop in September, she asked, "Where does market segmentation end and harmful discrimination begin?"

The Commissioner then suggested steps that companies can take now to help build consumer trust. “Consumers cannot manage all of this on their own.”

  • “Improving security of data that is personally identifiable or linkable to individuals is a must.”
  • “Risk assessments and data minimization are integral to data security practices.”
  • “Make security tools more user-friendly.”
  • “Giving individuals more tools to control the privacy of their information is another must in the data-drive economy.”
  • Companies need to build “better protections under the hood to ensure ethical treatment of consumers.”
  • Privacy and security practices “that will earn consumers trust”
  • Companies dipping into Big Data should deploy dashboards for consumers and ethics reviews should be built into their policies. They should be “asking whether their analysis of consumer data is taking them into questionable territory.”

After Brill’s speech, Carl Szabo, policy counsel for NetChoice, pointed out that the Commissioner talked about a lot of potential problems from Big Data and asked what the FTC was doing to identify “real harms in the marketplace” instead of talking about “what may or may not happen.”

In response, Brill said she is often told she doesn’t dwell enough on the great benefits of Big Data and spends too much time talking about potential problems, “but that is my job.”

Brill asked, "Am I putting a chill on Big Data? I don't think so… I’m just one person in a chorus of many, many people that are singing a lot about these issues.”

Grappling with an abundance of data
As any marketing researcher knows, companies should not simply let data sit. "It's not good to just have this data... you have to get insights out of it," said Frank Stein, director of the Analytics Solution Center and Software Group at IBM.

"We expect 50 billion connected devices worldwide in the next 10 years," said Emery Simon, counselor at the Business Software Alliance, at the conference. Simon continued that, "data is not a new phenomenon... it has gone from being scarce to being very abundant."

To get a handle on the state of the data economy and related public policy issues, the Chamber released a data-driven innovation report to coincide with the conference.

Policies to make Big Data work
A panel discussion of public policy approaches for Big Data didn’t follow quite the same script as Commissioner Brill. Not all of them supported her call for comprehensive data security and privacy legislation.

Marjory Blumenthal, executive director of the President's Council of Advisors on Science and Technology (PCAST) in the Office of Science and Technology Policy (OSTP), warned that you can’t bake in specific technologies into public policy, because it is likely to be swiftly obsolete. “We need a kind of regulatory humility” when considering Big Data and “can’t count on any one technology to protect us.” Besides, she said, “technology alone is not likely sufficient” to protect consumer privacy.

David Quinalty, GOP policy director for the Senate Subcommittee on Communications and Technology, discussed current threats to the “cross-border data trade” and his concerns about digital protectionism leading to a global “balkanization” of data.

As Benjamin Wittes and Wells C. Bennett wrote in their section of the Chamber report, privacy is “something of an intellectual rabbit hole, a notion so contested and ill-defined that it often offers little guidance to policymakers concerning the uses of personal information they should encourage, discourage, or forbid. Debates over privacy often descend into an angels-on-the-head-of-a-pin discussion.”

They decided to focus on “databuse,” meaning: “the malicious, reckless, negligent, or unjustified handling, collection, or use of a person’s data in a fashion adverse to that person’s interests and in the absence of that person’s knowing consent.”

 “Corporate custodians” of consumer data have: “obligations to keep it secure; obligations to be candid and straightforward with users about how their data is being used; obligations not to materially misrepresent their uses of user data; and obligations not to use them in fashions injurious to or materially adverse to the users’ interests without their explicit consent.”

Failures of this kind of “data trusteeship,” said Wittes and Bennett, constitute databuse. Protection against database “should lie at the core of the relationship between individuals and the companies to whom they give data in exchange for services.” This is preferable, they concluded, than “broader protections of more expansive, aspirational visions of privacy.”