In an innovative environment, where seemingly every innocuous device can be connected to the Internet…and to everything else, what do technology developers, consumers, and the survey, opinion and marketing researchers who want to learn from the ubiquitous data flowing in every direction need to do to ensure consumer privacy and security?
That was the focus of the final panel of speakers at a November 19 Federal Trade Commission (FTC) workshop on the “Internet of Things” in Washington, DC, with four practical hypothetical scenarios used as their jumping off point. The speakers discussed the extent of the novelty of the Internet of Things, how to manage traditional expectations of notice and consent for data collection, use and sharing, and how to manage privacy and security in such an environment.
Scenario #1: Sue wants to design a system that will control the interconnected devices in her home via her smartphone. She wants her smartphone to be able to lock and unlock the front door, turn off her alarm as she approaches, and control the lights in her bedroom so they turn on before she wakes up. What should she be aware of? What are the costs and benefits? Does it depend on the sensitivity of the data? Can Sue just go out there, put up a shingle, and do this herself?
Dan Caprio, senior strategic advisor at the law firm McKenna Long & Aldridge, referenced two principles to work from: (1) “Think about what we connect to the Internet and why;” and (2) “There’s no such thing as perfect security.”
Michelle Chibba from the Office of the Information and Privacy Commissioner of Ontario reminded the audience that most mobile app developers are small businesses without massive IT departments. Small organizations need to be targeted with basic guidance on data protection, she said.
Caprio pointed out that there’s so much innovation in low-cost security that “reasonable data security doesn’t have to break the bank.” Sue needs to think of it at the conception of her app and “bake it in” to the development. “There are tools and technologies to keep in mind that might not cost an arm and a leg,” he noted.
The FTC waits for a while until you have a lot of customers before it starts to kick the tires on your security and privacy practices, according to Ryan Calo, assistant professor at the University of Washington School of Law. If you get those structures in place early, he said, with a good business model, you’ll be in a good position when you’re big enough to get noticed. The FTC “does a good job on security”: if you misrepresent your company as secure, you’ll be found out and punished for deceptive or unfair practices.
Marc Rogers, principal security researcher at Lookout, Inc., referenced the IP-connected light bulb mentioned earlier in the workshop, warning that there are more effects that take place than someone nearby controlling your light bulb. “Don’t underestimate what might be done with your app,” no matter how innocuous you may think it is.
Scenario #2: Jane wants to start training for a marathon and she considers buying a new smart device to help her training, The device can: connect to her online calendar to schedule times for runs; calibrate optimal training programs and design running courses; offer discounts on medical insurance; and post progress on her social networks. Terms and conditions on the device don’t mention data sharing details. Does this connected device put Jane on notice that the manufacturer may obtain her information? What if the manufacturer starts selling Jane’s data to advertisers, offering the app for free in exchange for selling Jane’s data to them?
As a consumer privacy measure, Calo talked about consumer notification the same way Churchill talked about democracies: a bad idea, except for all the others. In a Google Glass era, he lamented, we’re “using Guttenberg-era privacy policies.” While we shouldn’t abandon notice, “we need to drag notice into the 21st century,” and the Internet of Things is a “forcing mechanism” because it doesn’t have screens to display that notice. In this scenario, Jane needs more customization to allow her to understand how her data may be working for or against her.
The “bait and switch” in some arrangements bothers privacy activists, and can also bug consumers, according to Calo. Consumers need to be given the gist of the transaction and offered some basic transparency. Under those circumstances, he felt there wouldn’t be a problem with Jane using the watch.
Of course, this problem is not necessarily unique to the Internet of Things, since transparency has already proven a thorny issue with mobile apps.
Rogers noted that consumers must understand what data is being collected and given a chance to consider the implications. But Maneesha Mithal, associate director of the FTC’s Division of Privacy and Identity Protection, postulated that “notice and choice” was an outdated privacy concept.
Caprio agreed that the notice and choice model needed some adapting, since business models are rapidly evolving. “What is the problem we’re trying to solve and what do we need to do to solve it? Consumers don’t read privacy policies,” so we should move away from silo approaches around collection and think more about how the data collection really works, consider what the real-world harms may actually be, and develop practical solutions. Caprio observed that the Fair Information Practice Principles (FIPPs), which guide most privacy regulation, should perhaps be rethought and updated. The FIPPS “grew up” in the 1970’s, an era of “centralized databases with a lot of structured data.” A former FTC employee himself, Caprio recounted that 15 years ago, “we measured our progress on the Internet at the FTC by surveying 100 websites.” Now, data is unstructured and decentralized, so industry, civil society, and government entities need to work together and be flexible, but it is important to “get the policy aspects right.”
Mithal asked the panelists if there should be a central preference-setting system for privacy in the Internet of Things. “Would that give huge power and control to whoever controls the consumer interface?”
The potential for abuse, replied Calo, can be seen by asking who built the underlying mechanism, who controls the data flow, and who pays for it all. “Our lodestar should be to empower consumers.” While the idea of a centralized mechanism made him “uncomfortable,” he recognized the need for basic interoperability so that different overarching systems of consumer control could be developed. Otherwise, he said, “when you have standards, how do you get them taken up by everyone?”
Rogers expressed skepticism that such interoperability was within reach, since there is “too much going on from too many directions for the manufacturers to want to cooperate this way,” and on too many closed networks. He suggested that a “standardized approach for privacy, not a central controller,” would make for a much more realistic solution.
Recognizing that data from the Internet of Things could improve our lives, especially when combined with Big Data and marketing research analysis, Mithal asked the panelists if data sharing with third parties, “in anonymous aggregate form,” should be allowed.
Calo wondered if it matters if the proverbial “they” know who you are: If you’re on a 12 mile run, and an app tells snickers that device 1234 just completed a 12 mile run, and snickers can reward you with a coupon email, “is that so horrible? They don’t know you’re Ryan Calo.”
But what if Snickers gets the aggregate data on a thousand runners, asked Mithal in return. She also wanted to know if data minimization should play a role in such a scenario.
Caprio felt that, while minimization is important, there is so much going on, sometimes it is impossible to figure out at the beginning what data needs to be minimized without strangling off the innovation. Data minimization decisions are “not black and white.”
Scenario #3: Sue’s system for controlling interconnected devices via the smartphone is extremely successful. One day Sue gets a call from Tom who runs the home security system that is compatible with Sue’s application. Tom tells Sue that the login credentials for his system were compromised and that criminals have posted live video feeds of some of Sue’s customers on the Internet.
You shouldn’t be able to move between customer’s systems easily, said Rogers, and it can be prevented with strong password protections and two-factor authentication. “If security had been baked in at the start…with an adequate security assessment,” such an issue could have been easily avoided, but too often there is a rush to market by people who Roger believes don’t have appropriate skills or just want to leave it up to the consumer to set their own security.
Calo also weighed in with concerns that privacy and security should have been considered before the business even got started. “The data lifecycle starts at your business plan.”
Scenario #4: One day Sue is approached by a marketing company that wants to buy data about Sue’s customers. Now she wants to change her data sharing policies to share with third parties, which is different from her promises in her user agreements originally. At the FTC, we say that material changes to privacy policies require opt in consent. How could consumers go about giving their consent to that in the Internet of Things?
Changes in the gist of the transaction with “secondary non-beneficial uses” should always “raise alarm bells,” replied Calo. He compared it to the shifting in the value proposition of going to the movie theater, where consumers now get ads “before you even hit the previews.”
Mithal asked the panelists if this constituted a “material change” in privacy practices.
According to David Jacobs, consumer protection counsel at the Electronic Privacy Information Center, it could be a material change, depending on the wording of her original terms of service and user agreements. “How can Sue gain her user’s consent?” He said it would depend on the situation: in an app she could use a just-in-time popup notice, or she could reach out to her users through email, but it would depend on the relationships Sue has with the users.
What should the FTC do?
Mithal asked the panelists, “If you were the FTC, what would you do next?”
Rogers warned that the vastness of the Internet and “how fast it is moving” pose major challenges. The FTC, he said, needs to strike a balance between guiding companies and enforcing, but be “light” in regulating. Returning to his concerns that so many manufacturers and developers lack security expertise, Rogers commented that “a lot of these design problems were solved long ago,” and most people just need to be pointed towards those solutions.
Jacobs said that, in the absence of a federal omnibus privacy law, the FTC needs to step in with vigorous enforcement to fill the gaps. His organization, EPIC, last year asked the FTC to make an enforcement settlement against a marketing research company significantly harsher.
T. Drew Hickerson, assistant general counsel at Happtique, stressed the need to educate consumers and businesses. The FTC, he felt, should partner with industry (and other government agencies), since there is “too much volume” to effectively navigate and enforce in the Internet of Things. Partner with other agencies.
Caprio stressed the point that “one size doesn’t fit all” in regulating the Internet of Things. “This is an evolution that really requires a new way of thinking and a flexible framework to adapt to the 21st century.” The FTC needs to look at “technology neutral” regulation, but any move to regulate right now would be “premature.” Whichever country “gets this right will lead the world,” citing U.S. dominance in Internet and data entrepreneurship.
Are new FTC regulations on data privacy and security coming soon for the Internet of Things?
No, said Jessica Rich, director of the Bureau of Consumer Protection at the FTC, as she closed the workshop. Although, she did indicate the agency would offer a report on the subject next year.
With the rapid evolution discussed just at this FTC workshop, it makes sense that the FTC should take it slow. Consumers may feel that they have less control over what’s going on with their data, especially given how much is being generated, but they may also gain greater control over their world in the process – the end state is not yet in sight, so it is hard to predict. One can certainly understand the concerns of activists like Adam Thierer, who worry that the FTC could implement preemptive controls on the Internet of Things with a precautionary principle approach to regulation and thus kill off all sort of interesting and useful innovation. At the same time, the worries of activist Bruce Schneier resonate: “giving the Internet eyes and ears” could disrupt the power balance for consumers and leave them subject to not just ubiquitous data, but ubiquitous surveillance by government and corporations.
Survey, opinion and marketing researchers can be at the forefront in guiding the Internet of Things and making use of it – to better serve their clients and consumers and bring new and exciting innovation to the world. Keeping respondents in mind becomes extra important in the Internet of Things, since the passive data collection involved turns almost everyone into a respondent in one form or another. It is up to the profession to adapt new methods that can best protect those respondents as the new world evolves.
And the regulators will be watching us all closely.
For the whole MRA series about the workshop on the Internet of Things, see part 1, “The Internet of Things: Connected devices are changing the world for consumers and data users” part 2, “Trust and context in a connected world: what can marketing research tell us?”, part 3, “Vint Cerf and the Internet of Things: "Privacy may be an anomaly", part 4, “Smart Home, Smart Health, Smart Cars: What will inter-connected devices mean for users and data users?” and part 5, "Ubiquitous Data: Privacy and Security in a Connected World."