In case you didn’t know it, all your home appliances are connected devices. "Some of our engineers view a refrigerator as just a 72-inch computer," says Michael Beyerle, marketing manager, innovation at GE Appliances.
Discussions drilled down into the impact of the growing connectivity of products and services in people’s homes, their health and fitness, and their automobiles, at a November 19 Federal Trade Commission (FTC) workshop on the “Internet of Things” in Washington, DC.
According to Jeff Hagins, co-Founder and CTO of SmartThings, smart phones are becoming ubiquitous at the same time as we’re seeing an explosion of connected devices driven by reduction in the cost of manufacturing and the cost of connection. Manufacturers are pursuing a model where they connect devices to the cloud and give you an app for your mobile device to control them. Do I really want to end up with an app for my oven, refrigerator, thermostat, light bulbs, security system, baby monitor, etc.?
“I literally have three different apps for light bulbs on my phone right now.” From a consumer perspective, Hagins said, that is just silly, and the Internet of Things in that respect currently “fails the consumer.” His company created a platform targeted initially at the “smart home”, to allow one app to control it and allow these devices to work together.
Hagins continued: “I have 130 connected devices in my home, but most of those devices in and of themselves don’t deliver a lot of value. It is the software on top of them that delivers value.”
Lee Tien, senior staff attorney for the Electronic Frontier Foundation, claimed to not be a “cheerleader” for the smart home, since it “raises serious privacy issues.” He cited Supreme Court Justice Antonin Scalia, who said that, “in the home… all details are intimate.”
Much of the Internet of Things data coming out of a smart home is not useful on its own, but can be used to make inferences about who you are, how many people live in your home, how often you get visitors and for how long, where in the house you are, and what you’re doing there. Tien’s concerns about the dangers of inference synch with the same arguments made about Big Data at a policy conference in September.
Craig Heffner, vulnerability researcher at Tactical Network Solutions, mused that the Internet of Things is a nice buzzword, but “we already have things on the internet, and we have a lot of them,” all without much of any data security. According to Heffner, “Consumers don’t understand the technology they have,” and that lack of understanding may be an even bigger threat.
Hagins agreed that, “Our things and our data need to be secure” and “we need to own the data that comes from those things.” Consumers are constantly adding contextual data to their Internet of Things data, making it more useful to them but a bigger privacy and security threat. If you name a cluster of devices “Kim’s room,” that means that someone accessing that data would know whose room belonged to Kim and when she is in that room.
Beyerle pointed out the more mundane interests for most consumers. Their most common concern would be an app that checks to make sure the stove is turned off when they’re leaving the kitchen.
Hagins agreed. “My killer app is one that detects if I left my garage door open when I left in the morning and closes it for me,” but that is a clear security threat if an intruder can use it to control his garage door.
He upped the ante on the confused consumer, noting that it is not just consumers that don’t understand how the Internet of Things works – the folks building it likely don’t understand it either.
Tien and Heffner both returned to security vulnerabilities that can result from that ignorance. Naive engineers rely on consumers to have already established encryption protocols for their home wifi, on which the engineers’ smart home devices will operate. Heffner worried about unschooled amateurs and small businessmen developing Internet of Things devices and systems trying to limit costs and hiring cheap developers, who will “make rookie mistakes” with unsuspecting consumers’ security.
Another workshop panel explored the risks and benefits to consumers from connected health and fitness devices and apps. This can mean everything from connecting more obvious data-collecting devices like home glucose and blood pressure monitors, to less-suspected devices like the mobile app that helps you track and evaluate your daily jog, or the temperature-detecting sensors in a bra that help detect potential breast cancer. It could also mean a monitoring system for at-risk elderly people living on their own. The data they produce is generally not covered by existing U.S. medical data privacy laws like HIPAA.
Scott Peppet, professor at the University of Colorado School of Law, made the case that consumers can’t begin to understand the privacy risks of much of this seemingly innocuous disparate data or the kinds of inferences that can be drawn from it. For instance, he contended that consumers can be personally identifiable by the data from their fitbit app (used for exercise); because no two people have the same gait or stride.
Anand Iyer, President and COO of WellDoc Communications agreed that consumers can’t grasp the pitfalls in the Internet of Things or “make good decisions about privacy because they have incomplete information.”
As a result, getting informed consent from consumers will be a huge challenge. Stan Crosley, director of the Center for Law, Ethics and Applied Research in Health Information at Indiana University, warned that requiring explicit consent for interconnected health and fitness devices would be “catastrophic” to their development and ultimately hurt consumers. He warned that there is “no practical way” for a consumer to consent “every time a sensor records data about you.”
Crosley also stressed the importance of the burgeoning sea of Internet of Things data in the healthcare context, particularly in building data integrity. Consumers and their healthcare providers, he said, need multiple different applications collecting different data in order to build a more coherent picture.
Jay Radcliffe, senior security analyst at InGuardians, Inc., turned his focus to data security, worrying that consumers don’t realize how insecure their data may be: Just because they are a nobody does not mean that somebody won’t hack their data. The financial industry is making great strides in security, and most industries are demanding more complex passwords, but “doctors need to make their patients aware that these devices are connected” and have their patients think about that in a larger context.
Unfortunately, according to Peppet, he found in one of his studies that doctors were opposed to password protection on an app designed for patients, because it would be an impediment to patient compliance. They seek to remove every hurdle to patient compliance with treatment (and apps can help, since patient self-reporting to their doctors can be unreliable). If there is a password, he said, “patients might throw the app away.”
The panelists all tended to agree that the FTC could help by focusing on its traditional role, holding companies accountable for the false or embellished statements they make about the connected devices they offer consumers. More specific best practices and guidance on connected healthcare devices might be helpful as well, in collaboration with other relevant agencies (e.g., the Food and Drug Administration).
“At one time, the automobile was the single largest consumer of CPUs in a single device,” said Kenneth Wayne Powell, the general manager and senior executive engineer of electrical systems at Toyota. As cars get more complex, they contain tons more computing power. Powell noted that most of what Toyota has done in relation to the Internet of Things is to connect drivers of their cars with info that they need, usually downloaded from a satellite or terrestrial information system.
“Marketing research has shown we can service their needs” that way, said Powell, but it is one-way, not interactive. The same goes for a car’s event recorder—a standalone device in the car that has to be accessed directly to get the data out of it. None of these things make for a connected car.
So what is a connected car? OnStar is a favorite example: “an embedded function in the car, with a secure connection to service networks.” More recently, smartphone-based connectivity - Bluetooth or wired - has spawned an app-type environment, featuring music services like Pandora, simple Internet searches or queries, and mp3 players. “Those systems are, by design, separated inside the vehicle,” said Powell, and cannot access the car’s entire data network.
Christopher Wolf from the Future of Privacy Forum extolled some of the benefits of connected cars. If a driver is in an emergency situation, they can call emergency responders by touching a single button. Drivers’ cars can alert them to dangerous road situations ahead. Parents can ensure their kids are using the family car safely. Location services on cars can ensure that good Samaritan calls quickly send first responders to the right place. More mundane consumer benefits exist, too, like mobile apps to turn on your car’s air conditioning before you get in on a hot day or remember where you parked, or your car’s systems can point you to the nearest parking spot while you’re driving.
John Nielson, from the American Automobile Association (AAA), agreed that these developments have the potential “to simplify our lives” and “keep drivers safe,” but AAA worries that technology-driven distractions are a dangerous problem. Drivers are “looking down at an app instead of looking at the road.”
Wolf countered that, if you integrate those apps with the car, for instance, it could let them interact on social media safely, without taking their attention away from the road.
What about the security of those connected devices and their data? Tadayoshi Kohno, associate professor of computer science and engineering at the University of Washington, conducts research on the security issues in modern connected cars. He noted that the connections and computers within the car are incredibly valuable for safety. Some cars have traction control sensors on each wheel to detect how fast each wheel is going, so that if they are out of synch, it may mean you’re in a skid, so it adjusts. “There are lots of different definitions for connectivity. But we need to understand the unexpected consequences,” said Kohno, and prepare for them.
Kohno’s team purchased a pair of vehicles to research and came up with some scary security scenarios. An attacker could do “a lot of damage” if they could connect into the car’s internal network. Attackers could make it impossible for a driver to stop the car. Attackers could even access the car’s network without ever physically touching the car. “I could email you a working mp3 file that, if played in your car, could unlock the doors.”
Nielson pointed out that cars are tremendously complex and are a huge fountain of data, but that most of it has almost no value to a third party. It simply helps the car function, which is why security needs to be handled carefully.
Panelists were then asked about privacy and security surrounding updates to car systems made during a visit to your car dealer’s service department. As Nielson pointed out, “If you’re going in because your check engine light is on, you need to get in” for service. The data the service department is accessing is “mechanical diagnostics, not a record of everywhere you’ve been or how fast you drove. And it all spits out within less than a minute.” There’s not a big risk in that instance, he seemed to think.
More broadly, consent to data use and sharing is a complex challenge in connected cars. Many of the car’s systems don’t have screens. “How can you do notice and choice in that context,” asked Wold, and “how can a user consent to something at 60 miles an hour?” New ways to ensure consumer consent are needed.
What is unique about the connections involving automobiles rather than other kinds of connections? Powell noted that, “the fact that it is an automobile rolling down the road makes it a riskier environment,” and reiterated the panelists’ worries about the “extraordinary” risks of distracted driving resulting from connected cars.
Lerone D. Banks, a technologist in the Division of Privacy and Identity Protection at the FTC, asked the panelists, “When is there too much technology?”
Nielson responded that “the technology itself is a benefit. The information is a good thing.” But concerns center around how it is used; how and when it is displayed; and with whom, how and when the resulting data gets shared.
Asked what government ought to do, Wolf warned that, "to legislate in advance would stymie innovation." The FTC has done a good job not giving prescriptive regulations for specific technologies, he commented, and “I would not like to see the mission of the FTC become the granular technology prescriber.
For the whole MRA series about the workshop on the Internet of Things, see part 1, “The Internet of Things: Connected devices are changing the world for consumers and data users” part 2, “Trust and context in a connected world: what can marketing research tell us?”, part 3, “Vint Cerf and the Internet of Things: "Privacy may be an anomaly", part 4, “Smart Home, Smart Health, Smart Cars: What will inter-connected devices mean for users and data users?” and part 5, "Ubiquitous Data: Privacy and Security in a Connected World."