Should privacy policies be outside the box?
When you pick up an item and bring it home, did you remember to check online to see if it could spy on your habits? Should you have to?
Prior to the Internet of Things, the idea of hidden cameras and secret surveillance on the mainstream public would rarely be found outside a spy story; now however, it’s becoming business as usual. The Internet of Things (IoT) is here, and it’s only a matter of time before a 24/7 connected device steps into your life. Just look at the data: connected devices have bumped from 15.4 million in 2015, to 17.68 million in 2016 and 20.35 million in 2017. By the time 2020 rolls around the IoT is estimated to have 30 million connected devices worldwide; that’s equivalent to 82% of the entire population of Canada, and the number more than doubles by 2025. The result, behind the scenes, is an increasing steam of big data, transferring to remote servers and clouds for processing. The prospects of such a connected world is a double-edged sword: while we like the convenience and applicability of smart applications, there's a question of how comfortable we are, or would be wise, to live our lives in a glass fish bowl. When it comes to IoT, all that data collected to make an object ‘smart’ goes somewhere, and it’s not always in hands we’re comfortable with. Question is, are we vigilant enough when we buy products to be aware of the potential intrusion, and how hard should we need to look for the warning signs?
As an industry innovation, the rise in IoT certainly makes sense, both for startups trying new ideas and existing manufacturers to bring their customers the next generation of product. Smart devices can often do more than non-intelligent counterparts, to provide a better experience in their use. Personalization of health aides and how they operate, for example, can provide the recipient with a much higher quality of life. Smart devices are seen as fun, and have feature many consumers want: we like saving money on our power bills, but heating homes less when we’re not around and focusing that heat in areas we spend our time, rather than all rooms indiscriminately. Smart speakers such as Alexa can turn on the music and call friends from anywhere in the house, while a smart car can make advisories on where traffic is best avoided, and even save lives by sensing a collision with a living thing faster than the driver can react. Inserting technology to make a device ‘smart’ can improve the product, putting a competitive edge in what can be a constant struggle for consumer's dollars.
The catch-22 however is that only way a device becomes 'smart' is through the ability to access lots and lots of data, data that can pose complications not even the creators are aware of. Consider the Strava heat map disaster: during the design Strava may have felt the data it was aggregating harmless, as a heat map of a runner’s fitness routines isn’t considered as personal as, say, individual name, ethnicity or health conditions. Yet security researchers were able to prove otherwise, that such data could be used not only to expose international military bases, but potentially unlock the daily living patterns of high-end officials, making them much more vulnerable if targeted. An IoT Teddy Bear hacked in 2017 exposed over two million messages between parents and their children online. A United States ISP warned users accused of pirating they control a lot more than the family internet, with power over the thermostat too. Worse, at present IoT devices have less than stellar privacy and security records, with security of the data from outside threats or inside invading actors an afterthought in the design. Plus, it’s so easy to buy an intrusive device without recognizing what you're actually getting in to. Privacy isn’t always the first thing we think about when buying appliances, particularly items like lightbulbs and T.V.s which have never had privacy implications before. Is the buyer always aware they are purchasing a ‘smart’ device, or are they interested in other aspects, such as price, size, shape or brand preference, without paying attention to the added benefits of camera, microphone or sensor? Make no mistake, there’s a stark difference between signing up for a service online against walking into a retail outlet or selecting “add to cart” in an e-shopping store. While the actual privacy implications of social media and online services can be questionable depending on the vendor, at least users are reminded to take their privacy into consideration by agreeing to the online privacy notice, which is often required before account creation or can be viewed in the app store along with other information. When you buy an IoT device it's different: you invest cash before knowing the full stakes what you’re agreeing to once everything is plugged in. True, any device that features an app will let you see a privacy policy at some point, but after you’ve put fifty, a hundred or more dollars towards a product, are you really likely to return it after you’ve read the privacy notice? Will you even understand the notice, and will a vendor allow a return if the product is opened?
Least talking about ‘smart devices’ suggests data invasiveness should be obvious, here's just a sample of what ‘smart devices’ you could be bringing into your home:
- Lightbulb
- Lock
- Camera
- Speakers
- Water Bottle
- Notebook and Pen
- Television
- Thermometer
- Scale
- Slowcooker
- Child’s stuffed animal
Some legislators agree, privacy implications need to be higher on the company agenda before sending products to market. In California a new bill is on the table: Bill 327, also known as the “Teddy Bear and Toaster Act”. It would seek to protect individual privacy twofold, by both requiring manufacturers to embed security features into IoT products appropriate to the data collected, and more crucially to consumer awareness, require notice obligations about the information the device could capture “through the use of words or icons on the device’s packaging, or on the product’s, or on the manufacturer’s Internet Web site,” potentially putting privacy before the point of sale. Another bill on the table, this time before Congress, is S.2289 - The Data Breach Prevention and Compensation Act of 2018, which while not requiring privacy notification as prominent to the consumer, does punish manufacturers who collect data without security safeguards, or fail to notify of a privacy breach, to the tune of a hefty financial penalty. While the General Data Protection Regulation of Europe doesn't push for regulation at the time of purchase, that may be incoming, or regulators may feel it won't be needed: the GDPR is already the world's most robust piece of privacy legislation, with Article 25 explicit that any data collection of European Union citizens must have privacy be design and by default in the products and company processes to begin with. In Canada, PIPEDA, which governs the privacy requirements of private companies is in drafts to get an upgrade that includes data breach notification requirements. However, it’s still a long road to go, particularly as most of the aforementioned new legislations have yet to be actually approved.
Will privacy warnings ever be required outside the box in Canada or the entire United States? Unlikely anytime soon; even if Bill 327 does make it to law, there’s no telling how much leeway in timelines will be provided for business to comply. The Privacy Commissioner of Canada is calling for further amendments to Canadian privacy legislation to bring it more in line with the GDPR, notably giving Canadians the right to be forgotten, but existing cases have already demonstrated that even when privacy rulings favour Canadian prosecutors, data sovereignty and international information flows can prevent enforcement. There is however, a business opportunity for companies that do place privacy as a priority: like environmentally friendly labels that allow consumers to purchase more earth-friendly options, marketers for products that that take privacy seriously should consider it an advertising asset. As data breaches like Equifax raise customer awareness into the vulnerability of their data, a very real segment will take in privacy considerations during the buying decision if drawn attention to its advantages outside the box. Or, to look at studies on consumer choice consider this: if we know 79% of customers will stop using a brand that uses personal data without their knowledge, we can assume that 79% of customers will be more inclined to switch to products that are clear in how personal data is used. Startups that take privacy seriously into the design of their product have the ability to advertise this fact, showing the world not only what their technology is capable of, but how they are actively taking steps with their business to prevent problems in the future.