Function Creep: The Frankenstein of Privacy

Do you worry about how a new technology could invade your privacy in the future? Are you ever concerned that the company you give your data to will use it against you down the road? Have you wondered if it’s time tinfoils hats came back into style? If so, you’re not alone.

While innovators assure us new technology will only improve lives and open doors, it’s hard not to be skeptic when faced with reality. The Internet of Things might mean more convenience with more devices in the home, but it also means more products ready to play tattle tale. Consider the news from John Hancock Financial. While fitness trackers are intended to encourage and engage in healthier lifestyles, the American insurance giant has found a new purpose. In September 2018, the company  announced that data will now be used as part of deciding customer premiums.

Welcome to the dark side of innovation, better known as function creep, the Frankenstein monster of privacy.

 

What is 'function creep'?

Function creep is what happens when we use technology and systems in ways beyond the original purpose, particularly when the new purpose results in invasion of privacy. To take something that exists, old or new, and find a new use: not a bad concept on its own, but considered a big problem when that new use is to spy, commit fraud, or cause damage.

For an example of function creep, look no further than inside your own wallet. The driver’s licence started out with a simple role, to show that the holder could legally drive a vehicle. However, as licences became more common, they became easy to use for personal identification. A driver’s licences is easy to carry, and contains the state/ province ‘s verification of a person’s name and age. It’s also, unfortunately, historically easy to falsify. Although licences have evolved over the years to make it harder for fraud. From teens purchasing liquor underage to committing serious identity theft, there’s a lot of trouble a fake card can cause, none of which was intended when they were first designed.

 

Why is function creep such a critical concern for privacy?

Typically when we give our data to a product or for a service, we know the purpose of why the data is needed. Often the potential uses of our data are laid out in writing, particularly if the data is personal information. This is often required by law: acts and regulations across the globe, including Canada’s PIPEDA, the EU’s GDPR, and Japan’s APPI, require users be informed on the potential uses of the data, before data collection. ‘Identification of use’ is a key element of privacy, and significant in the contract of customer trust. We’re okay with giving our data over, provided we know the purpose the data is for. That does not mean however, we are okay with all of the potential uses of our data.

Consider Facial Identification technology, such as the Apple FaceID. Yes, it makes it easy to unlock your phone, but what if it is also used by law enforcement to access your device? What if Apple or a competitor sells their facial scanning technology to a new company, that wants to use it to identify and announce a shopper every time they look at a product display in stores? Will individuals still be comfortable with this new use? Is there anything in the design of the facial scanner to prevent this unintended use of the technology?

Function creep is uncomfortable in part because it’s a guessing game of how technology can go bad in the future, while trying to avoid shunning new tech altogether. We like new systems, achievements and digital problem solving, but we are weary of solutions that end up creating new, potentially more harmful problems than were had before. Worse, as more examples of function creep come to light, for your customers it’s a very real concern.

 

Why businesses need to consider the function creep of their products.

Sadly it’s terribly easy, for businesses to ignore function creep considerations. After all, how can designers possibly account for every possible use, situation or current event? Make no mistake however, your business ignores function creep at its own peril. Someone is going to be looking ahead, looking at your products, processes and services, and thinking of the possibilities. Ask yourself: do you want new uses to be imagined by your team, by concerned customers, malicious parties or government regulators?
Products that become popular for the wrong reasons are rarely good for business, and expecting customers to live in fear is not the answer.

Function creep is a game of ‘what if’, and although the results might not come to fruition, you still need to account for the possibilities. Much like pharmacy manufactures test and research medication to limit side effects, technology, system and data developers need a level of accountability over what they send to market. We may not be able to plan for everything, but we can think of the possibilities, and ask what steps in the design might limit malicious use.

 

What can a business do to avoid function creep with respect to privacy?

For now, there are no definitive standards to make a product or service 100% immune to function creep. As Alfred Nobel could attest with his invention of dynamite, history is full of inventions started for one purpose and used for another. There are however, things that can be done to lower the risk:

1. Ask the ‘what if’ questions before collecting the data. There are two points to highlight: what if we are required to pass along the data to other parties, through business sale or legal pressure, and what if our product/service becomes further developed and we can collect more, do more, with data; what are the breaks that will stop information misuse? Is there a way we can see this information being of valuable outside the organization, and is there anything that can be done early to prevent it?

2. Talk with your data scientists. Data science is the master profession of asking questions and looking ahead into the future. If you’re privileged enough to have one on board, true data scientists have a valuable skill set to bring to the table. Are there uses of the technology, surveillance or processing of data could harm the original user, even if you hadn’t intended it?

3. Set up a meeting and bring as many different backgrounds as you can to the table. Get product design to be weighed in by individuals with backgrounds in finance, HR, legal compliance, arts and communications. Get those who don’t get technology along with your champions, those with arts and philosophy sitting next to science and machines. Ask the what if questions, but don’t talk solutions; those can be covered later, and you don’t want to dismiss valid concerns. Critically, if you have a team member good at moderation, don’t let one set of voices overshadow others, as the silent voice may have identified points you need to hear.

4. Avoid cover-all legal clauses in your privacy notices or policy, and be explicit in data use. Although generic and less direct policies arguably provide more room for product improvements down the road, growing controversies including the Cambridge Analytica scandal and Google MasterCard deal are making consumers, and regulators, look more closely at what is really going on. Privacy laws, including the GDPR and Canadian Health Information Acts are increasingly cracking down on organizations that are not clear about the uses, and limitations of use, on the data they collect.

5. Think about change early. Spoiler alert: change is going to happen, whether it means company growth and expansion through acquisitions, to new research and developments, to new faces joining your team. What would happen to your data if your business is purchased by another company? What if you join forces with a new idea, and have the potential to combine data for better products, and new uses? How would you inform customers of this new development? Would fresh consent be in order? How would you implement opt-out features to insure those not interested in joining new systems are not forced to?

6. Are there controls, both in the system and business practices, that can limit function creep? Privacy by design, for example, is the ideal of implementing privacy safeguards directly into core architecture of the product, so that it would be difficult to intrude on privacy if used by others. A similar philosophy of ‘Pushing Left’ exists in information security, arguing the best practices and failsafes are more effective when worked earlier in the timeline product design.

 

Function creep may well be impossible to eradicate entirely: to cut it out entirely would require removing the same imaginative and problem solving minds that ideas thrive on. However, we do technology a disservice by tossing it aside and assuming nothing can be done without looking at the possibilities. We owe it to future users to pay attention to what we are developing, so that our hands are not later responsible for making a monster.

Posted in Aware, Privacy, Protect and tagged , , .