The people that can find ~ an information, data and privacy blog
Should Health Technology Require a Medical Licence Before Going to Market?
You’ve officially had a month. Things are stressful at work, you’re fighting with friends, and you’ve been struggling just to keep your head above water. After a long, trying day, you log into Facebook and post a message that “it’s all over”. You would never actually kill yourself of course. You know that option is not actually on the table. But times are trying, the drive to continue is a struggle, and you know others in community who have been there. Maybe you have been struggling with depression, and you know it’s time to talk to your doctor again.
You don’t expect the police at your door later that evening.
Sound like the opening promo of a horror film? Sadly, that scenario is now a reality, thanks in part to Facebook’s suicide risk algorithm. As you're online, posting about illness, mental struggles or general unrest, Facebook is paying attention. When it comes to an interest in your health, the company is not alone.
When Your Health Is Interesting to Others
It should come as no surprise that many technology firms are collecting mental and physical health related data. We’ve known, for example, insurance agencies troll social media profiles for fraudulent claims. In September of 2018, when John Hancock Financial informed the market it would now extend plans only to those who opted for sharing fitness data, it came as little surprise to privacy enthusiasts. They had been warning of the possibility for years.
Conversely, many devices and services use this as a selling point. Fitbit, for example, offers a blend of physical wellness tracking, including exercise, weight, sleep and more. Apple has been steadily slipping into the healthcare race for some time now, teaming up with the Mayo Clinic and Epic Systems in its Health app and Health Care kit. Apps abound that offer tracking of physical conditions and mental off-days. They invite us to record as much detail about our ailments as possible. Technology that can help keep us healthier and live longer invites tremendous benefits for both consumers and investors. As the market rapidly spreads however, we need to pay more attention to what is actually going on.
Technology can aide our health goals, but should it make healthcare decisions?
Not to Be Confused with Clinical Health Information Technology
Let me be immediately clear: I am not in any way intent on criticizing medical technologies, clinical data analytical tools, or applications intent on better health tracking and communication between physician and patient. Conversely, true medical technology often has substantially better privacy and ethical practices than purely private products. This is accounted for in two ways: first, with stricter privacy regulations imposed on technology in the medical field. In the United States a physician interested in a new tool must confirm its compliance with HIPAA. In Canada, individual provinces have their own health privacy legislations, which include administrative, technical and physical safeguards for data protection.
Second, as a part of these legislations, there is often a direct separation between the medical practitioner and the technology. We see the language of ‘custodian’ for the physician or medical practice, and the terms ‘service provider’, ‘agency’, ‘information manager’ and ‘business associate’ for the company licensing use of its technology. This places a direct emphasis on accountability. The service provider must be accountable to the custodian, and the custodian is responsible for the service following applicable health privacy laws. It's a back-check. Technology that is not in compliance with the privacy requirements won't sell, because clinics and physicians are obligated to see evidence of compliance before they buy.
Not All Health Information Technology Is Equal to the Law
It’s dangerous for consumers however, to believe their health data is always protected or will be kept confidential. Health privacy laws such as HIPAA often only apply when technology is under contract with health professionals. Conversely, many applications that now collect or process health data are not subject to medical codes of confidentiality, although consumer privacy laws may still apply.
To a degree, it makes sense that not all health data require physician oversight. Individuals must have agency over their own lifestyles and habits. This also however, puts a lax on compliance: the custodian/provider back-check is no longer in force. Individuals have to be conscious of both data is collected, how it is processed and whom it might be given to. Depending on the law, individuals may not have as many rights to deny data access, or to require explicit consent prior to processing.
When the Doctor’s Office is Full
There’s a been a trend over the past decade to use technology companies for health advice instead of medical practitioners. According to Pew Research, in 2005 eight out of ten internet users has looked for health information online. Naturally, applications and Internet of Things devices have moved in on this trend. Now with the right app your phone is happy to measure your pulse, provide dietary recipes, calm your anxiety, or follow your reproductive health, all without a required doctor’s note.
There are reasons why individuals are looking to large technology companies for health help rather than a physician. For starters, going to the doctor costs time and money many people don’t have. This is particularly true in the United States, where government pays very little in medical care costs. Finding a physician can also be a problem. Depending on the community, doctors aren’t always readily available, and wait lists for specialists can be enormous. With problems of finance and time factoring with medical care, why not let Facebook play doctor? Why shouldn’t Google supplement a medical examiner?
The problem with technology being used instead of a medical practitioner isn’t due to data. There’s no question when it comes to information access, technology giants have enormous databanks of ailments, treatments and case studies. However, there’s a difference between making health lifestyle recommendations and proceeding to make health decisions. Decisions carry responsibility.
Action and Accountability Matter
When we allow technology to act as personal physicians, we leave out a major part of medical practice: accountability. We forget that doctors and nurses go to medical school to learn more than how to heal. They are subject to practice standards, exams, licensing, regulations and re-training. Medical boards and colleges provide oversight and investigation when things go wrong. Physicians with poor habits are subject to malpractice lawsuits and hefty fines. If a healthcare provider makes a bad move, like sharing data inappropriately or overprescribing dangerous pharmaceuticals, they may loose their ability to practice medicine altogether. Under the table practices risk jail time. Current laws and regulations ensure that medical professionals are held accountable for their actions and advice.
The same is not true with health information technology. Health startups do not need licenses. You don't need medical board approval for developing or selling health information technology. Artificial Intelligence systems don't need to pass university medical exams prior to practice. Worse, decisions based on machine learning and hidden algorithm logic are not subject to back check: there’s no questioning by the system of “what if we got it wrong?” and to double-check the data. A string of bad advice can still get the developing company sued, but no one can tell the AI it should care.
It’s one thing for information technology to collect, process and provide individual access to better data on their health. It is another thing entirely for information technology to give medical advice, establish treatment plans or act as an authority without anyone acting as an accredited oversight.
As Naveed Saleh, MD, wrote: “Diagnosis is an entailed process best practiced by a professional.”
Diagnosis: Kernel Panic, Hit Restart
The growing development of new heath information technologies isn’t a problem. Advances in medical technology are nothing short of phenomenal: we can regrow skin, replace lost limbs without sensory deprivation, and correct fatal conditions. Using advanced technology and analytics in health data processing is a natural fit. Al and machine learning algorithms pick up details we can’t. They can run through the math at warp speed. They can help even the most seasoned medical professional narrow down the problem and pick up clues human eyes would have missed. IBM’s infamous Watson, is already in medical schools, and no doubt is already doing a lot of good.
But health technology isn't perfect, and the decisions made by tech should always require a second opinion. Someone who can see things the tech can't, ask questions about information it doesn't have, and holds training to know when a situation should or shouldn't be escalated. Someone who understands the potential impact of their choice, and is ready to establish why the decision made is the right one.
If information technology companies want to help people stay healthy, the more power to them. However, if they want to make healthcare decisions, oversight from the medical community should be required. They can’t have their cake and eat it too: all of the patents, the accolades with zero of the regulatory framework or responsibility for diagnosis. If companies like Google or Facebook wish to make decisions based on health data, those decisions should be run past licensed professionals who are required to keep the company accountable. Facebook already holds a history of lax privacy; the last thing we need is the company also ignoring the codes of medical ethics like the Hippocratic Oath.
- First Do No Harm: Health Data Decision ResponsibilityJanuary 31, 2019
- Five Information and Privacy Predictions for 2019January 3, 2019
- Blockchain and Privacy: Still a New Frontier, or a False Start?December 4, 2018
- Pain Points in Biometric System SecurityNovember 5, 2018
- Function Creep: The Frankenstein of PrivacyOctober 1, 2018