Slideshow: Fine Print Fears

Thanks to numerous ongoing Facebook scandals making headlines, consumers are finally—maybe—starting to grasp what healthcare privacy experts have understood for some time: most people aren’t reading the fine print of privacy authorization forms. The point is driven home by the rapid mainstreaming of artificial intelligence products in both the consumer and healthcare markets, and the popularity of mobile and wearable health-tracking devices—both industries that commonly track personal data in ways that might creep out the average person.

HIPAA grants providers and covered entities broad leeway to determine how they use patient information once the patient signs the required Notice of Privacy Practices (NPP) form. The medical research community would have a much harder time researching new treatments and developing life-saving medications without data from the very patients they’re trying to help. However, even patients who do their own due diligence during treatment don’t realize that the social networks they are participating in collect their healthcare data too. Pharmaceutical firms, laboratory vendors, employee wellness programs, artificial intelligence vendors, technology companies (Apple, Facebook, Uber, Google, Amazon), and other healthcare stakeholders are paying top dollar for patient data wherever it can be found—and they aren’t necessarily beholden to HIPAA regulations.

Additionally, anonymizing healthcare data, which HIPAA-covered entities and business associates must do before sharing, has become increasingly hard to do well, experts say.

The following slideshow highlights recent examples of the thorny intersection of patient privacy, technology, and social media.

In the Washington Post article “Who owns your medical data? Most likely not you,” contributor Steven Petrow grapples with how he learned that New York City’s Memorial Sloan Kettering Cancer Center may have used data from him, his mother, and sister during and after they sought cancer treatment there. The `able to analyze the cancer center’s collection of 25 million patient tissue slides. Petrow doesn’t remember giving Sloan Kettering his permission to use his own tissue when he consented to treatment—although he was happy to in retrospect. He could conceive, however, that the decision to essentially donate part of himself to science was a giant decision to make after just learning about a terrifying cancer diagnosis. Upon further research, Petrow learned much more about the vast patient data industry with help from former Office of the National Coordinator for Health IT official Jodi Daniel. She told Petrow that in 49 of 50 states, patients have no ownership rights over their information. Daniel played a critical role in authoring HIPAA and acknowledges that it was written before “we were thinking about the value and use of data the way we do today.” Petrow concludes: “What I know is this: The next time I have surgery, I will ask for the consent form well ahead of time and take the time to read and understand it, in consultation with my doctor. I will strike out provisions I don’t agree to, knowing I can’t be denied treatment on that basis.”

In another reminder that, per the law, patients don’t own their own data, an investigative report from ProPublica detailed how insurance companies and physicians receive highly detailed information about the sleep quality of their patients without the patient’s implicit understanding. While continuous positive airway pressure (CPAP) machines have been around for over 20 years for the treatment of sleep apnea, patient compliance with the devices was usually only monitored by a physician via a removable computer chip. New machines, however, are equipped with modems that wirelessly transmit sleep data directly to the patient’s insurance company, physician, and usually a respiratory therapist. As a result, the insurance company may decide a patient’s lack of compliance or use of the machine can be used against them. In other words, if a patient doesn’t use CPAP for a pre-determined number of hours per night, they can deny coverage of the machine and necessary accessories. Reporter Eric Umansky learned this about his own machine while working on the article. “You view it as a device that is yours and is serving you,” Umansky told ProPublica. “And suddenly you realize it is a surveillance device being used by your health insurance company to limit your access to health care.”

When Facebook founder Mark Zuckerberg testified to Congress over the Cambridge Analytica scandal, many users learned for the first time how Facebook aggregates users’ information and sells it to thousands of third parties for marketing and research purposes. Unsurprisingly, around the time of Zuckerberg’s testimony, CNBC and The Verge reported on a pending (and eventually halted) Facebook partnership with numerous major hospitals to share anonymized patient data. Facebook reportedly had plans to use the hospitals’ data and match it to Facebook user profiles in order to, ostensibly, provide more comprehensive care. The project, code named “Building 8,” sought to combine what a health system knows about its patients (such as: “Person has heart disease, is age 50, takes two medications, and made three trips to the hospital this year”) with what Facebook knows (such as: “User is age 50, married with three kids, English isn’t a primary language, actively engages with the community by sending a lot of messages”), CNBC reported. Court cases challenging Facebook’s health data gathering have been dismissed in the courts. According to the Washington Post, an individual with metastatic cancer sued Facebook saying they violated his privacy by collecting data from his participation in cancer related websites outside of Facebook. His case was dismissed.

Currently, companies that use patient data acquired under business associate agreements skirt regulations around repurposing it by “anonymizing” or “deidentifying” it—removing names, addresses, gender, etc. But new studies are finding that it’s increasingly easier to reidentify the real identities of deidentified individuals by cross referencing data publicly available on Facebook, Twitter, newspaper articles, police reports, and even movie reviews. In one study of this practice, a scientist was able to collect from hospitals lists of injuries, accidents, and conditions during a specific time period. Even though the individuals treated weren’t named, the scientist was able to match them to accident reports in newspapers for the same period of time, according to Tech Science. In research published recently in JAMA Network Open, scientists successfully reidentified 4,720 adults and 2,427 children using a data set from the National Health and Nutrition Examination Survey, when it was cross referenced using machine learning with wearable device data.

At some point, every consumer who’s ever made an online purchase or searched Facebook for disease-specific support groups has felt spied on. For some, the psychological toll can be huge. Washington Post video editor Gillian Brockell wrote a viral Tweet and subsequent article back in December 2018 about the gut-wrenching experience of continuing to see dozens of online ads for pregnancy-related products after her baby was stillborn. “Please, Tech Companies, I implore you: If your algorithms are smart enough to realize that I was pregnant, or that I’ve given birth, then surely they can be smart enough to realize that my baby died, and advertise to me accordingly—or maybe, just maybe, not at all.” Brockell readily admits that the price of doing business through Amazon baby registries and hashtagging baby showers on Instagram is acceptance of being inundated with countless product advertisements. “We never asked for the pregnancy or parenting ads to be turned on; these tech companies triggered that on their own, based on information we shared. So what I’m asking is that there be similar triggers to turn this stuff off on its own, based on information we’ve shared,” she writes.

A recent Pew survey found that 74 percent of Americans didn’t realize that Facebook maintains lists of its users behaviors, interests, activities, locations, and other traits that it sells to advertisers. As Vox.com wryly notes: The results of the Pew survey have been making the rounds on the internet, typically under headlines questioning how Americans could possibly still be unfamiliar with what seems like the most familiar and grating adage: “When a product is free, you’re the product.” (Click here for Wired’s instructions to reconfigure your own Facebook privacy settings.) While the company claims it does this to market more accurately to a user’s interests, many find it invasive. The increased scrutiny of Facebook and other social networks may make users more cautious about what they post, or an external force such as the European Union’s General Data Protection Regulation (GDPR) could reach American shores and force companies toward more transparency. The GDPR makes it easier for users of major tech companies and social networks to find out how those companies are using their data, even though that process has been known to have some kinks.

Mary Butler is the associate editor at Journal of AHIMA.

Syndicated from http://journal.ahima.org/2019/02/01/slideshow-fine-print-fears/

Translate »