Improvements in the sophistication of artificial intelligence (AI) is threatening health data privacy by making it possible to identify individuals from supposedly anonymised data.

This is according to research by engineers at the University of California, Berkeley, published in the JAMA Network Open journal suggesting that current US regulations of the handling of health data are no longer fit for purpose.

The researchers mined the health data of over 15,000 US residents covering two years. They found that by using AI, it was possible to identify specific individuals.

This was achieved by identifying the daily patterns of step data, which is collected by a host of devices including wrist-worn fitness trackers and smartwatches and conventional smartphones, and correlating it to key demographic data.

“We wanted to use NHANES (the National Health and Nutrition Examination Survey) to look at privacy questions because this data is representative of the diverse population in the US,” explained UC Berkeley engineer Anil Aswani, who led the study.

“The results point out a major problem. If you strip all the identifying information, it doesn’t protect you as much as you’d think. Someone else can come back and put it all back together if they have the right kind of information.”

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Data sharing and the risks to health data privacy

While in the vast majority of cases, such potential would be very unlikely to translate into individuals actually being identified, Aswani does identify a scenario where it could be used to infringe on an individual’s privacy.

“In principle, you could imagine Facebook gathering step data from the app on your smartphone, then buying health care data from another company and matching the two,” he explained.

“Now they would have health care data that’s matched to names, and they could either start selling advertising based on that or they could sell the data to others.”

The problem, therefore, is not how devices capture step data, but how the resulting information is handled and even sold on.

“I’m not saying we should abandon these devices. But we need to be very careful about how we are using this data,” said Aswani.

“We need to protect the information. If we can do that, it’s a net positive.”

Regulatory change needed to protect health data privacy

Aswani argues that the threat certain uses of AI pose to health data privacy means that the current US HIPAA regulations, which concern the right to privacy in healthcare situations, are not adequate.

“HIPAA regulations make your health care private, but they don’t cover as much as you think,” Aswani said.

“Many groups, like tech companies, are not covered by HIPAA, and only very specific pieces of information are not allowed to be shared by current HIPAA rules. There are companies buying health data. It’s supposed to be anonymous data, but their whole business model is to find a way to attach names to this data and sell it.”

Aswani argues that the risks will only increase, particularly as companies are ever-more keen to find novel uses for personal data.

“Ideally, what I’d like to see from this are new regulations or rules that protect health data,” he said.

“But there is actually a big push to even weaken the regulations right now. For instance, the rule-making group for HIPAA has requested comments on increasing data sharing.

“The risk is that if people are not aware of what’s happening, the rules we have will be weakened. And the fact is the risks of us losing control of our privacy when it comes to healthcare are actually increasing and not decreasing.”