In September, ProPublica selectively vetted and publicized the findings of an international study by the German company Greenbone Networks which found widespread and serious deficiencies in the protection of patient medical images, like MRIs, CAT scans, and mammograms and related patient data (identifiers like name, data of birth, Social Security number) all over the world. Very few countries got a clean bill of health, and the US found to be particularly lax.
Greenbone made a second review. As a result of the report, 11 countries, including Germany, the UK, Thailand, and Venezuela, took all Picture Archiving and Communication Systems (PACS) servers offline. However, in the US, which Greenbone had classified as “ugly” in its “good, bad, and ugly” typology, got even worse. In the US, Greenbone identified 786 million exposed medical images, which is a 60% increase from when their original survey was completed, to September. Not only does the data have patient identifiers, but it can also include the reason for the test, ID cards and for members of the armed forces, military personnel IDs. This growth occurred despite some of the companies called out in the ProPublica report taking corrective action. From the latest report, which we have embedded at the end of the post:
For the most part, this data isn’t merely hackable, it’s unprotected.
This situation is bad for patients. First is the exposure to identity theft, both for financial fraud (getting credit in your name) and medical fraud (getting medical services in your name and leaving you and your insurer with the bill). Second is the possibility of vindictive use of sensitive health information. From the ProPublica story:
“Medical records are one of the most important areas for privacy because they’re so sensitive. Medical knowledge can be used against you in malicious ways: to shame people, to blackmail people,” said Cooper Quintin, a security researcher and senior staff technologist with the Electronic Frontier Foundation, a digital-rights group.
“This is so utterly irresponsible,” he said.
And at least one US official is up in arms. From an early November press release by Senator Mark Warner:
U.S. Sen. Mark R. Warner (D-VA), Vice Chairman of the Senate Intelligence Committee and co-founder of the Senate Cybersecurity Caucus, today raised concern with the U.S. Department of Health and Human Services (HHS)’s failure to act, following a mass exposure of sensitive medical images and information by health organizations. In a letter to the HHS Director of the Office for Civil Rights, Sen. Warner identified this exposure as damaging to individual and national security, as this kind of information can be used to target individuals and to spread malware across organizations.
“I am alarmed that this is happening and that your organization, with its responsibility to protect the sensitive personal medical information of the American people, has done nothing about it,” wrote Sen. Warner. “As your agency aggressively pushes to permit a wider range of parties (including those not covered by HIPAA) to have access to the sensitive health information of American patients without traditional privacy protections attaching to that information, HHS’s inattention to this particular incident becomes even more troubling.”
“These reports indicate egregious privacy violations and represent a serious national security issue — the files may be altered, extracted, or used to spread malware across an organization,” he continued. “In their current unencrypted state, CT, MRI and other diagnostic scans on the internet could be downloaded, injected with malicious code, and re-uploaded into the medical organization’s system and, if capable of propagating, potentially spread laterally across the organization. Earlier this year, researchers demonstrated that a design flaw in the DICOM protocol could easily allow an adversary to insert malicious code into an image file like a CT scan, without being detected.”
Warner also complained that HHS was giving its Good Housekeeping Seal of Approval to health care providers despite glaring security holes:
In his letter to Director Roger Severino, Sen. Warner also raised alarm with the fact that TridentUSA Health Services successfully completed an HHS Health Insurance Portability and Accountability Act (HIPAA) Security Rule compliance audit in March 2019, while patient images were actively accessible online.
I have always assumed that medical providers are on par with candy stores at data security and this report confirms my prejudices. The ProPublica story described why medical images are so commonly out in the open:
Oleg Pianykh, the director of medical analytics at Massachusetts General Hospital’s radiology department, said medical imaging software has traditionally been written with the assumption that patients’ data would be secured by the customer’s computer security systems.
But as those networks at hospitals and medical centers became more complex and connected to the internet, the responsibility for security shifted to network administrators who assumed safeguards were in place. “Suddenly, medical security has become a do-it-yourself project,” Pianykh wrote in a 2016 research paper he published in a medical journal…
The passage of HIPAA required patient information to be protected from unauthorized access. Three years later, the medical imaging industry published its first security standards.
Our reporting indicated that large hospital chains and academic medical centers did put security protections in place. Most of the cases of unprotected data we found involved independent radiologists, medical imaging centers or archiving services..
Meeting minutes from 2017 show that a working group on security learned of Pianykh’s findings and suggested meeting with him to discuss them further. That “action item” was listed for several months, but Pianykh said he never was contacted. The medical imaging alliance told ProPublica last week that the group did not meet with Pianykh because the concerns that they had were sufficiently addressed in his article. They said the committee concluded its security standards were not flawed.
Pianykh said that misses the point. It’s not a lack of standards; it’s that medical device makers don’t follow them. “Medical-data security has never been soundly built into the clinical data or devices, and is still largely theoretical and does not exist in practice,” Pianykh wrote in 2016.
Dirk Schrader, cyber resilience architect at Greenbone Networks said: “Whilst some countries have taken swift action to address the situation and have removed all accessible data from the internet, the problem of unprotected PACS systems across the globe only seems to be getting worse. In the US especially, sensitive patient information appears to be free-flowing and is a data privacy disaster waiting to happen.
“When we carried out this second review, we didn’t expect to see more data than before and certainly not to have continued access to the ones we had already identified. There certainly is some hope in the fact that a number of countries have managed to get their systems off the internet so quickly, but there is much more work to be done.”
My tiny sample may not be representative, but my experience was also that the software imaging centers collect unnecessary but sensitive personal data.
Many years ago, my orthopedist wanted me to get an MRI. He was fussy and had a particular imaging center he liked (he did not have a financial interest in them). When I went to schedule an appointment, the staffer asked for my Social Security number. I refused to give it; I won’t give my SSN to any medical provider because there’s no good reason for them to have it and I assume their security is dreadful. The staffer said she couldn’t book me unless I gave it. I said I was a cash customer, I’d give them a credit card number and they could pre-authorize payment, and in any event, my insurer didn’t use SSNs as an identifier. No dice and so I did not get the MRI, since my MD was also stubborn and would not refer me to another radiologist practice.
That brick wall suggested to me that the SSN was set up as a “must fill” field in their software. I’ve never had that issue with any doctor or other medical test service (note I have not tried to get an MRI since then).
Both the ProPublica story and the Greenbone report strongly urged patients to press their medical providers about their data security:
If you have had a medical imaging scan (e.g., X-ray, CT scan, MRI, ultrasound, etc.) ask the health care provider that did the scan — or your doctor — if access to your images requires a login and password. Ask your doctor if their office or the medical imaging provider to which they refer patients conducts a regular security assessment as required by HIPAA.
And from Greenbone:
For patients it is usually difficult to verify the measures taken by the chain of medical service providers they face. What they can do is to be clear about their expectations about data protection and privacy.
• Ask your doctor about their data protection regime, what they do precisely.
• In case you get the generic answer (We do what is required by law), demand further details like
how often do they verify their IT security and data privacy posture.
• You might not get good or immediate responses, but the same question asked by many again and again will lead to an improvement.
I would also ask about their data retention policies. If you are a one-off patient and the image is not likely to have lasting value (like the chest X-ray I got as part of the Australian visa process, or to rule out a stress or hairline fracture as a cause of pain), I would press to have it deleted in a month or six months, and find out what it would take to have that happen. Even if you don’t get anywhere, it sends a message that patients are concerned about data security.