Over the past two weeks readers have pointed KrebsOnSecurity to no fewer than three different healthcare providers that failed to provide the most basic care to protect their patients’ records online. Only one of the three companies — the subject of today’s story — required users to be logged on in order to view all patient records.
A week ago I heard from Troy Mursch, an IT consultant based in Las Vegas. A big fan of proactive medical testing, Mursch said he’s been getting his various lab results reviewed annually for the past two years with the help of a company based in Frisco, Texas called True Health Diagnostics.
True Health is a privately held health services company specializing in “comprehensive testing for early detection of chronic diseases,” according to the company’s Web site.
The medical reports that True Health produces contain vast amounts of extremely personal information on patients, including indicators of genetic abnormalities as well as markers of potentially current and future diseases.
To demonstrate the flaw, Mursch logged into his account at True Health and right clicked on the PDF file for his latest health report. He showed how the site would readily cough up someone else’s detailed health records and blood tests if he modified a single digit in the link attached to that PDF record and then refreshed the page.
I alerted True Health Diagnostics immediately after verifying the flaw, and they responded by disabling the healthcare records data portal within minutes of our call. Over the weekend, True Health said it discovered and fixed the source of the problem.
“Upon discovering the potential for registered users of our patient portal to access data for individuals other than themselves, we immediately shut down the system in order to resolve any vulnerabilities,” the company said in a statement emailed to this author. “True Health has total confidence that all patient records are fully secure at this time. We regret this situation and any harm it may have caused.”
The statement said True Health CEO Chris Grottenthaler has ordered an immediate investigation to determine which files, if any, were improperly accessed.
“It will be thorough, speedy and transparent,” the statement concludes. “Nothing is more important to us than the trust that doctors and patients put in our company.”
The company says it is still investigating how long this vulnerability may have existed. But Mursch said it appears his healthcare record was assigned by True Health a record number that was issued as part of a numerical sequence, and that the difference between the record numbers attached to a result he received recently and another set of test results produced two years ago indicate at least two million records may have been exposed in between.
“I would assume all patient records were exposed,” Mursch wrote in an email.
Alex Holden, founder of cybersecurity consultancy Hold Security, said he’s responded to a number of inquiries of late regarding clients who inadvertently published patient data online with little or no authentication needed to view sensitive health records.
Holden said he advises clients to add security components to their links to encrypt any portion of the link that contains data so that it can’t be easily reversed or manipulated. He also tells clients not to use sequential account numbers that can be discovered by simply increasing or decreasing an existing account number by a single digit.
“A lot of times the medical records are stored sequentially as PDF files and they all just sit in the same folder that patients can access with a Web browser,” Holden said. “And in many cases they are not even protected by a username and password.”
Finally, Holden said, companies should ensure their servers are always checking to see if the user is logged in before displaying records, and also checking to ensure the user has the right to view the requested record.
According to Verizon‘s newly released Data Breach Investigations Report (DBIR), 15 percent of breaches now involve healthcare organizations. The 2017 DBIR found that healthcare organizations were tied with the retail and accommodations sector as the second-largest source of data breach victims last year.
Author’s note: After a year in which virtually every major hotel chain suffered at least one major breach, Verizon’s finding should be a sobering one for everyone concerned about the security of their healthcare records.
As I said at the outset of this story, I’m amazed that companies with so much on the line routinely fail to do basic checks to ensure customers or users can’t see beyond their own data.
Not long after I started this blog in 2010 I encountered a similar vulnerability after being approached by a security company that helps Web sites defend against online attacks. The vendor in this case offered to protect my site for free, but I was ultimately unwilling to give them any kind of control over my site.
However, the vendor did give me an account on their protection portal that was reserved for my site, and after logging in I could see my site’s sample “dashboard” even though it hadn’t been officially set up yet. Then I had a look at the URL in my browser’s address field and noticed that it ended with a long string of numbers. On a whim, I changed the last digit in that long number to something else, refreshed the page and in an instant I was captain of some other site’s dashboard.
Needless to say, I notified them about that flaw and they fixed it quickly at the time, but that was the end of our trial for me.
Stay tuned for another installment in this series on how not to store medical records online.
Good informative article
Sounds like a HIPAA violation to me.
understatement of the millennium… 🙂
I am into Zen man.
HHS should be prosecuting people at this company for improper disclosure of medical records under HIPAA. In addition, all those affected should immediately file breach of privacy law suits. Finally, all attending Physicians should face stern discipline for breach of doctor-patient confidentiality.
But I doubt much will happen. At best, they’ll just throw some poor unqualified and underpaid IT immigrant with an H-1B visa under the bus. In end, it’s the greedy Trial Lawyers that’ll come out ahead.
And offer the obligatory one year of credit monitoring…maybe…
“In end, it’s the greedy Trial Lawyers that’ll come out ahead.”
They always do!
But the company is in Texas, therefore they can do no wrong! Next case, please. Maybe it’ll be in California and therefore prosecutable.
[/sarcasm]
Drone, you state that “HHS should be prosecuting people” and “all those affected should immediately file breach of privacy law suits”
HIPAA doesn’t have a private right of action, and criminal penalties would require proving intent.
But hey, good luck with that.
Does not HIPAA punish negligence?
HIPAA does not apply to companies that do not take Medicare or Medicaid.
They should still follow the regulations, but not required to.
HIPAA does apply to lab services as long as they transmit billing information electronically. I suspect that OCR will be launching an investigation and we will see a corrective action plan and settlement agreement come out. Unfortunately for affected patients, their only course of action at this point is probably to report a complaint with OCR but they would not see any of that settlement money. Unless of course they can demonstrate damages, they may be able to sue through the State Attorney General.
Who were the other two (2) health web sites?
With the advent of the General Data Protection Regulation(GDPR) in May 2018, if these companies process medical data for EU citizens they will be in scope, even though they have no physical EU presence.
The maximum fine for a breach is 4% of GLOBAL turnover, and whilst its considered unlikely that this level would be reached for most breaches, the regulators have indicated that it’s possible in cases demonstrating a flagrant disregard for the privacy rights of EU citizens. I’d suggest that failing to require any authentication to access medical records would easily cross that threshold.
We also know they are itching to try out their new powers!!
USA Big Lawyers will not allow real action to take place. Look at Anthem’s 18m records disclosed… nothing real happened.
Funny stuff.
Brian – The same type of vulnerability exists for PEO companies, who also maintain employee’s sensitive personal data (such as W2’s) and, at least in my cursory investigation, often use the same out-of-the-box login applications which do not offer acceptable login security unless configured correctly. Maybe you could do a piece on that next….
The last PEO company I worked with inadvertently exposed the PDF pay stubs of every employee to me. I told them about it.
Lucky for them, I had no axe to grind. Otherwise I could have started a company revolt by telling my co-workers who was getting paid more/less then themselves. There were some surprises. I can attest to the fact that women get paid less then men for the same work. I’m a guy, but I felt bad for the women that I worked with who were getting paid less for the same work.
Great article, Brian! (As always).
Something doesn’t seem right about how this vulnerability was revealed. Did Troy Mursch try contacting True Health before disclosing this vulnerability to the press (e.g. you), and if he didn’t, why? Per the article, True Health seems to take compromised patient data rather seriously, so I’m struggling to believe Troy would’ve been brushed off by True Health if he approached them first. Responsible disclosure dictates that the company gets first notice of a flaw and, unless the company is tardy about fixing it, everyone else (including the press) gets the details after it’s fixed.
I’m not saying you did the wrong thing–IMO, you disclosed and reported on the vuln responsibly. Unless I’m missing something, though, your source seemed to go about disclosure the wrong way. Is that something you can clarify?
Why did the researcher go about it the wrong way? A lot of people aren’t exactly jumping at the opportunity to be mistreated, disrespected and lied to by companies, which is what often happens when people report security flaws. The affected organization gets cagey, starts circling the lawyers, making threatening statements, etc.
Your statement confuses me because it is an accepted method these days to notify via a third party for all those reasons, whether that third party be a security firm, a vulnerability research company, or a journalist. It’s not clear to me why it matters who reports it, as long as it’s done so without unnecessarily endangering users and that it get fixed quickly.
Professor Plum asked a simple and intelligent question which was answered defensively. You are facilitating and being a mouth for bug hunters who then try to extort money from companies in the form of a bug bounty. Then as a threat and in desperation they will go public with it unless they are compensated (extorting) money from the company. everybody in this chain of events should be ashamed and you included for enabling this to happen so you can make a few bucks off your blog and lame adverts. From what I understand you made it to about a sysadmin level and are trying to rationalize hippa, responsible disclosure and healthcare legal in your response….sad!
Extortion? Really? Mr. Mursch reached out to me because he did not know how to proceed. Vulnerability notification can be tricky to navigate him, and I applaud him for his discretion in asking for advice. I offered to alert the company because he seemed reticent about doing it himself (something about how the company’s data was acquired from a previous company gave him pause I think, see: https://www.wsj.com/articles/health-diagnostic-laboratory-patients-face-bills-years-after-blood-work-1463428248).
As far as I know, the first the company heard about it was from me, and to my knowledge Mr. Mursch didn’t have any direct contact with the company before they fixed this problem, so your conspiracy theory is pretty weak.
yes, extortion, self-promotion, narcissism!
You didn’t really answer the question, but let me help you here. You have another “Installment” meaning you’re basically going to DOX another company who may have similar issues? The vulnerability may be there, but if there was no threat, its not a problem yet, until you expose it. Then you have guys like you and you’re “Bug Hunters” who come along and because a company doesn’t capitulate, you write a “blog” post about it as if you are a journalist or have an “expose’ “. You and this blog are absolutely ridiculous! Security is much more than what you write about because you border line become part of the problem!
Thanks for the reply, Brian. I worded my initial comment too strongly; there are definitely times third parties are needed in responsible disclosure, and I understand it’s how vulnerability disclosure is often done. I was just impressed at how well True Health seemed to address the situation (they accepted responsibility, promptly disabled the system, etc.) and I thought it was a pity that they didn’t have a chance to address the issue in a more private setting.
Prof. Plum wrote:
> True Health seems to take compromised patient data rather seriously
With respect, this is untrue. If that company cared about patient confidentiality, they would have done some white-hat intrusion scans on their patient portal, and the guessability of patient record numbers would have jumped out at them.
The CEO takes it seriously, sure. I suspect they’re covered by the HIPAA / ARRA patient confidentiality laws. Those laws pierce the corporate veil and render individual employees personally liable for breaches.
This kind of vulnerability has been well known for at least a decade.
Ollie Jones,
That’s why I said “compromised” patient data; I was discussing their incident response, not their incident prevention. This vulnerability, to put it nicely, was pathetic.
See my response to Brian above.
The CEO takes the potential PR backlash and momentary consequences seriously. Clearly no one at the firm takes security seriously or even knows anything about it.
From the dev to the QA to the program manager to anyone working there that is ‘dog fooding’ or does a 5 minute hallway test, vulnerabilities like this are obvious, simple to identify, and even simpler to avoid as a developer. I mean how negligent is it for a developer to not realize there are no authorization checks being performed on the GET for these PDF’s? It’s extraordinarily obvious because either, one you have a conditional at the very top of the function routed to or two, you have a decorator above the route itself indicating the endpoint is authenticated. In either case we’re talking about extremely easy code to comprehend and also code that stands out pretty easily, not like a bug in a function full of complicated logic I’m talking something [Authenticated] sitting right above the GetPdf route name or ” if (user.isAuthenicated) ” as the first statement within the function that is routed to.
I’m not shocked to see this level of negligence but to describe it as anything else (like they care but don’t know better) is abhorrent and dishonest.
If you are a HIPAA entity and you have a breach, you’re required to report it to CMS — the Centers for Medicare and Medicaid Services.
If more than 500 patients’ data was leaked, you get your name in lights here.
https://ocrportal.hhs.gov/ocr/breach/breach_report.jsf
It’s a bit frightening.
I don’t agree these companies have “so much on the line.” What is the financial incentive for them to care at all? There’s not much evidence of lost business due to lackadaisical security in healthcare, and government regulators certainly aren’t pressing the point. Until there is a real financial incentive nothing is going to change.
Ask what is d.trump opinion ?
“What is the financial incentive for them to care at all?” The OCR imposes fines for many breaches — probably not enough, but they have limited resources to investigate. See the link to the wall of shame someone posted above.
Those resources to investigate will likely become less under this administration.
Those folks in Collin County (where Frisco is located) seem to be habitual in playing loose-and-fast where other people’s information and money is concerned, with the current Texas Attorney-General (Ken Paxton) a home-grown poster-boy who’s been fighting his indictment on multiple Federal and State securities fraud charges since he was elected to office.
But at least “JimV” had an opportunity to publicly parade the Leftist insanity that seems increasingly prevalent these days (re: the personal attacks on “poster boy” Ken Paxton) while COMPLETELY dodging the entire content of Krebs’ post.
You clearly haven’t followed the deeply-rooted corruption which emanates from Collin County for decades, as I have.
I noticed the same type of issue when I registered my daughter at a youth conference at a church a couple years back. There was a number attached to the URL confirming my daughter’s registration and I could change it up or down and refresh the page. One-by-one I could see the name of every student registered and his or her home address and phone number. I notified the church on what I could see and they replied they would look into it but it hadn’t changed when the conference came around the next year.
This is vaguely reminiscent of the observation that changing the string in the last component of most shortened urls (like tinyurl and its descendants) will often take you to a random site which is not one you are supposed to be allowed to see.
I wouldn’t assume you’re not supposed to be able to see those sites. Keep in mind how those utilities work… You give them a valid url, they produce a hash for it that is short in length, return it to you, when a user clicks this url it redirects them to the destination you entered at creation.
If for example I did that for any of my corporate resources it wouldn’t matter, you’d get the login screen and it’s not a big deal that you got to the site in the first place because it’s on the open internet and any of the billion plus users in the world could. TinyUrl does not attempt to or promise to offer any security, it’s a url shortener, it’s the destination who’s responsible for security and to assume ‘you’re not supposed to be able to see’ their site is probably wrong in the vast majority of cases. Also remember the old adage ‘obscurity is not security’ – TinyUrl is aware of this, they don’t think there is any security there.
I’ve been in the information security field for over 20 years, and I’ve seen these issues time and time again. The real point is that when someone places information on the web, they should conduct risk assessments and remediate findings. URL modification attacks date back to the 90’s and a vulnerability scanner would find the issue. There are standard methods for remediation.
For those looking for the HIPAA requirements, see 45 CFR 168.308(a)(1)(ii) on risk analysis and risk management. I know Texas also has additional laws protecting health information so there might be a state level issue.
I applaud the vendor’s public reaction. Hopefully they have logs that will show if this attack has occurred before, and who was impacted. As for all the comments about lawsuits, let the investigation complete before people talk about suing.
Also, before people go to far in the thought that the vendor doesn’t care. Most of us care. The issue with information security is that many companies don’t know what to do. Getting their records on line is a great feat, and one they probably celebrated. Now that good deed is causing them to face fines and possibly years of federal oversight. This will certainly raise their cost of doing business and consequently, the cost of healthcare.
So, all documents on the website can be called up just from the code in the link?
Sigh.
This is a KNOWN problem! It was known over 10 years ago! This is the kind of design that shows up on “The Daily WTF”, which highlights the worst software designs in the world:
http://thedailywtf.com/articles/Oklahoma-Leaks-Tens-of-Thousands-of-Social-Security-Numbers,-Other-Sensitive-Data
Hey, not only do you get to see who’s on the sexual registry list, you get to see *everyone* who’s in the state prison records, including the guards and employees! And their Social Security numbers.
You think that people would avoid copying something that was already a famously bad idea in 2008. And If you’ve never heard of Little Bobby Tables, you really shouldn’t be working on important systems.
After reading articles such as the dental researcher that faced prosecution after discovering patient data on a publicly available FTP server, AFTER responsible disclosure to the software company, I too will continue to use a third party to initiate contact with a company.
https://www.dailydot.com/layer8/justin-shafer-fbi-raid/
But wait there’s more http://www.smh.com.au/technology/security/super-bad-first-state-set-police-on-man-who-showed-them-how–770000-accounts-could-be-ripped-off-20111018-1lvx1.html which leads people to not reports flaws (safe for them, unsafe for everyone else), or use a method which where the insecure does not want to risk the normal over the top threatening response, like via the media or a security company or similar.
“According to Verizon‘s newly released Data Breach Investigations Report (DBIR), 15 percent of healthcare breaches now involve healthcare organizations. ”
What are the other 85% ?
Great article!
We’re so shocked! Gaining extremely personal information on patients by changing a single digit in the link… That’s a shame.
This bit made me laugh:
Ya think…? That statement seems dryly sarcastic, like recommending that professional carpenters don’t try to drive nails with the claw end of a hammer. That’s pretty much web app development 101, and depending on the system/language being used & coding style, can be accomplished in about 4 lines:
if($logged_in_user_id == $record_user_id)
{ [output code goes here] }
if:else
{ [error message or other error-handling code goes here] }
(And that’s the quick-and-dirty, inline version – with most non-trivial applications, any developer who knows what they’re doing would probably put that into a function or at least an include)
If Holden is at least moderately-competent as security consultant, I assume he immediately follows that statement by saying “But if you don’t already know this, then you’re not even remotely qualified to be developing any public-facing web application.”
In case you all haven’t figured out a potentially large implication here, I’ll put in it plain terms: many of these companies hire large outsource firms to develop software of this type because the in-house staff don’t have that skill, and the outsource firm has done it many times. The problem lies in the fact that the outsource firm saves money by hiring low quality developers who are not familiar with the risks and threats such as this. The result is not just one bad website, but literally hundreds, as these outsource companies go around reselling much of their code to other similar customers. I have personally seen this where a very large, presumed reputable firm, created garbage code and sold it to dozens or hundreds of customers. However, maybe only one customer discovered the flaw, while the other dozens and hundreds are just sitting there waiting to be hacked. Someone needs to figure out who developed the code, and see if they developed similar code for others that might be flawed.
I don’t do this regularly. Thank you for your article!
It is exceptionally composed and makes some excellent points.
Subscribed.