Computer and software industry maker HP is in the process of notifying customers about a seemingly harmless security incident in 2010 that nevertheless could prove expensive for the company to fix and present unique support problems for users of its older products.
Earlier this week, HP quietly produced several client advisories stating that on Oct. 21, 2014 it plans to revoke a digital certificate the company previously used to cryptographically sign software components that ship with many of its older products. HP said it was taking this step out of an abundance of caution because it discovered that the certificate had mistakenly been used to sign malicious software way back in May 2010.
Code-signing is a practice intended to give computer users and network administrators additional confidence about the integrity and security of a file or program. Consequently, private digital certificates that major software vendors use to sign code are highly prized by attackers, because they allow those attackers to better disguise malware as legitimate software.
For example, the infamous Stuxnet malware — apparently created as a state-sponsored project to delay Iran’s nuclear ambitions — contained several components that were digitally signed with certificates that had been stolen from well-known companies. In previous cases where a company’s private digital certificates have been used to sign malware, the incidents were preceded by highly targeted attacks aimed at stealing the certificates. In Feb. 2013, whitelisting software provider Bit9 discovered that digital certificates stolen from a developer’s system had been used to sign malware that was sent to several customers who used the company’s software.
But according to HP’s Global Chief Information Security Officer Brett Wahlin, nothing quite so sexy or dramatic was involved in HP’s decision to revoke this particular certificate. Wahlin said HP was recently alerted by Symantec about a curious, four-year-old trojan horse program that appeared to have been signed with one of HP’s private certificates and found on a server outside of HP’s network. Further investigation traced the problem back to a malware infection on an HP developer’s computer.
HP investigators believe the trojan on the developer’s PC renamed itself to mimic one of the file names the company typically uses in its software testing, and that the malicious file was inadvertently included in a software package that was later signed with the company’s digital certificate. The company believes the malware got off of HP’s internal network because it contained a mechanism designed to transfer a copy of the file back to its point of origin.
Wahlin stressed that the software package in question was never included in software that was shipped to customers or put into production. Further, he said, there is no evidence that any of HP’s private certs were stolen.
“When people hear this, many will automatically assume we had some sort of compromise within our code signing infrastructure, and that is not the case,” he said. “We can show that we’ve never had a breach on our [certificate authority] and that our code-signing infrastructure is 100 percent intact.”
Even if the security concerns from this incident are minimal, the revocation of this certificate is likely to create support issues for some customers. The certificate in question expired several years ago, and so it cannot be used to digitally sign new files. But according to HP, it was used to sign a huge swath of HP software — including crucial hardware and software drivers, and other components that interact in fundamental ways with the Microsoft Windows operating system.
Thus, revoking the certificate means that HP must re-sign software that is already in use. Wahlin said most customers impacted by this change will merely encounter warnings from Windows if they try to reinstall certain drivers from original installation media, for example. But a key unknown at this point is how this move will affect HP computers that have built-in “recovery partitions” — small sections at the beginning of the computer’s hard drive that can be used to restore the system to its original, factory-shipped software configuration.
“The interesting thing that pops up here — and even Microsoft doesn’t know the answer to this — is what happens to systems with the restore partition, if they need to be restored,” Wahlin said. “Our PC group is working through trying to create solutions to help customers if that actually becomes a real-world scenario, but in the end that’s something we can’t test in a lab environment until that certificate is officially revoked by Verisign on October 21.”
“what happens to systems with the restore partition, if they need to be restored … that’s something we can’t test in a lab environment”
Chances are nothing but it can be setup in a lab. Many systems fail to check revocation status of a previously valid cert, including Android’s Chrome browser. Until Heartbleed, people asked why would a valid signed cert ever be revoked? HAHA
No doubt it can be set up in a lab. Any claim to the contrary is bs.
Translate this to “it would take effort and time and money to test it in a lab, so we haven’t done that” (or perhaps to “when we tested it in the lab it was so bad we don’t want to talk about it”?).
Or it could be taken as “does anyone have some old systems we can borrow for a couple days?”
PC makers don’t hold onto old stock indefinitely, they may very well not even have the PCs available to test with.
Think of all the tens (even hundreds) of thousands of different PC models that HP produces every year, they’d end up needing a huge warehouse to hold onto a single one of each after a couple years.
Having worked at a major computer manufacturer myself I’d be very surprised if they did not have a bunch of old systems around. For one thing, they need all their supported products for reproducing and diagnosing problems so somewhere there’s a lab with systems for testing. In fact this is exactly the testing their support lab will need to do.
For another thing, I doubt they refresh the desktop configurations for all their employees every time there’s a new product introduction. More likely they are on a multi year refresh cycle and still have lots of old units in use.
Holy smokes i saw this on a rebuild engagement several years ago! We reloaed several times from media supplied by the customer and saw the cert error. Ended using new media, but the drivers kept failing. Dinner if this was the issue.
Oops not dinner, wonder
like you are hyungry
Please plan tour in Denver. The Denver/Boulder area has a large workforce in the IT, Security, finance, and energy business, in addition, to many others who value your chutzpah and knowledge.
(wonderful skiing here)
Wait a minute…
> …and that the malicious file was inadvertently included in a
> software package that was later signed with the company’s digital > certificate.
A developer workstation was infected by malware. That developer inadvertently included a copy of that malware into a piece of software. It passed all HP’s QA testing and HP shipped it. By now it’s been out there in the wild for how many years? And HP signed it so the rest of the world who trusted HP thought it was clean? So now, millions of HP computers apparently have a copy of this malware? Where’s the outcry? Does anyone sense a quality control problem here? What software was compromised?
I feel a class action lawsuit coming on.
“… HP was recently alerted by Symantec about a curious, four-year-old trojan horse …”
Seriously? It took’em 4 years to identify this? Wow! Talk about incompetence all around.
“Seriously? It took’em 4 years to identify this? Wow! Talk about incompetence all around.”
Incompetence includes underestimating the adversary as you are doing.
Consider the possibility that the trojan was a single use exploit for a highly targeted attack, with best-in-class C2 by a nation-state actor, and the exploit to get it signed was similarly a single-use compromise that did its best to erase the footprints on the way out. How long would detection of that trojan take?
Add in the time to analyze, determine what happened, take care of cleaning up as appropriate before disclosure. How quickly would you handle those?
There’s a lot not being disclosed about this, but it’s not necessarily incompetence. It might just be that the adversary is more competent than you should know.
Indeed, when you stop and think about how it works, it’s amazing that antivirus even works at all. It’s based on viruses being released in the wild, being isolated by random people in the field, submitted to AV companies for analysis, then definitions are created to catch identical files in the field. And those are just the simple infections… some botnets include (or quickly grow to include) dozens of components, each of which plays their part in the infection.
AV is always behind the game. And all it takes is someone specifically targeting you with a unique piece of malware to literally never get caught by AV, because it was never caught in the wild for AV companies to analyze.
It’s why I keep telling kids to never, ever download software from internet sharing sites. You really don’t know WHO was the last one to modify the files, and then you run them with full administrative privileges, so they can do absolutely anything to your system.
yeah something is not right here. Typically, coders require elevated priv’s. If they had a “trojan” on their box, and IF they had some sort of malware protection on that box, it should have alerted them to the issue long before.
They didn’t say it was a rootkit, stealthy malware or other persistent threat, so one has to assume – if they are telling the truth – that this should have been detected by HP long time ago if they were following good security measures.
Things change over the years, and thus, this issue probably was eventually corrected when the programmer got a new workstation, updated software or migrated/had the machine migrated.
A private key can be used to impersonate the company or gain access to confidential data that they may have access to. One does not know the true role of that cert. HP is intertwined with alot of other companies and agenies that work together on projects. I wonder if this cert could have been used to get an outsider a peek into the confidential hand shakings that were going on between these companies.
Four years in IT is a LONG time. A lot of things change. As much as HP tries to become one of the leaders in the smashing of malware, and they have their best front line defenses up – and an outside security company has to tell them they are jacked up, tells me that they aren’t practicing what they preach from inside their walls. It only takes one weak link for something like this to happen. Now you can see why IA/ CND people stock up on tumbs and alleve at work, and something to aid in sleeping at night.
The coder may have had elevated privileges and therefore disabled any anti-malware/anti-virus/security software on his workstation. We coders are tricky like that.
Obvious IA Eng does IA not CND… 🙂
“…and IF they had some sort of malware protection on that box, it should have alerted them to the issue long before. ”
Not if the malicious code were previously tested to ensure it would not be detected.
This is not the doing of a scriptkiddy, it’s much more likely to be nation-state actors. Think in terms of a resourceful and determined adversary whose resources are based on the world’s largest economy (China, per http://www.businessinsider.com/china-overtakes-us-as-worlds-largest-economy-2014-10).
“They didn’t say it was a rootkit, stealthy malware or other persistent threat, so one has to assume – if they are telling the truth – that this should have been detected by HP long time ago if they were following good security measures.”
Just because they didn’t say it was APT does not mean it wasn’t. There is no reason to believe they are telling the entire truth, especially when it would be dumb for them to do so. It’s even dumber to assume that this should have been detected. That underestimates both HP and the adversary, based on scant information looking on from afar.
The irony here is that someone working for Symantec is likely to have been using Symantec products to protect his system, and every malware author isn’t going to release his product into the wild until he’s implemented something to hide from Symantec.
Being a huge public company focused on security products means your products are well known to the very people you’re attempting to catch.
“Even if the security concerns from this incident are minimal,…”
But they aren’t minimal. Symantec told HP it was “found on a server outside of HP’s network.”
HP’s CISO “stressed that the software package in question was never included in software that was shipped to customers or put into production.”
That means this signed malware was EXFILTRATED FROM HP AND DEPLOYED TO A VICTIM by the responsible adversary – four years before it was uncovered.
BTW, happy Cybersecurity Awareness Month, everyone!
I’ve googled around but don’t see any alerts– is the average user of an HP/Compaq branded computer at risk here?
Beginning about five years ago, I was repairing infections coming from the automatic updater HP installs with printers (HP Update). I had even isolated what I thought was the infected source. Trying to open it with a hex editor in Linux shut the machine down. Virustotal found nothing and so on.
I eventually gave up trying to find someone that knew anything about it and have seen fewer of the infections over the past two years.
I’ve also found infections riding in on an updater that nVidia installs. Now I routinely uninstall ANY automatic updater.
But I never could understand how they first detect the IP stream that they could then piggy back in on the connection. (Assuming that the actual update was not infected – if it were, then the viral code would be inside HP being distributed by the updater).
But I remembered that some years ago, Cisco suffered a breach giving up their source code. Could the hackers be hacking routers to watch for the HP update connections? Then using some redirect and this cert, just shove the malware into the machine?
And speaking of rootkits. We NEVER repair a computer from the recovery partition. We’ve found that about 40% of the rootkit infections we see, have infected the MBR. We then zero-fill the HD and install a clean OS – not from the mfg recovery disks.
There is an aspect to this that I don’t see covered (I could have missed it, too). Revoking a certificate is only useful if you have your systems configured to enforce said revocation or even check the CRL at all. I beleive it to be the STATE key under HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\WinTrust\Trust Providers\Software Publishing that sets this. So, if it’s not configured correctly the revocation could be useless. Hopefully that nearly useless AV people have will actually create a signature for it.
There’s a detail here that looks worrying.
“according to HP, it was used to sign a huge swath of HP software — including crucial hardware and software drivers, and other components that interact in fundamental ways with the Microsoft Windows operating system.”
On 64-bit versions of Windows, device drivers must be signed, and the signatures are checked at each boot time. I don’t know if certificate revocations are applied to those checks, but if they are, machines with device drivers signed with this certificate are going to start complaining about their drivers in ways that the typical user won’t understand at all.
The PDFs linked above don’t obviously mention device drivers, but I’m not very familiar with HP’s produce range, having refrained from buying stuff from them in recent years.
The part that concerns me is that a regular developer was able to code sign an arbitrary binary with a publicly-trusted certificate.
It’s unclear whether the certificate was installed on the developer’s workstation, or whether the binary was uploaded to some internal system that does the signing. Either way, it’s inherently unsafe for a professional software company to allow arbitrary binary files to be signed.
A better solution here would be to have a dedicated build machine that operated in a trusted environment with limited access. That machine would grab the code from source control, build it, sign the binaries, then package the code for testing or release. It doesn’t protect from any issues where the malicious code makes it into source control, but it does reduce the attack surface and improve traceability.
Private file sharing with torrent tools is quite difficult. I suggest a file transfer software called Binfer as a great alternative. See http://www.binfer.com/solutions/domains/file-transfers-for-personal-use
Excellent post by MSMVP Susan Bradley on browser setups – Chrome, Firefox, and/or IE:
> – http://windowssecrets.com/top-story/protecting-yourself-from-poodle-attacks/
Oct 22, 2014