“We must care as much about securing our systems as we care about running them if we are to make the necessary revolutionary change.” -CIA’s Wikileaks Task Force.
So ends a key section of a report the U.S. Central Intelligence Agency produced in the wake of a mammoth data breach in 2016 that led to Wikileaks publishing thousands of classified documents stolen from the agency’s offensive cyber operations division. The analysis highlights a shocking series of security failures at one of the world’s most secretive entities, but the underlying weaknesses that gave rise to the breach also unfortunately are all too common in many organizations today.
The CIA produced the report in October 2017, roughly seven months after Wikileaks began publishing Vault 7 — reams of classified data detailing the CIA’s capabilities to perform electronic surveillance and cyber warfare. But the report’s contents remained shrouded from public view until earlier this week, when heavily redacted portions of it were included in a letter by Sen. Ron Wyden (D-Ore.) to the Director of National Intelligence.
The CIA acknowledged its security processes were so “woefully lax” that the agency probably would never have known about the data theft had Wikileaks not published the stolen documents online. What kind of security failures created an environment that allegedly allowed a former CIA employee to exfiltrate so much sensitive data? Here are a few, in no particular order:
- Failing to rapidly detect security incidents.
- Failing to act on warning signs about potentially risky employees.
- Moving too slowly to enact key security safeguards.
- A lack of user activity monitoring or robust server audit capability.
- No effective removable media controls.
- No single person empowered to ensure IT systems are built and maintained securely throughout their lifecycle.
- Historical data available to all users indefinitely.
Substitute the phrase “cyber weapons” with “productivity” or just “IT systems” in the CIA’s report and you might be reading the post-mortem produced by a security firm hired to help a company recover from a highly damaging data breach.
DIVIDED WE STAND, UNITED WE FALL
A key phrase in the CIA’s report references deficiencies in “compartmentalizing” cybersecurity risk. At a high level (not necessarily specific to the CIA), compartmentalizing IT environments involves important concepts such as:
- Segmenting one’s network so that malware infections or breaches in one part of the network can’t spill over into other areas.
- Not allowing multiple users to share administrative-level passwords
- Developing baselines for user and network activity so that deviations from the norm stand out more prominently.
- Continuously inventorying, auditing, logging and monitoring all devices and user accounts connected to the organization’s IT network.
“The Agency for years has developed and operated IT mission systems outside the purview and governance of enterprise IT, citing the need for mission functionality and speed,” the CIA observed. “While often fulfilling a valid purpose, this ‘shadow IT’ exemplifies a broader cultural issue that separates enterprise IT from mission IT, has allowed mission system owners to determine how or if they will police themselves.”
All organizations experience intrusions, security failures and oversights of key weaknesses. In large enough enterprises, these failures likely happen multiple times each day. But by far the biggest factor that allows small intrusions to morph into a full-on data breach is a lack of ability to quickly detect and respond to security incidents.
Also, because employees tend to be the most abundant security weakness in any organization, instituting some kind of continuing security awareness training for all employees is a good idea. Some security experts I know and respect dismiss security awareness programs as a waste of time and money, observing that no matter how much training a company does, there will always be some percentage of users who will click on anything.
That may or may not be accurate, but even if it is, at least the organization then has a much better idea which employees probably need more granular security controls (i.e. more compartmentalizing) to keep them from becoming a serious security liability.
Sen. Wyden’s letter (PDF), first reported on by The Washington Post, is worth reading because it points to a series of continuing security weaknesses at the CIA, many of which have already been addressed by other federal agencies, including multi-factor authentication for domain names and access to classified/sensitive systems, and anti-spam protections like DMARC.
Will there ever be hope that a government run network will ever be any way as secure as even a lowly non-profit charity organization?
You will never know about the ones that are well secured.
Non-profits have much worse security. But they are lower value targets, so you just won’t hear about breaches as much.
Luckily such organizations give spyware to the police who routinely fail yearly auditing, and don’t even follow their own internal procedures and protocol anyway. So high profile people such as politicians, judges or lawyers need not worry about any little brats who can’t keep their mouths shut, and everyone can keep on making money while pretending that their little ones are in the safe hands of those police.
The little blighters reuse their passwords and don’t enable MFA, making it their fault. They should stop looking so enticing to important people in the first place, probably have learned something about situational awareness from all the time spent in front of the television, or at least the basics of self defense.
“Some security experts I know and respect dismiss security awareness programs as a waste of time and money, observing that no matter how much training a company does, there will always be some percentage of users who will click on anything.”
Which means there needs to be a highly trained team or automated system that pre-screens email for malicious links, disables them and only then passes the emails on.
@Sam, security awareness programs have their place. But too many organizations feel that a series of video tutorials and posters are sufficient to “build awareness.”
Trouble is, even if “mandated,” plenty of employees fast forward through these “programs du jour” or never even try to view them. (“What are you going to do if I don’t view them, fire me?”)
What’s more important is the need to keep employees’ vigilance top-of-mind over an extended period of time. And too often, employees just don’t grasp how critical it all is — until it’s too late.
Some awareness programs are clearly more effective than others, but the reality is this: true cyber security requires much more. Training, improved IT system resources, pen testing to identify vulnerabilities, and acting to address them, are just a few of the steps needed. We can all add to the list.
But that (gasp) costs money… and too many leaders just won’t do the right thing. Until reports like this come out.
So, Sam, what you recommend is fine, as long as it’s just one part of a multi-faceted, ongoing effort. And if more companies, agencies and other organizations did all that correctly, Brian would have a lot less to write about.
Yes, I agree, but I was talking about one small but important and highly abused area – email phishing. Isn’t there software that can effectively scrub emails of any types of links, buttons and documents? The cost of this or having a team to pre-screen all incoming emails would certainly be offset by the cost of ransomware. It wouldn’t be complicated to implement.
One small step at a time.
…there’s plenty of phish generators – even metasploit has one…
…getting people to change, priceless…
…some people will never change…
Yes, MS “Safelinks” Advanced Threat Protection is already available in O365/MS365
Many government systems, including the military, already do this. Links are “defanged” so they aren’t clickable.
But guess what? 99.999% of links in emails ARE legit. And there is still a need to use these links.
So what happens? You train users to copy and paste. And that is what they do. They still won’t take the time to carefully review every link, because now you’ve wasted their time, and they will try to copy/paste even faster.
User training is useful, but as those “experts” rightly observe… it can only help up to a point. Phishing training is good, but still of limited value.
Anyone remember PLAIN TEXT EMAIL
Or what about URLS scrubbed from email body
= 98% less risk
Yeah…. also less productive.
Security gets a bad name when people implement draconian measures. Then people stop listening to security when we make a good/reasonable policy.
In my experience, the best security awareness program is to engage with your employee population rather than solely relying on impersonal videos and posters. And, the most powerful security awareness is to follow-up on security events with your employees. The fact that someone is “minding the store” and questioning what is going on reinforces that the organization cares, and makes an employee double-think any thought to engage in mischief.
Great points Brian and to that I would suggest the best programs take it beyond just the corporate view and show the “what’s in it for me”. Great programs seem show staff how to also use security awareness training (spotting phishing etc.) to the employee’s personal benefit (how to defend against identity theft, spot signs in your kids of cyber bullying, etc.).
The issue is, as web threats become more advanced, host portals like O365 have both legitimate and bogus links. You test the link, it’s trusted, but you download the payload and have a new threat inside.
I had a coworker in the IT department that would click on anything. After he got the entire place ransomwared I had the pleasure of restoring from backup
I quit the place after that incident because he’s still there
If a single person could damage the whole organization… then you don’t have proper compartmentalization. Even IT folks, especially IT folks.
Do IT workers only have one set of credentials? Or do they perform IT admin functions with an elevated account?
Rule of thumb: The account that is used for emails, should NOT have any special privileges. Even the computer used for day to day operations and tasks, like checking email… should not be privileged in anyway above a basic employee.
Wikileaks puts a “bounty” for information. There are communist, homosexuals in the CIA that hate America.
There are people in many organizations (foreign and domestic) containing people of all creeds, orientations, races, and nationalities that hate America.
So what’s new?
Has your comment actually been insightful or of help to anyone?
To your question — I would answer a loud and resounding, “NO!”
Given David’s focus, dcm, we may need to put a finer point on it:
There are Republican heterosexuals in the CIA that hate America.
Focusing only on people you foolishly “expect” to do certain things — such as to exfiltrate classified data — is one clear path to a security disaster. Just ask the CIA!
Perhaps the goal should be for America — AND individual Americans, David — to stop being SO easy to hate. To stop using EVERYTHING as an excuse to make ignorantly hateful comments would be a great start.
Some spies are devout Catholics
Some had fathers that also worked for the CIA or other government positions:
In hindsight it is easy to see how the government’s internal controls were and are severely lacking in vetting regular straight religious patriotic altar boys.
I was wondering if anyone was going to bring up Robert Hanssen. I remember reading the list of reasons that the CIA didn’t catch him and it looked remarkably similar to the list above. Also the information I have seen put his wife as a devout Catholic, Robert was definitely Catholic but not very devout from the information I read, and that information is not related to his spycraft.
I think it is the toll of the bell that history is prone to repeat itself. Clifford Stoll became one of the most sought security lecture personas of his era over a minuscule accounting discrepancy of less than a dollar related to the mainframe access at the UC Berkeley computer labs. He was an astronomer helping out to earn money during the summer break. He was tenacious and followed the trails to the hacker.
The more we learn the less we seem to remember or maybe that is just my age showing.
“When Security Takes a Backseat to Productivity”
In my 25+ years of cybersecurity work, my observed baseline of “problematic” users is between 2 and 4% of the employees. No amount of awareness training will change their behavior, and a smaller percentage of these will sometimes be actively malicious. Size of the organization doesn’t matter, there are always a few bad apples – but they should not be the reason to eschew awareness programs. Nothing we do is 100%, and if I can get user behavior to 98%, I’d take it.
Pompeo was new as CIA Director in March of 2017 and so it could be said that he inherited the lax security of the Brennan years.
LOL, good security value add response there Bart! smdh
Slowly we are starting to consider security when developing systems. The push is finally coming from management as it should. However, they are kicking and screaming all of the way as security does take time an money from their pet projects that used to take days but will now take weeks while the software developers cross the t’s and dot the i’s.
Developing secure systems is the right thing to do. It has always been the right thing to do. But cutting corners saves time and money. Cutting corners enables hiring of lower skilled and cheaper developers.
Cutting corners is not unique to software development. Ask OSHA.
As a software developer, I am so happy that we are finally able to provide secure systems with managements blessing. Not because they believe in what we are doing but because they are in fear of being blamed or legally liable if something goes wrong.
Baby steps in the right direction but we are only 5% of the way to the goal of having secure systems everywhere.
Is the CIA more of an asset or a burden to the US?
If a dept is more concerned with “offensive cyber operations” than security for the nation, what good is that? Going to start another war? We don’t need that. Spying to understand the world, sure, creating puppet governments with the most corrupt idiots (how many times) is NOT good for the US, and constantly backfires to get people killed for what? A mission that had no real world function other than to abuse people, get corporations mineral rights? Closing the CIA would save the US a lot of money and embarrassment.
Let the FBI take over the “worthwhile and respectable” task the CIA does, and drop the criminal behavior.
That’s not The CIA’s job it’s the FBI which could do with being split into a federal police force and something like MI5 but aim to have < 20% former cops in there.
We have implemented a phishing program where emails are sent out to the company with enticing links. Users that click on these links are required to take a several minute training automatically. These users are also identified to the security team
It has been very effective. To the point that my staff gets 2-4 request per day to inspect a suspect email.
The program we use is rather inexpensive, but well worth it!
When organization are having financial issues often one of the first things they do is cut the IT budget. Most senior managers have little or no idea as to what the IT department is actually doing to maintain proper IT security. When this happens this almost always has a very negative impact on network security, end user training, patching and monitoring.
I have twice seen where organizations have short changed IT security and the financial losses were 10 times higher than the savings.
The CIA clueless, reckless AND feckless…?
Shocked, I am.
Now and then some malicious email will get through–especially if it is of the phishing variety and initially not overtly malicious. This morning I found a Yahoo phishing email in my Yahoo email box! I also read on Bleeping Computer about a present ongoing Outlook email phishing campaign that is bypassing spam filters because it is not out in large numbers and it is sent from a valid email account. Technologies like SPF, DKIM, and DMARC only help verify if an email has been sent from the domain it claims to be originating from. There was also a valid Comodo certificate involved.
I wonder if the phishing email I received is a derivative of the BOA one. Anyway, the upshot is that a user will often spot something if he/she is well-trained, conscientious and trained often as reinforcement.
The report is not “redacted”, the report is censored. Washington is great at choosing new terms for old crimes. Remember “extraordinary rendition”? What is “extrajudicial?” What did they call torture?
If they choose the words, they determine how the issue is conceived and thought about, and so they win.
According to the OED, the definition of censor is:
“An official in some countries whose duty it is to inspect all books, journals, dramatic pieces, etc., before publication, to secure that they shall contain nothing immoral, heretical, or offensive to the government”
In other words, censorship concerns morality, not national security. Removing words that might compromise security is not censorship.
As for inventing new words, the OED dates “redact” to 1475.
When you use “national security” as a blanket statement and rubber stamp for anything that they think is immoral , it is indeed censorship. National Security is a codeword for coverup. Deny it all you want, we all know its true.
Considering the unredacted content is plenty embarrassing, it is very reasonable to assert that the redacted parts do indeed contain classified information.
Censorship is a ridiculous claim to make.
Remember, this is the CIA’s report. Is the CIA making a claim for 1st amendment free speech here? Is the CIA calling it censorship?
If you had your medical records leaked, and wanted a few key terms, names, dates, etc… kept private in the report that would go to the media.
Would that be “censorship”? Or would you want that “redacted”?
What about bank statements? Adoption papers?
Awesome breakdown and insight into what went on with this and what the final internal summary was. Provides some great insights for any organization on some top goals for securing their own ‘digital stuff’.
Sounds like the average hipaa or fdic data bearing network i see in the field. The govt has standards, ive never seen them punitively enforced. Not once in my career, ever.
We need a law saying, if u handle secured data and get caught with out-of-support systems, it means an instamt injunction on business until its resolved. You have to hit managers, doctors, bankers in the pocket hard or they’ll always treat systems as an abstraction and ignore them.
If u aint paying somebody to watch the packets, then nobody is.
It’s not just the punishment, you need a process for discovery for substantial audit. Only 50 hospitals were audited in 2015 out of hundreds.
I have hopes that CCPA and it’s right for class action lawsuit standing for privacy breaches will provide that level of effort needed to find those bad actors and discourage the lax security practices driven by bottom line budgets.
Sometimes, I think we should just go back to pen and paper, and do away with e-mails altogether.
…shared root passwords among “trusted’ admins is normal practice…
…not logging activity is not…
…the issue is that managing technical people is a contact sport and most managers are not up to it…
Familiarity and complacency sometimes leads to carelessness and contempt for rules and procedures, which could lead loss, sometimes of life. It was true on the flight deck of a carrier (the complacent could end life as a puff of smoke out of an afterburner), and it is true in the world of IT.
You make two very important points on how we can change from old security practices to new ones. First, developing a baseline norm. Many company’s have yet to grasp onto Machine Learning security tools to help sift the unimportant data from the important, and filter up more critical events to security analysts quicker. The second is security awareness training, but more importantly, establishing Trust between staff and security. Developing this trust bond will only improve the effectiveness of security awareness training.
I have certainly seen cases where security took a back seat to convenience. When I point out the risks, I get at best bemused smiles from the non-IT staff. All you can do is educate/train the non-IT staff, test the training, point out the risks, and send them horror stories. Like this one.