May 29, 2019

Would your average Internet user be any more vigilant against phishing scams if he or she faced the real possibility of losing their job after falling for one too many of these emails? Recently, I met someone at a conference who said his employer had in fact terminated employees for such repeated infractions. As this was the first time I’d ever heard of an organization actually doing this, I asked some phishing experts what they thought (spoiler alert: they’re not fans of this particular teaching approach).

John LaCour is founder and chief technology officer of PhishLabs, a Charleston, S.C. based firm that helps companies educate and test employees on how not to fall for phishing scams. The company’s training courses offer customers a way to track how many employees open the phishing email tests and how many fall for the lure.

LaCour says enacting punitive measures for employees who repeatedly fall for phishing tests is counterproductive.

“We’ve heard from some of our clients in the financial industry that have similar programs where there are real consequences when people fail the tests, but it’s pretty rare across all types of businesses to have a policy that extreme,” LaCour said.

“There are a lot of things that organizations can do that aren’t as draconian and still have the desired effect of making security posture stronger,” he said. “We’ve seen companies require classroom training on the first failure, to a manager has to sit through it with you on the second time, to revoking network access in some cases.”

LaCour said one of the most common mistakes he sees is companies that purchase a tool to launch simulated phishing campaigns just to play “gotcha” with employees.

“It really demotivates people, and it doesn’t really teach them anything about how to be more diligent about phishing attacks,” he said. “Each phishing simulation program needs to be accompanied by a robust training program, where you teach employees what to do when they see something phishy. Otherwise, it just creates resentment among employees.”

Rohyt Belani, CEO of Leesburg, Va.-based security firm Cofense (formerly PhishMe), said anti-phishing education campaigns that employ strongly negative consequences for employees who repeatedly fall for phishing tests usually create tension and distrust between employees and the company’s security team.

“It can create an environment of animosity for the security team because they suddenly become viewed as working for Human Resources instead of trying to improve security,” Belani said. “Threatening people usually backfires, and they end up becoming more defiant and uncooperative.”

Cofense provides a phish reporting system and encourages customers to have their employees flag suspected phishing attacks (and tests), and Belani said those employee reports can often stymie real phishing attacks.

“So what happens a lot of times is a person may click on link in a real phishing email, and three seconds later realize, ‘Oops, I shouldn’t have clicked, let me report it anyway’,” Belani said. “But if that person knew there was a punitive angle to doing so, they’re more likely not to report it and to say, ‘You know what, I didn’t do it. Where’s the proof I clicked on the link?'”

LaCour says PhishLabs encourages clients to use positive reinforcement in their employee training campaigns.

“Recognition — where employees and departments that do especially well are acknowledged — is very common,” LaCour said. “We also see things like small gifts or other things that companies would typically use to reward employees, such as gift cards or small bonuses for specific departments or people.”

LaCour said his offices make a game out of it.

“We make it competitive where we post the scores of each department and the lowest scoring department has to buy lunch for the rest of the department,” he said. “It teaches people there are real consequences and that we all need to be diligent when it comes to phishing.”

What about you, dear readers? Does your employer do phishing awareness training and testing? What incentives or disincentives are tied to those programs? Sound off in the comments below.


156 thoughts on “Should Failing Phish Tests Be a Fireable Offense?

  1. Fox

    So why isn’t the security team fired that let the phish email get to the employee who clicked on it?? Oh ya, that’s right, because even million dollar systems like O365 or MimeCast filtering, still lets some phish emails through because they look incredible similar to real emails. Example, one of my users got an email from a vender we communicate with regularly and the link was to dropbox for a proposal we were expecting. When the opened the proposal off of drop box it was a word doc with a link in it that sent the user to a look a like OneDrive site where they were suppose to sign in with their email address and password to receive the file. Yikes!!

    1. Richard S

      Saw one recently from an auto dealer, who we just purchased a vehicle from and communicated with via email, sent us a follow-up email with a copy of our paperwork.

      It was not paperwork. Someone had breached his email and elaborately tried to phish his contacts. I don’t know the outcome, but I suspect it was a very effective attack.

    2. SkunkWerks

      “So why isn’t the security team fired that let the phish email get to the employee who clicked on it??”

      Well, I mean one reason is that algorithmic filtering and trying to slide phishing emails around is an arms race that never really ends and never has a clear victor.

      Another- maybe more subtle reason is- the above being the case (and assuming you actually could stop it completely) all you’d really be doing is denying your employees a much needed-life lesson.

      Because I’m pretty sure these people aren’t just getting phishing emails at their jobs.

      I work for a mid-size company, so what work I try to do in training and awareness often brings that concept in- I’m not just teaching you things you can use to help protect the company.

      This is just as useful in your own life.

      1. Matt

        As a consultant who does a few hundred tests every year for thousands of people, I don’t want to do business with a client who terminates employees and would end the relationship with any client if I discovered one of my tests resulted in termination.

    3. BionicSecurityEngineer

      @Fox – You’re right and wrong. It’s a double-edged sword. The “serious” security pros don’t let users onto O365 or other “cloud service providers” without implementing an MFA protection, because when it comes to phishing… no e-mail security solution is 100%. You must anticipate and solve for the “layer 8” problems and using MFA spoils the phisherman’s catch. And that’s just 1/3 the battle. The other major problems are malware and social engineering without links or attachments. For that… I routinely recommend an email security product that heavily leverages reputation and machine learning to filter out the lame e-mails that continue to stab users in the back. Malware can usually be defeated with heavy attachment filtering (remember that? Back in the day when double extensions were the back door past filters? Yeah… Nah… don’t let anything through that is considered an executable)

      I feel for companies that elect not to do the proper diligence and planning before adopting a cloud service provider, and most of the time it’s due to the added expense; however, that thinking must change. Unless companies realize the risk they incur with providing e-mail (on-prep or CSP), then the phishing and malware and social engineer campaigns will continue to succeed.

      1. senior sysadmin

        This hubris is what gets us in hot water. MFA cannot do anything to block phishing. Once the ransomeware executable is on the operating system, there’s not much we can do to stop it from encrypting the drive without a feat of super human strength. Serious IT professionals do what is reasonable and directed to them by their business. They collaborate. This nonsense will bury any IT professional.

        1. non-senior sys..., non-BionicSecurity, non-engineer, aka acorn

          ?? “Once the ransomeware executable is on the operating system…there’s not much we can do to stop it from encrypting the drive without a feat of super human strength.”

          From a checklist: “Application control.”…block unwanted applications from …running at the endpoint.” Straight up executable, questionable malware, malware apps shouldn’t be allowed to run, at all.

          Also known as application whitelisting, as Joe says:
          https://krebsonsecurity.com/2019/05/should-failing-phish-tests-be-a-fireable-offense/comment-page-1/#comment-490395

          1. acorn

            But, if “directed to them by their business” to allow ransomware to run and that IT professional knows it’s not good practice, indeed “bur[ied]” that IT professional.

    4. JasonR

      If the security team can bounce emails (greylist filtering) and cause delays so that filters can have time to learn from their vendors, say 4-8 hours for unknown sources, and still 2-4 hours for known sources, then perhaps the security team’s vendor should be fired. Vet all attachments through a secops team to look at context in a sandbox environment and watch for phone-homes or other abnormal conduct via application whitelisting.

      But that’s not how it works. Secure, timely, and easy to use, pick two.

  2. Hooked on Phishing

    I thought I was being phished a few months ago when I received an email inviting me to answer a survey about a company I had never heard of or dealt with before, but I could see it was legit on a Google search.

    I reported the phishing attempt to corporate security, as trained.

    A few days later they said it was nothing to worry about but I should unsubscribe from their emails on the handy-dandy link in the email.

    But what if, I asked, the unsubscribe link was malevolent in some way?

    Still waiting for an answer.

    1. vb

      “Never unsubscribe to anything that you did not subscribe to.”

      Everyone should know that by intuition. Any company that would add you to their mailing list without your opt-in does not deserve an unsubscribe reply. Also, there is no way to know for certain that the email even came from the legit company. At best, it’s spam. At worst, it’s phishing. Ether way, just delete it.

      1. TB

        At my place we tell people to set up a filtering rule on their mailbox instead of unsubscribe for that exact reason. The result is the same for the end user, but much safer.

  3. Marti

    At my last employer we did have progressive counseling up to and including firing for clicking the phishing campaign emails. I can tell you that after that written warning people were pretty cautious.

    I personally see nothing wrong with it, would you want to work next to a dummy that can’t keep his paws off potentially malicious links?

  4. SkunkWerks

    “But what if, I asked, the unsubscribe link was malevolent in some way?”

    Hovering over links to see where they go helps.

    But really what irks me more about the response of your security people is that they’re actually lending credence to the idea that Marketers have some Code of Honor that says they ~must~ cease all contact with you if you just click that link.

    If you have no clue how you got opted-in to a particular marketing email, why would you have any expectation that opting-out is actually an option?

    IMO, if this concept (unsubscribe links) hasn’t been over 90% disingenuous from inception, then I’ll eat my hat.

    Don’t wear one, but I’ll buy one just to eat it for the occasion.

    1. PJJ

      “Hovering over links to see where they go helps.”

      Yes, until the company starts using a product that obfuscates all external links and addresses them by forcing them through the firewall.

      1. Bob Tables

        Ah yes, gotta love “clicktime protection” or whatever it’s called. It doesn’t 100% guarantee the link isn’t malicious, only that it will block a request if it *knows* the link is malicious. I don’t know, maybe a solution that expanded links out to the actual site you’re requesting, and that turns it back in to plain old unclickable text so that somebody has to really think hard about going there? Scratch that… I think that would be to easy. I don’t think there’s a way to win here.

  5. JimV

    Institution of a personal (much less corporate) mantra of “Don’t Be Stupid” is a lot more difficult than it should be.

  6. MG

    I don’t believe (or see from the comments) that a systematic approach by terminating offenders is a reasonable approach. I do, however, firmly believe that termination/reassignment is well within reason for those who have demonstrated an inability to learn and continue to fail in their responsibility to be vigilant against threats. Fact is many employees lack accountability – I see this all the time with abuse of company phones where someone travels internationally and racks up a ridiculously large bill. If it was their own phone they certainly would be more cautious.

  7. Craig

    In my experience, managers are the ones who fall for the Trojans and Phishing schemes the most. And the higher up the ladder they are, the more likely they are to fall for these things.

    1. JCitizen

      At my last contract job, they were under HIPAA rules, so the organization literally made the CIO the boss and gave upper IT management the power to recommend firing or more training. It was the most locked down organization I’ve ever worked at, and we did the job right! What few incursions happened were detected and stopped within a few hours or less.

      This is why I shake my head at all the stories Brian brings us, and I just have to say to myself, “What a bunch of amateurs!!”

    1. MG

      That article is flawed…. certainly employees in both operations and finance must be held accountable and face consequences if they fail to comply with the requisite safety rules and controls for their areas.

  8. Jf

    The old pink panther detective movies made a running gag for his assistant Kato to attack him at any moment to keep his skills sharp, this is phishing testing today

    Phishing tests are seen as an sad to hysterical abuse of power by staffers where I’m at. Where else in industry does random testing with career consequences happen other than drug testing. Drug testing Has many hr guardrails that lack in most IT policy. The most obvious is independent evaluation of results.

    If anything place into the “public” kpi to the staffers and shareholders to show the time, energy and expense of IT against phishing pays off vs the contrary.

  9. John Nelson

    Phishing simulations serve two purposes.
    The most important of those is training. Regular simulated phishing campaigns should be aimed at _increasing_ user awareness. Encourage users to share when they sight something sketchy. We do that by rewarding users with a token bonus and a public atta-boy when they spot the bait.
    The other purpose is to gauge the effectiveness of your user awareness campaign. Once you’ve got everyone engaged, there will likely be some who, for a variety of reasons, can’t seem to spot the email with the hook in it. Security should provide remedial training and whatever hand-holding is necessary to achieve the target level of awareness.
    Yes, there will be chronic offenders. As much as possible, hand off the “stick” chores to managers and HR. It is the employee’s manager who is responsible for the employee’s performance. Let them do what they need to do to change behavior. At some point, that means HR may get involved. Meanwhile, Security has been doing nothing but trying to help.

  10. Readership1

    vb and a few others got it right.

    There’s no reason to fire an end-user for the failure of a system. One person’s mistake shouldn’t have the potential to blow up the office. If it does, there’s a larger systemic problem at that workplace.

    1. James

      No, not true. The requirement to communicate and share data across an organization means there is always the possibility that a single user can do a lot of damage, no matter if the best systems were in place to prevent and mitigate it.

      Even if you did backups every few minutes. You stand to lose a ton of work, and the larger the organization the larger the loss.

      Not to mention if the attacker is more malicious and patient. They can wait and slowly pick at your users and network for months or years without anyone knowing because the initial user never reported that they gave out their credentials. Then one day, bam! It all blows up.

  11. EJ

    We offer additional training when associates continue to click and bring in management to help explain how serious this could be. What do you do when that still doesn’t work? We have a couple that continue to click on our monthly phishing test even though they know all the warnings. Unfortunately, the only common denominator is age. Are they too trusting? It reminds me of the overwhelming amount of financial elder abuse and how those victims also know the warning signs yet they still fall of the scams.

  12. Gerard

    Firing employees is like blaming the store owner for their robbery. It is short sighted and does nothing to address the problem, only gives cyber criminals more ammunition and another incentive for attack. Not a good work culture either.

    Companies NEED to take security awareness education more seriously, and take the time to do it right. Phishing is not complicated, with the right email security gateway (i.e. Proofpoint) and proper (easy) training your staff will at least know how to spot a phish and where to send it so that the security teams can deal with it. Your employees are your agents, they are targeted, you need to arm them for war that is.

    FYI, best security awareness platform, Beauceron Security
    https://www.beauceronsecurity.com/

    1. DH

      — Firing employees is like blaming the store owner for their robbery.

      Sorry, I think everyone misses the point.
      Firing an employee for failing phishing testS (should be several) is like firing a store employee for forgetting to lock up at the end of the day…repeatedly.
      The phishing test should be more easily detectable than actual phishing attempts. It is like requiring someone to take a driving test to operate a vehicle. Cover the basics that were taught.

      There are employees who just don’t care (e.g. won’t lock up the store, isn’t at the counter when customers are browsing the shop, doesn’t clean up messes on the floor which customers can slip on). Those are the people who should be fired. I put people who repeatedly put their company at risk within that group.

  13. Helo Guy

    I work in a healthcare organization, we have taken this topic very seriously for many years. We use a training/evaluation supplier to both help with exposing staff via simulated Phish and to evaluate submitted suspicious items. We include training to all staff and work to help staff who struggle with properly reporting Phish. The focus is on motivating staff, not punishing.

    In 2015 the simulations showed only 33% of staff reported the dubious items and almost as large a fraction fell for it.
    In 2016 the recognition/reporting climbed to 44% and only 14% were susceptible.
    In 2017 the recognition/reporting climbed to 52% and only 8% were susceptible.
    In 2018 the recognition/reporting climbed to 58%, the susceptible rate is leveling.

    Because so many staff report Phish, we get prompt detection of those that slip through our automated defenses. During the first quarter of 2019, we detected an average of 42.9 attacks per week. The larger attack size (where 11 or more items reported) had a median time from first item received to being reported by staff of 7 minutes. This value has been below 10 minutes in 7 of the last 8 quarters.

    We continue the efforts in earnest. Noteworthy is a 1 minute training animation video for Phish, part of a THINK series for Information Security:
    >> Phishing emails are a real threat
    So we need your help
    Thousands of harmful emails attack us daily
    Filters catch most of these Phish
    But some still manage to get through the net
    This is where YOU come in
    Did you know?
    If you delete a malicious email
    It will be removed from your inbox
    But the threat to (our organization) still exists
    But if you report the malicious email
    You help (our organization) defend against future threats
    THINK, you colleagues depend on you
    THINK, patients depend on you
    THINK before you click. Office of Information Security <<

    Happy hunting!

    1. James

      Good stuff Guy! I like that poem… and the puns.

  14. M

    We have escalating punishments with each click on test or real phish with a way to track them. For the first click, it’s documented and a training email is sent. 2nd, another training email explaining that with the next time they click, their manager will be involved. 3rd, a meeting with the employee, their manager and an InfoSec person. 4th, disabled email with all employee email going to manager. 5th, potential termination. The biggest thing about our policy is that if someone clicks a link and then determines it was not legitimate, as long as they report it right away, it does not count towards this count. We do not want to punish someone for letting us know about a potential problem. Of course, if a breach would occur, which none has so far, I suspect the level of consequence could escalate. To me, all this seems pretty lenient. One click on a malicious link in an email has the potential to compromise the entire network. The excuse that a phish looks real will not prevent the company from losing loads of money, customers or even going out of business.

  15. JW

    The culture of the organization – and how the organization backs up its culture – makes the biggest impact to the decision of whether or not to fire folks that click phishing emails.

    At its core, cybersecurity is a profoundly human experience. The vast majority of cybersecurity issues are due to human error at some point, many of which are basic. But the premise to solving these issues should be pretty simple:

    Reward the best-case scenario and penalize the worst-case scenario.

    The best-case scenario isn’t simply deleting or ignoring a phishing email; it’s reporting it upstream properly. People (users) want to know what you’re asking them to do (report phish) matters, so make sure you recognize their efforts. Doesn’t have to be constant grand-guestures; it could be as simple as sending out an email about who submitted phish properly last month (or the top 10 users that submited the most phish – whatever works). Simply recognizing the effort goes a long way; without recognition, motivation will wane.

    The worst-case scenario isn’t necessarily clicking an email – but when an email is clicked, that user lies or covers up the incident.

    There HAS to be accountability for testing your people, though; without it, these “tests” become a joke.

    One of the best ways I’ve seen is to create a tiered system for users clicking phish or failing SE tests. One get-out-of-jail-free card is given, provided that the failure is reported ASAP. However, if a user clicks one additonal emails during a specific timeframe, the user must undergo additional training. If another additional click is made by the same user, they must sit down and explain to their supervisor or a member of senior management why they have clicked numerous times. If there’s yet another additional click, that individual must sit down with a member of the C-suite and explain themselves further.

    There are numerous other options, including removing email or Internet access, stripping email of any HTML or rich-text, etc.

    But at the end of the day, if repeated failures and training make it clear that the individual has either no regard for security or any comprehension that his or her actions put the organization at high risk, it’s not likely in the best interest of the organization to continue to employ that individual.

    Should termination for failing phishinng tests be a fireable offense? After one or maybe a couple issues? Lhat’s likely too drastic. After repeated failures? It should certainly be an option on the table based on the organization’s culture.

  16. PHP

    Phishing is taking place everywhere. Training is only way to stop it.
    Whenever people call/write me to tell they did something stupid, I show them how they could have spotted the e-mail as as fake.

    I tell them, that if they did not enter the password, then they are safe for now. If they did enter their password, I help them change it.

    And one important thing people forget, always customize your ADFS login page, and tell people that they are not allowed to enter their password unless the page looks like that. It means phishing must be targeted specificly for each organization. So 90%+ of phishing is thus worthless.

  17. ASB

    It really boils down to what “repeatedly” means.

    If an organization is firing people for failing 2 or 3 phishing tests, then there’s going to be a lot of negative consequences for the organization.

    If, on the other hand, an organization fired people who failed 15 or more phishing tests, I don’t think the answer is as clear cut.

    And I’m not recommending that people get fired for that, but would you want your finance department falling for phishing scams on a regular basis?

    1. SkunkWerks

      Another thing not mentioned/addressed here (which I find odd): What sort of position/access to information does the person hold?

      While I’m not sure about the efficacy of this idea when used on the rank and file, who in any thoughtfully designed environment should have fairly limited access to information and sensitive areas, I get a lot more supportive of it when those people are “whales” (in the parlance of phishers).

  18. GB

    My employer (a large financial institution) has used phishing simulations in conjunction with training for several years. Unfortunately, they’ve recently decided to go down the “punishment” path, with escalating consequences for repeat mis-clickers. Mandatory training after each infraction, and if you get to five mis-steps your file is forwarded to HR for review. Of course, they don’t define the time period the five infractions must cover in the announcement they sent out (A year? A decade? The course of your entire career?) so it’s raising questions as well as hackles.

    1. badbanana

      what your employer is doing is correct.

      such repeat offender employees, despite the training provided, are a burden to the company.

      if this is company being pulled down, time and money wise, wouldn’t you do the same?

  19. Mark

    The article does not have complete information. What if the phishing failures were say 5 in a row to get fired. What if one failure was ok, 2 was training, 3 manager chat, 4th was 3 days without paying 5th was dismissal. Think about it, if phishing is say similar to harassment or unsafe behavior, how may occurrences would you tolerate in your office before firing the person. All of these put the organization at risk

  20. Cary

    My company does fully manages ethical phishing simulations for organizations in Canada. Due to our culture here it is imperative that our program was built on a “Helping Hand Approach” because here in Caada a “Heavy Hand Approach” will never work for many reasons.
    A program MUST respect the employees first and foremost, even something as simple as diverting an employee to a 5 minute video when they expected to spend 1 minute resetting their password (In a password reset phishing simulation) can cause an uproar.
    I have seen too many programs that have the completely wrong approach here in Canada and it remains good news for me.
    A little hint for those wanting to do this here – respect the employees first and foremost, help them realize that these are skills they need in today’s online world both at work and at home in their personal lives. This way they know there is something in it for them and then they will be far more likely to learn.

Comments are closed.