Would your average Internet user be any more vigilant against phishing scams if he or she faced the real possibility of losing their job after falling for one too many of these emails? Recently, I met someone at a conference who said his employer had in fact terminated employees for such repeated infractions. As this was the first time I’d ever heard of an organization actually doing this, I asked some phishing experts what they thought (spoiler alert: they’re not fans of this particular teaching approach).
John LaCour is founder and chief technology officer of PhishLabs, a Charleston, S.C. based firm that helps companies educate and test employees on how not to fall for phishing scams. The company’s training courses offer customers a way to track how many employees open the phishing email tests and how many fall for the lure.
LaCour says enacting punitive measures for employees who repeatedly fall for phishing tests is counterproductive.
“We’ve heard from some of our clients in the financial industry that have similar programs where there are real consequences when people fail the tests, but it’s pretty rare across all types of businesses to have a policy that extreme,” LaCour said.
“There are a lot of things that organizations can do that aren’t as draconian and still have the desired effect of making security posture stronger,” he said. “We’ve seen companies require classroom training on the first failure, to a manager has to sit through it with you on the second time, to revoking network access in some cases.”
LaCour said one of the most common mistakes he sees is companies that purchase a tool to launch simulated phishing campaigns just to play “gotcha” with employees.
“It really demotivates people, and it doesn’t really teach them anything about how to be more diligent about phishing attacks,” he said. “Each phishing simulation program needs to be accompanied by a robust training program, where you teach employees what to do when they see something phishy. Otherwise, it just creates resentment among employees.”
Rohyt Belani, CEO of Leesburg, Va.-based security firm Cofense (formerly PhishMe), said anti-phishing education campaigns that employ strongly negative consequences for employees who repeatedly fall for phishing tests usually create tension and distrust between employees and the company’s security team.
“It can create an environment of animosity for the security team because they suddenly become viewed as working for Human Resources instead of trying to improve security,” Belani said. “Threatening people usually backfires, and they end up becoming more defiant and uncooperative.”
Cofense provides a phish reporting system and encourages customers to have their employees flag suspected phishing attacks (and tests), and Belani said those employee reports can often stymie real phishing attacks.
“So what happens a lot of times is a person may click on link in a real phishing email, and three seconds later realize, ‘Oops, I shouldn’t have clicked, let me report it anyway’,” Belani said. “But if that person knew there was a punitive angle to doing so, they’re more likely not to report it and to say, ‘You know what, I didn’t do it. Where’s the proof I clicked on the link?'”
LaCour says PhishLabs encourages clients to use positive reinforcement in their employee training campaigns.
“Recognition — where employees and departments that do especially well are acknowledged — is very common,” LaCour said. “We also see things like small gifts or other things that companies would typically use to reward employees, such as gift cards or small bonuses for specific departments or people.”
LaCour said his offices make a game out of it.
“We make it competitive where we post the scores of each department and the lowest scoring department has to buy lunch for the rest of the department,” he said. “It teaches people there are real consequences and that we all need to be diligent when it comes to phishing.”
What about you, dear readers? Does your employer do phishing awareness training and testing? What incentives or disincentives are tied to those programs? Sound off in the comments below.
Absolutely those employees should be fired. It is much like an employee forgetting/overlooking physical security like locking the front door at the end of the day, or not closing the cash register. Do it once, reprimand, twice, goodbye.
That is a false equivalency. Physical locks are one thing… and probably analogous to leaving a laptop in a car and getting it stolen.
What is “falling for phishing”? Nothing so obvious as leaving a cash register open.
The very nature of phishing is that it appears as benign and natural as normal business.
Most jobs require interacting with other people through email, and yes, often opening attachments from people you don’t know. Public facing employees HAVE TO trust to some extent to do their jobs.
How about the security office lock down endpoints better so that these attachments aren’t so effective? Should they keep their jobs if an endpoint is compromised?
The big problem with punishing people for falling for a phish, is that you will start to lose out on the potential for your employees to call you with “I think I might have clicked on something I shouldn’t have”, which, depending on your environment, could alert you to something you might not have seen otherwise. I feel it’s better to go with training as the “punishment”, and encourage people to look out for phishing e-mails. Have a “catch of the week” type thing, where you give movie tickets or something to the sneakiest phish submitted to the inbox that week, or whatever.
I would argue that phishing would be more like walking into the wrong location believing it was the correct one. What is closer to what you describe, would be leaving your PC unlocked while you are away from it, which I would argue taking punitive action against.
At a small company in my past, leaving a computer unlocked resulted in the offender originating decidedly off-color emails. I am not advocating the approach, but the mistake rarely happened twice.
In the UUNet NOC, and unlocked station would result in X env antics. The user would lose control of his solaris graphics environment until the person who swooped in felt merciful.
If the user was lucky xsnow, xcockroach, or all their websites becoming hamsterdance is all that would happen.
A night of hamsterdance made the likelihood of repeat offenses low 🙂
Anyone can be phished – ANYONE. All it takes is a bad day at work, a fight with a spouse, or many deadlines coming due. Anything that would distract a normally savvy employee from his/her usual diligence. The trick is to realize it when it happens and report it, or at least learn from the mistake.
Along with phishing tests, there should be network access and segmentation tests. Let a test phish hit the network. If a test phish employee has access to every PC and server in the company, the network admin should have to undergo network training.
The problem is not just employees falling for phishing, it’s that network admins allow far too much resulting damage. If the damage is minimized, there should be no need to fire an employee for failing a phishing test.
If after several failures, in which low level employees repeatedly have access to mission critical infrastructure, the network admin should be fired.
Personally, I think that anyone can fall for a well crafted phish, it’s just a matter of time or being tired, or being busy, or having a momentary lapse.
That’s certainly part of the problem. At the end of the day security needs to be holistic – so just focussing on one myopic area of concern doesn’t reduce the threat that much. Security needs to be a combination of network access control, microsegmentation, user security awareness, secure SDLC, roles based access and a heap of other things as most of us already know.
We need to do a better job of combining people, process and technology. Where I see failure is where these 3 pillars are imbalanced with draconian processes or panacea security technologies. If you examine the recent breaches, it’s usually the simple mistakes that caused them, like continued use of end of life systems, slow lax patching cycles, or slack processes that you could drive a truck through. Sometimes the basics are forgotten in favor of the shiny new tool. Let’s not forget good housekeeping!
As a security and network admin myself separation of privilege is what we try to do but we don’t always have the last say. You also have to balance the person’s ability to do their job. If you followed MS security best practices to a tee, everyone in the environment would have multiple privileged access workstations with separate credentials for each job/system, which may be the appropriate thing to do in certain high security verticals that have unlimited budgets. In reality, the C-levels are going to be the one to make these decisions, all we can do is present our argument/evidence for tighter security.
I see both sides. Damage caused to a company from a phishing attack can be massive and could have happened really easily from someone not paying attention to the red flags. However most phishing attacks these days are getting really clever that you need to analyze the source so much that could fool even the pro’s. So would be a case by case basis.
Firing is punitive and punishment does not teach anything but how to avoid the punishment. Such avoidance is usually achieved by methods that are covert and undesirable.
Discipline requires training and takes a lot more effort than punishment. If an employee fails a phishing test then the employee needs to be trained. If the employee cannot be trained to pass a phishing test then the employee may need to be given a position with reduced network privileges. If such a position is not available than termination may be the only option, but that is very different than simply punishing someone for failing a test.
I agree. There should be steps that actually provide better security, and removing the person isn’t they way.
Training on 1st offence, 2nd offence reduces network privilege… and if network privilege is essential to do the job (as it is many times)… then put their butt on linux… let them use a system with libreoffice with no macros, plugins, etc. Locked down browser. Or maybe even a thin client that resets every day.
It teaches them about how the convenience of using Microsoft Office and Adobe Acrobat are a trade-off with security.
I somewhat agree to your approach, but you have to draw the line of when to enforce stronger actions somewhere. If you look at it as a loss event, i.e., “Bobby Smith fell for a phishing campaign and gave up his credentials, we reset his password and no loss was observed”… this case definitely would require training. However, in a second scenario, “Bobby Smith fell for a phishing campaign and ransomware was installed on his device which spread to 100 other machines and caused a loss of $1 million. This is the 4th time this year that Bobby Smith fell for phishing”… clearly, that user isn’t getting it and they’re causing a loss for the company. At that point, termination should be at least considered.
Threatening termination is counter-productive. You should assume your company *is* going to get successfully phished. The goal should be to minimize the impact of that phishing.
If Bobby knows he’ll be fired the next time he gets phished, he’s not gonna report it when he realizes he’s messed up, and then attack will be able to continue. Train him, sure. But encourage him to report such mistakes ASAP to minimize the damage.
Jon Marcus, I fully understand where you’re coming from and I can’t say that I disagree. However, I have an issue if the same repeat offender is causing loss for the company by being phished. I think you’re assuming the case that I laid out happens frequently. Can the same user gets phished one or two times? Absolutely. Of those times, how many caused a loss? Probably none. My example is more hypothetical, i.e., the same user causes loss multiple times by being phished. At that point, this is negligence and a failure on the user’s part. If you fail at your job, the company has every right to terminate, especially if that user costed the company millions of dollars.
The scenario you laid out is precisely what is wrong with many organizations. Using phishing metrics to make people into scapegoats.
When you say, “the user is the cause of the loss”… because of phishing… it sounds like the legal department trying pass the buck and sweep the real problem under the rug. It is a CYA mentality that makes employees into victims.
Yes, some blame is on the user… but the bulk of the responsibility is on the security team. Why is ransomware allowed to run? Is the machine not patched? Was that “user” logged in as a local “admin” for a good reason? How the hell did it spread to 100 systems? Still on SMBv1?
Users can make mistakes, and with phishing, the attackers are simply smarter than users. So falling for phishing is NOT ALWAYS A MISTAKE! Many times, all the training in the world wouldn’t get a user to spot a sophisticated phish.
Unless they work on the security or fraud teams… email header investigation isn’t their job… and having to forward a lot of emails and wait for a determination,… is actually prohibiting them from doing their job.
The cyber criminal who sent the phish is the one to blame!
Blame and Responsibility aren’t synonymous.
Phishing emails are a very serious threat, but that doesn’t necessarily mean serious consequences.
Training and deployment of managers as key members of the security team, and training for each staff member is critical. A lot of people don’t see each person in their office being a key member of team security and to train them in everything they know. That is the key tenant of office management. Let responsibility lie on the shoulders of the staff, and show that responsibility is key by having managers conduct the training and include them.
Obviously managers and staff don’t care and they get lazy, or there is just a lack of knowledge/leadership from the top down on this issue, but CISOs and CTOs are getting more aware. Unfortunately this ties in with the lack of skill and outsourcing of I.T., which cuts off the head of the defence for each company. A truly regrettable act. Maybe western countries need to promote and reward IT professionals more?
Not that I don’t get paid a truckload, hahaha
In too many cases, companies do fire-hose training: they require employees to sit through CBT or even live classroom training and then go back to their jobs and hopefully remember what they just (maybe) learned. Supplement it with a few posters and they think it’s enough.
It’s not — but threatening employees with losing their jobs is hardly a “motivator” to do better. (And as others have already pointed out, who wants to work in that kind of punitive environment?)
OTOH, there are some companies whose corporate communications groups, working together with IT, have developed innovative, ongoing campaigns that keep employees’ awareness top-of-mind. They use humor and creative marketing techniques and have found that employees’ cyber awareness and retention is notably higher than with “basic training” alone.
There’s no one “perfect” way, but this one seems to be more effective as a long-term approach.
My wife works at a financial services company and the first time you click on a phishing email you get an IT training session with your manager, second time you get written up, third time you get fired.
Interesting. You can tell a real Phish email from a rfi? Or from a customer from just the header? Interesting. No one has thought of other methodologies of getting the emails needed to the proper people safely? That’s more interesting. No one’s sandboxing the systems, just blaming the person who’s job is receiving and awnsering the emails. Sounds right. How about firing the it guy or gal, after all, no email may be no revenue for the boss.
Formerly, as an IT manager in a small hospital, we did some pretty basic phishing testing and training. I ALWAYS encouraged employees to contact me ASAP under any of a variety of scenarios. I would not (and the hospital did not) make opening suspect attachments or clicking links per se a disciplinary offense.
If someone persists in that behavior, it will be documented over time and that documentation can be provided to HR and to that employee’s supervisor. We never had to do that.
As part of a layered security approach, we did ongoing security awareness training (extra, for any who actually fell for a phishing email) as well as phishing testing and training.
We were able to contain without trouble any and all issues reported. These included two different ransomware attacks which were reported within minutes, before the encryption of the entire endpoint drives and without export to the network. The two PCs were restored to initial OEM state, and updated, put back into use. No breach of our EHR data.
We also saw close to a dozen scareware attacks, easily identified and cleaned out. At least two of those attacks, by the way, came NOT from phishing email but from drive-by download from an actual, real state-agency server, which was accessed often by employees. (Same state agency, different servers for the two incidents.)
That is something you cannot hold our (your) employees accountable for–and you best hope they don’t attempt to deal with the alarms and warnings themselves, and do call you in immediately. That takes training. And relationship building, between IT and each and every department, and all the managers.
Firing isn’t “punishment” as some are suggesting… it’s simply cutting off a person who consistently shows that they can’t do their job… which obviously includes protecting whatever company assets they have access to in a reasonable fashion!!! It should only be done on a case by case basis and never in an automated fashion, but only when competency is obviously very low, and education seems ineffective at bringing it up to a satisfactory level. Remember you have to re-hire someone else to fill the vacancy which is very expensive, so it should never be done lightly (let alone the reasons they give in the original article). It’s all a cost/benefit analysis thing to decide when to fire someone. It should be extremely rare that this happens though, if you’re doing your job hiring competent people in the first place!!!
Bill, that is patently doublespoken! Your opening statement is false. Then you argue in it’s defense with a simple fact that is true. Unfortunately, your logic is much too simple
Just to begin with, the person fired may have something to tell you about what punishment is and is not
Just try to comprehend what it is like to work in a department where the water cooler gossip always includes another coworker fired for something they didn’t mean to do, so watch out or you’ll be next!
I believe you are right, in the sense that I should have been less absolute in my wording, and more explanatory that I was trying to describe my opinion about an ideal narrowly defined scenario…. not necessarily the messy realities we live in.
In an ideal scenario, in my humble opinion, people should be fired if they can’t do their jobs well, and can’t learn to do so either. Really it was a mistake to hire them in the first place, at least, for that job. There are other jobs better suited to their skill set.
Also, in an ideal scenario, in my opinion, when people are living under a horrible work management situation where everyone’s walking on eggshells waiting for the next terrible unjust thing to happen… then everyone should dust off their resumes and look for new jobs. I have lived in this situation before, almost the entire company quit (about 10 people) in a matter of 2 months, once they’d had enough…
In my opinion (the only one I have), the difference between these two “ideal scenarios” and reality, is that reality isn’t just one narrow issue at a time… it’s a mixture of many issues all pulling in different ways… One way to make some sense of such complex realities, is to pull them apart into separate topics, and discuss each one in isolation… Is that overly simplistic? Of course it is! It’s just one part of a full analysis of a given unique complex situation though.
I’ve never seen a person that was completely resistant to any training and yet helpful when given a real job. But I can imagine a IT doing things that clearly look like phishing on a regular basis whilst doing things that look like being reasonable as phishing test. One thing that opened my eyes was the phishing test you can take part at google.com: On my android smart phone the firefox often didn’t show me enough information to identify the red flags.
In my current working place we have an IT that you can call at any time if you want to know if something is phishing without being afraid of being shouted at. I believe that this is one of the most important anti-phishing measures: A 2nd pair of eyes you can call before clicking on a link in a mail.
I’ve personally witnessed HR intervene and prevent punitive action being taken because the end-user felt harassed that they kept getting caught compromising security and leaving their workstation unlocked and unattended. This forced the degradation of normal precautions because of feelings.
Example:
* end-user complained IT permissions prevented installing additional required external software. HR instructed IT to provide local admin rights; proceeded to install malware-laden pirated software directly afterwards. Handled the restore, yet still instructed to provide local admin.
* end-user phished outside of a pentest twice with local admin rights.
* end-user saved all passwords in text files or saved in browser .
* end-user caught on a daily basis by IT leaving PC unlocked. end-user disabled the power management options for screen lock with local admin rights. when mentioned no indication of responsibility is shown.
* all credentials for provided access need to be changed for all systems multiple times due to this end-user.
* end-user admits they barely know how to ‘work their macbook let alone windows’.
Whoever that was in HR would be looking for a new job the next day where I am. We don’t tolerate that kind of intervention by HR.
I agree with Mike, that HR person should be fired. NO ONE, should have admin rights, not even IT on their regular accounts. That and no non-IT person should ever be allowed to install their own software, period. If they need it done, IT can do it. If that persons PC got infected with malware after being given admin rights, whoever approved those rights should be written up and potentially fired depending on the severity of the infection.
What a load of BS…!!! We expect users to pick up every phish (love to see some honest stats about how many InfoSec and IT staff are caught out?) when we aren’t prepared to train them, assist them, build security around. We need to use people not alienate them. We have to remember security is not their primary concern. People are our strongest asset as it’s proved time and time again technology cannot stop compromises…
The end justifies the means. If the company looses data the loss has to be borne by the employees of that company along with their CEO. However, that being said, there needs to be other factors that need to be considered before firing the defaulter. Factors such as how much and in what manner awareness training was imparted, is this the first time or a repeat behaviour, correlation with other user behaviours (visiting risky websites, time spent on non-professional activity on the internet, etc), feedback from the last training. And most important what has been the role of the IT in providing a safe and secure environment to the employee.
One thing that seems effective to me and avoids some of the issues mentioned in the article is a notice added to all emails originating outside the company in red, “This email likely came from outside of [company name].”
It seems that being overly draconian will be counterproductive. Make a game of it as suggested makes a great deal of sense.
I have fiddled with computers since I was in high school years ago and although I have never clicked a phishing email there was a time or two I came close.
Can do that as a client side rule in Outlook. It helps.
I’ve seen some organizations with email gateways that add [EXTERNAL] to the subject line. That helps somewhat too.
Of course some places deal a lot with external incoming email all the time, and have to trust the sender, because that’s the business.
Paul, you wanted some real world stats on IT and InfoSec staff being phished. Well, i’ve been doing IT/InfoSec for over 33+ years as self employed as well as past 19+ years in a large FI. I can assure you that IT/IS staff fall for these test (and real attacks) all the time, but do to the nature of who they are and often status, rarely are they reported or required to take same additional training, etc. i.e. It’s swept under the rug per say. I can personally say i have fallen for my own phishing test before. The very nature of the Phish is that it’s not just about the content presented, but also about timing. I don’t care who you are or how careful are, if the timing is right, you can and will take the bait. We are about to reach 1100+ employees and being CISO, my method is very simple and …
Training – Training – Training Phish Test (Learn from results and adjust training) Training – Training – Training (repeat)
Always remember, everyone learns differently, therefor training must be ongoing, but provided in a variety of formats to reach everyone. Talk with staff after training and get feedback and adjust training based on such feedback.
We do phishing training but, as far as I know, there’s no punitive measures for failure, rather just more training.
I’d argue that, in some companies, the problem is cultural. An example would be a C-level admin getting an email to wire money (very common) and the admin immediately responding to the request. Why? Because they’ve been ‘trained’ to respond quickly. They won’t ask questions because they’ll get screamed at if they don’t get the job done ‘right now’. Are they to blame for doing their job? It’s a bit of a grey area.
If the phisher is good at their job, or just a very precise spear-phish, they have a much higher chance of success. I think punishment needs to be examined on a case-by-case basis and, maybe, a cultural shift for those companies is in order.
I have seen a lot of comments about “training”, but not any about having “policies” in place, both for processes and training.
I would like to see policies that require an employee to scan all attachments (including VirusTotal[dot]com). I would like to see all obfuscated (or even ALL) URLs scanned by VirusTotal or similar.
I would REALLY like to see mail servers do that automatically and flag anything that fails to admins before delivering the email to the recipient.
Perhaps the hiring process should include testing of tech savvy skills before granting employment (that could be a whole can of worms there).
It would be great if we can design an efficient email system that segregates the email totally from the main network. DMZs and segmentation aren’t enough.
The problem goes beyond phishing. Teaching people when to click and when not to click is a much bigger issue. Think of Cross Site Request Forgery. Think of researching something for work and accidentally running into a bad URL that downloads malicious payloads. Where does it stop? Firing may not be the answer. Do C-Suits get treated the same when they click on a no-no?
“policies that require an employee”
Unless there is an “easy button” to do all the scanning… this is still way to technical to expect by policy.
“like to see mail servers do that automatically”
Yes, there are comments that mention this. Several, but hard to find amidst the many others.
There are several good email gateways that will sandbox attachments, run through AV engines, and reputation checks on URLs. Some even click through links like a spider to check for redirects (this backfires as some links contain identifying info in the URL).
Email segmentation? Not really a thing, because email messages and attachment does ultimately need to get to the user’s endpoint system.
Maybe a thin client that virtualizes the email client and isn’t persistent… but that is a networking nightmare.
Defense in Depth means that defenders must always consider than an attacker has already gotten phishing clicks. There are so many other security gaps that can be fixed to ensure no actual damage/loss.
I agree that punishment for failing the test is counterproductive and creates resentment. I would make an exception for that for IT admins. If a person with IT experience and high levels of IT access falls for a phishing test repeatedly, I think it should be punishable even with termination.
Many say that punishment gets in the way of education and I agree that it does, BUT at some point an organization has to go from investing in an employee to protecting itself against someone who cannot or will not do the right thing. What would you do if an employee was chronically absent due to taking care of an ill family member? We would be less than human if we did not commiserate with the situation but allowing it to continue would be detrimental to the organization and others in it.
We’ve had a phishing program for over 2 years. After a while, you start to notice that there are a handful of people who, despite the repeated training, continue to click on every phishing message. What do you do with a person who fails monthly phishing tests 23 times in two years? The question in our organization was “what does due care require?” As managers who know this information about certain employees, what should they do as responsible parties? Is it ok to leave these people in positions of financial trust? Or, based on this information, must they be moved to a less sensitive position? What if they continue to fail? At some point, is this not just another part of the job they are required to do and if they continue to fail, shouldn’t it be treated like any other aspect of their job they continue to fail at?
Our answer was yes. To that end, they are given additional training if they fail. We provide additional resources in the form of a diagram that must be posted in their workspace that describes how to evaluate phishing messages. If they fail again, they and their manager have to go to HR for negative counseling and a reprimand is placed in their personnel record. If they fail again within six months (this means 4 failures in 6 months), then they are terminated.
This is an activity that can be learned and most people do. More than most people, nearly everyone. For those that don’t, they represent a serious hazard to your company. That risk needs to be addressed. Sure, anyone can click on the odd phishing message, this isn’t about that. Perfection is not required. For those people who routinely fail, the answer may be termination.
I have mixed feelings about this. On the one hand I think we need to get over the idea that we can protect our business from phishing attacks. Anybody can fall for a well crafted phishing attack and I’ve worked with a lot of businesses that lack a control environment to prevent mass damage once an email account is compromised. On the other hand I’ve also worked with businesses where top level employees with massively elevated access give out their VPN and admin credentials basically any time they’re asked for it. I have no clue if those individuals are untrainable or just not motivated to care, but regardless as a corporate liability matter I think it’s reckless to the point of creating potential legal liability to continue to trust those people with sensitive personal or contractually protected information. It should be a condition of being able to perform your job duties that you understand phishing at least well enough to not fall for the same phishing message on two consecutive days where you give away admin access to highly sensitive systems (no, that’s not an exaggeration)
I am a regional sys admin at small site that is part of a larger corporation. The company has sent out required CBT links in the past dealing with the issue. However, one of our local employees got caught and clicked on a phishing email. I had to re-image her machine and run anti-virus tests in our office. In order to re-enforce the CBT, I asked our local manager if I could hold a site meeting, and he agreed. I made up a presentation and pointed out the usual places to look before clicking on links. After that meeting, I started to have users come to me with emails that they thought were suspicious. Most were OK, but they actually caught some phish. In our case, this local face-to-face training, reinforcing the CBT classes have really made a difference. I would always stress training first. Some of these phishers are getting very good at their trade and a strict punishment regime is not going to work. Now if we started to see the same person becoming a problem over and over, then escalating the issue with that person’s manager might be in order. However, most employees want to do the right thing and this hand’s on approach to training really worked out for us.
I don’t agree with most here. This needs to be looked at as an organization/team. Stop with the us/them. Security/Technology staff “usually” are the ones touting people to get fired. End users are “usually” the ones who think there should be no consequences for their actions.
Negative consequences, whether through punishment, or effects, or result of, are a part of all decisions. It is not right to shield people from that nor is it helpful to the person or the organization. You make a mistake fine… you fail to do what is necessary to update your skills to keep company data safe, you need to be fired.
This should not be screamed from the rooftop, but should be a known result for repeated offenders or people who don’t show they care about what they did. It is a judgment call. No way to generalize it.
While positive reinforcement is awesome! It has its rightful place. It is not an end all in behavior training. As a company the organization needs to evaluate themselves. Have they done everything they could to help their employee’s succeed at not falling for this stuff? Have they put in enough hardware and software to stop in case there is a mistake? Have they been training users consistently and appropriately? Have they created a security conscious culture or just assumed people should know?
Help encourage (positive) responsibility and not victim mentality.
I will put it this way. Be as positive as possible and negative as necessary.
Only if these companies are willing to and actually do fire executives for the same offenses. But we know that doesn’t happen so no, it shouldn’t be a fireable offense.
Anyone can be tricked or fooled or taken advantage of. I think we should be on the side of making users aware, giving them the best advice we can on indicators, and work doubly hard at the mail gateway, monitoring and endpoint to assist. At the end of the day we want to partner with the end users who are the victims and make sure they trust us enough to come to us for help.
Termination is a valid response to repeated failures. That really is not draconian, it is logical. If an individual repeatedly fails testing and remedial training, then they have to lose email. If they cannot work without email, then termination is the only logical choice. The stakes are much too high and the “industry” sales experts are flat wrong to think that is too punitive. If you cannot operate at a level that business requires today, you have to change professions. Lastly, this is not common at all. Companies work hard to help employees identify phishing emails or to seek assistance if they are not sure. It is a total last resort. But, it also sends a message. Harsh but true.
Cigna Health Ins/ Cigna Group does this.
We recently implemented a phishing program where I work. We have a VAST difference in computing skills, from hardly being able to log onto a computer, to very advanced, so that makes a program harder to design.
As far as the accountability, I sat with the HR Manager for an hour hashing through different scenarios. We came up with a 5 step failure scenario. At the 4th and 5th failure, the employee will be brought into the HR department to talk about the mistake, and how that conversation went would determine the real penalty. We have some people here that are flippant about the whole program and treat it as a joke. Those people would probably just dismiss it in a discussion with HR, so in our opinion, it is just fine to fire them. It’s the ones that probably just struggle with identification and possibly computers in general, that we have to spend more time training.
One thing that often is overlooked is that if you produce loads of cooperate mails with clickable links that lead to sites that require your password it is easy to fall for phishing, at least if you are in a hurry. If you instead can tell people “If it contains a link it is phishing” the training gets simple.
I’m not confident that any anti-phishing awareness campaign that hinges on an absolutism like “must not contain a link!” is bound to be very effective on the longview.
In fact I can think of more than a few ways those kinds of platitudes are likely to blow up in your face, later on.
Who here recalls “Look for the Lock!”/”Make sure it’s HTTPS!”/”Make sure it’s secure SSL!”?
And then the bad guys started using SSL-enabled sites, because those had become ubiquitous over time, of course.
Now all that advice equates to is “false sense of security” and for most people those old habits die hard.