(UPDATE – Hord Tipton, executive director of (ISC)2, posted recently on the biggest data breaches of the past year. His analysis confirms that ” …humans are still at the heart of great security successes – and, unfortunately, great security breaches…The lesson we learn from this year’s breaches is that most of them were avoidable – even preventable – if humans had exercised best practices at the proper times.”)
What was the nature of the attack on the security company RSA that they described as “extremely sophisticated” and an “advanced persistent threat”? Simply put, it was a fairly ordinary phishing email that was sent to RSA employees that contained an Excel spreadsheet with an embedded Adobe Flash exploit. At least one employee opened the attachment. The exploit allowed the attacker to install a backdoor and subsequently gain access to the information they were after.
I wrote about this here, discussing how password tokens (like RSA) were just one factor in one layer of a multi-factor, multi-layered security process. At the time the post was written (shortly after the attack became public) we weren’t sure about either the nature of the attack, or exactly what was taken, but at this point it is pretty clear that the real weakness that was exploited at RSA is still out there, and it can’t be fixed by a patch or an update. In fact according to recent IT audits this particular vulnerability is still present at most financial institutions…the employee. Or more specifically, the under-trained-and-tested employee.
How do you address this threat? Sure, regular critical patch updates and Anti-virus/Anti-malware software are important, but the only way to mitigate the employee risk is through repeated testing and training. As far back as 2004 the FFIEC recognized social engineering as a threat, stating in their Operations booklet:
Social engineering is a growing concern for all personnel, and in some organizations personnel may be easy targets for hackers trying to obtain information through trickery or deception.
And as recently as this year social engineering is mentioned again in the recent FFIEC Internet Authentication guidance:
Social engineering involves an attacker obtaining authenticators by simply asking for them. For instance, the attacker may masquerade as a legitimate user who needs a password reset or as a contractor who must have immediate access to correct a system performance problem. By using persuasion, being aggressive, or using other interpersonal skills, the attackers encourage a legitimate user or other authorized person to give them authentication credentials. Controls against these attacks involve strong identification policies and employee training.
Most financial institutions already include some form of social engineering testing in their IT controls audits, typically as part of a penetration, or PEN test. Auditors assessing social engineering control effectiveness will use various techniques to entice an employee to enter their network authentication credentials. Posing as a customer, an employee, a trusted vendor, or even going to the extreme of constructing a website with the same look and feel of the institutions actual website, auditors have been extremely effective in getting employees to disclose information. In fact in all of the social engineering tests I’ve seen, the vast majority resulted in at least one employee disclosing confidential information, and in many cases 50% or more employees handed over information. Although I believe this number is slowly declining, if the RSA breach taught us anything it was that all it takes is one set of disclosed credentials from one employee to compromise the organization.
So if both social engineering and the need for training and testing is not a new concept to financial institutions, then why is this such a persistent problem? After all, most institutions have been conducting information security training for years. In fact as part of their IT examinations, examiners have been required to “…review security guidance and training provided to ensure awareness among employees and contractors, including annual certification that personnel understand their responsibilities.”
I think a big part of the challenge is that financial institution employees are specifically hired for their customer service skills; their willingness to want to help each other and the customer. These are exactly the personality traits that you want in a customer-facing employee. But this helpful attitude is exactly why financial institution employees are notoriously difficult to train on information security. (An excellent summary of this is found in a technical overview paper published by Predictive Index). The same personality traits that make employees want to help are also correlated with a general lack of suspicion. And a little suspicion can be more useful in preventing social engineering attacks than all the formal training in the world.
Suspicion can’t be taught, but adherence to polices and procedures can. And fortunately one personality trait that is correlated with a helpful attitude is a willingness and ability to follow the rules. Perhaps the answer is to spend more training time making sure your employees know what is expected of them (as defined in your policies) and how they are expected to respond to requests for information, and spend less time discussing why (i.e. the current threat environment). Make sure you include social engineering testing as part of your annual IT audits because this is the only way to measure the success of your training efforts. And if the testing results indicate that more training is necessary, repeat training and testing not just annually but as frequently as you have to until the test response rate is zero. Also, use the news of recent cyber-incidents as an opportunity to stage “what would you do in this circumstance” training exercises with your employees. In the end this is one risk you’ll never completely eliminate…the best you can hope is that you don’t become a training exercise for someone else!