Author: Tom Hinkel

As author of the Compliance Guru website, Hinkel shares easy to digest information security tidbits with financial institutions across the country. With almost twenty years’ experience, Hinkel’s areas of expertise spans the entire spectrum of information technology. He is also the VP of Compliance Services at Safe Systems, a community banking tech company, where he ensures that their services incorporate the appropriate financial industry regulations and best practices.
20 Sep 2011

FDIC Sues Bank Directors (again)

On June 19, 2009 Cooperative Bank in Wilmington, NC was closed by the North Carolina Commissioner of Banks and the FDIC.  Federal banking regulators are now suing Cooperative Bank’s chairman and eight members of the board of directors for more than $145 million for negligence and breaches of fiduciary duty.  One of the FDIC’s assertions in the suit is the claim that the “…bank materially deviated from its approved business plan”, and that it did not adequately control the risks.  But this is not the only instance, it’s merely the latest.

If you are a bank director or officer, and your bank fails, there is a 1 in 4 chance that the FDIC will sue you.  In fact, as of September 13, 2011, the FDIC has authorized suits in connection with 32 failed institutions against 294 individuals with damage claims of at least $7.2 billion.  And not just officers and directors are being targeted…attorneys, accountants, appraisers, brokers and other professionals working on behalf of the bank can be held liable as well.  More importantly, the pace is increasing rapidly too.  From 1986 through 2010 there were a total of 109 defendants named in lawsuits, but in just 8 months of 2011 185 have been named.

The FDIC regulations defining officer and director obligations are explained here, and the key concept is something called the “duties of loyalty and care.”

 The duty of loyalty requires directors and officers to administer the affairs of the bank with candor, personal honesty and integrity. They are prohibited from advancing their own personal or business interests, or those of others, at the expense of the bank.

 The duty of care requires directors and officers to act as prudent and diligent business persons in conducting the affairs of the bank.

But the guidance states that the FDIC will not bring civil suit if it finds that they’ve…

  1. “…made reasonable business judgments…
  2. …on a fully informed basis, and…
  3. …after proper deliberation.”

If you are an officer or director, preventing a lawsuit in the first place is far preferable to having to defend yourself after being named, and prevention is entirely predicated on being able to demonstrate that you’ve properly exercised your duties.  Exercising your duties means making reasonable business decisions after proper deliberation.  The key to proper deliberation is that you be fully informed, and that requires accurate, timely and relevant information.   Not just data, but actionable information.

I’ve written before about how technology (specifically automation) can enable and/or enhance your compliance efforts, particularly in the effort to extract useful information from mountains of data.  I’ve also discussed how management committees like the IT committee and the audit committee can provide both a forum for the exchange of information, and documentation that the exchange took place.  And don’t underestimate the value of having outside expertise on those committees.  Not only can it add a different perspective, it can also help document that you are making an effort to be “fully informed” and that you are “properly deliberating”.

Now here is a question to ponder…if the regulators are found to have been at least partially liable for the failure of an institution, can they be named as a party to the lawsuit?   In my next post I’ll take a look at some recent Material Loss Reviews, and examine the regulator mandate of “Prompt Corrective Action”.  In the meantime, what do you think…can the FDIC be both a plaintiff and a defendant?

14 Sep 2011

The current single biggest security threat to financial institutions – UPDATE

(UPDATE – Hord Tipton, executive director of (ISC)2, posted recently on the biggest data breaches of the past year.  His analysis confirms that ” …humans are still at the heart of great security successes – and, unfortunately, great security breaches…The lesson we learn from this year’s breaches is that most of them were avoidable – even preventable – if humans had exercised best practices at the proper times.”)

What was the nature of the attack on the security company RSA that they described as “extremely sophisticated”  and an “advanced persistent threat”?  Simply put, it was a fairly ordinary phishing email that was sent to RSA employees that contained an Excel spreadsheet with an embedded Adobe Flash exploit.  At least one employee opened the attachment.  The exploit allowed the attacker to install a backdoor and subsequently gain access to the information they were after.

I wrote about this here, discussing how password tokens (like RSA) were just one factor in one layer of a multi-factor, multi-layered security process.  At the time the post was written (shortly after the attack became public) we weren’t sure about either the nature of the attack, or exactly what was taken, but at this point it is pretty clear that the real weakness that was exploited at RSA is still out there, and it can’t be fixed by a patch or an update.  In fact according to recent IT audits this particular vulnerability is still present at most financial institutions…the employee.  Or more specifically, the under-trained-and-tested employee.

How do you address this threat?  Sure, regular critical patch updates and Anti-virus/Anti-malware software are important, but the only way to mitigate the employee risk is through repeated testing and training.  As far back as 2004 the FFIEC recognized social engineering as a threat, stating in their Operations booklet:

Social engineering is a growing concern for all personnel, and in some organizations personnel may be easy targets for hackers trying to obtain information through trickery or deception.

And as recently as this year social engineering is mentioned again in the recent FFIEC Internet Authentication guidance:

Social engineering involves an attacker obtaining authenticators by simply asking for them. For instance, the attacker may masquerade as a legitimate user who needs a password reset or as a contractor who must have immediate access to correct a system performance problem. By using persuasion, being aggressive, or using other interpersonal skills, the attackers encourage a legitimate user or other authorized person to give them authentication credentials. Controls against these attacks involve strong identification policies and employee training.

Most financial institutions already include some form of social engineering testing in their IT controls audits, typically as part of a penetration, or PEN test.  Auditors assessing social engineering control effectiveness will use various techniques to entice an employee to enter their network authentication credentials.  Posing as a customer, an employee, a trusted vendor, or even going to the extreme of constructing a website with the same look and feel of the institutions actual website, auditors have been extremely effective in getting employees to disclose information.  In fact in all of the social engineering tests I’ve seen, the vast majority resulted in at least one employee disclosing confidential information, and in many cases 50% or more employees handed over information.  Although I believe this number is slowly declining, if the RSA breach taught us anything it was that all it takes is one set of disclosed credentials from one employee to compromise the organization.

So if both social engineering and the need for training and testing is not a new concept to financial institutions, then why is this such a persistent problem?  After all, most institutions have been conducting information security training for years.  In fact as part of their IT examinations, examiners have been required to “…review security guidance and training provided to ensure awareness among employees and contractors, including annual certification that personnel understand their responsibilities.”

I think a big part of the challenge is that financial institution employees are specifically hired for their customer service skills; their willingness to want to help each other and the customer.  These are exactly the personality traits that you want in a customer-facing employee.  But this helpful attitude is exactly why financial institution employees are notoriously difficult to train on information security.  (An excellent summary of this is found in a technical overview paper published by Predictive Index).  The same personality traits that make employees want to help are also correlated with a general lack of suspicion.  And a little suspicion can be more useful in preventing social engineering attacks than all the formal training in the world.

Suspicion can’t be taught, but adherence to polices and procedures can.  And fortunately one personality trait that is correlated with a helpful attitude is a willingness and ability to follow the rules.  Perhaps the answer is to spend more training time making sure your employees know what is expected of them (as defined in your policies) and how they are expected to respond to requests for information, and spend less time discussing why (i.e. the current threat environment).  Make sure you include social engineering testing as part of your annual IT audits because this is the only way to measure the success of your training efforts.  And if the testing results indicate that more training is necessary, repeat training and testing not just annually but as frequently as you have to until the test response rate  is zero.  Also, use the news of recent cyber-incidents as an opportunity to stage “what would you do in this circumstance” training exercises with your employees.  In the end this is one risk you’ll never completely eliminate…the best you can hope is that you don’t become a training exercise for someone else!

08 Sep 2011

Exam preparation – less equals more?

One of the more surprising findings from my recent examination experience survey (thanks again to all that participated!) is that there doesn’t seem to be a direct relationship between the amount of time spent preparing, and examination results. I’ll elaborate in a moment, but first, here are the final survey demographics:

  • There were 80 total respondents
  • FDIC was the most prominent regulator (80%), but institutions representing all the others PFR’s (OTS, OCC, Federal Reserve and NCUA) also responded.
  • Institutions from 20 different states responded, resulting in a pretty good geographic cross-section.
  • The majority of respondents were under $500M, but we also got a handful >$1B.
  • 25% were DeNovo (less than 5 years old).

So what we found was that most institutions spent quite a bit of time preparing for their last IT examination.  57% of you spent more than 5 hours, but interestingly enough, it really didn’t translate into better results.  Although 73% of those felt they were very prepared for the exam, less than half felt that the exam went pretty much as expected, with 9% describing their last examination as a “nightmare”!  By contrast, only 5% of those who spent less than 5 hours preparing felt the same way.  But perhaps the most significant statistic is the average IT composite score.  Those who spent more than 5 hours preparing averaged a score of 1.85 as opposed to a 1.76 for those that spent less than 5 hours.  So is the conclusion that as far as preparation goes, less equals more?  I think a better way to interpret the data is that it’s better to work smarter than harder. Consider this:  Those of you who used an outside consultant to assist with the pre-examination questionnaire seemed to have a much more favorable experience overall.  90% of you felt that the examination experience was either not bad, or pretty much as expected.  But more significantly, those who used outside help also got better IT composite scores, averaging a 1.69 versus 1.82 for all respondents!

31 Aug 2011

Online Transactions – Defining “Normal”

I’ve gotten several inquiries about this since I last posted so I thought I’d better address it.  The new FFIEC authentication guidance requires you to conduct periodic risk assessments, and to apply layered controls appropriate to the level of risk.  Transactions like ACH origination and interbank transfers involve a generally higher level of risk to the institution and the customer, and as such require additional controls.  But here’s the catch…given the exact same product with the exact same capabilities one customer’s normal activity is another customer’s abnormal.  So defining normal is critical to identifying your abnormal, or “high-risk”, customers.

Most Internet banking software has built-in transaction monitoring or anomaly detection capabilities, and vendors that don’t are scrambling to add it in the wake of the guidance.  As the guidance states:

“Based upon the incidents the Agencies have reviewed, manual or automated transaction monitoring or anomaly detection and response could have prevented many of the frauds since the ACH/wire transfers being originated by the fraudsters were anomalous when compared with the customer’s established patterns of behavior.

So automated anomaly detection systems can be a very effective preventive, detective and responsive control.  But I think there is a very real risk that a purely automated system may not be enough, and may even make the situation worse in some cases.  For one thing, any viable risk management solution must strike a balance between security and usability.  A highly secure automated anomaly detection and prevention system may be so tightly tuned that it becomes a nuisance to the customer or a burden to the institution.  Customers are already reluctant to accept any constraints on usability, even if they can be presented as in their best interest.  And if your requirements are just a little bit more than your competitor, you risk losing the customer to them.  Interesting paradox…you implement additional controls to protect them, and lose them to a (potentially) less secure competitor!

But another way a purely automated solution may not achieve the desired result is that it may actually lull the institution into a false sense of security.  I’ve already heard this in my discussions with our customers…”My vendor says they will fully comply with the new guidance, so I’m counting on them.”  And indeed the vendors are all saying “Don’t worry, we’ve got this…”.  But do they?  In at least one incident, transaction monitoring did not stop an account take-over because according to the automated systems the fraudulent transactions were all within the range of “normal”.

So what more should you do?  One thing is to make sure that you don’t rely solely on your vendor to define “normal”.  Just as with information security, you can, and (because of your reliance on the vendor) should outsource many of the risk management controls.  But since you can not outsource the responsibility for transaction security, you must take an active role with your vendor by sharing responsibility for monitoring.  One way to do this is to participate in setting the alert triggers.  For example, high account inquiries may trigger an automated anomaly alert, but really don’t carry a high risk of loss.  (However, they could be indicative of the early stages of an account takeover, so they shouldn’t be completely ignored either.)  On the other hand, a slight increase in interbank transfers may not trigger an alert, but could carry a potentially large loss.  Rank the capabilities of each product by risk of loss, and work with your vendor to set anomaly alerts accordingly.

Once you’ve established “normal” ranges for your products by capability, and set the anomaly triggers, your vendor should be able to generate reports for you showing deviations from normal for each product.  The next step is to separately assess each customer that falls outside those normal ranges.  Anomaly triggers for these customers should necessarily be set more tightly, and your vendor should be able to provide deviation reports for those as well.  By regularly reviewing these reports you are demonstrating a shared security responsibility approach, and most of all, demonstrating an understanding of both the letter and spirit of the guidance.

Remember, although your vendor can help, “normal” transaction frequency and dollar amounts must be defined by you based on your understanding of the nature and scope of your on-line banking activities.

22 Aug 2011

Risk Assessing Internet Banking – Two Different Approaches

One of the big “must do” take-aways from the updated FFIEC Authentication Guidance was the requirement for all institutions to conduct risk assessments.  Not just prior to implementing electronic banking services, but periodically throughout the relationship if certain factors change, such as:

  • changes in the internal and external threat environment, including those discussed in the Appendix to this Supplement;
  • changes in the customer base adopting electronic banking;
  • changes in the customer functionality offered through electronic banking;
  • and actual incidents of security breaches, identity theft, or fraud experienced by the institution or industry.

The guidance also mandated annual re-assessments if none of these previous factors change, but given the increasingly hostile on-line environment it’s really a question of ‘when’ actual incidents occur, not ‘if’.  That being the case, if you only update your risk assessment annually the regulators could reasonably take the position that you’re not doing it often enough.

So risk assessments must occur “routinely”, but what is the best way to approach them?  Although the guidance does not specify a particular approach, it might be instructive to take a look at what the FFIEC has to say about Information Security and Disaster Recovery, both of which require (separate) risk assessments.  In both cases the FFIEC encourages that you approach the task by analyzing the probability and impact of the threat, not the nature of the threat.  This makes perfect sense.  By shifting the focus of your risk assessment off of the moving target of the constantly changing threat environment, and on to strengthening the overall security of your Internet-based services1, you can build a secure transaction environment that will scale and evolve as you grow.  Here is the critical difference between the two approaches; if you take a “nature-of-the-threat” approach, you must list every possible specific threat both existing and reasonably anticipated2.  It doesn’t work very well for disaster recovery or information security risk assessments, and in my opinion it is not the best approach for Internet banking either.

Although certainly not the only way to do the risk assessment, I would recommend a 2-step approach that addresses most if not all of the updated FFIEC guidelines.  Step 1 of this approach is to assess the overall risk of your products by listing the capabilities and controls for each one.  As a part of that step you would determine how many customers use the product, and then also how many of those you consider to be “high-risk” as defined by high transaction frequency and high dollar amount.  In Step 2 you should list those high-risk customers you identified in step 1 separately, along with the associated controls you plan to implement for each one.

Again, there is no one single way to do this correctly.  Whatever you do should be consistent with the size and complexity of your institution, and the nature and scope of your Internet banking operations.  Good luck!

 

1 Although other regulations and guidelines address financial institutions’ responsibilities to protect customer information and prevent identity theft, this guidance specifically addresses Internet authentication, and should be the primary focus of this risk assessment.

2 You must still re-assess if either you or the industry experience any actual incidents, but instead of adding a new threat to your risk assessment, you simply determine if your existing control environment is sufficient to address the impact of the threat. In other words, you re-assess for the impact, not the nature of the threat.

09 Aug 2011

Examination Experience Survey – preliminary results

Although the survey is still open, I wanted to discuss one particular trend that I find interesting.  (If you’ve already participated, thank you!  Please pass the link on to a colleague at another institution.  If you haven’t had a chance to fill it out, please do so.  The survey will remain open until 8/19).

One of the questions is “During your last examination, did you challenge any of the findings with the examiner?”  So far, 41% of you have challenged findings…

 

…and of those that did, almost 70% were successful getting the finding removed or modified in the final exit report…

I was surprised by a couple of things.  First, that so many of you actually challenged the examiners.  I think this is a direct result of proper examination preparation.  Fully 85% of you felt that your examination experience was either “pretty much as expected”, or “a few curve balls, but not bad overall”   This makes perfect sense…proper preparation leads to fewer findings, which leads to confidence that you’re doing the right things, and that makes it easier to stand up for what you are doing even though it may differ slightly from examiner expectations.  The key is in understanding the root cause of the examiner finding.

So I was also surprised that the number of successful challenges wasn’t even higher.  Even if your procedures differ from expectations, if you can demonstrate that you are still effectively addressing the root cause, you will usually have success getting the finding removed or modified in the final report.  This next statistic may be telling in that regard…even though 73% of you used an outside consultant to assist with exam questionnaire preparation, only 41% used a consultant to assist with post-exam responses.

Again, the survey will remain open until 8/19, and I’ll be posting additional findings shortly thereafter.  Stay tuned!