Category: From the Field

06 Oct 2011

Material Loss Reviews: Does responsibility = liability?

I asked in my previous post whether or not the regulators should share any of the blame when institutions fail, and if so, should they shoulder any of the liability?  The thought occurred to me as I was reviewing some recent Material Loss Reviews.

A Material Loss Review (MLR)  is a post-mortum written by the Office of Inspector General for each of the federal regulators with oversight responsibility after a failure of an institution if the loss to the deposit insurance fund is considered to be “material”.  (The threshold for determining whether the loss is material was recently increased by the Dodd-Frank Act from $25 million to $200 million, so we are likely to see fewer of these MLR’s going forward.)  All MLR’s have a similar structure.  There is an executive summary in the front, followed by a break-down of capital and assets by type and concentration.  But there is also a section that analyzes the regulator’s supervision of the financial institution, and I noticed a recurring theme in this section:

  • …(regulator) failed to adequately assess or timely identify key risks…until it was too late.
  • …(regulator) did not timely communicate key risks…
  • …Regulator should have taken “a more conservative supervisory approach”, and used “forward-looking supervision”.
  • …examiners identified key weaknesses…but…they did not act on opportunities to take earlier and more forceful supervisory action.
  • …serious lapse in (regulator’s) supervision
  • …(regulator), in its supervision…did not identify problems with the thrift
  • …(regulator) should have acted more forcefully and sooner to address the unsafe and unsound practices
  • …(regulator) did not fully comply with supervisory guidance

There were also many references to the responsibilities of the Board, which I addressed here, but in almost every case the regulator was found at least partially responsible for the failure of the institution.

Here is where you can find the reports for each regulator:

I encourage you to take a look at these and draw your own conclusions as to the issues of responsibility and liability.  But clearly there are lessons to learn from any failure, and one lesson that I think we should all learn from this is that regulators will be pressured to be much more critical going forward.  (I.e. quicker to apply “Prompt Corrective Action“.)  After all, no one likes to be called out for doing a bad job.

One other part I found interesting (in the sense that it perfectly fits the narrative) is where the review lists all examination CAMELS ratings in the periods immediately prior to the failure.  What struck me was how many times institutions scored 1’s and 2’s just prior to the failure, and then dropped immediately to 4’s and 5’s in a single examination cycle.  Again, the lesson is that there will be tremendous downward pressure on CAMELS scores.  And don’t think that just because you are healthy you’re immune from the additional scrutiny.  As one MLR stated “…a forceful supervisory response is warranted, even in the presence of strong financial performance.”

08 Sep 2011

Exam preparation – less equals more?

One of the more surprising findings from my recent examination experience survey (thanks again to all that participated!) is that there doesn’t seem to be a direct relationship between the amount of time spent preparing, and examination results. I’ll elaborate in a moment, but first, here are the final survey demographics:

  • There were 80 total respondents
  • FDIC was the most prominent regulator (80%), but institutions representing all the others PFR’s (OTS, OCC, Federal Reserve and NCUA) also responded.
  • Institutions from 20 different states responded, resulting in a pretty good geographic cross-section.
  • The majority of respondents were under $500M, but we also got a handful >$1B.
  • 25% were DeNovo (less than 5 years old).

So what we found was that most institutions spent quite a bit of time preparing for their last IT examination.  57% of you spent more than 5 hours, but interestingly enough, it really didn’t translate into better results.  Although 73% of those felt they were very prepared for the exam, less than half felt that the exam went pretty much as expected, with 9% describing their last examination as a “nightmare”!  By contrast, only 5% of those who spent less than 5 hours preparing felt the same way.  But perhaps the most significant statistic is the average IT composite score.  Those who spent more than 5 hours preparing averaged a score of 1.85 as opposed to a 1.76 for those that spent less than 5 hours.  So is the conclusion that as far as preparation goes, less equals more?  I think a better way to interpret the data is that it’s better to work smarter than harder. Consider this:  Those of you who used an outside consultant to assist with the pre-examination questionnaire seemed to have a much more favorable experience overall.  90% of you felt that the examination experience was either not bad, or pretty much as expected.  But more significantly, those who used outside help also got better IT composite scores, averaging a 1.69 versus 1.82 for all respondents!

09 Aug 2011

Examination Experience Survey – preliminary results

Although the survey is still open, I wanted to discuss one particular trend that I find interesting.  (If you’ve already participated, thank you!  Please pass the link on to a colleague at another institution.  If you haven’t had a chance to fill it out, please do so.  The survey will remain open until 8/19).

One of the questions is “During your last examination, did you challenge any of the findings with the examiner?”  So far, 41% of you have challenged findings…

 

…and of those that did, almost 70% were successful getting the finding removed or modified in the final exit report…

I was surprised by a couple of things.  First, that so many of you actually challenged the examiners.  I think this is a direct result of proper examination preparation.  Fully 85% of you felt that your examination experience was either “pretty much as expected”, or “a few curve balls, but not bad overall”   This makes perfect sense…proper preparation leads to fewer findings, which leads to confidence that you’re doing the right things, and that makes it easier to stand up for what you are doing even though it may differ slightly from examiner expectations.  The key is in understanding the root cause of the examiner finding.

So I was also surprised that the number of successful challenges wasn’t even higher.  Even if your procedures differ from expectations, if you can demonstrate that you are still effectively addressing the root cause, you will usually have success getting the finding removed or modified in the final report.  This next statistic may be telling in that regard…even though 73% of you used an outside consultant to assist with exam questionnaire preparation, only 41% used a consultant to assist with post-exam responses.

Again, the survey will remain open until 8/19, and I’ll be posting additional findings shortly thereafter.  Stay tuned!

 

21 Jul 2011

BCP plans continue to draw criticism

In a recent FDIC IT Examination, the examiner made the following criticism of the institutions’ DR/BCP:

“Business continuity planing should focus on all critical business functions that need to be recovered to resume operations. Continuity planing for technology alone should no longer be the primary focus of a BCP, but rather viewed as one critical aspect of the enterprise-wide process. The review of each critical business function should include the technology that supports it.” (bold is mine)

This is not the first time we’ve seen this finding, nor is it a new direction for regulators, but rather follows directly from the 2008 FFIEC Handbook on Business Continuity Planning when they state:

“The business continuity planning process involves the recovery, resumption, and maintenance of the entire business, not just the technology component. While the restoration of IT systems and electronic data is important, recovery of these systems and data will not always be enough to restore business operations.”

I still see way too many DR plans that focus on the recovery of technology, instead of recovery of the critical process supported by the technology.  Sure, technology is an interdependency of nearly every function you provide, but it must not be the primary focus of your recovery effort.  Focus instead on recovery of the entire process (teller, CSR, lending, funds management, etc.), by recognizing that each process is nothing more than the sum of its interdependencies.   For example, what does it take to deliver typical teller functionality?

  • A physical facility for customers to visit
  • A trained teller
  • A functional application, consisting of:
    • A workstation
    • A printer
    • A database, requiring:
      • LAN connectivity
      • WAN (core) connectivity, requiring:
        • Core functionality
      • A server, requiring:
        • Access rights
      • etc.
    • etc.
  • etc.

As you can see, technology certainly plays a very important role, but it is not the only critical aspect of the process.  All sub-components must work, and work together, for the overall  process to work.  Mapping out the processes through a work-flow analysis is an excellent way to get your arms around all of the interdependencies.

So next time you perform the annual review of your BCP (and you do review your plan annually, right?), make sure your IT department isn’t the only one in the room!

27 Jun 2011

Audits vs. Examinations

As I speak with those in financial institutions responsible for responding to audit and examination requests, I find that there is considerable confusion over the differences between the two.  And some of this confusion is understandable…there is certainly some overlap between them, but there are also considerable differences in the nature and scope of each one.  It may sometimes seem as if you are asked to comply with 2 completely different standards.  How often has the auditor had findings that you’ve never been asked during an examination?  And how often has an examiner thrown you a curve ball seemingly out of left field?

In a perfect world shouldn’t the audit be nothing more than preparation for the examination?  The scope of the audit should be no more and no less than what you need to get past the examination.  Any more and you feel as though you’ve wasted resources (time and money), any less and you haven’t gotten your money’s worth, right?  Well…actually no.  While the two have the same broad goal of assessing alignment with a set of standards, the audit will often use a broader set of industry standards and best practices.  This is because the FFIEC guidance is so general and non-prescriptive.  For example, take one of the questions in the FDIC Information Technology Officer’s Pre-Examination Questionnaire.

“Do you have a written information security program designed to manage and control risk (Y/N)?”

Of course the correct answer is “Y”, but since the FDIC doesn’t provide an information security program template, how do you know that your program will be acceptable to the regulators?  You know because your IT auditor has examined your InfoSec program, and compared what you have done to existing IT best practices and standards, such as COBIT, ITIL, ISO 27001, SAS 94, NIST, and perhaps others.  While this doesn’t guarantee that your institution won’t have examination findings, it will reduce the probability, as well as the severity, of them.  This point is critical to understanding the differences between and audit and an examination; an audit will identify and allow you to correct the root cause of potential examination findings prior to the examination. So using the example above, even if the examiner has findings related to your information security program, they will be related to how you addressed the root cause, not if you addressed it.  (I’m defining root cause as anything found in the Examination Procedures.)  In fact, the FFIEC recognizes the dynamic between the IT audit and examination process this way:

An effective IT audit function may also reduce the time examiners spend reviewing areas of the institution during examinations.

And reduced time (usually) equals fewer curve balls, and a less stressful examination experience!

07 Jun 2011

SAR Filings – Computer Intrusion vs. Identity Theft

The Financial Crimes Enforcement Network (FinCEN) publishes a statistical summary and review of all suspicious activity report (SAR) filings a couple of times per year.  The latest one was just released in May covering the 10 year period from 1/1/2001 through 12/31/2010.  I thought it might be interesting to see how the category of Computer Intrusion (Part III, item 35 f) compared with Identity Theft (Part III, item 35 u) during that period of time:

As expected, reported incidents of identity theft increased sharply and remain relatively high today…no surprises there.  What did surprise me though is the low reported incidents of computer intrusion over the past 5 years.  (The initial blip in 2003 was due to the fact that when the Computer Intrusion category was added in 2000, it initially defined an intrusion as “gaining access or attempting to gain access to a computer system of a financial institution”.  That meant that each time the firewall blocked an attempt, it had to be reported.  Obviously this proved to be extremely labor intensive, and the verbiage was changed in 2003 to define intrusion as actually gaining access to the system.)

I suppose the lesson here* is that financial institutions are doing a far better job securing their networks then they are securing their customer data, which leads me to the conclusion that the vast majority of identity theft must be occurring outside the protected perimeter of the institutions’ networks.  Remember, you must protect your data at every stage of its existence; during processing, in transit, and in storage, and regardless of its physical or electronic nature.

By the way, the newest SAR form for depository institutions is here.  It was just updated in March, and institutions must use this to replace the older form (dated July 2003) by September 30th of this year.  I compared the two forms side by side to see what the differences were, but couldn’t find a single change besides the date, so I’m not sure why the new form is required, but it is.

*Of course another possibility is that computer intrusions are simply being under-reported, but since most financial institutions have been subjected to regular audits, penetration tests and examinations, I believe that the low incidence is probably accurate.