Tag: examination findings

18 Oct 2012

“2 is the new 1″…or is it? (with poll)

UPDATED – October, 2012 – Two institutions in the past ten days have told me that they have been assigned a CAMELS score of “1” in their latest examination.  One institution regained their 1 after slipping to a 2 in their last exam cycle, and the other went up to a 1 for the first time.  The FDIC is the primary federal regulator for both institutions.  What is your experience?  (Original post below the polls)

[poll id=”4″]

And while we’re asking for your input…

[poll id=”5″]

During a panel discussion recently at our annual user conference, we heard this from a banker who was quoting an examiner during their last examination.  They had slipped from a CAMELS 1 rating to a 2, and in discussing the reasoning with the Examiner in Charge they said that they should be satisfied with a 2, because “2 is the new 1”.

Just 3 years ago Tony Plath, a finance professor at the University of North Carolina Charlotte, said that (at least for large banks) a CAMELS score of anything less than “1” was cause for concern.  These days it almost seems that examiners are digging for anything they can find to justify NOT assigning the highest rating.  Indeed I had a recent conversation with an FDIC examiner who said (off the record) “if we find anything at all to document during our examination, that is enough to disqualify them for a “1” rating”.

Unlike the comparatively significant difference between a “2” and a “3”, the differences between a “1”, defined as “Sound in every respect” and a “2”, defined as “Fundamentally sound” are extremely subtle, and there is no clear line of demarcation between them.  Often it comes down to examiner opinion.

So pick your battles and push back where you can, but understand that although you should be familiar with the criteria for a “1” rating, and strive to achieve it, you should be quite satisfied with a “2”…at least for now.

 

23 May 2012

Patch deployment – now or later? (with interactive poll!)

We recently saw an examination finding that recommended that “Critical Patches be deployed within 24 hours of notice (of patch release)”.  This would seem to contradict the FFIEC guidance in the Information Security Handbook that states that the institution:

Apply the patch to an isolated test system and verify that the patch…

(1) is compatible with other software used on systems to which the patch will be applied,

(2) does not alter the system’s security posture in unexpected ways, such as altering log settings, and

(3) corrects the pertinent vulnerability.”

If this testing process is followed correctly, it is highly unlikely that it will be completed within 24 hours of patch release.  The rational behind immediate patch release is that the risk of “zero-day exploits” is greater than the risk of installing an untested patch that may cause problems with your existing applications.  So the poll question is:
[poll id=”2″]
Regardless of your approach, you’ll have to document the risk and how you plan to mitigate it.  A “test first” approach might choose to increase end-user training and emphasize other controls such as firewall firmware, IPS/IDS, and Anti-virus/Anti-malware.  If you take a “patch first” approach you may want to leave one un-patched machine in each critical department to allow at least minimal functionality in case something goes wrong.  You should also test the “roll-back” capabilities of the particular patch prior to full deployment.

I’ll be watching to see if this finding appears in other examinations, and also to see if the guidance is updated online.  Until then, because of the criticality of your applications and the required up-time of your processes, I believe a “test-first” approach that adheres to the guidance is the most prudent approach…for now.  However you manage it though, be prepared to explain the why and how to the Board and senior management.  Not only are the results expected to be included in your annual Board report, it may help to explain repeat future examination findings if your current approach differs from examiner expectations.

09 Aug 2011

Examination Experience Survey – preliminary results

Although the survey is still open, I wanted to discuss one particular trend that I find interesting.  (If you’ve already participated, thank you!  Please pass the link on to a colleague at another institution.  If you haven’t had a chance to fill it out, please do so.  The survey will remain open until 8/19).

One of the questions is “During your last examination, did you challenge any of the findings with the examiner?”  So far, 41% of you have challenged findings…

 

…and of those that did, almost 70% were successful getting the finding removed or modified in the final exit report…

I was surprised by a couple of things.  First, that so many of you actually challenged the examiners.  I think this is a direct result of proper examination preparation.  Fully 85% of you felt that your examination experience was either “pretty much as expected”, or “a few curve balls, but not bad overall”   This makes perfect sense…proper preparation leads to fewer findings, which leads to confidence that you’re doing the right things, and that makes it easier to stand up for what you are doing even though it may differ slightly from examiner expectations.  The key is in understanding the root cause of the examiner finding.

So I was also surprised that the number of successful challenges wasn’t even higher.  Even if your procedures differ from expectations, if you can demonstrate that you are still effectively addressing the root cause, you will usually have success getting the finding removed or modified in the final report.  This next statistic may be telling in that regard…even though 73% of you used an outside consultant to assist with exam questionnaire preparation, only 41% used a consultant to assist with post-exam responses.

Again, the survey will remain open until 8/19, and I’ll be posting additional findings shortly thereafter.  Stay tuned!