Category: Hot Topics

27 Sep 2016

FFIEC Rewrites the Information Security IT Examination Handbook

In the first update in over 10 years, the FFIEC just completely rewrote the definitive guidance on their expectations for managing information systems in financial institutions.  This was widely expected, as the IT world has changed considerably since 2006.

There is much to unpack in this new handbook, starting with what appears to be a new approach to managing information security risk. The original 2006 handbook put the risk assessment process up front, essentially conflating risk assessment with risk management.  But as I first mentioned almost 6 years ago, the risk assessment is only one step in risk management, and it’s not the first step.  Before risk can be assessed you must identify the assets to be protected and the threats and vulnerabilities to those assets.  Only then can you conduct a risk assessment.  The new guidance uses a more traditional approach to risk management, correctly placing risk assessment in the second slot:

  1. Risk Identification
  2. Risk Measurement (aka risk assessment)
  3. Risk Mitigation, and
  4. Risk Monitoring and Reporting

This is a good change, and it is also identical to the risk management structure in the 2015 Management Handbook.  Its also very consistent with the 4 phase process specified in the 2015 Business Continuity Handbook:

  1. Business Impact Analysis
  2. Risk Assessment
  3. Risk Management, and
  4. Risk Monitoring and Testing

Beyond that, here are a few additional observations (in no particular order):

More from Less:

  • The new handbook is about 40% shorter, consisting of 98 pages as contrasted with 138 in the 2006 handbook.


  • The new guidance contains 412 references to the word “should”, as opposed to 341 references previously.  This is significant, because compliance folks know that every occurrence of the word “should” in the guidance, generally translates to the word “will” in your policies and procedures.  So the handbook is 40% shorter, but increases regulator expectations by 20%!

Cyber Focus:

  • “…because of the frequency and severity of cyber attacks, the institution should place an increasing focus on cybersecurity controls, a key component of information security.”  Cybersecurity is scattered throughout the new handbook, including an entire section.

Assess Yourself:

  • There are 17 separate references to “self-assessments”, increasing the importance of utilizing internal assessments to gauge the effectiveness of your risk management and control processes.

Take Your Own Medicine:

  • Technology Service Providers to financial institutions will be held to the same set of standards:
    • “Examiners should also use this booklet to evaluate the performance by third-party service providers, including technology service providers, of services on behalf of financial institutions.”

The Ripple Effect:

  • The impact of this guidance will likely be quite significant, and will be felt across all IT areas.  For example, the Control Maturity section of the  Cybersecurity Assessment Tool contains 98 references and hyperlinks to specific pages in the 2006 Handbook.  All of these are now invalid.  I’m sure we can expect an updated assessment tool  from the FFIEC at some point in the not-too-distant future.  (Which will also necessitate changes to certain online tools!)
  • The new FDIC IT Risk Examination procedures (InTREx) also contains several references to the IT Handbook, although they are not specific to any particular page.

Regarding InTREx, I was actually hoping that the new IT Handbook and the new FDIC exam procedures would be more closely coordinated, but perhaps that’s too much to ask at this point.  In any case, the similarity between the 3 recently released Handbooks indicates increased standardization, and I think that is a good thing.  We will continue to dissect this document and report observations as we find them.  In the meantime, don’t hesitate to reach out with your own observations.

12 Jul 2016

FDIC Updates IT Examination Procedures

Starting immediately, all FDIC-examined institutions will be subjected to new IT examination procedures, the first major overhaul since December 2007.  The new format is dubbed the InTREx program (Information Technology Risk Examination), and is designed to be a bit simpler in the pre-examination phase.  In fact, the InTREx has only 26 questions vs. 59 for the 12/07 version.  But what the new version gives up in the pre-exam phase, it more than makes up for in the actual on-site examination portion.  I believe most institutions should prepare for a much more thorough (i.e. time-consuming) examination experience going forward.

The InTREx is based on the URSIT methodology (Uniform Rating System for Information Technology), which dates back to 1999.  URSIT consists of four main components used to assess the overall performance of IT management within an organization; Audit, Management, Development and Acquisition, and Support and Delivery (AMDS).  Additionally InTREx adds an Expanded Analysis section for both Management and Support and Delivery.

First, the similarities:  Both the old and new model share a pre-exam and an on-site phase.  The pre-exam phase consists of the questionnaire, which is designed to help the examiner “scope” the examination (see #2 below), and determine exactly what documentation they will require from you (some of which they will request ahead of time, some will be requested on-site).

Once on-site the differences between the old and new are more apparent.  The new exam procedures require examiners to “review” (47 instances) and “evaluate” (54 instances) your documentation, and “determine” (30 instances) whether it is sufficient to prove that you’re doing what you say you will do.  The examiner uses the “Core Analysis Procedures” to assess each “Decision Factor” as either Strong, Satisfactory, Less than satisfactory, Deficient, or Critically deficient.  Examiners will then assign a 1 – 5 rating score to each AMDS component, and then assign an overall composite score.  All component ratings and scores, along with examination findings and recommendations, will appear in the final report.

Here is how the pre-exam and on-site phases break down in terms of type and volume of information requested:

  • The pre-exam phase is divided into 6 sections, with a total of 26 questions (most of which have an “If Yes…” portion, very similar to the 12/07 version):
Core Processing 4
Network 6
Online Banking 4
Development and Programming 1
Software and Services 2
Other 9
  • The on-site exam phase is where the new examiner procedures are defined, and is divided into the AMDS sections, plus the 2 Expanded Analysis sections.  Each of the AMDS sections has a Core Analysis Decision Factors, and a Core Analysis Procedures sub-section:
Exam Procedures Components Core Analysis Decision Factors Core Analysis Procedures
Audit 10 8
Management 8 16
Development and Acquisition 6 9
Support and Delivery 8 26
Management: Expanded Analysis 6 7
Support and Delivery: Expanded Analysis 7 8
GLBA Information Security Standards* 1* 0
Cybersecurity* 1* 0

* These components are not assessed separately, but are scattered throughout the program.

  1. This is much more granular process, requiring a deeper analysis by the examiner, which in turn puts a greater burden on the bank.  Proper documentation will often make the difference between a “satisfactory” and a “less than satisfactory” assessment.  If you prepared for previous exams by not just answering “Yes” or “No” to the pre-exam questions, but identifying all supporting documentation whether or not it was asked for, you should be fine with the InTREx.  If you were used to answering “Yes” or “No” with little or no examiner follow-up, download the InTREx now and focus on all the items in the Core Analysis Procedures sections.  Pay particular attention to the 34 “Control Test” items marked with FDIC Control Test Image , and make sure you can get your hands on those items.  Again, being able to provide the documentation may make all the difference in your final exam score.
  2. The pre-exam portion of the questionnaire should (in theory) allow the exam to scale to the size and complexity of the institution.  We’ll have to wait and see if that actually occurs, but let’s hope so.  We’ve heard from far too many smaller institutions that said they felt their examiner treated them as if they were much larger.
  3. There is quite a bit of overlap between the elements in the InTREx and the Declarative Statements in the Cybersecurity Assessment Tool.  That should mean that actions taken to strengthen your cybersecurity control maturity will also strengthen your overall IT controls.  Also, cybersecurity elements are now permanently baked-in to the IT examination process, not a separate assessment.  This is consistent with what I’ve been saying all along, that cybersecurity is simply a subset of information security.  However, as with the Control Tests, you should make sure you have documentation available for all items marked with FDIC Cyber Image .
  4. Hopefully, one potentially positive outcome from all this will be a more consistent examination experience.  Inconsistent examination results have been a source of concern for many institutions recently.  This new process should address that by removing some of the subjectivity that results when different examiners interpret the same guidance differently, and also by clarifying precisely what resources they need to “review”, and “evaluate” in order to “determine” the level of compliance achieved. 
  5. It is uncertain how quickly the new format will be adopted by regulators, but I’m guessing it will be pretty quickly.  Since they will send the questionnaire 90 days prior to your scheduled exam, we should expect the new methodology to be implemented for exams conducted in Q4 2016 at the soonest.
  6. It’s also unclear whether the other (non-FDIC) regulators will adopt this format. However, the FDIC insures the funds in all banks they have supervisory authority at all insured institutions (including those for which it is not the primary federal supervisor), so a single standard would make sense.  In any case, even non-FDIC institutions would be wise to familiarize themselves with this guidance.

Shoot me an email if and when you get the new InTREx, and let me know your experiences with it.  I’ll update the post periodically.

20 Apr 2016

FDIC Targets Board Responsibilities

“A topic is at times of such significant interest to bankers and examiners that it warrants a special issue…”  Whenever something from a regulatory body begins this way all bankers should take notice, and the latest Special Corporate Governance Edition from the FDIC is no exception.  In fact the Guru did a little research and the last time the FDIC released a Special Edition of its Supervisory Insights was the Foreclosure Edition in 2011, which was a post-mortem on the banking crisis.

So all bankers would be well advised to review this latest publication, but particularly community bankers.  In fact the full title is:  A Community Bank Director’s Guide to Corporate Governance: 21st Century Reflections on the FDIC Pocket Guide for Directors.  The emphasis on community banks and bankers is intentional, and the release states right up front that:

“Community banks play a vital role in the nation’s economy and local communities, and a bank’s management – including its directors and senior management – is perhaps the single most important element in the successful operation of a bank.”

[pullquote]…a director’s responsibility…necessitates using independent judgment and providing a credible challenge (to management).[/pullquote]

Although the FDIC states that this does not constitute new guidance (the original Pocket Guide was issued almost 30 years ago, but the basics haven’t really changed), the fact that they chose this topic and this time to release a special issue indicates that this is almost certainly going to be an area of increased focus for examiners going forward.

If there is one common theme that resonates from this issue it is that directors are expected to play a more active role in the day-to-day affairs of their institutions, and NOT be simply a “rubber stamp” for management.  This sums it up pretty well:

“…a director’s responsibility to oversee the conduct of the bank’s business necessitates using independent judgment and providing a credible challenge.  This entails engaging in robust discussions with senior management and perhaps challenging recommendations at times, rather than simply deferring to their decisions.”

I’ve talked about this concept of “credible challenge” before, which also appears several times in the recent FFIEC Management Handbook, and is defined as “being actively engaged, asking thoughtful questions, and exercising independent judgment.”  In order to do that, directors need access to accurate, timely and relevant information.  Board reports, once very high-level, should now include sufficient detail to allow members to comprehend (and if necessary, challenge) management decisions.

Free White Paper

Dispelling 5 IT Outsourcing Myths within Financial Institutions

Learn why some of the most commonly believed “facts” about IT outsourcing for banks are actually myths.

7 Reasons Why Small Community Banks Should Outsource IT Network Management

Make sure your IT management systems and processes are capable of producing these Board-level summary reports, then get them in front of the Board and in the Board minutes.  And be prepared for 2 things going forward; first, examiners WILL ask for these Board minutes and expect to see evidence of more engagement.  And secondly, expect Board meetings to become a lot more spirited!

04 Mar 2016

FDIC Expands Criteria for 18 Month Exam Cycle

The FDIC released FIL-17-2016 today, which will increase the examination cycle for community banks meeting certain criteria from 12 months to 18 months, thereby potentially decreasing one of the most intrusive events in the bankers life.

The criteria is as follows:

  • Must be less than $1 B in assets
  • Must have a CAMELS composite rating of “1” or “2”
  • Must be well-capitalized
  • Must be well-managed
  • Must not have undergone any change in control during the previous 12 months
  • Must not be under an enforcement order or proceeding.

The 18 month examination cycle was previously not available to any community bank smaller than $500 million in assets, but now any bank smaller than 1 B will qualify, provided they meet the other criteria.

This is good news for already overly-burdened and otherwise healthy institutions, but what concerns me is the definition of “well-managed”. All of the other criteria is objective, and pretty easy to define and establish. But how will the regulators define well-managed? For example, if the institution had a single, non-material, repeat finding in their last exam, could that reflect poorly on management? After all, responsiveness to recommendation from auditors and supervisory authorities is one of the elements that make up the CAMELS management component.

And is it even possible for an institution to rate a composite score of “1” or “2” if it is not well-managed? Here is an extract from the FDIC Uniform Financial Institutions Rating System (UFIRS) relating to management:

  • Composite 2 : Only moderate weaknesses are present and are well within the board of directors’ and management’s capabilities and willingness to correct.
  • Composite 3: Management may lack the ability or willingness to effectively address weaknesses within appropriate time frames.

7 Reasons Why Small Community Banks Should Outsource IT Network Management

7 Reasons Why Small Community Banks Should Outsource IT Network Management

7 Reasons Why Small Community Banks Should Outsource IT Network Management

Based in this I think it’s highly unlikely that a bank could score a “2” and be poorly managed.

Anyway, time will tell how examiners define well-managed, but this is certainly a step in the right direction and should bring much needed relief to many institutions.

11 Nov 2015

FFIEC Updates (and Greatly Expands) the Management Handbook

This latest update to the IT Examination Handbook series comes 11 years after the original version.  And although IT has changed significantly in the past 11 years, the requirement that financial institutions properly manage the risks of IT has not changed.  This new Handbook contains many changes that will introduce new requirements and new expectations from regulators.  Some of these changes are subtle, others are more significant.  Here is my first take on just a few differences between the original and the new Handbook:


  • The original Handbook contained only a single reference to “cyber”.  The revised Handbook contains 53 references.

IT Management

  • The Board and a steering committee are still responsible for overall IT management, but the guidance now introduces a new obligation for the Board, requiring that they provide a “credible challenge” to management.  Specifically, this means the Board must be “actively engaged, asking thoughtful questions, and exercising independent judgment”.  Simply put, no more “rubber stamps”.  The Board is expected to actually govern, and that means they need access to accurate, timely and relevant information.

The IT Management Structure has changed.  The 2004 Handbook listed the following structure:

  • Board of Directors / Steering Committee
  • Chief Information Officer / Chief Technology Officer
  • IT Line Management
  • Business Unit Management

The Updated Guidance is a bit more granular, and recommends the following structure (changes in bold):

  • Board of Directors  / Steering Committee
  • Executive Management
  • Chief Information Officer or Chief Technology Officer
  • Chief Information Security Officer
  • IT Line Management
  • Business Unit Management

“Risk Appetite”

  • The FFIEC Cybersecurity Assessment Tool introduced this new term (addressed here), and the Management Handbook makes an additional 11 references.  Institutions should understand this relatively new (for IT anyway) concept and incorporate it into their strategic planning process.

Managing Technology Service Providers

  • The 2004 guidance contained a separate section on best practices in this area.  The new guidance has removed the section, incorporating references to vendor management best practices throughout the document.  This reflects the reality of the prevalence and importance of outsourcing in today’s financial institutions.

Examination Procedures (Appendix A)

  • The 2004 Handbook had 8 pages containing 9 examination objectives.  The new guidance is almost completely re-written, and has 15 pages containing 13 objectives.  Several of these new objectives deal with internal governance and oversight, and a couple address the enterprise-wide nature of IT management.  All areas have been greatly expanded.  For example, the objective dealing with IT controls and risk mitigation (Objective 12) consists of 18 separate examination elements with 53 discrete items that examiners must check.

Free White Paper

Best Practices for Control and Management of Your Community Bank’s IT

A community bank’s digital assets are every bit as valuable as the money in the vault.

7 Reasons Why Small Community Banks Should Outsource IT Network Management

In summary, the updated Handbook represents a significant evolution in both the breadth and depth of IT management requirements.  It will set the standard for IT management best practices for both examiners and institutions for some time to come, and should be required reading for all Board members, CEO’s, CIO’s, ISO’s, and network administrators.

16 Jul 2015

FFIEC Releases Cybersecurity Assessment Tool

UPDATE:  Safe Systems just released their Enhanced CyberSecurity Assessment Toolkit (ECAT) – This enhanced version of the FFIEC toolkit addresses the biggest drawback of the tool; the ability to collect, summarize, and report your risk and control maturity levels.  

Once risks and controls have been assessed (Step 1 below), institutions will now be better able to identify gaps in their cyber risk program, which is step 2 in the 5 step process.

 cyber cycle

This long-anticipated release of the Cybersecurity Assessment Tool (CAT) is designed to assist institution management in both assessing their exposure to cyber risk, and then measuring the maturity level of their current cybersecurity controls.  Users must digest 123 pages, and must provide one of five answers to a total of 69 questions in 10 categories or domains.  This is not a trivial undertaking, and will require input from IT, HR, internal audit, and key third-parties.

After completing the assessment, management should gain a pretty good understanding of the gaps, or weaknesses, that may exist in their own cybersecurity program.  As a result, “management can then decide what actions are needed either to affect the inherent risk profile or to achieve a desired state of maturity.”  I found the CAT to be quite innovative in what it does, but it also has some baffling limitations that may pose challenges to many financial institutions, particularly smaller ones.

[pullquote]The CAT is quite innovative in what it does, but it also has some baffling limitations…[/pullquote]

First of all, I was stunned by the specificity of both the risk assessment and the maturity review.  Never before have we seen this level of prescriptiveness and granularity from the FFIEC in how risks and controls should be categorized.  For example, when assessing risk from third-parties:

  • No third-parties with system access = Least Risk
  • 1 – 5 third parties = Minimal Risk
  • 6 – 10 third parties = Moderate Risk
  • 11 – 25 third parties = Significant Risk, and
  • > 25 third parties = Most Risk

Again, this is quite different from what the FFIEC has done previously.  If Information Security guidance used the same level of detail, we might see specific risk levels for passwords of a certain length and complexity, and correspondingly higher control maturity levels for longer, more complex passwords.  Of course we don’t see that from current guidance, what we get is a generic “controls must be appropriate to the size and complexity of the institution, and the nature and scope of its operations”, leaving the interpretation of exactly what that means to the institution, and to the examiner.

I see this new approach as a very good thing, because it removes all of the ambiguity from the process.  As someone who has seen institutions of all sizes and complexities struggle with how to interpret and implement guidance, I hope this represents a new approach to all risk assessments and control reviews going forward.  Imagine having a single worksheet for both institutions and examiners.  Both sides agree that a certain number of network devices, or third-parties, or wire transfers, or RDC customers, or physical locations, constitute a very specific level of risk.  Control maturity is assessed based on implementation of specific discrete elements.  Removing the uncertainty and guess-work from examinations would be VERY good thing for over-burdened institutions and examiners alike.

7 Reasons Why Small Community Banks Should Outsource IT Network Management

7 Reasons Why Small Community Banks Should Outsource IT Network Management

7 Reasons Why Small Community Banks Should Outsource IT Network Management

So all that is good, but there are 2 big challenges with the tool; collection, and interpretation and analysis.  The first is the most significant, and makes the tool almost useless in its current format.  Institutions are expected to collect, and then interpret and analyze the data, but the tool doesn’t have that capability to do that because all the documents provided by the FFIEC are in PDF format.

If you can figure out how to record your responses, that still leaves the responsibility to conduct a “gap” analysis to the institution.  In other words, now that you know where your biggest risks reside, how do you then align your controls with those risks?

One more challenge…regulators expect you to communicate the results to the Board and senior management, which is not really possible without some way of collecting and summarizing the data.  They also recommend an independent evaluation of the results from your auditors, which also requires summary data.

In summary, there is a lot to like in this guidance and I hope we see more of this level of specificity going forward.  But the tool should have the ability to (at a minimum) record responses, and ideally summarize the data at various points in time.  In addition, most institutions will likely need assistance summarizing and analyzing the  results, and addressing the gaps.  But at least we now have a framework, and that is a good start.