Tag: FFIEC

05 Apr 2012

5 “random” facts

Fact 1 – According to the U.S. Bureau of Labor Statistics, the increasing complexity of financial regulations will spur employment growth of financial examiners.  In fact it is expected to experience the third largest growth of all career paths through 2018:
Fact 2 – According to Rep. Shelly Moore Capito (R-W.Va.), author of H.R. 3461, “The Dodd-Frank Act has added so many new regulations to financial institutions, it has helped boost a 31% projected growth in job opportunities for Compliance Officers.”

Fact 3 – Speaking of H.R. 3461…It is also called the Financial Institution Examination Fairness and Reform Act, and aims to provide “more transparent, timely and fair examinations” by reducing the disconnect between exam results and their regulating agencies.  It now has 154 co-sponsors.

Fact 4 – A related bill (S. 2160) has just been introduced in the Senate.

Fact 5 – The provision in both bills that is getting the greatest push-back from regulators is the one that grants a financial institution the right to appeal an examination finding to an ombudsman at the FFIEC, not the regulator that made the finding.

I’ll let you connect the dots of these “random” facts.

28 Mar 2012

CFPB Examinations Are Coming – UPDATE 2

UPDATE 2 – June 2012:  Memorandum of Understanding issued on CFPB examinations

Examinations are coming, but hopefully they won’t impose too much of an additional burden on you.  At least that is the intent of an MOU was recently signed between the CFPB and the other Federal regulators (Federal Reserve, NCUA, FDIC and OCC).  The MOU provides for information sharing among and between all agencies in order to minimize unnecessary duplication of examination efforts, and provides guidelines for “Simultaneous and Coordinated Examinations” between the agencies.  So expect additional visitors during future examinations, but if they truly expect to achieve the stated objective to “minimize unnecessary regulatory burden on Covered Institutions” they could start by doing away with CFPB examinations entirely.

UPDATE 1  –  May 2012:  Ramping Up…

Coming soon to your financial institution –

Dear Board of Directors:

Pursuant to the authority of the Dodd-Frank Wall Street Reform and Consumer Protection Act, the Consumer Financial Protection Bureau (CFPB) performed a risk-focused examination of your institution.  The examination began on April 1, 2012.  The following report summarizes the findings of our examination.

Any matters of criticism, violations of laws or regulations, and other matters of concern identified within this Examination Report require the Board of Director’s and management’s prompt attention and corrective action….

Although by law the CFPB will only  examine large depository institutions (assets greater than $10B) individually, Section 1026 extends coverage to smaller institutions on a sampling basis.  This means all institutions can eventually expect a visit from CFPB examiners (either with or without your primary federal regulator) at some point in the future.  And it is my opinion that the influence of the CFPB will continue to expand to all financial institutions regardless of size.  Consider the following:

  1. The CFPB is now one of the agencies comprising the inter-agency council of the FFIEC (replacing the OTS).  This means that CFPB will have input into all FFIEC guidance going forward.
  2. The head of the CFPB sits on the FDIC Board of Directors
  3. So far, 19 (Regs. B – P, V, X, Z & DD) out of the total of 39 Regulations have been turned over to CFPB for enforcement.  (I wonder if including Reg E will affect all electronic funds transfers, or only those initiated by non-business customers?  I find it hard to believe that there would be 2 sets of standards.)

So they are coming, but believe it or not there is good news.  Not only are they telling you what they are looking for ahead of time, they are giving you lots of helpful templates to fill out in preparation.  True, the templates are for their examiners, but there is no reason why you can’t use them too.  Particularly helpful is the Consumer Risk Assessment Template which CFPB examiners will use to determine inherent risk, which is then reduced by the appropriate controls to arrive at the overall risk (also called residual risk).  This table represents the summary of the consumer risk assessment process:

Notice that if the inherent risk is high, the residual risk can be no lower than moderate, regardless of the strength of the controls.  I think this is significant because of the potential implications for all risk assessments going forward.  Remember, CFPB now has a seat at the FFIEC (and FDIC) table.

But consider this…could we be looking at a fundamental change in how all risk assessments are conducted, and examined, in the future?  One single standardized risk assessment template for all risks?  Inherent risk levels are pre-defined, and control strength is pre-determined, making residual risk a purely objective calculation.  The complete lack of subjectivity means that all examiners evaluate all institutions against the exact same set of standards.  No exit meeting surprises, no unexpected CAMELS score downgrades, no spending hours and hours preparing for one area of compliance, only to have the examiners focus on something else.

So could the influence of the CFPB be a smoother, more predictable examination experience overall?  Or am I dreaming?

12 Mar 2012

Risk Managing BYOD (bring your own device)

Thanks in part to social media, users today often don’t differentiate between work and non-work activities, and they certainly don’t want to have to carry multiple work/non-work devices to keep them connected.    As a result, new multi-function, multi-purpose mobile devices are constantly being added to your secure financial institution network…and often in violation of your policies.

Most institutions have an IT Acquisition Policy, or something similar, that defines exactly how (and why) new technology is requested, acquired, implemented and maintained.  The scope of the policy extends to all personnel who are approved to use network resources within the institution, and the IT Committee (or equivalent) is usually tasked with making the final purchasing decision.   And although older policies may use language like “microcomputers”, and “PC’s”, the policy effectively covers all network connected devices, including the new generation of devices like smartphones and iPads.  And managing risk always begins with the acquisition policy…before the devices are acquired.

Your policy may differ in the specific language, but it should contain the following basic elements required of all requests for new technology:

  • Description of the specific hardware and software requested, along with an estimate of costs (note what type of vendor support is available).
  • Description of the intended use or need for the item(s).
  • Description of the cost vs. benefits of acquiring the requested item(s).
  • Analysis of information security ramifications of requested item(s).
  • Time frame required for purchase.

Most of these are pretty straightforward to answer, but what about bullet #4?  Are you able to apply the same level of information security standards to these multifunctional devices as you are to your PC’s and laptops?  Or does convenience trump security?  This is where the provisions of your information security policy take over.

The usefulness of these always-on mobile devices is undeniable, and they have really bent the cost/benefit curve, but they have also re-drawn the security profile in many cases.  The old adage is that a chain is only as strong as its weakest link, and in today’s IT infrastructure environment these devices are often the weak links in the security chain.  So while your users have their feet on the accelerator of new technology adoption, the ISO (and the committee managing information security) needs to have both feet firmly on the brake unless they are willing to declare these devices as an exception to their security policy…which is definitely not recommended.

So how can you effectively manage these devices within the provisions your existing information security program, without compromising your overall security profile?  It might be worth reviewing what the FFIEC has to say about security strategy:

Security strategies include prevention, detection, and response, and all three are needed for a comprehensive and robust security framework. Typically, security strategies focus most resources on prevention. Prevention addresses the likelihood of harm. Detection and response are generally used to limit damage once a security breech has occurred. Weaknesses in prevention may be offset by strengths in detection and response.

Regulators expect you to treat all network devices the same, and clearly preventive risk management controls are preferred, but the fact is that many of the same well-established tools and techniques that are used for servers, PC’s and laptops are either not available, or not as mature in the smartphone/iPad world.  Traditional tools such as patch management, anti-virus and remote access event log monitoring, and techniques such as multi-factor authentication and least permissions, are difficult if not impossible to apply to these devices.  However there are still preventive controls you can, and should, implement.

First of all, only deploy remote devices to approved users (as required by your remote access policy), and require connectivity via approved, secure connections (i.e. 3G/4G, SSL, secure WiFi, etc.).  Require both power-on and wake pass codes.  Require approval for all applications and utilize some form of patch management (manual or automated) for the operating system and the applications.  Encrypt all sensitive data in storage, and utilize anti-virus/ anti-spyware if available.

Because of the unavailability of preventive controls, maintaining your information security profile will likely rest on your compensating detective and corrective controls.  Controls are somewhat limited in these areas as well, but include maintaining an up-to-date device inventory, and having tracking and remote wipe capabilities to limit damage if a security breach does occur.

But there is one more very important preventive control you can use, and this one is widely available, mature, and highly effective…employee training.  Require your remote device users to undergo initial, and periodic, training on what your policies say they are (and aren’t) allowed to do with their devices.  You should still conduct testing of the remote wipe capability, and spot check for unencrypted data and unauthorized applications, but most of all train (and retrain) your employees.

02 Mar 2012

“Data-flow diagrams”

This request was seen in a recent State examiners pre-examination questionnaire, and although I usually like to see a request a couple of times from different examiners before identifying it as a legitimate trend, this one could prove so potentially problematic that I thought I needed to get ahead of it.

Before we go much farther, it’s important to distinguish between “data-flow diagrams” and the “work-flow analysis”, which is a requirement of the business continuity planning process.  They are completely separate endeavors designed to address two very different objectives.  The BCP “work-flow analysis” is designed to identify the interdependencies between critical processes in order to determine the order of recovery for the processes.  The “data-flow diagram” is designed to:

Supplement (management’s) understanding of information flow within and between network segments as well as across the institution’s perimeter to external parties.

It’s important to note here that what the examiner asked for actually wasn’t unreasonable, in fact it appears word-for-word in the FFIEC AIO Handbook:

Common types of documentation maintained to represent the entity’s IT and business environments are network diagrams, data flow diagrams, business process flow diagrams, and business process narratives. Management should document and maintain accurate representations of the current IT and business environments and should employ processes to update representations after the implementation of significant changes.

And although this particular examiner quoted from the Operations Handbook, the term “data flow” (in combination with “maps”, “charts” and “analysis”) actually appears 15 times in 5 different Handbooks; Development and Acquisition, Information Security, Operations, Retail Payment Systems, and Wholesale Payment Systems.

So this concept is certainly not unheard of, but previously this “understanding of information flow” objective was achieved via a network topology map, or schematic.  Sufficiently detailed, a network schematic will identify all internal and external connectivity, types of data circuits and bandwidth, routers, switches and servers.  Some may even include workstations and printers.  In the past this diagram, in combination with a hardware and software inventory, was always sufficient to document management’s understanding of information flow to examiners.  But in this particular case the examiner highlighted (in bold) this section of the guidance (and this was the most troublesome to me):

Data flow diagrams should identify:

  • Data sets and subsets shared between systems;
  • Applications sharing data; and
  • Classification of data (public, private, confidential, or other) being transmitted.

…and…

Data flow diagrams are also useful for identifying the volume and type of data stored on various media.  In addition, the diagrams should identify and differentiate between data in electronic format, and in other media, such as hard copy or optical images.

Data classification?  Differentiation between electronic, printed and digital data?  This seems to go way beyond what the typical network schematic is designed to do, way beyond what examiners have asked for in the past, and even possibly beyond the ability of most institutions to produce, at least not without significant effort.  Of course using the excuse of “unreasonable resource requirements” will usually not fly with examiners, so what is the proper response to a request of this nature?

Fortunately there may be a loophole here, at least for serviced institutions, and it’s found in the “size and complexity” predication.   The guidance initially states:

Effective IT operations management requires knowledge and understanding of the institution’s IT environment.

This is the underlying requirement, and the core issue to be addressed.  It then goes on to state that documentation of  management’s “knowledge and understanding” be “commensurate with the complexity of the institution’s technology operations”.  And depending on size and complexity, this may include “data-flow diagrams”.  So the examiner is effectively saying in this case that they feel that a “data-flow diagram” is the most appropriate way for the management of this outsourced institution to document adequate knowledge and understanding of their technology operations.  I suggested that the institution respectfully disagree, and state:

Our management believes, based on our size and complexity as a serviced institution, that an updated detailed schematic, and current hardware and software inventories, adequately demonstrates sufficient knowledge and understanding of our technology operations.

This directly addresses the core issue and I’m pretty sure the examiner will agree, but I’ll let you know.  In any case it’s worth pushing back on this because of the potentially enormous resource requirement that it would take to comply with the request, both now and going forward.

Now here is the real question…should you require the same level of documentation (i.e. data classification and data type differentiation) from your core vendor?  And if so, are you even likely to get it from them?

22 Feb 2012

The single most important vendor management control

Pop quiz…according to the FFIEC Handbook on Outsourcing Technology Services

“The ________ is the single most important control in the outsourcing process”:

  1. Initial due diligence process
  2. Review of third-party audit reports
  3. Contract
  4. Risk Assessment
  5. Vendor’s financial stability

I’ve written before about the importance of the third-party review in the ongoing vendor management process (and how that changes with the phase-out of the SAS 70), and of course vendor financial stability can speak to the ability to continue to provide services.  And a vendor risk assessment has been an essential component of your Information Security Program since GLBA.  All of these are important, but according to the FFIEC;

“The contract is the legally binding document that defines all aspects of the servicing relationship. A written contract should be present in all servicing relationships.  This includes instances where the service provider is affiliated with the institution. When contracting with an affiliate, the institution should ensure the costs and quality of services provided are commensurate with those of a non-affiliated provider. The contract is the single most important control in the outsourcing process.

 I’ve already predicted that vendor management would become an area of increased regulator scrutiny in the coming year, and one of the first things they will be looking at are the contracts.  Actually, regulators asking to see vendor contracts is nothing new, but lately some have been examining these contracts for specific elements.  So here are the 18 elements that should be included (or at least considered) in all critical vendor contracts:

  • Scope of Service
  • Performance Standards
  • Security and Confidentiality
  • Controls
  • Audit
  • Reports
  • Business Resumption and Contingency Plans
  • Sub-contracting and Multiple Service Provider Relationships
  • Cost
  • Ownership and licensing
  • Duration
  • Dispute Resolution
  • Indemnification
  • Limitation of Liability
  • Termination
  • Assignment
  • Foreign-based
  • Regulatory Compliance

Not all contracts, even with critical vendors, will contain all of these elements.   But each of them represents an important factor for consideration as you negotiate (or re-negotiate), and your vendor risk assessment should ultimately determine exactly which elements you’ll require.

“The time and resources devoted to managing outsourcing relationships should be based on the risk the relationship presents to the institution. “

And unlike some risk assessments, where smaller financial institutions may have enjoyed less scrutiny because of their size and complexity…

“…smaller and less complex institutions may have less flexibility than larger institutions in negotiating for services that meet their specific needs and in monitoring their service providers.”

So smaller institutions may actually be in for a more rigorous vendor management review than larger institutions!

I’ve created a check-list of all contract considerations listed above, along with a brief description of each one.  I hope you’ll find it useful.  You can download it here.

06 Feb 2012

NIST releases new Cloud Computing Guidelines

Although not specific to the financial industry, the new guidelines provide a comprehensive overview of the privacy and security challenges of this increasingly popular computing model.  It’s worth a look by both financial institutions considering cloud-based services, as well as service providers, because NIST guidelines often wind up as the basis for new or updated regulatory guidance.

They start by defining the concept of cloud computing as characterized by the “…displacement of data and services from inside to outside the organization” and by correctly observing that “…many of the features that make cloud computing attractive, however, can also be at odds with traditional security models and controls.”   This pretty accurately summarizes the challenges faced by financial institutions as they consider, and try to manage, the risks of cloud computing…data and services are out of their direct control, but risks of privacy, security, confidentiality, data integrity and availability must be controlled.

NIST offers the following guidelines for overseeing cloud services and service providers:

  • Carefully plan the security and privacy aspects of cloud computing solutions before engaging them.
  • Understand the public cloud computing environment offered by the cloud provider.
  • Ensure that a cloud computing solution satisfies organizational security and privacy requirements.
  • Ensure that the client-side computing environment meets organizational security and privacy requirements for cloud computing.
  • Maintain accountability over the privacy and security of data and applications implemented and deployed in public cloud computing environments.

For financial institutions, all these guidelines should be addressed in your existing policies.  The privacy and security elements are mandated by GLBA, and should already be present in your information security program.  One of the required, but often overlooked, elements of your vendor management program is the requirement to strategically justify your decision to engage a cloud services provider, and periodically review and reaffirm that decision.  Understanding the cloud provider environment is indeed a challenge for financial institutions, and I have already addressed this, and some possible solutions, here.  I’ve also discussed why increased adoption of cloud-based services will likely make vendor management a topic of increased regulatory scrutiny in 2012 here.  Additionally I think that the new SOC 2 report will directly address many of the concerns facing institutions employing cloud-based services.

As for the FFIEC, I was surprised to see that a search of the word “cloud” on the IT Examination InfoBase turned up not one single mention.  The Handbooks are getting a bit dated…perhaps, given the importance of managing outsourced relationships, plus the increased challenges of cloud computing, they should address this next?  Or do you think the existing guidance on managing outsourced technology and vendors is sufficiently broad?