Tag: compliance

14 Jan 2021
Looking Ahead to 2021

A Look Back at 2020 and a Look Ahead to 2021: A Regulatory Compliance Update

From SafeSystems.com/Safe-Systems-Blog

Safe Systems recently published a two-part regulatory compliance blog series that looked back at 2020 and ahead to 2021. In Part 1, we explored how regulations related to the Pandemic dominated the compliance landscape early in 2020 forcing financial institutions to make adjustments to their procedures and practices on the fly. In Part 2, we summarized the regulatory focus on cybersecurity (particularly ransomware) and looked ahead to 2021.

20 Oct 2020

Compliance Quick Bites – Tests vs. Exercises, and the Resiliency Factor

One of several changes implemented in the 2019 FFIEC BCM Examination Handbook is a subtle but important differentiation between a BCMP “test” and an “exercise”. I discussed some of the more material changes here, but we’re starting to see examiner scrutiny into not just if, but exactly what and how you’re testing.

According to the Handbook:

  • “An exercise is a task or activity involving people and processes that is designed to validate one or more aspects of the BCP or related procedures.”
  • “A test is a type of exercise intended to verify the quality, performance, or reliability of system resilience in an operational environment.”

Essentially, “…the distinction between the two is that exercises address people, processes, and systems whereas tests address specific aspects of a system.” Simply put, think of an exercise as a scenario-based simulation of your written process recovery procedures (a table-top exercise, for example), and a test as validation of the interdependencies of those processes, such as data restoration or circuit fail-over.

The new guidance makes it clear that you must have a comprehensive program that includes both exercises and tests, and that the primary objective should be to validate the effectiveness of your entire business continuity program. In the past, most FI’s have conducted an annual table-top or structured walk-through test, and that was enough to validate their plan. It now seems that this new differentiation requires multiple methods of validation of your recovery capabilities. Given the close integration between the various internal and external interdependencies of your recovery procedures, this makes perfect sense.

An additional consideration in preparing for future testing is the increased focus on resiliency, defined as any proactive measures you’ve already implemented to mitigate disruptive events and enhance your recovery capabilities. The term “resiliency” is used 126 times in the new Handbook, and you can bet that examiners will be looking for you to validate your ability to withstand as well as recover in your testing exercises. Resilience measures can include fire suppression, auxiliary power, server virtualization and replication, hot-site facilities, alternate providers, succession planning, etc.

One way of incorporating resilience capabilities into future testing is to evaluate the impact of a disruptive event after consideration of your internal and external process interdependencies and accounting for any existing resilience measures. For example, let’s say your lending operations require 3 external providers and 6 internal assets, including IT infrastructure, scanned documents, paper documents, and key employees. List any resilience capabilities you already have in place, such as recovery testing results from your third-parties, data replication and restoration, and cross-training for key employees, then evaluate what the true impact of the disruptive event would be in that context.

In summary, conducting both testing and exercises gives all stakeholders a high level of assurance that you’ve thoroughly identified and evaluated all internal and external process interdependencies, built resilience into each component, and can successfully restore critical business functions within recovery time objectives.

30 Sep 2020
Ask the Guru – Can We Apply Similar Controls to Satisfy Both GLBA and GDPR

Can We Apply Similar Controls to Satisfy Both GLBA and GDPR?

Hey Guru!

Are the Gramm–Leach–Bliley Act (GLBA) and the General Data Protection Regulation (GDPR) similar enough to apply the same or equivalent set of layered controls? My understanding is that GDPR has placed a higher premium on the protection of a narrower definition of data. So, my question is more about whether FFIEC requirements for the protection of data extends equally to both Confidential PII and the narrow data type called out by GDPR.


Hi Steve, and thanks for the question! Comparing Gramm–Leach–Bliley Act (GLBA) and the General Data Protection Regulation (GDPR) is instructive as they both try to address the same challenge; privacy and security. Specifically, protecting information shared between a customer and a service provider. GLBA is specific to financial institutions, while GDPR defines a “data processor” as any third-party that processes personal data. However, they both have a very similar definition of the protected data. GDPR uses the term “personal data” as any information that relates to an individual who can be directly or indirectly identified, and GLBA uses the term non-public personal information (or NPI) to describe the same type of data.

To answer the question of whether the two are similar enough to apply the same or similar set of layered controls, my short answer is since using layering controls is a risk mitigation strategy best practice, it would apply equally to both.

Here’s a bit more. The most important distinction between GLBA and GDPR is that GLBA has two sections; 501(a) and 501(b). The former establishes the right to privacy and the obligation that financial institutions must protect the security and confidentiality of customer NPI. 501(b) empowers the regulators to require FI’s to establish safeguards to protect against any threats to NPI. Simply put, 501(a) is the “what”, and 501(b) is the “how”. Of course, the “how” has given us the 12 FFIEC IT Examination Handbooks, cybersecurity regulations, PEN tests, the IT audit, and lots of other stuff with no end in sight.

By contrast, GDPR is more focused on “what” (what a third-party can and can’t do with customer data, as well what the customer can control; i.e. right to have their data deleted, etc.) and much less on the “how” it is supposed to be done.

My understanding is that the scope of GLBA (and all the information security standards based thereon) is strictly limited to customer NPI, it does not expend to confidential or PII. One distinguishing factor between NPI and PII is that in US regulations NPI always refers to the “customer”, and PII always refers to the “consumer”. (Frankly there isn’t really any difference between data obtained from a customer or consumer by a financial institution during the process of either pursuing or maintaining a business relationship.) We have always taken the position that for the purposes of data classification, NPI and confidential (PII) data share the same level of sensitivity, but guidance is only concerned about customer NPI. GDPR does not make that distinction.

In my opinion, our federal regulations will move towards merging NPI and PII, and in fact some states are already there. So, although it’s not strictly a requirement to protect anything other than NPI, it’s certainly a best practice, and combining both NPI and PII / confidential data in the same data sensitivity classification will do that.

One last thought about enforcement… So far, we have not heard of US regulators checking US based FI’s for GDPR compliance, but since our community-based financial institutions have very little EU exposure, your experience may be different.

11 May 2011

Using Technology to Drive Compliance

In the past year to year and a half, nearly all of the IT examination findings I’ve seen have in the broad category of “documentation”, or more specifically, lack thereof.  In other words, policies and procedures were satisfactory, but documentation was either non-existent, or insufficient, to demonstrate that actual practices followed policy and procedure.

To visualize this, consider that the compliance process consists of three overlapping areas of endeavor:

 

Written polices begin the process, which must always have regulatory guidance as their target.  Policies should track guidance precisely; if guidance states that you should or must do something, your policies should state that you do, or you will.

If policies are “what” you do, written procedures are the “how”.  And just as policies align with guidance, procedures should flow logically from, and align with, your policies. For example, your information security policy states (among other things) that you will protect the privacy and security of customer information.  Your procedures contain the detailed steps (or controls) that you will take to prevent, detect and correct unauthorized access to, or use of, customer information.  Controls like securing the perimeter of your network, updating server and workstation patches, installing and updating Anti-virus, etc.

So you have the “what” and the “how”, but as I mentioned previously, the vast majority of audit and examination findings in the past couple of years were due to deficiencies in the third area; actual (documented) practices.  And this is where technology can be of tremendous assistance.

[pullquote]Auditors and examiners much prefer automated systems over manual.  Automated systems don’t forget, or get too busy, or take vacations or sick days.   They aren’t subject to human error or inconsistencies. [/pullquote]

Auditors and examiners much prefer automated systems over manual.  Automated systems don’t forget, or get too busy, or take vacations or sick days.   They aren’t subject to human error or inconsistencies.  In fact, some processes like firewall logging, normalization, and analysis are virtually impossible to implement manually because of the sheer volume of data generated by these devices.*  And while other areas like patch management and Anti-virus updates are possible to implement manually, auditors much prefer automated processes because they ensure polices are applied in a consistent and timely manner.

But perhaps the biggest boost to your compliance efforts from technology is in the area of reporting, and specifically, automated reporting.  In today’s compliance environment, if you can’t prove you’re following your procedures, the expectation from the examiners is that you aren’t.  This is the one area that has evolved more than any other in the past couple years.  And automated reporting provides that documentation without human intervention, easing the burden on the network administrator.  Auditors (internal and external) and examiners also like automated reporting because they have a higher confidence in the integrity of the data.  And the IT Steering Committee likes it because it is much easier to review and approve reports prepared and presented in a standardized format.

So in summary, technology enables automation, and automation enhances compliance.  And along the way everyone from the Board of Directors, to management committees, to the network administrator, benefits from it.

 

*  The FDIC IT Officer’s Pre-Examination Questionnaire validates the difficulty of manual processes to manage logs when it asks:

“Do you have a formal intrusion detection program, other than basic logging (emphasis mine), for monitoring host and/or network activity”