In this special vlog post, Tom Hinkel weighs on a proposed NCUA notification requirement for cyber incidents.
Tag: information security
FFIEC Cancels E-Banking Handbook
On May 13, 2022, the FFIEC very quietly rescinded the FFIEC Information Technology Examination Handbook (IT Handbook) booklet entitled E-Banking. The original booklet was released in 2003 and was accompanied by a flurry of activity by financial institutions to come up with a separate E-banking policy and risk assessment. In effect, the FFIEC is now declaring (admitting?) that these are no longer necessary because all the basic risk management principles that apply to E-Banking are already addressed in other Handbooks. Operational risk is addressed in the Business Continuity Management Handbook, information security risk is addressed in the Information Security Handbook, cyber risk is assessed in the Cybersecurity Assessment Tool, and third-party risk is addressed here, here, and here.
We agree with this approach, and have long held that separately addressing each new emerging or evolving technology was cumbersome, duplicative, and unnecessary. In our opinion, shifting the focus of the handbooks to basic risk management principles and best practices that can apply to all business processes makes more sense and is long overdue. Could the Wholesale and Retail Payment Systems handbooks be phased out next? How about the Cybersecurity Assessment Tool? Since cybersecurity is simply a subset of information security more broadly, could we see a phase-out of a separate cyber assessment? Or even better, could we see the Information Security Handbook include a standardized risks and controls questionnaire that includes cyber?
Admittedly this is only one less policy and one less risk assessment, but we’ll be watching this trend with great interest. Anything that can help ease the burden on overworked compliance folks is a welcome change!
Have There Been Any Official Board Reporting Updates to the FFIEC InfoSec Handbook since 2016?
Hey Guru!
Do you have any additional blogs about FDIC changing the annual IT report to the board? I saw the article from 2012 and was wondering if there are any updates to that. Has the FFIEC updated its Information Security IT Handbook after 2016 in regard to this subject?
Thank you,
Lynn
Hi Lynn, and thanks for the question! We haven’t seen any official board reporting updates from regulators since the 2016 revision to the FFIEC InfoSec Handbook, most of what we’ve heard on this topic lately is anecdotal (e.g., feedback from recent IT audits and examinations). The popular consensus is that the volume of information expected to be communicated to the board has greatly increased. We believe it’s because of the relatively recent requirement for the board to provide a “credible challenge” to management, which requires more information on all aspects of information security. Combine that with the hyper-focus on cybersecurity, and “the buck stops with the board” mentality, and it’s almost impossible to imagine over-informing the board.
A bit of background on board reporting… the Examination Procedures section (Appendix A) of the 2016 FFIEC Information Security IT Handbook instructs examiners to:
Determine whether the board approves a written information security program and receives a report on the effectiveness of the information security program at least annually. Determine whether the report to the board describes the overall status of the information security program and discusses material matters related to the program such as the following:
- Risk assessment process, including threat identification and assessment.
- Risk management and control decisions.
- Service provider arrangements.
- Results of security operations activities and summaries of assurance reports.
- Security breaches or violations and management’s responses.
- Recommendations for changes or updates to the information security program
We feel that this is a decent framework assuming sufficient detail is added to each item, and the reporting is presented to the board in a manner in which they are most likely to understand it. Because each one is unique, that often means dialing the level of detail up or down depending on the specific comprehension level of your board.
We also recommend folks add a “Strategic IT Planning” section to the report, with updates on all significant IT initiatives, including how each of those initiatives aligns with enterprise-wide strategic goals and objectives.
You may also want to check out Appendix A, Objective 2 of the Management Handbook. Again, nothing new, but it does help define the broad scope of Board oversight from the examiner’s perspective. Remember, for every item listed in #2 of Objective 2, there must be one or more associated reports supporting the activity, and both the activity and the supporting documentation should be part of the board minutes:
Review the minutes of the board of directors and relevant committee meetings for evidence of board support and supervision of IT activities.
Wherever there is a lack of prescriptive guidance or there is room for interpretation in the guidance, risk managers must choose the path of least risk. For us, although the official guidance hasn’t changed recently, it’s much less risky to over-report information security activities to the Board than it is to under report. To date, we’ve never had an examiner criticize one of our customers for over-reporting!
UPDATE – New Proposed Cyber Incident Notification Rules Finalized
Last updated March 30, 2022.
Currently, financial institutions are required to report a cyber event to their primary federal regulator under very specific circumstances. This requirement dates back to GLBA, Appendix B to Part 364 and states that FI incident response plans (IRP’s) should contain procedures for: “Notifying its primary Federal regulator as soon as possible when the institution becomes aware of an incident involving unauthorized access to or use of sensitive customer information…”. Customer notification guidance is very similar. Institutions should provide notice to their customers as soon as possible: “If the institution determines that misuse of its information about a customer has occurred or is reasonably possible.” (It’s important to note here that a strict interpretation of “…access to or use of…” would generally not include a denial of access (DDoS) type of attack, or a ransomware attack that locks files in place. We strongly suggest modifying the definition of “misuse” in your incident response plan to say “…access to, denial of access to, or unauthorized use of…”.) However, with the issuance of the final rule (officially called “Computer-Security Incident Notification Requirements for Banking Organizations and Their Bank Service Providers”) institutions will have additional considerations that will require changes to your policies and procedures.
Background
Late in 2020 the FDIC issued a joint statement press release with the OCC and the Federal Reserve announcing the proposed changes. As is the case for all new regulations, they were first published in the Federal Register, which started the clock on a 90-day comment period, which ended on April 12 of 2021. (We took an early look at this back in July.)
The new rule was approved on November 2021 by the OCC, Federal Reserve, and FDIC1 collectively, with a proposed effective date of April 1, 2022, and a compliance date of May 1, 2022. Simply put, it will require “…a banking organization to provide its primary federal regulator with prompt notification of any “computer-security incident” that rises to the level of a “notification incident.”
To fully understand the requirements and new expectations of this rule, there are actually three terms we need to understand; a computer security incident, a notification incident, and “materiality”.
Keys to Understanding the New Rule
A computer-security incident could be anything from a non-malicious hardware or software failure or the unintentional actions of an employee, to something malicious and possibly criminal in nature. The new rule defines computer security incidents as those that result in actual or potential harm to the confidentiality, integrity, or availability of an information system or the information the system processes, stores, or transmits.
A notification incident is defined as a significant computer-security incident that has materially disrupted or degraded, or is reasonably likely to materially disrupt or degrade, a banking organization’s:
- Ability to carry out banking operations, activities, or processes, or deliver banking products and services to a material portion of its customer base, in the ordinary course of business
- Business line(s), including associated operations, services, functions, and support, that upon failure would result in a material loss of revenue, profit, or franchise value; or
- Operations, including associated services, functions and support, as applicable, the failure or discontinuance of which would pose a threat to the financial stability of the United States.
The third term that needs to be understood is “materiality“. This term is used 97 times in the full 80 page press release, so it is clearly something the regulators expect you to understand and establish; for example, what is a “material portion of your customer base”, or “material loss of revenue”, or a “material disruption” of your operations? Unfortunately the regulation does not provide a universal definition of materiality beyond agreeing that it should be evaluated on an enterprise-wide basis. Essentially, each banking organization should evaluate whether the impact is material to their organization as a whole. This would seem to suggest that these material threshold levels would need to be defined ahead of time, perhaps as a function of establishing Board-approved risk appetite levels or perhaps it could be tied to the business impact analysis? Future clarification may be necessary on the best approach to establishing the determination of materiality in your organization, but since the term is at the centerpiece of the rule, and initiation of the 36 hour threshold for notification doesn’t begin until it has been established, we can definitely expect materiality to be a part of the discussion in the event of regulator scrutiny in this area.
Any event that meets the criteria of a notification incident would require regulator notification “as soon as possible”, and no later than 36 hours after you’ve determined that a notification event has occurred. It’s important to understand that the 36 hour clock does not start until there has been a determination that the incident has been classified as a notification event, which only happens after you’ve determined you’ve experienced a computer-security incident.
The Safe Systems Compliance Team has created a detailed decisioning flowchart to assist with your understanding of this new rule. Click here for a copy of the flowchart.
Notification can be provided to the “…appropriate agency supervisory office, or other designated point of contact, through email, telephone, or other similar method that the agency may prescribe.” No specific information is required in the notification other than that a notification incident has occurred. The final rule also does not prescribe any specific form or template that must be used, and there are no recordkeeping requirements beyond what may be in place if a Suspicious Activity Report (SAR) is filed in connection with the incident. The agencies have all issued additional “point-of-contact” guidance:
For FDIC institutions:
Notification can be made to your case manager (your primary contact for all supervisory-related matters), to any member of an FDIC examination team if the event occurs during an examination, or if our primary contact is unavailable, you may notify the FDIC by email at: incident@fdic.gov.
For OCC Institutions:
Notification may be done by emailing or calling the OCC supervisory office. Communication may also be made via the BankNet website, or by contacting the BankNet Help Desk via email (BankNet@occ.treas.gov) or phone (800) 641-5925.
For Federal Reserve Institutions:
Notification may be made by communicating with any of the Federal Reserve supervisory contacts or the central point of contact at the Board either by email to incident@frb.gov or by telephone to (866) 364-0096
One final note, we’ve received indications that at least some State Banking regulators will require concurrent notification of any incident that rises to the level of a notification incident. Check with your State regulators on if (and how) they plan to coordinate with this new rule.
Third-party Notification Rules
In addition to FI notification changes, there will also be new expectations for third-party service providers, like core providers and significant technology service providers (as defined in the BSCA). Basically, it would require a service-provider to “…notify at least one bank-designated point of contact at affected banking organization customers immediately after experiencing a computer-security incident that it believes in good faith could disrupt, degrade, or impair services provided subject to the BSCA for four or more hours.”
Furthermore, if you are notified by a third-party that an event has occurred, and the event has or is likely to result in your customers being unable to access their accounts (i.e. it rises to the level of a notification incident), you would also be required to report to your regulator. However, it’s important to note here that not all third-party notification incidents will also be considered bank regulator notification incidents. It is also significant that the agencies will most likely not cite your organization because a bank service provider fails to comply with its notification requirement, so you will likely not be faulted if a third-party fails to notify you.
Next Steps
There will undoubtedly be clarification on the specifics of rule implementation as we digest feedback from regulatory reviews next year, and we’ll keep you posted as we know more. In the meantime, aside from having internal discussions about what constitutes “materiality” in your organization, the new rules will likely also require some modifications to your Incident Response Plan (IRP), and possibly to key vendor contracts. For FDIC institutions, the “as soon as possible” regulator notification provisions of FIL-27-2005 already in your IRP will have to be amended. For all critical vendors, ensure that contracts contain verbiage committing them to the 4 hour outage criteria for notification, and that you’ve identified a contact person or persons within your organization to receive the alert.
New Proposed Cyber Incident Notification Rules
We first wrote about incident notification over ten years ago, and based on feedback from our cyber testing experience, financial institutions are still struggling with the issue of whether or not to notify their customers and primary regulators. The conversation often comes down, to “do we have to notify?” Some institutions may choose to notify out of an abundance of caution, but most won’t unless it’s absolutely required, as regulator notification opens the door to additional examiner scrutiny, and customer notification may result in increased reputation risk. To confuse the issue a bit more, notification requirements are currently defined differently for a regulator than for a customer. And all this is about to change!
Notification Rules Background
Financial institutions are currently required to report an event to their primary federal regulator under very specific circumstances. This requirement dates back to GLBA, Appendix B to Part 364 and states that FI incident response plans (IRPs) should contain procedures for: “Notifying its primary Federal regulator as soon as possible when the institution becomes aware of an incident involving unauthorized access to or use of sensitive customer information…”
Customer notification guidance is very similar. Institutions should provide notice to their customers as soon as possible: “If the institution determines that misuse of its information about a customer has occurred or is reasonably possible.” (It’s important to note here that a strict interpretation of “…access to or use of…” would generally not include a denial of access (DDoS) type of attack or a ransomware attack that locks files in place. We suggest modifying the language of “misuse” to “…access to, denial of access to, or use of…”.)
Announcement of New Proposed Notification Rules
Late last year the FDIC issued a joint press release with the OCC and the Federal Reserve1 announcing the proposed changes. The working title is a mouthful: Computer-Security Incident Notification Requirements for Banking Organizations and Their Bank Service Providers. As is the case for all new regulations, the proposed notification rules were first published in the Federal Register, which started the clock on a 90 day comment period that ended on April 12 of this year. When (or if) the rules will become law will depend on how long it takes regulators to compile, digest, and reconcile the comments received, which can take as long as 6 months to a year from the end of the comment period.
3 Key Terms of the New Regulator Notification Rule
One of the new rules “…would require a banking organization to provide its primary federal regulator with prompt notification of any computer-security incident that rises to the level of a notification incident.” There are actually three terms we need to understand here: a computer security incident, a significant security incident, and a notification incident.
A computer security incident could be anything from a non-malicious hardware or software failure or the unintentional actions of an employee to something malicious and possibly criminal in nature. Computer security incidents are those that:
- Result in actual or potential harm to the confidentiality, integrity, or availability of an information system or the information the system processes, stores, or transmits; or
- Constitute a violation or imminent threat of violation of security policies, security procedures, or acceptable use policies.
In addition to the GLBA NPI guidance, banking organizations are already required to report certain instances of disruptive cyber-events and cyber-crimes through the filing of Suspicious Activity Reports (SARs) within 30 days, but no regulator notification is required unless these criteria are met. Even so, if notification is provided, the concern is that the 30-day window may not be timely enough to prevent other events.
This new rule would define a significant computer security incident as one that meets any of these criteria:
- Could jeopardize the viability of the operations of an individual banking organization
- Result in customers being unable to access their deposit and other accounts
- Impact the stability of the financial sector
The proposed rule refers to these significant computer security incidents as notification incidents — the two terms are synonymous, so any event that meets the above criteria would require regulator notification “as soon as possible”, and no later than 36 hours after you’ve determined that a notification event has occurred.
We’ll see what the final rules look like, but at the moment there are no proposed changes to the customer notification requirements.
New Third-Party Expectations
In addition to FI notification changes, there will also be new expectations for third-party service providers, like core providers and significant technology service providers (as defined in the BSCA). Because these vendors are “…also are vulnerable to cyber threats, which have the potential to disrupt, degrade, or impair the provision of banking services to their banking organization customers,” it would require a service-provider to “…notify at least two individuals at affected banking organization customers immediately after experiencing a computer-security incident that it believes in good faith could disrupt, degrade, or impair services provided subject to the BSCA for four or more hours.” Presumably, if you are notified by a third party that an event has occurred, and the event has or is likely to result in your customers being unable to access their accounts, you would also be required to report to your regulator.
Reviewing the submitted comments, there are still many questions to be answered and terms to be clarified, but with cybersecurity dominating the news recently we can definitely count on regulatory changes to the “do we have to notify?” discussion coming fairly soon.
A Look Back at 2020 and a Look Ahead to 2021: A Regulatory Compliance Update
From SafeSystems.com/Safe-Systems-Blog
Safe Systems recently published a two-part regulatory compliance blog series that looked back at 2020 and ahead to 2021. In Part 1, we explored how regulations related to the Pandemic dominated the compliance landscape early in 2020 forcing financial institutions to make adjustments to their procedures and practices on the fly. In Part 2, we summarized the regulatory focus on cybersecurity (particularly ransomware) and looked ahead to 2021.