This request was seen in a recent State examiners pre-examination questionnaire, and although I usually like to see a request a couple of times from different examiners before identifying it as a legitimate trend, this one could prove so potentially problematic that I thought I needed to get ahead of it.
Before we go much farther, it’s important to distinguish between “data-flow diagrams” and the “work-flow analysis”, which is a requirement of the business continuity planning process. They are completely separate endeavors designed to address two very different objectives. The BCP “work-flow analysis” is designed to identify the interdependencies between critical processes in order to determine the order of recovery for the processes. The “data-flow diagram” is designed to:
Supplement (management’s) understanding of information flow within and between network segments as well as across the institution’s perimeter to external parties.
It’s important to note here that what the examiner asked for actually wasn’t unreasonable, in fact it appears word-for-word in the FFIEC Operations Handbook:
Management should also develop data flow diagrams to supplement its understanding of information flow within and between network segments as well as across the institution’s perimeter to external parties.
And although this particular examiner quoted from the Operations Handbook, the term “data flow” (in combination with “maps”, “charts” and “analysis”) actually appears 15 times in 5 different Handbooks; Development and Acquisition, Information Security, Operations, Retail Payment Systems, and Wholesale Payment Systems.
So this concept is certainly not unheard of, but previously this “understanding of information flow” objective was achieved via a network topology map, or schematic. Sufficiently detailed, a network schematic will identify all internal and external connectivity, types of data circuits and bandwidth, routers, switches and servers. Some may even include workstations and printers. In the past this diagram, in combination with a hardware and software inventory, was always sufficient to document management’s understanding of information flow to examiners. But in this particular case the examiner highlighted (in bold) this section of the guidance (and this was the most troublesome to me):
Data flow diagrams should identify:
- Data sets and subsets shared between systems;
- Applications sharing data; and
- Classification of data (public, private, confidential, or other) being transmitted.
Data flow diagrams are also useful for identifying the volume and type of data stored on various media. In addition, the diagrams should identify and differentiate between data in electronic format, and in other media, such as hard copy or optical images.
Data classification? Differentiation between electronic, printed and digital data? This seems to go way beyond what the typical network schematic is designed to do, way beyond what examiners have asked for in the past, and even possibly beyond the ability of most institutions to produce, at least not without significant effort. Of course using the excuse of “unreasonable resource requirements” will usually not fly with examiners, so what is the proper response to a request of this nature?
Fortunately there may be a loophole here, at least for serviced institutions, and it’s found in the “size and complexity” predication. The guidance initially states:
“Effective IT operations management requires knowledge and understanding of the institution’s IT environment.”
This is the underlying requirement, and the core issue to be addressed. It then goes on to state that documentation of management’s “knowledge and understanding” be “commensurate with the complexity of the institution’s technology operations”. And depending on size and complexity, this may include “data-flow diagrams”. So the examiner is effectively saying in this case that they feel that a “data-flow diagram” is the most appropriate way for the management of this outsourced institution to document adequate knowledge and understanding of their technology operations. I suggested that the institution respectfully disagree, and state:
“Our management believes, based on our size and complexity as a serviced institution, that an updated detailed schematic, and current hardware and software inventories, adequately demonstrates sufficient knowledge and understanding of our technology operations”.
This directly addresses the core issue and I’m pretty sure the examiner will agree, but I’ll let you know. In any case it’s worth pushing back on this because of the potentially enormous resource requirement that it would take to comply with the request, both now and going forward.
Now here is the real question…should you require the same level of documentation (i.e. data classification and data type differentiation) from your core vendor? And if so, are you even likely to get it from them?