Tag: data classification

17 Sep 2013

Data Classification and the Cloud

UPDATE –  In response to the reluctance of financial institutions to adopt cloud storage, vendors such as Microsoft and HP have announced that they are building “hybrid” clouds.  These new models are designed to allow institutions to simultaneously store and process certain data in the cloud, while a portion of the processing or storage is done locally on premise.  For example, the application may reside in the cloud, but the customer data is stored locally.  This may make the decision easier, but only makes classification of data more important, as the decision to utilize a “hybrid” cloud must be justified by your assessment of the privacy and criticality of the data.

I get “should-we-or-shouldn’t-we” questions about the Cloud all the time, and because of the high standards for financial institution data protection, I always advise caution.  In fact, I recently outlined 7 cloud deal-breakers for financial institutions.  But could financial institutions still justify using a cloud vendor even if they don’t seem to meet all of the regulatory requirements?  Yes…if you’ve first classified your data.

The concept of “data classification” is not new, it’s mentioned several times in the FFIEC Information Security Handbook:

“Institutions may* establish an information data classification program to identify and rank data, systems, and applications in order of importance. Classifying data allows the institution to ensure consistent protection of information and other critical data throughout the system.”

“Data classification is the identification and organization of information according to its criticality and sensitivity. The classification is linked to a protection profile. A protection profile is a description of the protections that should be afforded to data in each classification.”

The term is also mentioned several times in the FFIEC Operations Handbook:

“As part of the information security program, management should* implement an information classification strategy appropriate to the complexity of its systems. Generally, financial institutions should classify information according to its sensitivity and implement
controls based on the classifications. IT operations staff should know the information classification policy and handle information according to its classification.”

 But the most relevant reference for financial institutions looking for guidance about moving data to the Cloud is a single mention in the FFIEC Outsourcing Technology Services Handbook, Tier 1 Examination Procedures section:

“If the institution engages in cloud processing, determine that inherent risks have been comprehensively evaluated, control mechanisms have been clearly identified, and that residual risks are at acceptable levels. Ensure that…(t)he types of data in the cloud have been identified (social security numbers, account numbers, IP addresses, etc.) and have established appropriate data classifications based on the financial institution’s policies.”

So although data classification is a best practice even before you move to the cloud, the truth is that most institutions aren’t doing it (more on that in a moment).   However examiners are expected to ensure (i.e. to verify) that you’ve properly classified your data afterwards…and that regardless of where data is located, you’ve protected it consistent with your existing policies.  (To date I have not seen widespread indications that examiners are asking for data classification yet, but I expect as cloud utilization increases, they will.  After all, it is required in their examination procedures.)

Most institutions don’t bother to classify data that is processed and stored internally because they treat all data the same, i.e. they have a single protection profile that treats all data at the highest level of sensitivity.  And indeed the guidance states that:

“Systems that store or transmit data of different sensitivities should be classified as if all data were at the highest sensitivity.”

But once that data leaves your protected infrastructure everything changes…and nothing changes.  Your policies still require (and regulators still expect) complete data security, privacy, availability, etc., but since your level of control drops considerably, so should your level of confidence.  And you likely have sensitive data combined with non-sensitive, critical combined with non-critical.  This would suggest that unless the cloud vendor meets the highest standard for your most critical data, they can’t be approved for any data.  Unless…

  1. You’ve clearly defined data sensitivity and criticality categories, and…
  2. You’re able to segregate one data group from another, and…
  3. You’ve established and applied appropriate protection profiles to each one.

Classification categories are generally defined in terms of criticality and sensitivity, but the guidance is not prescriptive on how you should label each category.  I’ve seen “High”, “Medium”, and “Low”, as well as “Tier 1”, “Tier 2” and “Tier 3”, and even a scale of 1 to 5,…whatever works best for your organization is fine.  Once that is complete, the biggest challenge is making sure you don’t mix data classifications.  This is easier for data like financials or Board reports, but particularly challenging for data like email, which could contain anything from customer information to yesterdays lunch plans.  Remember, if any part of the data is highly sensitive or critical, all data must be treated as such.

So back to my original question…can you justify utilizing the cloud even if the vendor is less than fully compliant?  Yes, if data is properly classified and segregated, and if cloud vendors are selected based on their ability to adhere to your policies (or protection profiles) for each category of data.

 

 

*In “FFIEC-speak”, ‘may’ means “should’, and ‘should’ means ‘must’.

10 Jul 2012

FFIEC issues Cloud Computing Guidance

Actually the document is classified as “for informational purposes only”, which is to say that it is not a change or update to any specific Handbook and presumably does not carry the weight of regulatory guidance.  However, it is worth a read by all financial institutions outsourcing services because it provides reinforcement for, and references to, all applicable guidance and best practices surrounding cloud computing.

It is a fairly short document (4 pages) and again does not represent a new approach, but rather reinforces the fact that managing cloud providers is really just a best practices exercise in vendor management.  It makes repeated reference to the existing guidance found in the Information Security and Outsourcing Technology Services Handbooks.  It also introduces a completely new section of the InfoBase called Reference Materials.

The very first statement in the document pretty well sums it up:

“The (FFIEC) Agencies consider cloud computing to be another form of outsourcing with the same basic risk characteristics and risk management requirements as traditional forms of outsourcing.”

It then proceeds to describe basic vendor management best practices such as information security and business continuity, but one big take-away for me was the reference to data classification.  This is not the first time we’ve seen this term, I wrote about examiners asking for it here, and the Information Security Handbook says that:

“Institutions may establish an information data classification program to identify and rank data, systems, and applications in order of importance.”

But when all your sensitive data is stored, transmitted, and processed in a controlled environment  (i.e. between you and your core provider) a simple schematic will usually suffice to document data flow.  No need to classify and segregate data, all data is treated equally regardless of sensitivity.  However once that data enters the cloud you lose that control.  What path did the data take to get to the cloud provider?  Where exactly is the data stored?  Who else has access to the data?  And what about traditional issues such as recoverability and data retention and destruction?

Another important point made in the document, and one that doesn’t appear in any other guidance,  is that because of the unique legal and regulatory challenges faced by financial institutions, the cloud vendor should be familiar with the financial industry.  They even suggest that if the vendor is not keeping up with regulatory changes (either because the are unwilling or unable) you may determine on that basis that you cannot employ that vendor.

The document concludes by stating that:

“The fundamentals of risk and risk management defined in the IT Handbook apply to cloud computing as they do to other forms of outsourcing. Cloud computing may require more robust controls due to the nature of the service.”

And…

“Vendor management, information security, audits, legal and regulatory compliance, and business continuity planning are key elements of sound risk management and risk mitigation controls for cloud computing.”

…as they are for all outsourced relationships!