Tag: Audit

03 Dec 2013

Ask the Guru: The IT Audit “Scope”

Hey Guru
Our examiner is asking about the “scope” of our IT audits. What is she referring to, and how do we define a reasonable scope?


Audit results are one of the first things examiners want to see, and the “scope” of the audit is very important to examiners.  In fact, the term is used 74 times in the FFIEC Audit Handbook!  Scope generally refers to the depth and breadth of the audit, which is in turn determined by the objectives or what the audit is designed to accomplish.  The two broad objectives for any audit are control adequacy and control effectiveness*.  Control adequacy means that the controls you have in place (policies, procedures and practices) address all reasonably identifiable risks.  These audits are sometimes referred to as policy (and sometimes ITGC, or IT general controls) audits.  Although the standards used for these audits may differ (more on that later), the scope of these audits should ultimately address the requirements outlined in the 11 IT Examination Handbooks.

Once control adequacy is established, the next thing the examiners want to know is “OK the controls exist, but do the controls work?”, i.e. are they effective?  Are they indeed controlling the risks the way they were designed to?  Those types of audits are more commonly (and accurately) described as tests or assessments, usually referred to as penetration (PEN) tests, or vulnerability assessments (VA).  They may either be internal, external, or (preferably) both.

Sequentially, the audits must be conducted in that order.  In other words, you must first establish adequacy before you test for effectiveness.  It really doesn’t make sense to test controls that don’t directly address your risks. In fact although an auditor will sometime combine the 2 audits into a single engagement, I encourage folks to separate them so that any deficiencies in control adequacy can be corrected prior to the PEN testing.

One more thing to consider is the standard by which the auditor will conduct their audit, sometime referred to as their “work program”.  These are the guidelines the auditor will use to guide the project and conduct the audit.  While there are several industry established IT standards out there…COBIT, ITIL, COSO, ISO 27001, SAS 94, NIST, etc., there is no one single accepted standard.  The fact is most auditors use a customized hybrid work program, and the vast majority are perfectly acceptable to the examiners.  However at some point in your evaluation process with a new auditor you should ask them why they prefer one standard over another.  Whatever their preference, make sure that somewhere in their scope-of-work document they make reference to the FFIEC examination guidelines.  This assures that they are familiar with the unique regulatory requirements of financial institutions.

Regarding cost, there are often wide disparities between seemingly similar engagements, and it’s easy to see why.  In order to make a side-by-side comparison you’ll need to know a few things: Is the audit focused on control adequacy or control effectiveness (or both)?  If both, are they willing to break the engagement into 2 parts?  What is the audit standard they’ll be using, and why?  What methods will they use to test your controls; inquiry or inspection and sampling?  Are vulnerability assessments internal or external (or both)?  What are the certifications of the auditors and how much experience do they have with financial institutions?  Finally, if the examiners have questions or concerns about the auditor’s methodology, or if examiner findings seem to conflict with audit results, will the auditor work with you to respond to the examiner?

In summary, the scope of the audit is defined as either:

  • To assess and determine the adequacy (or design and operation) of our controls
  • To assess and determine the effectiveness of our controls
  • All of the above

So the examiner will want to know the scope, but it’s to your benefit for you to understand it too because examiners will often use the results of your audits to shape and possibly reduce* the scope of their examination!

* Some audits will break the first objective into two sections; design (are they designed properly), and operation (are they in place and operational).
** FFIEC IT Examination Handbook, Audit, Page 8

Related posts:

05 Apr 2013

The Problem with PEN Tests

This is a true story, the names have been changed to protect the guilty.  Al Akazam (not his real name) is an IT consultant with a solid background in technology, and wants to expand his practice into network penetration (PEN) testing.  So he downloaded a copy of Nessus, which is a powerful, open source, vulnerability scanner…and just like that Al Akazam was a PEN tester!  Armed with this new tool, Al secured his first client, a financial institution.  The institution was aware of the FFIEC guidance to periodically validate the effectiveness of their security controls through testing, and although Al didn’t possess audit credentials, nor vast experience with financial institutions, he seemed to know what he was talking about, and the institution engaged him.

Al got the institution to allow him to connect to the internal trusted network, where he activated his scanner and sat back to let it do its magic.  An hour or 2 later the scan was complete, and Al had a couple hundred pages of results, some of which (according to his magic scanning tool) were very severe indeed.  Confident that he had uncovered serious and immediate threats to the network, Al rushed the 200 page report to management, who were understandably very alarmed.  Al completed the engagement secure in his belief that he had performed a valuable service…but in fact he had done just the opposite.  He had done the institution a disservice.  By not evaluating the threats in the context of the institutions’ entire security environment, Al misrepresented the actual severity of the threats, and unnecessarily alarmed management.

A vulnerability’s true threat impact, its exploitation factor, is best expressed in a formula:

Threat impact = (vulnerability * exploitation probability) – mitigating controls

Al simply took the list of potential vulnerabilities the scanner spit out, and without factoring in the exploitation probability, or factoring out the existing controls, changed the equation to:

Threat = vulnerability

What he should have done was take the threats he found, and evaluate them in the context of the institutions’ specific environment by ascertaining what preventive measures were in place, and how effective are they…i.e. the likelihood that the vulnerability would be exploited, and if preventive measures failed, what detective and corrective measures are in place to minimize the impact?  The question Al should be addressing is not “what does my magic scanner say about the risk”, but “what is the actual risk”.  Simply put, Al got lazy (more on that later).

What else did Al do wrong?:

  • He didn’t start with an external scan.  Since the external interface(s) are the ones getting the most attention from the hackers, they should also get more preventive, detective and corrective resources directed towards them.  A risk-based approach demands that testing should always start at the outside, and work its way in.
  • The institution gave him privileged access to the internal network, which is not realistic and does not simulate a real attack.  Sure it’s possible that malware could allow an attacker access, and privilege elevation exploits can theoretically allow the attacker to gain privileged access, but is it likely?  How many layers of controls would have to fail for that to happen?
  • Again, he got lazy.  He should have gone further in his testing by taking one of the most severe vulnerabilities, and tying to exploit it.  Only then would management understand the true risk to the institution, and cost justify the allocation of resources to address it.
  • He didn’t understand financial institutions.  Bankers understand the concept of “layered security”, and how having multiple controls at various control points reduces the risk that any one failed control will result in an exploit.  The vast majority of today’s financial institution networks are built using a layered security concept, and have been for some time.  Shame on Al for not recognizing that.
  • He presented management with a meaningless report.  Instead of simply regurgitating the scanner severity ratings in the management report, he should have adjusted them for the control environment.  In other words, if the scanner said a particular vulnerability was a 10 on a scale of 1 – 10, but the probability of exploit was 50%, and other overlapping and compensating controls are present, the actual threat might be closer to 3 or 4.

I’ve seen this scenario several times over the last few years, and in most (but not all) cases when the PEN tester is presented with the flaws in their methodology, they adjust accordingly.  This is important, because a bad PEN test result has a ripple effect…you now have to expend resources to address issues that may not actually need addressing when placed in proper context.  You have to present the report to management, with an explanation of why it’s really not as bad as it looks, and you have to make the report available to your examiner during your next safety and soundness examination.  So for all these reasons, if you are a banker facing a similar situation, push back as hard as you can.  And get outside help from an auditor or IT consultant to help make your case if necessary.

Are you a PEN tester or auditor?  What is your approach to automated scanners and test results, do you adjust for the overall controls environment?

25 Jan 2012

Bank Directors and Officers targeted in 2011

The final numbers are in for 2011, and it was a record year for Director and Officer (D&O) lawsuits by the FDIC.  In 2011 alone, 264 defendants were named in FDIC lawsuits.  To put that in perspective, that’s more than twice the number sued in the previous 2 years combined.  Some of the most frequently repeated charges were:

  • “Negligence”
  • “Simple negligence”
  • “Gross negligence”
  • “Engaged in corporate waste”
  • “Recklessness and willful misconduct”
  • “Breach of fiduciary duty” (more…)
08 Nov 2011

Access Rights a frequent finding

In reviewing recent audit and examination findings, the issue of access rights and permissions is coming up with increasing regularity.  Making sure that end-users have no more access rights than absolutely necessary to do their job is one of the best information security controls.  According to the FFIEC, formal access rights administration for users consists of four processes:

  •   An enrollment process to add new users to the system;
  •   An authorization process to add, delete, or modify authorized user access to operating systems, applications, directories, files, and specific types of information;
  •   An authentication process to identify the user during subsequent activities; and
  •   A monitoring process to oversee and manage the access rights granted to each user on the system.

One best practice for simplifying the management of access rights is to assign them via  group membership based on the employee’s role.  Most financial institution employees are easily categorized by job duties (lending, deposit ops, senior management, IT, etc.).  Job duties logically translate to responsibilities, and network file and folder access flows from that.  When an employees job duties change, changing group membership is much easier than changing individual file and folder permissions.

One of the main reasons for audit and exam findings in this area is not necessarily that users and groups aren’t maintained, but that there is a  disconnect between access rights on the Core system and the local network (Active Directory).  Unfortunately until the Core providers implement Active Directory integration this is a manual, 2-step process.

The key to addressing the FFIEC guidance (and preventing rights gaps in the process) is to manage all four of the above steps at the same time in the IT Steering Committee (or functional equivalent).  Each time the committee meets it should approve all access rights adds, changes and deletes, and review activity logs.  Periodically review the existing users (by group if possible) to validate the appropriateness of their (and the groups) access rights on a schedule commensurate with the risk.  Properly restricted regular users may only require semi-annual reviews, but privileged access (administrative) accounts should be reviewed for activity and re-approved at each committee meeting.

27 Jun 2011

Audits vs. Examinations

As I speak with those in financial institutions responsible for responding to audit and examination requests, I find that there is considerable confusion over the differences between the two.  And some of this confusion is understandable…there is certainly some overlap between them, but there are also considerable differences in the nature and scope of each one.  It may sometimes seem as if you are asked to comply with 2 completely different standards.  How often has the auditor had findings that you’ve never been asked during an examination?  And how often has an examiner thrown you a curve ball seemingly out of left field?

In a perfect world shouldn’t the audit be nothing more than preparation for the examination?  The scope of the audit should be no more and no less than what you need to get past the examination.  Any more and you feel as though you’ve wasted resources (time and money), any less and you haven’t gotten your money’s worth, right?  Well…actually no.  While the two have the same broad goal of assessing alignment with a set of standards, the audit will often use a broader set of industry standards and best practices.  This is because the FFIEC guidance is so general and non-prescriptive.  For example, take one of the questions in the FDIC Information Technology Officer’s Pre-Examination Questionnaire.

“Do you have a written information security program designed to manage and control risk (Y/N)?”

Of course the correct answer is “Y”, but since the FDIC doesn’t provide an information security program template, how do you know that your program will be acceptable to the regulators?  You know because your IT auditor has examined your InfoSec program, and compared what you have done to existing IT best practices and standards, such as COBIT, ITIL, ISO 27001, SAS 94, NIST, and perhaps others.  While this doesn’t guarantee that your institution won’t have examination findings, it will reduce the probability, as well as the severity, of them.  This point is critical to understanding the differences between and audit and an examination; an audit will identify and allow you to correct the root cause of potential examination findings prior to the examination. So using the example above, even if the examiner has findings related to your information security program, they will be related to how you addressed the root cause, not if you addressed it.  (I’m defining root cause as anything found in the Examination Procedures.)  In fact, the FFIEC recognizes the dynamic between the IT audit and examination process this way:

An effective IT audit function may also reduce the time examiners spend reviewing areas of the institution during examinations.

And reduced time (usually) equals fewer curve balls, and a less stressful examination experience!

11 May 2011

Using Technology to Drive Compliance

In the past year to year and a half, nearly all of the IT examination findings I’ve seen have in the broad category of “documentation”, or more specifically, lack thereof.  In other words, policies and procedures were satisfactory, but documentation was either non-existent, or insufficient, to demonstrate that actual practices followed policy and procedure.

To visualize this, consider that the compliance process consists of three overlapping areas of endeavor:

 

Written polices begin the process, which must always have regulatory guidance as their target.  Policies should track guidance precisely; if guidance states that you should or must do something, your policies should state that you do, or you will.

If policies are “what” you do, written procedures are the “how”.  And just as policies align with guidance, procedures should flow logically from, and align with, your policies. For example, your information security policy states (among other things) that you will protect the privacy and security of customer information.  Your procedures contain the detailed steps (or controls) that you will take to prevent, detect and correct unauthorized access to, or use of, customer information.  Controls like securing the perimeter of your network, updating server and workstation patches, installing and updating Anti-virus, etc.

So you have the “what” and the “how”, but as I mentioned previously, the vast majority of audit and examination findings in the past couple of years were due to deficiencies in the third area; actual (documented) practices.  And this is where technology can be of tremendous assistance.

[pullquote]Auditors and examiners much prefer automated systems over manual.  Automated systems don’t forget, or get too busy, or take vacations or sick days.   They aren’t subject to human error or inconsistencies. [/pullquote]

Auditors and examiners much prefer automated systems over manual.  Automated systems don’t forget, or get too busy, or take vacations or sick days.   They aren’t subject to human error or inconsistencies.  In fact, some processes like firewall logging, normalization, and analysis are virtually impossible to implement manually because of the sheer volume of data generated by these devices.*  And while other areas like patch management and Anti-virus updates are possible to implement manually, auditors much prefer automated processes because they ensure polices are applied in a consistent and timely manner.

But perhaps the biggest boost to your compliance efforts from technology is in the area of reporting, and specifically, automated reporting.  In today’s compliance environment, if you can’t prove you’re following your procedures, the expectation from the examiners is that you aren’t.  This is the one area that has evolved more than any other in the past couple years.  And automated reporting provides that documentation without human intervention, easing the burden on the network administrator.  Auditors (internal and external) and examiners also like automated reporting because they have a higher confidence in the integrity of the data.  And the IT Steering Committee likes it because it is much easier to review and approve reports prepared and presented in a standardized format.

So in summary, technology enables automation, and automation enhances compliance.  And along the way everyone from the Board of Directors, to management committees, to the network administrator, benefits from it.

 

*  The FDIC IT Officer’s Pre-Examination Questionnaire validates the difficulty of manual processes to manage logs when it asks:

“Do you have a formal intrusion detection program, other than basic logging (emphasis mine), for monitoring host and/or network activity”