Monday, February 15, 2016

Data Analytics and Compliance, from Tom Fox's Compliance and Ethics Blog


From Tom Fox's Compliance and Ethics Blog, where we were interviewed for a special 5-part piece on data analytics and compliance. See original link for Part 1 at: http://fcpacompliancereport.com/2016/01/9264/ . More to follow.

This week I will begin a five-series exploring data analysis and how it can be used by the Chief Compliance Officer (CCO) or compliance practitioner to support a best practices compliance program under the Foreign Corrupt Practices Act (FCPA), UK Bribery Act or other anti-corruption compliance regime. My partner in this exploration is Joe Oringel, a co-founder and Managing Director at Visual Risk IQ, a data analytics services firm, who I interviewed for this series.
Today, we will focus the basics of data analysis and how it differs from other forms of data testing such as sampling and inspection of documents. Next I will consider how to think through the use of data analysis and the COSO Framework. Then I will explore some of the ways Oringel and his team have used data analytics to assist companies in ways that are analogous to FCPA based compliance programs. Additionally Oringel and I recorded a three-podcast series where we explored these issues in an interactive format. If you check out the podcasts you will be eligible to receive an additional White Paper, at no cost, on the complete series and topic.
Being a recovering trial lawyer, I began with the basics, which is: what are data analytics and data analysis? Oringel kept it simple, saying that it’s merely using data to answer questions. He noted that such analysis predates computers since Sherlock Holmes became well known for using deductive reasoning to make determinations from data based evidence. In the 21st century business world, the best evidence that we have as to whether something took place or not is most often digital evidence. Oringel pointed to a variety of authoritative digital data sources, which intone that modern data analysis is a process of inspecting, cleansing, transforming, and modeling data with the goals of highlighting useful information and supporting our decision-making, so data analysis is answering a question with data.
Oringel next pointed to another set of definitions for data analysis, which derived from Thomas Davenport, who is a well-known academic and author who teaches at Babson College. Davenport incorporates the notion of time to categorize data analytics as answering certain questions about either the past, the present or even the future. Incorporating time into analytics focuses these efforts so you can build repeatable patterns into the questions that should be asked and answered.
Oringel, who has both academic and professional training as an internal auditor, said that external financial auditors, like the Big Four, usually focus on answering the question, “What has happened?” This is really a focus on historical transactions, looking backwards and looking at the reporting of transactions, for example what was recorded in the books and records of the company? How was the transaction recorded? Why was the transaction recorded a certain way?
I next turn to the difference between data analysis and traditional internal auditing or sampling. Oringel believes this is the most significant change in technology in the last 25 to 30 years due to the advent of the personal computer and the associated spreadsheets and database softwares that allow auditors to make their conclusions with data, and to have those conclusions not be based on a sample of data, but, rather, on analyzing the population of data. He said “In the late 1980’s, early 1990’s, the predominant technique that internal auditors used was sampling. If an audit was designed to vouch fixed assets, auditors would pick a sample of 25 or more fixed assets; re-compute, or test, the acquisition date, and the disposition date; and finally re-compute depreciation by hand. If the fixed assets in our sample were properly recorded, then we looked up on a statistical chart or table and concluded that we were sufficiently confident that all of the fixed assets at the company were properly stated.”
He further said “with today’s digital accounting software, every fixed asset can be downloaded and the depreciation re-computed based on the acquisition date and the disposition date and the various depreciation rules for each asset class. If there are any differences in the valuation of any asset, the differences can be found through data analysis. Data analysis allows a company’s auditor, whether internal or external, to re-compute or model the financial recording of transactions, as they ought to be recorded and, therefore, have even greater confidence than if they had tested using sampling. By analyzing every asset and related transaction, a company is able to test the entire population and be much more confident in the results. This has obvious implications for any FCPA audit as there is no materiality standard under the FCPA.”
Data analytics can transition from a review of historical transaction to a review of current transactions simply by asking similar questions of similar data, but with a change in focus. This focus change is to answer the question “what is happening now and what should we do about it” instead of merely “what has happened.” When your bank or credit card company puts a freeze on your charge card because of suspicious transactions, they are using data analysis as an alerting function. More sophisticated companies use this sort of data analytics tools and processes as part of their Compliance program for areas like monitoring for improper payments or to identify vendors who may be a match with entities on a Denied Parties list.
This use of monitoring as an alerting task is a logical next step for compliance teams, but most are not yet for any number of reasons. The transition from data analytics as historical analysis to alerting through continual or continuous monitoring can be a challenge, and it is still an emerging best practice. Continual or continuous monitoring establishes these alerts and suggests us to take action based on something that happened just a quick moment ago.
I asked Oringel if he could provide an example along the lines of the Department of Justice (DOJ) and Securities and Exchange Commission (SEC), jointly released FCPA Guidance, which says that the goal of a best practices compliance program should be toPrevent Detect Remedy matters before they become FCPA violations. He translated the FCPA Guidance into “Stop, find and fix”. He believes that it is about asking the time period that you are pulling the data from, so if you are looking at transactions that happened 6 or 9 months ago, then your analytics are serving as a reporting function. He gave an example where a business development person entertained a government official, yet did not seek preapproval to do so. Unfortunately, the amount spent was more than was allowed under the company’s Gifts and Entertainment Policy for entertaining a foreign official. Now the compliance function needs to fix that policy violation and make sure that it does not happen again.
The next frontier for data analytics is a move from alerting to predictive analytics, which is using data analysis to predict what will likely happen in the future. This allows us to move from answering questions about what has happened in the past or present to what will likely happen in the future. While predictive analytics is common in many industries and processes, like Commercial Lending or Insurance, it is not at all common in compliance. Yet.
The “find” capability goes from the past to the present and to the future and may be where the most advanced audit and compliance teams go next. This actually moves to an almost a proscriptive action, where, because you were able to predict, or have an insight, going forward you are able to deliver a risk management solution to that potential situation.
Oringel concluded by saying that it is this future orientation, with data analysis as a predictor, that he believes is the next step in the compliance function using data. A company can score high risk employees in their unit by identifying the salespeople that tend to not respect the organization’s T&E policies; who spend too much on lavish meals or engage in other activities which contradict company policies, such as neglecting mandatory compliance training or simply being routinely late with expense report submissions.
============================================================================================================================================================================
Joe Oringel is a Managing Director at Visual Risk IQ, a risk advisory firm established in 2006 to help audit and compliance professionals see and understand their data. The firm has completed more than 100 successful data analytics and transaction monitoring engagements for clients across many industries, including Energy, Higher Education, Healthcare, and Financial Services, most often with a focus on compliance.
Joe has more than twenty-five years of experience in internal auditing, fraud detection, and forensics, including ten years of Big Four assurance and risk advisory services. His corporate roles included information security, compliance and internal auditing responsibilities in highly-regulated industries such as energy, pharmaceuticals, and financial services. He has a BS in Accounting from Louisiana State University, and an MBA from the Wharton School at the University of Pennsylvania.
Joe Oringel can be reached at joe.oringel@visualriskiq.com.
This publication contains general information only and is based on the experiences and research of the author. The author is not, by means of this publication, rendering business, legal advice, or other professional advice or services. This publication is not a substitute for such legal advice or services, nor should it be used as a basis for any decision or action that may affect your business. Before making any decision or taking any action that may affect your business, you should consult a qualified legal advisor. The author, his affiliates, and related entities shall not be responsible for any loss sustained by any person or entity that relies on this publication. The Author gives his permission to link, post, distribute, or reference this article for any lawful purpose, provided attribution is made to the author. The author can be reached at tfox@tfoxlaw.com.


© Thomas R. Fox, 2016

Friday, January 9, 2015

Five tips...#5 - Supplement Necessary Skills with Internal or External Resources

This week we have been blogging about how to succeed with data analytics in areas such as internal audit and compliance. Monday we introduced the following Body of Knowledge and indicated that each of the skills below are often needed for a data analytics project.
  • Project Management
  • Data Acquisition and Manipulation
  • Statistical techniques
  • Visual Reporting techniques
  • Communication
  • Audit and Compliance Domain expertise
  • Change Management and Strategic Thinking
Does this mean that audit teams need a statistician or visual reporting whiz in the department? Not at all. Just as audit teams co-source with supplemental resources, they can also co-source for data analytics. Better still, co-sourcing with internal company resources, in the form of a secondment or guest auditor is often possible. Reach into IT's Business Intelligence or data warehouse group, and internal audit can find talent with excellent company and data manipulation expertise. Reach into HR or Finance for someone with domain expertise around incentive compensation and team on that important Sales commission audit project.

Will these resources have advanced audit or compliance domain expertise? Probably not, but Tom Brady doesn't play running back or wide receiver yet he makes those players better by fitting the pieces together. Audit and compliance leaders know what questions we want to answer. It's the "how" where we sometimes need help. At Visual Risk IQ, I have the very good fortune to work with an incredibly talented team that is deep in database design, data manipulation, programming, and visualization skills.  We work together to make sure that our queries are answering the right business questions, and in turn that those answers are being communicated in a way that is precise and easy to understand.

When we have first worked in domains where our experience had been limited (e.g. Health claims in 2008, FCPA / anti-corruption in 2010, or HR in 2013), we relied heavily on domain expertise from our clients' General Counsel's office or on consultants to our firm, so we could bring the full expertise needed for a project, given the body of knowledge framework above. This technique has worked consistently for us, and it works for audit and compliance too. 

Good luck in 2015 with your data analytics projects! Please write or call if you'd like to compare ideas on how to excel in data analytics for audit or compliance. We'd be happy to assist in your success!

Joe Oringel
Managing Director
Visual Risk IQ


Thursday, January 8, 2015

Five tips...#4. Consider metric, outlier, and exception queries

For readers seeing this post as their first of the series, today is actually the fourth of a five-part blog that has been developed in response to Internal Auditor magazine's lead article titled "The Year Ahead: 2015". Because so many people make resolutions for the new year, we wanted to help audit and compliance professionals succeed with their resolutions.  Especially because we believe there are more than a few whose resolutions include becoming more data-driven in their work through regular use with data analytics.

Yesterday we defined metric, outlier, and exception queries, and provided examples in the context of related potential audit projects around expenses such as Accounts Payable, Travel and Entertainment, or Payroll. To review, metric queries are simply lists of transactions that measure values against various dimensions or strata, such as rank or time series. Top 10 largest or simply transactions by day of week are examples of metric queries.  These metric queries are powerful, and can become even more powerful when combined as part of outlier and exception analysis.

One recent Travel and Expense example from our client work was seeing a number of executive assistants in the "Top 10 Travel Spend reports." Even before we looked at any exception report it became clear that some of the organization's executives had their assistants complete and submit their personal expense reports, and then approved those reports themselves.

Outlier queries are those that compare value to other values like a mean or standard deviation. As an example, saying that today is twenty degrees colder than average or the coldest day of winter is more informative than saying that it will be sixteen degrees tomorrow than yesterday. Better still, listing the 10 coldest days together in relation to average and standard deviation is even more informative.

We recommend diving into exception queries only after metric and outlier queries have been prepared, explored and analyzed. It's common for false positives to be averted through thoughtful review of metric and outlier queries.

How does this compare to your experiences? 

Wednesday, January 7, 2015

Five tips... #3 - Understanding and Exploring Your Data

This week's 3rd tip for advancing with audit and compliance analytics is to "Understand Your Data, and Explore it Fully Before Developing Exception Queries." One common mistake that we see audit and compliance professionals make with data analytics is that they sometimes dive right into searching for transaction exceptions before exploring their data fully. This limits the effectiveness of their analysis, because they are searching for something specific and can overlook other conditions or anomalies in their data. If you've not seen the selective attention (aka Gorilla and Basketball) videos from Daniel Simons, here's a fun link.

Selective attention on exception queries seems to happen due to the strengths of traditional analytics tools like Microsoft Excel and general purpose tools like CaseWare IDEA or ACL. It is less common with Visual Reporting tools like Tableau and Qlikview, in part because these tools are designed to specifically support data exploration and interaction with click and drill-through capabilities. Visual Reporting capabilities are very effective for data exploration, and some rudimentary visual capabilities can be found in Excel, IDEA, and ACL.

During data analytics brainstorming, we categorize analytics queries as Metric Queries, Outlier Queries, and Exception Queries. When prioritizing queries to be built for client assignments, we make sure that there some of each type of query, so that sufficient data exploration takes place before we jump into exception queries or begin researching exceptions.

Metric queries are those analytics such as "Top 10 Vendors by Vendor Spend" or "Top 10 Vendors by Number of Transactions", or "Top 10 Dates of the Year for Requisitions (or Purchase Orders)." Simply summarizing number and value of transactions by different dimensions (day of week, week of quarter, or by UserID) can identify anomalies that should be questioned further. On a recent Payroll Wage and Hour project, we found unusual patterns of when people punched in and out much more frequently on some minutes (e.g. 7 or 23 minutes past the hour, vs. 8 or 22 minutes past the hour). This condition called for further inquiry and analysis about whether time rounding was fair and equitable for certain types of workers. This condition is in fact a major compliance risk and should be considered for any employers with a significant number of hourly worker. See Corporate Counsel article for more information.  

Outlier queries are comparative analytics like "Largest Invoice to Average Invoice, by Vendor," "Most Expensive Airfare by Distance," or "Most Expensive Travel / Entertainment Event per Person vs. Average Event per Person." These outlier queries are also essential, in that they help identify patterns or relationships that should be investigated further. Digital analysis such as Benford's Law is a well-known audit example of an Outlier query, but there are many more techniques that can yield insight beyond only Benford's Law.

Example of exception queries are more traditional Analytics queries such as these listed below:
  • List if two (or more) invoices have been paid for the same amount to the same vendor
  • List any purchase orders created after their corresponding invoice
  • List any Vendors who share a Tax ID Number, Address, or Phone Number with an Employee
  • List any Vendors who have had transactions posted after being Terminated or made Inactive 
In short, we recommend spending at least an hour and as much as a day or more exploring and analyzing your data, before beginning any Exception Queries. A data exploration checklist follows - any additions or other suggestions to this list are welcome. 
  • Sort transactions from oldest to newest and from newest to oldest. Any unusual dates or times? Any gaps in date or time stamps? Why?
  • Sort transactions from largest to smallest and smallest to largest. Any unusual negative values?
  • Stratify by various status codes, reason codes, or transaction types. Are all values consistently completed. Any unusual relationships? What do each of the codes and values represent?
  • Stratify by dollar value ranges. Do 20% of the transactions make up 80% of the value? Should they? The Pareto Principle says yes, but your business may vary. 
  • Compute Relative Size Factor (largest to average and largest to second largest), and sort again. Do any of these RSF values cause you to want to drill into specifics? Consider whole numbers and large numbers. Why or why not?
What has been your most significant "aha" moment when exploring your data? Comments and feedback are welcomed below.

Joe Oringel
Managing Director
Visual Risk IQ
Charlotte NC

Tuesday, January 6, 2015

Five tips for Advancing with Audit Analytics. Tip #2 - Brainstorming

Yesterday we started a multi-part post on the importance of building audit data analytics capabilities, together with some "how-to" tips. Our first tip was how this is actually as much of a people challenge as a technical undertaking. One particular "secret" is that a combination of skills are needed to accomplish these analytics projects, and we see many departments make the mistake of assigning a single individual to carry out a project, without sufficient assistance or at least oversight from colleagues that have complementary skills.

In our data analytics consulting practice, we use a Body of Knowledge framework to identify needed skills for a particular project, and then match at least one "expert" with an "apprentice" that is looking to add to these same skills. Together our teams bring excellent qualifications in each of these domains, but it's rare that they all arrive in the form of a single consultant. That framework was published here yesterday.

Today's tip is to "Begin with the business objectives in mind, and map from these objectives to available digital data." Too often, we see compliance and audit teams request data and begin to interrogate it before understanding the data fully or taking steps to validate control totals and/or data completeness. A related mistake is to exhaustively test a single data file without considering supplemental data sources that may yield greater insight or answer related business questions. 


A recent example of why to begin with business questions was a Payroll project that we completed for a retail client. Our team was tasked with searching for "off-the-clock" work. If we had focused only on available data files, we could have answered questions about meal breaks, rest breaks, and overtime but perhaps missed other hours worked but not paid. By focusing on the business question first, we identified badge data and cash register data to identify if employees were in the store and ringing sales, yet were off the clock at the time of badge swipes or point-of-sale,

As such, the first step in any data analytics project is brainstorming. You can think of it as part of project planning. During this step, teams should identify the business questions that they want to answer with their analytics efforts, and cross-reference these business questions against available reports and digital data. If existing report(s) fully answer a business question, then a new query may not needed**. But if a report does not currently exist, then analytics should be considered and understanding data sources becomes a key next step. During brainstorming, it is very important to understand the number and complexity and number of data sources that will be needed, and to focus only on a small enough number of business objectives so that the number of data sources does not get overwhelming. It is better to have a series of "small win" analytics efforts, than a larger, less successful project

Tune in tomorrow for tips on how to understand your data better, and how to explore it before building exception queries. In the meanwhile, comments and suggestions are welcome.

Joe Oringel
Managing Director
Visual Risk IQ
Charlotte NC

** Note that some audit and compliance teams may choose to build queries that replicate existing reports, to test the report validity. For important reports, this re-performance can provide comfort or assurance that the original report is working.

Monday, January 5, 2015

Five Tips for Advancing with Audit Analytics

For many people today, Monday January 5, is the first work day of 2015. We compliance and audit professionals are like our many co-workers and friends in that we have new goals and ideas that we expect should set this year apart. We want to grow and develop personally and professionally and have even greater career success. Inside and even outside of our current roles. But how?

Even more than in previous years, 2015 is shaping up as the year that Analytics will be adopted by the audit and compliance profession, at least according to Internal Auditor, the global professional journal for internal auditors. See article titled "The Year Ahead:2015."

This article quotes several high profile Chief Audit Executives (CAE's) on the subject of Analytics. Raytheon's Larry Harrington, a frequent keynote speaker for the IIA says that "you will see greater use of data analytics to increase audit coverage without increasing costs" and that "internal audit will leverage analytics from other lines of defense," such as compliance and risk management. Increased use of Analytics will lead to greater value from audit and compliance, as measured by management. But if this was easy, wouldn't we all be doing it already? How should we overcome obstacles such as finding the right people, training, and budgets (as cited by the CAE's in this article)

Visual Risk IQ has been helping audit and compliance professionals see and understand their data since 2006. We work with all leading audit-specific tools (e.g. CaseWare IDEA, ACL, and newcomer Analyzer, from Arbutus Software), and also with general purpose analytics and visual reporting tools like SQL, Tableau, Oversight, and more. Importantly, we have completed hundreds of engagements for clients across a wide variety of industries.

These five tips are:

1) Consider skills and experience of the team, not individuals, when planning a data analytics project.
2) Begin with the business objectives in mind, and map from these objectives to available data
3) Understand your data, and explore it fully before developing exception queries
4) Consider outlier, metric, and exception queries
5) Supplement necessary skills with internal or external resources

We'll be expanding on each of these five tips in blog posts later this week, but here is some information on the first and perhaps most important one. Your people.

1) Consider skills and experience of the Team, not individuals, when planning a data analytics project.

As part of our consulting projects, and for our inward assessment of our own team members, we use an analytics-focused Body of Knowledge framework that has the following seven key components.
  • Project Management
  • Data Acquisition and Manipulation
  • Statistical techniques
  • Visual Reporting techniques
  • Communication
  • Audit and Compliance Domain expertise
  • Change Management and Strategic Thinking
In our experience, data analytics projects succeed because of project expectations and corresponding competencies of team members in these seven areas. It's especially important to note that these body of knowledge components are rarely (if ever?) found at a high level within a single individual, and therefore a team approach is needed to accomplish successful an analytics projects.

People that have greater skills at project management or communication of issues may not have the requisite technical experience when it comes to data acquisition and manipulation, or statistical techniques. Similarly, it is common for stronger data specialists to be weaker on audit or compliance domain expertise.

So when planning an audit analytics project, be sure that you've built a team that has each of these key elements in their skill set, and that they have the incentives and team structure to work together and learn from each other's expertise.

Tune in tomorrow for more on how to begin with the business objectives when planning an audit analytics project. In the meanwhile, comments and suggestions are welcome.

Joe Oringel
Managing Director
Visual Risk IQ
Charlotte NC

Wednesday, January 16, 2013

Rutgers WCAS 26 - January 2013, part 1

Back again at Rutgers for the 26th World Continuous Auditing and Reporting Symposium (WCARS), which was delayed from November 2012 due to Hurricane Sandy. Yes, it's been a while since we've updated this blog, but we've become much more active in using Twitter ( @VisualRiskIQ ) to highlight what's new in the world of continuous auditing, continuous controls monitoring, compliance monitoring, and anti-fraud analytics. And though we'll be tweeting from #WCARS26 also, we feel that the Rutgers symposium is too important to cover in only 140 characters.

Presenters and sponsors at the 26th WCARS event include the AICPA and ISACA, all of the Big 4, and software providers such as ACL, CaseWare, CA, CCH, Greenlight, Oversight, and Trintech. Throughout the conference, we expect to see a wonderful mix of software and process case studies, standards reports, and academic papers, all on the subject of Continuous Auditing. The Rutgers definition of CA includes continuous data monitoring (i.e. embedded audit routines), continuous controls monitoring, and also continuous risk assessment, and presenters will cover all these topics. An important addition has been newly minted PhD Kevin Moffitt, who adds deep expertise in unstructured data analytics and text mining to Rutgers long history of data analytics of structured data.

Corporate attendees and presenters include advanced users of CA including Siemens, Verizon, Proctor & Gamble, HP, and more. More on the agenda, including selected slides, is available at: http://raw.rutgers.edu/26wcars .

The conference has begun with a nice introduction from Miklos on the status of numerous research and practical CA applications at Rutgers, and is transitioning to Eric Cohen, PwC's XBRL expert. More to come...

Joe Oringel
Harrison, NJ
January 16, 2013