Saturday, November 5, 2011

Another Saturday Morning in Newark NJ

Very energizing sessions this morning, as we heard from a Who's Who of large, multinational firms who have implemented CA and CM solutions. Siemens Financial Services led things off with their "Road to Continuous Assurance," as Jason Gross leads a mature CM function that was born in Internal Audit and has migrated to the CFO's office. His deck is downloadable at: http://raw.rutgers.edu/23WCARS

Brad Ames from HP followed with another strong presentation on using CA / CCM for assessing both IT controls and Financial Controls. @43Chase and @debreceny observed that strong IT controls help enable strong financial controls. I was focused on their use of dashboards at HP, and have asked for examples. Stay tuned.

Dave Levin of Proctor & Gamble followed with a strong session on the use of data-driven risk assessments. They compare results of Control Self Assessment and actual audit results, using outliers and differences between management's assessment (i.e. CSA) internal audit's evaluation as input into Internal Audit's risk assessment. Dave's session is available for download at this link.

Friday, November 4, 2011

Leveraging Information to Align Risk and Performance - CM, per KPMG

Jim Littley from KPMG is talking about Continuous Monitoring (CM) / Governance Risk & Compliance (GRC) / Business Intelligence (BI) etc., and all of the alphabet soup of technology tools that can be used improve controls and risk monitoring. He observes that most large organizations have multiple initiatives related to acquiring and implementing tools and technologies for point solutions that assist in this area, but these are siloed and rarely linked together. He sees Internal Audit as a potential value-creator in this area.

Good points. We see Procurement teams with supply chain analytics, Finance with BI and macro-level analytics, and Internal audit with audit data analytics, ERM or Risk with survey tools for subjective risk assessment, sometimes all in the same firm. Ideally, macro-level analytics tools like BI should work together with the exception analytic tools in the CM world to provide a single, integrated review of risk.

Jim suggests we think of Continuous Monitoring as the first line of defense, and Continuous Auditing as the second or third line of defense. Using common data sources (i.e. a single source of truth) can lower the cost of acquiring data for each initiative, and improve overall quality.

Slides aren't posted (yet?), but I'll update this post with a link if they are made available.

Opening Rutgers WCARS session - Continuous External Auditing

The opening panel was led by Greg Shields of the Canadian Institute of Chartered Accountants (CICA) and included Deloitte's National Office Partner Tom Criste, Retired Deloitte Partner Trevor Stewart, and PhD Student Paul Byrnes. A little disappointing that more signing partners from more accounting firms were not on the panel. Perhaps that would help unlock the code on the very slow adoption of use of technology to execute external audits.

Much emphasis was on the degree of change that would be needed for the firms to seriously re-engineer their processes. My favorite quote from the session was from Tom Criste, who observes that the great increases in technology have affected how audits are documented, but not how audits are performed. The work programs for Inventory, A/R, Cash, etc., are relatively unchanged even from when he entered the profession decades ago. And because many procedures (e.g. Inventory Observation, Confirmations of A/R balances) are required by professional standards, it would be difficult to re-engineer the audit.

Mr. Criste envisions an audit where statisticians and economists could review data and help form the External Auditor's opinion. He suggests that a test audit could be performed in parallel with a traditional external audit, and that the firm could compare results and findings with each other and the client. But he says, who would want to invest that time and energy, even if the second audit was free?

If that's truly the barrier, I'd suggest to start with the users of financial statements. Would MF Global's investors and creditors like to have had any assurance provided on quarterly financial results? Probably so.

I'd advocate beginning with the end in mind, and determine the desired frequency of external audit assurance. More than annual is probably good. Daily is probably way too frequent. (What CEO wants to explain slow mid-month sales to Wall Street Analysts).

If quarterly assurance was desired, how should external audit procedures be changed? Comments welcome!

Thursday, November 3, 2011

Live from Rutgers WCARS - Friends and Family meeting

Most of you reading this blog post have an awareness and even a keen interest in data analysis and/or continuous auditing, whatever we agree that means. You may not know how long this topic has been being discussed and debated.

I'm writing this from the 23rd (!) World Continuous Auditing Symposium at Rutgers Business School in Newark NJ. It's been a semi-annual meeting, so the group began gathering in 1999. All of the Big 4 firms are here, as are the AICPA, software vendors like ACL, Caseware, Oversight, and even CA. For more information on the agenda, see: http://raw.rutgers.edu/23WCARS .

Beginning tomorrow morning, I'll be blogging about the most interesting speakers, topics, and academic papers on the main agenda, so come back often for updates.

Today is the "Friends and Family" meeting, where some of the longer-standing supporters of the Rutgers program are discussing emerging issues. One topic on the agenda is the notion of Audit Data Standards, which would be a common data model for certain business processes like General Ledger and perhaps subledger like Supply Chain or Revenue.

The presenters advocate a cloud-based data store that public companies would use to load daily or at least monthly transactions, and that external auditors (and perhaps internal auditors) would access that data periodically to perform audit analytics. Glad I'm here - there's a lot of pro's and con's to consider with this standardization.

Sunday, September 11, 2011

Remembering 9/11/2001

It was afternoon for me in Ireland, where I was working on a project with Bristol-Myers. On a conference call with our NYC offices in midtown east. "Joe we need to reschedule the call, a small single-engine plane has hit WTC. Everybody is turning to the news to see what's going on." I wish that was what had happened.

My wife was 8 months pregnant with our second child, who we named Juliana to honor the daughter and her mother, an Irish national that both lost their life on the plane that hit the towers. Juliana McCourt never saw her 5th birthday. Thankfully her uncle made it down 50+ stories of the WTC. Later that day he would learn of his sister's and niece's death.

I'll never forget the outpouring of support for NYC and the whole US from the city of Dublin and the whole country of Ireland. It took me nearly a week to return home, yet given the tragedy all around us in NY/NJ, we know we were still so very fortunate.

Where were you? What should we teach our children?

(written from 30000 feet, courtesy of Gogo. Eerie to be flying cross-country today. But very proud of our country's resilience and the feeling of safety as I travel.)

Friday, August 19, 2011

New GTAG on Data Analysis from IIA

The IIA has published a new Global Technology Audit Guide, the 16th in a series. It is available for free download (IIA Members only) or purchase (for non-Members) at the following link.

Noteworthy in this GTAG is the use of a Maturity Model that outlines the progression from Basic Analytics through to Continuous Monitoring. While the model is simpler and less prescriptive than ours published previously (download WG&L article featuring Arrowpoint Capital here), we believe it represents important guidance to assist audit teams in advancing from "zero to 60" in the use of data analytics.

What do you think of the new GTAG? We'll provide more thoughts on this guidance next week...


Thursday, August 18, 2011

Emerging IT Issues for Audit & Compliance Leaders

In recent months, I've had the opportunity to re-kindle my professional relationship with Professor Scott Fargason. Scott and I had previously have worked together as co-instructors for a variety of training events, both while I was at Deloitte & Touche and more recently as volunteers and paid presenters for various IIA and other training conferences. We find our background and experiences dovetail quite well - his academic and legal training are top notch, and I bring a practical "here's what works" perspective from my time in both consulting and industry.

We've come together to build a customized one- or two-day CPE seminar titled "Emerging IT Issues for Audit and Compliance Leaders" that we believe offers something quite new, most notably a two-instructor format with total costs similar to courses with only a single instructor.

For more information, please download a two-page flyer on the course and contact either of us for questions about availability.

Thursday, August 11, 2011

Continuous Auditing and Monitoring Bootcamp Scheduled in Houston

Visual Risk IQ will be leading a one-day workshop in Houston on Tuesday 9/27, hosted by the Texas Society of CPA's. The workshop is designed to help you get started on the path to delivering measurable results with continuous monitoring and auditing. This class builds on a highly-reviewed program in Atlanta delivered earlier this year.

We will discuss overall methodology, detailed design for constructing a CA program, talk about current technology, and demonstrate how to turn “CA” into “CM” to benefit your entire organization.

Outcomes from the class will include a Company-specific roadmap, customized for your business and stakeholders, to target the risks you identify and benefits you want to achieve.

Attendees will receive up to 7.5 hours of NASBA-compliant CPE and accomplish the following learning objectives.

1. Business challenges today, and how early detection mitigates greater risks
2. Working definitions of CCM, CM and CA and supporting technologies
3. How to determine where your organization is on the CA/CM maturity model
4. Hurdles to CM/CA implementation and how to overcome them
5. How to dive deeper to determine specific needs in developing the roadmap for CM/CA implementations
6. Proven methods for engaging other stakeholders

For more information, see the following Registration page to download a more detailed program description.


Wednesday, June 15, 2011

Continuous Auditing and Monitoring Bootcamp Scheduled in Atlanta

Visual Risk IQ will be leading a one-day workshop in Atlanta, hosted by the Georgia Society of CPA's. The workshop is designed to help you get started on the path to delivering measurable results with continuous monitoring and auditing.

We will discuss overall methodology, detailed design for constructing a CA program, talk about current technology, and demonstrate how to turn “CA” into “CM” to benefit your entire organization.

Outcomes from the class will include a Company-specific roadmap, customized for your business and stakeholders, to target the risks you identify and benefits you want to achieve.

Attendees will receive up to seven hours of NASBA-compliant CPE and accomplish the following learning objectives.

1. Business challenges today, and how early detection mitigates greater risks
2. Working definitions of CCM, CM and CA and supporting technologies
3. How to determine where your organization is on the CA/CM maturity model
4. Hurdles to CM/CA implementation and how to overcome them
5. How to dive deeper to determine specific needs in developing the roadmap for CM/CA implementations
6. Proven methods for engaging other stakeholders

For more information, see the following Registration page, or see our Events webpage to download a more detailed description.

Monday, April 4, 2011

An $8 Million Question: Why do auditors test changes to Vendor Master Files?

One of the early audit tests that I was responsible for was to review who had access to change our vendor master file, and to make sure that all those changes were logged, reviewed, and approved. Our audit objective were validity - making sure that all changes to the master file(s) were properly authorized. But even authorized changes to the master file create risk.

Case in point: Conde Nast's $8 million email scam, as reported in this Forbes Magazine blog posting from William Barrett and Janet Novack.

What seems to have happened in the Conde Nast case is that a fraudster sent in a change of address / change of banking information request on behalf of a legitimate vendor. But the bank information provided was not the actual vendor; rather it was an account set up by a fraudster with a similar name and address as the real vendor. So properly authorized payments totaling nearly $8 million were misdirected. The fraud was detected when the real vendor called to ask "where's our money?"

A variety of preventive and detective controls began to visualize in my head when I read this story. How are changes to address and/or bank information communicated from your suppliers? How are these changes corroborated?

How might data analysis be used to identify mis-matches between supplier names and addresses? Seems like a good time to ask at your organization, even if an AP audit is not on the current quarter's schedule.

Joe Oringel
Visual Risk IQ
Charlotte, NC USA


Monday, February 21, 2011

Book review - a Great Read for Data Analysis Folks

Just finished Malcolm Gladwell's "What the Dog Saw" on a long plane ride this weekend. Like his other books (Tipping Point, Blink, and Outliers), there are great stories and examples for those of us involved in data analysis, including internal auditing and especially continuous auditing.

Gladwell's current book is actually a collection of essays from New Yorker magazine, but they piece together nicely so the essays can be read in sequence or by selecting chapters of interest. If you have time to only read one chapter, I'd point you toward the chapter "Open Secrets. Enron, Intelligence, and the Perils of Too Much Information."

The book points out that Enron's Special Purpose Entities (SPE's) were entirely transparent. To a fault. Because each of the more than 3000 SPE's involved paperwork of an average of 1000 pages of filings. Even an executive summary of an SPE contained 40 single-spaced pages. So the challenge in understanding the financial risks of their SPE's was to understand how to filter an insanely large volume of data into a form that was manageable, comprehensible, and actionable.

As you progress on the Continuous Auditing and Continuous Monitoring Maturity Curve, you'll find that your teams are amassing a similarly overwhelming (though hopefully not as large!) set of source data and anomalies to review. How do you see the source data? How do you see the exceptions? How do you decide which ones to act on?

Most data analysis efforts that we have worked have a goal of identifying individual exceptions, or rows, in database speak. So a AP vendor shares an address or tax ID number with an employee. Or a sales invoice had a discount in excess of a contract maximum. To act, we send an email to someone, maybe with a spreadsheet attached, to research and resolve the exception row.

But let's learn from Enron's SPE's. If we send 3000 emails, how will we manage the follow-up. Can we use color and graphs to measure the magnitude of the exceptions in total? How should we identify transactions that are acceptable one-by-one (example: a $9,500 requisition from a manager with a $10,000 signing authority), but unacceptable as a larger series (say ten, $9,500 requsitions from that same manager, all within the same week)?

Monday, January 31, 2011

How to update the IIA GTAG for what's new in Continuous Auditing?

Today's blog seeks to assimilate some of the things we heard at the IIA International conference last summer in Atlanta, in advance of this week's IIA Working Group discussions regarding the Global Technology Audit Guide (GTAG) on Continuous Auditing #3. The purpose of the Working Group discussions are to identify areas of the GTAG that require updating, so we are pleased to be able to participate with such an esteemed group of colleagues.

I wrote last summer about the IIA International conference, and how Data Analysis and Continuous Auditing were discussed at that three of the more interesting presentations at that conference. Those presenters Dan Kneer, Steve Biskie (ACL Services) and Robert Mainardi, and each presenter spoke on some combination of Data Analysis, Continuous Auditing, and Continuous Monitoring. Though they used many of the same words and terms, their perspective often seemed quite different.

Is Continuous Auditing about audit project selection and Risk Assessment. Yes. So techniques such as regression analysis, ratio analysis, and other analysis of aggregate data should be considered in any GTAG update. But Continuous Auditing is also about more frequent updates of subjective data, like control self-assessment, surveys, and call program activities. Similarly Continuous Auditing can and should include data analysis of transaction details. And greater frequency of data analysis, to include Visual Reporting also aid greatly in Risk and Control Assessment activity. Perhaps text analytics and analysis of unstructured data too.

Looking forward to this week's Working Group, so we can find some distinctive words to distinguish the various types of Continuous Auditing activities for inclusion in any updates to the GTAG.

Since Continuous Auditing can

Joe Oringel
Visual Risk IQ
Charlotte, NC USA