The 2026 Inflection Point
Your board is under pressure. Competitors are integrating artificial intelligence (AI). Investors are asking about it. Market analysts are watching for it. The fear of falling behind is palpable in quarterly earnings calls and investor relations conversations. "When will we adopt AI?" has become an inevitable question in boardrooms across every industry.
Internal audit isn't immune to this pressure. Your audit committee chair has likely already asked: when will audit leverage AI? How will we use machine learning? What's our AI roadmap?
The urgency is real. But here's what most audit leaders misunderstand: you cannot credibly use AI for data analysis without first building a foundation in data analytics.
The first instinct many audit leaders have with AI is to increase productivity: using it to draft documents faster, summarize findings, streamline administrative work. That's useful, but it’s only part of the story. If you only use AI to do the same compliance work faster, you've accelerated without changing your value or relevance. The real power lies in analysis – the ability to understand why something happened and what it means for the business, and to drive organizational change based on that insight. That requires structured data, clean processes, and proven analytics capability. You cannot skip to AI without mastering analytics first.
The reality is stark: if you don't have a reputation built on data analytics success today, you will not be credible with AI tomorrow. That's why going from zero to one in analytics isn't just another initiative on your roadmap. It's the foundation for everything that comes next. The board wants AI. The market demands it. But the path to getting there runs through data analytics – and that journey starts now.
This article is the first in a six-part series designed to help you navigate that journey – not as a technology project, but as a leadership transformation. We'll cover launching a program that sticks, moving from episodic to continuous auditing, knowing when to hand analytics off to the business, and ultimately positioning internal audit as a strategic enabler. But we start here, at the beginning: zero to one.
The Uncomfortable Truth: If You're Still Sampling in 2026, You're Behind
It's early morning in February 2026, and your audit committee chair leans forward with a question that feels both inevitable and urgent. "Your team reviewed ten thousand expense reports last year using sample testing. How confident are you that you caught everything?" You pause. The honest answer is uncomfortable: probably not very.
That moment – when the limitations of manual, sample-based auditing become impossible to ignore – is where most audit leaders find themselves right now. Let's say it plainly: if you're still auditing travel and expense or procure-to-pay processes using random samples and manual review, you're operating in a way that no longer serves your organization or your team. This isn't judgment. It's math.
When you sample ten percent of a population, you're betting on probability. A Big 4 study on controls testing effectiveness reveals that random sampling and traditional audit methods frequently fail to identify concentrated risk areas – with organizations discovering that policy violations and anomalous transactions cluster in specific transaction types, locations, or user populations that sample-based approaches systematically miss. The false sense of assurance it provides is more dangerous than acknowledging the gap.
Worse, manual testing forces a Hobson's choice: either you broaden your scope and exhaust your people, or you stay narrow and pretend you've covered the risk landscape. Neither serves your organization well. You end up reactive, defensive, and irrelevant to the conversations that actually matter.
Consider the downstream ripple. While your best people are manually matching invoices to purchase orders, your organization is dealing with cyber risk, third-party vulnerabilities, and AI governance. You cannot do both well. The analytics alternative is simpler: test entire populations, not samples. Identify patterns at scale. Free your people to think.
Why Analytics Programs Fail: The Cascade of Consequences
Before we talk about how to succeed, we need to name why most don't.
The pattern is predictable. A CAE makes a compelling case for analytics tools. Leadership approves. Then reality hits. Training happens, but time to operationalize the analytics within actual audits? That's not budgeted. Data access, which was never a conversation before? Now it's a surprise. Your team, already allocated at 80% to SOX testing and compliance work, has no oxygen left for something new.
Here's what happens next: the audit that was supposed to take 500 hours doubles. Nothing starts on time. Your downstream projects get pushed. Your team is burned out trying to wrap up one audit while starting another. Your audit committee is on the hot seat because you've only delivered two of five planned projects this quarter. Your audit leaders are defensive with business stakeholders.
But there's a subtler consequence: you damage your credibility. When findings take months to investigate, when you send a list of 50 exceptions from six months ago to a finance team closing the books next week, when audit feels like a check-box exercise rather than a strategic partner – you're reinforcing the "police auditor" reputation that's hard to shake.
Most critically, your team loses faith in the initiative. When auditors experience an analytics project that derailed their calendar, created chaos, and produced minimal value, they internalize a lesson: analytics are nice-to-have, not essential. Nobody wants to volunteer for the next one. Your best people start updating their LinkedIn profiles. The program stalls for years.
Challenge One: The Data Isolation Problem
Most organizations don't intentionally create fragmented data environments. They inherit them. A legacy on-prem ERP coexists with newer cloud systems. Finance maintains their own data warehouse. Supply chain runs parallel transaction systems. Cloud migration projects introduce second ERPs before old ones are fully retired. Sales operations have their data lake. It's the accumulation of decisions made by different teams at different times—and now you're managing the complexity. Nowhere in your audit plan did you account for the conversations required to navigate this landscape and secure read-only access to what you actually need.
Here's the revealing part: internal audit often doesn't have data access simply because no one asked. It's not malice. It's inertia. IT has other priorities. No one approached them about audit's needs. So the default is no.
But here's what changes that: when audit educates IT about requirements and explains that read-only access is appropriate for their independence mandate, the answer often changes. Many organizations will say yes immediately – they just needed to hear the business case. The problem is compounded by audit's historical reputation. Business departments sometimes protect data because they remember the last time audit came in and said "you're doing everything wrong." They're not guarding data from IT. They're guarding it from audit's judgment.
Yet timing matters enormously. If you ask for data in November when your audit is in January, you've already lost. Tom O'Reilly of the Internal Audit Collective recommends a critical shift in timing: meet with data governance teams at the very start of the planning cycle – before audit plans are finalized – to map data needs across the full year. This can be the difference between having data ready and having audits slip by weeks.
The data team also benefits from early engagement. When audit gives them visibility into the audit plan, explains what's needed, and respects their time, they can actually do better work. They'll research what audit is trying to accomplish and often come back with ideas audit didn't think to ask for. They'll suggest more efficient extracts. They'll flag data quality issues early. The relationship becomes a partnership instead of a transaction.
Challenge Two: Finding Time Within Your Audit
Even with data in hand, your team faces a real constraint: there's no built-in time for analytics in a traditional audit cycle. Learning the tool, scoping analytics, defining thresholds, troubleshooting data issues – all take time. Yet your audit timeline hasn't changed. Your people are already allocated. Most teams try to squeeze this into nights and weekends, which leads to burnout and signals that analytics are optional, not essential.
Challenge Three: False Positives and Poor Communication
You run your first analytics test and surface 200 exceptions. Your team spends time investigating noise. Your business owners receive a list of historical findings they can't act on. Finance doesn't care about exceptions from six months ago if they've already closed the books. The message becomes: audit creates work, not value.
Building Your Business Case (Without Making It Complicated)
Here's what many CAEs get wrong: they overthink the business case.
You don't need an elaborate ROI model or a 40-page deck. You need to show up and make a simple pitch. Put simply, if you're a chief audit executive and you don't know how to make a business case for data analytics, you may not be in the best seat. That's not harsh. It's clarity. You've already made business cases for audit budgets, expanded headcount, new audit areas. This is the same skill.
Your business case should answer three questions:
- What problem does this solve?
We're currently testing samples in high-volume areas and missing material exceptions. We're spending 80% of our time on compliance testing. We're not positioned to support the strategic risks our board cares about.
- What's the alternative?
Stay where we are. Or invest in a single focused analytics test this year, prove the model, and expand next year.
- What does success look like?
For your first win, success isn't "we found fifty exceptions." Success is "we tested 100% of a population in the same time it used to take us to test 10%, we identified a pattern the business can fix going forward, and now we have a reference case to build on."
That's it. Pitch it. Most leadership approves analytics initiatives. The issue isn't approval. It's execution.
The Minimum Viable Approach: How to Actually Start
Breaking the cycle requires embracing constraint, not fighting it. You don't start with enterprise-wide analytics. You start small, prove value, and expand.
Step One: Pick One High-Impact Audit (Scope)
Choose an audit that is high-volume (enough transactions to make sampling visibly insufficient), high-risk (real consequences for misstatement), and data-accessible. Travel and expense or procure-to-pay are classics. Don't pick the most complex process. Pick the one where analytics can prove immediate value.
For geographic organizations, start smaller still: if you have 20-30 geographies, don't go global. Start with 1-10 where you can control variables and prove the model before rolling out. Different maturity levels create noise. Prove value first, then scale.
For a real-world example, consider a firm with ten thousand expense reports annually. Instead of testing all ten thousand manually or sampling a thousand, they focused on the top 20% of spenders – roughly 800 transactions per quarter. The population was still material but manageable. The risk was clear. The data was accessible. And when they found a pattern of policy violations concentrated in one region, it was actionable and relevant to finance.
Step Two: Get Data Access Now – Before the Audit Starts
This is the critical move. In your 2026 audit plan, for every analytics audit, identify the data source and schedule the conversation early. Don't wait until fieldwork begins. Reach out to the IT lead or data steward in August for Q1 audits. Explain what you're trying to accomplish. Acknowledge that pulling data takes time and resources. Share your timeline and ask what they need from you to prioritize your request.
Build this relationship. You're not order-taking. You're partnering. When the data team delivers, they're part of your success. When they have constraints, loop them in early because you've treated them as partners, not vendors. And when they come back with ideas – ways to structure the data better, related datasets that could help, data quality issues worth knowing about – listen. They're thinking critically about your problem now.
Step Three: Define Success Before You Test
Before you build the query, sit with your audit team and the business owner. Define exception types that matter: duplicate payments, approval violations, policy breaches. Establish thresholds. Decide which findings warrant investigation, which warrant a conversation, which warrant an audit adjustment.
This prevents false positives. When you find something, you already know if it's real. Your team doesn't chase noise. Your business partner doesn't receive a report full of exceptions that don't actually matter.
Step Four: Run One Test, Not Ten
Pick one clear, bounded test. Maybe it's identifying duplicate payments. Maybe it's flagging invoices that bypass approval workflows. Run it. Find exceptions. Validate. Report. The whole cycle should take weeks, not months. You're proving analytics work, not solving the entire audit universe.
The First Win: How to Communicate It and Build Internal Momentum
Here's where most teams stumble: they surface findings and send them upline like a checklist.
Instead, send insights. Find your top 3-4 discoveries and frame them for forward-looking value. For finance, that might mean: "Here's a pattern in your month-end close that could save time next month" rather than "Here are 50 exceptions you need to investigate from December 2024."
The message shift is subtle but powerful: audit is not asking the business to justify past work. Audit is helping them improve future work. That changes the conversation entirely. The business owner goes from defensive posture to interested listener.
And here's something that's rarely discussed: your first win builds internal momentum through peer influence. When your audit team members complete their first analytics project, they talk about it. They tell colleagues what they learned. When peers hear that analytics made an audit faster or surfaced a meaningful insight, they want to try it themselves. This informal "water cooler" reputation is how analytics adoption spreads. It's not a top-down mandate. It's peer-driven interest.
What Not to Start With
Do not start with a tool that only your team uses. When analytics capability lives with a single subject-matter expert, organizational knowledge becomes fragile. That person becomes irreplaceable, and when they move on, the program often stalls. Instead, choose platforms that allow knowledge sharing and lower barriers to contribution across the team. Choose tools that align with how your organization thinks about data. If your company uses Python or SQL, follow that. Your analytics should look like the organization's analytics, not like an audit island.
Do not stop at the tool purchase. Buying a platform is fine. It's the starting line, not the finish line. The mistake most CAEs make is treating the purchase as the end of their work – handing it off to the team and expecting adoption to follow. It doesn't work that way. Analytics is not a technology problem. It's a leadership problem.
The CAE has to actively lead what happens after the tool arrives: making daily choices about priorities, defining when analytics should be used and when they shouldn't, guiding how findings are translated into insights rather than just exception lists, removing barriers your team encounters, celebrating progress, and course-correcting when adoption stalls. You need to be visibly invested in the deployment, the early projects, and the strategic choices about how analytics serve your audit mission. If you delegate it to your team after purchase, it fails. If you lead it from start through execution, it succeeds.
Why This Is a Leadership Decision
Once you've proven value with your first test, you face the real challenge: making it stick and scale.
It means pushing back when your team defaults to manual testing. It means saying no to audit requests that don't align with your analytics strategy. It means celebrating the people learning these skills and creating space for them to grow, even if it means temporarily slower delivery on other fronts.
Most importantly, it means reframing what audit delivers. Analytics isn't about finding more exceptions. It's about turning findings into insights and insights into action. It's about helping the business understand what's happening and why. It's about being a partner in improvement, not a policeman.
Here's what's empowering: you've already done this. All auditors do. You spend your career writing recommendations and influencing their acceptance. You oversee remediation. You manage change. Don't change who you are. Lean into who you are. You're a change agent.
With analytics, you're just facilitating this at a bigger scale. Implement change incrementally, but monitor it aggressively. Maintain a plan. Get continuous feedback. Communicate positively about progress. Address obstacles swiftly. That's leadership. That's also exactly what you've been training for your whole career.
The Road Ahead
You're at the threshold. You understand the potential. You see the limitations. You're ready. But you also see the obstacles.
Start with zero to one. Pick your single audit. Build the data relationship. Define success. Run one clean test. Prove the value.
The momentum will compound. Your team will see what's possible. Your organization will trust audit data more. And you'll have created the foundation for the strategic, high-impact audit function you've always envisioned.
In the next article in this series, we'll explore how to move from episodic analytics to continuous auditing – turning one-off tests into repeatable routines that watch your highest-risk processes in real time. We'll discuss which processes warrant continuous testing, what governance looks like, and how to avoid creating busywork for your team.
The moment is now. Not because the pressure is new, but because your response to it defines whether internal audit becomes more strategic or slowly fades from boardroom conversations. Start small. Start now. Make it stick.
Nikki is a freelance writer, editor, proofreader, and general word-nerd. Nikki has a 20+ year career background in internal audit, risk, and fraud, and now applies that knowledge in her writing and editorial work, rather than in daily practice. She holds her Certified Internal Auditor (CIA), Certification in Risk Management Assurance (CRMA), and Certified Fraud Examiner (CFE) designations. She is also an active member of both the Institute of Internal Auditors (IIA) and the Associated of Certified Fraud Examiners (ACFE).
