Categories
FCPA Compliance Report

FCPA Compliance Report – Navigating Compliance in 2026: Trends and Transformations

Welcome to the award-winning FCPA Compliance Report, the longest-running podcast in compliance. In this episode, we replay a recent webinar Tom Fox participated in, hosted by EQS. The panel moderator was Steph Holmes, and the panelists were Tom Fox, Mary Shirley, and Matt Kelly.

The session focuses on six key 2026 trends for ethics and compliance programs:

(1) AI moving from experimentation to operational use, emphasizing deliberate scaling, human-in-the-loop oversight, governance frameworks, monitoring, and managing “shadow AI,” with practical use cases such as policy chatbots, gift/travel/entertainment reviews, and AI-enabled third-party risk lifecycle management;

(2) enforcement “volatility” and unpredictable regulatory signals, with emphasis on returning to fundamentals such as documenting program inputs and outcomes, and noting continued activity, including record FCA resolutions and a DOJ whistleblower program award leading to a rapid antitrust settlement;

(3) shifting employer–employee dynamics, including Gartner survey findings that 40% of employees would intentionally miss a compliance requirement to harm their organization, discussion of trust, employee sentiment, multi-generational communication differences, and the need to partner with HR while staying within organizational lanes;

(4) heightened third-party and supply chain risk expectations, including cybersecurity, tariffs/tariff evasion, export controls, and the need to unify siloed risk views into a holistic third-party risk assessment;

(5) anticipated increases in whistleblowing and investigation demands amid volatility, highlighting the importance of preventing retaliation, keeping reporters feeling heard through responsive communications, triage protocols, and anonymized case examples to build trust; and

(6) measuring program effectiveness through a shift from outputs to outcomes, including reviewing KPIs and key risk indicators, peer review of investigations, hotline “mystery shopping,” and gap analyses against the DOJ’s ECCP and compliance program hallmarks, with special emphasis on third-party documentation and ongoing monitoring.

Resources:

Mary Shirley on LinkedIn

Steph Holmes on LinkedIn

Matt Kelly at Radical Compliance

EQS

Tom Fox

Instagram

Facebook

YouTube

Twitter

LinkedIn

Returning to Venezuela on Amazon.com

Categories
Blog

2026 Ethics & Compliance Trends in a Year of Volatility

Ethics and Compliance programs are entering 2026 under pressure from every direction at once. Enforcement signals are uneven and often contradictory. Regulatory expectations are evolving without clear glide paths. Boards are demanding proof of effectiveness, not just activity. Meanwhile, inside organizations, trust is fragile, employee engagement is strained, and ethical risk is increasingly driven by stress, uncertainty, and disengagement rather than overt malice.

I recently participated in the EQS-sponsored webinar, 2026 Ethics and Compliance Trends for Ethics and Compliance Programs: From Insights to Action. This webinar clearly framed the moment: the challenge is no longer simply identifying risk categories. The challenge is operating a compliance program that remains credible, defensible, and effective amid volatility. For compliance leaders, this is not a year for hype, shortcuts, or silver bullets. It is a year for disciplined execution.

This article distills the core themes emerging for 2026 that we explored in the webinar and explains why they demand a shift in how compliance programs are designed, governed, and measured. My co-panelists were Mary Shirley and Matt Kelly. Steph Holmes hosted us.

AI in Compliance: From Experimentation to Operational Reality

By 2026, artificial intelligence in compliance is no longer optional or novel. Most large organizations have already deployed AI in some form, such as intake triage, classification, translation, summarization, or search. What has changed is the expectation. Boards and executives now want results. This is where many programs will struggle.

AI works best today in structured, repeatable tasks. It can accelerate intake, reduce manual review, and surface patterns that humans might miss. But AI does not eliminate work; it rearranges it. Review, exception handling, governance, and oversight do not disappear. In many cases, they expand.

The real risk in 2026 is not AI itself. It is scaling too quickly without ownership, governance, or boundaries. Compliance teams that attempt to automate judgment-intensive decisions, such as investigations, escalations, or remediations, invite defensibility problems they cannot explain to regulators or boards. Successful programs will treat AI as an operational tool, not a strategic shortcut, and will clearly define where human judgment remains non-negotiable.

Regulatory Volatility, Not Regulatory Retreat

One of the most dangerous misreads in compliance today is the belief that shifting enforcement signals equals reduced risk. The reality is closer to the opposite. As the webinar materials emphasize, enforcement risk in 2026 is not disappearing; it is fragmenting. Political cycles, regional differences, and sector-specific priorities create uneven pressure, but exposure remains real. Whistleblower incentives continue to drive cases regardless of rhetoric. Cross-border cooperation persists even when domestic messaging softens.

The compliance mistake in volatile periods is overcorrection. Programs that scale back controls, staffing, or oversight in response to perceived deregulation weaken their defensibility. When enforcement inevitably resurfaces, documentation gaps and inconsistent standards become liabilities. The strongest programs in 2026 will not chase enforcement headlines. They will document risk assessments, decision rationales, and consistency of approach, building programs designed to withstand cycles, not react to them.

Employee Dynamics and the Rise of Ethical Drift

The most underappreciated risk heading into 2026 is internal. Employer–employee dynamics are shifting in ways that directly affect ethics and compliance. AI deployment, cost pressure, and political uncertainty are changing how employees perceive fairness, security, and leverage. According to research highlighted in the webinar, 40% of employees admit they would intentionally miss a compliance requirement to cause harm to their organization. That is not a culture problem waiting to happen. It is a present-tense compliance risk.

Ethical drift rarely announces itself through clear violations. It shows up as disengagement, silence, delayed reporting, rationalization, and erosion of trust. In this environment, compliance programs that rely solely on policies, training completion rates, or hotline volume are flying blind. In 2026, employee sentiment must be treated as a leading risk indicator, not a soft signal. Compliance teams must work more closely with HR and leadership to monitor stress points, manager behavior, and organizational pressure that create conditions for misconduct before it materializes.

Third-Party Risk as a Systemic Exposure

Third-party risk has outgrown its traditional boundaries. Vendors, distributors, technology partners, and AI service providers are now embedded across critical operations. When they fail, the failure rarely stays isolated. The webinar makes this point clearly: most third-party incidents expose internal governance gaps, not just vendor misconduct. Weak onboarding, poor segmentation, outdated contracts, and checklist-based monitoring all surface when something goes wrong.

In 2026, the compliance challenge is not perfect visibility. It is defensible prioritization. Not every third party requires the same level of scrutiny. Continuous monitoring and signal-based oversight are more effective than periodic reviews, which can provide a false sense of security. Compliance leaders should focus on materiality, lifecycle management, and resilience. The question regulators will ask is not whether every risk was identified, but whether the organization made reasonable, documented decisions based on the information available at the time.

Whistleblowing Surges Are Predictable And Test Credibility

Whistleblowing activity reliably increases during periods of economic stress, social disruption, and organizational change. 2026 will be no exception.

What matters is not volume alone. High reporting can reflect trust or fear. Employees use speak-up channels to test fairness, responsiveness, and safety. Programs designed only for steady-state conditions often buckle under surge conditions. The webinar emphasizes that timeliness, communication, and consistency matter more than outcomes in building trust. Mishandled cases during high-scrutiny periods carry amplified reputational and cultural risk. Retaliation concerns rise, and credibility erodes quickly if employees feel ignored or dismissed.

Compliance teams should plan for reporting spikes the same way they plan for crisis response. Capacity, triage protocols, communication standards, and leadership alignment must be stress-tested before volume hits.

Measuring What Matters: From Activity to Effectiveness

By 2026, boards and regulators are asking a harder question: Does the compliance program actually work? Activity-based reporting; training delivered, policies updated, and cases closed, is no longer sufficient. The expectation is outcomes. Are risks changing? Why? Where should resources move next? Data and analytics are essential, but only if they inform decisions. Overly complex dashboards and vanity metrics dilute clarity. The most effective programs use data to prioritize interventions, allocate resources, and identify emerging risk, not just to justify headcount.

Importantly, credible programs are willing to admit when initiatives fail. A compliance function that can point to lessons learned and course corrections demonstrates maturity. One that reports only success is unlikely to be testing itself hard enough.

Conclusion: 2026 Is a Year for Disciplined Compliance Leadership

The defining feature of 2026 will not be a single regulation, technology, or enforcement action. It will be volatility, both external and internal. In that environment, compliance programs cannot rely on legacy assumptions. AI must be governed, not glamorized. Enforcement signals must be contextualized, not chased. Employee disengagement must be monitored as a risk. Third-party exposure must be prioritized defensibly. Speak-up systems must be resilient. Metrics must drive action.

The compliance leaders who succeed in 2026 will be those who move from insight to action, building programs that are steady when everything else is not.

Categories
Innovation in Compliance

Innovation in Compliance – Steph Holmes on Blending AI and Human Oversight for Effective Compliance

Innovation spans many areas, and compliance professionals need not only to be ready for it but also to embrace it. Join Tom Fox, the Voice of Compliance, as he visits with top innovative minds, thinkers, and creators in the award-winning Innovation in Compliance podcast. In this episode, host Tom welcomes Steph Holmes, long-time friend and Director of Ethics and Compliance Strategy at the EQS Group, who looks at the current Intersection of AI and compliance.

Steph Holmes and EQS are both at the forefront of integrating artificial intelligence (AI) into compliance programs to enhance their efficiency and effectiveness. With a focus on practical applications, Holmes views AI as a crucial tool for expanding resources, especially as organizations face increasing regulatory changes and economic pressures. She advocates for the responsible, sustainable, and explainable adoption of AI, emphasizing that compliance professionals should embrace it rather than fear it. Holmes discusses the importance of blending AI capabilities with human oversight to ensure compliance tasks are managed accurately and risks are mitigated effectively.

Key highlights:

  • Digitizing Compliance: AI Tools and Programs
  • Navigating Compliance Challenges with Human Judgment
  • Enhancing AI Reliability Through Human Oversight
  • Enhancing Compliance through Responsible AI Implementation
  • Implementing AI Pilot Programs in Compliance Workflows

Resources:

Steph Holmes on LinkedIn

EQS Group LinkedIn

Where in the Loop: Corporate Compliance Insights

EQS Website

EQS Benchmark Report: AI Performance in Compliance & Ethics

Innovation in Compliance was recently ranked 4th among Risk Management podcasts by 1,000,000 Podcasts.

Categories
Compliance and AI

Compliance and AI: Steph Holmes on the Intersection of AI and Compliance

What is the intersection of AI and compliance? What about Machine Learning? Are you using ChatGPT? These questions are just three of the many we will explore in this cutting-edge podcast series, Compliance and AI, hosted by Tom Fox, the award-winning Voice of Compliance. Today, Tom looks at the current Intersection of AI and Compliance with Steph Holmes, a long-time friend and Director, Ethics and Compliance Strategy at the EQS Group.

They discuss the evolving role of AI in corporate compliance, emphasizing its key role in modernizing compliance programs. Steph elaborates on the importance of evidence-based assessments of AI capabilities, the impact of AI on operational efficiency, and the need for human oversight in AI processes. She highlights EQS’s comprehensive AI performance test, which evaluated various AI models against multiple compliance tasks. The discussion also covers practical steps for compliance professionals to begin their AI adoption journey, as well as the necessity of continuous monitoring and risk-based evaluation to ensure effective AI deployment.

Key highlights:

  • Steph Holmes’ Role at EQS Group
  • AI in Compliance: Current Landscape
  • AI Performance Test Report
  • The Messy Middle of Compliance and AI
  • Human Oversight in AI Implementation

Resources:

Steph Holmes on LinkedIn

EQS Group LinkedIn

Where in the Loop: Corporate Compliance Insights

EQS Website

EQS Benchmark Report: AI Performance in Compliance & Ethics

Tom Fox

Instagram

Facebook

YouTube

Twitter

LinkedIn

Categories
Everything Compliance

Everything Compliance: Episode 161, The Tribute Adam Turteltaub Edition

Welcome to this edition of award-winning Everything Compliance. In this episode, we have the complete quintet of Matt Kelly, Jonathan Armstrong, Karen Woody, and Karen Moore with Tom Fox, the Compliance Evangelist, sitting in as host.

  1. Matt Kelly looks at the recent EQS report assessing AI models for compliance. He shouts to Adam Turteltaub, who recently left the SCCE after 17 years.
  2. Jonathan Armstrong reviews AI risk relating to professional advice. He shouts out to Adam Turteltaub.
  3. Karen Moore delves into the recent EU parliamentary rejections of rolling back sustainability reporting. She shouts out to Accountancy Europe and Mother everywhere.
  4. Karen Woody looks at the recent Delaware Court of Chancery decision in the case of Brewer v. Turner and its impact on Caremark Doctrine claims. She shouts out to all those returning to work at the office.
  5. Tom Fox shouts out to Adam Turteltaub and Sean Connery.

The members of Everything Compliance are:

The host, producer, and sometimes panelist of Everything Compliance is Tom Fox, the Voice of Compliance. He can be reached at tfox@tfoxlaw.com.  The award-winning Everything Compliance is a part of the Compliance Podcast Network.