Categories
AI Today in 5

AI Today in 5: February 17, 2026, The Measurable Gains Edition

Welcome to AI Today in 5, the newest addition to the Compliance Podcast Network. Each day, Tom Fox will bring you 5 stories about AI to start your day. Sit back, enjoy a cup of morning coffee, and listen in to the AI Today In 5. All, from the Compliance Podcast Network. Each day, we consider five stories from the business world, compliance, ethics, risk management, leadership, or general interest about AI.

Top AI stories include:

  1. Measurable gains are now being achieved with AI. (FT)
  2. The hidden cost of poor compliance conciliation. (FinTechGlobal)
  3. AI at Kraken Compliance. (Kraken Blog)
  4. Is a memory chip crisis coming? (Bloomberg)
  5. AI worries erase $1tn from Big Tech values. (PYMNTS)

For more information on the use of AI in Compliance programs, my new book, Upping Your Game, is available. You can purchase a copy of the book on Amazon.com.

Categories
AI Today in 5

AI Today in 5: February 16, 2026, The Doom Loop Edition

Welcome to AI Today in 5, the newest addition to the Compliance Podcast Network. Each day, Tom Fox will bring you 5 stories about AI to start your day. Sit back, enjoy a cup of morning coffee, and listen in to the AI Today In 5. All, from the Compliance Podcast Network. Each day, we consider five stories from the business world, compliance, ethics, risk management, leadership, or general interest about AI.

Top AI stories include:

  1. Staying ahead of AI regs in housing. (HousingWire)
  2. UN sets up panel on AI impact. (YahooNews)
  3. KPMG examines PE and AI. (CrowdFundInsider)
  4. Continuous learning to scale healthcare. (FilMoGaz)
  5. Everything stock AI touches in ‘Doom Loop’? (Bloomberg)

For more information on the use of AI in Compliance programs, my new book, Upping Your Game, is available. You can purchase a copy of the book on Amazon.com.

Categories
FCPA Compliance Report

FCPA Compliance Report – Navigating Compliance in 2026: Trends and Transformations

Welcome to the award-winning FCPA Compliance Report, the longest-running podcast in compliance. In this episode, we replay a recent webinar Tom Fox participated in, hosted by EQS. The panel moderator was Steph Holmes, and the panelists were Tom Fox, Mary Shirley, and Matt Kelly.

The session focuses on six key 2026 trends for ethics and compliance programs:

(1) AI moving from experimentation to operational use, emphasizing deliberate scaling, human-in-the-loop oversight, governance frameworks, monitoring, and managing “shadow AI,” with practical use cases such as policy chatbots, gift/travel/entertainment reviews, and AI-enabled third-party risk lifecycle management;

(2) enforcement “volatility” and unpredictable regulatory signals, with emphasis on returning to fundamentals such as documenting program inputs and outcomes, and noting continued activity, including record FCA resolutions and a DOJ whistleblower program award leading to a rapid antitrust settlement;

(3) shifting employer–employee dynamics, including Gartner survey findings that 40% of employees would intentionally miss a compliance requirement to harm their organization, discussion of trust, employee sentiment, multi-generational communication differences, and the need to partner with HR while staying within organizational lanes;

(4) heightened third-party and supply chain risk expectations, including cybersecurity, tariffs/tariff evasion, export controls, and the need to unify siloed risk views into a holistic third-party risk assessment;

(5) anticipated increases in whistleblowing and investigation demands amid volatility, highlighting the importance of preventing retaliation, keeping reporters feeling heard through responsive communications, triage protocols, and anonymized case examples to build trust; and

(6) measuring program effectiveness through a shift from outputs to outcomes, including reviewing KPIs and key risk indicators, peer review of investigations, hotline “mystery shopping,” and gap analyses against the DOJ’s ECCP and compliance program hallmarks, with special emphasis on third-party documentation and ongoing monitoring.

Resources:

Mary Shirley on LinkedIn

Steph Holmes on LinkedIn

Matt Kelly at Radical Compliance

EQS

Tom Fox

Instagram

Facebook

YouTube

Twitter

LinkedIn

Returning to Venezuela on Amazon.com

Categories
Blog

2026 Ethics & Compliance Trends in a Year of Volatility

Ethics and Compliance programs are entering 2026 under pressure from every direction at once. Enforcement signals are uneven and often contradictory. Regulatory expectations are evolving without clear glide paths. Boards are demanding proof of effectiveness, not just activity. Meanwhile, inside organizations, trust is fragile, employee engagement is strained, and ethical risk is increasingly driven by stress, uncertainty, and disengagement rather than overt malice.

I recently participated in the EQS-sponsored webinar, 2026 Ethics and Compliance Trends for Ethics and Compliance Programs: From Insights to Action. This webinar clearly framed the moment: the challenge is no longer simply identifying risk categories. The challenge is operating a compliance program that remains credible, defensible, and effective amid volatility. For compliance leaders, this is not a year for hype, shortcuts, or silver bullets. It is a year for disciplined execution.

This article distills the core themes emerging for 2026 that we explored in the webinar and explains why they demand a shift in how compliance programs are designed, governed, and measured. My co-panelists were Mary Shirley and Matt Kelly. Steph Holmes hosted us.

AI in Compliance: From Experimentation to Operational Reality

By 2026, artificial intelligence in compliance is no longer optional or novel. Most large organizations have already deployed AI in some form, such as intake triage, classification, translation, summarization, or search. What has changed is the expectation. Boards and executives now want results. This is where many programs will struggle.

AI works best today in structured, repeatable tasks. It can accelerate intake, reduce manual review, and surface patterns that humans might miss. But AI does not eliminate work; it rearranges it. Review, exception handling, governance, and oversight do not disappear. In many cases, they expand.

The real risk in 2026 is not AI itself. It is scaling too quickly without ownership, governance, or boundaries. Compliance teams that attempt to automate judgment-intensive decisions, such as investigations, escalations, or remediations, invite defensibility problems they cannot explain to regulators or boards. Successful programs will treat AI as an operational tool, not a strategic shortcut, and will clearly define where human judgment remains non-negotiable.

Regulatory Volatility, Not Regulatory Retreat

One of the most dangerous misreads in compliance today is the belief that shifting enforcement signals equals reduced risk. The reality is closer to the opposite. As the webinar materials emphasize, enforcement risk in 2026 is not disappearing; it is fragmenting. Political cycles, regional differences, and sector-specific priorities create uneven pressure, but exposure remains real. Whistleblower incentives continue to drive cases regardless of rhetoric. Cross-border cooperation persists even when domestic messaging softens.

The compliance mistake in volatile periods is overcorrection. Programs that scale back controls, staffing, or oversight in response to perceived deregulation weaken their defensibility. When enforcement inevitably resurfaces, documentation gaps and inconsistent standards become liabilities. The strongest programs in 2026 will not chase enforcement headlines. They will document risk assessments, decision rationales, and consistency of approach, building programs designed to withstand cycles, not react to them.

Employee Dynamics and the Rise of Ethical Drift

The most underappreciated risk heading into 2026 is internal. Employer–employee dynamics are shifting in ways that directly affect ethics and compliance. AI deployment, cost pressure, and political uncertainty are changing how employees perceive fairness, security, and leverage. According to research highlighted in the webinar, 40% of employees admit they would intentionally miss a compliance requirement to cause harm to their organization. That is not a culture problem waiting to happen. It is a present-tense compliance risk.

Ethical drift rarely announces itself through clear violations. It shows up as disengagement, silence, delayed reporting, rationalization, and erosion of trust. In this environment, compliance programs that rely solely on policies, training completion rates, or hotline volume are flying blind. In 2026, employee sentiment must be treated as a leading risk indicator, not a soft signal. Compliance teams must work more closely with HR and leadership to monitor stress points, manager behavior, and organizational pressure that create conditions for misconduct before it materializes.

Third-Party Risk as a Systemic Exposure

Third-party risk has outgrown its traditional boundaries. Vendors, distributors, technology partners, and AI service providers are now embedded across critical operations. When they fail, the failure rarely stays isolated. The webinar makes this point clearly: most third-party incidents expose internal governance gaps, not just vendor misconduct. Weak onboarding, poor segmentation, outdated contracts, and checklist-based monitoring all surface when something goes wrong.

In 2026, the compliance challenge is not perfect visibility. It is defensible prioritization. Not every third party requires the same level of scrutiny. Continuous monitoring and signal-based oversight are more effective than periodic reviews, which can provide a false sense of security. Compliance leaders should focus on materiality, lifecycle management, and resilience. The question regulators will ask is not whether every risk was identified, but whether the organization made reasonable, documented decisions based on the information available at the time.

Whistleblowing Surges Are Predictable And Test Credibility

Whistleblowing activity reliably increases during periods of economic stress, social disruption, and organizational change. 2026 will be no exception.

What matters is not volume alone. High reporting can reflect trust or fear. Employees use speak-up channels to test fairness, responsiveness, and safety. Programs designed only for steady-state conditions often buckle under surge conditions. The webinar emphasizes that timeliness, communication, and consistency matter more than outcomes in building trust. Mishandled cases during high-scrutiny periods carry amplified reputational and cultural risk. Retaliation concerns rise, and credibility erodes quickly if employees feel ignored or dismissed.

Compliance teams should plan for reporting spikes the same way they plan for crisis response. Capacity, triage protocols, communication standards, and leadership alignment must be stress-tested before volume hits.

Measuring What Matters: From Activity to Effectiveness

By 2026, boards and regulators are asking a harder question: Does the compliance program actually work? Activity-based reporting; training delivered, policies updated, and cases closed, is no longer sufficient. The expectation is outcomes. Are risks changing? Why? Where should resources move next? Data and analytics are essential, but only if they inform decisions. Overly complex dashboards and vanity metrics dilute clarity. The most effective programs use data to prioritize interventions, allocate resources, and identify emerging risk, not just to justify headcount.

Importantly, credible programs are willing to admit when initiatives fail. A compliance function that can point to lessons learned and course corrections demonstrates maturity. One that reports only success is unlikely to be testing itself hard enough.

Conclusion: 2026 Is a Year for Disciplined Compliance Leadership

The defining feature of 2026 will not be a single regulation, technology, or enforcement action. It will be volatility, both external and internal. In that environment, compliance programs cannot rely on legacy assumptions. AI must be governed, not glamorized. Enforcement signals must be contextualized, not chased. Employee disengagement must be monitored as a risk. Third-party exposure must be prioritized defensibly. Speak-up systems must be resilient. Metrics must drive action.

The compliance leaders who succeed in 2026 will be those who move from insight to action, building programs that are steady when everything else is not.

Categories
AI Today in 5

AI Today in 5: February 12, 2026, The AI to the Moon Edition

Welcome to AI Today in 5, the newest addition to the Compliance Podcast Network. Each day, Tom Fox will bring you 5 stories about AI to start your day. Sit back, enjoy a cup of morning coffee, and listen in to the AI Today In 5. All, from the Compliance Podcast Network. Each day, we consider five stories from the business world, compliance, ethics, risk management, leadership, or general interest about AI.

Top AI stories include:

  1. Putting AI into your compliance workflow. (Valley Courier)
  2. GenAI and compliance. (FinTechGlobal)
  3. Musk wants to put an AI factory on the Moon. (NYT)
  4. OpenAI disbands safety teams. (TechCrunch)
  5. Is the US ready for what AI will do for jobs? (The Atlantic)

For more information on the use of AI in Compliance programs, my new book, Upping Your Game, is available. You can purchase a copy of the book on Amazon.com.

Categories
Great Women in Compliance

Great Women in Compliance: Why Decision Rubrics Matter in the Age of AI with Hemma Lomax and Shalini Rajoo

In this conversation, GWIC host Dr. Hemma R. Lomax and Shalini Rajoo explore the critical role of decision rubrics in governance, accountability, and trust, especially in the context of AI. Shalini shares her journey from law to compliance, emphasizing the importance of understanding systems and the impact of leadership on decision-making processes. They discuss how transparency and clarity in decision-making can build trust within organizations and the necessity of responsible AI governance. Practical tips for improving decision quality are also provided, highlighting the importance of self-awareness and critical thinking in leadership.

Takeaways:

  • The biggest risk in governance is unclear decisions.
  • AI amplifies existing clarity or confusion in decision-making.
  • Systems and rules reflect the identities of their architects.
  • Everyone has an impact on those around them every day.
  • Leadership is about improving the people around you.
  • It’s not just about rules; it’s about how people behave.
  • Decision rubrics provide consistency and predictability in outcomes.
  • Transparency in decision-making processes builds trust.
  • Slowing down to ask questions can lead to better decision-making.
  • Writing down the reasons for decisions brings clarity and accountability.

Sound bites:

“Systems and rules are not inherently neutral.”

“Transparency in decision making builds trust.”

“Slow is smooth, and smooth is fast.”

Chapters:

00:00 Introduction to Decision Rubrics and Governance

02:55 Shalini’s Journey: From Law to Governance

06:09 The Impact of Systems on Leadership and Accountability

09:09 Transitioning to Compliance and Ethics

11:49 Understanding Decision Rubrics in Compliance

15:06 The Role of Leadership in Decision Making

18:03 Designing Conditions for Effective Decision Making

20:47 The Importance of Transparency in Decision Processes

24:09 Decision Rubrics: Building Trust in Organizations

26:49 AI and Governance: Leadership Infrastructure Failures

29:47 Responsible AI: The Role of Ethics and Compliance

32:55 Practical Tips for Improving Decision Quality

36:00 Conclusion: The Future of Decision Making in AI

Guest Biography:

Shalini Rajoo is the Founder and Principal Consultant of Shalini Rajoo Advisory, LLC, where she partners with organizations to design governance, compliance, and decision-making systems that are resilient, trustworthy, and aligned to real operational pressures. Across more than two decades in law, compliance, HR, and organizational leadership, Shalini has helped companies and leaders move beyond check-the-box frameworks to build structures that embed accountability, clarity, and performance into everyday decisions.

She began her career in South Africa, first as a public prosecutor and then leading regulatory work with the Department of Trade and Industry, collaborating with legislative and executive stakeholders on corporate, competition, and consumer law. After relocating to the U.S., Shalini practiced commercial litigation. She later served as Director of Global Business Conduct for a Fortune 500 company, where she redesigned ethics and compliance systems, led global risk assessments, and championed psychological safety and integrity-based practices.

Today, Shalini’s work centers on helping leaders clarify decision rights, governance architectures, and accountability pathways — especially as organizations adopt AI and automation. She recently spoke at the Opal Group’s Corporate Governance & Ethics in the Age of AI conference, where she reframed AI governance as a leadership-infrastructure challenge rather than a purely technical or compliance one.

Categories
AI Today in 5

AI Today in 5: February 10, 2026, The AI Redefining GRC Edition

Welcome to AI Today in 5, the newest addition to the Compliance Podcast Network. Each day, Tom Fox will bring you 5 stories about AI to start your day. Sit back, enjoy a cup of morning coffee, and listen in to the AI Today In 5. All, from the Compliance Podcast Network. Each day, we consider five stories from the business world, compliance, ethics, risk management, leadership, or general interest about AI.

Top AI stories include:

  1. How AI is redefining GRC. (GulfNews)
  2. AI-assisted workforce leave compliance program. (USAToday)
  3. How to integrate AI into your compliance workflows. (AOL)
  4. How AI can speed compliance research. (FedScoop)
  5. Data sovereignty for AI compliance. (TechTarget)

For more information on the use of AI in Compliance programs, my new book, Upping Your Game, is available. You can purchase a copy of the book on Amazon.com.

Categories
Blog

From Enforcement-Driven to Purpose-Driven Compliance

For more than two decades, corporate compliance programs have been built around one central organizing principle: enforcement. Where regulators go, compliance resources follow. When the Department of Justice prioritizes anticorruption, companies invest in FCPA controls. When regulators turn to privacy, cybersecurity, or AML, compliance budgets pivot accordingly. This enforcement-driven approach has shaped the modern compliance profession.

Yet, as Veronica Root Martinez persuasively argues in her recent working paper, Purpose-Driven Compliance, this dominant model may be fundamentally flawed, certainly in the era of Trump.  Despite unprecedented investments in compliance infrastructure, corporate misconduct persists. Repeat offenders remain common. Penalties grew larger, but behavior did not meaningfully improve. For compliance professionals, this raises an uncomfortable question: are we optimizing for the wrong objective?

Martinez’s answer is both challenging and clarifying. Compliance programs should not be primarily designed to satisfy enforcement authorities or to maximize mitigation credit after failure. Instead, they should be anchored in the organization’s own purpose, business risks, and ethical standards. In short, it is time to move from enforcement-driven compliance to purpose-driven compliance.

The Limits of Enforcement-Driven Compliance

The enforcement-driven model rests on two assumptions. First, that enforcement priorities reflect a company’s most significant risks. Second, that imperfect compliance is inevitable and acceptable so long as the organization can demonstrate good-faith efforts. Martinez brings both under scrutiny.

Regulatory priorities often lag behind real business risks. Enforcement agencies focus on certain categories of misconduct because they are visible, politically salient, or historically entrenched. But the risks that most threaten an organization’s mission may lie elsewhere. Martinez highlights how firms can become over-invested in compliance areas that attract enforcement attention while under-investing in mission-critical risks to their operations.

The second assumption, that some level of misconduct is acceptable, is even more troubling. Behavioral ethics research suggests that tolerating small violations creates conditions for larger ones. When leaders frame misconduct as statistically insignificant or “within expectations,” they risk normalizing behavior that undermines culture, trust, and ultimately performance. Wells Fargo’s infamous “1% problem” illustrates this danger. Senior leadership took comfort in the idea that only a small fraction of employees were engaging in misconduct, failing to appreciate that those numbers reflected only the misconduct that had been detected.

An enforcement-driven mindset encourages this type of thinking. If the organization is sanctioned, then low detection rates look like success. But if the question is whether the organization is living up to its own purpose and values, the same data tell a very different story. This is not the broken windows theory of enforcement, but something else.

The Cost of Treating Compliance as a Cost of Doing Business

Another weakness of enforcement-driven compliance is that it can turn sanctions into a predictable line item. As firms grow larger and penalties are discounted through cooperation credit, fines risk being internalized as a cost of doing business. Empirical work cited by Martinez suggests that large, repeat offenders often pay penalties that are small relative to their assets and revenues. In that environment, enforcement loses much of its deterrent effect.

For compliance professionals, this dynamic creates a structural tension. Programs may be technically “effective” under DOJ guidance while still failing to prevent misconduct that harms customers, employees, and communities. The distinction between standards of review and standards of conduct becomes critical. Meeting the government’s expectations for leniency is not the same as meeting the organization’s ethical obligations to itself and its stakeholders.

What Is Purpose-Driven Compliance?

Purpose-driven compliance begins with a simple but powerful shift in perspective. Instead of asking, “What does the regulator expect?” the organization asks, “What risks threaten our ability to achieve our purpose and what standards of conduct are required to address them?” Martinez defines purpose-driven compliance as programs directed by three elements: the firm’s purpose, the inherent risks associated with pursuing that purpose, and the ethical standards the organization sets for itself. This approach does not reject enforcement frameworks; rather, it treats them as a floor, not a ceiling.

In practical terms, purpose-driven compliance requires leadership to articulate why the organization exists and how misconduct undermines that mission. For a bank, this may mean focusing on customer trust and market integrity. For a pharmaceutical company, it may mean prioritizing patient safety and scientific integrity. For a university, it may mean safeguarding academic freedom and institutional trust. For a summer camp, it means protecting the campers from flash floods and other storms.

Once the purpose is clearly defined, compliance risk assessments become more meaningful. Risks are evaluated not only by enforcement exposure but by their potential to compromise the organization’s core objectives. This reframing helps compliance leaders resist the temptation to chase regulatory trends at the expense of mission-critical risks.

Moving Beyond Mitigation to Aspirational Standards

A key insight in Martinez’s work is that firms often confuse mitigation with excellence. Compliance programs are designed to minimize penalties rather than to maximize ethical performance. Purpose-driven compliance challenges that mindset by encouraging organizations to adopt high, ethical, and aspirational standards of conduct.

This does not mean pursuing perfection through draconian controls or internal criminalization. Martinez rightly warns against overdeterrence and strict liability regimes that incentivize concealment rather than transparency. Instead, purpose-driven compliance emphasizes ethical framing, employee voice, and organizational learning. Compliance should never be Dr. No, sitting in the Department of Business Non-Development.

The examples of Wells Fargo and Novartis are instructive. Both organizations suffered repeated compliance failures under enforcement-driven regimes. Their subsequent reforms went beyond addressing the specific violations that triggered enforcement. They re-examined culture, leadership incentives, and ethical expectations. In Novartis’s case, tying bonuses to ethical performance and co-creating a new code of ethics signaled a shift from box-checking to values anchored in purpose.

Why Purpose-Driven Compliance Matters for the Modern CCO

For today’s chief compliance officer, Martinez believes purpose-driven compliance offers three critical benefits.

First, it creates durability. Enforcement priorities shift with administrations. Indeed, this Administration has signaled a cutback in white-collar enforcement by offering essentially get-out-of-jail-free cards to companies that self-disclose early. This underscores the importance of compliance programs. A compliance program anchored solely in regulatory expectations will always be reactive. Purpose-driven programs are more stable because they are tied to the organization’s identity rather than external politics.

Second, it improves the quality of compliance metrics. Measuring effectiveness against internal standards allows organizations to ask harder questions about culture, decision-making, and root causes. Not every initiative will succeed, but a willingness to acknowledge failure is itself a sign of program maturity.

Third, it enhances credibility with boards and senior leadership. When compliance is framed as a strategic partner in achieving the organization’s mission, rather than as a defensive function, it earns a more meaningful seat at the table.

Conclusion

Compliance has never been more sophisticated, expensive, or visible. Yet sophistication alone does not guarantee effectiveness. Martinez’s Purpose-Driven Compliance challenges compliance professionals to rethink the foundations of their programs. Enforcement-driven compliance has taken us far, but it cannot take us far enough.

The next evolution of compliance requires organizations to define their own standards of conduct, grounded in purpose, risk, and ethics. That shift is not easy. It requires courage from compliance leaders and commitment from boards and executives. But if compliance is truly about preventing harm and sustaining trust, purpose-driven compliance is not optional. It is essential.

Categories
FCPA Compliance Report

FCPA Compliance Report – FCPA Enforcement Shifts: Volatility and Uncertainty

Welcome to the award-winning FCPA Compliance Report, the longest-running podcast in compliance. In this episode,  host Tom Fox welcomes Anik Shah, Director & Senior Legal Counsel at Sandisk, for an insightful discussion about the pivotal changes and enforcement actions around the FCPA in 2025 and their implications for 2026.

In 2025, Anik Shah, a preeminent authority on FCPA and anti-corruption enforcement, offers a strategic perspective on the evolving compliance landscape. Given the recent uncertainties following an executive order and the dismissal of high-profile cases, Shah underscores the necessity for companies to maintain robust anti-bribery and anti-corruption controls, especially with potential reprioritization by the Department of Justice. He advocates a proactive risk management approach, emphasizing the importance of third-party risk management and comprehensive training to anticipate and mitigate potential FCPA issues. As enforcement focus shifts toward addressing cartel and transnational criminal organization activities, Shah advises companies to integrate anti-money laundering processes into their compliance strategies to align with global anti-corruption efforts.

Key highlights:

  • 2025 FCPA Enforcement Shifts and Uncertainty
  • Voluntary Self-Disclosure Policy Revolution in 2025
  • Cartel Risk Mitigation through Compliance Integration
  • Central Asia Construction Projects: Anti-Corruption Measures
  • Proactive Measures: Fostering Anti-Corruption Compliance Awareness

Resources:

Anik Shah on LinkedIn

Sandisk

Tom Fox

Instagram

Facebook

YouTube

Twitter

LinkedIn

Returning to Venezuela on Amazon.com

Categories
AI Today in 5

AI Today in 5: February 9, 2026, The AI Agents Doing Compliance Edition

Welcome to AI Today in 5, the newest addition to the Compliance Podcast Network. Each day, Tom Fox will bring you 5 stories about AI to start your day. Sit back, enjoy a cup of morning coffee, and listen in to the AI Today In 5. All, from the Compliance Podcast Network. Each day, we consider five stories from the business world, compliance, ethics, risk management, leadership, or general interest about AI.

Top AI stories include:

  1. What to do when AI is forced on compliance. (CW)
  2. Napier AI/AML report is out. (FinTechGlobal)
  3. AI and the accountability gap. (FinTechGlobal)
  4. Where AI is tearing through corporate America. (WSJ)
  5. Goldman is letting AI Agents do compliance. (PYMNTS)

For more information on the use of AI in Compliance programs, my new book, Upping Your Game, is available. You can purchase a copy of the book on Amazon.com.