Categories
Daily Compliance News

Daily Compliance News: March 23, 2026, The All FT Edition

Welcome to the Daily Compliance News. Each day, Tom Fox, the Voice of Compliance, brings you compliance-related stories to start your day. Sit back, enjoy a cup of morning coffee, and listen in to the Daily Compliance News. All, from the Compliance Podcast Network. Each day, we consider four stories from the business world, compliance, ethics, risk management, leadership, or general interest for the compliance professional.

Top stories include:

  • Is China more stable for companies than the US? (FT)
  • JPMorgan to monitor junior bankers’ hours. (FT)
  • Collapsed mortgage lender in the UK given the all clear by the FCA in 2024. (FT)
  • AI is reshaping the business of law. (FT)
Categories
Blog

The Business Case for Compliance: What Ethisphere’s Ethics Premium Means for CCOs and Boards

For years, compliance professionals have argued that ethics matters because it is the right thing to do. That remains true. But the latest data from Ethisphere adds an equally important point for boards, CEOs, and investors: ethics also matter because it is good business.

That is the message from Ethisphere’s “Inside the Ethics Premium,” which examines the performance of the publicly listed 2026 World’s Most Ethical Companies honoree cohort against the Solactive GBS Global Markets All Cap USD Index over five years. Ethisphere is careful to note that this is correlation, not causation. Yet the signal is powerful. Companies recognized for leadership in ethics, compliance, culture, and governance delivered stronger performance and greater resilience than the broader benchmark over a full market cycle.

For the Chief Compliance Officer, this is more than a nice talking point. It is a boardroom argument. It is a management argument. And it is a strategic argument.

The headline number is straightforward: Ethisphere found a five-year ethics premium of +8.2 percentage points from January 1, 2021, through December 31, 2025. But in some ways, the more important story sits beneath that top-line result. The honoree cohort did not simply outperform. It also proved more resilient. Ethisphere reports that these companies experienced a 7.1% smaller maximum drawdown, returned to prior highs 10.1% faster, and spent 14.4% less time below their prior peak than the broader market benchmark. That is the business case for compliance in a nutshell.

When markets are rising, companies want to participate in the upside. When markets are falling, boards want to know two things: how much value was lost and how quickly the company can recover. Ethisphere’s data suggests that ethics-leading organizations can do both. The cohort captured 104% of the market’s upside while experiencing only 97% of the benchmark’s downside in down months. In other words, strong ethics and compliance programs do not merely help companies avoid disaster. They may also position them to compete more effectively across a full economic cycle.

Why might that be? Ethisphere offers a compelling explanation. Strong programs reduce surprises, strengthen decision-making, protect trust, and safeguard intangible assets, all of which support durable performance. That last point is critical. In modern business, enterprise value is increasingly tied to intangibles: trust, culture, reputation, confidential information, and the ability to operate through disruption. Ethisphere cites Ocean Tomo’s estimate that, by the end of 2025, intangible assets will account for approximately 92% of the S&P 500’s market capitalization. That should get every board’s attention.

If intangible assets drive enterprise value, then the systems that protect those assets are no longer peripheral. They are central. Compliance, ethics, culture, reporting channels, investigations, training, third-party risk management, and managerial accountability become part of the company’s value preservation and value creation architecture. Put differently, stock performance is the outcome. The operating system is what management can control.

This is where the Ethisphere report is especially useful for compliance professionals. It does not stop at market outcomes. It also points to the kinds of practices that characterize best-in-class programs. The evaluation for the World’s Most Ethical Companies is built on the Ethics Quotient, a 240-plus-question assessment covering governance, program structure, written standards, training and communication, risk assessment and detection, enforcement and incentives, culture measurement, third-party risk management, and impact assessment and reporting. Those are not abstract ideals. They are operational disciplines. Consider three proof points from the report.

  1. 75% of honorees share investigation and discipline statistics with all employees, which Ethisphere says is a 6 percentage-point gain over the last three years. That is a powerful indicator of transparency. It tells employees that reports are taken seriously, issues are addressed, and misconduct has consequences. In the compliance world, trust in the system is everything. Employees speak up when they believe the organization will listen and act.
  2. Ethisphere notes that a majority of honorees are using more adaptive online training techniques, such as test-out, test-up, or progressive course difficulty. That is important because it reflects a maturity of approach. Training is not treated as a check-the-box exercise. It is treated as an engagement tool, designed to capture attention and improve retention. Effective compliance training should respect the workforce, meet people where they are, and be more relevant. The best programs understand this.
  3. Nearly every honoree equips managers with toolkits, talk tracks, and resources to discuss ethical dilemmas with their teams, and 51% require managers to do so. That may be the most practical lesson of all. Culture does not live in the code of conduct. Culture lives in the daily conversations between managers and employees. If you want an ethical culture, you need ethical middle management. You need managers who can translate corporate values into operational guidance at the point of decision.

There is another point in the Ethisphere data that boards should not miss: this outperformance is not a one-off event or a lucky stretch. Ethisphere found a positive excess return in 65% of rolling 12-month windows, or 31 of 48 periods, over the last five years. Even more striking, Ethisphere says that every year since it began calculating the Ethics Premium, the honoree cohort has outperformed its peer group. That kind of consistency matters because it suggests durability. It suggests that ethics and compliance excellence may be part of a repeatable enterprise capability.

What should compliance professionals do with this information? They should use it. Use it with the board to reframe compliance from overhead to strategic infrastructure. Use it with the CEO and CFO to show that ethics is tied to resilience, recovery, and enterprise value. Use it in budget discussions to explain why investments in reporting systems, investigations, manager enablement, and training are not soft spending. They are hard-edged business investments.

The lesson from Ethisphere is not that every ethical company will outperform, nor that every compliance investment leads directly to share price appreciation. Ethisphere expressly warns against that simplistic conclusion by emphasizing correlation rather than causation. But the lesson is still profound. Companies with stronger ethics, compliance, culture, and governance systems appear better positioned to protect trust, reduce disruption, and recover faster when stress hits.

That is what boards care about. That is what shareholders care about. And that is why the business case for compliance has never been stronger. For the compliance professional, the takeaway is clear: do not undersell your function. Ethics is not merely a guardrail. It is a performance advantage.

Categories
Sunday Book Review

Sunday Book Review: March 22, 2026, The University of Chicago Press Edition

In the Sunday Book Review, Tom Fox considers books that would interest compliance professionals, business executives, or anyone curious. It could be books about business, compliance, history, leadership, current events, or anything else that might interest Tom. In this episode, we look at 4 top books released in March by the University of Chicago Press.

  1. Bicentennial by Marc Stein
  2. The Invention of Infinite Growth by Christopher F. Jones
  3. The Means of Prediction by Maximilian Kasy
  4. Against Innocence by Miriam Ticktin
Categories
AI in Healthcare

AI in Healthcare: Five Healthcare AI Stories You Need to Know This Week – March 20, 2026

Welcome to AI in Healthcare in 5 Stories. This podcast is a Weekly Briefing of the five most important AI developments shaping healthcare, medicine, and life sciences. Each week, Tom Fox breaks down the latest stories on clinical innovation, regulation, privacy, compliance, patient safety, and operational transformation through a practical, business-focused lens. Designed for healthcare compliance professionals, executives, legal teams, clinicians, and industry leaders, the podcast moves beyond headlines to explain what each development means in the real world.

The top five stories for the week ending March 20, 2026, include:

  1. Does healthcare need specialized AI? (Harvard Business Review)
  2. AI opens a new front in the hospitals v. insurers battle. (Reuters)
  3. Where AI can make the biggest impact in healthcare. (Healthcare IT News)
  4. Why healthcare institutions are struggling to implement AI effectively. (Forbes)
  5. Is pharma ready for Agentic AI? (PharmaPhorum)

For more information on the use of AI in Compliance programs, my new book, Upping Your Game, is available. You can purchase a copy of the book on Amazon.com.

Categories
AI in Financial Services in 5 Stories

AI in Financial Services in 5 Stories – Week Ending March 20, 2026

Welcome to AI in Financial Services in 5 Stories. A practical weekly roundup of the five most important AI developments affecting banking, insurance, payments, asset management, and fintech. Each Friday, Tom Fox will break down the top stories that matter most through the lenses of compliance, risk management, governance, and business strategy. Designed for compliance professionals, executives, legal teams, and financial services leaders, it goes beyond headlines to explain why each development matters in a highly regulated industry. The result is a concise weekly briefing that helps listeners stay current on AI innovation while asking sharper questions about oversight, accountability, and trust.

This week’s stories include:

  1. How AI is changing fintech. (Intuit)
  2. GSA AI clause.(Holland & Knight)
  3. Leading through AI transformation in FinTech. (Forbes)
  4. Mastercard unveils AI engine. (FinTechMagazine)
  5. FCA demands explainable decisions. (FinTechGlobal)

For more information on the use of AI in Compliance programs, my new book, Upping Your Game, is available. You can purchase a copy of the book on Amazon.com.

Categories
AI Today in 5

AI Today in 5: March 20, 2026, The AI Changing Compliance Edition

Welcome to AI Today in 5, the newest addition to the Compliance Podcast Network. Each day, Tom Fox will bring you 5 stories about AI to start your day. Sit back, enjoy a cup of morning coffee, and listen in to the AI Today In 5. All, from the Compliance Podcast Network. Each day, we consider five stories from the business world, compliance, ethics, risk management, leadership, or general interest about AI.

Top AI stories include:

  1. Has AI changed the rules of compliance? (Forbes)
  2. How AI and deep fakes are reshaping identity fraud. (FinTechGlobal)
  3. How AI is changing product compliance. (SupplySidesJ)
  4. World Bank to focus on AI-resilient job creation. (Bloomberg)
  5. How AI is changing fintech. (Intuit)

For more information on the use of AI in Compliance programs, my new book, Upping Your Game, is available. You can purchase a copy of the book on Amazon.com.

Categories
Daily Compliance News

Daily Compliance News: March 20, 2026, The Flight Corridor Risk Edition

Welcome to the Daily Compliance News. Each day, Tom Fox, the Voice of Compliance, brings you compliance-related stories to start your day. Sit back, enjoy a cup of morning coffee, and listen in to the Daily Compliance News. All, from the Compliance Podcast Network. Each day, we consider four stories from the business world, compliance, ethics, risk management, leadership, or general interest for the compliance professional.

Top stories include:

  • Why did the lead investigator not testify in the FirstEnergy trial? (Cleveland.com)
  • The Nigerian ABC commission pays money back to NNPC. (Business Insider Africa)
  • Flight corridors and risk management. (NYT)
  • COI, corruption, and more in the Paramount deal. (WSJ)
Categories
Blog

AI Governance and Fiduciary Duty: Board Oversight of AI As Core Governance

There was a time when boards could treat AI as a management-side innovation issue, something for the technology team, the innovation committee, or perhaps an occasional strategy offsite. That time is ending. No longer. For every compliance professional, AI stops being a technology story and becomes a governance story. And once it becomes a governance story, boards need to pay attention through the lens they know best: fiduciary duty.

The issue is not whether every director needs to become an engineer. They do not. The issue is whether the board is exercising appropriate oversight over a capability that can materially affect legal exposure, operational resilience, internal controls, reputation, and enterprise value. Under that lens, ignoring AI oversight begins to look less like prudence and more like a governance gap.

The Board Question Is No Longer “Do We Use AI?”

Too many board discussions still start in the wrong place. A director asks, “Are we using AI?” Management says yes, in a handful of pilots. Another director asks whether there is a policy. Legal says yes, one is being drafted. Everyone nods, reassured that the matter is under control. That is not oversight. That is atmospherics.

The real board questions are different. Where is AI being used? What decisions does it influence? What data does it rely on? Who owns it? How is risk assessed? What controls are in place? What gets reported upward when something changes or goes wrong?

COSO’s GenAI guidance is quite direct on this point. It states that the board of directors must have visibility into GenAI use and associated risks, including regular reporting on adoption, key risk indicators, incidents, and material changes to high-impact use cases. It also says oversight bodies should have the capacity to challenge assumptions, request independent validation, and direct corrective action.

Fiduciary Duty Means Oversight, Not Technical Mastery

The fiduciary duty standard is more practical and more familiar. Directors are expected to exercise informed oversight over material risk. If AI is shaping material processes, material decisions, or material exposures, then the board should ask how management governs it and what evidence supports that confidence.

This is where compliance can be a true translator. We understand how to connect abstract governance expectations to operational proof. We know the difference between having a policy and having a control. We know that a dashboard without escalation is theater. We know that a pilot without documentation is an anecdote. And we know that “the business owns it” is not enough unless ownership is defined, trained, monitored, and accountable.

COSO again gives a helpful framework. It emphasizes clear ownership of each GenAI tool, platform, or capability, with defined authority, escalation paths, and documented scope of use. It further stresses that assigning ownership without the capability to deliver invites failure, and that accountability should be tied not only to adoption but also to accuracy, safety, compliance, and adherence to controls. Boards do not need to run AI. But they do need assurance that someone competent owns it and that the ownership model is real.

Why AI Oversight Is Different from Ordinary IT Oversight

Some directors may be tempted to ask whether this is simply another version of cybersecurity or of oversight for digital transformation. There is overlap, certainly, but AI presents a different governance profile. COSO notes several characteristics that distinguish GenAI. It is dynamic: models, prompts, and retrieval data can change frequently, requiring continuous risk assessment, change control, and monitoring. It is easily scalable, meaning it can amplify errors and bias as readily as it can amplify efficiency. It has a low barrier to entry, which increases the risk of shadow AI and ungoverned adoption. And critically, it can be confidently wrong.

That last point is especially important for boards. A broken machine usually signals that it is broken. AI often does the opposite. It produces polished, persuasive, and highly plausible output even when it is materially mistaken. That means traditional management confidence can be a weak proxy for actual reliability. Boards, therefore, need a different kind of assurance model, one that asks not only whether the system is in place, but whether the organization can validate outputs, explain limitations, monitor drift, and intervene when use cases expand beyond what was originally approved.

The Governance Gap Boards Must Avoid

Here is where the fiduciary-duty lens becomes especially useful. The governance failure in the AI era is unlikely to be that a board has never heard the term “AI.” Every board in America has heard it. The failure is more likely to be subtler and therefore more dangerous: the board heard about AI in broad strategic terms but never built a repeatable oversight mechanism around it.

That is the governance gap.

It shows up when management reports adoption but not risk classification.

It shows up when directors hear about productivity gains but not control failures.

It shows up when there is an AI policy but no inventory of use cases.

It shows up when there is enthusiasm about innovation but no discussion of third-party dependencies, data quality, escalation paths, or human review.

It shows up when incidents are handled ad hoc rather than through a defined reporting structure.

COSO warns that rapid iteration can outpace existing processes, and that prompts, thresholds, and retrieval connectors are critical configuration elements that require the same rigor as other controlled system settings. It also highlights third-party and vendor risk, noting that outsourced GenAI capabilities can limit visibility into training data, model updates, data handling, and underlying controls.

In other words, the board should not assume AI risk is contained simply because a vendor is involved or because the tool sits inside a familiar enterprise platform. That should sharpen the oversight question.

What Good Board Oversight Looks Like

The good news is that effective AI oversight is not mystical. It looks a great deal like good oversight in other high-risk areas. It is structured, periodic, evidence-based, and tied to accountability. At a minimum, boards should expect management to provide five things.

  1. An inventory of material AI use cases, categorized by risk and business impact.
  2. A governance structure that identifies owners, review forums, escalation paths, and the role of compliance, legal, risk, audit, and technology.
  3. Clear policies and boundaries around acceptable use, prohibited data, high-impact decisions, and when human review is mandatory.
  4. Meaningful reporting. Not just adoption statistics, but risk indicators, incidents, model or vendor changes, validation results, and material control exceptions.
  5. A remediation and monitoring process that reflects the dynamic nature of AI.

That is consistent with COSO’s broader framework, which stresses alignment with organizational goals and risk appetite, the use of relevant information, internal communication, ongoing evaluations, and the communication of deficiencies. This is where I would encourage boards to think less in terms of “AI briefings” and more in terms of “AI oversight cadence.” A one-time presentation is not governance. A recurring structure is.

The Board Does Not Need More Hype. It Needs Evidence.

One risk in the current market is that AI discussions are still drenched in promotional language. Faster. Smarter. More innovative. Transformational. Useful words, but not enough for a board discharging fiduciary obligations.

Boards need evidence. This is where the compliance function can shine. Compliance professionals know how to convert aspiration into evidence. We know how to build a record showing that oversight is not merely claimed, but exercised.

And make no mistake, documentation matters. Structured communication and clear records are essential for reconstructing decisions, demonstrating accountability, and supporting regulatory or audit review. That principle runs through effective compliance practice generally and becomes even more important in AI governance, where organizations must often explain not only what decision was made, but how the process was overseen.

Five Questions Every Board Should Ask Now

If I were advising a board chair or audit committee chair, I would start with five questions.

  1. What are our highest-risk AI use cases, and who owns each one?
  2. What information does the board receive regularly about AI adoption, incidents, and material changes?
  3. How do we know that management is validating AI outputs rather than simply trusting them?
  4. Where are third-party AI tools embedded in our environment, and what visibility do we have into the risks they pose?
  5. What evidence would we produce tomorrow if a regulator, auditor, or shareholder asked how this board oversees AI?

Those questions do not require the board to become technical. They require the board to become disciplined.

The Bottom Line

AI governance is moving quickly from optional good practice to expected governance hygiene. That is the real message the real message boards need to hear. Under a fiduciary-duty lens, the challenge is straightforward. Directors do not need to be AI developers. But they do need to ensure that management has built a credible system for identifying, governing, monitoring, and escalating AI risk. When AI touches material business processes, board silence is not neutrality. It is exposure.

The companies that get this right will not be the ones that talk most loudly about innovation. They will be the ones whose boards insist on visibility, accountability, evidence, and follow-through. That is not anti-innovation. That is governance doing its job.

Categories
Hill Country Authors

Hill Country Authors Podcast: Paul McGrath on “Left is Right”: Satire, Darker Threats, and Current-Events Inspiration

Welcome to a new season of the award-winning Hill Country Authors Podcast, sponsored by Stoney Creek Publishing. In this podcast, Hill Country resident Tom Fox visits with authors who live in and write about the Texas Hill Country.  Host Tom Fox opens a new season of the Texas Hill Country Authors Podcast with returning guest Paul McGrath to discuss McGrath’s novel Left is Right, a sequel to the PEN Craft award-winning Left.

McGrath recounts a 37-year career at Texas newspapers, primarily the Houston Chronicle, plus teaching at Texas A&M and Clear Lake, and his A&M roots with The Battalion. He explains expanding Anton’s story into a multi-book series (with five planned), driven by character attachment and news-inspired plots. McGrath describes layered “Left” titles, using Ellie to express progressive viewpoints, and empathy as a motivating force for Anton and Ellie, including Ezra’s lingering influence. He notes a darker tone influenced by right-wing militias, human trafficking, and a Texas motorcycle gang, balanced by humor, wordplay, and pop-culture references like a Jon Hamm dream sequence. He outlines the FBI’s and alien authorities’ ongoing pursuit, then a return to alien supervision, credits Stoney Creek Publishing’s support, shares on social platforms, and previews future themes involving Russians and cryptocurrency.

Key highlights:

  • Why Continue with Anton
  • Series Titles and ‘Left’
  • Empathy Driving the Plot
  • Darker Satire and Villains
  • Humor Wordplay and Names
  • Pop Culture Cameos
  • Where the Series Goes

Resources:

Paul McGrath on Stoney Creek Publishing

Left is Right on Texas A&M University Press

Social Media 

Instagram

X

Threads

 Podcast Cover Art

Nancy Huffman Fine Art

Tom Fox

Instagram

Facebook

YouTube

Twitter

LinkedIn

Categories
Daily Compliance News

Daily Compliance News: March 19, 2026, The Corruption in Soccer Edition

Welcome to the Daily Compliance News. Each day, Tom Fox, the Voice of Compliance, brings you compliance-related stories to start your day. Sit back, enjoy a cup of morning coffee, and listen in to the Daily Compliance News. All, from the Compliance Podcast Network. Each day, we consider four stories from the business world, compliance, ethics, risk management, leadership, or general interest for the compliance professional.

Top stories include:

  • US relaxes sanctions on PDVSA. (FT)
  • Chin wants the Malaysian ABC agency investigated. (Bloomberg)
  • Hacker breaks into law enforcement tip database. (Reuters)
  • Senegal, stripped of the Africa Cup title, calls for a corruption investigation. (NYT)