Categories
Daily Compliance News

Daily Compliance News: October 2, 2025, The Cook Can Stay Edition

Welcome to the Daily Compliance News. Each day, Tom Fox, the Voice of Compliance, brings you compliance-related stories to start your day. Sit back, enjoy a cup of morning coffee, and listen in to the Daily Compliance News. All, from the Compliance Podcast Network. Each day, we consider four stories from the business world, including compliance, ethics, risk management, leadership, or general interest, relevant to the compliance professional.

Top stories include:

  • Meta to mine AI to create ads. (FT)
  • World ABC fight lessened by the US withdrawal. (The Conversation)
  • South Africa and Nigeria poised to exit dirty money list. (Bloomberg)
  • Supreme Court says FED Governor can stay until ruling. (Reuters)
Categories
AI Today in 5

AI Today in 5: October 1, 2025, The HR & IT Edition

Welcome to AI Today in 5, the newest edition to the Compliance Podcast Network. Each day, Tom Fox will bring you 5 stories about AI, so start your day, sit back, enjoy a cup of morning coffee, and listen in to the AI Today In 5, all from the Compliance Podcast Network. Each day, we consider four stories from the business world, compliance, ethics, risk management, leadership, or general interest related to AI.

Top AI stories include:

For more information on the use of AI in Compliance programs, my new book, Upping Your Game. You can purchase a copy of the book on Amazon.com.

Categories
AI Today in 5

AI Today in 5: September 30, 2025, The Shrinking Companies Edition

Welcome to AI Today in 5, the newest edition to the Compliance Podcast Network. Each day, Tom Fox will bring you 5 stories about AI, so start your day, sit back, enjoy a cup of morning coffee, and listen in to the AI Today In 5, all from the Compliance Podcast Network. Each day, we consider four stories from the business world, compliance, ethics, risk management, leadership, or general interest related to AI.

Top AI stories include:

For more information on the use of AI in Compliance programs, my new book, Upping Your Game. You can purchase a copy of the book on Amazon.com.

Categories
AI Today in 5

AI Today in 5: September 29, 2025, The AI and Blue Collar Jobs Edition

Welcome to AI Today in 5, the newest edition to the Compliance Podcast Network. Each day, Tom Fox will bring you 5 stories about AI, so start your day, sit back, enjoy a cup of morning coffee, and listen in to the AI Today In 5, all from the Compliance Podcast Network. Each day, we consider four stories from the business world, compliance, ethics, risk management, leadership, or general interest related to AI.

Top AI stories include:

For more information on the use of AI in Compliance programs, my new book, Upping Your Game. You can purchase a copy of the book on Amazon.com.

Categories
AI Today in 5

AI Today in 5: September 26, 2025, The Of Mice and AI Edition

Welcome to AI Today in 5, the newest edition to the Compliance Podcast Network. Each day, Tom Fox will bring you 5 stories about AI, so start your day, sit back, enjoy a cup of morning coffee, and listen in to the AI Today In 5, all from the Compliance Podcast Network. Each day, we consider four stories from the business world, compliance, ethics, risk management, leadership, or general interest related to AI.

Top AI stories include:

  • India and Venezuela sign AI pact. (Coingeek)
  • Little difference between the neural networks of mice and AI. (TechXplore)
  • xAI snags the US government. (NYT)
  • 85% of execs expect compliance gains with AI. (PYMNTS)
  • AI could accelerate clinical gains. (MIT News)

For more information on the use of AI in Compliance programs, my new book, Upping Your Game. You can purchase a copy of the book on Amazon.com.

Categories
2 Gurus Talk Compliance

2 Gurus Talk Compliance – Episode 60 – The Dispatches Edition

What happens when two top compliance commentators get together? They talk compliance, of course. Join Tom Fox and Kristy Grant-Hart in 2 Gurus Talk Compliance as they discuss the latest compliance issues in this week’s episode!

 Stories this week include:

  • A former Navy No. 2 was sentenced to 6 years for corruption.  (NBC)
  • BCG employees to take Humanitarian Principles training. (FT)
  • DOJ is about to cut loose the Binance monitor. (Bloomberg)
  • Trump calls for the end of quarterly reporting for public compliance.  (NYT)
  • First AI CCO.  (BBC)
  • Dispatches from the SCCE Conference – Radical Compliance
  • Trump and Europe Are at Odds Over How to Sanction Russia – WSJ
  • What Compliance Leaders Need to Know Ahead of Crucial DOJ Data Security Program Deadline – Corporate Compliance Insights
  • The Rush to Return to Office is Stalling – WSJ
  • Florida man clings to back of moving UPS truck to avoid deputies after Lowe’s shoplifting attempt: officials – FOX Orlando 35

Connect with the Hosts:

Prove Your Worth

Tom

Instagram

Facebook

YouTube

Twitter

LinkedIn

Categories
Blog

Cybersecurity Oversight at the Boards

Cybersecurity risk is no longer a back-office IT issue. It is a board-level governance priority, a regulatory compliance challenge, and a reputational minefield. From ransomware attacks to regulatory enforcement actions, the stakes have never been higher. In an article in the Harvard Law School Forum on Corporate Governance, titled “Risk Management and the Board of Directors,” the review focused on the NACD’s 2025 survey. It showed that over three-quarters of boards now discuss the material and financial implications of cyber incidents. While that is progress, awareness alone is not enough.

For compliance professionals, the message is unmistakable: cybersecurity oversight is now a central pillar of governance. In this post, I will explore the evolving regulatory landscape, lessons from enforcement actions, and practical steps compliance teams can take to help boards discharge their responsibilities effectively.

A National Priority with Global Reach

Cybersecurity has moved to the top of national agendas. The Biden Administration’s 2023 National Cybersecurity Strategy set the tone, and the Trump Administration’s 2025 Executive Order reinforced it, emphasizing protections against foreign cyber threats and secure technology practices. But this is not just a U.S. issue. The EU’s GDPR, California’s CCPA, Virginia’s CDPA, and Illinois’s biometric data laws all impose sweeping obligations with high-stakes enforcement. Settlements under Illinois’s biometric privacy law alone have reached into the hundreds of millions.

For compliance professionals, this expanding patchwork of regulation means that cyber oversight cannot be siloed by geography or business unit. Boards must ensure management understands and complies with both domestic and international requirements.

The SEC Steps into the Spotlight

If boards needed any reminder of their cyber responsibilities, the SEC has provided it. In 2023, the SEC finalized disclosure rules requiring companies to report material cyber incidents on Form 8-K within four business days (subject to limited delays approved by the Attorney General). Companies must also disclose in their 10-Ks their processes for identifying and managing cyber risks, the material impacts of prior incidents, and, critically, the board’s role in oversight.

The SEC has coupled disclosure mandates with enforcement actions. From Robinhood in 2025 (failure to implement identity theft protections) to SolarWinds in 2023 (alleged fraud and internal control failures), to Blackbaud’s ransomware misrepresentations and Morgan Stanley’s vendor monitoring failures, the Commission is signaling that cyber lapses are securities law violations. The key takeaway for compliance is that disclosures must be accurate, controls must be effective, and boards must demonstrate active oversight. Anything less may well invite regulatory scrutiny.

DOJ, FTC, and State Regulators Join In

The SEC is not alone. The DOJ has used the False Claims Act to address software vulnerabilities sold to government agencies. The FTC has pursued cases against GoDaddy and other providers for failing to implement adequate protections. The New York Department of Financial Services (NYDFS) has enforced its prescriptive cybersecurity rules since 2019, with actions as recent as August 2025. And globally, regulators like Ireland’s Data Protection Commission have issued blockbuster fines, such as the €530 million penalty against TikTok for unlawful data transfers.

The compliance implication is clear: multi-layered enforcement is now the norm. Cybersecurity and data privacy risks span agencies, jurisdictions, and statutes. Boards must assume that regulators will coordinate, cross-reference, and pursue failures aggressively.

Frameworks That Matter

With enforcement risk high, companies need a structured approach. The National Institute of Standards and Technology (NIST) framework has become the de facto benchmark, with its five core functions: identify, protect, detect, respond, and recover. Both the SEC and FTC endorse it, and boards should expect management to benchmark their programs against it.

At the governance level, the NACD’s Director’s Handbook on Cyber-Risk Oversight and guidance from the Cybersecurity & Infrastructure Security Agency (CISA) provide clear expectations: boards should not manage cyber risk, but they must oversee management’s handling of it.

Lessons from Enforcement Actions

Every enforcement case tells a story, and compliance professionals should use these as teaching tools:

  • Vendor Oversight Matters – Morgan Stanley’s Failure to Monitor Vendors Exposed Data from 15 Million Customers.. Boards must ensure that vendor cyber risk is integrated into their oversight.
  • Accurate Disclosures Are Non-Negotiable – SolarWinds and Blackbaud faced allegations of misrepresentation around breaches. Boards must verify that management’s cyber disclosures are truthful and complete.
  • Controls Must Be Tested – Robinhood’s identity theft control failures remind us that having policies on paper is not enough. Boards should require evidence that controls work in practice.

Practical Steps for Compliance Professionals

So how can compliance officers help boards meet their obligations in this complex cyber landscape? Four steps stand out:

1. Educate and Engage the Board

Boards need ongoing, tailored education on cyber risks. Compliance should arrange regular briefings from CISOs, external experts, and regulators. This ensures directors can ask informed questions and challenge management effectively.

2. Strengthen Incident Response Preparedness

An incident response plan is only as strong as its execution. Compliance must test plans through tabletop exercises, ensure disclosure obligations are understood, and coordinate with law enforcement and advisors. Boards should be briefed on lessons learned after every drill or real incident.

3. Integrate Cyber Risk into Enterprise Risk Management

Cyber risk cannot be isolated from strategy, finance, and operations. Compliance should help boards see cyber threats as part of enterprise risk management, aligned with business goals and resilience planning.

4. Monitor Third-Party and Supply Chain Risk

Vendors, cloud providers, and contractors are often the weak link. Compliance should implement due diligence, ongoing monitoring, and contract requirements that address cyber obligations. Boards should receive visibility into these risks and the company’s mitigation strategies.

Why This Matters for Boards and Compliance

Cybersecurity is not just an IT challenge; it is a governance imperative. Regulators, courts, and investors expect boards to demonstrate active, documented oversight. For compliance professionals, the mandate is to help boards meet that expectation with clarity, structure, and evidence.

The reality is stark that a single breach can devastate a company’s reputation, stock price, and stakeholder trust. But boards that embrace active oversight, guided by compliance professionals, can transform cybersecurity from a vulnerability into a competitive advantage.

Final Thoughts

The cyber landscape is evolving faster than most organizations can keep pace. But boards do not have the luxury of waiting. As recent regulations and enforcement actions demonstrate, oversight failures will be punished, sometimes harshly.

For compliance professionals, this is both a challenge and an opportunity. By educating boards, strengthening incident response, integrating cyber into enterprise risk, and addressing third-party exposures, compliance can elevate its role from policy enforcer to strategic partner.

The bottom line: Cybersecurity oversight is no longer optional. It is the frontline of governance, and compliance professionals are the essential guides helping boards navigate it.

Categories
AI Today in 5

AI Today in 5: September 25, 2025, The Red Lines for AI Edition

Welcome to AI Today in 5, the newest edition to the Compliance Podcast Network. Each day, Tom Fox will bring you 5 stories about AI, so start your day, sit back, enjoy a cup of morning coffee, and listen in to the AI Today In 5, all from the Compliance Podcast Network. Each day, we consider four stories from the business world, compliance, ethics, risk management, leadership, or general interest related to AI.

Top AI stories include:

For more information on the use of AI in Compliance programs, my new book, Upping Your Game. You can purchase a copy of the book on Amazon.com.

Categories
Blog

Directors and AI: Do’s, Don’ts, and Compliance Lessons

Artificial intelligence (AI) has rapidly become embedded in the daily workflows of executives, employees, and, increasingly, board directors. From drafting strategy summaries to analyzing industry data, directors are turning to AI chatbots and transcription tools in the same way they once adopted email, spreadsheets, or virtual board portals. However, unlike those earlier technologies, AI presents new risks, and for directors, these risks intersect directly with fiduciary duties and corporate governance obligations.

A recent memorandum by Skadden, Arps, Slate, Meagher & Flom LLP, published through the Harvard Law School Forum on Corporate Governance, outlines practical dos and don’ts for directors using AI in their board roles. The message is clear: while AI offers great promise, directors must use it with caution. For compliance professionals, this guidance provides important lessons not only for boardrooms but also for the governance structures that surround them.

The Temptation of AI in the Boardroom

Boards are expected to absorb massive amounts of information, such as financial results, strategy papers, compliance reports, cybersecurity dashboards, and often under tight timelines. It is easy to see why a director might feed these materials into an AI tool to produce summaries or ask for red flags. Similarly, transcription services appear attractive for documenting complex board meetings and discussions. But here lies the trap: not all AI tools are created equal. Publicly available chatbots often train on user inputs, meaning that confidential board information could be incorporated into the system and potentially regurgitated to other users, including competitors.

Just as you would never allow directors to send board books through unsecured email, AI tools need guardrails.

Key Risks Identified in the Director’s Guide

The Skadden memorandum outlines several risks directors must consider when using AI in their corporate capacities:

  1. Confidentiality and Data Leakage – Uploading sensitive materials into public AI systems risks exposing trade secrets or personal data. Even if the information is deleted from a user’s history, the AI vendor may still retain and train on it.
  2. Discovery and Litigation Risks – AI chats are records. Like emails, they may be discoverable in litigation or regulatory reviews. Regulators could demand access to AI interactions if they involve matters under scrutiny, such as antitrust reviews of mergers and acquisitions (M&A) activity.
  3. Loss of Privilege – Using AI to transcribe board meetings or communications with counsel risks waiving attorney-client privilege. Once third parties have access, privilege may be lost forever.
  4. Accuracy and Hallucinations – AI outputs can be wrong, biased, or outdated. Treating AI results as authoritative without verification exposes directors to poor decision-making and potential breaches of fiduciary duties.
  5. Erosion of Human Judgment – Over-reliance on AI to make HR, strategy, or other critical decisions risks abdicating the duty of care and loyalty. Directors must remain firmly “in the loop”.

Compliance Lessons for Professionals

From these risks, we can distill key lessons for compliance officers advising boards and executives on AI governance.

1. Confidential Information Must Stay Inside the Perimeter

Compliance professionals should establish clear rules: no uploading of board materials, personal data, or trade secrets into public AI tools. Instead, direct the board to company-approved platforms that are vetted for security and configured to prevent training on sensitive inputs. This is not just a best practice; it may also be required to comply with contractual obligations, privacy laws, and internal data-protection policies.

2. Treat AI Chats as Discoverable Records

Boards should assume that anything shared with AI may one day be discoverable by others. Compliance professionals must include AI chats and transcripts in records-retention policies and advise directors to avoid discussing sensitive legal or competitive issues in public AI systems. This lesson mirrors earlier corporate missteps with text messages and messaging apps. AI is the new frontier for discoverability.

3. Preserve Privilege by Avoiding AI for Legal Matters

Directors must not use AI to record privileged discussions with counsel or board meetings, as this would violate the attorney-client privilege. Compliance officers should make this an explicit policy. Approved transcription tools may be used for training sessions or customer service calls, but never for board-level deliberations. Losing privilege could cripple a company’s defense in litigation. Compliance officers should hammer this home during board training.

4. Verify Before You Trust

AI has a well-documented tendency to “hallucinate.” Directors must be reminded: AI is not a single source of truth. Compliance programs should emphasize verification. Encourage directors to cross-check AI outputs against trusted sources and ensure management reviews AI-generated analyses before relying on them for decision-making.

5. AI Is a Tool, Not a Decision-Maker

The most important compliance lesson: AI augments but does not replace human judgment. Directors remain bound by duties of care and loyalty. Compliance professionals must make clear that delegating decision-making to AI tools could not only harm the company but also expose directors to personal liability.

Building a Compliance Framework for Board Use of AI

The Skadden guide closes by urging boards to develop clear policies for AI use, including approved tools, acceptable uses, and required disclosures. For compliance officers, this is an opportunity to lead.

Here are key framework elements to consider:

  • Approved Tools List – Maintain a list of AI platforms validated by IT and legal for security and compliance.
  • Acceptable Use Policy – Define when and how directors may use AI (e.g., industry research, summarizing public filings) versus prohibited uses (e.g., uploading board decks, transcribing meetings).
  • Training and Awareness – Provide directors with training on AI risks, including confidentiality, discoverability, and hallucinations.
  • Monitoring and Audit – Periodically review the use of AI by directors to ensure compliance with relevant policies and regulations.
  • Disclosure Requirements – Require directors to disclose if AI tools were used to generate or summarize board-related materials.

Final Thoughts

The “Do’s and Don’ts of Using AI” is a timely reminder: AI governance is not only about company-wide adoption. It also starts at the top, with the board itself. Directors tempted to use AI in their own roles face unique risks. These risks could compromise confidentiality, destroy privilege, or erode fiduciary oversight.

For compliance professionals, this presents an opportunity to serve as both educator and enforcer. Just as compliance led the charge on insider trading policies, conflicts of interest, and anti-bribery training, so too must we lead on AI governance.

The bottom line is that AI can be an extraordinary tool for directors. But without compliance guardrails, it can also be a governance trap. Our role is to ensure the boardroom and the company stay on the right side of that line.

Categories
AI Today in 5

AI Today in 5: September 24, 2025, The AI Literacy Edition

Welcome to AI Today in 5, the newest edition to the Compliance Podcast Network. Each day, Tom Fox will bring you 5 stories about AI, so start your day, sit back, enjoy a cup of morning coffee, and listen in to the AI Today In 5, all from the Compliance Podcast Network. Each day, we consider four stories from the business world, compliance, ethics, risk management, leadership, or general interest related to AI.

Top AI stories include:

For more information on the use of AI in Compliance programs, my new book, Upping Your Game. You can purchase a copy of the book on Amazon.com.