Categories
Blog

Agentic AI, Data Discipline, and Cross-Functional Governance: Compliance Insights for the Modern Era

As compliance professionals, we often inherit the boundaries that IT, Legal, and Security established long before we arrived. But what happens when those lines are out of date? I recently had a far-ranging conversation with cybersecurity author and educator Robert Meyers, who has spent more than three decades transitioning from “plain IT” to a world where cybersecurity and privacy have become distinct, high-impact disciplines. He explains why the old map no longer matches the terrain. Meyers’ vantage point spans early dial-up remote access fiascos, modern breach response, philosophical differences between U.S. and EU privacy regimes, and the tidal shift that agentic AI is bringing to accountability and data governance.

This blog post distills that conversation for a corporate compliance audience, focusing on practical, board-relevant governance and the day-to-day tactics that make privacy and security work together before, during, and after incidents.

From “IT Does Everything” to “Risk, Roles, and Accountability”

Meyers started in an era when “cybersecurity” did not exist. There was just “IT,” and everyone did everything. That lack of specialization produced preventable harm;  misconfigured remote access where a “guest” credential quietly had admin rights, cavalier attitudes toward email and user surveillance (Remember when “I read your email” bumper stickers were a thing.), and a culture that treated privacy as a corporate secrecy issue rather than a people-protection mandate. The lesson for compliance? Risk thrives in ambiguity. When roles and ownership are unclear and authority is not defined, controls are merely a facade.

Meyer contrasts the U.S. and EU not as a legal vs. legal comparison, but as a philosophical split. In Europe, privacy is government-centric and procedurally channeled through regulators; in the U.S., it is more individual-centric and notification-driven. California’s rules can even exceed the practical strictness of the GDPR in certain respects. For compliance leaders, that means your privacy posture must be designed around intent (IE., who is protected), governance (IE., who decides), and operational execution (IE., who does the work) and not just a citation list.

Data Has a Life Cycle—Treat It That Way

One of Meyers’ most pointed critiques is that organizations hoard data without a purpose or end-of-life discipline. If you keep 30 years of email, do not be surprised when eDiscovery asks for all 30. The habit of “keep it all, we might need it” is the enemy of proportional risk. Compliance should drive a business-backed data minimization program with explicit retention schedules tied to legal, operational, and risk rationales and then audit for enforcement. If the business cannot articulate why it needs a dataset today and in the future, that data is a liability, not an asset.

Fix the Operating Model: Privacy Is Not a Side Gig for Security

Meyers has observed the exact misalignment play out repeatedly: privacy responsibility is often assigned to Legal or Compliance, but Cybersecurity typically handles the work and associated expectations. CISOs are asked to “own” controls for which they lack budgetary authority or policy ownership. Legal “owns” privacy on paper, but it is not integrated into cyber operations. Meyer is clear that the cure is governance, not heroics: establish a cross-functional steering committee (including Legal, Security, Compliance, IT Ops, and the business) with clear charters, shared KPIs, and defined decision rights. Diversity matters here; mix senior leaders with younger employees and varied backgrounds to avoid blind spots. The first agenda item of that committee should be ruthless purpose-alignment: “Why do we have this data? Do we still need it?”

Put Risks on One Page—and Make It Everyone’s Page

While cybersecurity tooling is often automated and technical, Meyers recommends one deceptively simple instrument to unite the disciplines: a shared risk register. GRC teams already live in this world. You should bring Security into it and treat security events, control weaknesses, and privacy exposures as entries that share owners, mitigations, and review cadences. If the CISO, Chief Compliance Officer, and General Counsel are not reading, updating, and arguing over the same risk register, you do not have a single source of truth or a shared sense of urgency.

Breach Reality: Precision Beats Blanket Notification

“Assume breach” is not fatalism; it is a sign of professional maturity. Meyers highlights the emergence of data security posture management (DSPM) solutions that not only identify exposures but also determine who actually owns the data that was accessed. That allows for targeted notifications — “these 15 people, not 500,000 customers” — and saves both real money and reputation. For the compliance function, the key point is proportionality; your incident playbook should pair legal thresholds with data lineage and ownership maps, ensuring a fast, accurate, and respectful response to individuals.

Agentic AI: Accountability Without a Face

Agentic AI changes the rules. Agents act without asking, talk to other agents, and traverse systems and data at machine speed. They also obscure accountability because the human “operator” may interact with one agent while three others are making consequential decisions out of view. This breaks the legacy consent and audit paradigms, demanding new guardrails: identity and authorization that can follow agents, granular logging of agent-to-agent interactions, and data lineage that respects privacy scopes. From a compliance lens, agentic AI requires you to rewrite playbooks on consent, purpose limitation, and lawful processing, before deployment, not after the first mishap.

Storytelling: The Culture Carrier for Security and Privacy

Meyers’ long connection to San Diego Comic-Con may seem far removed from cybersecurity. Yet when you see a cybersecurity team finally “get it” when you swap a nameless attacker for “Lex Luthor” in a tabletop. That is not playing to pop culture; rather, it is cultural engineering. Humans adopt guardrails that they emotionally understand. If your privacy training or AI oversight policy can be told as a story, with villains, flawed heroes, and a clear “why,”  you improve retention, reduce resistance, and create connective tissue across silos. Compliance is, at its core, applied storytelling backed by controls.

Robert Meyers traces the evolution from undifferentiated IT to today’s specialized privacy and cybersecurity disciplines, emphasizing how poor role clarity and indiscriminate data retention have caused preventable harm for decades. He frames the U.S.–EU divide as a philosophical one, between individual-centric versus regulator-centric approaches, while urging companies to stop treating privacy as a side project for Security when Legal nominally “owns” it. The solution involves a cross-functional steering committee, a shared risk register, and purpose-driven data lifecycle governance.

Meyers underscores “assume breach” realism and highlights new DSPM tooling that enables precise, owner-level breach notification instead of blanket, costly responses. Looking ahead, agentic AI creates accountability gaps as autonomous agents act and collaborate out of human view, demanding fresh guardrails for identity, consent, lineage, and logging. Finally, Meyers champions storytelling (yes, even Comic-Con-style narratives) to make security and privacy relatable, and advocates for cross-training, with privacy professionals learning security and vice versa, so organizations can speak a single operational language from the boardroom to the SOC.