Trekking Through Compliance – Episode 7 – Compliance Lessons from What are Little Girls Made of?

In this episode of Trekking Through Compliance, we consider the episode What Are Little Girls Made of?, which aired on October 20, 1966, Star Date 2712.4.

In this episode of Trekking Through Compliance, we delve into the icy caverns of Exo III in the Star Trek classic “What Are Little Girls Made Of?, where Dr. Roger Corby has gone far beyond the boundaries of ethical science. His discovery of an ancient technology for creating androids opens a chilling debate on artificial intelligence, identity duplication, and the ethics of replication.

We explore how Corby’s desire to replace flawed humans with perfect androids reflects modern dilemmas surrounding automation, transparency, data integrity, and the compliance risks that arise from technology run amok. As we watch Kirk’s doppelgänger roam the Enterprise, the question becomes clear: when does innovation cross the ethical line?

Episode Summary

After the Enterprise travels to the planet Exo III to investigate Roger Corby’s fate, two security guards, Matthews and Rayburn, are killed after beaming down. It turns out that Corby, known as the Pasteur of archeological medicine, has discovered the remains of an ancient culture. They were using machinery he had found, which created androids.

Corby begins implementing his plan by creating an android of Kirk to be taken to Minas 5, where he will start spreading androids throughout the galaxy. However, Corby kills his robot servant, Rok, who has remembered the equation “existence, survival must cancel out programming.” This equation made Rok realize that the clash between humans and androids that had led to his civilization’s demise centuries ago was becoming inevitable again, causing him to attempt to kill Corby. Corby then reveals he is an android. Corby destroys the remaining android and himself, ridding the universe of Exo III androids for all time.

Key highlights:

1. Transparency and Disclosure—Trust Dies in the Shadows

🖖 Illustrated by: Corby failing to disclose that he is no longer human—and is, in fact, an android. This fundamental breach of transparency is the heart of the compliance risk. Corby’s hidden identity violates the trust of those he engages with. Just as companies hide material facts or fail to disclose conflicts of interest, his omission threatens not only ethical standards but also operational integrity. For compliance professionals, transparency must always be a first principle.

2. Data Privacy and Identity Misuse—The Ethics of Replication

🖖 Illustrated by: The creation of a perfect android duplicate of Captain Kirk. This raises a powerful metaphor for today’s concerns about biometric data and identity cloning. What happens when your digital or physical likeness is copied without consent? Compliance teams must ensure privacy protections are in place for employee, consumer, and partner data, particularly when AI and automation are involved.

3. Risk Assessment and Program Governance—The Fallacy of ‘Perfect Control’

🖖 Illustrated by: Corby’s belief that androids can eliminate human error and thus build a better civilization. Corby’s fatal flaw is the assumption that perfection through programming eliminates the need for oversight. In corporate compliance, this mirrors the belief that strong policies alone prevent misconduct. As Corby and Rok demonstrate, even perfectly programmed systems break down when values clash with situational complexity.

4. Third-Party Risk—The Vendor You Don’t Know Is the One That Destroys You

🖖 Illustrated by: The lethal android Ruk, a legacy remnant of a prior civilization Corby could not fully control. Ruk represents an inherited third-party vendor, technologically capable but poorly understood. This highlights the risk of using legacy systems or foreign vendors without adequate due diligence. Compliance programs must have protocols for onboarding, monitoring, and retiring high-risk third parties.

5. Ethical Limits of Innovation—Because You Can Doesn’t Mean You Should

🖖 Illustrated by: Corby’s vision of a galaxy populated by androids, with human flaws “corrected” by machine logic. Compliance professionals must always ask, what is the ethical boundary of our innovation? Whether it’s in AI, product safety, or marketing tactics, organizations that pursue progress without ethical guardrails are just one bad decision away from crisis. Corby’s demise is a cautionary tale of ambition eclipsing accountability.

Final Starlog Reflections

“What Are Little Girls Made Of? ” teaches us that replication without reflection is a road to ruin. Corby wanted control, certainty, and a frictionless future, but he lost sight of the ethical foundation that gives those goals meaning. In a world where technology is evolving faster than regulation, compliance professionals must stand as the stewards of ethical innovation.

Resources:

Excruciatingly Detailed Plot Summary by Eric W. Weisstein

MissionLogPodcast.com

Memory Alpha

Leave a Reply

Your email address will not be published. Required fields are marked *

What are you looking for?