menu

Julie Myers Wood on Navigating the AI Compliance Landscape: Mitigating Risks

I recently had the opportunity to visit with Julie Myers Wood, CEO at Guidepost Solutions. With her extensive background in law and government positions, Julie brings a wealth of knowledge and insights to our discussion on the challenges and considerations of incorporating AI into compliance programs. We took a deep dive into the intersection of compliance and artificial intelligence (AI).

With generative AI is coming at us with light speed, there are so many things a compliance professional to think about. Julie began with the first key thing is to take a high level perspective to step back and reflect on all the ways that AI can affect your company. You should ask several questions, including some of the following. What AI tools is the company using internally? What tools is the company using internally to help its operations or its capacity know about those tools? What is your company selling? Is your company selling tools that incorporate deep learning, generative AI or other sorts of machine learning?

Equally importantly what is the compliance part that each of your team is performing? What compliance tools are being used? Do you have individuals who are freelancing at your company trying to reduce their work using GPT or something else without telling you and maybe exposing some of the code? And finally, how are criminals using generative AI to get into your work? It all entails that , from a high-level perspective, what are various ways that AI can affect you.

Next it is important to think about is do you know what all these tools are that the company is using? You need to obtain an inventory of tools your employees are using. Compliance professionals need to have a comprehensive inventory of the tools being used within the company and fully comprehend their capabilities and limitations. This may not be easy, particularly if your organization is using a mix of homegrown tools as well as tools that are available for sale on the open market. Your compliance team must understand what are the tools that each part of the company is using because only then can you fully understand the privacy or other regulatory risks that may be involved.

In this inventory, you also need to understand who owns the software tools. When do they expire, how many seats to you have for your organization? Who owns the license keys and does the software legacy out?  This understanding is crucial for effectively managing compliance and mitigating potential risks. It is also a very good business practice.

Generative AI is rapidly advancing, and compliance professionals must stay informed and proactive in addressing its implications. Julie highlights the need to be aware of the risks related to generative AI, export compliance, and other potential problems. By staying updated on the latest developments, compliance professionals can adapt to the changing landscape and make informed decisions.

There are potential dangers of integrating AI into businesses and offers solutions to mitigate them. One key solution involves retraining or supplementing the training of employees. Companies need to educate their workforce on the rules of the road and provide a safe environment for exploring and experimenting with generative AI. Julie pointed to PwC’s billion-dollar investment in AI, including retraining and proprietary platforms, showcases the importance of investing in employee development. However, smaller companies may face challenges in investing in generative AI and effectively implementing it.

AI is revolutionizing compliance by enabling effective analysis and interpretation of large amounts of data. Compliance professionals are excited about the potential of AI for predictive analytics and identifying trends and patterns. However, choosing the right tools for compliance is crucial, as market winners and losers can impact success. A key for success for the compliance team is the need for collaboration between operations and compliance teams when considering the use of AI.

Clear policies defining what can and cannot be done with AI are essential to protect intellectual property and ensure compliance. But it is not simply policies and procedures, it is targeted and effective training, coupled with ongoing communications. All of this should be aimed at educating employees about the risks and consequences of using AI improperly is crucial. Compliance professionals should encourage caution when downloading AI tools from the web and carefully review terms and conditions to avoid unintended consequences.

As compliance professionals, we play a vital role in ensuring the safety and security of our businesses. The integration of AI into compliance programs presents both challenges and opportunities. By understanding the tools, risks, and solutions associated with AI, we can adapt to the changing landscape and make informed decisions.

For the full podcast with Julie Myers Wood, check out Compliance and AI here.

Leave a Reply

Your email address will not be published. Required fields are marked *

What are you looking for?