Ideas

Why AI Governance Needs More Than Traditional GRC Tools: A New Approach for Managing AI Risks and Compliance

Blog Image
Published on
September 24, 2024

Don't put ClearOPS in the GRC bucket, please. Governance, Risk Management, and Compliance (GRC) tools may be useful to some, but I find them comparable to a CRM. Sometimes useful, but requiring way too much work to maintain and manage. It is time someone came up with a way to do the work for you (like we did at ClearOPS). The purpose of this blog post is to explore if GRC is the round hole that the square peg of AI assessments is going to be jammed into. Spoiler alert: it should not.

The EU AI Act aims to ensure that AI systems deployed within the European Union are safe, transparent, and respect fundamental rights. Similar to GDPR, that went into effect in May of 2018, the Act is so comprehensive that businesses in the US and other countries will be forced to comply. The EU AI Act categorizes AI systems based on their risk levels—unacceptable risk, high risk, limited risk, and minimal risk—and sets stringent requirements for those classified as high-risk.

Governance: I think this is the area that most software tools fall flat. Governance is about people and if you don't have enough people in your organization with the right skillsets then having a governance tool is a huge waste of your time. When I began setting up an AI ethics program at Clarifai, I started with a committee of folks who were interested in the topic. From there, I was able to determine the correct strategy for the company in terms of which people and what skillsets. People determine process, not the other way around. If you are the only person responsible for AI governance, then you don't need a tool. You need help.

Yet, from what I have seen, GRC tools do not foster multi stakeholder collaboration. There is a sense that "the security team is responsible for it." It makes sense when the tool is set up for such a silo, but governance is not a siloed job. Look for a tool that encourages multiple stakeholders to log in and cross collaborate.

Risk: I have seen so many tools break down risk into a process: harm X probability equals risk. Bleh. Risk is not a process and it is not objective. All factors that go into making a decision about risk are subjective. The only point of discerning it down into a mathematical equation is to make it easier to digest at a glance. While we continue to wait for the AI assessments that will ultimately be required by regulation, AI is adding a new complexity to the process of determining risk. You can have a GenAI tool writing marketing content for one company where they deem that a high risk and, for another company, it may be the only acceptable risk.

I think this is also an area where GenAI can help. When I was the General Counsel, I would frequently advise business leaders of the risk they were taking, but ultimately leaving the decision to them. If the point of GRC is to record these conversations, then having GenAI take my audio and transcribe it to text and add it to a knowledge base would be useful. Also being able to tell it what is legal privilege and what isn't would be extremely useful. No current GRC tool I know does that right now.

Compliance: If you are like me, then you think of compliance as policy writing. This area is the one I am most excited about for the application of AI as well as the evaluation of AI. At ClearOPS, our reporting GenAI function enables companies to build brand new policies in minutes that are custom to their business and in compliance with regulation. It's really impressive! Most GRC tools I have seen require you to upload a fully complete policy and then help with storing it and making it available to employees. That is still useful, but it is also very 2010.

In addition, GenAI can help evaluate your current policies against new regulations and offer suggestions for improvements or update accordingly. With the many new laws and regulations and standards emerging on AI, this is a critical feature moving forward.

And let's face it, the point of policies is to guide people when they don't know what to do. Most policies I have read are confusing at best and pointless at worst. If you think your AI governance needs a policy, then at least let your employees be able to use GenAI to interact with it.

Conclusion

Organizations must consider investing in specialized AI governance tools that are designed to handle the unique challenges posed by AI technologies. Whether you are evaluating AI or using AI, tools that do the work for you are the future. Let's not stay stuck in the GRC past.

Featured Blog

We are constantly writing new content. Check back often or join our newsletter!

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros.

Take Control of Your AI Today: Contact Us!

Don't lose control of your proprietary data because you failed to implement governance.