The Need for an AI Codex | Written by Dr Sean Power
If your organisation develops or uses AI, you probably need an AI Codex. New technology usually forces a gap between, on the one hand, regulations and general industry advice and, on the other hand, what your organisation needs to know. Rapid technological change leaves the law and guidelines lagging behind what people are actually doing. And, without proper laws or guidelines, what people actually do can lead to mistakes and omissions that impact an organisation’s reputation and ability to function. Until the law and industry guidelines catch up, a codex can fill that gap. It is a document of guidelines for using a new technology, designed specifically for your organisation.
The gap between regulation and adoption is where we are with AI. It is a relatively new and already rapidly changing technology, with many industries quickly adopting it. AI regulations are coming (for example, the EU’s AI Act in 2024). However, even when they arrive, they are unlikely to capture everything from AI that impacts your organisation or what you can do with AI.
So, you need an AI codex.
Why Use an AI Codex?
Using a new and powerful technology like AI offers many kinds of benefits and harm. An AI can accelerate your responses to industry and markets, enabling you to stay up-to-date and competitive. However, it can also provide misleading information (‘hallucinating’) and mismanage data, especially private, confidential, and client data. Thus, however beneficial it is, AI can deliver ethically problematic results and damage your organisation’s reputation and relationships with clients, suppliers, and your overall industry.
A way to avoid this is to offer some guidance and constraints on the use of AI. This is especially important where there is not enough industry understanding of what AI use means, such as in the early stages of its development and adoption—that is, the stage where we find ourselves now.
What is an AI Codex?
A codex is a set of guidelines and rules about how to do something in a particular organisation.  It outlines ethical and legal requirements most relevant to your organisation, best practice guidelines, and notes on other practices, e.g., why and how other practices are prohibited or not recommended.
A typical codex contains the following elements:
- An explanation of why the codex is necessary, with explanations and citations (where necessary)
- An explanation of relevant concepts (sometimes in a glossary)
- Guidelines and requirements, with explanations and citations (where necessary)
- Important references, for example, references to specific regulations and ISO standards; links related to citations
For AI, a codex might contain the following elements:
- Why an AI codex is necessary: Referring to the current state of AI development and use, concerns over this state, justification of these concerns, and the general approach the organisation is taking to address these concerns. All this information cites sources for more information.
- Explanation of AI concepts: AI ethics, AI safety, trustworthiness, transparency, hallucination, and the types of AI relevant to your organisation, e.g., logistics AI, large language models (LLMs), and conversational AI.
- Specific guidelines: For AI use, acquisition, evaluation, approval, retention, and removal. The guidelines cover AI use in day-to-day work, development, user data management, and recruitment. They reference and cite specific parts of various AI laws, regulations, and general (and industry-specific) guidelines. In addition, each guideline offers a brief explanation of why it should be followed, including the consequences of not following it.
- References: Links to important AI sources around guidelines, such as the EU’s proposed AI Act, the EC’s ‘Ethics Guidelines for Trustworthy AI’, the 2023 US White House Directive on AI use, US Copyright Office decisions on AI authorship, and the MLA’s guidance on citing AI sources.
What Are Its Limits?
However important an AI codex might be, it has limits. As may be obvious, an organization’s AI codex cannot constrain employees’ behaviour outside the organisation. Nor can it constrain behaviour by other organisations when they are not dealing directly with an organisation. There is no way to check such behaviour. Generally, where it is not illegal, it is also ethically dubious to do so.
However, your organisation will inevitably provide some kind of data with a partner, customer, or supplier. Given your organization’s relationship with other organisations, a codex can constrain the behaviour of other organisations in their use of data you share with it. Those other organisations may very well use that data in an AI in a way that impacts the security and reputation of your organization. For example, another organisation may use your data when: their specialised in-house partner-management AI handles it; their supplier sourcing AI evaluates you based on previous correspondence and order fulfilment; they may even use data you provide them to train an in-house AI.
Even though you cannot directly control their use of your data in their organisation, you can provide guidelines on how to work with this possibility in your organisation, e.g., offer guidelines to the procurement staff on how to evaluate suppliers based on their transparency around their AI use; similarly, provide guidelines to sales and marketing on potential customer AI use of work your organisation does for them.
In addition, although you cannot regulate employees’ AI use outside of work, your guidelines can be useful educational sources for the best AI use outside of work.
When Do You Stop Using a Codex?
The main reason to stop using a codex is when the codex becomes redundant. Either the subject of the codex is no longer relevant or regulations, laws, or industry guidelines cover everything in the codex, either on an industrial, national, or international level.
A codex can cease to be relevant when the use, development, or behaviour of the specific technology changes so much that the codex’s requirements and guidelines can no longer apply. A codex for best practices for using glue in creating advertising copy is irrelevant to modern advertising agencies. Guidelines for copying and storing floppy discs are irrelevant to most organisations using computers.
Regulations can also supersede your codex. In that case, there are two possibilities, each with quite different issues for your codex. First, the regulation wholly covers a specific guideline of your codex, and so the codex need not cover it. There is no longer a gap for the codex to fill. Second, and more concerningly, your codex guidelines go against current regulations. Thus, you must revise your codex—or, if the whole codex is against regulations, withdraw it entirely.
Similarly, industry guidelines can supersede your codex. Unless the guideline is a requirement of some industry organisation that your organisation intends to remain part of, this is not such a pressing issue. However, if a guideline is now redundant, providing reference to the industry can be sufficient for referring to it in your codex.
In any case, even if the law or industry provides guidelines that your codex covers, the codex can be useful. A codex can cease to be a gap-filler in guidance and become a summary of the most relevant parts of the regulation for your organisation. You can modify it to refer to the regulations or guidelines and still note what is most important for you.
In this article, we described reasons for an AI codex, what to include in it, what its limits are, and why you might stop having one.
The key takeaway is that you should have a codex when there is a gap between a) how you ought to use AI in your organisation and b) AI regulations and explicit industry guidelines. As a final point, this gap is also where your organization’s competitiveness and reputation stand out. Everyone in your industry will know about the regulations and industry guidelines. What your organisation does in the gap is up to you. Your codex enables members of your organisation to work together in making that a benefit.
 In this context. Note that OpenAI, currently one of the leading developers in generative AI, has a product called the OpenAI Codex, a conversational AI that returns code when prompted through natural language. https://openai.com/blog/openai-codex