Javascript is required
logo-dastralogo-dastra

How to create an AI usage charter in the workplace

How to create an AI usage charter in the workplace
Leïla Sayssa
Leïla Sayssa
6 November 2025·5 minutes read time

Why your organization needs an AI charter right now

Artificial intelligence is being integrated into all professions, often without a formal framework: AI assistants, automation, content generation tools, business co-pilots...
But behind these efficiency gains lie legal, ethical, and compliance risks: data breaches, algorithmic bias, loss of control over decisions, or failure to comply with GDPR and the EU AI Act.

The AI usage charter thus becomes an essential governance tool.
It establishes rules and best practices to secure, structure, and hold accountable the use of AI, in compliance with ethical, regulatory, and compliance requirements.

Objective: to provide a compass for ethical innovation, protect the company, reassure teams, and anticipate the obligations of the AI Act.

1. Preparatory phase: laying the foundations

a) Who and how to start the approach

  • Identify the sponsor, governance (“who carries the charter”), involve internal stakeholders (IT, legal, HR, departments).

  • Conduct a state of play: what AI tools are already in use, what data is involved, what “shadow” practices exist.

Before any drafting, it is essential to identify the AIs used within the company, including those integrated without official validation (the “shadow IT” or "shadow AI"). This involves a data flow analysis, generative tools, automation, and connectors.

**Objective: to achieve a clear vision of actual usage and associated risks.

b) Key questions to ask

  • What AI tools are currently employed (chatbots, generative AI, co-pilots, etc.)?

  • What data is being processed? Often: sensitive data, business secrets, personal data, etc.

  • In which use cases do we want to allow AI and in which cases do we want to prohibit it?

  • How does usage align with the organization's values and objectives?

The drafting of an AI charter is a collective approach. Its effectiveness relies on the involvement of stakeholders: management, IT, legal, HR, departments, DPO, and sometimes external partners.

This co-construction allows for defining the values, objectives, and red lines of the organization.

c) Mapping risks & obligations

Each use of AI must be analyzed from several angles:

  • GDPR and data security,

  • ethical risks and algorithmic bias,

  • business and decision-making impact,

  • regulatory compliance (AI Act).

The company must then establish a prioritized risk matrix for each tool or usage.

Objective: to factually assess the risk associated with each AI use case and identify the necessary mitigation measures.


2. Structuring the charter: essential elements to include

Here are the essential sections identified in best practices:

a) Preamble

  • Objective of the charter, tailored to the context of AI usage within the organization.

b) Scope

  • Who the charter applies to (employees, contractors, interns...); which systems and use cases are covered.

c) Definitions

  • Generative AI, sensitive data, prompt, authorized user, etc.

d) Objectives and scope

  • That the charter aims to frame usage, complement existing policies (GDPR, security, ethics).

e) Guiding principles

  • Primacy of the human, transparency, explainability, respect for fundamental rights, fairness, security, accountability, responsible innovation, traceability.

f) Authorized/prohibited Uses

  • How to define concrete examples: “authorized: non-confidential email draft”, “prohibited: customer data in a prompt”.

g) Control and protection measures

  • Blocking unapproved tools, access restrictions, audits, monitoring.

h) Training and support

  • Awareness, hands-on training, prompt workshops, etc.

i) Sanctions

  • Disciplinary measures in case of non-compliance.

j) Governance, updates, enforceability

  • Appendix to internal regulations, AI committee, periodic review.

k) Employee commitment

  • Signature, individual commitment, liability clause.

3. Writing the charter: good practices

  • Use clear language, accessible to all (avoid legal jargon).

  • Illustrate with concrete examples (internal use cases) so that every employee can visualize.

  • Tailor the document to the organization's context (size, sector, type of data).

  • Think about operational format, not just theoretical. Provide a short document, appendices, FAQs, pictograms.

  • Involve end users in the review to ensure understanding.


4. Implementation and monitoring: turning the charter into action

  • Internal communication: dissemination, training, e-learning module.

  • Integration: attach to employment contracts, internal regulations.

  • Monitoring and control: audit of AI usage, tracking indicators (e.g., % of approved AI tools, incidents).

  • Updating: regularly review the charter to incorporate new tools, new regulations (AI Act).

  • Governance: set up an AI committee, compliance officer, liaison with the DPO.

  • Awareness sessions for teams, internal FAQs to address practical questions.

A useful charter is never static. Before its publication, it must be reviewed and validated by internal stakeholders, then adjusted according to their feedback.

**Objective: to transform the charter into a collective reflex and not just a compliance document.


5. Common mistakes to avoid

  • Copying and pasting a generic charter without internal analysis. Not adapting the content to its context (sector, size, type of data) → poor understanding.

  • Not mapping uses or tools before drafting → risk of omission or inadequacy.

  • Neglecting team training and failing to raise awareness among users: then the charter remains a dead letter.

  • Writing a text that is too legalistic, incomprehensible to users.

  • Writing the charter as a mere “compliance document” without business ownership.

  • Forgetting to assess biases and transparency of the AIs used.

  • Not providing sanctions or monitoring → lack of credibility.


Conclusion

The charter is a strategic tool to support innovation, limit risks, and build trust. It is a structured approach combining internal reflection, tailored drafting, training, and governance.

At Dastra, we support companies in this process: from auditing AI uses to drafting the charter template, to training teams and establishing operational follow-up with our AI systems register and more.


About the author
Subscribe to our newsletter

We'll send you occasional emails to keep you informed about our latest news and updates to our solution

* You can unsubscribe at any time using the link provided in each newsletter.