EU AI Act: The Compliance Checklist We Wish Someone Gave Us Six Months Ago
I spent three weeks reading the EU AI Act. All 113 articles. The annexes. The recitals. The guidelines the Commission quietly published in July.
Here's what I learned: the regulation is genuinely well-intentioned. It's also nearly impossible for a small company to figure out on their own.
We're a small team in Amsterdam. We use ChatGPT, HubSpot, GitHub Copilot, and about a dozen other AI tools. When we tried to work out our own compliance obligations, we hit a wall. The official guidance assumes you have a legal department. We have a Notion doc.
So we built a tool to figure it out — first for ourselves, then for anyone in the same position. But before we get to that, here's the checklist we wish someone had handed us when we started.
Wait — does this even apply to my company?
Probably, yes.
If you're a five-person startup using Slack AI and ChatGPT, you're what the regulation calls a "deployer." You didn't build the AI, but you use it in your business. The Act still has requirements for you.
If you actually build AI products — a recommendation engine, a scoring model, anything you sell to others — you're a "provider" and your obligations are heavier.
Either way: if you operate in the EU and use AI tools, you're in scope. That surprised us too.
The four risk levels (without the legal jargon)
- Banned. Social scoring, manipulative AI, mass biometric surveillance. You're not doing this. Move on.
- High risk. This is the expensive one. If AI touches hiring decisions, loan approvals, student grading, or healthcare — you have mandatory documentation, human oversight requirements, and regular audits ahead of you. Tools like HireVue, Workday AI, and LinkedIn Recruiter land here.
- Limited risk. Most of us live here. If you use ChatGPT to help write customer emails, or Zendesk AI handles your support tickets, or HubSpot scores your leads — you need to tell people they're interacting with AI. That's Article 50. It sounds simple. Actually doing it across every touchpoint is not.
- Minimal risk. GitHub Copilot, Notion AI, Grammarly. Internal tools that don't affect anyone outside your team. No requirements. Carry on.
The 30-day checklist
We did this ourselves. It took about two weeks because we were figuring it out as we went. With this list, you can do it faster.
Week 1: Find your AI. Walk through every department and list every tool that has "AI" in its marketing copy. You'll be surprised. We found 14 tools. We thought we had 5.
Week 2: Sort by risk. For each tool, ask one question: does this tool make or influence decisions about individual people? Hiring, lending, pricing, access to services — if yes, it's probably high-risk. Everything else is likely limited or minimal.
Week 3: Fix the obvious gaps. Add "powered by AI" notices where customers interact with chatbots. Check that your AI vendors actually give you the documentation you'd need if a regulator asked. Most don't by default — you have to request it. Assign one person internally (it can be you) as the AI responsible. It's not a full-time job. It's a named point of contact.
Week 4: Write it down. One document. What AI you use, why, what risk level, what you've done about it. For high-risk tools, you also need usage logs and a Fundamental Rights Impact Assessment — the EU publishes a free template that's actually usable.
That's it. Not painless, but not the six-figure consulting project the compliance industry wants you to think it is.
The fine print (literally)
Yes, the fines are up to €35 million or 7% of global revenue. No, that's not what will happen to your 10-person agency.
Fines are scaled by company size. Micro and small enterprises pay less. The nuclear fines are for companies that deploy banned AI or lie to regulators.
The real risk for small businesses is more mundane and more likely: a rejected job candidate challenges your AI-assisted hiring process. A customer demands to know why an AI denied their application. You can't produce the logs because you never kept them.
That's the scenario that should worry you. Not the €35 million headline.
Why we built ExactAI
When we went through this process ourselves, the hardest part was step one: figuring out which of our tools even mattered under the regulation. The Act is 400+ pages. The guidance documents add another few hundred. Nobody has time for that.
So we turned our internal process into a tool. Four questions, five minutes, no signup. It tells you which of your AI tools are high-risk, which need transparency labels, and what you specifically need to do — article by article.
It's free because we think every small business in the EU should at least know where they stand before August.
Free assessment for EU businesses using AI tools. No signup required.
Run your free assessment →