By Asmaa Gad | 10 min read
If your procurement team uses AI for supplier risk scoring, spend classification, or contract analysis, you need to pay attention. The EU AI Act is not a future regulation. The first prohibitions took effect on February 2, 2025. General-purpose AI obligations kicked in August 2025. And by August 2, 2026, the full set of requirements for high-risk AI systems becomes enforceable.
Fines? Up to 35 million euros or 7% of global annual turnover, whichever is higher. That’s not a typo.
The problem is that most procurement professionals don’t know whether their AI tools qualify as “high-risk” under this regulation. And most vendor sales teams aren’t rushing to explain it either. This guide breaks down what procurement teams actually need to know and do, without the legal jargon.
Why This Matters for Procurement Specifically
AI systems used for employment decisions, creditworthiness assessment, and “essential services” access can qualify as high-risk under the Act. If your AI tools make or influence decisions about suppliers, workforce, or access to procurement processes, classification matters. Procurement teams that buy AI tools are also considered “deployers” with their own compliance obligations.
EU AI Act Timeline: The Dates That Matter
FEBRUARY 2, 2025 (ALREADY IN EFFECT)
Prohibited AI practices banned. AI literacy requirements for organizations took effect. If your company deploys AI, staff must have appropriate training.
AUGUST 2, 2025 (ALREADY IN EFFECT)
Obligations for general-purpose AI model providers (think: ChatGPT, Claude, Gemini). Governance infrastructure and notified bodies became operational.
AUGUST 2, 2026 (UPCOMING: YOUR DEADLINE)
Full obligations for high-risk AI systems take effect. Conformity assessments must be completed, technical documentation finalized, and EU database registration done. This is the big one for procurement teams deploying or buying AI tools.
AUGUST 2, 2027
Final deadline for AI systems that are safety components of products. Also the compliance deadline for general-purpose AI models already on the market before August 2025.
Is Your Procurement AI “High-Risk”?
The EU AI Act classifies AI systems into four risk tiers. Here’s how each applies to procurement:
| Risk Level | What It Means | Procurement Examples |
|---|---|---|
| Unacceptable | Banned entirely | AI that manipulates supplier decisions through subliminal techniques, social scoring systems |
| High-Risk | Permitted with strict requirements | AI used for employment screening in procurement teams, creditworthiness assessments of suppliers, critical infrastructure decisions |
| Limited Risk | Transparency obligations | Chatbots for supplier inquiries, AI-generated content in RFPs, emotion detection in negotiations |
| Minimal Risk | No specific requirements | Spend categorization tools, AI-assisted email drafting, demand forecasting models |
Important nuance: Many common procurement AI use cases (spend analysis, contract review, sourcing optimization) fall under “minimal risk” and face lighter requirements. The key concern is when AI systems directly influence decisions about people, such as supplier workforce compliance scoring, hiring for procurement roles, or supplier credit/financial assessments.
Your 10-Point Procurement AI Compliance Checklist
Whether you build AI or buy it, here are the steps your team should take before August 2026:
Inventory All AI Systems
List every AI tool your procurement team uses. Include ChatGPT, Copilot, vendor-embedded AI, and any “smart” features in your P2P or ERP system. You can’t comply with rules for tools you don’t know about.
Classify Each System by Risk Level
Use the table above as a starting point. For borderline cases, consult your legal team. The European Commission is publishing additional guidelines through 2026 to help with classification.
Define Your Role: Provider vs. Deployer
If you build custom AI models, you’re a “provider” with heavier obligations. If you buy and use AI tools, you’re a “deployer” with transparency, monitoring, and human oversight duties. Most procurement teams are deployers.
Update Vendor Contracts
Include AI Act compliance clauses in all technology procurement contracts. Require suppliers to provide documentation on AI model usage, data handling practices, and risk classifications. This protects your organization from third-party liability.
Implement Human Oversight Protocols
For high-risk systems, human-in-the-loop is mandatory. Define who reviews AI outputs, how often, and what override procedures exist. Document everything.
Train Your Team on AI Literacy
This is already mandatory since February 2025. Everyone who interacts with AI systems must understand what AI can and can’t do, how to interpret outputs, and when to escalate. This isn’t optional training. It’s a legal requirement.
Establish Data Governance
Document what data feeds into your AI systems, where it comes from, how it’s processed, and how long it’s retained. This overlaps with GDPR requirements many teams already follow.
Set Up Monitoring and Logging
High-risk AI systems need continuous monitoring. Track accuracy, bias, performance degradation, and incident logs. Build this into your existing reporting cadence.
Build Transparency Documentation
For systems that interact with suppliers or stakeholders, be transparent that AI is involved. If a chatbot handles supplier queries, users must know they’re interacting with AI.
Create an Incident Response Plan
What happens when an AI system produces a biased output or makes an error that affects a supplier? Have a documented process for identifying, reporting, and resolving AI-related incidents.
Don’t Wait for August 2026
Compliance is not a one-week project. Start your AI inventory today, update your vendor contracts this quarter, and build AI literacy into your team’s development plan. The organizations that treat compliance as an opportunity to build trust will be the ones winning contracts when competitors are scrambling.
Asmaa Gad is the founder of SupplyChain AI Pro, helping procurement and supply chain professionals master AI tools for real work.
