ConformySV

EU AI Act for International Companies

Complete guide on territorial scope, obligations, and compliance for non-EU AI providers.

Last updated: March 27, 2026

Does the EU AI Act apply to your company?

If you're wondering whether the EU AI Act affects your business outside Europe, the answer is likely yes β€” at least if you're deploying or selling AI systems to customers or users in the European Union.

Article 2 of the EU AI Act defines territorial scope: The regulation applies to any high-risk AI system placed on the EU market, regardless of where the provider is based. "Placed on the EU market" means the system is made available for use β€” whether through direct sales, cloud services, SaaS, mobile apps, or integration into products sold in the EU.

This means US tech companies, Chinese AI providers, Indian SaaS platforms, and any other non-EU entity placing high-risk AI in Europe must comply. The regulation doesn't care about your company's location; it cares about whether your AI system is used in the EU.

Key principle: The EU AI Act is extraterritorial. Your company's headquarters, servers, or incorporation status is irrelevant. What matters is whether your AI system reaches EU users or is deployed by EU customers.

Even limited pilots or beta deployments to EU users can trigger the regulation. If you're testing a high-risk AI system with EU customers β€” even a small group β€” you're subject to the AI Act's requirements.

Who counts as a provider, deployer, or importer?

The EU AI Act defines roles based on function, not nationality. Understanding which role(s) your company plays is essential because different obligations apply to each.

Provider (Article 3)

A provider is any entity that develops or has AI systems developed and places them on the EU market or puts them into service. If your company develops, licenses, or customizes an AI system and makes it available to EU users β€” even as a cloud service β€” you're a provider. This is the main role that bears the heaviest compliance burden: system classification, technical documentation, risk management, conformity assessment, CE marking (if high-risk), and notified body involvement.

Deployer (Article 27)

A deployer is a user who operates an AI system for their own purposes. If you're an EU enterprise using an AI system provided by a non-EU company, you're a deployer. Deployers have lighter obligations: using the system in accordance with the provider's instructions, monitoring for anomalies, maintaining logs (if high-risk), and reporting incidents to the provider and authorities. You can read more in our dedicated Article 27 guide.

Importer (Article 4)

An importer is a non-EU entity that imports an AI system into the EU market and places it under their name or trademark. If you manufacture AI systems outside the EU and sell them to EU retailers or enterprises under your brand, you're an importer. Importers must ensure the system complies with the regulation before placing it on the market and must cooperate with authorities.

Distributor (Article 4)

A distributor makes an AI system available on the EU market without modifying it (e.g., a reseller or integrator). Distributors must verify that the system carries CE marking (if required) and must have procedures to handle non-compliance.

Key obligations for international companies

If your company qualifies as a provider of high-risk AI systems for the EU market, you face significant obligations. Here are the most critical ones.

1. CE Marking and Conformity Assessment

High-risk AI systems must carry a CE mark and a Declaration of Conformity (DoC) before being placed on the EU market. The provider must prove the system meets all requirements in Articles 8–15 (mandatory features like documentation, risk management, data governance, and monitoring). For most high-risk categories, you'll need involvement from a notified body β€” an independent assessment organization designated by EU authorities.

2. Appoint an Authorized Representative in the EU (Article 22)

Non-EU providers must appoint an EU-based representative to act on their behalf for regulatory matters. This representative must be established in the EU and can be a legal entity (subsidiary, distributor, or agent). The representative acts as the point of contact for authorities and is responsible for ensuring compliance. This is a strict requirement with no exceptions β€” you cannot place high-risk AI on the EU market without this.

3. Technical Documentation in English or National Languages

Your Annex IV technical documentation must be written in English or the official language of the EU country where the system is deployed. Documentation must cover system design, training data, testing results, risk management, and post-market monitoring procedures. The documents must be kept for the entire lifecycle of the system and made available to authorities upon request.

4. Post-Market Monitoring and Incident Reporting

You must continuously monitor your AI system's performance in the real world. If you detect risks, biases, performance degradation, or safety issues, you must document them (Article 72). Serious incidents β€” those that cause injury, fatality, or significant damage β€” must be reported to relevant EU authorities within 15 days (Article 73). Non-EU providers often struggle with this because they lack direct visibility into how their systems are used across diverse EU markets.

Timeline and enforcement: When compliance becomes mandatory

The EU AI Act's general applicability date is August 2, 2026. After that date, any high-risk AI system placed on the EU market must fully comply with all requirements. However, some requirements took effect earlier: prohibitions on high-risk practices were effective immediately upon publication (December 2023), and rules for high-risk systems under Articles 8–15 began applying on June 10, 2024.

Fines and penalties for non-compliance

The EU takes AI Act violations seriously. Financial penalties are substantial:

  • Up to €35 million (or 7% of global annual turnover) for providing non-compliant high-risk AI systems or failing to include required documentation
  • Up to €15 million (or 3% of turnover) for violations such as inadequate risk management, data governance, or post-market monitoring
  • Administrative fines up to €10 million (or 2% of turnover) for smaller violations like missing incident reports or technical documentation issues
Additionally, authorities can issue compliance orders, suspend market access, or require mandatory recalls β€” all of which damage brand reputation and market standing.

Given the scale of penalties, international companies should prioritize compliance now, even though the final deadline is August 2026. Early action reduces risk and demonstrates good faith to regulators.

Who enforces the regulation?

Each EU member state has a market surveillance authority (typically a national trade, consumer protection, or standards body) responsible for testing AI systems and investigating violations. These authorities can conduct audits, request documentation, and impose fines. At the EU level, the European Commission coordinates enforcement and publishes guidance. Non-EU providers are not exempt β€” authorities actively monitor for non-compliant systems from international providers and take action when violations are found.

Step-by-step: How to achieve compliance from outside the EU

If your company provides high-risk AI systems to the EU, follow these steps to ensure compliance.

  1. Classify your AI system. First, determine whether your system is high-risk under Annex III of the regulation. High-risk categories include systems used for law enforcement, critical infrastructure, employment, education, credit scoring, and biometric identification. Use our free AI Act classification tool or consult a compliance expert to be certain of your category.
  2. Map your obligations. Once you know your system's risk category, identify the specific requirements that apply (Articles 8–15). You must establish a risk management system (Article 9), governance for training and test data (Article 10), a quality management system (Article 17), and human oversight procedures (Article 14).
  3. Appoint an EU-based authorized representative. Select a trustworthy EU legal entity β€” often a subsidiary, distributor, or specialized compliance firm β€” to represent your company. Draw up a formal representation agreement specifying their responsibilities. The representative's contact details must appear in your Declaration of Conformity and be available to authorities.
  4. Build or audit your systems. If your system already exists, conduct a thorough audit against Article 8–15 requirements. If it's in development, integrate compliance throughout the design process. Ensure robust risk management, high-quality training and test data, and traceability. This is the most time-intensive step.
  5. Prepare technical documentation (Annex IV). Compile comprehensive documentation covering system purpose, development process, training data characteristics, testing and validation results, known limitations, performance metrics, and post-market monitoring procedures. Documentation must be in English or a relevant EU language. Our Annex IV documentation tool guides you through this systematically.
  6. Engage a notified body (for most high-risk categories). Submit your system and documentation to a notified body for third-party conformity assessment. Once approved, obtain the body's assessment report, then issue your own Declaration of Conformity and apply the CE mark to your system.

How EU Compliance AI can help

Compliance is complex, time-consuming, and costly if done wrong. International companies often lack the expertise to navigate the EU AI Act, especially the technical documentation and conformity assessment processes. EU Compliance AI simplifies this:

Our tools and guides support every step: Our free classification tool helps you understand if your system is high-risk. Our Annex IV documentation generator walks you through technical documentation requirements systematically, translating regulations into practical questions. Our guides on risk management, data governance, and Article 27 deployer obligations provide detailed, actionable guidance. And our timeline tool keeps you aware of upcoming deadlines so you're never caught off-guard.

Start your compliance journey

Classify your AI system today and discover exactly which EU AI Act requirements apply to you β€” free, no registration required.

Classify your system