Skip to main content
Robotique

Machinery Directive 2023/1230: the cobot framework moves into execution mode

Applicable on 20 January 2027, the new Machinery Directive replaces 2006/42/EC and integrates the EU AI Act. For robotics integrators, CE marking becomes an AI topic.

Équipe SwoftPôle veille sectorielle
Atelier industriel avec cobot collaboratif et panneau de sécurité robotique

Regulation (EU) 2023/1230 on machinery, adopted in June 2023 and applicable on 20 January 2027, replaces directive 2006/42/EC which had governed for 18 years the placing of machinery on the European market. It is a substantive change for manufacturers, robotics integrators and industrial users — all the more as it interacts directly with the EU AI Act on safety functions controlled by AI.

What really changes vs the 2006/42 directive

From directive to regulation

First change, legal but structuring: we move from a directive (transposed into national law, hence with variations between member states) to a regulation (directly and uniformly applicable). For an integrator delivering robotic cells in several EU countries, simplification is real — no more interpretation gaps between BAUA in Germany, INRS in France, INAIL in Italy.

High-risk machinery controlled by AI

The regulation lists in its Annex I the categories of "high-risk machinery" requiring third-party conformity assessment (notified body). Novelty of 2023/1230: machinery whose safety functions are provided by AI systems is explicitly included — for example, a collaborative robot (cobot) whose human-presence detection relies on computer vision trained on a dataset.

Articulation with the EU AI Act

When a machine integrates AI providing a safety function, it falls under both Machinery Regulation 2023/1230 AND the EU AI Act (which classifies these AI systems as "high-risk" under Annex III). Concretely: dual CE marking, dual documentation, dual post-market surveillance. The Machinery regulation avoids duplication by pointing to the AI Act for the AI part, but the integrator must produce both files coherently.

The concrete cobot case

An ABB, Universal Robots, KUKA or FANUC cobot deployed in collaborative mode (no safety cage, sharing space with the operator) relies on a critical safety function: human detection. This detection can be ensured by integrated force/torque sensors (classical, deterministic method), or by cameras + AI (more recent, more precise but more complex to validate).

With Regulation 2023/1230 + EU AI Act, an integrator choosing the "cameras + AI" path must produce: (1) the Machinery compliance file with ISO 12100 risk analysis, ISO 10218 (robot safety), ISO/TS 15066 (cobots), (2) the EU AI Act file with training-data governance, model traceability, post-deployment monitoring, incident management. That is a significant documentary volume — easily 200-400 pages for a complex cell.

Three blind spots for integrators

Cybersecurity as a safety requirement

Regulation 2023/1230 considers cybersecurity is now a component of functional safety of machinery. A cyberattack taking control of a cobot and pushing it beyond its torque limits compromises human safety. The risk analysis must therefore include cyberattack scenarios (NIS2 and IEC 62443 are the references). For integrators used to thinking only about mechanical/electrical safety, this is a new competency to acquire.

Digital instruction notice

The regulation explicitly allows, for the first time, providing the instruction notice in digital form (online PDF, dedicated platform), provided the paper version remains available on request free of charge. Many integrators will use this to switch their documentation to interactive format (videos, 3D schematics, hotspots). But this requires a long-lived documentation platform, accessible 10 years after market placement.

AI training-dataset traceability

The EU AI Act demands, for high-risk AI, training-dataset traceability: sources, bias, quality, representativeness. For an integrator using a pre-trained model provided by the cobot manufacturer (ABB, UR, etc.), they must obtain this information from the manufacturer, and attach it to their conformity assessment file. Earlier supplier contracts don't have this clause.

The 2027-2030 scenario

For French robotics integrators, the 2026-2027 program is: (1) audit cells in service to identify those that will need re-assessment under 2023/1230, (2) restructure technical files to integrate cybersecurity risk analysis and EU AI Act elements, (3) frame with cobot manufacturers the AI training data and post-market governance, (4) upskill documentary teams (safety engineers familiar with ISO 12100 + IEC 62443 + EU AI Act).

Beyond 2027, the stake will be predictive maintenance and OTA updates of AI models, which open governance questions not yet resolved — how to certify that a model update does not degrade safety? The regulation leaves this question to delegated acts to come.

Sujets abordés

  • Directive Machines 2023/1230
  • EU AI Act
  • Cobot
  • ISO 10218
  • ISO/TS 15066
  • Cybersécurité OT
Tech translation

How Swoft turns this challenge into software

Industrialiser la conformité Machines + EU AI Act pour la robotique, c'est connecter le dossier technique, l'analyse de risque, la traçabilité IA, et la documentation post-marché dans un système unique. Voici comment Swoft équipe les intégrateurs robotiques.

  1. 01

    Dossier technique unifié Machines + EU AI Act

    Pour chaque cellule livrée, un dossier numérique structuré : analyse de risque ISO 12100, conformité ISO 10218 / ISO/TS 15066, dossier IA (datasets, modèle, monitoring), analyse cybersécurité IEC 62443. Les documents sont versionnés, signés, et archivés 10 ans avec accessibilité immédiate.

  2. 02

    Suivi des modifications substantielles avec impact certification

    Quand une mise à jour matérielle ou IA est planifiée, le système évalue automatiquement si elle constitue une « modification substantielle » au sens du règlement 2023/1230, auquel cas l'évaluation de conformité doit être ré-engagée. Les mises à jour OTA des modèles IA passent par un workflow de validation tracé.

  3. 03

    Notice numérique pérenne et plateforme post-marché

    Notice d'instruction interactive (vidéos, schémas 3D, troubleshooting), accessible via QR-code sur la machine. La plateforme garantit 10 ans de disponibilité ; les mises à jour de la notice sont versionnées et accessibles aux utilisateurs avec changelog. Conforme aux obligations post-market surveillance.