Classifies a system against Annex III and Article 6. Returns tier, citations, and the minimum obligation set that follows.
A plugin for Claude that classifies systems against the AI Act, runs structured DPIAs, assesses vendors, and assembles evidence packs your auditor will accept. Built by a legal engineer for senior in-house counsel preparing for the Aug 2, 2026 high-risk deadline.
claude plugin marketplace add lexbeam-software/eu-ai-governance-plugin
Nothing speculative. Each command mirrors a discrete workflow an in-house compliance team runs today, now structured, cited, and version-controllable.
Classifies a system against Annex III and Article 6. Returns tier, citations, and the minimum obligation set that follows.
Runs a structured Data Protection Impact Assessment with GDPR Art. 35 scaffolding and AI-specific risk prompts.
Reads vendor documentation and produces a gap list against AI Act provider obligations with suggested contract language.
Assembles a signed, timestamped evidence bundle in Article 11 technical documentation style, export-ready.
Reviews an internal AI policy against the Act and flags the clauses that will not survive a regulator's first read.
Snapshot of your organisation's readiness across classifications, DPIAs, vendors, and evidence, with the things that are still open.
The output below is what the plugin actually returns. Chips are structured fields; the analysis is model-written against the Act, and every reference is citable.
CV-Screen sits squarely within Annex III § 4(a), AI systems used for recruitment or selection, specifically to filter applications. Under Art. 6(2) that makes it high-risk regardless of vendor claims. As deployer across three jurisdictions, obligations under Art. 26 attach to your entities. A Fundamental Rights Impact Assessment (Art. 27) is mandatory for public-sector deployers and recommended as good practice for private employers in this use case. A GDPR DPIA (Art. 35) is required given the automated decision-making on applicants.
This plugin was written against real in-house files, the kind a board asks about on a Friday afternoon. It is open source because compliance work should be inspectable, not proprietary. It is a passion project, not a company.
"I did not want another dashboard. I wanted the Act itself at the cursor, cited, structured, and defensible to the partner on the other side of the table." · Werner Plutat, on why the plugin exists
Requires Claude with plugin support. The plugin is Apache 2.0 licensed and runs locally; nothing is sent to Lexbeam.
This plugin is a tool for licensed professionals. It does not replace legal advice, and it is published under the limitations set out below in full.
This plugin does not render legal services within the meaning of the German Legal Services Act (Rechtsdienstleistungsgesetz). Outputs are informational scaffolding for qualified counsel and must be reviewed before any external use or filing.
Read full disclaimer →Provided as-is under Apache 2.0. The author disclaims liability to the fullest extent permitted under the German Product Liability Act. Users are responsible for validation against the current text of the AI Act and their national implementing measures.
Read full disclaimer →/classify-ai-risk, /run-dpia, /assess-ai-vendor, /generate-evidence-pack, /review-ai-policy, /ai-act-status) plus six skills covering classification, compliance, vendor assessment, DPIA, governance documentation, and risk management. DACH-specific guidance (Works Council, BaFin, BSI) included. German language support via --lang de.