Enter your AI deployment details and generate a structured draft Fundamental Rights Impact Assessment covering all seven mandatory elements of Article 27(1)(a)-(g) of Regulation (EU) 2024/1689. Free. No data leaves your browser.
This information appears in the document header and establishes which Article 27 obligation category applies to your organisation.
Identify the high-risk AI system. This section establishes what is being deployed and in what context, providing the foundation for the rest of the assessment.
Article 27(1)(a) requires a description of the processes in which the AI system will be used, including the periods and frequency of use.
Article 27(1)(b) requires a statement of the periods over which the AI system will be used and how often it will be updated during that period.
Article 27(1)(c) requires identification of the categories of natural persons and groups likely to be affected by the deployment of the system in the Union.
Article 27(1)(d) requires identification of the specific risks of harm to fundamental rights likely to impact those categories of persons. Article 27(1)(e) requires a description of the implementation of measures to mitigate those risks.
Article 27(1)(f) requires a description of the human oversight measures. Article 27(1)(g) requires a description of the measures for internal governance and for complaints by affected persons.
This draft covers the required structure of Article 27(1)(a)-(g). For deployers who also have a GDPR obligation, Article 27(4) permits a joint FRIA and DPIA document, provided all elements of both are present.
Read the full guidance on what deployers must file and maintain before 2 August 2026.
Article 27 guide 90-day checklist