As highlighted earlier this year, Team-NB published an updated version of their Position Paper on “European Artificial Intelligence Act (AI ACT)”, it highlights the challenges that need to be addressed in the medical device sector for successful implementation of the AI Act, which first came into force on August 1, 2024.
To watch the video version of this blog, click below. Otherwise, continue reading.
With a significant uptick in the development of AI-based medical device software (MDSW), these challenges will only become more pronounced in the lead up to August 2, 2027, when the clear set of risk-based rules become applicable to high-risk systems already subject to other EU legislation (e.g. MDR / IVDR). If these challenges are not appropriately addressed, they could potentially result in a repeat of the impact of the MDR and IVDR, for which significant delays in certification continue to be experienced by manufacturers.
As an economic operator, MedEnvoy recognizes both the potential and the challenges of artificial intelligence (AI) and its growing impact on the regulatory landscape. We remain committed to providing accurate, timely information to help our clients navigate the EU market efficiently and without disruption.
In this article, we take a closer look at three key challenges facing Notified Bodies under the EU AI Act, offering insights into how these may influence regulatory processes moving forward.
Learn more about MedEnvoy’s EU Regulatory Importer service here.
Challenge # 1: Notified Body Designation
The designation process under the EU AI Act commenced on August 2, 2025. From that date forward, two key milestones were required to be met:
- Application of Rules for Notified Bodies: The chapter of the EU AI Act detailing the requirements, responsibilities, and procedures for notified bodies became fully applicable.
- Establishment of Notifying Authorities: Each EU Member State is mandated to have its national “notifying authority” in place. These national bodies will be responsible for assessing, designating, and monitoring the notified bodies within their jurisdiction.
Following this date, conformity assessment bodies—organizations with the requisite expertise and independence, can formally submit their applications to the national notifying authority in the Member State where they are established. The application process will require comprehensive documentation, including an accreditation certificate where applicable, to demonstrate their competence in areas such as AI technologies, risk management, and fundamental rights.
As of the date of this article, several Member States have not met the August 2, 2025, deadline for notification to the European Commission. A primary reason for these delays is that the establishment of new regulatory authorities or the reassignment of responsibilities to existing ones involves national legislative procedures, which vary in duration across the EU. However, several countries have already taken formal steps to appoint their respective authorities. Germany, for example, has designated its Federal Network Agency (Bundesnetzagentur) as a key market surveillance authority. Similarly, Denmark has appointed its Danish Agency for Digital Government (Digitaliseringsstyrelsen) to a significant role.
In the interim, organizations like the International Association of Privacy Professionals (IAPP) and other legal and technology-focused groups are compiling their own directories of the designated or expected authorities based on publicly available information from individual Member States. These resources provide a valuable, though unofficial, overview of the evolving regulatory framework across the EU.
While the AI Act sets the starting implementation date, it does not prescribe a specific duration for the designation process itself. The time it will take for a conformity assessment body to officially become a Notified Body will likely vary, depending on several factors:
- The preparedness of the applicant: The completeness and quality of the application and supporting documentation will significantly influence the assessment timeline.
- The capacity of the national notifying authority: The resources and efficiency of the newly established national authorities will play a crucial role in how quickly they can process applications.
- The complexity of the AI systems the body intends to assess: Bodies seeking designation for a wide or particularly complex range of high-risk AI systems may face a more extended evaluation.
Experts and industry stakeholders have pointed to potential challenges that could impact the speed of Notified Body designation. A primary concern is a potential shortage of organizations with the necessary expertise to audit complex AI systems, which could create a bottleneck in the designation process. This has led to calls for leveraging existing frameworks and expertise, for instance, from the medical device sector, where a similar system of Notified Bodies is already in place. For context, under the MDR, the pre-application phase for a Notified Body can take around 50 working days, with the assessment phase potentially lasting 105 working days or more. While not directly transferable, this provides an indication of the potential duration involved.
Challenge # 2: Notified Body Resourcing
The EU AI Act places a heavy burden of responsibility on Notified Bodies, which must possess a rare and diverse blend of expertise to adequately assess the complex and evolving nature of AI. This has given rise to a number of significant resourcing challenges that could create a serious capacity crunch, potentially delay the market entry of innovative AI systems and hinder the very safety and fundamental rights the Act aims to protect.
The sheer volume of AI systems falling under the category of “high-risk” is expected to create a massive demand for the services of Notified Bodies. Without enough designated Notified Bodies, developers of high-risk AI systems could face significant delays in getting their products to market, stifling innovation and creating a competitive disadvantage for EU-based companies.
At the heart of the resourcing challenge lies the exceptional and multifaceted expertise required to audit high-risk AI. Notified Bodies will need to assemble teams of specialists with a deep understanding of:
- AI Technologies and Data Science: Proficiency in machine learning, neural networks, natural language processing, and other AI techniques is fundamental. Auditors must be able to scrutinize algorithms, data governance practices, and the methodologies used for training, validation, and testing of AI models.
- Cybersecurity: Given the interconnected nature of AI systems, robust cybersecurity knowledge is essential to assess their resilience against adversarial attacks and data breaches.
- Fundamental Rights and Ethics: A significant portion of the EU AI Act is dedicated to protecting fundamental rights. This requires personnel with a strong grounding in data protection (including GDPR), non-discrimination law, and the ethical implications of AI in sensitive areas like employment, law enforcement, and access to essential services.
- Risk Management: The ability to identify, analyze, and evaluate the potential risks posed by AI systems throughout their lifecycle is a core competency for Notified Bodies.
- Sector-Specific Knowledge: High-risk AI applications span a wide array of sectors, from medical devices and critical infrastructure to education and finance. Notified Bodies will need experts with in-depth knowledge of the specific domains in which the AI systems they assess will be deployed.
The combination of these highly specialized and often distinct skill sets within a single organization is a tall order. The demand for AI and data science talent already far outweighs supply, and the EU AI Act will intensify this competition, making it difficult and expensive for potential notified bodies to recruit and retain the necessary personnel.
Becoming a Notified Body under the EU AI Act is a costly and complex undertaking. Organizations will face several significant financial and organizational challenges, including:
- Accreditation and Designation Costs: The process of being assessed and designated by a national competent authority will itself be resource-intensive.
- Staffing and Training: As highlighted, recruiting and retaining a team of highly qualified experts will be a major expense. Continuous training to keep abreast of technological and regulatory developments will also be necessary.
- Liability and Insurance: The EU AI Act requires Notified Bodies to have appropriate liability insurance, the cost of which is likely to be substantial given the high-stakes nature of their work.
- Robust Quality Management and Cybersecurity Systems: Aspiring Notified Bodies must establish and maintain comprehensive internal systems to ensure the quality and integrity of their assessments and to protect the sensitive data they will handle.
- Data Access and Assessment Complexities: A critical part of a Notified Body’s role will be to assess the data used to train and test AI models. This presents a dual challenge: gaining access to often proprietary and vast datasets while ensuring full compliance with stringent data protection regulations like the GDPR. Navigating the legal and technical complexities of accessing and analyzing this data will require significant resources and expertise.
Challenge # 3: Notified Body Conformity Assessment Timeframes
The EU AI Act outlines two primary routes for demonstrating conformity, with the required path depending on the specific type of high-risk AI system:
- Self-Assessment (Internal Control): For some high-risk AI systems, providers will be able to conduct a self-assessment, not requiring Notified Body involvement.
- Third-Party Assessment by a Notified Body: A significant portion of high-risk AI systems, particularly those in critical areas such as medical devices, critical infrastructure, and certain safety components of products, will require Notified Body conformity assessment This process involves a thorough audit of the AI system’s technical documentation, risk management processes, and overall compliance with the EU AI Act’s requirements.
While the EU AI Act itself does not prescribe a specific duration for Notified Body conformity assessment, early indicators and comparisons to similar regulatory frameworks suggest that the timeline could range from a few months for straightforward self-assessments to a lengthy and potentially arduous period of nine to 24 months, or even longer, for systems requiring a third-party audit by a Notified Body. Such timeframes will result in significant delays to market entry for manufacturers of high-risk AI systems and subsequently impact the rate of innovation of such systems, particularly in healthcare.
Learn More About the EU AI Act and More with MedEnvoy
This article provides a detailed overview of the primary challenges being faced by Notified Bodies under the EU AI Act. MedEnvoy’s regulatory experts can assist manufacturers requiring support with EU regulatory requirements for medical devices. Furthermore, MedEnvoy offers EU authorized representative and EU importer services. Please reach out should you need assistance by clicking here and for information about our regulatory experts click here.