We are seeking a proactive and detail-oriented AI Regulatory Engineer to ensure that AI-based clinical features for ultrasound systems comply with evolving global regulatory requirements-particularly FDA regulations for AI-enabled medical devices.
This role is critical as AI has become a highly regulated domain, requiring specialized regulatory strategies, risk management approaches, and lifecycle governance. The AI Regulatory Engineer will work closely with AI engineering, clinical, quality, product management, and regulatory affairs teams to define and execute the correct AI regulatory and submission strategy across the full product lifecycle.
Job Description
Key Responsibilities
AI Regulatory Strategy & Compliance
Interpret and apply regulatory frameworks and guidance for AI-enabled medical devices, including FDA, EU MDR/IVDR, and global regulations.
Define and drive AI regulatory strategies for clinical ultrasound features, including SaMD and AI-enabled device functions.
Design and execute clinical validation strategy and plans for AI-driven ultrasound products in compliance with regulatory standards.
Support FDA submission pathways (e.g., 510(k), De Novo, PMA) for AI-based features, including AI-specific regulatory positioning.
Risk Management & Safety for AI:
Lead AI-focused risk management activities in accordance with ISO 14971, addressing AI-specific hazards such as bias, robustness, generalization risk, and clinical misuse.
Perform and maintain AI-specific risk analyses (e.g., AI FMEA, algorithm hazard analysis, clinical performance risk).
Ensure traceability between clinical requirements, AI behavior, risk controls, and verification/validation evidence.
Regulatory Documentation & Submissions:
Prepare and maintain AI-related regulatory documentation, including:
AI descriptions and intended use statements
Training and validation dataset descriptions
Performance evaluation and clinical evidence summaries
Algorithm change management and lifecycle documentation
Support creation of FDA-ready AI documentation, including transparency, explainability, and human-factors considerations.
AI Lifecycle & Change Management:
Define regulatory-compliant AI lifecycle strategies, including updates, retraining, and change impact assessment.
Support implementation of Predetermined Change Control Plans (PCCP) or equivalent AI lifecycle strategies.
Ensure alignment between AI development practices and regulatory expectations for locked vs. adaptive algorithms.
Cross-Functional Collaboration:
Collaborate closely with:
AI engineers and data scientists
Clinical and medical affairs teams
Quality, systems, and V&V engineers
Product management and regulatory affairs
Embed regulatory and safety requirements early into AI design, data strategy, and clinical validation plans.
Regulatory Intelligence & Audits:
Monitor evolving AI regulations, FDA guidance, standards, and industry best practices.
Communicate regulatory changes and their impact on AI roadmaps and product strategy.
Support internal audits, design reviews, and external regulatory inspections related to AI and clinical safety.
Provide training and guidance on AI regulatory and safety topics to engineering and product teams.
דרישות:
Bachelors or Masters degree in Engineering, Computer Science, Biomedical Engineering, or related field.
Experience working in regulated medical device environments, preferably involving AI-driven clinical features.
Solid understanding of medical device regulatory frameworks, with emphasis on FDA and EU MDR.
Experience with risk management methodologies and safety processes for medical devices.
Strong analytical, documentation, and communication skills.
Ability to work independently and collaboratively in a multidisciplinary environment.
Preferred Qualifications:
Experience with AI-enabled medical devices, SaMD, or clinical decision support systems.
Familiarity with FDA guidance on AI in medical devices, Good AI Pr המשרה מיועדת לנשים ולגברים כאחד.