QuantPi Joins NVIDIA Halos AI Systems Inspection Lab Ecosystem to Advance Trustworthy Physical AI
Read announcement

Accelerate certification with relaible technical evidence

Our platform produces the evidence certifiers and regulators require: audit trails, validated test criteria and confidence intervals for safety cases. Built alongside leading auditing and certification organizations.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Built with certifiers

Aligned with standardization

Pre-validated with regulators

Testing, Inspection & Certification Partnerships

From regulatory requirements to verifiable test criteria

Build your evidence layer for validated compliance processes of regulated use cases such as physical AI, autonomous vehicles or robotics. QuantPi collaborates with renowned TIC partners to translate complex regulatory requirements into verifiable testing criteria.

Audit and Certification

QuantPi’s TIC partners support your full audit readiness and preparation with:

Risk scoping workshops

AI risk management training

Gap analysis towards applicable legislation

AI quality management system setup

Technical documentation for your AI systems

Test Criteria and Methods

The selection and parametrization of technical assessment metrics needs to be justified in context of applicable regulation. We have successfully collaborated with TIC partners on conformity assessments in context of  the following exemplary references:

EU Artificial Intelligence Act

ISO/IEC TR 5469

ISO/IEC TR 5469 - Functional Safety and AI Systems

General-Purpose AI Code of Practice

ISO/PAS 8800

ISO/PAS 8800 - Road Vehicles - Safety and Artificial Intelligence

German General Act on Equal Treatment

ISO/IEC TR 24027

ISO/IEC TR 24027 - Bias in AI Systems and AI aided Decision Making

EU Charter of Fundamental Rights

ISO/IEC 25010

ISO/IEC 25010 - Systems and Software Quality Requirements and Evaluation

Explore how the QuantPi platform is used in real-world assurance cases. Read our white paper with TÜV AI.Lab and StepStone on uncovering bias in AI recruitment with a legally assured methodology to assess a candidate recommender system under European regulation.

Read our white paper
Physical AI Safety

Accelerating Physical AI Safety Certification

QuantPi is a member of the NVIDIA Halos AI Systems Inspection Lab, the industry’s first program accredited by the ANSI National Accreditation Board (ANAB) to provide a unified framework for functional safety, cybersecurity, and AI compliance. NVIDIA Halos is a comprehensive full‑stack safety system for physical AI that unifies safety elements across vehicle and robotics architectures and their underlying AI models.

The QuantPi platform delivers scalable technical validation with statistical confidence across diverse metrics, including robustness, fairness, and performance. These capabilities help streamline the validation process for physical AI deployments, ensuring they meet emerging AI-specific standards like ISO PAS 8800, ISO/IEC TR 5469 and ISO/IEC TS 22440.

Market Surveillance

QuantPi is in continuous exchange with regulators

Read more about our work with market surveillance authorities and explore our recent guest article on challenges and approaches to test AI systems, which was invited by the German Federal Office for Information Security.

Policy & Standards

Test AI systems according
to existing and future standards

QuantPi contributes to multiple standardization committees at German and European level to define the future of AI quality. Our contributions focus on reliability requirements for AI testing tools.

ETSI TS 104 008

QuantPi was invited by the German Federal Network Agency to comment on a technical specification that introduces a framework for continuous auditing-based conformity assessment.

Read announcement

AI Verify Foundation

QuantPi is a member of the AI Verify Foundation. We participated in their 2025 global AI assurance pilot and tested an investment research assistant.

Read case study

DIN Membership

QuantPi is an official member of the German Institute for Standardization (DIN). Our team contributes actively to multiple ongoing standardization projects.

AI at DIN

DIN SPEC 92006

QuantPi co-authored a new standard defining requirements for AI testing tools. For example regarding their reliability, traceability, and reproducibility.

Download full document

Certified AI

QuantPi is an associated partner of the German lighthouse project 'Certified AI'. We contribute talks and are part of programme committees for symposia.

Visit project website
Platform Capabilities for Certification of AI

How the QuantPi Platform supports your path to fast certification

We provide the specialized tooling required to satisfy rigorous audit requirements while maintaining speed of development.

Out-of-the-box test coverage

Hundreds of test scenarios aligned with international standards and regulation are available per default.

Customization to your concrete operating domain

Adapt test scenarios for in-context evaluations according to the operational design domain (ODD) of AI systems.

Extension of test coverage

Broaden test coverage with automated data annotation for deeper insights into AI system strengths and failure modes.

Pre-validated metrics and reporting

Assessment metrics and audit traces which are pre-validated with market surveillance authorities and certification bodies.

Transparent documentation

Auto-generated PDF-reports covering the entire assessment, or JSON exports with the complete audit traces prioritized according to criticality.

my customer's emotions and

What do they really say?

Vectors by Vecteezy.com

"I will recommend you to my colleagues. I was amazed at the quality of Designer."

Manuel Bailey
CEO
Vectors by Vecteezy.com

"Designer is awesome! I will let my mum know about this, she could really make use of Designer!"

Bernice Rogers
Owner

Start working towards
certification of your AI system

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.