Debugging Machine Learning Models
Debug ML models to understand failures and improve transparency.
Workshop and toolkit for diagnosing machine learning model failures and improving interpretability. Helps teams identify why models underperform, detect potential biases, and document model behavior for compliance. Used by ML engineers, data scientists, and governance teams needing transparency into model decisions.
Adjacent tooling.
AI Trust Services (KPMG)
KPMG's trusted AI framework for governance, risk, and compliance.
Aporia
Monitor, test, and safeguard LLMs in production with observability and guardrails.
Dataiku EU AI Act Readiness
Platform helping organizations assess and manage EU AI Act compliance risks.
DataRobot
Real-time AI governance, monitoring and compliance platform for enterprises.
Earthian AI
Enterprise risk management platform purpose-built for AI systems.
IBM watsonx.governance
Unified AI governance platform for model lifecycle management and compliance tracking.