Uncertainty Quantification

Estimating the confidence of a robot system's predictions or decisions — distinguishing epistemic uncertainty (due to limited data) from aleatoric uncertainty (due to irreducible randomness). Bayesian neural networks, Monte Carlo dropout, deep ensembles, and conformal prediction quantify uncertainty. Uncertainty-aware robots can request human help when confidence is low.

Robot LearningSafety

Explore More Terms

Browse 1,000+ robotics terms.

Back to Glossary