Adversarial Robustness

The ability of a robot perception or policy system to maintain correct behavior under deliberately crafted adversarial inputs (sensor perturbations, manipulated images, physical adversarial patches). Adversarial robustness is a safety concern for autonomous systems deployed in environments where malicious actors may attempt to fool the robot's perception.

Robot LearningSafetyVision

Explore More Terms

Browse 1,000+ robotics terms.

Back to Glossary