Conventional robot control relies on cameras or LiDAR, but these sensors are highly vulnerable to environmental factors such as lighting changes, smoke, fog, occlusion, and clutter. It is therefore essential to have alternative sensing modalities when vision sensors are rendered ineffective.
This projects performs object classification within a candidate set by sequentially performing tactile exploration to estimate position and size, and acoustic perception to collect impact sounds, using a robotic arm in vision-denied environments.
