Past Projects

 Here are some of the past projects from our group to give you an idea of the research we do.

Learning and Manipulation of Articulated Objects

We are investigating how robots can interact with different articulated objects such as drawers, doors, cabinets and boxes. Not all articulations are obvious from visual inspection such as the cabinet on the left. In these cases the robot must fuse sensing from learned, vision-based priors with kinematics and force-based sensing to discover the articulation during interaction.

 

Deep Inertial Estimation

We have investigated the application of deep learning to Inertial Measurement Units (IMUs). We can train neural networks to learn motion models of humans or legged robots from IMU measurements (acceleration and rotation rate) and to then infer motion. This can be fused with classical IMU integration and other odometry sources.

We have also explored the use of deep learning to predict the bias process of IMUs. This can improve visual-inertial odometry in visually challenging situations such as in darkness or confined spaces.

 

Haptic Localization

Legged robots have the unique ability to use their legs not only for locomotion but to also sense their environment. This information can be used to localize the robot when vision sensors fail such as in a dark and dusty mine. We have shown different ways to localize a robot with haptic sensing such as geometric sensing, semantic terrain classification and learned terrain representations.