EgoLocate, In SIGGRAPH 2023.
Christian Theobalt Christian Theobalt
3.45K subscribers
982 views
19

 Published On May 30, 2023

Paper Abstract:
Human and environment sensing are two important topics in Computer Vision and Graphics. Human motion is often captured by inertial sensors (left), while the environment is mostly reconstructed using cameras (right). We integrate the two techniques together in EgoLocate (middle), a system that simultaneously performs human motion capture (mocap), localization, and mapping in real time from sparse body-mounted sensors, including 6 inertial measurement units (IMUs) and a monocular phone camera. On one hand, inertial mocap suffers from large translation drift due to the lack of the global positioning signal. EgoLocate leverages image-based simultaneous localization and mapping (SLAM) techniques to locate the human in the reconstructed scene. On the other hand, SLAM often fails when the visual feature is poor. EgoLocate involves inertial mocap to provide a strong prior for the camera motion. Experiments show that localization, a key challenge for both two fields, is largely improved by our technique, compared with the state of the art of the two fields.


Reference Publication:
X. Yi, Y. Zhou, M. Habermann, V. Golyanik, S. Pan, C. Theobalt, F. Xu. EgoLocate Real-time Motion Capture, Localization, and Mapping with Sparse Body-mounted Sensors. In SIGGRAPH, 2023.


Project Page:
https://xinyu-yi.github.io/EgoLocate/#

show more

Share/Embed