Real-time Hand Motion Tracking from a Single RGBD Camera

Day - Time: 29 June 2015, h.11:00
Place: Area della Ricerca CNR di Pisa - Room: C-29
Speakers
  • Andrea Tagliasacchi (Eacute;cole Polytechnique Fédérale de Lausanne)
Referent

Paolo Cignoni

Abstract

We present a robust method for capturing articulated hand motions in realtime using a single depth camera. Our system is based on a realtime registration process that accurately reconstructs hand poses by fitting a 3D articulated hand model to depth images. We register the hand model using depth, silhouette, and temporal information. To effectively map low-quality depth maps to realistic hand poses, we regularize the registration with kinematic and temporal priors, as well as a data-driven prior built from a database of realistic hand poses. We present a principled way of integrating such priors into our registration optimization to enable robust tracking without severely restricting the freedom of motion. A core technical contribution is a new method for computing tracking correspondences that directly models occlusions typical of single-camera setups. To ensure reproducibility of our results and facilitate future research, we fully disclose the source code of our implementation.

Biography: Andrea Tagliasacchi is an assistant professor at University of Victoria (BC, Canada). Before joining UVic, he was a postdoctoral researcher in the Graphics and Geometry Laboratory at EPFL. Andrea obtained his M.Sc. (cum laude, gold medal in ENCS) in digital signal processing from Politecnico di Milano. He completed his Ph.D. in 2013 at Simon Fraser University as an NSERC Alexander Graham Bell scholar. His doctoral research at SFU focused on digital geometry processing (skeletal representations and surface reconstruction). Recently his interests are real-time registration and modeling, with applications to augmented reality and human-machine interaction.

Latest Announcements