Personal tools
You are here: Home Publications Temporally Consistent Motion Segmentation from RGB-D Video Temporally Consistent Motion Segmentation from RGB-D Video

Temporally Consistent Motion Segmentation from RGB-D Video

Computer Graphics Forum

 Teaser

Peter Bertholet
Alexandru-Eugen Ichim
Matthias Zwicker
 University of Bern
EPFL
University of Maryland

Abstract

Temporally consistent motion segmentation from RGB-D videos is challenging because of the limitations of current RGB-D sensors. We formulate segmentation as a motion assignment problem, where a motion is a sequence of rigid transformations through all frames of the input. We capture the quality of each potential assignment by defining an appropriate energy function that accounts for occlusions and a sensor specific noise model. To make energy minimization tractable, we work with a discrete set instead of the continuous, high dimensional space of motions, where the discrete motion set provides an upper bound for the original energy. We repeatedly minimize our energy, and in each step extend and refine the motion set to further lower the bound. A quantitative comparison to the current state of the art demonstrates the benefits of our approach in difficult scenarios.

 

Additional Material

Code and data: (coming soon, under https://github.com/bertholet/4DMSEG )

Paper: (coming soon)

Document Actions