University of Surrey

Test tubes in the lab Research in the ATI Dance Research

Go With The Flow: Hand Trajectories in 3D via Clustered Scene Flow

Hadfield, S and Bowden, R (2012) Go With The Flow: Hand Trajectories in 3D via Clustered Scene Flow In: ICIAR 2012, 2012-06-25-2012-06-27, Aveiro, Portugal.

Hadfield_ICIAR_2012.pdf - ["content_typename_Submitted version (pre-print)" not defined]
Available under License : See the attached licence file.

Download (2MB) | Preview
PDF (licence)
Available under License : See the attached licence file.

Download (33kB) | Preview


Tracking hands and estimating their trajectories is useful in a number of tasks, including sign language recognition and human computer interaction. Hands are extremely difficult objects to track, their deformability, frequent self occlusions and motion blur cause appearance variations too great for most standard object trackers to deal with robustly. In this paper, the 3D motion field of a scene (known as the Scene Flow, in contrast to Optical Flow, which is it’s projection onto the image plane) is estimated using a recently proposed algorithm, inspired by particle filtering. Unlike previous techniques, this scene flow algorithm does not introduce blurring across discontinuities, making it far more suitable for object segmentation and tracking. Additionally the algorithm operates several orders of magnitude faster than previous scene flow estimation systems, enabling the use of Scene Flow in real-time, and near real-time applications. A novel approach to trajectory estimation is then introduced, based on clustering the estimated scene flow field in both space and velocity dimensions. This allows estimation of object motions in the true 3D scene, rather than the traditional approach of estimating 2D image plane motions. By working in the scene space rather than the image plane, the constant velocity assumption, commonly used in the prediction stage of trackers, is far more valid, and the resulting motion estimate is richer, providing information on out of plane motions. To evaluate the performance of the system, 3D trajectories are estimated on a multi-view sign-language dataset, and compared to a traditional high accuracy 2D system, with excellent results.

Item Type: Conference or Workshop Item (Conference Paper)
Divisions : Faculty of Engineering and Physical Sciences > Electronic Engineering > Centre for Vision Speech and Signal Processing
Authors :
Date : 27 July 2012
Identification Number : 10.1007/978-3-642-31295-3_34
Contributors :
Related URLs :
Additional Information : The final publication is available at Springer via
Depositing User : Symplectic Elements
Date Deposited : 18 Nov 2015 10:19
Last Modified : 18 Nov 2015 10:19

Actions (login required)

View Item View Item


Downloads per month over past year

Information about this web site

© The University of Surrey, Guildford, Surrey, GU2 7XH, United Kingdom.
+44 (0)1483 300800