University of Surrey

Test tubes in the lab Research in the ATI Dance Research

Texture-Independent Long-Term Tracking Using Virtual Corners

Lebeda, K, Hadfield, SJ and Bowden, R (2015) Texture-Independent Long-Term Tracking Using Virtual Corners IEEE Transactions on Image Processing, 25 (1). pp. 359-371.

TIP2015.pdf - ["content_typename_Accepted version (post-print)" not defined]
Available under License : See the attached licence file.

Download (3MB) | Preview
Text (licence)
Available under License : See the attached licence file.

Download (33kB) | Preview


Long term tracking of an object, given only a single instance in an initial frame, remains an open problem. We propose a visual tracking algorithm, robust to many of the difficulties which often occur in real-world scenes. Correspondences of edge-based features are used, to overcome the reliance on the texture of the tracked object and improve invariance to lighting. Furthermore we address long-term stability, enabling the tracker to recover from drift and to provide redetection following object disappearance or occlusion. The two-module principle is similar to the successful state-of-the-art long-term TLD tracker, however our approach offers better performance in benchmarks and extends to cases of low-textured objects. This becomes obvious in cases of plain objects with no texture at all, where the edge-based approach proves the most beneficial. We perform several different experiments to validate the proposed method. Firstly, results on short-term sequences show the performance of tracking challenging (low-textured and/or transparent) objects which represent failure cases for competing state-of-the-art approaches. Secondly, long sequences are tracked, including one of almost 30 000 frames which to our knowledge is the longest tracking sequence reported to date. This tests the redetection and drift resistance properties of the tracker. Finally, we report results of the proposed tracker on the VOT Challenge 2013 and 2014 datasets as well as on the VTB1.0 benchmark and we show relative performance of the tracker compared to its competitors. All the results are comparable to the state-ofthe-art on sequences with textured objects and superior on nontextured objects. The new annotated sequences are made publicly available.

Item Type: Article
Divisions : Faculty of Engineering and Physical Sciences > Electronic Engineering > Centre for Vision Speech and Signal Processing
Authors :
Lebeda, K
Hadfield, SJ
Bowden, R
Date : 2 November 2015
DOI : 10.1109/TIP.2015.2497141
Additional Information : Copyright 2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.
Depositing User : Symplectic Elements
Date Deposited : 23 Dec 2015 15:08
Last Modified : 31 Oct 2017 17:53

Actions (login required)

View Item View Item


Downloads per month over past year

Information about this web site

© The University of Surrey, Guildford, Surrey, GU2 7XH, United Kingdom.
+44 (0)1483 300800