University of Surrey

Test tubes in the lab Research in the ATI Dance Research

Joint Group Feature Selection and Discriminative Filter Learning for Robust Visual Object Tracking

Xu, Tianyang, Feng, Zhenhua, Wu, Xiao-Jun and Kittler, Josef (2019) Joint Group Feature Selection and Discriminative Filter Learning for Robust Visual Object Tracking In: IEEE/CVF International Conference on Computer Vision (ICCV 2019), 27 Oct - 02 Nov 2019, 2019.

[img]
Preview
Text
Joint Group Feature Selection and Discriminative Filter Learning for Robust Visual Object Tracking.pdf - Accepted version Manuscript

Download (1MB) | Preview

Abstract

We propose a new Group Feature Selection method for Discriminative Correlation Filters (GFS-DCF) based visual object tracking. The key innovation of the proposed method is to perform group feature selection across both channel and spatial dimensions, thus to pinpoint the structural relevance of multi-channel features to the filtering system. In contrast to the widely used spatial regularisation or feature selection methods, to the best of our knowledge, this is the first time that channel selection has been advocated for DCF-based tracking. We demonstrate that our GFS-DCF method is able to significantly improve the performance of a DCF tracker equipped with deep neural network features. In addition, our GFS-DCF enables joint feature selection and filter learning, achieving enhanced discrimination and interpretability of the learned filters.

To further improve the performance, we adaptively integrate historical information by constraining filters to be smooth across temporal frames, using an efficient low-rank approximation. By design, specific temporal-spatial-channel configurations are dynamically learned in the tracking process, highlighting the relevant features, and alleviating the performance degrading impact of less discriminative representations and reducing information redundancy. The experimental results obtained on OTB2013, OTB2015, VOT2017, VOT2018 and TrackingNet demonstrate the merits of our GFS-DCF and its superiority over the state-of-the-art trackers. The code is publicly available at https://github.com/XU-TIANYANG/GFS-DCF.

Item Type: Conference or Workshop Item (Conference Paper)
Divisions : Faculty of Engineering and Physical Sciences > Electronic Engineering
Authors :
NameEmailORCID
Xu, Tianyangtianyang.xu@surrey.ac.uk
Feng, Zhenhuaz.feng@surrey.ac.uk
Wu, Xiao-Jun
Kittler, JosefJ.Kittler@surrey.ac.uk
Date : 2019
Funders : Engineering and Physical Sciences Research Council (EPSRC)
Grant Title : Programme Grant (FACER2VM)
Copyright Disclaimer : © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Related URLs :
Depositing User : Clive Harris
Date Deposited : 09 Aug 2019 09:43
Last Modified : 27 Oct 2019 02:08
URI: http://epubs.surrey.ac.uk/id/eprint/852386

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year


Information about this web site

© The University of Surrey, Guildford, Surrey, GU2 7XH, United Kingdom.
+44 (0)1483 300800