University of Surrey

Test tubes in the lab Research in the ATI Dance Research

Rectified Wing Loss for Efficient and Robust Facial Landmark Localisation with Convolutional Neural Networks.

Feng, Zhenhua, Kittler, Josef, Awais, Muhammad and Wu, Xiao-Jun (2019) Rectified Wing Loss for Efficient and Robust Facial Landmark Localisation with Convolutional Neural Networks. International Journal of Computer Vision. pp. 1-20.

[img] Text
IJCV_RWing.pdf - Accepted version Manuscript
Restricted to Repository staff only until 18 December 2020.

Download (3MB)

Abstract

Efficient and robust facial landmark localisation is crucial for the deployment of real-time face analysis systems. This paper presents a new loss function, namely Rectified Wing (RWing) loss, for regression-based facial landmark localisation with Convolutional Neural Networks (CNNs). We first systemically analyse different loss functions, including L2, L1 and smooth L1. The analysis suggests that the training of a network should pay more attention to small-medium errors. Motivated by this finding, we design a piece-wise loss that amplifies the impact of the samples with small-medium errors. Besides, we rectify the loss function for very small errors to mitigate the impact of inaccuracy of manual annotation. The use of our RWing loss boosts the performance significantly for regression-based CNNs in facial landmarking, especially for lightweight network architectures. To address the problem of under-representation of samples with large pose variations, we propose a simple but effective boosting strategy, referred to as pose-based data balancing. In particular, we deal with the data imbalance problem by duplicating the minority training samples and perturbing them by injecting random image rotation, bounding box translation and other data augmentation strategies. Last, the proposed approach is extended to create a coarse-to-fine framework for robust and efficient landmark localisation. Moreover, the proposed coarse-to-fine framework is able to deal with the small sample size problem effectively. The experimental results obtained on several well-known benchmarking datasets demonstrate the merits of our RWing loss and prove the superiority of the proposed method over the state-of-the-art approaches.

Item Type: Article
Divisions : Faculty of Engineering and Physical Sciences > Electronic Engineering
Authors :
NameEmailORCID
Feng, Zhenhuaz.feng@surrey.ac.uk
Kittler, JosefJ.Kittler@surrey.ac.uk
Awais, Muhammad
Wu, Xiao-Jun
Date : 17 December 2019
Funders : Engineering and Physical Sciences Research Council (EPSRC)
DOI : 10.1007/s11263-019-01275-0
Grant Title : FACER2VM
Copyright Disclaimer : Copyright © 2019, Springer Nature. Open Access. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Uncontrolled Keywords : Facial landmark localisation; Deep convolutional neural networks; Rectified Wing Loss; Pose-based data balancing; Coarse-to-fine networks
Depositing User : James Marshall
Date Deposited : 24 Jan 2020 13:55
Last Modified : 27 Apr 2020 11:02
URI: http://epubs.surrey.ac.uk/id/eprint/853396

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year


Information about this web site

© The University of Surrey, Guildford, Surrey, GU2 7XH, United Kingdom.
+44 (0)1483 300800