University of Surrey

Test tubes in the lab Research in the ATI Dance Research

Human Action Recognition From Relative Motion.

Oshin, Olusegun Temitope. (2011) Human Action Recognition From Relative Motion. Doctoral thesis, University of Surrey (United Kingdom)..

Available under License Creative Commons Attribution Non-commercial Share Alike.

Download (11MB) | Preview


The aim of this thesis is to develop discriminative and efficient representations of human actions in video for recognition. Human actions can be defined as sets of atomic events which occur in local and global regions of the video. Natural actions consist of several variations in factors pertaining to their execution and capture conditions. Therefore, representations that are robust in the presence of these variations are desirable. A visual description of an action is often given in terms of motion of body parts, and/or objects in the scene with which these parts interact. Therefore, this thesis presents approaches to the recognition of actions based solely on motion observed in video. Explicit appearance information is discarded as the appearance of subjects vary significantly, especially in uncontrolled environments, while motion cues are consistent. Also, it has been shown in Psychology experiments using Point Light Displays that it is possible to observe detailed properties of human actions and actors based entirely on the dynamics of body movement. This motivates the presented approaches. Motion in video can be summarised using highly informative spatio-temporal interest points. However, the selection of interesting motion regions can be computationally expensive. A novel interest point detector is therefore introduced, which provides for a generic and efficient solution. Interest point detection is formulated as a classification problem: Given examples of detected interest points, an approach is presented, which emulates the functionality of any detector. Simple, yet effective tests are employed in a naive Bayesian classifier, Randomised Ferns, to categorise local regions as motion or non-motion regions. Results show comparable detections to emulated detectors, achieved in constant time, and independent of the complexity of the detectors. The spatial and temporal distribution of interest points induced by actions provides discriminative information for action description. For simulated actions performed in simplified settings, characteristic events of actions can be deduced from the global distribution of these points. The Randomised Ferns classifier is further extended to encode these distributions. The global distribution of interest points indicates the presence and absence of motion at various regions within the global action region, and can therefore provide discriminative information for action description. Minimal constraints exist on the execution of natural actions and scene setup. In such settings, simply encoding global motion events fails. A Relative Motion Descriptor is introduced, which encodes characteristic low level motion information, and-therefore captures detailed properties of action and scene dynamics. The descriptor is computed at local regions across the video, and encodes atomic motion events via the relative distribution of interest point response strengths. The resulting descriptor can be used in conjunction with state-of-the-art classifiers. Results show recognition using SVM Classifiers. Furthermore, an approach is presented for the improvement of action classification, which assumes the presence of inherent modes in the observations. This is necessary as loose constraints are placed on actions in natural settings. Automatic Outlier Detection and Mode Finding methods are introduced to determine these modes. A variant of the RANSAC algorithm is employed with a novel adaptation based on a Boosting-inspired iterative reweighting scheme. These methods simplify the classification boundaries between actions and result in improved recognition performance.

Item Type: Thesis (Doctoral)
Divisions : Theses
Authors : Oshin, Olusegun Temitope.
Date : 2011
Additional Information : Thesis (Ph.D.)--University of Surrey (United Kingdom), 2011.
Depositing User : EPrints Services
Date Deposited : 06 May 2020 14:15
Last Modified : 06 May 2020 14:17

Actions (login required)

View Item View Item


Downloads per month over past year

Information about this web site

© The University of Surrey, Guildford, Surrey, GU2 7XH, United Kingdom.
+44 (0)1483 300800