Towards robust automatic affective classification of images using facial expressions for practical applications

Loading...
Thumbnail Image
File version

Accepted Manuscript (AM)

Author(s)
Zhang, Ligang
Tjondronegoro, Dian
Chandran, Vinod
Eggink, Jana
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
2016
Size
File type(s)
Location
License
Abstract

Affect is an important feature of multimedia content and conveys valuable information for multimedia indexing and retrieval. Most existing studies for affective content analysis are limited to low-level features or mid-level representations, and are generally criticized for their incapacity to address the gap between low-level features and high-level human affective perception. The facial expressions of subjects in images carry important semantic information that can substantially influence human affective perception, but have been seldom investigated for affective classification of facial images towards practical applications. This paper presents an automatic image emotion detector (IED) for affective classification of practical (or non-laboratory) data using facial expressions, where a lot of “real-world” challenges are present, including pose, illumination, and size variations etc. The proposed method is novel, with its framework designed specifically to overcome these challenges using multi-view versions of face and fiducial point detectors, and a combination of point-based texture and geometry. Performance comparisons of several key parameters of relevant algorithms are conducted to explore the optimum parameters for high accuracy and fast computation speed. A comprehensive set of experiments with existing and new datasets, shows that the method is effective despite pose variations, fast, and appropriate for large-scale data, and as accurate as the method with state-of-the-art performance on laboratory-based data. The proposed method was also applied to affective classification of images from the British Broadcast Corporation (BBC) in a task typical for a practical application providing some valuable insights.

Journal Title

Multimedia Tools and Applications

Conference Title
Book Title
Edition
Volume

75

Issue

8

Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement

© 2016 Springer. This is an electronic version of an article published in Multimedia Tools and Applications, 75, 4669–4695 (2016). Multimedia Tools and Applications is available online at: http://link.springer.com/ with the open URL of your article.

Item Access Status
Note
Access the data
Related item(s)
Subject

Artificial intelligence

Software engineering

Electronics, sensors and digital hardware

Computer vision and multimedia computation

Data management and data science

Distributed computing and systems software

Science & Technology

Computer Science, Software Engineering

Computer Science, Theory & Methods

Persistent link to this record
Citation

Zhang, L; Tjondronegoro, D; Chandran, V; Eggink, J, Towards robust automatic affective classification of images using facial expressions for practical applications, Multimedia Tools and Applications, 2016, 75 (8), pp. 4669-4695

Collections