Gender Classification from Face Images Using Mutual Information and Feature Fusion

Claudio Perez, Juan Tapia, Pablo Estévez, Claudio Held

Research output: Contribution to journalArticlepeer-review

29 Citations (Scopus)

Abstract

In this article we report a new method for gender classification from frontal face images using feature selection based on mutual information and fusion of features extracted from intensity, shape, texture, and from three different spatial scales. We compare the results of three different mutual information measures: minimum redundancy and maximal relevance (mRMR), normalized mutual information feature selection (NMIFS), and conditional mutual information feature selection (CMIFS). We also show that by fusing features extracted from six different methods we significantly improve the gender classification results relative to those previously published, yielding 99.13% of the gender classification rate on the FERET database.

Original languageEnglish
Pages (from-to)92-119
Number of pages28
JournalInternational Journal of Optomechatronics
Volume6
Issue number1
DOIs
Publication statusPublished - Jan 2012

Keywords

  • Feature fusion
  • feature selection
  • gender classification
  • mutual information
  • real-time gender classification

ASJC Scopus subject areas

  • Instrumentation
  • Mechanical Engineering
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Gender Classification from Face Images Using Mutual Information and Feature Fusion'. Together they form a unique fingerprint.

Cite this