Gender classification from periocular NIR images using fusion of CNNs models

Juan Tapia, C. Carlos Aravena

Resultado de la investigación: Conference contribution

6 Citas (Scopus)

Resumen

Gender classification from periocular images is a challenging topic. Previous algorithms have focused primarily on the use of texture features and not much research has been done on applying Convolutional Neural Networks (CNN) to this task. In this work we trained a small convolutional neural network for the left and right eyes, and more importantly, studied the effect of merging those models and compare it against the model obtained by training a CNN over the fused left-right eye images. We show that the network benefits from this model merging approach, and becomes more robust towards occlusion and low resolution degradation, outperforming the results of using a single CNN model for the left and right set of images. Ex-periments done over a database of near-infrared periocular images show that our CNN model exhibits competitive performance compared to other state-of-the-art methods.

Idioma originalEnglish
Título de la publicación alojada2018 IEEE 4th International Conference on Identity, Security, and Behavior Analysis, ISBA 2018
EditorialInstitute of Electrical and Electronics Engineers Inc.
Páginas1-6
Número de páginas6
Volumen2018-January
ISBN (versión digital)9781538622483
DOI
EstadoPublished - 9 mar 2018
Evento4th IEEE International Conference on Identity, Security, and Behavior Analysis, ISBA 2018 - Singapore, Singapore
Duración: 11 ene 201812 ene 2018

Conference

Conference4th IEEE International Conference on Identity, Security, and Behavior Analysis, ISBA 2018
PaísSingapore
CiudadSingapore
Período11/01/1812/01/18

Huella dactilar

CNN
Neural Networks (Computer)
Image fusion
neural network
Neural networks
gender
Merging
Databases
Research
Textures
Infrared radiation
Degradation
performance

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Computer Vision and Pattern Recognition
  • Safety, Risk, Reliability and Quality
  • Behavioral Neuroscience
  • Social Sciences (miscellaneous)

Citar esto

Tapia, J., & Aravena, C. C. (2018). Gender classification from periocular NIR images using fusion of CNNs models. En 2018 IEEE 4th International Conference on Identity, Security, and Behavior Analysis, ISBA 2018 (Vol. 2018-January, pp. 1-6). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ISBA.2018.8311465
Tapia, Juan ; Aravena, C. Carlos. / Gender classification from periocular NIR images using fusion of CNNs models. 2018 IEEE 4th International Conference on Identity, Security, and Behavior Analysis, ISBA 2018. Vol. 2018-January Institute of Electrical and Electronics Engineers Inc., 2018. pp. 1-6
@inproceedings{5553c142e65f4832b800679c23534c9a,
title = "Gender classification from periocular NIR images using fusion of CNNs models",
abstract = "Gender classification from periocular images is a challenging topic. Previous algorithms have focused primarily on the use of texture features and not much research has been done on applying Convolutional Neural Networks (CNN) to this task. In this work we trained a small convolutional neural network for the left and right eyes, and more importantly, studied the effect of merging those models and compare it against the model obtained by training a CNN over the fused left-right eye images. We show that the network benefits from this model merging approach, and becomes more robust towards occlusion and low resolution degradation, outperforming the results of using a single CNN model for the left and right set of images. Ex-periments done over a database of near-infrared periocular images show that our CNN model exhibits competitive performance compared to other state-of-the-art methods.",
author = "Juan Tapia and Aravena, {C. Carlos}",
year = "2018",
month = "3",
day = "9",
doi = "10.1109/ISBA.2018.8311465",
language = "English",
volume = "2018-January",
pages = "1--6",
booktitle = "2018 IEEE 4th International Conference on Identity, Security, and Behavior Analysis, ISBA 2018",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

Tapia, J & Aravena, CC 2018, Gender classification from periocular NIR images using fusion of CNNs models. En 2018 IEEE 4th International Conference on Identity, Security, and Behavior Analysis, ISBA 2018. vol. 2018-January, Institute of Electrical and Electronics Engineers Inc., pp. 1-6, 4th IEEE International Conference on Identity, Security, and Behavior Analysis, ISBA 2018, Singapore, Singapore, 11/01/18. https://doi.org/10.1109/ISBA.2018.8311465

Gender classification from periocular NIR images using fusion of CNNs models. / Tapia, Juan; Aravena, C. Carlos.

2018 IEEE 4th International Conference on Identity, Security, and Behavior Analysis, ISBA 2018. Vol. 2018-January Institute of Electrical and Electronics Engineers Inc., 2018. p. 1-6.

Resultado de la investigación: Conference contribution

TY - GEN

T1 - Gender classification from periocular NIR images using fusion of CNNs models

AU - Tapia, Juan

AU - Aravena, C. Carlos

PY - 2018/3/9

Y1 - 2018/3/9

N2 - Gender classification from periocular images is a challenging topic. Previous algorithms have focused primarily on the use of texture features and not much research has been done on applying Convolutional Neural Networks (CNN) to this task. In this work we trained a small convolutional neural network for the left and right eyes, and more importantly, studied the effect of merging those models and compare it against the model obtained by training a CNN over the fused left-right eye images. We show that the network benefits from this model merging approach, and becomes more robust towards occlusion and low resolution degradation, outperforming the results of using a single CNN model for the left and right set of images. Ex-periments done over a database of near-infrared periocular images show that our CNN model exhibits competitive performance compared to other state-of-the-art methods.

AB - Gender classification from periocular images is a challenging topic. Previous algorithms have focused primarily on the use of texture features and not much research has been done on applying Convolutional Neural Networks (CNN) to this task. In this work we trained a small convolutional neural network for the left and right eyes, and more importantly, studied the effect of merging those models and compare it against the model obtained by training a CNN over the fused left-right eye images. We show that the network benefits from this model merging approach, and becomes more robust towards occlusion and low resolution degradation, outperforming the results of using a single CNN model for the left and right set of images. Ex-periments done over a database of near-infrared periocular images show that our CNN model exhibits competitive performance compared to other state-of-the-art methods.

UR - http://www.scopus.com/inward/record.url?scp=85049777358&partnerID=8YFLogxK

U2 - 10.1109/ISBA.2018.8311465

DO - 10.1109/ISBA.2018.8311465

M3 - Conference contribution

VL - 2018-January

SP - 1

EP - 6

BT - 2018 IEEE 4th International Conference on Identity, Security, and Behavior Analysis, ISBA 2018

PB - Institute of Electrical and Electronics Engineers Inc.

ER -

Tapia J, Aravena CC. Gender classification from periocular NIR images using fusion of CNNs models. En 2018 IEEE 4th International Conference on Identity, Security, and Behavior Analysis, ISBA 2018. Vol. 2018-January. Institute of Electrical and Electronics Engineers Inc. 2018. p. 1-6 https://doi.org/10.1109/ISBA.2018.8311465