Balanced Medical Image Classification with Transfer Learning and Convolutional Neural Networks

David Benavente, Gustavo Gatica, Jesús González-Feliu

Research output: Contribution to journalArticlepeer-review

Abstract

This paper aims to propose a tool for image classification in medical diagnosis decision support, in a context where computational power is limited and then specific, high-speed computing infrastructures cannot be used (mainly for economic and energy consuming reasons). The proposed method combines a deep neural networks algorithm with medical imaging procedures and is implemented to allow an efficient use on affordable hardware. The convolutional neural network (CNN) procedure used VGG16 as its base architecture, using the transfer learning technique with the parameters obtained in the ImageNet competition. Two convolutional blocks and one dense block were added to this architecture. The tool was developed and calibrated on the basis of five common lung diseases using 5430 images from two public datasets and the transfer learning technique. The holdout ratios of 90% and 10% for training and testing, respectively, were obtained, and the regularization tools were dropout, early stopping, and Lasso regularization (L2). An accuracy (ACC) of 56% and an area under the receiver-operating characteristic curve (ROC—AUC) of 50% were reached in testing, which are suitable for decision support in a resource-constrained environment.

Original languageEnglish
Article number115
JournalAxioms
Volume11
Issue number3
DOIs
Publication statusPublished - Mar 2022

Keywords

  • Chest X-rays
  • Computer vision
  • Convolutional neural nets
  • Deep learning
  • Image classification
  • Medical imaging
  • Problem solving

ASJC Scopus subject areas

  • Analysis
  • Algebra and Number Theory
  • Mathematical Physics
  • Logic
  • Geometry and Topology

Fingerprint

Dive into the research topics of 'Balanced Medical Image Classification with Transfer Learning and Convolutional Neural Networks'. Together they form a unique fingerprint.

Cite this