An Explainable Deep Learning-Based Classification Method for Facial Image Quality Assessment

Research output: Contribution to journalArticlepeer-review

Abstract

Considering factors such as illumination, camera quality variations, and background-specific variations, identifying a face using a smartphone-based facial image capture application is challenging. Face Image Quality Assessment refers to the process of taking a face image as input and producing some form of “quality” estimate as an output. Typically, quality assessment techniques use deep learning methods to categorize images. The models used in deep learning are shown as black boxes. This raises the question of the trustworthiness of the models. Several explainability techniques have gained importance in building this trust. Explainability techniques provide visual evidence of the active regions within an image on which the deep learning model makes a prediction. Here, we developed a technique for reliable prediction of facial images before medical analysis and security operations. A combination of gradient-weighted class activation mapping and local interpretable model-agnostic explanations were used to explain the model. This approach has been implemented in the preselection of facial images for skin feature extraction, which is important in critical medical science applications. We demonstrate that the use of combined explanations provides better visual explanations for the model, where both the saliency map and perturbation-based explainability techniques verify predictions.

Original languageEnglish
Pages (from-to)558-573
Number of pages16
JournalJournal of Information Processing Systems
Volume20
Issue number4
DOIs
StatePublished - Aug 2024

Keywords

  • Explainable Deep Learning
  • Face Image Quality Assessment
  • Image Classification
  • MobileNet
  • Transfer Learning

Fingerprint

Dive into the research topics of 'An Explainable Deep Learning-Based Classification Method for Facial Image Quality Assessment'. Together they form a unique fingerprint.

Cite this