A two-stage dimensional reduction approach to low-dimensional representation of facial images

Jongmoo Choi, Juneho Yi

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

We present a two-stage dimensional reduction approach to low-dimensional representation. When facial feature data need to be stored in low capacity storing devices, low-dimensional representation of facial images is very important. Our approach is composed of two consecutive mappings of the input data. The first mapping is concerned with best separation of the input data into classes and the second focuses on the mapping that the distance relationship between data points before and after the map is kept as closely as possible. We claim that if data is well-clustered into classes, features extracted from a topology-preserving map of the data are appropriate for recognition when low-dimensional features are to be used. We have presented two novel methods: FLD (Fisher's Linear Discriminant) combined with SOFM (Self-Organizing Feature Map) method and FLD combined with MDS (Multi-Dimensional Scaling) method. Experimental results using Yale, AT&T and FERET facial image databases show that the recognition performance of our methods degrades gracefully when low-dimensional features are used.

Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
EditorsDavid Zhang, Anil K. Jain
PublisherSpringer Verlag
Pages131-138
Number of pages8
ISBN (Print)3540221468, 9783540221463
DOIs
StatePublished - 2004

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume3072
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Fingerprint

Dive into the research topics of 'A two-stage dimensional reduction approach to low-dimensional representation of facial images'. Together they form a unique fingerprint.

Cite this