Robust Visual Segmentation using RCrR Plane and Mahalanobis Distance

In this paper a robust algorithm against illumination changes for skin detection in images is proposed. A database with 50 controlled condition images and 50 without controlled conditions of people in frontal position showing face, hands and arms was used. Five algorithms to perform color correction...

Full description

Autores:
Arévalo Casallas, Diego Armando
Castañeda Obando, David Ricardo
Castañeda Fandiño, Jos´é Ignacio
Tipo de recurso:
Article of journal
Fecha de publicación:
2014
Institución:
Universidad Antonio Nariño
Repositorio:
Repositorio UAN
Idioma:
spa
OAI Identifier:
oai:repositorio.uan.edu.co:123456789/3952
Acceso en línea:
http://revistas.uan.edu.co/index.php/ingeuan/article/view/389
http://repositorio.uan.edu.co/handle/123456789/3952
Palabra clave:
Faded photo correction
gray world assumption
gamma correction
illumination
skin color
segmentation
euclidean distance
mahalanobis distance
histogram
Corrección foto descolorida
Suposición de mundo gris
Corrección gamma
iluminación
segmentación color de piel
Distancia Euclidiana
Distancia Mahalanobis
Histograma
Rights
openAccess
License
Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)
Description
Summary:In this paper a robust algorithm against illumination changes for skin detection in images is proposed. A database with 50 controlled condition images and 50 without controlled conditions of people in frontal position showing face, hands and arms was used. Five algorithms to perform color correction are evaluated: Simple Correction with Green Channel, Color Channel Compression, Color Channel Expansion, Fixed Reference and Gamma Correction. And four algorithms for segmentation are evaluated as well: RGB Skin Color, Reference Histogram, Euclidean Distance and Mahalanobis Distance. The proposed algorithm uses the Fixed Reference method together with Gamma Correction for color correction and performs the skin segmentation based on an RCrR color plane, found by making the transformation of the images using RGB and YCbCr spaces, finally Mahalanobis Distance is used. An average sensitivity value of 99.36 % and specificity of 84.31 % were obtained as result.