A multi-resolution model for human chromatic and achromatic local-contrast discrimination
Perception, European Conference on Visual Perception (ECVP´04), Volume 33, page 118b - 2004
IF: 1.271. area: PSYCHOLOGY. Quartile: 3.
IF: 1.271. area: PSYCHOLOGY. Quartile: 3.
We have previously shown how a simple (low-level), physiologically plausible model of achromatic local-contrast discrimination predicts human performance for discriminating between pairs of slightly different morphed pictures (Párraga et al, 2000 Current Biology 10 35 - 38). The model does a multi-resolution analysis of the two pictures and detects differences in local contrast in each spatial frequency channel. For the present work, we have developed a dichromatic version of the same basic model, which analyses separately the achromatic and chromatic (red - green) representations of pairs of colour images and simply signals which representation produces the largest contrast difference. This limited version of the model is valid only for foveal detection tasks (given the lack of blue cones in the central fovea), and we expect to develop a full-colour version in the near future. To relate model output values to actual human discrimination thresholds, we calibrated the model against a series of psychophysical experiments where human observers´ discrimination thresholds were measured for 49 sequences of slightly different morphed images of fruits (Párraga et al, 2003 Perception 32 Supplement, 168). The model was tested by correlating subjects´ detection performance in an experiment which involved detecting coloured targets with predictions of the model.
Images and movies
BibTex references
@InProceedings\{PTT2004, author = "C. Alejandro Parraga and Tom Troscianko and D.J. Tolhurst", title = "A multi-resolution model for human chromatic and achromatic local-contrast discrimination", booktitle = "Perception, European Conference on Visual Perception (ECVP\´04)", volume = "33", pages = "118b", year = "2004", abstract = "We have previously shown how a simple (low-level), physiologically plausible model of achromatic local-contrast discrimination predicts human performance for discriminating between pairs of slightly different morphed pictures (P\'arraga et al, 2000 Current Biology 10 35 - 38). The model does a multi-resolution analysis of the two pictures and detects differences in local contrast in each spatial frequency channel. For the present work, we have developed a dichromatic version of the same basic model, which analyses separately the achromatic and chromatic (red - green) representations of pairs of colour images and simply signals which representation produces the largest contrast difference. This limited version of the model is valid only for foveal detection tasks (given the lack of blue cones in the central fovea), and we expect to develop a full-colour version in the near future. To relate model output values to actual human discrimination thresholds, we calibrated the model against a series of psychophysical experiments where human observers\´ discrimination thresholds were measured for 49 sequences of slightly different morphed images of fruits (P\'arraga et al, 2003 Perception 32 Supplement, 168). The model was tested by correlating subjects\´ detection performance in an experiment which involved detecting coloured targets with predictions of the model.", ifactor = "1.271", quartile = "3", area = "PSYCHOLOGY", url = "http://cat.cvc.uab.es/Public/Publications/2004/PTT2004" }