Successful comprehension of observed images and scenes depends on our ability to distinguish their features. Human vision identifies scene features through the apparent contrasts that they create within their context. Well visible contrasts facilitate the recognition of objects in a scene, identification of their texture, understanding of their spatial distribution, and the ability to judge brightness between adjacent and distant areas. Together, these features directly influence people's assessment of overall image quality. We present an image processing tool that creates countershading profiles for an image to enhance perceived contrast of features degraded with respect to the original. For instance, an image tone mapped with the logarithmic mapping (Figure A) lacks details in the sky with respect to its HDR original, the area around the sun has become almost isoluminant and much of the contrast has been lost in the horizon area. This is automatically restored with the adaptive countershading (Figure B) by introducing subtle changes to appropriate areas of the image in such a way that the look of a particular tone mapping algorithm remains unchanged. The countershading profiles (Figure C) are not disturbing because their strength has been adjusted by a visual detection model.
This project received the 2nd Best Paper Award at the Eurographics 2007 conference. The paper and the slides from the conference presentation are available for download in section reference.
We address the problem of communicating contrasts in images degraded with respect to their original due to processing with computer graphics algorithms. Such degradation can happen during the tone mapping of high dynamic range images, or while rendering scenes with low contrast shaders or with poor lighting. Inspired by a family of known perceptual illusions: Craik-O'Brien-Cornsweet, we enhance contrasts by modulating brightness at the edges to create countershading profiles. We generalize unsharp masking by coupling it with a multi-resolution local contrast metric to automatically create the countershading profiles from the sub-band components which are individually adjusted to each corrected feature to best enhance contrast with respect to the reference. Additionally, we employ a visual detection model to assure that our enhancements are not perceived as objectionable halo artifacts. The overall appearance of images remains mostly unchanged and the enhancement is achieved within the available dynamic range. We use our method to post-correct tone mapped images and improve images using their depth information.
These sample results are obtained automatically using our algorithm without any parameter tweaking, click on the images to see them at a higher resolution. Each row contains a tone mapped image (left), the result of contrast restoration by adaptive countershading (middle), and a map of countershading profiles (right). The blue countershading profiles darken the image and the red lighten, their intensity corresponds to the profile magnitude. Notice that the contrast restoration preserves the particular style of each tone mapping algorithm. Refer to the paper for further discussion.
A carefully shaped luminance profile at an edge between two areas causes change in the brightness of the whole areas and increases the perceived contrast between them. The family of such border profiles known as Craik-O'Brien-Cornsweet illusion is based on the gradual darkening and lightening of areas towards their common edge - countershading. In the figures above, each gray circle and the background are filled with exactly the same pixel value, but their perceived brightness is different because of the countershading profiles at their borders. The illusion is strong and appears even for a consecutive combination of profiles (left) and the illusory brightness holds even when an area is isolated with a white ring (right, center area).
Contrast Restoration by Adaptive Countershading
Grzegorz Krawczyk, Karol Myszkowski, Hans-Peter Seidel,
In: Proc. of EUROGRAPHICS '07 (Computer Graphics Forum, vol. 26), 2007. 2nd Best Paper Award.
[bibtex] [paper] [slides]