Files

Abstract

Color deficient individuals have trouble seeing color contrasts that could be very apparent to individuals with normal color vision. For example, for some color deficient individuals, red and green apples do not have the striking contrast they have for those with normal color vision, or the abundance of red cherries in a tree is not immediately clear due to a lack of perceived contrast. We present a smartphone app that enables color deficient users to visualize such problematic color contrasts in order to help them with daily tasks. The user interacts with the app through the touchscreen. As the user traces a path around the touchscreen, the colors in the image change continuously via a transform that enhances contrasts that are weak or imperceptible for the user under native viewing conditions. Specifically, we propose a transform that shears the data along lines parallel to the dimension corresponding to the affected cone sensitivity of the user. The amount and direction of shear are controlled by the user'sfinger movement over the touchscreen allowing them to visualize these contrasts. Using the GPU, this simple transformation, consisting of a linear shear and translation, is performed efficiently on each pixel and in real-time with the changing position of the user's finger. The user can use the app to aid daily tasks such as distinguishing between red and green apples or picking out ripe bananas.

Details

Actions

Preview