Friday, November 15, 2019

Clears That Up

Pictures taken underwater use the light that penetrates the water's surface to show the image. Obviously at great depths that light is nonexistent, but even at relatively shallow, brightly lit levels the water affects what color the light is. All but the very clearest water can give objects a blue tint when they are seen underneath.

Which means that the colors we see in underwater photography are not always true, by being shaded with a blue tint that's more or less heavy depending on how deep the water may be. It may also flatten colors and cause them to look the same when the objects being photographed actually range widely in color.

Enter oceanographer Derya Akkaynak and engineer Tali Treibitz, who developed an algorithm to shift images' color so that we can see what the objects in the photos would look like in normal above-the-surface sunlight. As you can see scrolling through the pictures, the differently colored light that filters down through the water can alter the color considerably from what the item would look like if it were not underwater. Like a lot of little digital tweaks that exist in the world, it may or may not wind up offering much benefit to people -- but it is pretty cool to look at either way.

No comments:

Post a Comment