The key to successful underwater vision is getting rid of the water!
This cliché is as old as underwater photography itself, but it’s still true today. Depth, distance, lighting, turbidity of the water, salinity, and pollution all contribute to the visibility, and the perception of size, shape, and color of underwater objects. Adding the abilities and limitations of the human eye and brain makes this a very challenging environment. However, technological development is rapidly pushing the limits of what we can see and do underwater.
Water vs. air
Water is 800 times denser than air. When light enters water, it interacts with the water molecules and particles, resulting in loss of light intensity, color changes, diffusion, loss of contrast and other effects. If you take an underwater photo of an object one meter away, it will be similar to a photo above water at 800 meters; both will look bluish and lack contrast.
Light under water
Sun light is reflected by the surface of the water, which causes significant changes in visibility and the perception of color underwater. Depending on waves light may form patterns or become randomly diffused. The amount of light reflected also depends on the geographical location, the time of day, weather conditions, the season and the condition of the sea.
If you descend more than a couple of meters you will need to bring your own light source. Unfortunately, lamps tend to have a ‘hot spot’, resulting in an image with a very bright center becoming darker towards the edges. In turbid waters a camera sensor will be almost blinded by the reflections at the center while the edges appear very dark. So the ideal camera should have a highly dynamic low-light sensor reducing the need for high-power illumination.
Or – you could use electronic video enhancement technology like lyynification™ from LYYN®.
Color under water
An important part of vision underwater is being able to distinguish different colors, or actually specific wavelengths of light being reflected off objects and picked up by the eye or the camera sensor. Different wavelengths are absorbed differently as the light passes through the water. The shorter the wavelength, the deeper (longer) it will reach before being absorbed. This causes objects to lose their color as you go deeper down or further away.
Weeds, rocks, animals and man-made objects generally appear to have the same color as the depth or viewing range increases. Objects become distinguishable only by differences in brightness and not color. Contrast becomes the most important factor in visibility, and even very large objects may be undetectable if their brightness is similar to that of the background.
Water depth is not the only factor effecting the filtering of colors. Salinity, turbidity, the size of suspended particles, and pollution all affect the color-filtering properties of water. For instance, plankton absorbs purples and blues. So the presence of plankton would cause blue and purple objects to lose their colors much faster than red and yellow objects.
Humans on-site can make some adjustment, psychological in part, which allows divers to perceive some of the warm colors of the coral reef. But camera sensors have no ability to compensate for the blue/green filtering of sea water. This is why the fantastic colors of the reef appear cold and lifeless when you look at your holiday photos or videos at home.
Turbidity and contrast
Bright particles in water reflect and scatter light, resulting in diffusion. Sometimes, diffusion is helpful because it sheds light on areas that would otherwise be in shadow. Normally, however, diffusion interferes with vision because the backscattering reduces the contrast between an object and its surroundings.
The loss of contrast resulting from diffusion and loss of color spectrum are the major reasons why vision underwater is so much more restricted than it is on land.
Underwater photographers have been battling with these problems since the early days. One can use different filters to try to compensate for the loss of a particular wavelength, or the white-balancing feature on video cameras that tries to compensate for the color cast. But all these traditional methods are very crude with severe limitations.
The revolutionary method of lyynification™ takes a different approach. Each video frame is optimized for contrast and color spectrum to make it as “natural” as possible to the human eye. Even the smallest fragments of color and object shape can be extracted from the camera sensor to restore the scene as much as possible. And all this is done in real-time. The result is an image that constantly self-adjusts to the environment, and the diver or ROV pilot can focus on mission objectives.Pushing the limits of underwater video (41608 downloads)
Paper where lyynification is explained