We don’t see the world as it is …

Someone said something like “We don’t see things as they are, we see them as we are.” Who? A search, filtered through my web bubble, ascribed it to Stephen R. Covey, Anais Nin, Albert Einstein… The trail is complex enough that the Quote Investigator has an article on it. When you start thinking about the path from the the eye to the perception of the world around us, you realize that the world is so complex that the brain’s computations can only create an approximation. A frog sees a world of moving objects in exquisite detail. A butterfly sees the bright patterns in ultraviolet light that flowers have evolved to attract them. We do not. And the camera, that instrument that we carry around in our pockets, does not see what we see. So it takes a little work to tweak a camera’s output to get the impression of dark water, its surface reflecting the clouds above, its transparency letting you see a rotting leaf slowly sinking, its beading on the fresh leaf to reflect the sun, into what you think you see.

The cold war led to the development of CCDs to act as eyes on spy satellites. In the same year, 1976 CE, that they were first used for this purpose, they were also used for astronomical observations. The very next year they were put into the Voyager satellites, our first eyes to travel to other planets. Kodak labs had developed the first CCD camera in 1975, but it wasn’t till 1988 that the first commercial digital camera became available. There is enough information in the output that what looks like a perfectly black image at first can be used to tease out details. Even without using raw data, I could recover an image of a gaur (Bos gaurus) that I saw on a dark night in a forest. You see that imposing creature in the photo above. This was an old camera, so there is a lot of noise, but I like it that way. After all even our eye/brain does not see too well in the dark.

Colour perception is another whole kettle of fish. The simple RGB colour space model which cameras use is a very crude approximation of what our eye sees. Actual human colour perception is still an active research area. So the images that come out of a camera require colour correction. And the interaction of attention and colour; let’s not even go there. When I looked at this lotus pond inside a forest my first reaction was to the bright red of the flowers. Only later did I realize that the number of insects on it was enormous. And was the water strider (Gerridae) there for the flower, or its shade? I forgot all about the red. To reproduce a semblance of this attention I had to tweak the photo.

Lens artists will want to see the” originals” too. The step from raw to jpeg is all digital magic, so nothing is really original. For that matter, our eyes are not particularly great instruments, so the brain’s chemical-electronic magic is really needed to build up, see, the world around us.