I recently had cataract surgery; two weeks ago on my left eye, and yesterday on my right. So for two weeks, I got to walk around comparing the colors with the treated versus those with the untreated eye. Cataracts are known to block short wavelengths to a greater extent than long ones, so I expected to see things looking bluer with my new lens. To some extent, that was the case, but what I saw was highly dependent on the lighting. With incandescent lights, or even 2700 Kelvin LEDs (which are known to have a strong blue component) things didn’t look all that different. Even under direct sunlight, the contrasts were far from dramatic.
In open shade, the differences were larger. When I started looking at different surface colors under bluish lighting, I noticed something that surprised me. I expected that things would look less yellow with my new lens. That was the case. But what I didn’t expect was that some things looked more magenta. Since I didn’t think that my cataract was filtering out much red light, that made me scratch my head. With some experimentation, I finally figured it out. What I was seeing that caused increased sensation of blue and red was the extreme short-wavelength end of the visible spectrum, which excites both the beta (bluish) and rho (nominally reddish, but in reality, far from that) cone cells. When you look at a rainbow, it is these wavelengths that you see as violet light.
The fact that two different spectra can look the same is caused metamerism. Metamerism is the result of the way the eye reduces a many-dimensional spectrum to a three-dimensional color. When two colors match for one observer, but not for another, that is called observer metameric failure, and that’s what I was seeing if you consider each of my eyes an observer. If two colors match for an observer, but not for a camera, that is called capture metameric failure. I decided to compare each of my eyes to a camera on a particularly striking example that evoked metameric failure between my two eyes.
Outside my bedroom is a stained redwood deck lit most times of the day by blue sky. Access is through sliding doors with UV absorbing glass. If you open a door and look at the deck, part of it is seen directly, and part through two layers of the glass. With my untreated eye, the redwood looked brown seen directly and through the glass. With my new lens, the deck was brown through the glass but had a distinctly purple tinge when observed directly. This was the result of the glass in the door blocking some short-wavelength light that was important with my new lens but mostly blocked by my cataract.
When I aimed a Sony a9 at the deck, this is what I saw:
The part with no glass is on the right. The camera sees the scene close to the way the eye with the cataract did, with some difference between what are, to me, two different shades of brown. With my new lens, the part of the deck on the right is distinctly purplish. That the camera can’t respond in the red channel to the violet part of the spectrum is not a surprise. Very few cameras can do that. That my deck is such a strong reflector of those wavelengths was unexpected.
This is not simply behavior that occurs with man-made dyes. I now see it all over the place. Here’s a view of some hills on the other side of the valley I live in:
Look at the areas with the dark vegetation. They are cooler on the left side, which has two layers of glass, but the differences aren’t all that great. With my new lenses, the vegetation on the right has a pronounced violet cast.
As I was seeking examples of places where my new lens and old one saw color differently, I became concerned about how I had been editing images. Would all the work I’ve done over the past decade now look wrong? That would be horrifying.
Fortunately, there appears to be no problem. The phosphors used for CRTs and LCD displays don’t seem to have any appreciable amount of really short-wavelength light. The combinations of the illuminants and the pigments I use for prints don’t seem to, either.
Whew!
CarVac says
It seems like the most noticeable metameric failure comes from wavelengths at the edges of the visible wavelengths, like near IR turning foliage brown with some ND filter/camera combinations, or in this case the camera not rendering the violet that you see.
Sheila Murphy says
I had cataract surgery in July on one eye and am waiting for a while to do the next (another story). In the mean time I see “blue” on the left eye and “yellow” on the right eye but the brain adjusts to where, with both eyes, my personal “white balance” is somewhere in the middle. Pretty funny. Color is so subjective – when being perceived by a human brain.
AndrewZ says
If you get a didymium filter of some sorts (Hoya red intensifier) there’s lots of fun to be had. In that case the filter itself changes colour depending on what light its viewed under not to mention the images on the camera. It removes a portion of the yellow part of the spectrum so spectral orange-yellow is removed (true for most natural objects) but combined yellow, like for LCD displays and prints pure yellow, are unaffected. So if you have a real lemon and a picture of one next to it the real lemon will look orange through the filter while the picture lemon will remain yellow. It intensifies red because most natural red has a yellow component which the yellow-green cone in the eye is sensitive to. By removing the input from the yellow-green cone your eye saturates the red more. In fact it works so well some have tried making glasses to help people with red-green colour blindness (at exorbitant prices).