In the early nineties, while I was attending SPIE, SIGGRAPH, and Society for Information Display conferences, I heard a psychologist give a presentation on display bit depth. He claimed that, under the right conditions, humans could distinguish about 400 shades of gray, and that therefore 8 bits of grayscale resolution, which only allows 255 or 256 shades, depending on how you count, wasn’t enough. At the time, we had in the lab an incredibly expensive Barco display and graphics card that supported more than 8 bits per color plane so that you’d have the full 256 levels in each color plane regardless of eh way the display calibration was done. There was also a Radius card with 10-bits per color plane, but it wasn’t of much use because at the time Photoshop had limited support of greater than 24-bit color. In those days the connection between the monitor and the graphics card was analog, so the expensive part of a 30-bit graphics setup was the graphics card; just about any monitor would, supposedly, benefit.
I wasn’t convinced. The images at trade shows looked nice, but they always do. The lack of software to take full advantage of the high bit depths made the point moot.
Fast forward to the present. The displays are digital. There are lots of 30-bit graphics cards, and they’re not very expensive. However, the displays cost a bundle. A few years ago, a big 30-bit display was priced somewhere north of $20K. In the last year or so, some have been introduced for around $5K. NEC recently announced a 30 inch, 30 bit display with a color calibrator for half that.
Since I was spending a bundle on the Dell T7500, I said “what the heck,” and ordered the NEC PA301W BK SV (the name flows trippingly off the tongue, does it not?). After a week or so, a big box arrived. I set the monitor in place, and booted the computer.
After about five minutes, I finally found the place in the ATI graphics card software (Graphics>Workstation) to set the bit depth to 30 bits. In the checkbox label, ATI calls it 10 bits. Sounds like the engineers wrote the software and product management wrote the published specs.
There wasn’t much in the way of documentation on how to load the display drivers in the material that came with the monitor. I tried to load the drivers from the SpectraView disc, but there weren’t any drivers for the newest NEC monitors on the disc. I found them on another disc, but there was no installation software, so I had to load them manually.
I put the SpectraView disc back into the computer and installed the calibration software. I had a devil of a time getting the diffuser off the spectrophotometer, but I finally removed it without breaking anything. This is my third or fourth Eye-One, and I don’t remember any difficulty before – probably a manufacturing dimension tolerance issue. Calibration failed because of lack of communications with the monitor. I had read in the instructions that the software couldn’t talk to the monitor is the way it wanted with some display adapters, so I plugged in a USB cable from the computer to the monitor. That fixed it. I accepted the default brightness of 140 cd/square meter, even though I prefer 80 or a 100 for good color matching between the monitor and the print. I’ve just become too spoiled by bright monitors, which are great for everything else.
It was time for the moment of truth. I launched Photoshop, created a 1600×600 16-bit-per-color-plane, RGB landscape image, and filled it with a horizontal gradient that started at 255,255,255 on the left, and ended at 0,0,0 on the right. It looked pretty good to me. I went over to the old workstation, which has an NEC 2090WQXi (the last generation, eight-bit-per-color-plane equivalent) and did the same thing. If I stared at the screen long enough, I could convince myself that there was a tiny bit of banding. I call the difference extremely subtle, and suggest that it’s not a sufficient reason for an upgrade for photographers. For graphics artists who work with smooth gradients a lot, it might be a different story.