I’ve been doing the synthetic slit scan images for a couple of months now. That series requires me to make many exposures. A typical run is 30,000 to 40,000 images exposed at one-second intervals over a day. I can’t go over 40,000, for reasons I’ll get to later on. I use silent shutter on my Sony a7RII to save wear and tear on the Copal mechanical shutter. As an aside, the camera does not include silent shutter shots in the cumulative shot count that it writes in the EXIF metadata of its files.
I now have well over a million shots on this camera. That’s more than many photographers expose in a lifetime. So this is probably a good place to reflect on how reliable the SD cards have been.
I use 256GB and 512GB Lexar Professional and SanDisk Extreme Pro cards, in spite of the fact that 512 GB cards are not officially supported by the camera. I format the cards in-camera each time I insert one into the Sony’s SD slot. I read the data from the cards to the striped SSD array that I use for processing the images using a Lexar Professional Workflow HR2 four-bay minitower, using the USB3 port connected to an Anker 7-port USB3 hub. I have had reliability problems with other USB flash readers, but never with this one, in spite of the extreme number of images that I’ve put through it.
I know that conventional wisdom is to use a bunch of small flash cards instead of one enormous one. I’ve never bought into that way of thinking, because I think the time when you are most likely to get a file system failure is when you’re removing a card from the camera or reader. Therefore, in all my photography, I use large cards and try to change them out as infrequently as possible. In the case of the synthetic slit scan images, I didn’t have a choice: because I was capturing long continuous series of images, I needed to use big cards to avoids gaps in the exposure sequences.
There are those who would say I am making a mistake by putting all my eggs in one basket, but I prefer to do exactly that, and then be very careful with the basket.
So, what’s been the result of this risky behavior when used with so many captures? I have lost precisely zero images due to file system issues. There have been glitches, to be sure. When I was using the Vello adapter to control the diaphragm in Nikon E (electromechanical aperture) lenses, the lens didn’t stop all the way down every three or four thousand shots. I solved that by not using that adapter for this series.
Also, if I try to put more than a bit over 40,000 images on an SD card, the camera stops writing data, complaining on the LCD display that “The image database file is full.” This happens with both the 256 GB and the 512 GB SD cards, although less often with the smaller cards, sice often they fill up all the way before the image count gets to 40,000 (I use the 18 MP image size with Extra Fine JPEG, most of the time, which gets me a continuous frame rate of one shot per second). Although this error prevents the camera from writing any more images to the SD card, it does not damage the images already there.
Before I started this project, I had probably exposed less than a million digital captures in my entire career, and I’ve been making digital photographs since 1991 (although in 1991, we used spinning rust disks slung over our shoulders, not flash). Up to that point, I’d never lost an image due to file system failure. Now I’ve more than doubled that count, and my record is still perfect in that regard.
There are two possibilities, and I think they both are true.
- I’ve been lucky
- Treated right, flash memory can be very reliable.