This is the beginning of a series of post on synthetic slit scan photography (defined below). To see the other posts in this series, scroll down to the bottom of the page and click on the permalinks (below the comments) to navigate to each.
As long-time readers know, I have been a diligent searcher (and, occasionally, a finder) of ways to introduce the dimension of time into my photographs. One way that I’ve done that is with slit scan photographs.
For those of you new to slit scans, the idea is that the vertical or horizontal dimension of the image is distance, like a normal photograph, but the other dimension is time, quite unlike a normal photograph, but like the images from finish-line cameras at racetracks. At first, I made slit scan photographs with a slit scan camera, the Betterlight Super 6K. Then I started to simulate a slit scan camera by writing computer programs to simulate the results from a real slit scan camera from a video file.
In the last few days, I’ve embarked on another synthetic slit scan journey. This time, I am making the direction in the photograph that’s not strictly space a blend of space and time. I’m doing this by writing programs to process groups of still images from a Sony a7RII.
Confused? I don’t blame you. The easiest way to see what I’m doing is with a picture:
This image was constructed from 7500 conventional images, using a one-pixel wide column from each image. The time span is from before sunrise until after sunset the day before yesterday. The earliest capture is the leftmost column in the picture above, and the latest is the rightmost. The ISO setting and exposure were constant for all the exposures.
Not going to inspire anyone, is it? I admit that, but I’m hoping that it’s the beginning of something.
Here’s another one of these, with the captures done over a three hour period, and five pixel wide columns from each exposure assembled into the final product:
Everything looks pretty normal, except the clouds. If you analyze what’s going on with them, you’ll figure out that the winds aloft were away from me. The above image was made before I figured out that you have to turn off autoexposure to make this work well.
I have lots more work to do here. Eventually, I’d like to made the distance and time relationships of the axis perpendixular to the synthetic slit completely arbitrary. That’s going to take a lot of programming.
One of the difficulties with what I’m doing here is the size and number of the files, which translates into long processing times. With 42 MP files and a one-pixel-wide slit, it actually takes longer to process a series of captures that it does to make the exposures in the first place. I will probably be doing my initial work with much smaller files.
Jack Hogan says
Cool Jim, a slit scan time lapse.
Max Berlin says
Leaving the shore takes imagination first then courage. You have both Jim. I’m interested to see where this lands.
Lynn Allan says
I’m curious if you have pre-visualized what this will look like for target scenes. “Staccato’ish” looking urban locations? Storms?
I’m experiencing a “poverty of imagination” on how these will turn out, but I’m looking forward to what you come up with.
Jim says
For urban locations with people and cars, you have to make the captures faster than the a7RII can record single images to its flash card. Video is the way to go here, I think. I haven’t experimented with video except with static slits.
http://blog.kasson.com/?p=10586
For people in cities, I’m thinking you need capture rates of at least 120 fps for 4K, and maybe as high as 1000 fps. Cameras that can do that are pretty dear, although I could experiment at lower resolutions.
I’m playing with clouds this morning.
Part of the fascination of this kind of thing for me is that I don’t really know what I’m going to get.
Jim