I have this new idea that sounds like a lot of work.
Inspiration came from a movie called Nope, which I finally got around to watching yesterday. A cool movie, not an outstanding one, but very entertaining. More comedy than horror. Anyway, interesting is a technique that the makers used for the night shots.
Since the film is set in a very rural environment that is completely devoid of light when it’s dark, cinematographer Hoyte van Hoytema was faced with the challenge of getting enough brightness into the night shots.
He decided to shoot the scenes during the day with two cameras at the same time – a normal film camera and a camera that only records infrared. Then certain channels were combined to create a believable image of a night scene.
Incidentally, this is not the first time van Hoytema has used infrared techniques. The moon scenes in Ad Astra were also enhanced with IR to make them more otherworldly.
Anyway, I find the idea of combining two cameras very interesting. Because often with full-spectrum photography, you’re a little bit limited by just one sensor. For example, you could do authentic Aerochrome imagery without any problems.
You could also do an analog with UV, where two channels would be visible light and another would be UV. Or UV and IR. Sure, there are filters that let UV and IR pass through, or UV and a range of visible light, but visible light and IR drown out UV to such an extent that the UV influence disappears completely. This would not be the case with two sensors. The bottom line is that it would open up completely new possibilities.
It would be very cool if there were cameras with two or more sensors. There is with video cameras, but not with DSLRs.
So you have to use two cameras. I see three possibilities. But since I’m absolutely at the beginning, I make no guarantees about this info. Absolutely no idea if I see everything important there or maybe disregard possibilities.
1. take a tripod, position the camera, press the shutter release with a hot mirror (or filter 1). Then you change the filter (UV only for example) and press the shutter again – with the same settings and the same framing. The big problem: the scenery changes. Even minimal cloud or leaf movement will be visible. So this is the worst option.
2. put two identical cameras with identical lenses together, so that the sensors and lenses are as close as possible. 1 with filter 1, 2 with filter 2. Then synchronize the shutter releases. The cameras should be as small as possible, they could be attached to each other at the bottom. The image from camera 2 would then have to be reversed. This would be a small problem. A much bigger one: the parallax effect. Because no matter how close the lenses are, the image will never match 100%. This would only be possible with 50mm lenses or more and with distant landscape motifs.
3. the Nope technique. One uses a beamsplitter or a hot mirror. I need to do more research on what exactly would be needed. But the idea is to use a filter or prism that lets some of the light through (e.g. visible) and is picked up by camera 1, and directs some other part (e.g. UV) in a different direction and is picked up by camera 2. So the cameras would be positioned to each other and the beampslitter. But there are problems. 1. no idea how easy it is to get such a beamsplitter and how expensive they are. 2. the cameras have to be on a stable surface, so that everything is fixed and correct. So logistically that would be rather difficult. 3. I think that the incidence of light through the prism or filter would be affected. With UV that would be a problem.
Well – variant 2 is my favorite at the moment. It is relatively easy to implement. I will keep you posted.