Image-based Lighting using High Dynamic-range photography
Image Based Lighting from HDRI maps is a clever solution to recreating things digitally and producing subtle lighting to make them feel as if they were photographed in those conditions.
I’ll write a basic explanation, but for more detail I highly recommend reading through some of Paul Debevec’s work. He is one of those rare geniuses.
The basic idea of IBL or Image Based Lighting, is to photograph a mirrored or chrome sphere, called a ‘light probe’, along with your background or clean plate. The chrome ball provides you (and your computer) with all the information you need about the lighting conditions of the scene you are placing your virtual object inside.
Above I’ve placed an image of the chrome spheres I made to use as my own light probes.
amazon sells these ‘gazing balls’ which are a very mirror polished steel; link
I asked around and a friend who used to refinish them at Kerner optical, told me “We used use the Plastikote Grey Sandable primer on reference balls at ILM.”
So I painted one side Grey, and drilled them to place a rod to make them easier to hold.
The other part of this is the HRDi, which stands for ‘high dynamic range imaging’. This is the practice of taking photographs from the same position with varied exposures so they can be combined to give you a wider range of detail than the media you are shooting on can hold. Think of it this way, generally a ‘good exposure’ is one where you can see the subject well, with balanced lights and darks, however there may be parts that fall into shadow and parts where the light blows out slightly. These light and shadow ‘lost details’ are what has fallen outside the dynamic range of the camera. Now bracket this exposure with one exposed for the dark part, you’ll see more detail in the shadows, however the lights will become blown out. The other side of the bracket, is the image balanced for the lights, in this image the darks will fall out to shadow. You can see how combining these will give you more information about the lights in that situation/scene.
The software I’ve been using, Realusion’s iClone, has built in IBL, and makes it really easy, but this process is available in many other applications as well, including 3D studio Max, Maya, Blender, and others, but the idea is the same.
This diagram shows in essence how the software handles the image based lighting, taking the image you created from your light probe and mapping the points to directions of light emanating from the scene.
I’ve borrowed this image from Realusion’s article explaining their version of IBL LINK
The example scene;
I’m trying this test with some of the digital stuntmen I created awhile ago –
Without ILB, using 3d virtual lights to approximate the look of the background, I did this by hand in a little under an hour;
This was straight ‘out of the box’ application of lighting based on the chrome sphere;
Here I’m using the image to light the models, but I’ve turned down the strength of the image’s effecs as well as turned up the ambient occlusion;
Now it isn’t as dramatic as the hand-lit scene, but its a lot more subtle and I was able to do it much faster. Perhaps more scientific testing is in order, but I’d wager that it is more realistic as well.
My endgame is to replicate lighting conditions both digitally as well as to dynamically relight models and live actors on the ‘Holodeck’, my bluescreen stage, using something like this – http://gl.ict.usc.edu/HDRShop/lightgen/