You are on page 1of 10

HDRI - High Dynamic Range Imaging

(Figs. 1, 2) The two above images were rendered using no lights at all. How was this possible? By using images or (light probes) to project a radiosity calculation. This technique, when coupled with an HDR image, is often called an HDRI render. The entire HDRI term has been misunderstood, and despite the fact this feature had been incorporated since back in Lightwave 6.0, it is still surrounded in mystery. While there are some good documentation on HDR images, Lightwave users find it hard to apply to Lightwave itself as there does not appear to be much documentation as of yet to help even the experienced user get through it. This is specifically why this tutorial is here. However, before we jump into all the hype about HDRI, we must first re-cover some of the basics. This is because HDRI

incorporates quite a few principles, and the more that one knows how things are being operated, the better the tweaking and resulting work will be. (Not to mention troubleshooting!) Radiosity First, a concept of radiosity must be understood. For those that know this already feel free to skip ahead. For those that dont, this will be a quick watered down explanation of it (well leave an in depth explanation of radiosity (rads) for an upcoming tutorial). What makes radiosity so enticing for 3D artists, is the realism it creates. There are 2 major elements to radiosity. One is the fact that you get shadows (nice and soft). The other major element is that you get bouncing light. This bouncing effect, is what can add huge render times to a scene, but it sure can be worth it. For a further explanation, if we were to follow rays of light form a source, we see it travel and hit an object. However, light does not stop all its rays as soon as it hits an object (unless its a black hole). Some light may be absorbed, but the rest is reflected and continues on a new linear path. Eventually these rays will hit other sources, some will absorb and some will bounce again, and the process continues. Another property of Applications such as Lightwave not only give you the added feature of radiosity, but also (version 7.5) allow you to select the number of bounces. Generally this is set at one to two. Further bounces will add very little improvement usually, but drastic render times. No Lights And now, for something more interesting. You can effectively light your scene without needing to use any lights. Well, not in the conventional sense. While radiosity CAN use the normal lights that come with LW, such as area lights, etc, this is not always needed. Any surface created which has a luminous value, will be counted as its own light source. What this means, is that you could even slap on an image to a polygon(s) and have it determine how the light is emitted. The texture can will even control the color of the rays. This is something you may have done similar in the past. So, what is the big deal you may ask? Well let me explain something about HDR images. BTW, there is no such thing as an HDRI image, that is the same thing as saying HDR image image. The acronym HDRI stands for High Dynamic Range Image. Most conventional images, are 24 or 32 (added alpha) bits. These are considered low dynamic ranged images. They consist of 8 bits per channel. Thats 24, as 8 Red, 8 Green, and 8 blue bits make up the image and determine the over-all look. For those who forgot binary mathematics, 2^8 = 256. Thus, for each 8 bit channel, there can only be 256 possible levels of shade for that color. There is nothing drastically wrong with this, in fact, your common computer output devices work fine with this. A setting of (0,0,0) we recognize as pure black, and (255, 255, 255) is pure white. The steps in between are whole numbers. For normal viewing we can get along with this form fine, however, in real life, there are much more shades than

256 levels. If you were looking one instant away from the sun, then the next directly into it, you will quickly be reminded of this fact. In fact, often when rendering, the variations internally are much greater than the levels within 24 bit images, however they are squashed down so to speak (non linearized), in order to display on the image viewer. Now, you may have figured out where we are going with this. Yes, an HDRI image contains much more information than a 24 or 32 bit one. In fact, not only can you have floating point (fraction) values for each channel, you can also have numbers that greatly exceed 255. Hence a channel value of 942.32 is perfectly legal. Thus, even though your eye may not be able to see this, the image itself may contain many bright areas which your eyes do not detect. Also, conversely in the dark areas, there may still be valuable information there, however you can not currently see it. But if you were to crank up the brightness values, they would become evident. As demonstrated below:

LDRI (Normal view)

HDRI (Normal View)

LDRI (32x Brightness)

HDRI (32x Brightness)

LDRI (1/32x Brightness)

HDRI (1/32x Brightness)

LDRI HDRI (Motion Blur) (Motion Blur)

(Figs. 3 - 10) In regards to viewing images on a normal display, there is no apparent difference that can be seen between normal image and the HDRI. They appear to be the exactly same image as far as one can tell. As for increasing the brightness, the difference may almost look subtle here, but the HDRI contains more detail of the dark areas which were only observed as blackness before. The difference is much more dramatic when decreasing the brightness. While the LDRI looks almost pure black, the HDRI still has its hot spots shining through with strong intensity. A similar difference can be seen when using motion blur. While the LDRI shows the hot spots as becoming dilute, the HDRI once again shines through with strong intensity, creating a much more realistic look. These samples all show the power of HDRIs. Now that you understand how HDR images can behave, we will apply them to rendering. When an HDR image is used as a lightsource in Lightwave, we get far more control over the lighting of a scene than our limiting 24 bit images. For example, if we take an HDR image of a scene, and map it onto an inverted sphere placed around our environment, the lighting from the texture would much more realistically light our scene, than from the 24 bit type. Of course this does not mean you HAVE to use HDR images to make something realistic, but this is just another tool you can use in your LW arsenal for those special lighting scenes. The images generated from HDR are not only realistic, but also can be implemented very simplisticaly, as no complex lighting rigs need to be set up. The only major problem with this system, is that of waiting for the render.

LightProbesA quicker method to light your scene, is to simply use a lightprobe in the background settings of layout. A lightprobe, is nothing like an object that needs to be loaded, it is just the term used to a specific kind of HDR image. This kind of image, your may have seen time to time on the net, have an interesting spherical look to them. The reason these images look spherical, is because they have been designed to map AROUND your environment in a specific way. For those interested in the making of these images, they are most often done by taking a photo of the REFLECTION on a mirrored or chrome ball. Yes, thats correct, many people end up buying silver-like balls, putting them on poles, then photographing the reflection on them to obtain a truly surround image system (well, almost 100% surround). However, often the photographer will do angles at 90 degrees, and then use tricks such as stitching the images, which is done to prevent the reflection of the actual camera showing up in the resulting work. In order to get one HDRI, often many separate images are composed. This is even needed when stitching is not applied. These photos are taken at different settings to capture different exposures. About four to 8 is a common number for this. When done, they are all combined, into one, which gives enough definition to get a good accurately based dynamic range.

LightProbs in LightwaveNow, from theory to practice in Lightwave. The incorporation of a lightprobe is most likely one of the easiest things to do, (yes I am being honest here). Once your scene has been loaded, it is time to access the Effects > Background options by pressing CTRL-F5. It is in the Effects menu that you need to install the image word plug-in. Just click on the add environment drop-down, and select image world. Now your plug-in is installed. We still need to add an actual HDR image however. Click on the Image World text to highlight it, then click on edit > properties. Or, you could just double click on the Image World text to make things simpler.

Now, load your image in the Light Probe Image drop-down menu. This is where a lot of people have trouble. The problem is, if the HDR image is not present in the list, users are confused as to how to load it. The answer is simple, use the generic Image Editor CTRL-F4 and just load your image from there. This will add the image now in the dropdown light probe menu, so now you can go back and select it from there. Note: From the light probe menu, you can tweak HDRI settings, such as brightness, and rotational values. Dont worry about messing up too bad by experimenting. As mentioned, this feature is very simplistic and is quick to learn.

When you render, dont forget to turn down normal light properties, and make sure that radiosity is turned on. This also tends to be a common mistake first time users go through. P.S. In the first two schoolhouse images listed at the top of this tutorial, one involved the use of an HDRI, and the other a LDRI for lighting, can you guess which? Cant tell? Well, the blue sky was a regular 24 bit image taken from the texture gallery at Highpoly3d.com, while the reddish sky texture was the HDRI courtesy of Dan Ablan. This shows that you do not always need a high dynamic image for a lightprobe, always adapt to your situation.

You might also like