I won't stop until I get the perfect 3D looking clouds. They are the biggest challenge.Tomas Griger
Realistic planet Earth in Nuke
I have always liked building models of planet Earth in various 3D software tools. Every time I’ve had a chance to try a new software package, my first attempt to render something has been usually planet Earth. So quite naturally, when I enrolled in Compozitive Academy, I was curious how I could possibly produce a superb hollywood-style planet Earth in Nuke.
Node based workflow and Nuke’s ability to work with huge files have proven to be great combination for creating a truly high-quality planet. In this article, I will summarize the basic principles and techniques I used to make realistic images and videos of planet Earth from space. At the beginning I set one important condition. I didn’t want to make a simple projection. I wanted a real and complete model of planet where I could set my camera anywhere in the world at any angle and render a realistic image without doing any extra manual adjustments to every individual shot. While Nuke isn’t a professional 3D tool and it wouldn’t make sense to try creating a full 3D model of Earth within Nuke, I wanted to achieve outcomes as if I had a full blown 3D model right inside Nuke. Now I am going to describe how I got there, while I was learning Nuke from scratch at Compozitive Academy class by class.
The very first step in manufacturing any good image is to find suitable references. Today we have enough footage from ISS astronauts, including videos and time-lapses. I gathered a lot of real shots of different places on Earth. What colour is the ocean and how does its light reflect from the water surface? What do mountains look like from 400 km? What is the colour of the desert, snow, fields, forests? What do clouds, fogs, atmospheric haze and other atmospheric phenomena look like from space? How does the atmosphere-space transition look like? And finally, the night, can we see the city lights? If so, how do they look? Even though the planet from space is not something we encounter daily and none of us has seen it with our own eyes, our brain can assess whether the image looks realistic or not quite well.
Examples images from web used as references.
Last day of the 3D Earth – ▶️ video.
Since I was creating a model of a real planet, our planet, I had to use actual maps of its surface. To begin with, I used the textures from NASA available as free downloads inside the Blue Marble collection. These cloudless maps of Earth surface have decent resolution 84k x 42k pixels. Not great, not terrible. Such resolution is sufficient to display the Earth from the altitude of ISS (approx. 400 km) in HD resolution quite comfortably and also in 4K resolution, but only just. Currently I play around with production of my own textures processed from raw satellite data, which are on a whole new level in regards to quality and resolution. But even with free NASA maps, one can achieve nice, realistic results.
Examples of textures used in this project.
A good CGI Earth is not only a texture on a 3D sphere. It is a mixture of various interconnected textures:
– RGB Surface texture of the Earth itself
– Normal Map of Earth relief, or a Height Map
– Water / Earth map for specular
– Bathymetry – map of sea depths – as Height Map, but for oceans
– Map of night city lights
– Map of clouds
Unfortunately, most of the available textures are not good enough for what I needed. Lets assume that the size 84k x 42k px is suitable for our use. But we can not find all textures at such a high resolution. For example, the most important texture is the one with clouds. But there is no such thing as real cloud texture, because the clouds are always in motion and no satellite can take a picture of all of them at one single time. What NASA offers is a composite of individual clouds stitched together, it is low quality, insufficient resolution and it’s not without errors. It is full of visible clones, long connecting lines and visible pixelation. What I had to do was to re-do the whole map of clouds in Nuke. I also used AI software to enlarge existing clouds to achieve much higher quality texture. It really made some difference, see an example below.
Test 01 with photo
Test 02 with photo
And then we applied this technique to our clouds texture to get details we never had and this was the result.
After all the preparations, I could start working on the planet itself. Because Nuke’s main focus is not 3D, it is best to get into 2D space as soon as possible. In the beginning I pasted each texture on a ball and created a Scanline render node which would play role of a precomp. In addition to rendering all textures, some additional precomps must be rendered as well. I rendered several different Noises, which I used to break up homogeneous surfaces like oceans or to produce random atmospheric phenomena such as haze in the valleys around the mountains, or morning mist over lowlands.
Examples of precomps.
An important precomp is also one that determines the division of day and night. Here I used a technique in which I illuminated a 3D sphere with directional red light from one side and blue light from the other side. By using combinations of Shuffle and Merge nodes I could then display certain things only during the day, other things only at night, or I could grade things like clouds differently during the day and different at night. By simply turning the lights around the 3D sphere, I am able to change day and night in the final render.
Generic script for multiple shots