Week 1 – Nuke 3D & Camera Projection
Now we switch to the 3D aspect of Nuke, we’re introduced to a new set of nodes, as a the 3D space that represents these nodes.
When setting up a camera and combining the 2D and 3D elements in Nuke, there are some key factors to maintain the quality of how the virtual camera matches the actual footage captured by a real camera.
- Focal Length
- Resolution
- Sensor
Colour space usage
Week 2 – Nuke 3D Tracking
Week 3 – 3D Equaliser
Week 4 – Lenses & Cameras
Through out the years, there’s been a development in the evolution of match-moving lenses and cameras, as well as their ability to capture light into the various types of film that then introduced more aspect ratios.
Week 5 – Prep for Assignment 2
Today’s exercise is also an early preparation for our second assignment. As we go go outdoors with specific filming equipment, we were assigned roles to capturing our own footage by turn and capturing the a sequence of images for that area to create an accurate HDRI.
The footage is taken to Nuke for tracking, that will allow any model done in Maya to be applied to this tracking data. And with the HDRI from the images all blended in, this creates an accurate environmental lighting for the model from Maya that will match its lighting to the footage.
Creativity is key, and now we have the footage out of the way, we can start modeling different elements according to the time period. In this case,
Week 6 – 3D Equaliser Freeflow & Nuke
Assignment 1 – 3DEqualiser tracking to Nuke cleanup patch / workflow process
Exporting the camera data, locators & cube to Nuke