12/02/25 week1
Nuke isn’t just a 2D software but also a 3D one. Used famously in the famous Avatar move made by James Cameron. By pressing tab I can switch between these two versions.
The two cameras below are 3D nodes.
Render_cam – renders the quality coming from a camera
Projection_cam – receives camera data tracked from an original shot.
Nuke doesn’t bring in massive files like FBX’s with textures it uses cards which project imagery. In this case I have a door being projected on a card.
Scanline render combines multiple 3D assets such as cameras and cards.
I tried putting together multiple cards from an image. In the original image I rotoscoped out parts of the original image such as the: walls, ceiling, left wall, right wall and floor.
After turning these images in to cards I could try and place the cards carefully rebuilding the original image in a 3D format in which the camera can move towards. As you can see the left and right walls were stretched to give this distorted yet real looking clip.
I will say the only thing that ruins this clip for me is how it breaks part at the end with the left wall showing a gap. For me this was a pain to put together. I kept going in the camera view and back in to the 3D view and back in to the camera view trying to get the walls right.
The merge nodes link all of the imagery together.
This was a great exercise for me. Really enjoy using Nuke and learning new nodes and features.
Week 2 19/2/25
3D matchmoving is a software used for combining a 3D sequence with a backplate. The software will make sure the footage isn’t wafting about while the camera turns, combining the 3D elements with the footage.
This work involves a lot of tracking and trying to place cards in specific places that works well with the camera.
scattering points and tracking
The purple rectangle called reformat is a very important node. Cameras can have a range of distortion. This node changes the distortion and resolution. You can see the bottom of the footage below is curved which is a result of the reformat.
roto-ing the area where the tracker will latch on to. Making sure that the track points get put on to non moving areas such as buildings and the clear blue sky. Water as well as pedestrians are two things that the trackers need to avoid.
Switching the camera tracker to source alpha will make the marker fit in to the roto as its the only shape in the alpha channel.
The only issue I can see with this scatter track is that I think one of the points is tracking a bird. Still, there’s so many points scattered that I don’t think it would matter.
I changed the range from 1001 to 1110 because there’s not much point tracking the entire clip.
I then clicked track and then solve, creating lots of different colour tracking nodes.
The auto track tab has options to delete the failed tracking markers. This is incredibly useful for getting rid of the points that have errors on them as well as the dots that have slight issues with the tracking.
Then update solve for a cleaner track.
There’s still bad points which need top be removed again on the auto track tab.
26/3/25 week3 – 4
Using 3D equaliser
3D equaliser looks like a confusing software at first, however I found it to be quite straight forward. It is used for fixing distortion as well as figuring out the spatial awareness in a scene for CG elements to be added.
The first thing I did when opening equaliser up was setting the layout. On environments located at the bottom left of the screen I set up the layout to basic and closing some of the tabs.
To open up a file click under camera in the top left
Click browse and select the video file while also changing the FPS to the correct setting based on the video and camera used to film
set the correct short keys.
hold cntrl and left click on an area that I want to track. A square will appear with a dot in the middle. This point in the middle is what is tracking the pixels in the video. so adjusting the square and dot is crucial in getting the best tracking result.
When happy with the result of the tracker placement you press ‘G’ for gauge marker. This will stabilize the tracker and then press ‘T’ to track.
The video in this exercise had incredibly poor camera movement with no trackers on purpose. This was so we could track videos with poor quality.
week 6 Taking the tracking data from 3D equaliser in to Nuke
Continuing on from last week we started by exporting the previous weeks work.
First Bake scene
Next export to Nuke and change the start frame to 1001. Export the file to cam.
Then export to Nuke LD 3DE4 lens distortion Node
Add the file to the undistort file holder
Select all the Geo objects (the points).
Export as OBJ
Labeled points in the GEO folder
Select cylinder
Export 3D Models as OBJ
End of exporting the files from 3D for Nuke
Assignment 1
week 8 Survey points
Week 9
Filming
Week 10
3DEqualizer, exporting the trackers and cameras
Open and set up the 3D Equalizer scene.
Open the file of the already tracked project in the DATA, 3D file
After opening the file go to the camera settings and open up the video.
adjust settings such as camera type and FPS
Import playback buffer compression file
Export Maya
Change start Frame and save in to a good file place on the pc
Make sure the starting frame matches up. Changed the starting frame from 1 to 1001 which is the beginning of this videos frame number.
END of 3D equalizer
Set up the settings for the scene.
Add BlackOutside node which puts a black border over the footage.
Downloaded the Lens Distortion node from the 3DE website.
Add the lens distortion plugin.
The footage expands after adding the distortion fix.
Added a reformat node with the adjusted setting to set the scale to 1.1.
Also change resize type to none. With the reformat node the video will reshape to its correct scale.
bring a write node in the scene for export to MAYA.
End of Nuke part 1
Assignment 1
I already attempted this assignment however the trackers weren’t accurate enough so I had to re-do the assignment. Also I forgot to add the camera settings correctly. Below are the settings I added based on the camera used.