3D Match Moving

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Assignment 1

Final Result:

  • With locators, cube and patch
  • With cube and patch, no locators

Development

First of all, I save my project.

I import my sequence into 3D Equalizer.

Once the sequence is in my viewport, I go into Playback > Export Buffer Compression File. I do this to save and optimize the tracking data for efficient storage and retrieval which will help me later on in my workflow.

I press overwrite.

Next, I set up my Camera Settings.

 

xxxx

Once all of my tracking is finished, I open the deviation browser and go into Show Point Deviation Curves > All points. So far my overall deviation is 0.49, which isn’t bad but ideally I’d like to get it lower.

 

 

Lastly, I bake the scene the scene by going into Edit > Bake Scene.

 

 

 

 

 

 

 

 

NUKE:

 

I started off with importing my 3DE Camera and geometry / locators into NukeX to set up the scene inside Nuke,

 

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Assignment 2 – Track a shot & add CG elements to it

Assignment 2 Final Result:

Assignment 2 Breakdown

Project Setup

  • First, I launch 3D Equalizer and save a new project file to establish my working environment.

Import Footage

  • I then import the source footage into the project. Once the sequence is visible in the viewport, I go to Playback > Export Buffer Compression File. This step optimizes the tracking data for efficient storage and future retrieval, ensuring a smoother workflow later on.

Camera and Lens Settings

  • It’s crucial to configure the Camera and Lens Settings to match the real-world specifications of the camera used for the shot. This ensures accurate 3D tracking and scene reconstruction.

For this particular shot, captured with a Blackmagic Pocket Cinema Camera using a 24mm lens, the settings are:

    • Filmback Width: 1.896

    • Focal Length: 24 mm

    • Pixel Aspect Ratio: 1.0

My camera settings.

My camera settings.

My lens settings

My lens settings.

Begin Tracking

  • With everything configured, I start the tracking process. My goal is to establish a robust set of stable tracking points that require minimal manual adjustment, ensuring a more efficient and accurate matchmove later in the pipeline.

  • I placed approximately 40 tracking points using CTRL + left-click, focusing on areas that offered good initial track quality to minimize the need for corrections. Wherever possible, I concentrated the points on the ground plane, since that’s where the CG elements will be placed and stability in that region is critical.

All of my final tracking points.

Deviation

  • Once tracking is complete, I review the deviation to assess the stability of the track—lower deviation values indicate a more accurate and reliable result.
  • To start, do calculate all for everything in my scene to even show up in there, then  i go into Config > Deviation Browser  and open the deviation browser. at First it only shows me a green graph which is the average, so i go into deviation browser > all points ot show me all of the points 

Opening the deviation browser.

Average deviation of all my points.

Once all calculations are complete, I select ‘Show All Points’ to display the deviation graph and corresponding values. Currently, I notice a few data points that are spiking, which I need to address, as they are skewing the average deviation upward. I remove all of the points that are spiking (unstabble) and replace them with new ones that are tracked better.

Parameter Adjustment: Focal Length

To further polish my track, I go into the Parameter Adjustment window and remove all parameters except for the Focal Length parameter, which is the one I aim to improve. I then recalculate to apply the adjustment. To do that, I press Clear and then Adjust.

The first step is to download the file from BlackBoard and open the project.

 

Exporting the 3D Equalizer Camera into Maya

Naming the Camera export to maya

Exporting my camera track into maya

Exporting NukeX Lens Distortion Node

Lastly, I export NukeX Lens Distortion Node

Opening the 3D Equalizer files in Maya

To open my 3D Equalizer scene with tracked camera in Maya, I go into the folder where I saved my export and select the file I exported. I have to change FILE TYPE to all files as this is not a regular maya file so that it shows up.

Once the scene has loaded up in Maya, I go into Panels > Perspective > Exported Camera to look through the lense of the camera.

Looking through the camera lens

Once I’m looking through my camera lens, I go to the Camera Settings > Image Plane Shape and select the UD version created in Nuke as the image plane.

The CG element I’ll be incorporating into this shot is a crab, originally created for the Rigging & Creature FX module. It will feature a simple up-and-down motion to simulate breathing. Since the focus of this module isn’t on animation or performance, I’ve opted for a more subtle and understated approach. My main goal is to ensure the crab aligns accurately with the ground plane and integrates naturally into the scene, with minimal visual deviation.

The crab was previously rigged in Maya using IK handles and skinning techniques, which I developed during a prior assignment. This groundwork made it straightforward to bring the asset into this scene. I imported the rig into my MEL script and created a ground plane, applying the AI Shadow Matte material to it so it could receive shadows realistically. I then aligned everything with the scene, positioned the crab as needed, and scaled it up to fit appropriately.

Finally, I created the simple breathing animation by moving the crab’s main control handles up and down to simulate subtle, rhythmic movement. 

After playing the footage through the scene I quickly realized that the tracking and footage are misaligned due to lens distortion. To correct this, I navigate to the Placement tab under the Image Plane Shape settings. There, I adjust the Size values by entering: =value*1.1. This scales both width and height by 1.1 to account for the lens distortion.

I repeat the same process under a different camera tab: Camera Shape (BlackMagic_Pocket_25mm_1_1Shape1).

In the Film Back section, under Camera Aperture, I again multiply the aperture values by 1.1.

This ensures the camera settings match the distorted footage and maintain consistency throughout the shot.

Lastly, before rendering the scene, I want to isolate the crab from the live-action footage by removing its background, ensuring only the crab is visible in the render, which will later be combined with the live action footage in Nuke. To accomplish this, I adjust the Display Mode setting in the ImagePlaneShape node, changing it from RGBA to None. This prevents the background from being rendered, allowing for a cleaner composite in Nuke.

ImagePlaneShape > Display Mode > RGBA to None.

No more background live action image.

Compositing:

 

 

 

 

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 1 – Nuke 3D & Camera Projection

Nuke 3D 

Nuke’s 3D space allows you to bring in or create 3D geometry, place cameras, add lights, and then render it all back into a 2D image. This is super useful for things like set extensions, camera projections, realistic lighting, and depth compositing.

Even though Nuke is primarily a 2D compositing tool, its 3D environment lets you simulate real-world setups so your 2D elements can interact more realistically with space and light.

Here are the core components we’ve been using in class:

1. 3D Geometry

You can import 3D models (like .obj or .abc files), or create simple shapes inside Nuke using nodes like:

  • Card – a flat plane, great for projections

  • Cube, Sphere, etc. – for basic modeling or placeholders

2. Cameras

The Camera node gives you a viewpoint into the 3D scene. You can:

  • Animate it manually

  • Import it from 3D software

  • Or create it using CameraTracker if you’re matching live footage

3. Lights

Lighting in Nuke works similarly to 3D applications. You can add:

  • PointLight – for localized lighting

  • DirectionalLight – for sun/moon-type setups

  • AmbientLight – for general scene illumination

4. Scene Assembly

All your 3D objects, lights, and cameras go into a Scene node, which acts like a container for your virtual environment.

5. ScanlineRender

This node converts your 3D setup into a 2D image. It’s where everything comes together — geometry, textures, lighting, and shadows — ready to be composited into your shot.

Exercise – Introduction to basic 3D objects in NukeX

I set my project settings to HD.

I press TAB to switch between the 2D and 3D space.

To start off, I bring in 2 geometries: a cube and a cylinder.

I bring in a checkerboard and colour bars node to project them onto my objects.

Colourbars node projected onto my cube.

Then, I bring in the scene node which works similiarly to the merge node used in Nuke 2D and allows me to add both of these object to my scene and view them at the same time.

I bring in a camera onto my scene and adjust it’s settings so that it can show both of my objects.

I then set 3 keyframes, one at the start, one on frame 50 and set it to the middele of the scene and last one at frame 100 on and move the camera to the right so that the end effect is camera moving from left to right and showcasing the object.s.

I follow all fo that up with a scanline render node so taht the view from my camera is also able to view in 2D space not just 3d.

Final view in 2D.

Final view in 3D.

Exercise – Tunnel Projection

Tunnel

Need to have a camera thats constantly projecting the image and need to have another one that has the movement

Weve got an image and projection 3d node

i Duplicate the 3d render camera, delete the animation from it and call it a projection camera and apply it onto a rotated by x 90 degrees cylidner with height 100

Projection camera must always be static

 

 

Exercise – Prison Cell Projection

To start off, I import 5 pictures of different parts of the prison cell.

Then, to each of them, I add a copy node which I plug in through the B pipe into the image, and into A pipe, I plug in a roto node followed by a blur node to smooth the edges.

In each of the images, I roto out a different section of the prison cell corresponding with the images name, so in the Back_Wall_Image_01 I roto out the back wall etc I do that for all of the images.

 

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 2 – Nuke 3D Tracking

Parralax Movement

Point Clouds within 3D space

Lens distortion

Exercise 1 –

 

Camera Data

 

I bring in a copy node and roto and Camera Tracker

 

In Camera Tracker settings tab, I enable preview features and then select a range of frames from 1001 to 1110, then in  CameraTracker tab I Press solve  to solve my trackers in the same range

 

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 3 – 3D Equalizer

Today in class we were introduced to tracking an entire scene in 3D Equalizer and the way the workflow works with a high amount of tracking points and accounting for deviation.

 

First of all, I import my footage into 3D Equalizer by going into Cameras > Live Footage > Browse.

It’s important to keep the camera settings within 3DE matching to the actual camera that shot the footage.

To speed up my workflow, I go into Playback > Export Buffer Compression File. This catches the data so I can replay the sequence smoothly wihtout it having to load every time.

Once it’s done exporting I just press ”okay”.

I can now begin tracking my footage. To create a tracking point, I press Ctrl + Left Click on the desired location. Next, I gauge it by pressing G, allowing the software to analyze the point, and then I initiate tracking by pressing T. It’s generally best to start tracking from either the beginning or the end of the sequence, rather than the middle. If a track doesn’t continue through the entire sequence, I press E at the point where it ends to mark the end frame—this prevents the tracking point from lingering unused.

With that knowledge, I go ahead and create 40 tracking points, as that number should be sufficient for this shot.

Once I’m done with tracking all my points, it’s time to calculate the deviation. To do that, I have to go into Config > Add Horizontal Pane > Deviation Browser to open my deviation browser and start adjusting it.

When my deviation browser opens, it looks quite flat and doesn’t tell me anything. This is because I haven’t calculated my points and scene yet.

To fix the flat line deviation browser, I go into Calc > Calculate All Points in order to calculate my points so that they show up in the deviation browser.

Once all calculations are complete, I select ‘Show All Points’ to display the deviation graph and corresponding values. Currently, I notice a few data points that are spiking, which I need to address, as they are skewing the average deviation upward. At this point, the average deviation stands at 0.4224.

I identify that point number 13 is causing the issue. To prevent it from increasing the average deviation, I remove it and replace it with a new point that is better tracked. I repeat the same process for all other points that are peaking in the wrong places and are too shaky until I achieve my desired result.

Once the new point is in place, I go to Calc > Calculate All Points to ensure the change is reflected in the Deviation Browser.

I can now see that the deviation has decreased to 0.2573, which is noticeably lower than the previous value of 0.4224.

To further polish my track, I go into Windows > Parameter Adjustment Window.

In the Parameter Adjustment window, I remove all parameters except for the Focal Length parameter, which is the one I aim to improve. I then recalculate to apply the adjustment. To do that, I press Clear and then Adjust.

 

Once the focal length calculation is complete, the window displays the updated results and a new, even lower deviation of 0.1441. I then click Transfer Parameters to apply the changes and confirm the adjustment.

To transfer the changes to deviation browser, I press Calc > Calculate all Again

 

 

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 4 – Lenses & Cameras

 

Class Exercise:

 

  1. Open 3D Equalizer and set the workspace to basic

2. Import the Footage, set the FPS to 25

 

3. Export Buffer Compression File

 

 

4. Set Up Lens Settings

5. Set Up Tracking Points (25)

Tracking Backwards, Forwards, Manually moving tracks, flip direction

 

6. Fixing up the deviation, removing points

 

7. Setting a camera constraint

 

9. Setting the Focal Length to adjust

 

In Parameter Adjustment Window, Remove all of the other parameters, Focal Length > clear > adjust > transfer parameters which lower smy deviation from 3 to 1.

 

Calculate All To apply changes.

 

10. Setting the Lens Distortion mode to Classic 3D LD

In Paramater Adjustment Window > Clear > Adaptive all > adjust > transfer parameters

 

 

11. Creating constraints based on survey data

 

 

 

 

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 6 – 3D Equalizer to Nuke/Maya Workflow

 

 

 

 

 

 

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 8 – Surveys 

Import footage

 

Set up 4 Surveys with the data

 

 

Add points to each survey point to the correpsonding palces on teh footage

 

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 9 – Filming

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 10 – Nuke Maya Pipeline

Class Exercise:

 

Creating the footage for Maya

3D Equalizer Exports

First of all, I download the project off of BlackBoard and open it up

Open the project.

Export 3DE to Maya 

First of all, I export the project into Maya by going into 3DE4 > Export Project > Maya.

I set the startframe of the Maya export to 1001 and ensure all settings and directory are right.

Project successfully exported.

Export 3DE to Nuke LD

Secondly, I export the 3DE Lens Distortion into Nuke by going into 3DE4 > File > Export > Export Nuke LD_3DE4_Lens_Distortion Node.

Export settings for my Nuke 3DE LD. I set the startframe to 1001.

Export 3DE to Nuke

Lastly, I export the overall Nuke Scene by going into 3DE4 > Export Project > Nuke

Nuke Merging and Finalizing

My final Nuke Script

Final Result:

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━