Rigging and Creature FX

Week 1

This week we were introduced to the module and the assessments we would need to complete.

Our first assessment is to create a robot character that we apply motion capture data to. For this we will need to 

  • record motion capture data which we will do collectively in the studio
  • Create our individual robot characters using ZBrush and Maya, with texturing in Substance painter
  • And lastly apply the motion capture data to the character that we have created.

The first thing we learnt was the foundation of this module, how to translate motion captured data onto a 3D model in Maya.

Here is a video of me doing this process, however I have also broken each stage down into steps and written my process below

Video

I would like to comment on the reason for the spikes on the forearms of the character in the video when the animation is played through. This is called “spiking” and it happens because some of the vertices in the mesh model don’t get bound to the joints in the skinning process. This results in little spikes being left behind. As this is an automated process this is one of the problems that can arise, adjustments can be made to joint placement to try and avoid this but as I’ve been practicing I feel it is a mixture of trial and error and getting more used to the system. As can be seen in the render later down the page, I did manage to fix this issue there.

Written step by step

To do this we started by loading the motion capture data into Maya, this is just a standard FBX file format which is very useful as Maya has no problem interpreting it.

There are a couple of interesting things about the file though, one is that the scale is much larger than normal Maya projects and secondly the frame rate is set at 100fps. This isn’t a format that I am used to working in in Maya but it makes sense for motion capture.

What mocap data looks like straight from the FBX file, the green points are tracking markers and cameras. The size of the default gird shows the scale.

Another attribute of this file is that it hold a lot of data about the motion capture system for example where the cameras and markers are. This is great for flexibility and this data can be used and manipulated but for the purposes of this task we didn’t need this data so we simply deleted it.

An example of some of the excess data that is included in the FBX mocap file. All could be useful in other circumstance but not for our project

After this we zeroed the model out which is making sure it starts in the correct place and can be rigged correctly. To do this we selected the hips and then pressed “selected hierarchy” to select the branches coming off of it and set all the rotational values to 0. This laid the character flat so we then rotated just the hips 90 degrees.

One the motion capture was prepped we turned to a handy tool in Maya called Human IK. Within this window the option to pick for the motion capture is “create character definition”. This is defining the movement. From here a skeleton diagram is shown and we simply have to click on bone on our motion capture, then right click on the corresponding bone on the diagram and click “assign selected bone”. This is basically telling Maya where to place that movement in the skeleton it creates on our model.

The character definition tool in the the Human IK window. When all bones have been assigned correctly the skeleton goes green and we can lock it

Once this has been done correctly and throughly enough that Maya has enough data, the movement is then locked to the skeleton and we rename this character (as called in the Human IK window). I’ve called mine “vicon” as that was what we used in class.

That is the end of the motion capture portion. Now we can import the geometry the mocap will be applied to and set about making a skeleton.

For this we select the whole character. Reset the Human IK window and this time choose “Quick Rig Tool” this opens up tools to make the skeleton. We make a step by step right and add the character and the geometry. For embed method we use “Polygon Soup” as this is a high poly mesh. We then click the create button to create guides.

This creates joints in the skeleton automatically. These need some adjusting particularly with the elbows but these changes can be made in the viewport and mirrored in the quick rig panel.

Moving the automatically created joints to more fitting positions. The mirror control to make sure they’re symmetrical is in the “user adjustment of guides” panel

The result of clicking create in the skeleton generation section (after creating and placing joints)

We then tell Quick Rig to make a skeleton only rather than a skeleton and a rig. And finally we bind the skeleton to the geometry in a process called skinning. All of these steps are carried out in this same window.

The result from the final step after the previous 2 images, binding the skin to the joint system, under the skinning tab

We rename that character to “skeleton”.

The result is two characters in the Human IK window, one the motion capture data and the other the character.

To combine them we look at the top of the human IK window and under the character drop down select “skeleton” and then under source select “vicon”.

This will apply vicon motion data to this skeleton.

This was a really interesting thing for me to learn and it definitely helped me see the amount of possibilities for creating content using this system.

Render of skeleton model with motion capture data applied

We also repeated this process with a troll model. This was both to familiarise ourselves with the process by repeating it but also to see the same mocap data being used on two different models. This was really fun to see considering one is very cartoony. It seems like they’re two different animations which makes me realise the power of motion capture.

Render of troll model with motion capture data applied

This mocap data on the troll brings up problems not found with the skeleton such as the arms clipping through the body and the shoulders not quite moving correctly, again these are just some of the quirks with motion capture. We are trying to translate data of a human who has human arm proportions onto a character that is much larger, therefore it makes sense that the clipping would occur. A way to avoid this is to know what kind of character the mocap will be transferred to and try and plan ahead of time how best to capture a performance for this. For example keeping your arms further apart or further away from the body. There are ways which was could’ve adapted this data to overcome this problem but those are more advanced techniques that we aren’t learning yet.

As far as shoulder tracking unfortunately that is just due to the sophistication of the suit and the camera set up that we have access too. We don’t have enough sensors or cameras to pick up accurate movement of shoulders.

Thinking about the design of my robot character

Towards the end of the lesson we turned our attention to thinking about the design of our robot characters and this is the mood board that I came up with. We talked in class about how we may be able to do some facial animation with blend shapes if we have time. Therefore I would like to try and make a character with a helmet so that if we do have time I can incorporate the facial rig however if not the character will still look okay as I can just leave the helmet on/closed.

I like these designs as I feel the mocap will translate well to it. The design is fairly similar to a human character just with more blocky shapes and of course impactful texturing and bold design elements.

Week 2

This week we explore more of how we are going to go about making the model of the robot mech.

We were introduced to ZBrush and given a crash course on how to make a boot. I had used ZBrush before but this was a needed refresher and I learnt some new tools like the knife tool. The knife allows us to cut away pieces with helps with creating flat edges. I also used the knife curve brush which allowed be to create the shape at the top. Additionally the knife circle was what I used to make the indent on the side.

My design for a boot in ZBrush

Remeshing the model to get good topology and a relatively low poly count

Creating a UV map for the mesh inside of ZBrush

Exporting the FBX

After we had the FBX File we then loaded it into Substance Painter for texturing. Again I have used this program before but only for very basic textures. We didn’t generate another UV map within Substance as we used the one generated in ZBrush

Exporting textures from Substance Painter

One trick that I did pick up in class was that I can change the output template of the textures to “Arnold AI Standard”. This is really useful as it means the naming conventions will match up when trying to link them in Maya. It’s a small thing but it makes the process easier

Model loaded in Maya (the FBX file from ZBrush)

We then imported and linked up the substance painter textures, and added a HDRI for lighting

Model, textured with lighting applied

We then made the rest of the character above the boots so that we could see how the process would come together once we had made all the parts. However for this beginning exercise we just made the rest of the body out of cubes

We then repeated the steps we went through last week to apply the mocap data to the new character. It was really motivating to see a character that we had made from scratch have the mocap data. Even though it wasn’t polished or close to the final output it was nice to see it come together and it gives me reassurance that I have the tools to complete my first assignment.

This is a video of the character following the mocap data

Learning how to make an export parts of my model from ZBrush was really valuable for me as it’s a skill I hope to use in my robot design. I think I willl model portions in Maya too but it will be nice to be able to use both modelling programs in order to get a contrast in the look of the mechs components

Week 3 – recording motion capture

This morning we recorded the mocap in the studio.

This shows the floor area that the cameras record and some of the cameras used (attached to the ceiling)

This shows the set up of the cameras on the digital software counterpart. Here you can see the cameras and the floor that has been analysed

The small dots on the screen here are the markers placed on our actor

Week 3 – modelling the robot

After recording the mocap I started to model the character.

I decided to model over the top of an provided character base mesh to make sure the proportions were

Here it can be seen that I used a human base mesh that is included in ZBrush as reference. Initially I wanted to go for a much more organic human feel but that changed to being more blocky and robotic once I rethought about the assignment and what it was requiring of me

Elements like the shoulder pads and the joining of the hips to the suit as well as the elbow pads are areas of the model that I’m quite pleased with I think they elevate it to being more of a mech suit and help the model fit the aesthetic

 

These screenshots show how I ended up building the helmet. I had a lot of trail and error with this trying to figure out the best way to go about it. Initially I wanted to use a boolean operation to subtract the head shape from inside a larger head shape as this would be my workflow in Maya however I couldn’t figure out how to do this effectively with ZBrush even after spending quite a lot of time to research. In the end I just ended up building a mesh around the head itself that I could then sculpt to look like a helmet.

I’m really pleased with the helmet portion of the model as I think it lends itself well to the workflow of creating blend shapes and adding facial animation to the character.

Here is the model inside Maya, to transport it I used the same method that we followed in class with the shoe that is catalogued above

As this screenshot demonstrates in my first export to Maya I decimated the mesh too much and ended up losing some of the polygons in the fingers, to fix this I re-exported the mesh with a slightly higher poly count

The model in Maya with a standard surface material added, this was the model ready have motion capture applied to it

Week 4 – Applying mocap to the robot

Above are my first motion capture experiments. Upon seeing the work of others in the class where the designs were a lot more robotic and shape driven (primarily modelled in Maya rather than in ZBrush). I quickly discovered that I had unknowingly gone a different way with my approach to this task, I don’t think I fully understood the brief as intended when I made my model.

Despite this I have carried on with it and tested out the motion capture. Due to the model being more organic there was a worry that the mocap wouldn’t translate right especially with the hips however I was quite pleased with how it translated and think that it still looks effective. There are of course still improvements to be made but as a base I am happy with it.

This is a compilation of all of the mocap footage I requested to be recorded added to my character. I made this so that I could easily look through all the shots and think about which one/ones I want to use for my final submission.

Trying to improve the base model

After this weeks lesson I was influenced to try and update my model to look more mechanical. My first plan was to redesign the hips and add in ball joints for the knees. Although I do think this makes it look more mechanical, I missed the stylistic choices I had made original.

This is the compromise I came up with, I know I still need to modify the mocap as it is not completely translating well but I’m pleased with the shortening of the torso as there is less warping now.

Creating and binding the mocap to the model in this most recent iteration

Creating the face and texturing the robot

Starting to sculpt the face

More complete face sculpt

Window where blend shapes are created

Trying to texture the face

trying a different way to colour the eyes

What Maya looks like when importing blend shapes from ZBrush

Starting to align the head with the motion capture and getting used to texturing

I had trouble with texturing the eyes. I fixed this problem by importing the model as an OBJ rather than an FBX and this fixed it

The face placed within the robot model

Texturing the robot suit in substance painter

This is where I realised that the texturing had gone wrong on the shoulder. Turns out that it was a problem with the ZBrush model. There was extra mesh that was being textured that didn’t need to be there. I deleted this from the ZBrush and reimported it

The model imported into Maya and textured. I tried to be more efficient here by importing only half the textures as they were mirrored/repeating. However I for that the UV mapping would also be wrong, therefore the textures look wrong here

I solved this by importing each texture for each piece separately and this is the updated result

This is the full model textured and combined with blend shapes still working and available

This is a video/render of the robot with the mocap applied. I still need to readjust the motion capture skeleton so that it stops clipping but I’m glad that I have succeeded in running through most of the areas of the workflow now. I also would like to experiment with texturing, I am debating going for more of a plastic or painted look rather than the reflective steel.

Week 5 – Learning Pistons

This week we leant about how we can add pistons to our robots. At the moment this isn’t really applicable to my robot due to how I’ve modelled it however it was still interesting to learn the techniques and be introduced to constraints and how to make mechanical things

The first part was to look at aim constraints, what those are and how they work. Aim constraints control where an element is looking. They are often used for directing eyes in  characters for example

This screenshot shows the menus and options that are given to us when setting up constraints. For us we wanted to make sure maintain offset was off so that the the geometry didn’t jump

Maya constrain menus and options

This is the result of two aim constraints between the red and yellow rings

To increase our understanding of the task we emulated an elbow joint with this example piston. To achieve this we parented the elements to the joints and added point constraints

The next task we adding constraints to this model of an elbow joint, this involved using point and aim constraints as well as parenting everthing together. To give us more control over the rig we ended up using locators are the points we wanted to constrain this just makes our rig a bit cleaner and gives us a piece of geometry to tie everything to. It also can help the rig run more smoothly and encounter less problems

Elbow piston extended

Elbow piston retracted

Limit constraints

Something else we learnt during this exercise was that we can limit the amount that a transform can be applied. For example with our piston, we know we only want the elbow to be able to rotate a certain amount, else the rig with break. Therefore we can input this information into our transform limits, to limit the animators control when using the rig, this prevents the rig from breaking

Setting up the nurbs sphere controller

Finished nurbs sphere controller

We also learnt how do design controllers for the joints and connect them. We created a nurbs sphere and lowered its sections to make a simple shape, we then aligned it to the center of the joint and made a parent constraint to give the controller the same power as the joint.

Additionally in the drawing overrides panel in the attribute editor you have the option to turn off shading and give the the controller a colour which is important when think about the usability and readability of the rig.

The last exercise was working pistons in an engine or more mechanical setting.

Here we focused on parenting and constraints are before, this was quite a quick tutorial and covered a lot of different constraints and techniques, using locators to aim certain pieces of geometry.

After making the pistons work effectively our next task was to work on making the gears move according to the movement of the pistons/crank. To do this we were introduced to a new window, the connections editor. The connection editor allows you to feed attributes of one piece of geometry into another. For this example we took the rotate Z of the crank and ported it to the rotate Z of our first gear

Connection editor

After this the next thing we wanted to do was amke the second crank move in time with the first. To do this we needed to change both it’s direction and it’s speed. For this we used the expression editor. In the expression editor we typed a small piece of code which stated what we wanted to set out value to.

For this example we said “Gear16.rotateZ (the element we want to change) = -Gear32.rotateZ (the negative of rotational attribute of the gear previous (reversing it’s direction)) *32/16 (times half (as there are half as many teeth in this cog))”

This resulted in a cog which went the opposite direction at half the speed

Final result

I did hook up the rest of the cogs and in preview it works correctly however when setting keyframes it seemed to go a bit loopy, but the logic works and I know I understand the concepts

Trying new texturing styles

After evaluating my robot model I wanted to do a bit of experimenting in terms of my texturing style. I thought that was a possibility that I would come up with something the suited the character more. I wanted to move away from the stainless steel look as I feel that suits a robot that would be doing very mechnical movements which mine isn’t, therefore I thought a softer approach to texturing might work better

My first idea was to go with a painted and scuffed steel, my hope was that the paint would soften the character and only the highly worn elements (like the elbow hinges would be super shiny)

coloured steel texturing in Substance Painter

coloured plastic texturing in Maya

My second idea was to go with a more plastic aesthetic almost as if the character was a real life toy or an action figure. This changes the look quite a lot but I think it is still effective

coloured plastic texturing in Substance Painter

coloured plastic texturing in Maya

From here my next steps will be to firmly decide the piece/pieces of mocap I want to use, align the character to it, manoeuvre the mocap to avoid clipping and mistakes, and finally adding a face cam so that I can incorporate the blendshapes.

Choosing motion capture footage to include in my sequence

Initial comp

Combining the files in Premiere

Adding in the blend shape keyframe by generating a face cam

Blendshapes with face cam

Final comp

This is my final composition combining the motion capture and the blend shape keyframes

Week 6 – Refining

Found an easy way to fix the tilted foot in the sneaking portion. I can just add an animation layer and rotate the ankle joint down. This does mean that I need to re-render the scene though.

I decided to go ahead with this plan as I thought it would make for a better piece/submission. To do this I added a new animation layer so that I could keyframe the skeleton independently from the motion capture. This means that I can keyframe the feet separately, which allowed me to level the feet out.

This shows the new animation layer and how I rotated the ankle joint of the skeleton to level the foot

Week 7 – Launching creature brief

This week we did a final class critique of our robots and then moved on to discuss the next assignment which is making creatures.

For our assessment we need to model and or sculpt a creature. We can chose whether or not we take it into Maya to rig and animate or whether we sculpt and model it in ZBrush

Introduction to ZSphere

Today we were introduced to a sculpting technique called ZSphere, this allows us to quickly create connected spheres similar to joints which help us create a base for our creatures. We create joined together spheres that we can then move and scale to our liking.

My first trial with this system

The mesh once adaptive skinning has been used

We can then use a technique called “adaptive skinning” to create this chain into a mesh that we can sculpt on to

Sculpting on the mesh

We then experimented with making a creature with this technique so that we could see uses for this workflow

We then started sculpting an arm using the anatomy references we had.

My proportions are a little off here as I made my ZSpheres at an incorrect proportion initially.

This is an example of some of the references we were givien

Starting on assignment 2 sculpting and rigging a creature

When coming up with my idea for a creature I wanted to focus on something where I knew I could follow the process described, and when thinking about animals and creatures that would be interesting yet pretty straight forward to create, I landed on making an elephant. Although not the most detailed or complex thing to model, I though rigging an elephant could be a nice challenge considering I could potentially make the trunk move and roll up and in.

My hope is that I can get the sculpting done fairly quickly to give me more time for rigging and technical aspects

Some 2D reference images that I gathered

Making the ZSphere shape

Using adaptive skin to turn it into a mesh and beginning to sculpt

Front view of the sculpt

I’m happy with how I am progressing with this and think I am on the right track with the shape and overall look, there is a lot of work to be done but it is a good start.

Week 8 – Corrective blendshapes, modelling legs and working on our creatures

This week we learnt how to use corrective blendshapes in Maya to change and tweak our model shapes.

To do this we modelled a basic cartoon hand in Maya using polygons and extrusion. After making the model we then added a rig to it by adding joints for the shoulder, elbow, wrist, knuckles and fingers.

With our skeleton in place we added a small animation to just the joints on the skeleton. After this was in place we bound the skeleton to the model using Skin > Bind. Doing this bound the mesh to the skeleton therefore the mesh inherited the animation and followed it along.

With the rig working and the animation correct we then went about adding a blendshape. This is because when we bend the elbow in our animation it can be seen that the elbow mass shrinks. This is due to the verticies on the model being squished and manipulated by the movement of the rig/skeleton.

To resolve this we made a corrective blenshape on the arm. We went into the shape editor menu and added the arm mesh as a blendshape, we then selected the geometry we wished to control and clicked ‘add target’ with the strength of our blendshape set to 1 we edited the shape to add more volume to the elbow and bicep. By moving the blendshape slider from 0 to 1 we can view the changes were making to the shape.

As a final step we keyframed this blendshape so that at frame 1 it would be set to 0 (and therefore inactive) and when the elbow was bent it would be set to one.

This technique allows us to tweak our models in line with the animation as needed.

Video showing the blend shape animation bound to the hand mesh via the skeleton/rig

Learning the anatomy of the leg and sculpting it

For the next portion of the lesson we learnt about the different muscles in the leg and how they are shaped so that we can learn to sculpt one. We looked at the relationship between the quads and the abductors as well as the soleus muscle that leads in to the Achillies heel.

With that knowledge I began attempting to sculpt the leg. I started with the base model provided by zBrush which had a lot of the muscles already there, my job became just defining and exaggerating them.

Reference images

 

My Sculpt (on a pre-made base mesh)  

Although I know more could be done to this, I think I have managed to highlight some key areas and with more time could develop this further.

Continuing to work on the elephant sculpt

In lesson this week it was recommended to me to find 3D reference of an elephant model as well as 2D, so I went ahead and did this. This was valuable advice as it made me think about the shapes and topology of the mesh

3D reference images

Updated sculpt from the side

Updated sculpt from the front

The main improvements I made today were trying to correct some of the proportions, add more definition and add in the tusks and feet shapes.

Week 9 – Learning basic rigging and IK handles

This week we learnt the basic of rigging and making skeletons and IK Handles.

Our first exercise was to make a basic skeleton representing a leg. We then an IK handle from the hip to the ankle and another from the base of the foot to the tip. Making these IK handles meant that we could move the foot in the relation to the hip yet it would stay flat and able to be placed accurately on the ground.

After making this we added controllers to the rig so that we could easily select that handles and not worry about selecting skeletons joints or geometry. To connect the controllers we parented the joint to the controllers (the yellow curbs circles)

To advance this small rig we also added some geometry and bound it to the joints via skinning. This shows the complete workflow for a piece of modelled geometry.

Video of the rig with a small animation applied.

Importing the scorpion and rigging it

To practice creature rigging for creatures that we will need to do for our assignment we exported a basic scorpion model from ZBrush and rigged this.

After we imported the model we make the skeleton for the scorpion and then added IK handles for the legs and tale

The skeleton of the scorpion/ completed joint chains

When creating the skeleton we used a technique called “live surface” where we made then scorpion a live surface, this mean that we could add joints accurately within the mesh rather than having to make joints orientated to the world that we would have to place manually

Skeleton isolated

We also only created on side of the legs and then used the mirror joints function to mirror the joints to the other side.

It is also very important to name your joints accordingly so that your scene is readable. Another good organisational practice is to create display layers of each of the components (skeleton, IK Handles, controllers and geometry). This means that you can show or hide certain pieces very easily

Controllers and IK Handles in yellow

Applying a set driven key to the claws

A new rigging technique that we leant was applying set driven keys. Set driven keys allow you to control certain aspects of a rig in relation to something else. For example for our project, we created a set driven key so that when the scorpion moves forward the claws would close. To do this add a new attribute to the root controller. This is a slider that will go from 0 to 1. We then navigate to a menu inside of animation – > key -> set driven key. Inside this window we can select the driver for the animation and also the parameter that it will drive. With the animation neutral and the new attribute set to 0, we key this animation, we then move the attribute to 1 (active), modify our animation and then key the new change.

The result is that as we move the slider on the new attribute from 0 to 1 our new animation will take effect

Video of the scorpion moving in a small animation, to prove that the IK Handles work

Video of the animation with the set driven key applied to the claws

Working on the elephant model

Improving the sculpt, adding eyes, improving the tail and the feet

Making a lower poly version so I can transfer it to Maya

Both the low poly and high poly version. Ready to project that texturing and poly paint onto the lower resolution version

Generating the displacement map

Importing into Maya and creating a new material (aiStandardSurface). We use the skin preset as it most close to our desired output

With colour added it was now time to link the displacement map

Linking the displacement map

Adding subdivision information

The mixed results

Initailly I was getting results like this, the elephant is there but is basically transparent.

After more experimentation I did get this result but it still didn’t translate correctly.

To negate this problem I decided to just export a lower poly version that still had quite a lot of detail (100k polys). Then I imported that into substance painter so I could texture it.

texturing in substance

Textures imported into Maya and rendered

texturing in Maya made slight brighter

Starting rigging with model textured in Substance Painter

Due to the time pressure of this module I wanted to make a start on rigging this model just to practice

Starting rigging

Animation using the rig

All I did for this was add the skeleton, IK Handles and Controllers. As I have yet to paint skin weight the movement is far too intense right now but this animation does prove that the setup of my rigging at least works

Week 10 – Sculpting Torso and continuing elephant

This week we learnt how to sculpt the torso. We started by looking at reference and discussing the different groups of muscles in this part of the model. There are a lot of recognizable shapes here such as the diamonds in the back, pecs and ab muscles in the front. We also revisited some muscles in the arm like the deltoids

Sculpting

Here you can see my short work on sculpting and then the results after my lecturer helped me figure it out. Using morph targets allowed me to see both versions and I really like being able to compare them. Sculpting is still something that I struggle with but I think my awareness and knowledge is getting better every week. Having this comparison makes me notice which bits and most important to define/highlight.

After doing the sculpt of the torso I asked for help on getting the displacement map to work with the elephant model

My first step was to try and simplify the mesh so I merged the body with the tail and smoothed it out so that hopefully it was UV unwrap more easily.

Then I repeated the texturing process and followed the process of making displacement and poly paint maps again.

I’m not sure what went wrong last week however I think it was probably something to do with UV unwrapping I think potentially I UV unwrapped it at a resolution that was too low, rather than one that was about 100k.

Below is a result of the OBJ and new maps imported and applied into Maya. I’m grateful that loading the displacement map worked this time however I do wonder if the result from substance painter is better…

Exporting model, texturing in substance painter and then applying textures to Maya

Model in ZBrush taken down in polycount so that I can UV unwrap and export to Substance Painter/Maya

Textures for main body. Using a paint layer to add details to feet

Texture for the tusks

Elephant in Maya and textured

For the eyes I just used an AiStandardSurface coloured black to make them reflective. I’m pleased the model is now at this stage so that I can continue rigging again

Week 11 – Learning XGen

This week we learnt how to use XGen to make hair for our models. This was a really interesting process as it was something I was completely new to.

When starting xgen you have to click on the primitive you want to use as the surface and the open the Xgen window. From here you wrote out the description and collection names as well as choosing the kind of hair you want to make. Descriptions are stored inside collections and the collections are put in an xgen folder in your Maya project. The naming of these is important as they’ll be references throughout the whole project file hierarchy

For ours we chose to create guides on the primitive. We clicked to create the guides and then used the sculpt tool to make them curve.

Once we had two in place we were able to click in the middle to add more that would follow the style of the previous ones I made. Therefore adding more becomes easy

This is the result when first generating the hair. We assign a density and add more CVs to it to make more and nicer curves. We also designate how much we want the hair to taper.

This is how xgen looks in the outliner. The description is a child of the collection and the guides are within the description

Something that I didn’t screenshot here is the process of adding the modifiers to the hair. This is where the hair comes to life, we added modifiers for clumping, cut, noise and coil.

This changes the style of the hair dramatically and is where a lot of the personality comes from.

To set up the scene we added lights, turning of normalise.

For the hair we added a aiStandardHair surface.

This is the scene set up

This is a render of the hair

Continuing work on elephant

Here I worked on adding hair to the tail as I had learnt xgen this week

Here I used the paint effects tool to add guides for xGen

These are the guides in XGen.

Here I adjusted the width of the guides in the attribute spreadsheet so that they would be easier to see and manipulate in the viewport

This shows the xgen window when making the hair

Render to the hair

Rigging the elephant

This is a turntable of the skeleton

For the rig I made the leg controllers using a series of IK handles and parenting them to a controller.

The main thing I had to spend a lot of time working on here was painting the skin weights of the elephant accurately. This took a lot of time and patience and it still could be perfected.

These are the controls that I have added

  • Main
  • Head tilt (SDK)
  • Trunk roll (SDK)
  • Individual legs
  • Individual ear back and forth (SDK)

SDK = make with set driven key

To make the controls I often used a technique called set driven key which allows me to bind an operation to another operation. As an example with the head tilt, when I move the controller shape forward, the head tilts down.

I also added limits to a lot of the controller shapes, this means that they will only be able to move within a certain range. This works really well when working with SDK as you can get a range of movement to align to a certain action.

For example with the trunk movement I mapped the attribute to go from 0 to 5 but the circle control only translates by about 2cm. The limit information means I have disabled the circles ability to be moved outside of the box

Trunk roll set to 0

Trunk roll set to 5

Screenshot showing all of the controls

Video showing all of the controls and how they work

Week 12 – Finishing off the assignment

To finish off the assignment I added another slider control for the tale and also created a rendered animation to show the model and rig

Updated rig screen recording

Screenshot images

Final animation

Elephant placed in a scene

Combining skills learnt from my 3D matchmoving module I decided to add the elephant into a live action scene

Week 13 – Retopology

This week we learnt how to retopologies models in Maya. To do this we had a messy model made of a lot of triangles. We exported it to Maya and used a tool called quad draw top redraw the new topology over the top. Making the model a live surface means that we were able to follow to curves of the model accurately. We also used the mirror function so that everything would transfer to the other side

When doing retopolgy there are standard practices that we can follow such as loops for the eyes and mouth and general shapes of the face. Getting good topology means that the animation will be a lot easier. smoother and more accurate.

This is a screenshot of my retopologised face using conform in Maya allows for the face to mimic the shapes in the existing geometry except in a better layout

Here are some screen shots of the new model inside ZBrush showing the shapes of the new geometry