CGI Foundations

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Assessment 1 – Digital Portfolio Tasks

Weekly & Workshop activities updated at the end of my

blog each week.

Assessment 2 – Interior Scene in Unreal Engine

Old West Saloon

For my interior scene assessment, I decided to create an Old West Saloon.

I used assets from the unreal marketplace as well as quixel bridge that would resemble the nature of an old west saloon.

Assessment 3 – Exterior Scene in Unreal Engine – Lake in a forest

For this assignment I decided to create a 60×60 size landscape using landscape mode, sculpted my terrain so that it resembles a forest with a few hills behind it .

I made a landscape material using quixel bridge textures with the use of tiling parameters, layer blend nodes, normal strength nodes.

 

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

━━━━━━

Week 1 – Introduction to the module

What is CGI?

Computer-generated imagery (CGI) encompasses a significant part of what we see as 3D graphics in video games, films, and television. It’s used to create characters, environments, backgrounds, special effects, and even entire animated films. In the world of filmmaking, these computer-generated visuals fall under the domain of the visual effects (VFX) team. 3D computer graphics are a critical component of CGI, enabling the creation of lifelike and fantastical elements in digital media. The process of creating these computer-generated images involves sophisticated software and skilled artists to bring virtual worlds to life.

Modern CGI pipelines often use games
engine such as Unreal engine to develop
real-time renders. Here are some of the reasons why:

  • Real-time vizualisation: Unreal Engine offers real-time visualization, allowing teams to see the results of their work in real-time as they make changes. This is a huge advantage over traditional CGI rendering solutions, which can take hours or even days to produce a single image.
  • Enhanced collaboration: Unreal Engine allows teams to collaborate more effciently. With real-time vizualisation and a user-friendly interface, multiple artists can work on a project at the same time, speeding up the production process.
  • Improved graphics quality: Unreal Engine utilizes advanced graphics technologies, like physically-based rendering, which result in more realistic, higher quality images. This can make the final product more appealing to clients and customers.
  • Lower costs: Unreal Engine is an open-source platform, which means it is free to use. This can greatly reduce costs for teams compared to the high costs associated with traditional CGI solutions.
  • Versatility: Unreal Engine can be used for a wide range of creative projects, from architecture vizualization to gaming, which makes it a versatile solution for many industries.

In conclusion, Unreal Engine provides artists with a powerful tool for creating high quality 3D content in real-time, while also being a cost effective solutions for many individuals and industries. With its growing popularity, it is clear that Unreal Engine is set to become a standard tool for many CGI teams.

Unreal Engine Binds

  • F10 – Dock Windows
  • G – Game View
  • When moving an object, unreal snaps it to the grid every 10 cm by default. To achieve smooth movement uncheck the grid
  • W – move object
  • E – rotate object
  • R – scale object
  • F – camera to object
  • ctrl + D – duplicate object
  • alt + drag – duplicate object and move it in one direction
  • shift + drag – lock camera
  • end key – snap object to another object below it
  • esc – quit game

Workshop Activity:

Create a new Third Person Project called “ObstacleCourse”!
Add the following features:
A several objects that uses physics and can be moved
around by the player.
Some walls to create a simple maze.
Some ramps and platforms for the player navigate.

Creating a new third person project called ”Obstacle Course”.

Setting up the maze.

Enabling simulating physics on my objects.

Weekly Activity:

Define what an Actor is and provide an example of
a commonly used Actor Type.

In Unreal Engine, an Actor is a basic building block used in creating a game. It can be anything that you place in your game world, such as a character, a tree, or a light source. Essentially, Actor is a base class for all objects that can be placed or spawned in a level. It is the fundamental unit of the world, and everything that exists in a game level is an Actor or a subclass of an Actor.

Key features of an actor:

  • Transform: Every Actor has a transform, which includes its location, rotation, and scale. This determines where the Actor is in the world, how it’s oriented, and its size.
  • Components: Actors are made up of components, which are modular pieces that define the Actor’s behavior and appearance. For example, a Static Mesh Component defines the visual geometry of the Actor, while a Collision Component defines its collision boundaries.
  • Lifecycle: Actors have a lifecycle managed by the engine. They are created (spawned), updated every frame (tick), and eventually destroyed. You can define custom behavior for different stages of an Actor’s lifecycle.
  • Events: Actors can respond to various events, such as overlapping with another Actor, receiving damage, or being clicked by the player. This makes them highly interactive and responsive to the game environment.
  • Blueprints and C++: Actors can be created and managed using both Unreal Engine’s Blueprint visual scripting system and C++ programming. Blueprints are user-friendly and great for quick prototyping, while C++ offers more control and performance for complex behaviors.

A commonly used Actor Type is the Character. A Character is a type of Actor that is specifically designed to be controlled by a player or AI. It comes with built-in features for movement, animation, and handling inputs, making it easier to create playable characters or NPCs (non-player characters).

Key components of a character:

  • Capsule Component: Defines the collision boundaries for the Character. It’s usually a vertical capsule shape that fits around the Character’s body.
  • Character Movement Component: Handles all the movement logic, including walking, running, jumping, and swimming. It simplifies the process of making the Character move in response to player input or AI commands.
  • Skeletal Mesh Component: Defines the visual appearance of the Character, using a skeleton that can be animated.
  • Animation Blueprint: A specialized Blueprint that controls the animations of the Character based on its state (e.g., idle, walking, jumping).

For example, when you create a player character in a game, you would typically use the Character Actor Type. This allows you to add controls for running, jumping, and interacting with the game environment, all within a framework that handles a lot of the complex details for you.

Define what components are and provide two
examples of components.

In Unreal Engine, components are reusable, modular units of functionality that can be added to actors to provide specific behaviors or properties. Components allow developers to build complex objects by combining smaller, well-defined pieces. Components can handle a variety of tasks, from handling physics interactions to providing visual representations.

Examples of Components in Unreal Engine

  • Static Mesh Component:

Static mesh component is used to represent 3D models in the game world. It allows you to attach a static mesh (a non-animated 3D model) to an actor. Static Mesh Components handle rendering, collision detection, and other properties related to 3D models.

If you are creating a building or a piece of furniture in your game, you would use a Static Mesh Component to attach the model of the building or furniture to your actor.

  • Box Collision Component:

Box collision component defines a rectangular collision volume in the game world. It is used for detecting overlaps and collisions with other objects. Box Collision Components can be used to trigger events when an actor enters or exits the volume or to detect physical collisions.

You might use a Box Collision Component to create a trigger zone that causes a door to open when the player character enters the zone. This component would define the area in which the trigger can be activated.

By using these and other components, developers can build actors with specific behaviors and properties without having to write custom code for each individual aspect, promoting reusability and modular design.

Use references and screenshots/images from Unreal
to support your answer!

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 2 – Materials

Material Inputs

Material inputs an control a variety of visual properties like color, reflectivity, texture, and more.

Key material inputs in Unreal Engine:

  • Base Color:

The primary color of the material. It represents the diffuse color that is visible under direct lighting.

Input Type: Vector3 (RGB color).

  • Metallic:

Defines how metallic the surface appears. A value of 0 indicates a non-metallic surface (like wood or plastic), while a value of 1 indicates a fully metallic surface (like gold or aluminum).

Input Type: Scalar (float between 0 and 1).

  • Specular:

Controls the intensity of the specular reflection. This affects how shiny the surface appears under direct light.

Input Type: Scalar (usually between 0 and 1).

  • Roughness:

Defines the roughness of the surface. A value of 0 results in a perfectly smooth, shiny surface, while a value of 1 results in a very rough, matte surface.

Input Type: Scalar (float between 0 and 1).

  • Emissive Color:

The color emitted by the material. This makes the material appear as if it is glowing.

Input Type: Vector3 (RGB color).

  • Opacity:

Determines the transparency of the material. A value of 1 makes the material fully opaque, while a value of 0 makes it fully transparent.

Input Type: Scalar (float between 0 and 1).

  • Normal:

Provides details about the surface normals, which affect how light interacts with the surface. This is often used with normal maps to create the illusion of complex surface detail without adding geometry.

Input Type: Vector3 (XYZ normal direction).

  • World Position Offset:

Allows you to modify the position of vertices in world space. This can be used for effects like waving grass or other vertex displacement effects.

Input Type: Vector3.

  • World Displacement:

Similar to World Position Offset but specifically used with tessellation to displace vertices based on a height map or other input.

Input Type: Vector3.

  • Tessellation Multiplier:

Controls the amount of tessellation applied to the geometry. Higher values increase the number of polygons, which can enhance detail at the cost of performance.

Input Type: Scalar.

  • Subsurface Colour:

Used for subsurface scattering effects where light penetrates a material and scatters inside it, like skin or wax.

Input Type: Vector3 (RGB color).

  • Opacity Mask:

Defines areas of the material that should be fully transparent or fully opaque. It is used with masked blend mode.

Input Type: Scalar (usually between 0 and 1).

  • Refraction:

Simulates the bending of light as it passes through the material, like glass or water.

Input Type: Scalar.

  • Ambient Occlusion:

Darkens crevices and corners where light is naturally occluded. It adds depth and realism to the material.

Input Type: Scalar (usually between 0 and 1).

  • Pixel Depth Offset:

Adjusts the depth of pixels for rendering purposes. This can create effects like parallax or pseudo 3D effects.

Input Type: Scalar.

These inputs can be combined and modified using various nodes in the Material Editor to achieve a wide range of visual effects in Unreal Engine. Each input plays a crucial role in defining the final look and behavior of the material under different lighting conditions.

Macro Textures

Example of a macro texture.

Material Instances:

Material instances in Unreal Engine are simplified versions of a base material. They allow you to customize certain properties without needing to create a new material from scratch.

Difference between a base material vs a material instance

  • Base Material –  the original, fully-featured material that defines all the properties and complex calculations for how the material should look and behave.
  • Material Instance – a variation of the base material that allows you to change specific parameters (like color, texture, or roughness) quickly and easily. These parameters are defined as “instance parameters” in the base material.

 

 

Emissive denim

Macrotexture

 

Make 8 different materials

Attach screenshots

 

Brick Material

 

Nodes – Grass Material

 

 

Render – Grass Material

 

Nodes – Denim Material

Render – Denim Material

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 3 – Unreal Engine Tools

Landscape Tool

The Landscape tool in Unreal Engine is a powerful feature used to create large, open-world environments like mountains, valleys, and terrains. It lets you build and modify expansive terrains within your game or project.

Key Features

  • Sculpting: Use various brushes to shape the terrain. You can raise, lower, smooth, or flatten areas to create hills, mountains, valleys, and plateaus.
  • Painting: Apply different textures (like grass, dirt, or snow) to your terrain to make it look realistic. You can paint these textures onto the landscape just like you would with a paintbrush.
  • Foliage: Automatically add trees, plants, and rocks to your landscape to make it more detailed and lifelike. The tool can distribute these objects across the terrain based on rules you set.
  • Water: Create lakes, rivers, and oceans. You can adjust their shape, depth, and flow to fit your landscape.

Foliage Tool

The Foliage tool in Unreal Engine is an essential tool for adding vegetation and natural elements to your scenes. It allows you to quickly and efficiently populate your environment with trees, bushes, grass, and other foliage, bringing virtual worlds to life with ease.

Key Features

  • Placement: Easily place trees, bushes, grass, and other foliage onto your terrain with just a few clicks.
  • Variety: Choose from a library of pre-made foliage types, or create your own custom plants to use in your scene.
  • Density Control: Adjust the density and distribution of foliage to achieve the desired look. You can have sparse forests or dense jungles, depending on your needs.
  • Randomization: Automatically vary the size, rotation, and placement of foliage to avoid repetition and make your scene more natural.
  • Performance Optimization: The tool automatically optimizes foliage placement for performance, so you can have lush environments without sacrificing speed.

Landscape Material Nodes

Landscape Material

Workshop Activity

Create a landscape and place foliage around the
scene.

Weekly Activity

Create a simple forest with a “cabin” and take
high resolution screenshots for your portfolio.

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 4 – Lighting & Reflections

Unreal Engine 5’s Lighting System – Lumen

Lumen is a cutting-edge technology introduced in Unreal Engine that revolutionizes how global illumination (GI) is handled in real-time rendering. Global illumination refers to the way light interacts with surfaces in a scene, bouncing off objects and affecting the overall lighting.

Traditionally, achieving high-quality global illumination in real-time engines required pre-baked lighting solutions or expensive ray tracing techniques. However, Lumen brings real-time global illumination to Unreal Engine without the need for precomputation or heavy computational loads.

Here’s how Lumen works:

  • Dynamic Global Illumination: Lumen dynamically calculates global illumination in real-time as the scene changes. This means that lighting updates instantly as objects move, change, or interact with light sources. It provides realistic and accurate lighting without the need for manual tweaks or precomputations.
  • Voxelized Representation: Lumen uses a voxelized representation of the scene to efficiently calculate indirect lighting. Voxelization breaks the scene into small 3D cubes (voxels), allowing for fast and scalable calculations of indirect lighting. This voxelization technique enables Lumen to handle complex scenes with dynamic lighting seamlessly.
  • Temporal Accumulation: Lumen employs temporal accumulation techniques to improve the quality of global illumination over time. By accumulating lighting information from previous frames, Lumen produces smoother and more stable results, reducing flickering or noise commonly associated with real-time global illumination.
  • Adaptive Resolution: Lumen dynamically adjusts the resolution of voxelization based on scene complexity and performance requirements. This adaptive approach ensures that Lumen maintains high-quality global illumination while optimizing performance for various hardware configurations.
  • Integration with Nanite: Lumen seamlessly integrates with Nanite, Unreal Engine’s virtualized geometry system. Nanite allows for rendering massive amounts of geometric detail without sacrificing performance. Lumen complements Nanite by providing realistic global illumination for intricate scenes rendered with Nanite.

Overall, Lumen represents a significant advancement in real-time rendering technology, enabling developers to create visually stunning and immersive experiences with realistic lighting in Unreal Engine without compromising performance or workflow efficiency.

Lighting Types

In Unreal Engine, lighting is like painting a scene with light to make it look realistic or stylized.

Three main styles of lighting:

  • Static Lighting: Think of it as painting a picture. You set up lights in your scene, and Unreal Engine calculates how light interacts with surfaces and objects. Once calculated, this lighting doesn’t change during gameplay. It’s good for scenes where lighting doesn’t need to change much, like interiors.
  • Dynamic Lighting: This is like using a flashlight in a dark room. Dynamic lights can move and change during gameplay, giving a sense of movement and interaction with objects. They’re great for things like character flashlights or moving lights.
  • Global Illumination (GI): GI simulates how light bounces off surfaces and affects the overall brightness and color of a scene. It’s like adding a layer of realism by considering how light interacts with everything. Unreal Engine uses techniques like “Lightmass” to calculate GI in static lighting.

Unreal Engine provides various tools and settings to adjust these lighting types, allowing developers to create diverse and visually appealing environments for their games or applications.

Main types of lights that you can add to your scene:

  • Point Light: This is a light that emits light in all directions from a single point in space. It’s like a light bulb or a campfire. You can adjust its radius to control how far the light reaches.
  • Spotlight: Similar to a flashlight, a spotlight emits light in a cone-shaped direction. You can control the angle of the cone and the falloff to create different lighting effects, like narrow spotlights or wider floodlights.
  • Directional Light: This light simulates light coming from a distant source, like the sun. It emits parallel light rays in a specific direction, illuminating everything in its path. Directional lights are commonly used for outdoor scenes and can simulate realistic daylight.
  • Rect Light: Introduced in newer versions of Unreal Engine, a rectangular light emits light in a rectangular shape, similar to a fluorescent light fixture. It’s useful for simulating area lighting in architectural or interior design scenes.
  • Sky Light: This light captures the color and brightness of the sky and uses it to illuminate the scene. It’s great for creating ambient lighting and can simulate realistic outdoor lighting conditions.
  • Area Light: Another newer addition, area lights emit light from a defined surface area rather than a single point or direction. They can produce soft, even lighting similar to softboxes in photography studios.

Each type offers different characteristics and effects, allowing you to create diverse and visually appealing lighting setups for your projects.

Reflections

In Unreal Engine, reflections play a crucial role in creating realistic and immersive environments. Reflections simulate how light bounces off surfaces and objects, contributing to the overall visual fidelity of the scene. Here’s an overview of reflections in Unreal Engine:

  • Screen Space Reflections (SSR): SSR is a real-time reflection technique that calculates reflections based on what is visible on the screen. It’s computationally efficient and works well for most scenarios. SSR reflects surfaces that are within the camera’s view and can accurately capture dynamic objects and changes in the environment.
  • Reflection Capture Actors: These actors capture the surrounding scene and generate reflection cubemaps. There are two types:
    • Box Reflection Capture: It captures reflections within a defined box-shaped area and is useful for indoor environments or specific areas.
    • Sphere Reflection Capture: This captures reflections within a spherical area and is typically used for outdoor environments or large open spaces.
  • Planar Reflections: Planar reflections simulate reflections on flat surfaces like water bodies, floors, or mirrors. They use a rendering technique that creates a separate reflection pass from a specified plane’s perspective, providing more accurate reflections for flat surfaces.
  • Ray Traced Reflections: Introduced in newer versions of Unreal Engine, ray-traced reflections use hardware-accelerated ray tracing to compute accurate reflections. This technique offers high-fidelity reflections with realistic lighting and accurate occlusion.
  • Reflection Probes: Reflection probes capture the surrounding scene’s reflection data and provide it to reflective materials within their influence. They are useful for adding localized reflections in specific areas of the scene.
  • Material Reflections: Unreal Engine supports reflection in materials, allowing developers to create custom reflection effects for various surfaces. Materials can control the intensity, roughness, and color of reflections, giving artists fine control over the look of reflective surfaces.

Reflections in Unreal Engine contribute to the overall visual quality of scenes by adding depth, realism, and immersion. By utilizing different reflection techniques and tools, developers can create visually stunning environments that enhance the player’s experience.

Workshop Activity

Create or edit an existing example project to
showcase different types of lighting.

 

Weekly Activity

Continue to develop your forest using lighting
(lanterns with a model and light source).

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 5 – Sequencer, Cameras & Post Processing

The Sequencer

In Unreal Engine, Sequencer is a powerful cinematic tool that allows developers and creators to craft complex and visually stunning cinematic sequences, cutscenes, and animations directly within the engine. It provides a timeline-based interface similar to video editing software, making it intuitive for both artists and designers to create compelling narratives and gameplay experiences.

Sequencer’s key features:

  • Timeline Interface: Sequencer features a timeline interface where users can arrange and edit various cinematic elements, such as cameras, actors, animations, and visual effects. The timeline allows for precise control over timing, animation keyframes, and transitions.
  • Keyframe Animation: Users can animate properties of objects and actors over time using keyframes. This includes movement, rotation, scale, material properties, and more. Keyframe animation in Sequencer is flexible and supports both linear and nonlinear interpolation between keyframes.
  • Cinematic Cameras: Sequencer provides robust camera tools for creating cinematic camera movements and shots. Users can set up multiple cameras, define camera cuts, pans, zooms, and other camera effects to achieve professional-looking cinematography.
  • Track Types: Sequencer supports various track types for controlling different aspects of the scene, including transform tracks for position, rotation, and scale, visibility tracks for showing or hiding objects, event tracks for triggering in-game events, audio tracks for synchronizing sound effects and dialogue, and more.
  • Animation Blending and Layering: Users can blend and layer animations within Sequencer to create complex character animations and interactions. This allows for smooth transitions between different animation states and enhances the realism of character movements.
  • Visual Effects Integration: Sequencer seamlessly integrates with Unreal Engine’s visual effects tools, such as Cascade (particle effects) and Niagara (advanced particle system). Users can animate and control visual effects directly within Sequencer to enhance the cinematic experience.
  • Sequencer Recording: Users can record gameplay sessions or interactions directly into Sequencer, allowing for the creation of gameplay trailers, tutorials, or promotional videos with ease.
  • Rendering and Output: Once the cinematic sequence is complete, users can render it out as a video file or image sequence directly from Sequencer. Unreal Engine’s high-fidelity rendering capabilities ensure that the final output maintains the quality and visual fidelity of the real-time scene.

Keyframes

When it comes to animation and interpolation between keyframes, there are various types of interpolation methods, or “tangents,” that determine how smoothly the transition occurs between keyframes. Here are some of the common types:

  • Auto Tangents: Auto tangents calculate the interpolation automatically based on neighboring keyframes. This method aims to create smooth transitions between keyframes without requiring manual adjustment. It’s a good choice for general animations where consistent motion is desired.
  • Linear Tangents: Linear tangents result in a straight-line interpolation between keyframes. This means that the animation progresses at a constant rate from one keyframe to the next, creating a uniform motion without acceleration or deceleration.
  • Cubic Tangents: Cubic tangents use cubic spline interpolation to smoothly transition between keyframes. They provide more control over the animation curve, allowing for gradual acceleration and deceleration. Cubic tangents are versatile and commonly used for creating natural-looking motion.
  • Break Tangents: Break tangents allow animators to manually adjust the tangent handles of keyframes, breaking the automatic interpolation and giving precise control over the animation curve. This enables fine-tuning of acceleration, deceleration, and overall motion characteristics.
  • Clamped Tangents: Clamped tangents limit the range of motion between keyframes, preventing overshoot or undershoot in the animation. They ensure that the animation stays within specified bounds, which can be useful for maintaining consistency or avoiding unrealistic movements.
  • Stepped Tangents: Stepped tangents create a sudden change between keyframes, resulting in a discrete, step-like motion rather than a smooth transition. This type of interpolation is often used for creating mechanical or robotic movements where abrupt changes are desired.

Cameras

In Unreal Engine, cameras are essential components used to define the viewpoint and perspective from which the player or viewer experiences the game or simulation. They play a crucial role in creating immersive environments and dynamic visual experiences.

 

Main types of cameras in Unreal Engine:

  • Player Camera: In most games, the player’s viewpoint is controlled by a player camera. This camera follows the player character or object, providing the player with a first-person or third-person perspective of the game world. The player camera’s position, orientation, and field of view can be adjusted to customize the player’s viewing experience.
  • Cinematic Cameras: Cinematic cameras are used to create cinematic sequences, cutscenes, and scripted events within the game. These cameras can be controlled manually or scripted to achieve specific camera movements, angles, and shots. Cinematic cameras are essential for storytelling and enhancing the narrative elements of a game.
  • Camera Actors: Camera actors are objects placed within the level to define specific camera viewpoints or perspectives. They can be static or dynamic, depending on the requirements of the scene. Camera actors can have properties such as position, rotation, field of view, and lens settings, allowing for precise control over the camera’s behavior.
  • Camera Components: Camera components are attached to other actors, such as player characters or vehicles, to define their viewpoint in the game world. These components can be configured to follow the parent actor’s movement and rotation, providing a dynamic and immersive perspective for the player.
  • Camera Modes: Unreal Engine offers various camera modes, such as free camera, orbit camera, fly-through camera, and spectator camera. Each camera mode provides a different way to interact with the game world and offers unique perspectives for gameplay, level design, and content creation.
  • Camera Effects: Unreal Engine provides a wide range of camera effects and features to enhance the visual quality of the game. These include depth of field, motion blur, lens flares, camera shakes, and post-processing effects. Camera effects can add realism, depth, and cinematic flair to the game’s visuals.
  • Camera Animations: Cameras can be animated using keyframes and sequences to create dynamic camera movements, transitions, and cinematic sequences. Animating cameras allows developers to create immersive storytelling experiences and cinematic moments within the game.

Overall, cameras in Unreal Engine are versatile tools that enable developers to control the player’s viewpoint, create cinematic experiences, and enhance the visual quality of their games and simulations. By leveraging different camera types, modes, effects, and animations, developers can craft engaging and immersive experiences for players and viewers.

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 6 – Blueprints

 

Widget Blueprints – used for displaying 2D elements

Game Mode Blueprint

 

Variables

 

  • Boolean Variable

The most basic form of data

0 up to 1

Byte Variable

Whole number value between 0 – 225

Integer Variable

A whole number value between -2, 147, 483, 648

Integer 64 is a whole number value between

-9, 223, 372, 036

A float is a number with a decimal value such as -74.3845 or 190.22204

  • Name Variable

A name is a piece of text to identify something in engine

  • String Variable

A string is a group of alphanumeric character such as ”Hi Mum! I am on the telly!”

  • Text Variable

Text is information you would display to a user

  • Vector Variable

A vector is a set o three nuymbers (X, Y, Z). It is used for 3D co-ordinates or RGB colour data. But it is essentially 3 floats.

vector 2 – [float float] x/y

vector 3 – [x/y/z]

  • Rotator Variable

A rotator is a group of numbers that define rotation in 3D space

Transform is a set of data that combines translation 3D position rotation and scale

Made out of 3 vectors

v3 [float, float, float]  xyz

ro

 

Object Variable

Blueprint objects sucha s lights actors static meshes cameras and soundcues so anything that exists in your scene

 

An array is a data structure which can store a fixed sized collection of elements of the same data type

0, 1, 2

Arrays amost always start at 0 except in Lua

Events are essentially instructions which can be triggered under certain conditions

You can use pre existing events and define your custom evenets

Level reset sends out an exectuion signal when the level restarts It is useful when you need something triggered a level has reloaded

Actor beign overlap is executed when an actor overlaps wqith another this is useful for triggeering certain logic in particular areas

Actor end overlap is when the exact opposite is true when an actor leaves

Event hit is when the collision component is hit with another
When using a mouse interface this executes whn the cursor is over teh actor

Begin play is an event which is executed whent he scene starts or if the actors is spawned it is called immediately

Event end play is executyedf when the actor ceases to exist in the scene

 

Event tick is an evenet which is called every single frame the delta seconds is essentially time

You can create your own custom events that you can call from other evenets or when certain conditions are met

Custom events are certain events will have additional nodes these are called poarameters these are used to pass necessary information for the event to function

int sum (a, b)

a + b = c

You can name your events in details panel

 

NODES

 

The execution line defines when logic is triggered, some nodes may not have one

When there is no execution line it is called when used as a parameter

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 7 – Advanced Materials & Shaders

 

Workshop Activity

  • Create a simple single layer water material as shown in the slides.
    • Attempt to customise it further, with parameters to create 2 other unique water materials.
      • A parameter to change the colour of the water.
      • A parameter to change the speed of the waves.
    • Explore Scattering Coefficient through documentation or the youtube video provided on a more complicated version.

Weekly Activity

  • Create a simple toon shader post processing material.
    • Refer to the slides on how to achieve this!
    • Remember to set the material domain and blendable location in the details panel of your material!
    • Remember to set your post processing volume to infinite extend unbound to apply to the entire scene!

Shading models control how your material reflects incoming light#

unlit only outputs emissive for colour making it perfect for special effeects such as fire or any light emitting obkject

Default lit is the default shading model and likely the one you use most often

Subsurface simulates usbsurface scattering this reliues on subsurface colour input which defines the colour of the matter just beneath the surface of an object

preintegrated skin is similiar to ubsurface but prioritises lower performance cost while

Clear Coat s used to simulate multilayer materials cars that have a thin translucent layer

Dual normal clear coat great for carbon fiber and carpaints

subsurface profile is very similiar in nature to both the subsurface and preintegrated skin

Two sided foliage allows the light transmission the surface of a material like light passing through the leaves of a tree.

Hair enables you to create natural looking hair that simulates multiple specular highlights.

Cloth is designed to mimic the properties of cloth this includes a thin fuzz layer across the cloth

Eye is to simulate the surface of an eye giving access to the Iris Mask, Distance and very technical

SINGLE LAYER WATER enables you to achieve the effect of a transparent water surface while using an Opaque Blend Mode.

Thin translucent enabbles you to create glass like material

Material expression is an advanced feature combine multiple shading models into a single material or instance.

 

Post Processing Materials

 

Week 7 – Animation

 

Concepts;

  • State Machine

State machines are modular systems you can build in animation blueprints in order to define certain animations that can play and when they are allowed to play

 

Are created within the aim graphic to create one right click in the anime graph and select state machines add new state machine connect it to the output pose

 

All state machines begin with an entry point which is typically used to define the daulfat state

To view the internal operation of a state you can either double click in the buy blue print panel

 

 

  • Blend Space

 

Run animation + Walk animation BLEND into each other

 

 

Event graph nodes are used to process incoming data that in turn is used to drive the pose data in the anime graph such as triggering playback activating or disabling animations functions, and updating animation data.

 

Using data from the EventGraph, AnimGraph nodes determine the actual frame by frame pose to use for the animating object.

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 8 – Particle Effects

The Niagara System

Unreal Engine’s Niagara system is a sophisticated and flexible particle system that allows developers to create complex visual effects. It replaces the older Cascade particle system and provides a more powerful and versatile toolset for generating effects like smoke, fire, magic spells, explosions, and more.

Key features of Niagara:

  • Node-Based Editor: Niagara uses a node-based editor, similar to Unreal Engine’s Blueprint system, which allows for visual scripting of particle behaviors. This intuitive interface lets users create complex effects by connecting nodes that represent different functionalities and data flows.
  • Emitter Types: Niagara supports various emitter types that control how particles are generated, emitted, and behave over time. Common emitter types include:
    • CPU Emitters: These are processed on the CPU and are generally used for effects requiring complex calculations or high particle counts.
    • GPU Emitters: These are processed on the GPU and can handle a much larger number of particles with better performance, ideal for large-scale effects like dense smoke or flocking behaviors.
  • Modules: Modules are reusable components that define specific behaviors or properties for particles. Niagara comes with a library of built-in modules, such as those for controlling velocity, color, size, and lifetime. Users can also create custom modules to extend Niagara’s functionality.
  • Data Interfaces: Data interfaces allow Niagara systems to interact with other parts of Unreal Engine, such as accessing information from textures, meshes, or other actors in the scene. This enables dynamic and context-sensitive effects.
  • Simulation Stages: Niagara allows for multi-stage simulations, meaning you can process particles in stages, applying different behaviors or transformations at each step. This is useful for creating layered and sophisticated effects.
  • Event Handlers: Event handlers in Niagara can trigger actions based on particle events, such as collisions or lifespan completion. This allows for more interactive and responsive effects, such as spawning new particles when existing ones collide with surfaces.
  • Scalability: Niagara is designed with scalability in mind, allowing developers to create effects that perform well across a range of hardware capabilities. This includes features like LOD (Level of Detail) systems, which adjust the complexity of effects based on the performance target.

Practical uses of niagara:

  • Environmental Effects: Create realistic weather systems like rain, snow, and fog. Niagara can also simulate natural phenomena like waterfalls, dust, and wind.
  • Character Effects: Enhance character interactions with effects like magical spells, damage indicators, footprints, and trailing particles.
  • Explosions and Destruction: Generate complex explosion effects with debris, fire, and smoke. Niagara’s ability to handle large particle counts efficiently makes it ideal for such scenarios.
  • Interactive Elements: Implement interactive particle systems that respond to player actions, such as ripples in water when objects are thrown, or dynamic lighting effects that react to in-game events.
  • Stylized Visuals: Design stylized or abstract effects for unique visual aesthetics, such as glowing energy fields, sci-fi holograms, and fantastical creature effects.

Emitter Groups

Emitter Spawn (Begin) – Defines what happens when an
emitter is first created on the CPU.
✖ Emitter Update (Tick) – Defines what happens on every
frame, this is useful when you want them to spawn
continuously.
✖ Particle Spawn (Begin) – Called once per particle when it is
first born.
✖ Particle Update (Tick) – Called per particle on each frame.
✖ Event Handler – Used to create generate events in one of more
emitters that define certain data. Used to listen in other
emitters which trigger behaviour.
✖ Render – Defines the display of the particle (mesh or material,
etc).

Workshop Activity

Let’s create a simple particle system using
Niagara.

Creating a new third person project called ”NiagaraTest”.

Loading into my project.

Creating a new, empty Niagara system called ”NS_Portal”.

I create a new, empty emitter and call it ”Emitter_Particles”.

I go into emitter update > spawn rate > set spawn rate to 1000. This allows me to give birth to a 1000 new particles.

I go into the initialize particle section. There, I set the particle lifetime to 3 and the uniform size to 3 as well.

I set the shape primitive to torus, change the large radius to 125 and rotate my particle system 90 degrees.

I add vortex force to make the particles spin.

I add drag to my particle.

At the moment, my particles are at their full size when they disappear, so it looks really rapid and unnatural. To change that, I add scale sprite size and make sure they’re invisible when they appear and disappear, and at their full size in the middle of the animation.

Final Result:

Variation 1 – Blue

Variation 2 – Red

Weekly Activity

Continue with interior and exterior
environments, use this weekly activity as a
development log if have not been documenting it
so far.

 

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 9 – Advanced Blueprints

Workshop Activity

 

Let’s create a simple health system

Weekly Activity

Continue with interior and exterior
environments, use this weekly activity as a
development log if have not been documenting it
so far.

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Week 10 – Animation

Workshop Activity

Let’s create a third person character animation.

 

Weekly Activity

Continue with interior and exterior
environments, use this weekly activity as a
development log if have not been documenting it
so far.

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Assessments

 

Assessment 1 – Exterior

Old West Town

For my exterior scene assessment, I decided to create an old western cowboy town. I used Blender to create my blockout assets, Substance 3D Designer to create my materials & textures and then combined it all in Unreal Engine 5.4.

Reference:

 

Unreal Engine – Breakdown

I create a new, empty level.

I create a landscape.

I create assets for my environment in Blender and then start blocking out my city in Unreal Engine 5

Breakdown of blocking out the city.

Breakdown of blocking out the city.

 

Blender – Breakdown