I've applied the water planes material from Epic (lake one) as well as the automatic landscape material to my drone game prototype. Both the water and the landscape material required some tweaking. But this was easy to do after carefully studying the inner workings on both (which I discussed in my two previous posts). I was having issues getting my game to cook correctly for Windows due to the long names in the automatic material shader assets. To go around that I had to rename a few of the assets as well as the directories. Anyways, the result of what I got so far is sown below.
Drone Mesh and Material
I found the drone mesh in a website that had many free 3d meshes. However, I've lost the exact link where I got it from :( I do remember that the mesh was in a non-FBX format so I had to import it to blender first just to export it to FBX right afterwards. In the import/export process I lost a piece of the mesh that represented the blades. Hence, rather than trying to create some blades in blender and then somehow animate them, I decided to make believe my drone is a military prototype that uses jet propulsion...
For the military looking material, I used a navy cammo texture for the base color and the noise texture shown below as my roughness map. This is how I'm achieving the wear look on the surface. Btw, I've made this a fully metallic material in UE4s physically based shaders.
Finally, I generated a physics body for the mesh inside the editor. I've used a simple bounding volume and its shown above as the purple box wireframe. Nothing fancy. Later through script I attached 4 thrusters to this box and this is how I'm achieving flight.
The blueprint class components
All in all, this blueprint class contains the following components and a screenshot of the components view for this blueprint class is shared below.
This is the mesh described above.
There are two cameras. The first one is attached to a spring arm which is in turn attached to the mesh and set up to be active by default. It achieves the third person camera. The second camera is set right at the nose of the drone to achieve a first person camera and its deactivated by default. Inside the event graph I'm using the 'Select' button release event on my PS3 gamepad to trigger a flip-flop node that switches between third person and first person cameras.
In the picture below the red arrows represent the positive x-axis for each of the four physics thrusters components. The thruster's strength is set through script and depends on the gamepad's left and right thumbstick axis inputs. More specifically, the left thumbstick y-axis controls the initial strength for each thruster. The right thumbstick controls pitch and roll via the y-axis and x-axis respectively. In order to implement flying controls, the overall strength for each thruster is calculated as follows
Thrust = overall thrust strength (left thumbstick y-axis) - pitch amount (right thumbstick y-axis) - roll amount (right thumbstick x-axis).
Jet Exhaust Particle System
This is the exact same particle system discussed in my previous post. As pointed out in the previous post, the particles' initial speed was made a parameter so that it can be set through script in a way that is proportional to the overall strength of each thruster component respectively. Controlling the initial speed of the particles is a proxy for the intensity of the thruster visual effect.
I found a rocket booster sound in the www.freesound.org website and added it as an audio component to my blueprint. I then make its pitch and volume change proportionally to the left thumbstick y-axis (the one that controls the main thrusters strength).
This particle system was designed using three emitters that are briefly discussed below: Combustion emitter, Smoke emitter and Heat emitter.
Combustion is achieved by creating a GPU Sprites Emitter with 500 particles that are put through a vector field. This vector field is designed to push the particles in a jet stream like motion. Further, the vector field is set up as local vector field with a constant 'intensity' value and a 'tightness' value of zero, meaning that it only influences the particle's velocity (as opposed to completely taking over its velocity). Therefore, the particles need to have an initial velocity when spawned. The velocity cone module is used to spawn the particles with an initial velocity along a cone with a small angle aperture that is always pointing downwards. The angle of the combustion cone is fixed but the velocity distribution is set up to be a 'Distribution Float Particle Parameter' in order to be able to control this parameter from code or blueprints. Exposing this parameter to code or blueprints allows for controlling how strong the overall combustion effect looks like and, in this particular case, its hooked to the strength of the thrusters. Which is in turn controlled by pulling the gamepad left thumbstick up and down.
It is worth mention that an alternative way to controlling the overall strength of the combustion effect could have been set up by using a global vector field instead. This means that is up to the blueprint/code to define and own the vector field allowing it to modify the vector field intensity dynamically (instead of modifying the velocity cone's velocity parameter).
A color over life module is used to create the blue to red gradient effect. Also the size over life effect allows for the sprite to grow twice its size as it ages.
Lastly, another key module to achieve this combustion effect is to use a subimage index module (sub UV). First of all, this subuv module allows for a sprite to play a sprite animation. In this case the explosion sprite atlas shown below was used.
It should be noted that there are two ways to achieve sprite animations via subuvs: One option is to create a custom material that uses time to programmatically sweep the uvs horizontally and vertically across the sprite atlas. Another simpler approach, which was used here, is to create a custom material and just use a particle subuv expression. This expression automatically does the uvs horizontal and vertical sweeps (cascade tells it how many sub images there are horizontally and vertically). All there is left to do when using this expression is to provide the sprite atlas texture. When using this approach the subimage index module needs to be created in order to define how many subimages are to be used. Moreover, the amount of subimages horizontally and vertically is defined through the subuv section parameters in the 'required' module.
Finally, a couple comments regarding collision. I found that GPU collision only works when the option 'Use Local Space' is disabled in the required module. According to Unreal Forums this is because "collision on GPU particles calls from Scene Depth and cannot function in local space". However, disabling this local space option causes the particle effect to have a trailing effect when flying and I didn't like that. Hence I disabled collisions on this combustion emitter (The cascade screenshot may not reflect that tho).
Originally the smoke emitter was simply meant to be a simple CPU sprite emitter to simulate spark collisions when the jet exhaust got too close to the ground. This was meant to overcome the limitation of the GPU sprites from above not supporting collisions when dealing with local space. However, I couldn't get the sparks to look natural enough and eventually this module evolved into smoke. The particles in this emitter are simply shot down with an initial velocity and made to collide with any static actor in the world. A maximum of two collisions per particle was chosen and a damping factor vector that makes the particles (smoke) jump in an upward fashion when colliding. To create the sprite material a smoke alpha texture was used in a translucent material configuration. As usual, the particle color modulates the material emissive color as well as the opacity. Hence, to create smooth smoke a color over life was chosen where smoke is born light gray and dies black. Also, it is born almost translucent just to become fully opaque halfway and die fully translucent again. Finally, this is alpha texture used for the smoke sprite.
The heat effect is mainly achieved by the sprite material. This material is set up as a lit translucent surface and the main thing about it is its refraction variation. The refraction variation is being driven by a lerp module that interpolates between 1.0 and 1.6 for the refraction index input parameter in the shader. Furthermore, this lerp is driven by a modulation between the particle's color and a noise texure being panned. Finally, the normal of this material is driven by two normal maps that are panned and then added to each other in order to create random motion. With the material set up, the emitter is setup to spawn particles on the surface of a sphere centered around the combustion emitter with an initial velocity pointing up (heat rises). A drag module helps slow down the particles as they mature and grow in size. Finally, the scale color module allows for the emitter to communicate to the material to increase that modulation factor driving the refraction interpolation.
In order to better showcase the HD landscape (discussed in my previous post), I decided to create a day-night cycle blueprint. The main idea is simple but, as usual, the devil is in the details...
At a very high level, the main steps were the following:
- Create a 'Timeline' node inside the level blueprint.
- Create the different parameter curves inside the 'Timeline' needed to drive all the cyclic items. More specifically, I'm talking about the sun position, the fog amount, skylight intensity, and the time of day.
- In order for the time of day and sun position to be consistent I had to drive the sun position from -270 degrees at 00:00 to 90 degrees at 24:00 hrs. Moreover, my time scale was 1 hr = 10 seconds.
- The fog parameters were trial and error but overall attempted to reduce fog during midday and increase it at night.
- The skylight allowed for nice ambient lighting during day light but it had issues at night as it would light up the terrain like crazy. Therefore, I decided to drive the skylight intensity to almost zero during night time.
- Create own copy of the engine's sky sphere in order to be able to tweak freely without affecting other levels or projects.
- Cloud amount, cloud speed, and stars brightness can be tweaked in the sky sphere blueprint but only at construction time. Hence, in order to drive those material parameters with the time of day, create another 'Timeline' node inside the sky sphere's BP event graph. Then, create the different curves for cloud amount, cloud speed, and stars brightness inside this timeline node. Afterwards, hook up those outputs to the corresponding parameters of the dynamic material instance. Finally, connect the time of day, from the 'Timeline' node in the level blueprint, to the time input of the Timeline node just created for the sky sphere's BP event graph. This is how the two blueprints communicate.
- Add milky way galaxy at night by grabbing picture from NASA and importing it as a power of two texture into the project. After that, find the section where the stars are being drawn inside the sky sphere's material and linearly interpolate the milky way galaxy texture. Getting the UV's tiling was trial and error but at the end found U tiling of 2 and V tiling of 1 factors worked best for me.
- Create Matinee by adding camera groups, a director group and a soundtrack.
- Need to set the directional light as movable.
- Need to register the directional light with the sky sphere in order to use its angle to drive the sun position.
- Need to call the sky sphere's 'Update Material' function every time the sun position is updated.
- It looks nicer with distance field ray traced shadows enabled.
- Reduce skylight intensity at night in order to avoid glitches where the terrain lights up at night. Still investigating if this is a glitch with skylights or if its a problem where I'm abusing of the interface type of deal.
To do items
- Explore the distance field shadows parameters better.
- Investigate lighting artifact where there are isolated pockets of the landscape that light up at weird times during night cycle.
- Investigate why the dirt mask applied to camera 2 is not working.
- Improve landscape material. Investigate how to autogenerate landscape materials blends based on terrain features.
- Another reason to improve the landscape material is due to the grid patterns being formed by the UV tiling.
First of all, what is IK? IK stands for Inverse Kinematics and here's a quick definition I found on the web:
In a videogame context, IK can be used to create a smooth animation of a character's limb at a desired position based on an end point in space. This end position in space is also known as the end-effector and it can be procedurally calculated using collision queries. One common example of IK in videogames is for making sure the character's feet conform to the ground topology or stairs, etc.
I think the picture below gives a better intuition on how the end-effector is used in IK to drive the desired animation.
In my example above, I'm using an IK setup to override the punching animation so that the hand bones don't go through the walls. Below I will describe some of the essentials of setting up IK in UE4 for my project as well as some of the lessons learned. Keep in mind I'm just scratching the tip of the iceberg here.
In the character blueprint graph
- On an event tick event, set up a Sphere Trace by Channel (Visibility) and make the trace start at the lower arm socket and end at the hand socket. Create appropriate sockets (if you need to) using Persona. Do this for both the left arm and the right arm. The radius for the trace sphere may need to be tweaked appropriately.
- Break the hit result in order to obtain the collision location in world space and store this in a variable. This result will effectively be our end effector. Do this for both sphere traces on the left and right arms.
- The animation blueprint will make use of an alpha blending value where 1.0 uses the IK result and 0.0 means don't use it at all. On a collision hit, set this alpha value to 1.0 and on a non collision make it drop back to zero smoothly via a 'Finterp to' node.
In the animation blueprint's event graph
- Cast the Pawn Owner to the appropriate character blueprint where you are setting up your sphere trace and obtain all the relevant data that will be used in the animation blueprint from it. Namely, (and do this for each arm) get the end effector location, and the alpha value.
In the animation blueprint's animation graph
Append the following nodes right before the final pose and right after the state machine pose and the montages pose has been calculated.
- Convert Local to Component node so that the animation data gets converted from bone local space to component space. This is done because the Two Bone IK node to be used to calculate IK takes animation data in component space as an input.
- Create two 'Two Bone IK' nodes. One will be the IK solver for the left arm and the other for the right arm.
- A 'Two Bone IK' node takes 5 inputs. The component pose, the alpha value that you set up in the character blueprint to decide how much control to assign to this bone control node, the final bone in the IK chain (i.e., the hand bone), the parent bone that acts as the 'joint target' for the hand (i.e., the lower hand bone), and the end effector location.
- Make sure the end effector location space is set to world space as this is the result of the sphere trace collision that we did in the character's blueprint.
- The joint target input is actually a bit more involved. Not only do we specify the joint bone to use but also we specify a 'pole' (constraint) for the bone. An illustration on how to set those up in Maya can be found here. This position for the pole is the hardest thing to get right about this whole IK setup thingy. Mainly because Persona does not offer a tool for visually adjusting this pole position. For now, this is mainly a trial and error process.
- Before passing the final result to the final 'Final animation pose' node, make sure you convert back to Local space using a 'Component to Local' node.
This weekend I was able to follow along the intro tutorial to third person games and I was able to set up a basic character controller blueprint as well as an animation controller for such character. Below I summarize the necessary steps for future reference:
1) First things first, create an animation blueprint targeting the skeleton for the prototype character
2) Create an animation blueprint and add the following:
- Create a 1D blendspace using the idle, walk and run animation controlled by the character's speed.
- Create an animation state machine to go from idle/walk/run - jump - falling - end jump - idle/walk/run.
- Create all the transition rules for the different states based sometimes on a external isInAir Boolean and sometimes on the time remaining ratio between the two adjacent animations.
- Add the 1D blendspace as part of the idle/walk/run sate and add the jump start, jump loop, and jump end animations for the jump, falling, end jump states inside the state machine
- Add the state machine to the animation graph and make it drive the final pose
- In the event graph wire the isFalling from the character movement component to the isInAir Boolean that we use to drive state transitions inside the state machine. Similarly, wire the pawn's (the character) velocity to the speed input for driving the 1D blendspace (speed is just the velocity length).
3) Create a Character blueprint and add the Skeleton Mesh for the character, the animation blueprint created above, a collision capsule, a spring arm and a camera as components inside this Character blueprint. Make sure the camera component is parented by the spring arm component and if the camera does not snap into the correct place of the spring arm make sure you reset its Location to default. In the character movement controller make sure 'Orient Orientation to Movement' is selected and that the camera's component 'Use Pawn Control Rotation' is unselected in return.
4) Create some input bindings/mappings for the engine. It is here that I added the spacebar for jump action and the W/D keys for the speed axis.
5) In the character movement controller create an event graph where all the different input axis that got binded/mapped earlier get wired to 'add movement inputs'. This is achieved by extracting forward and right vectors out of the control rotation. We will also want to wire the mouse input axis to the 'add controller yaw input' and 'add controller pitch input' respectively. If you want to be able to freely rotate the camera spring arm around the character make sure to disable 'use controller rotation yaw' on the default settings of the character controller blueprint.
6) Create a Game Mode blueprint and set the default pawn class to that of the character blueprint we just set up in the step above.
7) In the world settings make sure to override the game mode with the one game mode blueprint you just created in the step above.
In order to get Windows 8 to think my PS3 Controller was a Xbox controller I followed the following guide.
In order to wire the Gamepad into the game make sure to A) add the appropriate input mappings/bindings to the engine. The showup as Gamepad Left/Right Y axis and Gamepad Left/Right X axis for example. Finally make sure to add the blueprint connections for such bindings in blueprint and multiply their values by a 'base turn rate' of our own choosing (tweakable) and the time delta between frames (Get World Delta Seconds).
I'm a software engineer with a passion for computer graphics.