Baking Cloth and Hair Simulations for Real-Time Game Characters

236 views 3 replies

Something I rarely see discussed clearly: how to actually get believable cloth and hair into a real-time game without tanking your frame budget. I've been working through this problem for a character-driven project and want to share what I found.

The Core Problem

Full physics simulation for cloth and hair in-engine is expensive and often unstable. But pre-rendered characters with zero cloth movement look lifeless. The middle ground is simulation baking — run a high-quality simulation offline, bake the result into vertex animation textures (VAT), and sample those textures at runtime.

The Workflow I've Been Using

  1. Set up cloth/hair sim in Blender using cloth modifier or Blender's hair particle system (or the new Curves-based hair in 4.x)
  2. Run the sim for a full loop cycle — idle breathing, walk, run, jump land — usually 200–400 frames each
  3. Bake to VAT using the Houdini Engine plugin or the open-source VAT Baker addon for Blender. Stores per-vertex position offsets as pixel data in a texture.
  4. In your engine (I'm using Godot 4 with a custom shader), sample the VAT in the vertex shader using UV2 coordinates, blend between animation states by lerping between texture rows
// Simplified VAT vertex shader (GLSL-ish)
vec3 offset = texture(vat_texture, vec2(uv2.x, anim_frame / float(total_frames))).rgb;
VERTEX += offset * vat_scale;

Limitations Worth Knowing

  • No runtime physics response — cloth won't react to real collisions, only looks correct for baked scenarios
  • Texture memory adds up fast if you have many unique characters with long animation sets
  • Blending between two different VAT clips requires careful normalization or you get popping

For secondary motion that does need to respond to gameplay (cape catching wind, hair reacting to quick turns), I layer a lightweight spring-bone system on top of the VAT result — cheap and surprisingly convincing.

Has anyone tried this pipeline with Unreal's Chaos Cloth baked out, or found a better compression strategy for the VATs themselves? I'm currently just using BC6H but I feel like there's room to optimize.

One thing worth adding to the baked cloth discussion: vertex count budgeting before you bake matters a lot more than people expect. I've seen folks spend hours perfecting a simulation bake only to realize the cloth mesh has 8k verts that the rigger never optimized because "the sim handles it." For real-time, keep the baked mesh as lean as possible. The bake process tolerates a decimation pass on secondary cloth better than you'd think, especially for background characters.

For hair specifically, I've had better results baking to texture-driven strand animation rather than vertex animation when the budget is tight. Bake the simulation down to a flipbook of normal maps and use a wind-response shader to blend between frames. Loses some fidelity but the texture memory cost is predictable and it renders at nearly flat cost regardless of strand count.

What tool are you using for the actual bake export? Houdini's KineFX has gotten genuinely good at this but Blender's bake-to-shape-keys workflow is surprisingly workable for indie scale if you script the cleanup step.

for hero character hair I've had solid results with a hybrid: a few animated bones driving the base motion (baked from sim) plus 2-3 blend shapes for volume at extremes. the blend shapes handle the "squash" when hair hits the shoulder and the "stretch" on fast head turns, secondary motion feel with zero runtime sim cost. it's not cinematics-grade but for third-person gameplay cameras nobody's scrutinizing your hair physics that closely. reserve the expensive baked sim clips for cinematic cutscenes where the camera actually gets close.

One approach that's saved me a lot of frame budget: use blend shapes (morph targets) to fake cloth secondary motion on hero characters, and reserve actual simulated cloth baking for cinematics only.

The workflow I landed on — run your cloth sim in Blender or Marvelous Designer at full quality, then bake maybe 8–12 looping blend shape targets that represent the range of motion (idle sway, run crunch, landing impact). At runtime you blend between them driven by velocity and animation state. It's not physically correct but on a character silhouette, players almost never notice.

For hair specifically, I've had good results with bone chains driven by spring physics rather than baked vertex animation — Godot's SkeletonModifier3D and Unity's Magica Cloth 2 both handle this well with a tiny CPU cost. The trick is keeping chain count under 5 per hair cluster and using wide capsule colliders rather than lots of small ones.

The real budget killer is usually trying to simulate too many independent elements. Merging overlapping cloth panels into one mesh before baking cuts vert count dramatically and the simulation actually behaves better too.

Moonjump
Forum Search Shader Sandbox
Sign In Register