wrote a quick Blender script to smooth mocap curves with per-bone weighting — sharing it here

275 views 0 replies

Rokoko cleanup has been eating my time lately, specifically wrist rotations and hand data. Even after the built-in stabilization pass in Smartsuit Pro, I was still getting jitter that made hands basically unusable without manual work.

The issue with Blender's built-in smooth operator is that it hits every curve with the same intensity. I kept losing subtle hip and spine detail while the hands still looked bad. So I wrote a script that lets you set a smoothing weight per bone:

import bpy

def smooth_fcurves(obj, bone_weights=None, smooth_factor=0.5, iterations=3):
    if not obj.animation_data or not obj.animation_data.action:
        return

    action = obj.animation_data.action

    for fcurve in action.fcurves:
        if 'pose.bones' not in fcurve.data_path:
            continue

        try:
            bone_name = fcurve.data_path.split('"')[1]
        except IndexError:
            continue

        weight = bone_weights.get(bone_name, 1.0) if bone_weights else 1.0
        if weight == 0:
            continue

        effective_factor = smooth_factor * weight
        effective_iters = max(1, round(iterations * weight))

        kps = fcurve.keyframe_points
        if len(kps) < 3:
            continue

        for _ in range(effective_iters):
            values = [kp.co[1] for kp in kps]
            for i in range(1, len(values) - 1):
                kps[i].co[1] = (
                    values[i - 1] * (effective_factor / 2.0)
                    + values[i] * (1.0 - effective_factor)
                    + values[i + 1] * (effective_factor / 2.0)
                )

        fcurve.update()


# per-bone weights: 0.0 = untouched, 1.0 = full smoothing
bone_weights = {
    'mixamorig:Hips':      0.3,
    'mixamorig:Spine':     0.5,
    'mixamorig:Spine1':    0.5,
    'mixamorig:RightHand': 1.0,
    'mixamorig:LeftHand':  1.0,
    'mixamorig:Head':      0.4,
}

obj = bpy.context.active_object
smooth_fcurves(obj, bone_weights=bone_weights, smooth_factor=0.6, iterations=4)

The bone_weights dict is the core of it. 0.0 leaves a bone completely untouched, 1.0 applies the full iteration count. It parses the fcurve data_path to extract bone names, so it works with any naming convention as long as it follows the standard pose.bones["BoneName"] format.

A few caveats: this is Laplacian-style weighted-average smoothing, not a Butterworth or Gaussian filter, so timing on fast movements can shift slightly if you push iterations high. For body data I stay at 2–3 iterations; for hands I'll go up to 5–6. Duplicate your action before running this. Fcurve edits made via Python outside of operators aren't undoable.

It's been saving me 30–40 minutes per cleanup session. I keep hearing about people running mocap data through scipy.signal.butter before importing into Blender for proper frequency-domain filtering, and I keep meaning to try it. Anyone actually doing that in their pipeline, or is there a smarter in-Blender approach I'm missing?

Moonjump
Forum Search Shader Sandbox
Sign In Register