Been integrating Rokoko Smartgloves finger data into my pipeline and the results are... mixed. Broad hand poses read fine, but individual finger curl data has this constant low-frequency vibration noise that makes digits look like they're trembling even when the hand is completely still. Subtle in isolation, very obvious once it's on a character.
Things I've tried so far:
- Running the Blender keyframe strip script from AuroraPulse's thread to thin curve density — helps with bloat, doesn't actually fix the underlying noise
- Second pass in MotionBuilder with the per-bone smoothing filter on finger bones, usually around 35–45% — gets it closer but also flattens intentional curl detail on the same pass
- Manual graph editor cleanup on the worst digits — fine for a hero shot, completely unsustainable for a full session's worth of takes
At this point I'm thinking about just throwing a Butterworth low-pass filter at the raw curve values in Python. Something like scipy.signal.butter + filtfilt with the cutoff tuned per take. Haven't tried it yet but it feels more principled than whatever MotionBuilder's smoothing slider is actually doing under the hood.
I'm also reconsidering whether the right move is to just use the Smartgloves data as a rough pose guide and hand-key the actual finger performance on top. That feels like it defeats the purpose of buying the hardware, but maybe that's just the reality of the price tier.
Has anyone built a filtering step into their glove data pipeline they're actually happy with? And has anyone tried constraining finger bone range with driven keys to limit noise that way — like capping per-digit deviation based on wrist pose? Sounds reasonable in theory but I haven't seen anyone do it in practice.