After cleaning up a batch of fight choreography takes last week, I kept running into this: the animation looked fine at first glance, but on render you'd catch a single frame where a wrist or shoulder snapped slightly before settling back into the motion. Tiny. Invisible at 24fps in the viewport, but very obvious in a final render at full quality.
The standard approach I had was a simple delta-check: if frame-to-frame rotation difference exceeded a threshold, flag it. But this misses single-frame outliers that are small in magnitude but represent genuine acceleration spikes. They're not large, they're fast. The problem isn't the size of the value change, it's the rate of change of the rate of change.
So I wrote a script that computes the second derivative (acceleration) of each rotation fcurve, finds frames where that value is more than N standard deviations from the mean for that curve, and marks them in the timeline with labels showing which bones are affected.
import bpy
import math
from collections import defaultdict
def get_stats(values):
if len(values) < 2:
return 0.0, 0.0
mean = sum(values) / len(values)
variance = sum((v - mean) ** 2 for v in values) / len(values)
return math.sqrt(variance), mean
def detect_rotation_spikes(action, threshold_sigma=2.5, target_bones=None):
spike_frames = defaultdict(list)
for fcurve in action.fcurves:
data_path = fcurve.data_path
if 'rotation' not in data_path or 'pose.bones["' not in data_path:
continue
bone_name = data_path.split('pose.bones["')[1].split('"]')[0]
if target_bones is not None and bone_name not in target_bones:
continue
kps = fcurve.keyframe_points
if len(kps) < 4:
continue
frames = [kp.co[0] for kp in kps]
values = [kp.co[1] for kp in kps]
vel = [values[i + 1] - values[i] for i in range(len(values) - 1)]
accel = [vel[i + 1] - vel[i] for i in range(len(vel) - 1)]
std, mean = get_stats(accel)
if std == 0.0:
continue
for i, a in enumerate(accel):
if abs(a - mean) > threshold_sigma * std:
spike_frames[int(frames[i + 1])].append(bone_name)
return spike_frames
def run(threshold_sigma=2.5):
obj = bpy.context.active_object
if not obj or not obj.animation_data or not obj.animation_data.action:
print('No active action found.')
return
action = obj.animation_data.action
spikes = detect_rotation_spikes(action, threshold_sigma)
if not spikes:
print('No spikes detected.')
return
scene = bpy.context.scene
for m in [m for m in scene.timeline_markers if m.name.startswith('SPIKE:')]:
scene.timeline_markers.remove(m)
for frame, bones in sorted(spikes.items()):
unique = list(dict.fromkeys(bones))
label = 'SPIKE: ' + ', '.join(unique[:3])
if len(unique) > 3:
label += f' +{len(unique) - 3}'
scene.timeline_markers.new(name=label, frame=frame)
print(f'Flagged {len(spikes)} frames in action: {action.name}')
for frame, bones in sorted(spikes.items()):
bone_str = ', '.join(dict.fromkeys(bones))
print(f' Frame {frame:4d} -- {bone_str}')
run(threshold_sigma=2.5)
Threshold of 2.5 sigma catches most real artifacts without too much noise on my data. On a 600-frame fight take I typically get 10–20 flagged frames, and maybe 3–4 are actually worth touching. The rest are intentional fast moves that just look statistically unusual relative to the rest of the take.
A few things I know are imperfect:
- Works per-channel, not per-bone. A spike in just the Y rotation channel triggers independently, so intentional fast rotations on a bone that's otherwise slow can false-positive
- Only analyzes keyframe points, not the interpolated curve. If you've already baked and smoothed, it can miss things between explicit keys
- Standard deviation is sensitive to outliers, which is somewhat circular. One big spike inflates std and can mask nearby smaller ones
I tried IQR instead of std but it was too sensitive on sparse takes where there aren't enough keyframes to establish a solid baseline. Anyone found a better threshold approach? Also wondering if analyzing quaternion curves instead of euler would make the velocity math cleaner. Quaternion distances are more geometrically meaningful for rotation but I haven't gone down that path yet.