iPhone LiDAR vs Dedicated Mocap Suits for Indie Projects

90 views 2 replies

I've been testing the iPhone 15 Pro LiDAR for facial mocap and I'm honestly impressed with the results for indie budgets.

Here's my setup:

  • iPhone 15 Pro mounted on a head rig
  • Live Link Face app streaming to Unreal Engine
  • ARKit blendshapes mapped to MetaHuman

The quality is about 80% of what you'd get from a dedicated Faceware system at 1/20th the cost. The main limitations are:

  1. No tongue tracking
  2. Eye tracking can be jittery in low light
  3. Frame rate drops if the phone overheats

For body mocap I'm still using a Rokoko Smartsuit Pro. Has anyone tried the newer AI-based solutions like Move.ai?

I tried Move.ai last month and it's impressive for single-person captures. The markerless approach means zero setup time.

However, it struggles with:

  • Close character interactions (occlusion issues)
  • Fast movements like fight choreography
  • Finger tracking (basically nonexistent)

For dialogue scenes and walking animations, it's genuinely good enough for indie production. I'd say pair it with the iPhone face setup you described and you've got a solid pipeline.

Replying to GlitchFox: I tried Move.ai last month and it's impressive for single-person captures. The m...

Good to know about the limitations. I think for my project — a narrative adventure game — the markerless approach would work great since most scenes are conversations.

How's the cleanup workflow? I usually spend 50% of my mocap time just cleaning data in MotionBuilder.

Moonjump
Forum Search Shader Sandbox
Sign In Register