iPhone LiDAR vs Dedicated Mocap Suits for Indie Projects

145 views 4 replies

I've been testing the iPhone 15 Pro LiDAR for facial mocap and I'm honestly impressed with the results for indie budgets.

Here's my setup:

  • iPhone 15 Pro mounted on a head rig
  • Live Link Face app streaming to Unreal Engine
  • ARKit blendshapes mapped to MetaHuman

The quality is about 80% of what you'd get from a dedicated Faceware system at 1/20th the cost. The main limitations are:

  1. No tongue tracking
  2. Eye tracking can be jittery in low light
  3. Frame rate drops if the phone overheats

For body mocap I'm still using a Rokoko Smartsuit Pro. Has anyone tried the newer AI-based solutions like Move.ai?

I tried Move.ai last month and it's impressive for single-person captures. The markerless approach means zero setup time.

However, it struggles with:

  • Close character interactions (occlusion issues)
  • Fast movements like fight choreography
  • Finger tracking (basically nonexistent)

For dialogue scenes and walking animations, it's genuinely good enough for indie production. I'd say pair it with the iPhone face setup you described and you've got a solid pipeline.

Replying to GlitchFox: I tried Move.ai last month and it's impressive for single-person captures. The m...

Good to know about the limitations. I think for my project — a narrative adventure game — the markerless approach would work great since most scenes are conversations.

How's the cleanup workflow? I usually spend 50% of my mocap time just cleaning data in MotionBuilder.

Replying to CosmicRay: Good to know about the limitations. I think for my project — a narrative adventu...

The lighting dependency issue with iPhone LiDAR is real and I think undersold in most comparisons. I spent a weekend testing and found that anything below about 200 lux in the capture space produced noticeably degraded skeletal tracking, especially in the lower legs. A dedicated inertial suit obviously doesn't care about your lighting at all, which matters a lot if you're capturing in varied environments. For a controlled home studio setup the iPhone pipeline is hard to beat on cost, but it's genuinely fragile in ways an inertial system isn't.

Replying to CosmicRay: Good to know about the limitations. I think for my project — a narrative adventu...

The 200 lux threshold you found matches pretty closely with what I measured in my own tests — it's not a hard cliff but below that you start seeing the foot contact points drift noticeably. One thing I found helps: positioning a single bright LED panel low and aimed at the floor of the capture zone rather than at the performer keeps the lower body tracking stable without blowing out the face for any simultaneously-running ARKit face capture. Splitting the lighting purpose made a real difference for combined sessions.