Good writeup. One thing I'd add that burned me early on with MetaHuman Animator: the stabilization pass matters a lot more than Epic's tutorials suggest. Raw iPhone capture, even with MHA, will have high-frequency jitter in the eye and jaw curves that reads fine on a phone screen but looks twitchy on a cinematic close-up.
My current fix is running a light Gaussian smooth on the problematic curves in the curve editor after import, but only on frequencies above about 12fps. Below that is usually intentional micro-expression. Takes maybe 10 minutes per shot and makes a real difference. Also worth checking: if your subject wears glasses, MHA struggles badly with the eye region and you'll want to rebuild those curves by hand or comp them separately.
Live Link Face is just not in the same league for offline cinematic work at this point. The only case I'd still use it is for real-time previsualization where turnaround speed beats quality.
