Most facial animations I've seen this gen, in upcoming titles and prior to now have been terrible. What makes it completely outside my understanding is that it's been years and years and years since Half Life 2, HL2: Episode 1 and HL2: Episode 2 got facial animations RIGHT. That's the base, go up from there, but when I consistently see games not even manage to get it to HL2 levels, a very old game by this point, it's kind of . . . sad.
It doesn't speak well of the people in development positions, doing this work, these days when so few manage to do anything resembling good with the facial animations - especially the lip syncing. The Source Engine had this great lip syncing tech, by the way, that was inclusive to multiple languages since the tech made the face, rather the mouth, animate to the voice, rather than it being hand animated or through other means. Basically it would try and match up how the lips moved with the sounds being made, it wasn't perfect, but it worked quite well and it escaped a lot of the issues that occur when dubbing in a new language (ever played a Japanese game dubbed in english where the mouth kept moving beyond the english words spoken? That's what the source engine's tech allowed Valve, and other developers, to escape in non-english dubs of their source engine games).