So recently, I ran across this guy.
Sensible enough. It has a few apparent assumptions in it; such as that we all have equivalently coded expressions corresponding to our emotions (a big leap), we all drive worse when we’re angry or flustered (a moderate leap), and that the number one reason why we might be lowering our eyelids is that we are tired (moderate).
The concept of determining emotion and mental state by reading body language cues is not a new one; in fact, it is integral to communication. Humans have an astounding 43 muscles in the face, capable of generating myriad expressions that play a pivotal role in establishing rapport with other people. Some of the muscles are moved consciously, some of them are not. NLP has an entire subfield regarding microexpressions, tying singular twitches of these muscles to internal mind states; it is not psuedoscience to think that a computer can be trained to do the same thing. What I worry is what controls that the experimenters went through to establish these foundations.
It’s ultimately moot, though. I think we can all see where this is going.