14 Sep 13 - @Sunshine_Jones on Twitter
Words are overrated. Cognitive language is a poor emotional yardstick. Yet most of the sentiment analysis industry is focused on words. Think of emotions as your car’s spark plugs — little and hidden but responsible for the resulting combustion that ultimately powers the car. Similarly, emotions summon the words in your prefrontal cortex; we dress them up by applying cultural filters and social norms and run them through our personalized cognition. The result is by now an almost indistinguishable mix of which emotions are just a small & diluted component.
Speaking of 'communications of feelings and attitudes,' the widely quoted formula of nonverbal communications pioneer Albert Mehrabian suggests in 'Silent Messages' that only seven percent of our communicational impact pertaining to feelings and attitudes is based on verbal language. The bulk is delivered by body language vocal modulations. Our intonations are literally tuned by our emotions — happiness or sadness, excitement or depression, anger or anxiety. Free from language the music of our vocal expression is universal and rings true across races and cultures. And not just humans, just think of the family dog.
Ironically, most sentiment analysis solutions are focused on figuring out those seven percent with mixed results. One can of course choose to use an MRI brain scan to crack the mystery of human language. Using MRI, Dr. Sophie Scott at University College London has done just that showing how the brain takes speech and separates it into words and 'melody.' Her studies suggest words are shunted over to the left temporal lobe for processing, while the melody is channeled to the right side of the brain, a region more stimulated by music.' Interesting as it may be, donning a Lady Gaga-like contraption on our heads to identify emotions in every day conversations would certainly not meet with 'applause.'"