Dragon

Speech to text converters are coming into their own. But speech isn’t just words and sentences.

The use of emotion recognition might prove challenging as well, he added. Despite the claims that it improves love connections and speeds job interviews, consumers might bristle at the thought of being handled gingerly by a machine because they happen to have a note of frustration in their voices.

“The emotion-recognition aspect is being discussed widely,” Hegebarth said. “But there doesn’t seem to be a really reliable way of detecting emotional states fully, and some callers might not like it. They could find it intrusive.” |link|

So what do they find intrusive? From an informal survey I conducted a while ago, it seems at least a slim majority of people don’t mind the idea of giving up information to an artificial system per se, provided certain assurances that the information won’t cross human hands (cf Gmail, for instance).

In any case, I dont think there is the same reaction of intrustion is, for instance, a human speaker registers the emotion in your voice and reacts accordingly. In fact, I imagine that we expect the human to be able to handle my specific case when they are talking to me, emotions and all.

It seems to me that what is intrusive about a automated and mechanical response to human emotions is that it makes our emotional response itself seem mechanical and predictable. That my tone of anger doesn’t provoke a sympathetic response, but that it merely places me in the ‘anger’ category, to be dealt with in such and such a way.

In other words, if the machines become responsive to our emotions, then even our most emotional response can still be understood as the behavior of machines.

3 Comments

  1. Certainly this doesn’t change anything at all. If I’m in customer service and someone is distressed or angry then I deal with them with a given emotional response. This sort of automated emotional accomidation is understood. I think what might bother people is that with machines, since they don’t have emotions, this response would not be a reflection of empathy (sincere or otherwise). It would be patornizing for a machine voice to tell me “I understand you anger.”

  2. Surely its equally patronizing to hear that from a human. And of course it begs the question to say that machines don’t feel emotions. When machines are responsive to emotions, and their reactions are more or less standardized and understood in normal interactions (as they are with humans), what sense left is there to say that they don’t ‘have’ emotion?

  3. um machines don’t get angry. They don’t know what it feels like to deal with the bureucracy or stupid computers. Hence the machine can’t really empathize or understand on an emotional level. I guess you’d need to figure a standard to emotional responses and then test that to see how well humans fit that and then we could start to say that humans and machines have thier standard emotional responses and then (fingers crossed) we could stop with these laborous emotions all together, since they really wouldn’t bring anything to the interaction once this standard had been ascertained and applied.

Submit a comment