titorite
Senior Member
- Messages
- 1,974
http://www.washingtonpost.com/blogs/inn ... ure-lying/
We are nearing a point where our smartphones will be able to recognize a face or voice, in real life or on-screen. And identification is only the most basic of the possibilities. Many app-makers are experimenting with software that can also analyze – able to determine someone’s emotions or honesty just by a few facial cues.
This interpersonal assessment technology promises to make our lives easier. For instance, facial recognition technology can allow people to get immediate and amazing customer service. If a restaurant or retailer can identify me before I walk in the door, it would be able to identify me as a returning customer, accessing my favorite dishes or products. I would be greeted like an old friend (whether I were, or not).
Similarly, algorithms are now being developed that link thousands of facial cues with human emotions. Our brains do this naturally – we know without asking whether someone is happy or upset based only on their expressions. Law enforcement and poker players take this a step further, using facial cues to determine someone’s honesty. But with technology augmenting our brain’s natural behavior – possibly providing direct, measurable and verifiable input – we can produce measurable and verifiable data. As sensors move from our smartphones to activity trackers to smartwatches from Apple and Samsung, we are measuring more than ever and are not far off from continuously tracking our emotions. And software is now in development to interpret people’s emotions, then project the results via an app onto a screen such as Google Glass.
Technology can also analyze the human voice to determine emotion – again, not just mimicking, but surpassing our brain’s abilities. Moodies, an app developed by Beyond Verbal, is able to detect a speaker’s mood based on nothing more than a voice. Worldwide call centers are testing the technology to help operators determine whether callers are upset and likely to switch their business to a competitor.
There are also some potentially negative consequences. If you can simply run a person’s image and voice through an app to determine their emotions and veracity, we will have to adjust as a society. Many of our daily interactions are built on small lies: “So happy to see you”, “Of course I remember you,” and “This is the best (food, activity or place).” In other words, society’s function is smoothed by little white lies – do we really want to eliminate that?
We are nearing a point where our smartphones will be able to recognize a face or voice, in real life or on-screen. And identification is only the most basic of the possibilities. Many app-makers are experimenting with software that can also analyze – able to determine someone’s emotions or honesty just by a few facial cues.
This interpersonal assessment technology promises to make our lives easier. For instance, facial recognition technology can allow people to get immediate and amazing customer service. If a restaurant or retailer can identify me before I walk in the door, it would be able to identify me as a returning customer, accessing my favorite dishes or products. I would be greeted like an old friend (whether I were, or not).
Similarly, algorithms are now being developed that link thousands of facial cues with human emotions. Our brains do this naturally – we know without asking whether someone is happy or upset based only on their expressions. Law enforcement and poker players take this a step further, using facial cues to determine someone’s honesty. But with technology augmenting our brain’s natural behavior – possibly providing direct, measurable and verifiable input – we can produce measurable and verifiable data. As sensors move from our smartphones to activity trackers to smartwatches from Apple and Samsung, we are measuring more than ever and are not far off from continuously tracking our emotions. And software is now in development to interpret people’s emotions, then project the results via an app onto a screen such as Google Glass.
Technology can also analyze the human voice to determine emotion – again, not just mimicking, but surpassing our brain’s abilities. Moodies, an app developed by Beyond Verbal, is able to detect a speaker’s mood based on nothing more than a voice. Worldwide call centers are testing the technology to help operators determine whether callers are upset and likely to switch their business to a competitor.
There are also some potentially negative consequences. If you can simply run a person’s image and voice through an app to determine their emotions and veracity, we will have to adjust as a society. Many of our daily interactions are built on small lies: “So happy to see you”, “Of course I remember you,” and “This is the best (food, activity or place).” In other words, society’s function is smoothed by little white lies – do we really want to eliminate that?