Home » Science & Technology » 5 ways in which robots are already more emotionally intelligent than you
 

5 ways in which robots are already more emotionally intelligent than you

Catch Team | Updated on: 10 February 2017, 1:38 IST

Remember Spike Jonze\'s movie Her, where which Scarlett Johansson plays an intelligent computer operating system? That reality is already closer than you think. Robots can understand human emotions far better than they could a decade ago and far more than you thought possible. In fact, At some level, these robots/computers are already showing signs that they can replicate the human emotional quotient (EQ).

They may even have an advantage over us when it comes to EQ, because, unlike humans, they lack emotion and therefore find it easier to understand the feelings of the other party. In the near-term future, we could be seeing robots replacing humans as bosses with the way artificial intelligence is advancing. Here are 5 ways in which computers are already replacing the human emotional quotient

01
Leadership

Leadership is basically a mechanical process of getting a group of people to do their work and function together in the most efficient way possible. It's not about a leader's looks or attitude, but about the way a leader handles the multitude of people under him/her and harnesses their abilities. Once we are able to decode and breakdown how someone will lead effectively whilst at the top, it is easy to then automate that process. "If we can do anything in a clear and intelligible way, we can do it by machine." says Norbert Wiener, a pioneer of robotics.

Back in 1997, Rosalind Picard coined the term 'affective computing'. "Affective Computing is computing that relates to, arises from, or deliberately influences emotion or other affective phenomena". 19 years later and there are a plethora of sensors, devices and other gadgets, that are good at interpreting features of the body, whether it is facial expressions, body gestures, speech and some software can even be taught to learn patterns and use the data in may different ways.

Some of the leading backs in America, including Bank of America have put a system in place to monitor the emotional range of its employees. Managers can then assist those who report 'out of depth' scores and help them take action

02
Self-driving

Self-driving is the future. If you're one to feel panicky whilist driving a car, especially in Delhi's traffic, then you'd be happy with a future of self-driving cars obeying traffic rules to the tee. In Silicon Valley and a few other places around the world, autonomous cars are replacing manned vehicles with the help of AI technology.

A startup called BRAIQ aims to teach the 'autonomous car' to drive similarly to how the driver would have driven it. By monitoring 'bio human signals' - facial expressions, eye movement, heart rate and others - the technology can then show the car how the passenger is feeling, feed that into the system, and teach it to respond accordingly.


Another company is British startup Five.ai that aims to beat Ford and BMW to market by two years. "Five.ai thinks it will beat the incumbents by using more sophisticated machine-learning that will help a vehicle understand its surroundings without the need to constantly compare its data against ultra-precise, three-dimensional maps created by radar systems, an approach being tested by Ford and Google," writes Joon Ian Wong in Quartz.

03
End-user experience

How do you improve call centers? By automating them, of course. Recently, Cogito raised $15 million. This money is to help call center staff better understand the 'mood' of the customer, based on just their voice." Cogito combines behavioral science and artificial intelligence to detect a person's emotional state and determine if their mood has changed and how conversations are going - all in real time," reports Venture Beat.

By doing so, the agents will be happier and more engaging in their day-to-day jobs, therefore improving the end user experience.

Cogito has also worked with the US government's Defense Advanced Research Projects Agency (DARPA) and the Department of Veterans Affairs to assist veterans in Post Traumatic Stress Disorder (PTSD) and those patients who have experienced depression. Again, by tapping into the emotions of human beings.

04
Bots

Bots have limited tools for picking up on emotional cues. Emotibot, a Chinese startup, wants to change that. The company is focused on making sure that personal assistants, virtual customer service agents and chatbots can understand and respond to the emotional state of the person they are dealing with. If someone can respond based on your emotions alone, that itself makes for a huge difference in quality of the service one is using.

Emotibot, to achieve their goal, uses audio and visual signs (via a camera), text, and sometimes even all three. Via visual cues, it has an accuracy of 95.63% when detecting emotions according to a TechCrunch report.

05
Law

We can already see robots surpassing humans in becoming lawyers. Back in February, a Stanford University student developed a robot that interacted with clients to "give them legal counsel and even generates custom appeals".


Andrew Arruda, the CEO and co-founder of ROSS Intelligence, told Tech Insider earlier in the year that he expects a legal robot to start doing so much more - from "drafting its own documents, building arguments, and comparing and contrasting past cases with the one at hand". Essentially, at the heart of AI systems, are humans interacting together. With robots helping out lawyers (since robot lawyers aren't yet allowed to argue cases), the interacting will only grow and also tap into the emotive appeal that is sometimes needed to win a case.


What the movie Her demonstrated is that 'falling in love with robots' isn't a far off future, especially with the first sex robots coming sometime in 2017. Forget science fiction, the future is already in front of our eyes, although a few years of refinement is needed. Within a decade, this reality will be seen in person.

First published: 4 December 2016, 1:13 IST