Will AI decode our emotions ?

    With higher computation power devices are available on the market, researchers can now perform a more complicated task with Artificial Intelligence which were on thoughts only. Among those complex problems, decoding human emotion is a challenging task. I’ll share some of the progressive work regarding AI to answer will AI decode our emotions/feelings.

    facial expression
    Nine different face expression.

    What would it be like if we can read another person’s moods and thoughts? We can’t know how our near ones are feeling, or they are suffering. Nearly 800,000 people commit suicide every year, i.e., one person commits suicide every 40 seconds (data verified from WHO).

    WHO logo
    Geneva, Switzerland World Health Organization (WHO / OMS) Logo at WHO Headquarters

    It is increasing day by day. We could prevent people from committing suicide if we knew about their moods and emotions. For now, it is not possible; however, it is not impossible in the future. The researcher made much more research and progress.

    We use Artificial Intelligence algorithms in our daily life. AI is listening to our voices; they are monitoring our body movement and language. Thanks to high computing devices like mobile, AI is now easily accessible in people’s reach.

    With recent advancements, the AI industry has a 25 billion dollar projected value in recent years.

    With advancing technology progress, the world had become more virtual. More personal virtual assistants are developed, such as Siri, Cortona, Google assistance, etc. And people are using more often.

    artificial intelligence and virtual assistant
    artificial intelligence and virtual assistant

    People feel more comfortable talking with virtual assistants than real people. A virtual assistant is programmed to be a more friendly and listening companion than a human. For these reasons, virtual assistants are in use.

    Artificial intelligence decodes the facial expression of mice

    Many neuroscientists and researchers have used AI/machine learning algorithms to decode lab mice’s facial expressions. With this advancement, now researchers are researching the human brain neurons responsible for a particular emotion. Neurochemicals are accountable for feelings.

    The researcher team used videography as a medium to study mice in the lab. They fixed the mice’s head to keep it still and then provided different sensory stimuli and filmed for study.

    White mouse in laboratory
    White mouse in laboratory

    The researcher places both sweet and bitter fluids to invoke pleasure and disgust. The team even gave a small but painful electric shock to mice to induce pain and fear.

    Researchers performed a study on mice for three years. They knew that mice could change their expression by moving their ears, cheeks, nose, and upper part of their eyes. However, videography didn’t help them to co-relate mice’s image with their different emotions/feelings.

    Researcher filmed mice with ultra-movement. Machine learning algorithms were able to distinguish distinct expressions. They were able to draw some conclusions on mice’s emotions.

    Emotion study on mice
    Emotion study on mice

    When mice experience pleasure, mice pull their nose down towards its mouth and pull its ear and jaw forward. Also, they pull back their ear and bulks out its cheek when they sense fear.

    Connected nerve cells
    Connected nerve cells

    The team then targeted the individual neuron responsible for emotions. They used optogenetics techniques to study different emotions trigger in mice’s brains.

    When researchers stimulated mice’s minds, they assumed the relevant facial expressions. A single neuron in the emotional centers of mice reacted in sync with facial expressions. This research could be the lead for neuroscientists to learn the origin of human emotion.

    Can emotion recognition software be trusted?

    Many emotion recognition software uses facial expression or facial muscle movement to recognize to categorize felling. But human emotions are hard to classify as feelings don’t only depend on facial emotion.

    face analyzing
    Face analyzing for emotion recognition

    They also depend on voice, body language. We speak aloud in joy, and in fear, we talk in a low voice; Companies did not analyze these factors in their marketed software.

    Big tech giants are selling the product what they called “emotion recognition.”  Many acknowledged companies are selling emotion analysis tools. Microsoft says their software ” recognizes eight core emotional states based on the universal facial expression,” which are turned down by reviewers.

    Will AI decode our emotions?

    Yes, AI will be able to decode/interpret our emotions/feelings in the future. But I doubt these “emotion recognition” are not entirely trustworthy as our emotions can be hide without moving our facial nerve. So performing a research study as done with mice might help achieve the goal of making AI that can decode our emotions/feelings.

    Recent Articles


    Related Stories

    Leave A Reply

    Please enter your comment!
    Please enter your name here

    This site uses Akismet to reduce spam. Learn how your comment data is processed.

    Get new notification

    Subscribe to our newsletter

    To be updated with all the latest news, offers and special announcements.