Would having VR experiences make us more empathetic human beings?
The Award winning VR Experience – The Tree
builds upon the life and death of a tree in which participants can actively direct the experience that they engage in in the life of a tree. When the tree is destroyed ultimately and hopelessly by a fire, the struggle feels real and allows us to see what it may feel like being in the souls of the trunk of a tree.
In pursuit of an empathetic machine we may need to go back and question what it really means, how do we learn, categorise and process emotions.
With progresses even in AlphaGo, does deep AI mean that such intelligence that we create would include emotions like anger, sadness, happiness, frustration or joy? As a game that I am somewhat familiar with as a child, the level of complexity and possibilities is numerous. Like playing in any competition, we often try to suspend our emotions when concentrating at such levels – particularly when you think of sportspeople competing or running a race. Perhaps like chess or poker, reading the other sides emotions is just as important in playing the game. How would that work with an AI?
Does/would an AI process emotions as well? AlphaGo combines advanced tree search and deep neural links. When humans are defeated by AI, such in the game between Alpha Go and Ke Jie. How do we interpret that win?
What human emotions do we want from AI?
Lovell parks the Turing test and welcomes the Frampton Test. By analogy and drawn from the lyrics of Rockstar Peter Frampton 1973, ‘Do you feel like we do?’
Lovelace 2.0, developed by Georgia Tech Mark Riedl, differentiates the Turing test:
How is your Lovelace 2.0 test different?
In my test, we have a human judge sitting at a computer. They know they’re interacting with an AI, and they give it a task with two components. First, they ask for a creative artefact such as a story, poem or picture. And secondly, they provide a criterion. For example: “Tell me a story about a cat that saves the day”; or “Draw me a picture of a man holding a penguin.”
Lovelace 2.0 is create – they can create designs, write a poem, story or painting. Could such creative AI serve a greater purpose in understanding, expressing emotions?
According to Harvard Business Review, emotions that engage in our creativity includes positive experiences. How does that work with an AI? If AI experiences feelings and emotions, then is it likely to evolve? Would they want to experience emotions like human beings? At that stage, could we say that AI has reached consciousness?
What if AI can read human emotions and react as humans would? If not even more accurately than our fellow human beings?
What if AI can comfort us as friends, when they see that we and our loved ones are stress? Could AI save lives and discuss emotions say through a Chatbox from children to adults?
What questions arise is inverted to what may be a reflection of society? If as a developed/developing society, we do not have ‘time’ to reflect, to feel our emotions – good or bad, to spend time with friends or families or ourselves just to sit in silence. How will AI save us? As we progress to embrace AI, it could be a social experiment of whether we use this as an opportunity to embrace humanity more closely or distance ourselves further away from it.
You might also like:
|Opening our Minds with systems thinking|
|What is consciousness?|