I saw a demo of Mica, a short-haired woman who doesn’t speak but still communicates in warm ways with the viewer. I put the AR glasses on my head and looked through prescription inserts to see the virtual overlays on the real world. I thought it was the best thing Magic Leap showed off.
I walked into a physical room and sat in a chair. Mica was sitting at the table in the same room. She smiled at me and look at me. I was struck that she wasn’t just looking at me. She was looking in my eyes. She tilted her head from side to side.
When I noticed how attentive she was, I moved my head forward and looked in her eyes. She did the same and looked at me. I moved my head back and she moved her head back too. She was mimicking some of the movements that she saw me make. She didn’t talk, but that is coming in the future.
On stage today at the Magic Leap L.E.A.P. event, Andrew Rabinovich, head of AI at Magic Leap, and John Monos, head of human-centered AI, talked about how they see digital humans and AI-based virtual avatars becoming real in the coming years on the Magic Leap platform.
These avatars will be like fully embodied Amazon Alexa smart speakers. When you wonder what the song was that you liked so much at the Pink Floyd concert last year, the virtual human, dubbed Aya in Magic Leap’s stage talk, will answer that it was “Another Brick in the Wall.” Aya will get to know you and your emotions. If your friend Erika comes over, Aya will detect that you are happy when you talk with Erika, and it will store that in her AI memory, Rabinovich said.
Late this year, Magic Leap will roll out avatar chat for its users. Over time, the company hopes to release all of the tools and technology that will make digital humans possible. Mica was powered by Unreal Engine 4, said Nick Whiting of Epic Games, maker of the Unreal Engine.