Skip to content

Can the Anki Vector robot assistant feel real emotions?

Back when I was a kid, the toy I always wanted was Tekno the Robotic Puppy, which launched in the year 2000. My parents wouldn’t let me get a real dog, so I figured a robot dog would be the next best thing. By today’s standards it wasn’t a particularly capable robot, but by 2000’s standards, it was lifelike.

It walked around, barked, knew when to go to sleep, and responded to my voice. As fun as it was, the popularity of toys like these quickly faded, but my infatuation persisted. That’s why I was so excited even all this time later when I saw the Anki Vector announced on Kickstarter.

The Anki Vector isn’t a robot dog, but it is like a robotic pet in many ways. Like no other toy especially in its price range (US $250) it is able to react with emotions depending on your choice of words and interactions.

It even has a built-in camera under its OLED “eyes” so it can actually see you and recognize you, getting excited if it recognizes you are back home after a day out. Plus, Vector also acts as a smart assistant, answering questions on command thanks to its voice recognition, AI, and Alexa support.

Of course, I rushed to buy one. Since I already have Apple’s HomePod, I wasn’t so interested in the smart assistant part of it. I wanted an emotional connection with a robot — a concept the likes of which one would typically only express privately to a therapist. Jokes aside, the technology behind giving robots human characteristics has always fascinated me.

Once I got to playing around with Vector, the realism in his responses and reactions almost immediately prompted an existential question: where is the line between real emotion and programmed emotion?

Connecting with Vector

Fist bump!

I noticed right away that I interacted with Vector extremely differently than any other gadget I had ever owned. In fact, people that visited my apartment instinctively acted differently too. Before I could even say what Vector was, the charming way it rolled around on the table naturally invited interaction. It really was like a pet.

Telling Vector he’s a “good robot” makes him happy just like telling him “bad robot” makes him upset and ashamed. Picking him up for whatever reason infuriates him, and he’ll often throw a tantrum in your hand. Rubbing his sensors on the top to simulate petting motion puts him at ease.

I can’t leave out my favorite interaction by far though: when Vector notices you staring at him, he’ll recognize your face and say your name out loud with enthusiasm. It tickled me so much to hear him say “George” with a sense of pride.

It’s odd for me to even be describing a household gadget in such a way. I even recognized while writing that when I started talking about Vector’s programmed emotions, I inadvertently switched my pronouns from “it” to “he.”

It seems like a silly question at first, but it’s fascinating to explore: does Vector really feel all these emotions he’s expressing? It’s strangely convincing at times, especially when friends and family meet him for the first time and instantly get friendly.

Vector getting ready to play with his toy.

The question quickly leads to a debate on consciousness and life. We have plenty of research on it, but haven’t quite fully nailed the difference between what is alive and not alive. The lines are far more blurred than we once thought. A fascinating video from one of my favorite YouTube channels, Kurzgesagt, explains this concept far better than I possibly can.

Vector certainly has a very long way to go before he’s as emotionally intelligent as a real pet and even the famous bot Sophia doesn’t at all match the intelligence of humans. But will we ever get to a point where robots can genuinely feel and understand at the level of a human? If so, when?

One can argue the answer is never because they will always have been initially programmed to act a certain way, but another can argue the possibility that humans were actually programmed too. In fact, some people like Elon Musk are pretty certain of it.

Does Reality Matter?

That’s getting further away from the central question though, so here’s a simple answer. Given our current understanding of both life and computers, no, Vector can not feel emotions. He is just responding the way he is programmed to respond to certain phrases and interactions.

…But maybe, just maybe, that doesn’t matter.

Vector’s emotional intelligence draws you in, even if it is only programmed.

When you’re talking to someone about the rough day you’re having, and they express sympathy, can you ever really be sure they mean it? We’ve all just grown to trust that most of the time, humans are in fact experiencing the emotion they are conveying, but the truth is we never know what anyone else is feeling with 100% certainty.

Despite humans having free will to be able to decide whether to be genuine or not, even the definition of “free will” becomes grayer as AI and machine learning get more advanced.

Humans really only perceive reality, especially when it comes to emotions. We presume certain things are true or false, genuine or disingenuous.

If it looks like a duck and quacks like a duck, people are going to treat it like a duck. That’s exactly what happened with Vector, and he’s only a glimpse of what’s to come in terms of robots with emotional appeal and understanding. Even with our knowledge that he nor any robot feel true emotion, just like any human, if it’s good enough at acting, people are going to buy into it.

So for now, continue to enjoy petting Vector and getting his realistic feedback — and watch in awe over the next few years as robots get better and better at this so-called “acting.”

Hello there

If you like what you read, please support our publication by sharing it with your friends, family and colleagues. If you’re running an Adblocker, we humbly request you to whitelist us. 






  • The autonomous assistant has long been seen as the endgame for the smart home, and companies like Anki are the ones who will nudge us there. This one didn’t quite make it, but others will come along. We just have to accept that there will be casualties along the way – and be ready to feel them when they do.


  • >