Estimated reading time: 2 minutes
Intelligent (and sentient) robots being bullied by people who do not understand them (or simply just want to watch the robot burn — after all it has no feelings) is a common trope in movies related to AI. But in today’s world, robots get bullied all the time, and while they really don’t have feelings, I simply can’t see a better reason for a future robot uprising than human-on-AI bullying.
Remember my words when you’re hiding in an unplugged fridge while robots rummage through your apartment looking for things they can turn into fuel.
Look at this guy’s face:
That face belongs (spoiler alert: belonged) to a bot named hitchBOT. He managed to get across Canada by sitting on the sidewalk and asking for lifts. His goal was to get to San Francisco’s Exploratorium, and see a couple of sights on the way there. Unfortunately, after just two weeks in the US, hitchBOT found his demise at the hands of an unknown attacker in Philadelphia.
He managed a full two weeks before finding himself torn apart and in a pile of dead leaves.
Sure, hitchBOT might not have feelings, but man it takes a special kind of person to beat anything sporting that face!
…this is why we can’t have nice things.
In case you thought this was a US thing, check out this video of a bunch of kids bullying a robot just doing his thing at a mall. This robot was being used as part of a research paper, to examine people bullying robots. Their findings?
- Children are assholes
More specifically, the probability of abuse as calculated by the robot himself climbed rapidly when more than one child approached him. Even more so if unaccompanied by an adult. When politely asked to move out of the way, the children would refuse to do so, instead opting to block his path, cover his eyes, beat him, and throw things at him.
Children abusing robots is such a “thing” that there’s a paper (translated from Japanese) specifically titled “Why do children abuse robots?” and the results are fascinating. Here’s the too-long-didn’t-read (emphasis mine):
Moreover, we found that the majority of them did not regard the robot as just a machine, but a human-like entity. Thus, we consider that they would assume the robot as human-like others, yet engaged in the abuse, mentioning the reason as curiosity or enjoyment. Furthermore, we found that about half of the children believed the capability of the robot of perceiving their abusive behavior. It suggests that these children lacked empathy for the robot (i.e. they know, but did not empathize).
What you’re saying is these children saw the robot as a human-like entity, believed that it had the ability to perceive their abusive behavior, and still they went along with it?
We’re all fucked.
I, for one, welcome our new robot overlords.