Bad news: Children give in to peer pressure when it comes from robots, so accept your new overlords now

Children will fall for peer pressure from robots, and then what will become of society? (Photo: Oli Scarff/Getty Images)
Children will fall for peer pressure from robots, and then what will become of society? (Photo: Oli Scarff/Getty Images)

While adults seem to be able to resist the charms of robots, school-age children fall easily for their robotic wiles.

According to a new study, children will repeat incorrect responses that robots give to a simple visual task, whereas adults, who often fall prey to peer pressure from their human counterparts, resist pressure from the robots.

“Rather than seeing a robot as a machine, children may see it as a social character,” says psychologist Anna-Lisa Vollmer of Bielefeld University in Germany. “This might explain why they succumb to peer pressure [applied] by robots.”

The research built on a previous study from the 1950s in which adults agreed with a group of their peers who were told to say that lines of a different length were actually the same length. Participants in that study, ages 18 to 69, agreed with their peers’ incorrect judgments. When the same study was performed with robots in place of human peers, however, adults disagreed with the robots.

But the children in the current study, who were 7 to 9 years old, agreed with three-quarters of the robots’ answers.

Why does it matter? Robots seem to have some sort of power over children, which should be kept in mind when we design artificial intelligence.

The study reads: “In this light, care must be taken when designing the applications and artificial intelligence of these physically embodied machines, particularly because little is known about the long-term impact that exposure to social robots can have on the development of children and vulnerable sections of society. More specifically, problems could originate not only from intentional programming of malicious behavior (e.g., robots that have been designed to deceive) but also from the unintentional presence of biases in artificial systems or the misinterpretation of autonomously gathered data by a learning system itself. For example, if robots recommend products, services, or preferences, will compliance and thus convergence be higher than with more traditional advertising methods?”

That’s just like when my older brother replaced my Teddy Ruxpin cassette tape with a Mötley Crüe tape, and the talking bear encouraged me to go smoke in the boys’ room. So impressionable.

Read more from Yahoo Lifestyle:

Follow us on Instagram, Facebook, and Twitter for nonstop inspiration delivered fresh to your feed, every day.