Facebook accused of targeting 'insecure' children and young people, report says

Facebook has apologised for reportedly allowing advertisers to target emotionally vulnerable people as young as 14, as a 23-page leaked document obtained by The Australian revealed. 

According to the news outlet, the document prepared by two top Australian Facebook executives uses algorithms to collect data (via posts, pictures, and reactions) on the emotional state of 6.4 million "high schoolers," "tertiary students," and "young Australians and New Zealanders … in the workforce," indicating "moments when young people need a confidence boost." 

In other words, data says they feel "worthless" or "insecure" and are therefore well-positioned to receive an advertiser’s message. 

SEE ALSO: Facebook blasted again for not quickly removing child porn

A spokesperson for the social media giants said an investigation has been opened, telling The Australian on the weekend, "we have opened an investigation to understand the process failure and improve our oversight. We will undertake disciplinary and other processes as appropriate." 

Additionally, a Facebook spokesperson told Mashable that the document's insights were never used to target ads. 

"Facebook does not offer tools to target people based on their emotional state. The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook," said the spokesperson. "Facebook has an established process to review the research we perform. This research did not follow that process, and we are reviewing the details to correct the oversight." 

Still, there's no denying that data mining algorithms such as this one not only exist, but in keeping with the basic principles of production for profit, they're being used all the time.  

What makes things worse for Facebook is that the real-time monitoring of young people's emotions in the document marked "Confidential: Internal Only" and dated 2017 seems to be in breach of the Australian Code for Advertising & Marketing Communications to Children

As The Australian points out, the Code defines a child as a person 14-years-old or younger, and states that children must "obtain a parent or guardian's express consent prior to engaging in any activity that will result in the collection or disclosure … of personal information." That is, "information that identifies the child or could identify the child." 

Mining Facebook for young people and children's negative emotions including "stressed," "defeated," "overwhelmed," and "useless" seems contrary to the ethical standards the Code's authors, the Australian Association of National Advertisers (AANA), champions. 

The report is the latest example of Facebook's intelligence being used in the service of what some would consider unethical advertising. A ProPublica investigation in 2016 alleged that the platform enabled advertisers to discriminate by race — what Facebook calls its "ethnic affinity" tag. 

In February, the company announced that it would begin using its AI to identify ads for housing, credit, and jobs, and remove any ads that targeted race.  

Perhaps news that Facebook is allowing ads to target young Australians based on their low emotional state will result in another "bare minimum" policy change. Either that, or it may create even more AI tools to try and address the problem.  

The AANA have been contacted for comment. 

WATCH: Facebook and Google got scammed out of $100 million