“We’re talking about modelling pages, dancing pages, gymnastics pages for girls. It’s like any girl is up for grabs, really,” she said.
Ms Tankard Reist said it was worse on Instagram than other social networks because it was an image-based medium that is “very, very popular with young girls”, making it a magnet for predators and fantasisers.
The campaign, using the hashtags #WakeUpInstagram and #InstaPimpsGirls, is calling for Instagram’s parent company Facebook to make a number of changes to protect children.
First, to change the settings so that strangers cannot send direct messages to minors.
Second, to fix its algorithm to proactively remove sexualised or sexually graphic comments on minor’s photos.
Finally, to update its reporting system to let people reporting comments to highlight if it is on a minor’s post. Comments flagged through the reporting tool were often found not to breach community standards without this context.
A Facebook spokesman said it used image-matching software to find child exploitation material and it had removed 754,000 items in the third quarter of 2019.
Instagram also had comment moderation tools that allow only the people you follow to comment on your posts. The tools automatically block comments that contain any words or emojis you have identified as offensive, or turn off comments altogether.
“Any content that endangers or exploits children is unthinkable and has no place on Instagram,” the spokesman said. “We have policies that prohibit this type of content, and we use proactive technology to find and remove content that poses a risk to children.”
But Ms Tankard Reist said the group had easily found hundreds of inappropriate comments: “If that’s the definition of proactive then we’ve got a problem.”
Instagram, like Facebook, requires users to be 13 before they join. As Ms Tankard Reist points out, this is “still underage”.
Australian eSafety Commissioner Julie Inman Grant said she shared Collective Shout’s concerns.
Onus on platforms
“We are placing the responsibility back on the technology platforms themselves so that they are designing, developing and deploying their online services with user safety in mind and protections are built-in up front — an initiative we call Safety by Design,” Ms Inman Grant said.
Michael Salter, associate professor of criminology at the University of NSW, said social networks should keep minors in a “walled garden” similar to the new YouTube Kids for children aged 12 and under.
He said the fundamental flaw in platform design was that adults could directly contact children who were strangers to them.
“For as long as those design features are enabled in platforms, we will always have predatory adults who are targeting children,” Dr Salter said.
“We need to reject industry claims that they cannot keep kids safe on the platforms. We need to reject the industry claim that it is ‘up to parents’ to police children’s behaviour.”
Dr Salter said technology companies were legally responsible for the content they hosted once it was reported to them but they often made it difficult to file a complaint.
Caitlin Fitzsimmons is a senior writer for The Sun-Herald, focusing on social affairs.