Mainstream Social Media Promoting Global Paedophilia By Mrs Vera West
When I first saw this story about how mainstream social media was permitting accounts devoted to exploiting underaged people, mainly young girls for sex, I thought it must be false news, being too incredible. Wouldn’t the police have been following this, one of the crimes of the century, and cut it down? However, the present disclosure come from private research conducted by The Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst. One can only guess at the sheer amount of abuse and evil out there in the wilds of the internet, let alone the Dark Web. The problem will be to restrict this evil material, without by a flow-on effect, enabling censorship of legitimate dissent political views, and the system, once it moves, will be more interested in silencing political dissent, than silencing paedophiles, as the Epstein case demonstrated; plenty of the elites are paedophiles themselves.
“Instagram, the popular social-media site owned by Meta … Platforms, helps connect and promote a vast network of accounts openly devoted to the commission and purchase of underage-sex content, according to investigations by The Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst.
Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them. Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests, the Journal and the academic researchers found.
Though out of sight for most on the platform, the sexualized accounts on Instagram are brazen about their interest. The researchers found that Instagram enabled people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts that used the terms to advertise child-sex material for sale. Such accounts often claim to be run by the children themselves and use overtly sexual handles incorporating words such as “little slut for you.”
Instagram accounts offering to sell illicit sex material generally don’t publish it openly, instead posting “menus” of content. Certain accounts invite buyers to commission specific acts. Some menus include prices for videos of children harming themselves and “imagery of the minor performing sexual acts with animals,” researchers at the Stanford Internet Observatory found. At the right price, children are available for in-person “meet ups.”
The promotion of underage-sex content violates rules established by Meta as well as federal law.
In response to questions from the Journal, Meta acknowledged problems within its enforcement operations and said it has set up an internal task force to address the issues raised. “Child exploitation is a horrific crime,” the company said, adding, “We’re continuously investigating ways to actively defend against this behavior.”
Meta said it has in the past two years taken down 27 pedophile networks and is planning more removals. Since receiving the Journal queries, the platform said it has blocked thousands of hashtags that sexualize children, some with millions of posts, and restricted its systems from recommending users search for terms known to be associated with sex abuse. It said it is also working on preventing its systems from recommending that potentially pedophilic adults connect with one another or interact with one another’s content.
Alex Stamos, the head of the Stanford Internet Observatory and Meta’s chief security officer until 2018, said that getting even obvious abuse under control would likely take a sustained effort.
“That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” he said, noting that the company has far more effective tools to map its pedophile network than outsiders do. “I hope the company reinvests in human investigators,” he added.
Technical and legal hurdles make determining the full scale of the network hard for anyone outside Meta to measure precisely.
Because the laws around child-sex content are extremely broad, investigating even the open promotion of it on a public platform is legally sensitive.
In its reporting, the Journal consulted with academic experts on online child safety. Stanford’s Internet Observatory, a division of the university’s Cyber Policy Center focused on social-media abuse, produced an independent quantitative analysis of the Instagram features that help users connect and find content.
The Journal also approached UMass’s Rescue Lab, which evaluated how pedophiles on Instagram fit into the larger world of online child exploitation. Using different methods, both entities were able to quickly identify large-scale communities promoting criminal sex abuse.”
https://cyber.fsi.stanford.edu/news/addressing-distribution-illicit-sexual-content-minors-online
Comments