![]() Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children.” Per the Journal, test accounts set up by researchers that viewed a single such account “were immediately hit with ‘suggested for you’ recommendations of purported child-sex-content sellers and buyers, as well as accounts linking to off-platform content trading sites. Researchers found that Instagram enabled people to search “explicit hashtags such as #pedowhore and #preteensex” and then connected them to accounts that used the terms to advertise child-sex material for sale, according to the report. ![]() The Journal’s report noted that Instagram accounts that offer to sell illicit sex material “generally don’t publish it openly” and that such accounts often link to “off-platform content trading sites.” The WSJ story also cited data compiled via network mapping software Maltego that found 112 of those accounts collectively had 22,000 unique followers. As of the fourth quarter of 2022, Meta’s technology removed more than 34 million pieces of child sexual exploitation content from Facebook and Instagram, more than 98% of which was detected before it was reported by users, the company said.Īccording to the Journal’s report, “Technical and legal hurdles make determining the full scale of the network hard for anyone outside Meta to measure precisely.” The article cited the Stanford Internet Observatory research team’s identification of 405 sellers of what the researchers deemed “self-generated” child-sex material (accounts purportedly run by children themselves) using hashtags associated with underage sex. “Predators constantly change their tactics in their pursuit to harm children, and that’s why we have strict policies and technology to prevent them from finding or interacting with teens on our apps and hire specialist teams who focus on understanding their evolving behaviors so we can eliminate abusive networks.”īetween 20, according to Meta, its policy enforcement teams “dismantled 27 abusive networks” and in January 2023 disabled more than 490,000 accounts for violating child-safety policies. We work aggressively to fight it on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it,” the rep said in a statement. In addition, “we provided updated guidance to our content reviewers to more easily identify and remove predatory accounts,” the Meta rep said. ![]() A Meta spokesperson said the company is “continuously exploring ways to actively defend against this behavior, and we set up an internal task force to investigate these claims and immediately address them.” Meta acknowledged that the company in some cases received reports of child sexual abuse and failed to act on them, citing a software error that prevented them from being processed (which Meta said has since been fixed). ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |