Meta is finally banning the accounts of these awful criminals

1comment
63,000 IG accounts banned after a horrific crime wave with devastating consequences
After the news that Spotify could connect you to the National Suicide Prevention Lifeline if you're searching for risky content, Meta is also taking actions into its own hands. It's not about self-harm though, but instead, Meta is trying to combat a horrific crime that can have devastating consequences.

Financial sextortion: over recent years, there’s been a growing trend of scammers (cybercriminals known as Yahoo Boys) who target people across the internet.

– Meta blog, July 24, 2024

That's why Meta announced the "strategic network disruption" of two sets of accounts in Nigeria affiliated with Yahoo Boys and involved in financial sextortion scams.

Zuck's conglomerate removed approximately 63,000 Instagram accounts in Nigeria that engaged in the horrible scam. This included a coordinated network of around 2,500 accounts linked to a group of about 20 individuals. These scammers primarily targeted adult men in the US, using fake accounts to conceal their identities.

The coordinated network of around 2,500 accounts was identified through new technical signals and in-depth investigations by expert teams. Most of these accounts had already been detected and disabled by enforcement systems, and the investigation facilitated the removal of the remaining accounts and enhanced the understanding of the techniques used to improve automated detection.

The investigation revealed that most of the scammers' attempts were unsuccessful and primarily targeted adults, although some attempts targeted minors. What an appaling thing to do! These accounts were reported to the National Center for Missing and Exploited Children (NCMEC). Relevant information is also shared with other tech companies through the Tech Coalition's Lantern program to enable broader action.

Recommended Stories
Applying lessons from dismantling terrorist groups and coordinated inauthentic behavior, the identification of this network helped uncover more accounts in Nigeria attempting similar sextortion scams, bringing the total to around 63,000 accounts removed.

I'm sure that each of the said 63,000 scam accounts has targeted more than one potential victim, so the total number of affected people could amount to some staggering figures. This is an epidemic that has to be dealt with.

Secondly, Meta removed approximately 7,200 assets, including 1,300 Facebook accounts, 200 Facebook Pages, and 5,700 Facebook Groups in Nigeria that provided tips for conducting scams. These efforts included selling scripts and guides for scamming and sharing links to photo collections for fake accounts.

Since this disruption, systems have been identifying and automatically blocking attempts by these groups to re-establish their activities. New tactics observed have been used to further improve the detection of accounts, Groups, and Pages engaging in such activities.

Meta aims to help people recognize and avoid these scams while making it difficult for the criminals to succeed. Teens under 16 (under 18 in certain countries) are defaulted into stricter message settings to prevent them from being messaged by anyone they are not connected to, and Safety Notices encourage them to be cautious.

Meta has developed new signals to identify accounts potentially engaging in sextortion and is taking steps to prevent these accounts from finding and interacting with teens. Additionally, an on-device nudity protection feature in Instagram DMs is being tested, which will blur images detected as containing nudity, encourage caution when sending sensitive images, and direct users to safety tips and resources, including NCMEC’s Take It Down platform.

The Take It Down platform is a service that is described as "a step you can take to help remove online nude, partially nude, or sexually explicit photos and videos taken before you were 18".

Take It Down works by assigning a unique digital fingerprint, called a hash value, to nude, partially nude, or sexually explicit images or videos of people under the age of 18. Online platforms can use hash values to detect these images or videos on their services and remove this content. This all happens without the image or video ever leaving your device or anyone viewing it. Only the hash value will be provided to NCMEC.
Create a free account and join our vibrant community
Register to enjoy the full PhoneArena experience. Here’s what you get with your PhoneArena account:
  • Access members-only articles
  • Join community discussions
  • Share your own device reviews
  • Build your personal phone library
Register For Free

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless