Meta Platforms Inc., the parent company of Facebook, Instagram, and WhatsApp, has removed 63,000 accounts associated with the notorious “Yahoo Boys” scam group, the company announced in its Q1 2024 Adversarial Threat Report on Wednesday.
The accounts, deleted over the past few weeks, were used for financial sextortion scams and distributing blackmail scripts.
Meta reported that a smaller network of 2,500 accounts, linked to around 20 individuals, primarily targeted adult men in the United States using fake identities.
Meta said it identified and disabled those accounts through a combination of advanced technical signals and comprehensive investigations, enhancing its automated detection systems.
“Financial sextortion is a borderless crime, fueled in recent years by the increased activity of Yahoo Boys, loosely organized cybercriminals operating largely out of Nigeria that specialize in different types of scams.”
“We have removed around 63,000 accounts in Nigeria attempting to target people with financial sextortion scams, including a coordinated network of around 2,500 accounts.”
“We have also removed a set of Facebook accounts, pages, and groups run by Yahoo Boys—banned under our dangerous organisations and individuals policy—that were attempting to organize, recruit, and train new scammers,” the company explained.
During the investigation, Meta said it found that most scammers’ attempts were unsuccessful, though some had targeted minors, noting that those cases were reported to the National Centre for Missing and Exploited Children.
Meta revealed that it also shared information with other tech companies via the Tech Coalition’s Lantern program to help curb these scams across platforms.
Further, the parent company of Facebook said it removed around 7,200 assets in Nigeria, including 1,300 Facebook accounts, 200 pages, and 5,700 groups that were providing scam-related resources.
These assets were found offering scripts and guides for scams and sharing links to collections of photos for creating fake accounts, it expounded.
Since this disruption, Meta’s systems have been actively blocking attempts by these groups to return, continually improving their detection capabilities.
The company noted that it had also been working closely with law enforcement, supporting investigations and prosecutions by responding to legal requests and alerting authorities to imminent threats.
The social media giant stated that its efforts extended beyond account removal.
“We also fund and support NCMEC and the International Justice Mission to run Project Boost, a program that trains law enforcement agencies around the world in processing and acting on NCMEC reports.”
“We’ve conducted several training sessions so far, including in Nigeria and Cote d’Ivoire, with our most recent session taking place just last month,” the firm revealed.
To protect users, especially teens, Meta disclosed that it has implemented stricter messaging settings for users under 16 (under 18 in certain countries) and displays safety notices to encourage cautious behaviour online.
Last week, Meta was fined $220m by Nigeria’s Federal Competition and Consumer Protection Commission for multiple violations of data protection laws linked to WhatsApp.
The investigation, initiated in May 2021, found that Meta’s privacy policies infringed on users’ rights, including unauthorized data sharing and discriminatory practices.
Meta planned to appeal the decision, arguing that it disagreed with the findings and the imposed penalty.