Meta is testing facial recognition technology to combat celebrity bait scams and improve the account recovery process on Facebook and Instagram. This initiative comes as scammers increasingly use the images of public figures in fraudulent ads, luring users into providing personal information or even sending money.
Detecting these scams can be tricky since many legitimate ads feature well-known personalities. Meta uses machine learning algorithms to scan millions of ads daily for potential violations.
If Meta’s systems flag an ad as potentially deceptive, they will compare the image to the public figure’s profile picture on Facebook and Instagram. The ad is immediately blocked if a match confirms using a celebrity’s likeness in a scam.
The company promises strict privacy safeguards, including deleting all facial data after a one-time comparison, whether or not the system identifies a match. Meta’s early tests involving a small group of celebrities have shown positive results, with faster detection of scams.
Meta plans to notify more public figures soon, automatically enrolling them in this protection program. However, they can opt out at any time via the Account Center.
Scammers can also impersonate celebrities by creating fake accounts to promote fraudulent investment opportunities or solicit personal information in exchange for fake giveaways. Currently, Meta relies on its detection systems and user reports to flag impersonator accounts, but the company is exploring the use of facial recognition here, too.
The company’s potential solution involves comparing the profile pictures of suspicious accounts with those on the official accounts of public figures.
“We currently use detection systems and user reports to help identify potential impersonators. We’re exploring adding another step to help find this kind of fake account faster. This added layer of defence would work by using facial recognition technology to compare profile pictures on the suspicious account to a public figure’s Facebook and Instagram profile pictures,” Meta said.
Another key area Meta is addressing with facial recognition is account recovery. Users who lose access to their Facebook and Instagram accounts due to forgotten passwords or scams often face a frustrating recovery process. Typically, they must submit an official ID or other documentation to verify their identity.
Meta is now testing video selfies as a quicker alternative to streamline this process. Users can upload a video selfie, which the system will compare to their profile pictures using facial recognition technology. The video is encrypted and securely stored, and Meta assures that the data will be deleted immediately after the verification process, regardless of the outcome.
Meta claims this method makes identity verification faster and harder for hackers to exploit than document-based methods.
Meta and UK banks recently agreed to combat rising online fraud by sharing financial and transactional intelligence. In July 2024, Meta purged 63,000 Nigerian accounts involved in sextortion. In May, Meta, Match, Coinbase and others unite to combat online fraud.
In April, Meta’s Oversight Board took up two celebrity deepfake cases. However, Meta has not always been on the right side of the ethical scale. Last month, Meta was caught scraping Australia’s data including pictures of children.
In another case, Google and Meta entered into a secret deal to target teenagers on YouTube. In July, news reports showed that Meta allowed drug ads on their platform.
In the News: Download Manager is in the works for tracking recent Android apps