Posted inSecurity

Meta shuts down over 1,000 ChatGPT scams

Meta’s security experts have identified approximately ten variations of malware masquerading as AI chatbot tools, including ChatGPT

Facebook-parent Meta has raised a warning against the widespread presence of fraudulent ChatGPT malware, which aims to compromise user accounts and take control of business pages.

In its Q1 security report, the company revealed that malware operators and spammers are exploiting popular trends and topics to grab users’ attention, with AI chatbots like ChatGPT, Bing, and Bard being the latest tech craze. Consequently, fake AI chatbot tools, including ChatGPT, are currently in vogue, and scammers are using them to deceive unsuspecting users.

Meta’s security experts have identified approximately ten variations of malware masquerading as AI chatbot tools, including ChatGPT, since March. Some of these appear as web browser extensions and toolbars and are available in unspecified official web stores.

“The generative AI space is rapidly evolving and bad actors know it, so we should all be vigilant,” said Guy Rosen, Chief Information Security Officer, Meta.

According to the Meta report, threat actors have developed harmful browser extensions that are available in official web stores. These extensions falsely claim to offer tools related to ChatGPT. To avoid detection from both stores and users, some of these extensions even have functional ChatGPT features alongside the malware.

Meta has detected and prevented the sharing of over 1,000 such malicious URLs through its apps. The company noted that they have also reported them to their industry peers at file-sharing services where the malware was hosted so that they can take appropriate action.

The Facebook-owner said it plans a new support flow for businesses impacted by malware.

“In the months and years ahead, we’ll continue to highlight how these malicious campaigns operate, share threat indicators with our industry peers and roll out new protections to address new tactics,” said Rosen.