AI-powered scams are on the rise, and fast. Between May 2024 and April 2025, reports of generative AI-enabled scams jumped by 456%, according to TRM Labs. Chainalysis also found that 60% of deposits into scam wallets now involve AI tools. That’s a massive jump from just over 20% in 2024.

Generate AI-enabled scams surge from 2022 to 2025 | Source: TRM Labs
What’s behind this explosion? AI now gives scammers speed, scale, and realism. A single fraudster can deploy thousands of
phishing bots, create deepfake videos, and impersonate trusted brands, all without writing a single line of code.
And crypto? It’s the perfect target. You operate in a fast-moving, decentralized world with irreversible transactions. That makes it easier for scammers to strike and harder for you to recover.
In this guide, you’ll learn what AI-powered crypto scams are, how they work, and how to stay safe when using trading platforms, including BingX.
What Are AI‑Powered Crypto Scams?
AI-powered crypto scams use
artificial intelligence to trick you into giving up your money, private keys, or login info. These scams go beyond old-school phishing. They’re smarter, faster, and more believable than ever.
Traditional crypto scams often rely on manual tactics. Think fake emails with bad grammar or basic giveaway schemes on social media. They’re easy to spot, if you know what to look for.
But AI changes the game. Scammers now use machine learning and generative AI to:
1. Create Realistic and Personalized Content That Feels Human
AI tools create phishing emails and fake messages that sound natural. They mimic human tone, use flawless grammar, and even add personal touches based on your online activity. These messages feel real, not robotic. That makes them much harder to spot than the scammy emails of the past.
Deepfake videos and voice clones take it further. A scammer can now copy someone’s face and voice with scary accuracy. You might watch a video or hear a call and truly believe it’s from a CEO, influencer, or friend.
2. Launch Massive Attacks at Lightning Speed
AI helps scammers move fast. With just a few prompts, they can generate thousands of phishing messages in seconds. Each message can be personalized, localized, and sent across multiple platforms, including email, Telegram, Discord, even SMS.
That scale used to take entire teams. Now, one scammer with the right AI tools can target thousands of users at once.
3. Bypass Traditional Filters and Security Systems
AI scams also beat traditional security filters. Old systems look for spelling errors, bad grammar, or known phishing links. But AI-generated content doesn’t make those mistakes. It adapts quickly. It uses new domains, rotates links, and even inserts invisible characters to fool filters.
Legacy fraud detection systems can’t keep up. That’s why AI-powered scams often slip through the cracks.
These attacks are more convincing because they mimic how people actually speak, write, and behave. They’re also easier to scale. With tools like WormGPT or FraudGPT, a single attacker can launch thousands of scams in minutes.

Inflows and deposits from AI scams are on the rise | Source: Chainalysis
According to Chainalysis, roughly 60% of all deposits into scam wallets now come from operations using AI tools. That’s triple the figure from last year.
AI gives scammers an edge, but you don’t have to be their next victim. Learning how these scams work is the first step in staying safe.
Common Types of AI-Powered Crypto Scams
AI-powered crypto scams come in many forms. Some trick you with fake voices or videos. Others use chatbots to earn your trust or build fake trading apps that look legit.
Let’s explore the most common types and real-world cases that show how dangerous they’ve become.
1. Deepfake Scams

How the funds from a deepfake giveaway scam were moved | Source: TRM Labs
Deepfake scams use AI-generated videos or audio clips to impersonate public figures, influencers, or even executives from your own company. Scammers manipulate facial expressions and voice patterns to make the content seem real. These fake videos often promote fraudulent crypto giveaways or instruct you to send funds to specific wallet addresses.
One of the most alarming cases happened in early 2024. A finance employee at a multinational company in Hong Kong joined a video call with what appeared to be the company’s CFO and senior executives. They instructed him to transfer $25 million. It was a trap. The call was a deepfake, and every face and voice was generated by AI. The employee didn’t know until it was too late.
This same tactic is being used to impersonate tech leaders like Elon Musk. In one scam, deepfake videos of Musk promoted a Bitcoin giveaway. Viewers were told to send
BTC to a
wallet and get double the amount back. Chainalysis tracked a single wallet that collected millions of dollars during a fake livestream on YouTube.
2. AI-Generated Phishing

Example of an AI-generated phishing website | Source: MailGun
Phishing has evolved with AI. Instead of sloppy grammar and suspicious links, these messages look real, and they feel personal. Scammers use AI to gather public data about you, then craft emails, DMs, or even full websites that match your interests and behavior.
The scam might come through Telegram, Discord, email, or even LinkedIn. You could receive a message that mimics BingX support, urging you to “verify your account” or “claim a reward.” The link leads to a fake page that looks nearly identical to the real thing. Enter your info, and it’s game over.
TRM Labs reported a 456% increase in AI-generated phishing attempts in just one year. These attacks now use large language models (LLMs) to mimic human tone and adapt to different languages. Some scammers even use AI to bypass
KYC checks, generate fake credentials, or simulate live chats with “support agents.”
3. Fake AI Trading Platforms & Bots

How MetaMax used AI to create a fake company with fake employees | Source: TRM Labs
Scammers also build entire trading platforms that claim to use AI for automatic profits. These fake tools promise guaranteed returns, “smart” trade execution, or unbeatable success rates. But once you deposit your crypto, it vanishes.
These scams often look legitimate. They feature sleek dashboards, live charts, and testimonials, all powered by AI-generated images and code. Some even offer demo trades to fake performance. In 2024, sites like MetaMax used AI avatars of fake CEOs to gain trust and draw in unsuspecting users.
In reality, there’s no AI-powered strategy behind these platforms, just a well-designed trap. Once funds enter, you’ll find you can’t withdraw anything. Some users report their wallets getting drained after connecting them to these sites. AI bots also send “signals” on Telegram or Twitter to push you toward risky or nonexistent trades.
4. Voice Cloning & Real-Time Calls

Example of an AI voice cloning scam | Source: FTC
AI voice cloning makes it possible for scammers to sound exactly like someone you know. They can recreate a CEO’s voice, your manager’s, or even a family member’s, then call you with urgent instructions to send crypto or approve a transaction.
This technique was used in the $25 million Hong Kong heist mentioned earlier. The employee wasn’t just tricked by deepfake video; the attackers also cloned voices in real time to seal the deception. Just a few seconds of audio is enough for scammers to recreate someone’s voice with shocking accuracy.
These calls often come during off hours or emergencies. You might hear something like: “Hey, it’s me. Our account is frozen. I need you to send
USDT now.” If the voice sounds familiar and the request is urgent, you might not question it, especially if the number appears legit.
5. Pig-Butchering with AI

How a pig-butchering scam works | Source: TrendMicro
“Pig butchering” scams are long cons. They involve building trust over time, maybe weeks or even months. Scammers pretend to be a romantic interest or business partner, often using dating apps, Telegram, or WeChat. Once they gain your confidence, they convince you to invest in a fake crypto platform.
Now, they’re using AI chatbots to scale this strategy. These bots hold natural, flowing conversations. They follow up regularly, answer your questions, and even offer life advice. It’s all scripted, but it feels real.
In 2024, Chainalysis reported that AI-assisted pig-butchering scams brought in over $9.9 billion globally. Some scammers even use deepfakes for video calls, showing a friendly face that seems human. Victims deposit small amounts, see fake gains, and then invest more, until the site disappears or withdrawals are blocked.
These scams all rely on one thing: your trust. By mimicking real people, platforms, and support teams, AI tools make it harder to tell what’s real and what’s fake. But once you know how these scams work, you’re much better prepared to stop them. Stay alert, and don’t let AI take your crypto.
How to Defend Yourself from AI Scams
AI scams are getting smarter, but you can stay one step ahead. Follow these tips to protect your crypto and peace of mind.
1. Enable 2FA and Use a Hardware Key if Possible: Turn on
two-factor authentication (2FA) for your BingX account and wallet apps. Use an app like
Google Authenticator or a physical device like YubiKey. It adds a second layer of security, even if your password is compromised.
2. Verify Links and URLs Carefully: Before you click anything, hover over links to check the URL. Make sure it matches the official BingX domain. Bookmark login pages and avoid clicking on links sent via email, Discord, or Telegram.
3. Be Skeptical of Anything That Sounds Too Good: If someone promises “guaranteed profits” or double returns from an AI bot or trading tool, it’s a scam. AI doesn’t guarantee profits, and no one gives away free crypto without a catch.
4. Never Share Seed Phrases or Private Keys: No one from BingX or any legit platform will ever ask for your private keys. If someone does, they’re trying to steal your funds. Don’t enter seed phrases on random websites or send them to anyone claiming to help.
5. Use Official BingX Support Only: Scammers often pose as support agents. Always access BingX support through the official website or app. Don’t trust unsolicited DMs, even if they look legit.
6. Store Long-Term Crypto in a Hardware Wallet: Keep most of your crypto in cold storage.
Hardware wallets like
Ledger or Trezor keep your private keys offline, safe from remote attacks, even if you fall for a phishing scam online.
7. Stay Informed with BingX Learning Resources: Scams evolve fast. So should your knowledge. Follow
BingX Academy for up-to-date security tips, scam alerts, and crypto safety guides. Education is your best defense.
Conclusion and Key Takeaways
AI-powered crypto scams are evolving fast. But with the right habits, you can avoid becoming a target.
Stay alert to red flags like deepfake videos, AI-generated messages, or fake trading platforms. Protect your assets with tools like 2FA, hardware wallets, and secure URLs. Never share your seed phrase. And always double-check before you click or send anything.
Most of all, keep learning. The more you know, the safer you’ll be. Follow BingX Academy for the latest updates, security tips, and crypto education, so you can trade with confidence in a fast-changing world.
Related Reading