The industry spent years operating on a simple premise: secure the code, and the assets stay safe. That logic no longer holds. As protocols became harder to crack, attackers stopped targeting the blockchain and started targeting the person using it.
Data from the Chainalysis 2026 Crypto Crime Report puts the cost of this shift at $17 billion in stolen funds for 2025. The criminal calculation is straightforward. Why burn resources trying to break a cryptographic key when you can just trick someone into signing a transaction? This rise in social engineering hits just as digital assets move past the experimental phase.
Binance recently crossed the 300 million registered user milestone, signaling that crypto has become a massive global infrastructure. When you operate at the scale of a major nation-state, security ceases to be a feature and becomes a public necessity. Growth cannot outpace protection.
As the user base expands, however, the methods used to exploit that trust have industrialized.
The industrialization of social engineering
The market is undergoing a migration from technical hacks to psychological warfare. The sharp rise in fraud losses is not due to broken blockchains or exchange-level glitches but rather broken trust models. Criminals have industrialized fraud, moving away from lone-wolf operations to organized crime syndicates utilizing phishing-as-a-service kits and AI-driven deception.

The data is stark. Chainalysis’s data shows impersonation scams saw a massive 1,400% year-over-year growth in 2025, and attackers are using generative AI to apply a new kind of pressure. They manufacture deepfakes and scripts that are polished enough to slip past the radar of even experienced users—and it works. AI-enabled scams are now 4.5 times more profitable than the traditional variety.
Victims are paying a steeper price per incident. It isn’t just the frequency of attacks; it’s also the financial damage they cause. The average scam payment rose 253% to $2,764. Data from PeckShield supports this, showing a 64% year-over-year jump in total scam losses for 2025.
This surge suggests that hardware wallets and exchange protocols have become hard targets, yet the user remains the soft spot. When a user voluntarily signs a malicious transaction because they believe they are talking to customer support or a trusted investment advisor, cryptography cannot save them.
These security failures should be viewed as direct feedback for Web3 designers. Users are consistently falling for sophisticated social engineering, meaning that the interface itself must evolve to recognize and intercept these threats. The industry is facing a feedback loop where organized crime scales its capabilities in lockstep with adoption. This turns fraud into a high-margin global enterprise.
Investing heavily in security and compliance
You can’t fight industrialized fraud with user education alone. Platforms need to build trust by design directly into the interface, catching errors before the money leaves the wallet. Binance has poured resources into this specific layer of defense.
In 2025, Binance reports its risk controls stopped $6.69 billion in potential fraud losses, protecting 5.4 million users. Those aren’t just statistics. That figure represents millions of people who kept their assets despite being targeted by sophisticated social engineering. The exchange also reports a 96% drop in direct exposure to illicit funds between 2023 and 2025. This decline suggests that strict compliance filters succeed in cutting off the liquidity that bad actors rely on.
The ecosystem’s defense also requires offline collaboration. According to Binance, the exchange handled more than 71,000 requests from law enforcement in 2025, helping connect on-chain activity to real-world investigations. But the technology battle is just as intense.
As criminals use AI to create better fakes, platforms use AI to spot them. Binance recently secured ISO 42001 certification for its AI management system and currently runs over 100 models dedicated to anti-fraud controls. These systems scan for behavioral anomalies in real-time, catching red flags that human analysts simply can’t see at speed.
Binance Chief Data Protection Officer Barry Young noted the impact of these rigorous standards, “Combined with our existing portfolio of security and data privacy certifications, ISO 42001 signals that AI at Binance is developed and operated under a formal governance model that regulators, partners, and users can scrutinize and trust.”
Binance states these investments are the primary driver of retention for its 300 million users. Institutional investors and retail users alike are gravitating toward platforms that can demonstrate resilience against the $17 billion fraud economy. Security is no longer a cost center; it is the product.
The new perimeter: where AI meets user intuition
The rise of a $17 billion scam economy is a wake-up call. The industry has scaled, but so has the criminal element shadowing it. The old mantra of “your keys, your coins” implies a level of individual infallibility that simply does not exist at a mass scale. The future of crypto security relies on intelligence with guardrails.
Exchanges need to sit between the user and the risk. That means using real-time monitoring and AI to step in right before a user makes a costly error. The industry has to close the distance between hard security and human psychology. If users cannot be prevented from falling for psychological tricks, it becomes difficult to safely onboard the next wave of users. The security perimeter isn’t just the software anymore. It is the user’s decision-making process and protecting that requires a defense just as complex as the attacks.
Investing involves risk, and your investment may lose value. Past performance gives no indication of future results. These statements do not constitute and cannot replace investment advice.
VentureBeat newsroom and editorial staff were not involved in the creation of this content.
