
Worldcoin — the biometric identity project led by OpenAI’s Sam Altman — wants to change how we think about digital identity. It promises financial inclusion for everyone, especially those in underserved parts of the world. The catch? To get onboard, you need to scan your eyeball.
The company, now just calling itself “World,” says the iris scan proves you’re a unique human in a sea of bots. In return, you get a “World ID” and some WLD tokens. Sounds futuristic — and maybe even noble. But not everyone’s convinced this is the kind of future we actually want.
A Question of Control
At the core of the criticism is this: can a biometric system that relies on custom-built hardware, encrypted iris scans, and centrally-managed infrastructure ever really be decentralized?
Shadi Al-Damati, co-founder of the Holonym Foundation, put it bluntly: “Decentralization isn’t just about the code — it’s about values like personal control and privacy. The way World is structured goes against that.”
Even though the company uses tools like zero-knowledge proofs (ZKPs) and multi-party computation (MPC) to anonymize the data, it still requires users to interact with a device it built — the “Orb” — and trust that everything is handled securely. For critics like Al-Damati, that’s a problem. “If one company controls the hardware, the network, and the rules, is it really decentralized — or just rebranded surveillance?”
World insists that users stay in control. According to their spokesperson, the iris image is encrypted, sent to the user’s phone, and then immediately deleted from the Orb. The data is never stored, they claim — only a hashed version is used, and even that’s processed anonymously. Still, the concerns aren’t going away.
Déjà Vu from OpenAI?
The criticism isn’t just about tech — it’s also about trust. Al-Damati draws a direct line between World’s data collection and what OpenAI (Altman’s other company) did with internet data to train its models. “They scraped the internet without asking, then called it innovation,” he says. “Now they’re collecting people’s biometrics and calling that progress too.”
It’s not an empty comparison. In recent years, OpenAI has faced lawsuits over unauthorized data collection. In 2023, a class action accused them of scraping 300 billion words from the internet — including private information from adults and children. In 2024, Canadian media organizations filed another suit, accusing the company of using their content without permission to train ChatGPT.
World distances itself from that narrative, claiming it doesn’t store or sell user data, and that its tech is designed for privacy from the ground up.
Incentives or Exploitation?
Another big concern is how World is recruiting users — especially in developing countries. The company says it provides clear instructions, translated guides, and educational modules. But when you’re offering people money (in the form of WLD tokens) to scan their irises, the ethics get murky.
Critics say the power imbalance is too big. In places where access to banking and jobs is limited, people might give up their most sensitive data without fully understanding what they’re signing up for. “You can’t have real consent when someone’s desperate,” says Al-Damati.
Regulators Aren’t Ignoring This
World’s rollout hasn’t gone unnoticed by regulators. Since launching in 2023, the company has faced investigations in Germany, Kenya, Brazil, Indonesia, and the UK. In some cases, like Kenya and Indonesia, the project was temporarily suspended due to legal concerns around privacy and consent.
The questions they’re asking are tough but fair:
- Do people really know what they’re agreeing to?
- How is their biometric data being protected?
- Can this system actually be called decentralized?
What Happens When Iris Scans Become the Norm?
There’s also a bigger-picture concern. If Worldcoin succeeds, will we wake up one day in a world where you need to scan your eye to access digital services? And what happens to those who refuse?
Al-Damati warns that it could create a digital class divide — where those willing to give up privacy gain access, and those who resist are left out. “It’s not inclusion if the price is your identity,” he says.
World says that users can still use a basic version of its platform without verifying with an Orb. But even that hasn’t fully reassured critics.
The AI Angle: A Real Security Concern
Eowyn McMullen, co-founder of Privado ID, offers another perspective: with the rise of AI bots, fake identities, and misinformation, some kind of digital verification system is inevitable. “The problem is real — but we have to solve it without turning privacy into collateral damage,” she says.
She believes the real challenge is balancing safety, identity, and freedom — all while avoiding centralization. “We need open standards, not black boxes.”
Bottom Line
Worldcoin wants to solve a real problem. But in doing so, it may be creating new ones. Until the privacy, governance, and power questions are seriously addressed, the project will keep facing hard scrutiny — and maybe that’s exactly what’s needed.