Gianni Dell'Aiuto | WBN News Global - WBN News  | October 9, 2025

Do you even know what you’re handling? It’s not just numbers. It’s not “neutral inputs.” It’s people. Their habits, their health, their voices, their secrets. You build your systems as if data were clay in your hands — but every line in your dataset comes from a living human being.

It is true that a company based in the U.S. or Canada might be tempted to say, “I don’t care, those rules don’t apply to me.” But that is no longer the case. Borders do not protect you when your product crosses them in the form of data flows and digital services. And beyond the law, there is ethics. Exploiting people without respect is not innovation — it is arrogance. It does not project strength; it projects weakness. A company that treats humans as raw material for algorithms sends a message of contempt, not leadership.

The first illusion is ownership. Buying or scraping a dataset does not make it yours. The individual who generated that data remains the real owner. GDPR, the AI Act, and every serious legal framework repeat this truth: you never own the data, you only borrow responsibility for it.

The second illusion is consent. Let’s be honest: the consent you rely on is a lie. A click on “accept” is not a free, informed decision. One thing is subscribing to a newsletter. Another is discovering that you’ve been profiled from head to toe. That gap is where lawsuits are born.

And then come the risks. The danger is not just reputational. In Europe, misusing data is a violation of rights written into law. In the U.S., people may wake up to the fact that they are being stripped naked digitally — and they could shut you down with class actions or public backlash. A dataset built on shaky consent is not an asset; it’s a liability.

The final illusion is the metaphor. Stop repeating that data is the new oil. That metaphor is outdated. Data is oil, yes — but also gold and platinum, because it is more valuable than all of them. Why? Because it’s not a natural resource. It’s identity. It’s dignity. Extract it without consent, and you’re not innovating — you’re exploiting. The EU AI Act is not only a constraint. It’s a shield. It protects companies that work with AI, even where compliance is not yet an obligation. It’s insurance against the collapse of trust.

And European laws are not just polite warnings. Luka Inc., the U.S. company behind the Replika chatbot, discovered this on its own skin. After investigations by the Italian Supervisory Authority, the company was fined €5 million and ordered to redesign its system. Why? Because it processed personal data without a valid legal basis, failed to provide clear information to users, lacked adequate privacy policies, and ignored age verification for minors. In short: no accountability, no transparency, no compliance. And the sanction arrived even though Luka was not based in Europe. That is the reach of GDPR — and the reality for anyone building AI without respecting fundamental rights.

And here is the only truth worth repeating: you can write the best code in the world, but if the data behind it is stolen, your system is rotten at the core. If your AI cannot survive without exploiting my data, maybe your AI shouldn’t survive at all.

More about your business vulnerabilities in Gianni's last article: https://www.wbn.digital/what-if-your-best-employee-becomes-your-worst-threat/:

See more articles about this subject here.

Tags:
#AI Ethics, #Data Privacy, #GDPR Compliance, #EU AI Act, #Tech Accountability, #Digital Rights, #Cross Border Regulation

Gianni Dell’Aiuto is an Italian attorney with over 35 years of experience in legal risk management, data protection, and digital ethics. Based in Rome and proudly Tuscan, he advises businesses globally on regulations like the GDPR, AI Act, and NIS2. An author and frequent commentator on legal innovation, he helps companies turn compliance into a competitive edge while promoting digital responsibility. Click here to connect with him.

Source Listings:
Sources: European Commission on the AI Act, GDPR Article 5 Principles, Italian Garante Privacy Sanctions Report, Replika Case Study, Wired EU Data Protection Coverage, FTC AI Guidelines, Privacy International Reports

Editor: Wendy S Huffman

Share this article
The link has been copied!