All Targeted Advertising is Malvertising, and We Can't Fix It
"The medium is the message, but the algorithm is the manipulation."
You know that weird moment when you mention wanting new running shoes to your friend, and suddenly every website you visit is plastered with sneaker ads? And you think, "Huh, that's convenient... and also deeply unsettling." We've all made peace with a certain level of this digital omniscience. We joke about our phones "listening to us" (they probably aren't, but the truth is somehow weirder). We've accepted that free services mean we're the product being sold. But here's the uncomfortable reality I've been wrestling with: what if targeted advertising isn't just creepy or invasive—what if it's fundamentally malicious by design?
The Word Nobody Wants to Say
The word "malicious" means intent to harm. And that's exactly what targeted advertising does—by design. I think we need to have an honest conversation about whether targeted advertising—all of it, even the "legitimate" kind—might fit that definition. I know this sounds extreme. Comparing a shoe ad to a computer virus? But before you dismiss this, consider what's actually happening behind those eerily perfect advertisements.What Targeted Ads Actually Do
The goal isn't to inform—it's to influence.Let's be brutally honest about how targeted advertising actually works:
The Data Collection
Modern advertising platforms collect massive amounts of data about us:- Behavioral data: What we click, how long we hover, our browsing patterns
- Location data: Where we go, when we're there, how often we visit
- Social data: Who we know, what they like, how we interact
- Financial data: What we buy, when we buy it, how much we spend
- Emotional data: Our mood patterns, stress indicators, vulnerability windows
- Health data: Our searches, our concerns, our conditions
The Psychological Targeting
This data isn't just collected—it's weaponized. Good targeted advertising finds the moment when you're most susceptible to buying something you don't need or making a decision you'll regret. It catches you when you're:- 🎯 Tired (reduced willpower—probably why I bought those 3AM infomercial products)
- 🎯 Anxious (seeking quick solutions to complex problems)
- 🎯 Lonely (vulnerable to social proof: "Everyone else is buying this!")
- 🎯 Bored (looking for stimulation—hello, impulse purchases while waiting for the dentist)
- 🎯 Recently paid (feeling financially flush until tomorrow when you realize utilities are due)
- 🎯 Feeling inadequate (seeking validation through purchases—surely THIS face cream will fix everything)
It's basically emotional aikido: using your own mental state against you. At least a traditional salesperson had to look you in the eye while doing it.
The Inequality Amplifier
Here's what really keeps me up at night: this system doesn't affect everyone equally. The same technology that shows me ads for overpriced headphones might show:- Someone struggling financially → Predatory loans at devastating interest rates
- People in recovery → Alcohol or gambling advertisements
- Those with eating disorders → Dangerous weight loss supplements
- Elderly users → Investment scams disguised as financial advice
- Young people → Get-rich-quick schemes and cryptocurrency fraud
"But It's Just Advertising"
Traditional advertising was fishing with a net. Modern targeted advertising is having a drone follow each fish, study its behavior, and deliver irresistible bait at the moment of maximum hunger.I can hear the pushback: advertising has always tried to persuade us. True! But there's a fundamental difference between a billboard that everyone sees and a system that studies your individual psychology to exploit your specific vulnerabilities.
The Old Model: Broadcast Persuasion
- Everyone saw the same message
- Limited personal data collection
- Obvious when you were being advertised to (the TV show stopped, commercials played, you made a sandwich)
- Clear separation between content and ads
The New Model: Psychological Warfare
- ✓ Personalized manipulation tactics
- ✓ Comprehensive surveillance infrastructure
- ✓ Ads disguised as content, recommendations, and "news" (that Instagram post about face cream? Probably an ad. Your friend's authentic product recommendation? Also maybe an ad. Your mom's recipe blog? Somehow, still ads)
- ✓ Behavioral modification at unprecedented scaleSure, sometimes this means showing me an ad for a book I actually want. But the same mechanism that does that also:
- 🔴 Deliberately targets vulnerable people with harmful products
- 🔴 Creates filter bubbles that radicalize and polarize
- 🔴 Manipulates behavior in ways we don't even notice
- 🔴 Undermines informed choice by design
And yes, people can still make their own decisions. But we've engineered a system specifically designed to undermine informed choice. That's not a side effect—that's the product working as intended.
The Part Where I Wish I Had Better News
Here's where I'd love to offer you a solution. "Here are five tips to protect yourself!" or "Vote for this policy change!" But I don't think we can fix this within the current system, and that reality is what haunts me.🚫 Individual Solutions Don't Scale
You can use ad blockers. You can delete your cookies. You can quit social media platforms. Good for you, genuinely. But that doesn't stop the system from affecting everyone else. Most people don't have the luxury of digital minimalism. My neighbor who works two jobs to support her kids shouldn't need a computer science degree to avoid being exploited online. The grandfather trying to stay connected with family shouldn't have to become a privacy expert to avoid financial scams.🚫 Regulation Faces Impossible Odds
The targeted advertising industry is worth hundreds of billions of dollars. The companies that benefit from it are among the most powerful in human history, with:- More lawyers than most governments
- Lobbying budgets that dwarf public interest groups
- Technical complexity that outpaces regulatory understanding
- Global reach that transcends any single jurisdiction
Most politicians don't truly understand how any of this works. Many of us don't either—this complexity is deliberately maintained.
🚫 "Ethical" Targeted Advertising is an Oxymoron
Some suggest we just need "better" targeted advertising with more oversight and consent. But consider: You can't get meaningful consent for something this complex. Those privacy policies are intentionally unreadable legal documents that change constantly. Even if you could understand them, what's the alternative? Not using email? Not having a phone? Not accessing essential services that have moved online? In 2025, digital participation isn't optional—it's required for basic social and economic participation. The surveillance capitalism genie is out of the bottle, and we're not putting it back in through individual choice or incremental reform. Most people don't have the luxury of digital minimalism. My neighbor who works two jobs to support her kids shouldn't need a computer science degree to avoid being exploited online. The grandfather trying to stay connected with family shouldn't have to become a privacy expert to avoid financial scams.🚫 Regulation Faces Impossible Odds
The targeted advertising industry is worth hundreds of billions of dollars. The companies that benefit from it are among the most powerful in human history, with:- More lawyers than most governments
- Lobbying budgets that dwarf public interest groups
- Technical complexity that outpaces regulatory understanding
- Global reach that transcends any single jurisdiction
Most politicians don't truly understand how any of this works. Many of us don't either—this complexity is deliberately maintained.
🚫 "Ethical" Targeted Advertising is an Oxymoron
Some suggest we just need "better" targeted advertising with more oversight and consent. But consider: You can't get meaningful consent for something this complex. Those privacy policies are intentionally unreadable legal documents that change constantly. Even if you could understand them, what's the alternative? Not using email? Not having a phone? Not accessing essential services that have moved online? In 2025, digital participation isn't optional—it's required for basic social and economic participation. The surveillance capitalism genie is out of the bottle, and we're not putting it back in through individual choice or incremental reform.Living With the Uncomfortable Truth (While Building Something Better)
"The first step toward freedom is calling things by their true names. The second step is figuring out what to do about it."So what do we do if we can't fix this within existing systems? Do we just accept surveillance capitalism as inevitable and move on with our lives while algorithms track our every click? Honestly? Some days that feels tempting—ignorance can be bliss, and fighting against multi-billion-dollar ad-tech feels like throwing pebbles at a tank. But here's what I keep coming back to: we built this system, which means we can build something different.
🤔 But First, Let's Acknowledge What We'd Be Giving Up
Before I propose alternatives, let me be honest about what makes targeted advertising so entrenched: it actually does fund a lot of free content that people value. YouTube tutorials that taught me to fix my sink? Ad-supported. Independent news sites covering stories mainstream media ignores? Ad-supported. That food blog with the perfect banana bread recipe (after scrolling past three life stories)? Ad-supported while your coffee gets cold. I'm not going to pretend those creators should just work for free. Many of them have rent to pay, families to feed, and student loans to service. Telling them "just stop using ads" without offering alternatives is like telling someone to "just stop being poor"—morally superior, practically useless. So any conversation about moving beyond malvertising has to include: How do creators eat?💡 What Alternatives Actually Exist? (The Honest Assessment)
Let me run through the options, including their real limitations.Option 1: Subscriptions
The pitch: Pay $5-10/month for ad-free access. What works:- Direct relationship between creators and audience
- No surveillance required
- Predictable income for creators
- Subscription fatigue is real (I cannot afford 47 different $5/month subscriptions)
- Paywall fragmentation makes the internet less accessible
- Many people simply can't afford multiple subscriptions
- Creates information inequality based on wealth
Option 2: Donations/Patreon
The pitch: Support creators you love voluntarily. What works:- Maintains free access for those who can't pay
- Direct creator-audience relationship
- Flexible support levels
- Unpredictable income (makes planning impossible)
- Only ~1-3% of audiences actually donate (the free-rider problem)
- Requires existing audience and personal brand
- Donation fatigue is also a thing
Option 3: Public/Cooperative Funding
The pitch: Publicly funded digital infrastructure, cooperative ownership models. What works:- Removes profit motive from surveillance
- Could provide stable funding
- Democratic governance possibilities
- Requires massive political will (good luck with that)
- Who decides what gets funded? (politicization risks)
- Doesn't exist yet at meaningful scale
- International coordination challenges
Option 4: Computational Contribution (Web Mining)
The pitch: Use spare computational power instead of ads or personal data. What works:- No personal data collection required (math doesn't need to know who you are)
- Passive contribution (no recurring payments to remember)
- Scales better than donations (more people participate)
- Transparent resource usage vs. hidden data extraction
- Earnings are modest (pennies per hour, not dollars)
- Requires decent hardware (excludes some users)
- Battery impact on mobile devices
- History of abuse (Coinhive cryptojacking poisoned the well)
- Still earning cryptocurrency (which many people distrust)
🌈 A Realistic Path Forward (With Actual Hope)
Here's what I think is actually possible, not in some utopian future, but in the messy, complicated present:We Need Multiple Monetization Models Coexisting
Stop looking for the one solution that replaces everything. Different creators, different audiences, different contexts need different approaches:- Subscriptions work for some audiences (those with disposable income who love specific creators)
- Donations work for others (audiences who want to support but need flexibility)
- Web mining might work for tech-savvy audiences with decent hardware
- Public funding could support journalism and education
- Cooperatives might work for communities with shared interests
- Yes, even ethical advertising might have a place (contextual, not surveillance-based)
Prioritize Transparency Over Perfection
Web mining earns $0.02/hour? Say that. Don't call it "revolutionary" or claim it'll make anyone rich. Subscriptions exclude people without disposable income? Acknowledge that. Don't pretend paywalls are purely about quality. Public funding risks politicization? Admit it. Don't act like cooperative governance is simple. Every model has tradeoffs. The current targeted advertising model just hides its tradeoffs behind complexity and legal jargon. Any alternative that's honest about limitations is already an improvement.Start Calling Malvertising What It Is
Stop pretending that a system designed to exploit human psychology for profit is just "advertising" or "personalization" or "relevant content recommendations." It's surveillance capitalism. It's psychological exploitation at scale. And while we figure out alternatives, let's at least be honest about what we're dealing with. When platforms say "we need targeted advertising to provide free services," what they mean is: "Our business model requires comprehensive surveillance and behavioral manipulation, and we've made you dependent enough that you'll accept it." Acknowledging that truth is step one.Support Experimental Alternatives (Even Imperfect Ones)
When someone tries something different—computational contribution, subscription co-ops, public interest funding, whatever—the response shouldn't be "That's not perfect, back to surveillance ads." Of course it's not perfect. Nothing will be. But experiments teach us what might work, what definitely doesn't, and what we haven't thought of yet. Give weird ideas permission to fail. Some will. But the alternative is accepting that surveillance capitalism won forever, and I don't believe that's true yet.🎯 What You Can Actually Do (Without Needing a CS Degree)
Okay, practical steps that don't require becoming a digital hermit or overthrow capitalism by Tuesday: 1. Use Ad Blockers- Not because you're stealing from creators (you're not)
- Because you're opting out of surveillance and manipulation
- If you love specific creators, support them directly when possible
- Subscriptions, donations, purchasing products
- If you genuinely can't afford it, don't feel guilty
- Your attention and word-of-mouth also matter
- Signal instead of WhatsApp (E2E encryption, no ads)
- Mastodon instead of Twitter (decentralized, community-owned)
- LibreOffice instead of Google Docs (if that works for you)
- Not perfect, but different tradeoffs
- Stop acting like targeted ads are just "how the internet works"
- Have real conversations about what we're trading away
- Support policy changes that limit surveillance
- Vote with your attention and your wallet when you can
- Web mining might work for some situations (with ethical consent)
- Public funding models are being experimented with
- Cooperative platforms are emerging
- The answer probably isn't one solution, but many
🤝 Finding Our Way Out Together
Look, I don't have all the answers. I don't even have most of them. What I have is this: a deep conviction that we can do better than mass psychological surveillance as the default business model for human communication and creativity. To creators worried about losing ad revenue: I see you. Rent is real. I'm not asking you to work for free. I'm asking us collectively to explore alternatives that don't require you to become surveillance brokers to pay your bills. To people who think I'm being too harsh on advertising: Maybe I am. But when a system is designed from the ground up to exploit human psychology for profit, when it amplifies harm to vulnerable people, when meaningful consent is structurally impossible—I think we should at least question whether "that's just how business works" is good enough. To those overwhelmed by this whole conversation: Me too, honestly. Most days I'm just trying to live my life without thinking too hard about how many algorithms are studying my behavior. But sometimes—like now—I think it's worth stopping to ask: Is this really the best we can do?💭 The Internet We Actually Want
Imagine for a second: What if the internet wasn't designed to extract data and manipulate behavior? What would it look like if the default was contribution instead of extraction? Maybe it would be messier. Maybe it would be less convenient in some ways. Maybe we'd have to think more carefully about which services we actually value enough to support. But maybe—just maybe—we'd also have:- Breathing room without constant manipulation
- Privacy as default instead of luxury
- Genuine choice about how we participate
- Digital spaces that feel like communities, not markets
- Content created for humans, not algorithms
The ads will still be there tomorrow. The algorithms will still be learning. But we don't have to pretend it's fine, and we don't have to stop looking for something better.
We built this system. We can build something different.💡 Curious about alternatives to surveillance-based advertising? Explore WebMiner for transparent, consent-first computational contribution. Not a perfect solution, but one more option in building a better digital economy. See the code, understand the tradeoffs, make your own choice. The answer to that question will determine whether we're remembered as the generation that sold human attention to the highest bidder, or the one that chose to build something better. The technology for alternatives exists. The question is whether we have the collective will to try them.
💡 Interested in exploring alternatives to surveillance advertising? Check out our WebMiner project for transparent, consent-first content monetization that doesn't require tracking users.