Institute

The Extraction Machine (How Social Media Turned You Into the Product)
Corporate Autopsy
Tuesday, November 25, 2025

The Extraction Machine (How Social Media Turned You Into the Product)

Make you happy. Piss you off. Make you emotionally invested. Hit you with an ad. That's the loop. That's the entire business model. And while they extract your attention, your data, and your sanity—they'll ban YOU if something goes wrong on THEIR platform.

153

"If you're not paying for the product, you are the product." — Origin disputed, but the truth isn't

Let me tell you about my wife's livestreaming account.

She had 40,000 followers. Built it over months of consistent content. Real community. Real engagement. Real people who showed up.

Then a guest came on her livestream and got naked.

She didn't invite it. She didn't want it. She killed the stream immediately. But in the moments before she could react, she was sexually assaulted on camera by someone else's actions.

The platform's response?

They banned her. Permanently.

Not the person who did it. Her. The victim. The one who built the audience, followed the rules, and had her platform weaponized against her by someone else.

No appeal. No nuance. No human review. Just an algorithm that saw nudity, flagged the account owner, and executed.

40,000 followers. Gone. Because she was assaulted.

That's the machine we're dealing with.


The Loop

Here's how social media actually works:

  1. Make you happy — Show you something that triggers dopamine. A cute video. A win. Validation.

  2. Piss you off — Show you something that triggers outrage. Rage holds attention longer than joy. They know this.

  3. Make you emotionally invested — Get you commenting, arguing, sharing. Engagement is engagement—they don't care if it's positive.

  4. Hit you with an ad — Now that your emotions are activated and your attention is captured, sell you something.

  5. Repeat.

That's it. That's the entire business model.

They're not connecting you with friends. They're not building community. They're not spreading information.

They're farming your emotional responses for advertising revenue.

Research from 2025 confirms what we suspected: algorithms deliberately amplify "emotionally charged, out-group hostile content" because it drives engagement. Facebook's angry emoji was weighted five times higher than a regular like—because anger keeps you scrolling.

They eventually reduced that weight. Not because it was wrong. Because they got caught.


What They Actually Know About You

The FTC released a report in September 2024 that should terrify you.

"These surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identity theft to stalking."

That's not a privacy advocate. That's the federal government.

Here's what they found:

Instagram shares 79% of your data with third parties. Browsing history. Location. Contacts. Financial information.

Even if you never signed up, they have a profile on you. Your friends uploaded their contacts. They tagged you in photos. The platforms built a shadow profile from the data exhaust of people who know you.

Every click, pause, scroll, and share teaches the algorithm about you. It learns what makes you angry, scared, excited, or curious—then feeds you more of those exact triggers.

And here's the part that should really bother you:

Those security question surveys on Facebook? "What was your first car?" "What street did you grow up on?" "What was your childhood pet's name?"

People answer them publicly. For fun. To show off their personality.

Those are your banking security questions. You're handing your account recovery information to data scrapers for engagement bait.

The extraction isn't hidden. You're participating in it.


The Creator Trap

Here's where it gets even darker.

The platforms need content. They need YOU to create it. So they built a system where creators can "monetize"—get paid a fraction of the ad revenue their content generates.

Sounds fair, right?

Except:

The rules change constantly. YouTube's demonetization triggers are a moving target. What's acceptable today is bannable tomorrow.

The enforcement is algorithmic. Patrick Boyle, a financial journalist, had his Epstein investigation demonetized despite a 98.9% approval rating—while deepfake scams using his likeness remained on the platform.

Legitimate content gets suppressed while fraud thrives. Journalism gets flagged. Scams get promoted. The algorithm doesn't understand context—it understands engagement.

One strike can erase years of work. Jill Bearup, a YouTuber with 500,000 followers, had her entire channel demonetized for ten days "for no adequately explained reason."

And appeals? Good luck. You're arguing with a system designed to process millions of decisions without human intervention.

The platforms built their empires on creator labor, then made the creators disposable.


The Fake Everything Economy

Let's talk about what's actually happening on these platforms.

$37.7 billion was lost to ad fraud in 2024. Projected to hit $41.4 billion in 2025.

21.3% of all traffic is invalid. Bots. Click farms. Fake engagement.

One in six PPC clicks is fraudulent. If you're running ads, you're paying for ghosts.

40% of mobile app ad impressions are invalid. The engagement metrics brands are paying for don't represent real humans.

Click farms employ thousands of low-paid workers—or increasingly, sophisticated bots—to create fake likes, fake follows, fake reviews, fake everything.

You know those accounts on X that post "EVERYONE FOLLOW EVERYONE"? That's the visible tip. Below the surface are industrial-scale operations manufacturing the illusion of popularity.

Reddit was supposed to be different. Community-driven. Organic. Authentic.

Now? You have to manufacture fake engagement just to get real engagement. The algorithm won't show your content unless it already looks popular. So creators buy upvotes, plant comments, game the system—just to get a fair chance at being seen.

The platforms reward inauthenticity, then punish you for being inauthentic.


The AI Influencers Are Here

And now we've added another layer of unreality.

AI influencers are projected to be a $1.5 billion market by 2030. Computer-generated personas with millions of followers, promoting products, building "relationships" with fans.

Lil Miquela "cried" on camera about a fictional breakup. Fans called it manipulative. Because it was. An AI was programmed to simulate emotional vulnerability to drive engagement.

72% of Gen Z trust human influencers more than AI ones. But the other 28%? They can't tell the difference. Or don't care.

The line between real and manufactured is dissolving. And the platforms have no incentive to clarify it—because manufactured engagement is still engagement.

You're being influenced by entities that don't exist, to buy things you don't need, with emotions that were engineered in a server farm.


The Real Cost

Here's what this machine actually extracts:

Your attention. Hours per day. Years of your life. Scrolling instead of creating. Consuming instead of connecting.

Your data. Everything you do, say, click, pause on, search for—packaged and sold to whoever pays.

Your mental health. Adolescents who spend more than three hours daily on social media show heightened risk for anxiety and depression. The platforms know this. They don't stop.

Your relationships. The algorithm shows you what triggers you, not what connects you. It optimizes for conflict because conflict is engaging.

Your sense of reality. When 21% of traffic is bots, when AI influencers have millions of followers, when engagement is manufactured at industrial scale—how do you know what's real?

And if something goes wrong—if someone else violates the rules on YOUR platform, on YOUR stream, against YOUR will?

They ban you.

Because you're not the customer. You never were.

You're the product. And products don't get appeals.


What You Can Do

I'm not going to tell you to delete everything and move to a cabin. That's not realistic.

But I am going to tell you to see clearly.

Recognize the loop. When you feel that surge of anger or outrage, ask: did I choose to feel this? Or was it served to me?

Protect your data. Stop answering those personality quizzes. Review your privacy settings. Understand that "private" on these platforms doesn't mean what you think.

Question the metrics. That viral post might be bot-boosted. That influencer might be AI. That engagement might be purchased. The numbers lie.

Build off-platform. Email lists. Your own website. Relationships that don't depend on an algorithm's permission.

Support regulation. The EU is ahead of us. The FTC is trying. These platforms won't fix themselves—they have no incentive to.

And most importantly:

Stop being surprised when the machine acts like a machine.

It's not broken. It's not making mistakes. It's not failing to understand context.

It's doing exactly what it was designed to do: extract value from your attention, your emotions, and your data.

The fact that it occasionally destroys innocent people in the process isn't a bug.

It's a cost of doing business.


My wife rebuilt. Different platform. Smaller audience. More cautious now.

The person who assaulted her on stream? No consequences we ever heard of.

The algorithm that banned her? Still running. Still extracting. Still deciding who gets a voice and who doesn't.

That's the machine.


Sources:

This is a one-time drop