The election isn’t happening on a map. It’s happening in a feed.

We keep talking about November like it’s a destination, like we just have to get there and endure it. But what’s happening right now, in July, is not buildup. It’s the event. The election is already underway, and it’s being run through a system that was built to optimize attention, not legitimacy.

That system has a name, and it isn’t “the media” in the old sense. It’s the platforms. It’s the recommendation engines. It’s the ranking models and the engagement loops. It’s the human moderators doing triage at industrial scale. It’s the policy teams drafting rules for edge cases that will define the country’s mood.

We can pretend this is still a persuasion contest, where candidates speak and voters decide. But the modern election is also a moderation contest, where platforms decide what gets amplified, what gets dampened, what gets labeled, what gets throttled, and what gets left to run wild because it “doesn’t technically violate” anything.

And if you want to know why July feels like a bad dream that keeps restarting, it’s because the incentives are aligned for the nightmare.


Platforms don’t host democracy. They shape it.

The convenient story is that platforms are neutral pipes. That story is false. The pipes have valves. The valves are tuned to keep you scrolling.

Every platform has two jobs that are in constant conflict:

  1. Keep you engaged.
  2. Keep the place from becoming unusable.

Job one pays the bills. Job two is damage control. We are watching, in real time, what happens when the business model wins.

A circuitry-built megaphone blasts glowing signals as reaction icons and speech bubbles radiate outward over data charts
Engagement isn’t feedback. It’s a throttle.

Engagement is not a harmless metric. Engagement is what you get when you turn social interaction into a competitive sport. The system doesn’t reward truth. It rewards velocity. It rewards certainty. It rewards content that triggers instant reaction. Outrage, fear, humiliation, and tribal celebration outperform nuance almost every time.

In that environment, misinformation isn’t an anomaly. It’s a native species.

So when we ask platforms to “stop misinformation,” we are asking them to defeat a creature that evolved inside their own greenhouse.


July 2020: a perfect storm of inputs

This summer is feeding the machine exactly what it wants.

  • A pandemic that produces real uncertainty and constant new information.
  • An economy in shock, with anger looking for a target.
  • Mass protest and counter-protest, with video as the main currency.
  • An election where one side is openly testing the boundaries of legitimacy.
  • A public exhausted enough to accept simple explanations and clean villains.

This is not a normal information environment. And the worse it gets, the more we lean on platforms as our shared reality.

That is the trap: the more chaotic the world becomes, the more we depend on systems that monetize chaos.


“Free speech” is being used as a shield for design choices

The loudest argument you’ll hear this year is that platforms can’t intervene without becoming arbiters of truth. That they must allow speech, even bad speech, because they are the modern town square.

But the town square doesn’t have a ranking algorithm that decides which five conversations the whole city has to hear.

The key design choice isn’t whether a post is allowed to exist. The key design choice is whether it is given distribution. Whether it is placed in front of millions of people. Whether it is recommended to users who were not looking for it.

Platforms already make those choices, constantly. They just prefer to frame them as “the algorithm did it,” as if the algorithm is a natural phenomenon like weather.

It isn’t weather. It’s architecture.

And when the architecture rewards manipulation, you get more manipulators.


The moderation problem is not “what is true.” It’s “what is profitable.”

Let’s say a platform wanted to handle election misinformation responsibly. What would that require?

  • Fast, consistent enforcement across languages and regions.
  • Strong action on repeat offenders.
  • Clear, public standards.
  • Reduced amplification of dubious content.
  • A willingness to take political heat.

Now ask yourself what it would cost.

It would cost engagement. It would cost growth. It would cost time on platform. It would cost ad impressions. It would cost revenue.

That is why the platform response often looks like this:

  • Add a label.
  • Add a link.
  • Announce a policy update.
  • Hope the news cycle moves on.

Labels are not meaningless, but they are often a fig leaf over a distribution engine that remains intact.

A label is what you do when you want to be seen responding without changing the underlying incentives.


The election will be decided by logistics, and lies are targeting the logistics

If you want a preview of the ugliness coming, look at how the conversation is being steered toward the mechanics of voting: mail-in ballots, deadlines, signature rules, polling places, “fraud,” “rigged,” “illegitimate.”

These are not random talking points. They are stress tests. The goal is to create a narrative where any outcome can be contested as fake.

Here’s the part people miss: you don’t have to convince everyone. You only have to convince enough people that the system is untrustworthy, that the losing side feels justified in rejecting it.

In a tightly polarized country, legitimacy is not a default condition. It’s something you maintain. It’s maintenance work. Platforms are now part of that maintenance, whether they admit it or not.

Because the lies are not just about candidates. They’re about the plumbing.


The “both sides” reflex is an algorithm in the human brain

There is a reason institutions struggle to respond to asymmetry. It feels unfair to say one side is doing something worse. It feels partisan. It feels like a trap.

But refusing to describe asymmetry does not make it go away. It just makes your response weaker.

Platforms do this too. They want to be seen as neutral referees, so they enforce policies in ways that look symmetrical, even when the behavior isn’t.

The result is predictable: the more aggressive actor sets the terms. The cautious actor plays defense and calls it “principle.”

If one side is willing to light a fire in the information ecosystem, and the other side is worried about appearing impolite, the ecosystem burns.

Neutrality is not the same as fairness when incentives are being exploited.


Why the “marketplace of ideas” fails online

We like the idea that bad ideas will be defeated by better ideas if we just let them compete. That fantasy depends on a few conditions that do not exist in your feed:

  • People must encounter competing ideas in comparable volume.
  • People must share a baseline set of facts.
  • People must change their minds when presented with evidence.
  • The system must not reward deception.

Online, volume is not neutral. Volume is purchased, coordinated, and manipulated. Shared facts are optional. Evidence competes with identity. And deception is often rewarded because it generates reaction.

The marketplace of ideas becomes a marketplace of attention, and attention has always been easier to steal than to earn.


This is why July feels like we are living inside a feedback loop

If you feel like you are seeing the same argument every day, it’s because you are.

The platforms are not showing you “what is happening.” They are showing you what is performing. And performance on these systems is driven by repetition, simplification, and emotional intensity.

So the same claims return in slightly altered form. The same videos circulate with new captions. The same “just asking questions” posts seed doubt and then back away from responsibility. The same bad-faith actors find the gaps in enforcement like water finding cracks.

Meanwhile, people who are trying to be honest are playing by rules the system does not reward: uncertainty, context, restraint.

We are not losing our minds. We are being trained.


What would a serious platform response look like?

It would look like platforms treating election integrity as a first-order product problem, not a PR risk.

A giant funnel sorts floating content tiles while warning-marked items are diverted onto conveyor belts by robotic arms.
A feed at scale becomes an assembly line: sort, flag, ship, repeat.

That means:

  • Distribution controls: reduce amplification of content flagged as misleading, not just label it.
  • Repeat offender penalties: escalating consequences, not whack-a-mole on individual posts.
  • Transparent metrics: publish data on how misinformation spreads, what actions were taken, and what worked.
  • Friction by design: slow down sharing for certain categories, especially when content is going viral faster than it can be evaluated.
  • Localized expertise: enforcement that recognizes context, not generic policy applied from far away.
  • Ad scrutiny: treat political and issue ads as a known risk surface, because they are.

None of this is impossible. The obstacle is not capability. It is incentive.

Platforms can redesign for less harm. They just cannot pretend it won’t change their numbers.


What should you do, as a person living in the middle of it?

I can’t give you a magic list that makes you immune. But you can do a few things that starve the system of easy fuel:

  • Don’t share first. Read first. If you’re angry, you’re being recruited.
  • Treat viral as suspicious. Virality is not evidence. It’s a signal that the system found a nerve.
  • Look for the logistics lies. Misinformation about how to vote is not “political opinion.” It’s sabotage.
  • Value boring sources. The sources that feel dull often have the discipline you need right now.
  • Remember that screenshots are weapons. A screenshot is context removed on purpose.
  • Don’t argue in the arena the algorithm built. The platform wants conflict. Sometimes the move is refusing to perform.

This isn’t about being above it all. It’s about understanding the environment you’re in.


The uncomfortable truth: democracy is now partly a software problem

We are heading into an election where legitimacy will be contested, and the contest will be waged through the systems that distribute attention.

The platforms won’t decide the winner directly. But they will influence what people believe is possible, what they believe is normal, what they believe is true, and what they believe is worth fighting over.

That is power. Not elected power, not accountable power, but real power all the same.

If we want to make it through November without breaking something we can’t repair, we need to stop treating platform governance as a side issue. It’s part of the election. It’s part of institutional stability. It’s part of public safety.

July is not the calm before the storm. July is the storm practicing.

And the forecast is being written by engagement metrics.


Leave a Reply

Your email address will not be published. Required fields are marked *