All articles
Technology & Politics

The Pharaoh Who Rewrote a Battle He Nearly Lost — And the Brain That Let Him

In the thirty-first year of his reign, Ramesses II commissioned what may be the most extensively distributed piece of political propaganda in the ancient world. The Battle of Kadesh — fought against the Hittite Empire in 1274 BCE, in what is now southern Syria — was, by any neutral military accounting, a near-disaster that ended in a draw. The Egyptian army was ambushed. Ramesses himself was nearly captured. The campaign achieved none of its territorial objectives.

The inscriptions that Ramesses ordered carved into the walls of six major temples describe something else entirely: a solitary, godlike king, abandoned by his cowardly troops, single-handedly routing an entire Hittite army through divine favor and personal heroism.

The story spread. It was repeated for centuries. Several generations of Egyptians accepted it as history.

Before we congratulate ourselves on having progressed beyond this, it is worth examining why it worked — because the answer has nothing to do with the medium, and everything to do with the mind.

The Hunger for a Flattering Story

Human beings do not process information about power neutrally. We have a strong, well-documented tendency to prefer narratives that confirm existing hierarchies when we feel those hierarchies offer us protection, and to prefer narratives that challenge them when we feel excluded or threatened. Neither preference is primarily about truth. Both are primarily about psychological security.

This is not a modern discovery, though modern psychology has given it precise names: motivated reasoning, identity-protective cognition, the backfire effect. What it means in practice is that the audience for political narrative is never simply asking "Is this true?" It is asking, simultaneously, "Does this make me feel safe?" and "Does this reflect well on people I trust?" and "Does believing this cost me anything socially?"

Ramesses understood this intuitively, as every successful political communicator across five thousand years has understood it. He was not trying to inform his population. He was trying to manage their emotional relationship to power. The temples were not libraries. They were mood regulation at architectural scale.

A Brief Tour of the Technique

The specific forms of political lying shift with available technology, but the underlying structure is remarkably consistent across cultures and millennia.

Augustus Caesar transformed a bloody civil war fought entirely for personal supremacy into a narrative of reluctant republican restoration. His memoir, the Res Gestae, is a masterpiece of selective omission — a document in which the word "enemy" appears only in reference to foreign adversaries, and in which every act of violence is framed as a defensive response to someone else's aggression. Augustus did not claim to be a king. He claimed to have refused the crown. The distinction mattered enormously to Roman psychology, and he knew it.

In 1937, the Soviet state publishing apparatus released a photograph of Joseph Stalin standing beside a canal, surrounded by loyal workers. An earlier version of the photograph had included Nikolai Yezhov, the head of the NKVD. By 1940, Yezhov had been executed, and the photograph had been retouched to remove him. Stalin's editors did not invent photo manipulation — they industrialized it. The Soviets maintained an entire bureaucratic apparatus dedicated to updating the historical record as political circumstances changed. The technology was new. The impulse was Ramesses.

The pattern extends forward. Every modern government that has managed wartime information — from the British Ministry of Information in World War I to the Pentagon's embedded journalist program in Iraq — has operated on the same fundamental principle: control the emotional frame first, and the factual content will largely take care of itself.

What Is Actually New About Disinformation

The contemporary alarm about disinformation is, in some respects, misdirected — not because the concern is unfounded, but because it tends to focus on the novelty of the technology while underestimating the novelty of the structural change beneath it.

For most of recorded history, the production of official political narrative required resources that only institutions could command. Temples, printing presses, broadcast licenses, newspaper chains — these were expensive. The barrier to entry meant that the volume of competing narratives in any given society was limited. A citizen in 1950 might distrust the government's account of events, but the number of alternative accounts available to them was small, and most of those alternatives required some institutional backing to reach a mass audience.

Algorithmically amplified social media did not change human psychology. It changed the economics of narrative production. For the first time in history, the cost of distributing a political story to millions of people dropped to approximately zero, and the selection mechanism for which stories traveled furthest shifted from editorial gatekeeping to emotional virality. Stories that provoked strong feelings — outrage, fear, tribal solidarity, contempt — spread faster than stories that were merely accurate.

This is genuinely new. Not because humans are now more susceptible to flattering stories about power than Ramesses's subjects were. They are not. But because the ecosystem in which those stories compete has been structurally altered in ways that favor intensity over accuracy at a scale and speed no previous society has had to manage.

The Outrage Is Missing the Point

Most public debate about disinformation focuses on the supply side: who is producing false content, with what resources, toward what ends. This is a reasonable question, but it is not the most important one.

The more important question — the one that Ramesses, Augustus, and every successful political propagandist across the entire span of recorded history implicitly answered correctly — is about the demand side. Why do people want certain stories to be true? What needs does a flattering account of power fulfill? What does it cost someone, socially and psychologically, to reject a narrative that their community has accepted?

These questions are harder to answer, and the answers are less satisfying, because they locate part of the problem inside the audience rather than entirely within the bad actors producing the content. We prefer the version in which we are passive victims of sophisticated manipulation. That version is, of course, more flattering.

Ramesses would recognize it immediately. He would probably commission a temple.

The Medium Is Not the Message

The history of political lying is, at its core, a history of the same transaction repeated across thousands of years: a leader offers a population a story in which power is legitimate, the future is safe, and the community is special — and the population, more often than not, accepts the offer because the alternative is more frightening than the lie.

Understanding this does not make the problem of disinformation less serious. It makes it more serious, because it removes the comfortable fantasy that the solution is primarily technical. You cannot fact-check your way out of a hunger that is not, at its root, about facts.

The Egyptians knew this. So did the Romans, the Soviets, and every political consultant who has ever A/B tested a campaign message. The question for any democracy that wants to survive its own information environment is whether enough of its citizens can learn to ask, when they encounter a story that feels deeply satisfying, not just "Is this true?" but "Why do I want it to be?"


All articles