Practical Tips for Defeating the Woke Mind Virus

December 14, 2022
Michael Taylor

The man who made reusable rockets, electric cars, and brain-computer interfaces happen, and is supposed to be taking us to Mars, has a new overriding concern. The woke ‘mind virus’.

https://twitter.com/elonmusk/status/1602278477234728960?s=20&t=i0ZISrSkqP9G3hz3avQ5lg

I was interested to see such a public reference to ‘mind viruses’, because it’s central to the topic of my book: “Marketing Memetics”. Most people won’t get the reference, and almost certainly won’t know that memetic engineering gives us tools for dealing with the problem. I thought I’d write an explainer post that gives a high level overview.

What is the Woke Mind Virus?

The term ‘woke’ was originally a left-wing compliment, and has now turned into a right-wing insult. It simply means being aware of and actively attentive to important societal facts and issues (especially issues of racial and social justice), which in and of itself, is a good thing.

The anti-woke movement began as a response to ‘cancel culture’, where social pressure has been used to ostracize perceived violators of left-leaning orthodoxy. The concern from free-speech absolutists like Musk, is that mob rule on social media is a form of censorship, disproportionately enforcing leftist political viewpoints and banning valid discussion. This may be a slippery slope towards a dystopian future, where independent thought is illegal. Musk laments that wokeness is "at its heart, divisive, exclusionary, and hateful".

Depending on where you fall on the issue, cancel culture can be construed as valid criticism of those who have spoken or acted in an unacceptable manner, or as cynical virtue-signalling to increase one’s social status, at great risk to free society. For the purposes of this article we’ll assume the latter, but the methods we discuss are morally neutral.

The reference to a ‘mind virus’ is likely from Neal Stephenson’s “Snow Crash”, which featured a literal mind virus – a drug, religion and computer virus all in one – ”What’s the difference?” asks one of the characters.

In the influential 1992 cyberpunk sci-fi novel – which also gave us Avatars, Google Earth, and the Metaverse – the central premise is that ideas are ‘viruses of the mind’, that can infect our brains and affect our behavior, causing us to act against our self-interest. This idea dates back to evolutionary biologist Richard Dawkins, who coined the term ‘meme’ as an analogy to biological genes. He was describing the way ideas, behaviors, or styles evolve through natural selection.

“Just as genes propagate themselves in the gene pool by leaping from body to body via sperms or eggs, so memes propagate themselves in the meme pool by leaping from brain to brain.”

– Richard Dawkins, The Selfish Gene, 1976

Dawkins' work became the foundation of the field of Memetics. If ideas evolve like genes, perhaps we can direct their evolution to our benefit, like we do in genetics. The study of Memetics is still more ‘pseudo-’  than science, as there is no generally accepted way to measure where a meme begins and ends, which limits progress. 

Memetics is missing its Gregor Mendel, the 19th century monk who meticulously crossbred 29,000 pea plants to establish the gene as the unit of heredity. It also lacks a Fischer character to formalize the mathematics, and has yet to find its version of Watson and Crick, who discovered the underlying structure of genetics in DNA. Perhaps with advancements in Artificial Intelligence, Neuroimaging, and Brain-Computer Interfaces we’ll get there soon.

In the meantime, fringe internet commentators have reportedly used memetic engineering to great effect; claiming victories such as the GameStop short squeeze, the proliferation of Q-Anon, and the election of Donald Trump. We know the U.S. Military has studied Memetics, Taiwan is using memetic engineering to stop disinformation, and we can see evidence of memetic tactics  deployed by both sides of the Ukraine-Russia war.

Furthermore, it may provide the solution to Elon’s issue, which he calls one of the biggest threats to the existence of humanity. These tools we’re about to discuss are general purpose, and can be adapted and used to counter any ‘mind-virus’: though the morality of what constitutes a ‘toxic’ meme versus a beneficial one, is up to you.

“Who controls the memes,
controls the Universe”

– Elon Musk

Memetic Engineering

In Internet discussion forums in the early 90s, Mike Godwin saw debates frequently getting out of hand. Potentially fruitful conversations about difficult topics would escalate until predictably, someone made a comparison to the Nazis. Claiming the other side’s ideas were ‘Nazi-like’ or saying they were “just like Hitler” ruined any hope for productive dialogue. So he decided to do something about it: in the first noted use of memetic engineering, he created the first ‘counter-meme’. 

Godwin’s law stated “as an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one”. He seeded this law in discussions to show participants they were “acting as vectors to a particularly silly and offensive meme”. It worked: others started repeating the meme and mutating it, and comparing someone to a Nazi in an online discussion now acts more to weaken your own position, than that of your intended target.

Godwin’s counter-meme spread because it was memorable: it’s easy to recall, and when you’re familiar with the concept, you see it everywhere. Just like viruses die out if they fail to infect enough people, memes only persist if they’re remembered long enough, and shared with others. Memes have different strategies for doing this: ‘catchy’ songs get stuck in your head, funny stories get told at dinner parties, useful information gets written down in books. Quoting Godwin made people feel smart, and helped them win arguments, so it continues to be remembered.

What made it effective was its sense of humor. It’s suggested that humor evolved to help us break free of rigid classifications, by testing boundaries and playing with social norms. Calling someone a Nazi works to shut down discussion because Nazi-ism is taboo. In daring to poke fun at those who mis-use that label, Godwin gives us a cathartic release, stealing away the power of the accuser.

The Vaccine for Memes

In studying the spread of misinformation, researchers have indeed found methods that echo Godwin’s methods, to be an effective way to inoculate people against fake news. In medical vaccines, a virus is weakened so it doesn’t make you sick, but will trigger antibodies to fight future infections. Meme vaccines work the same way, inoculating people against misinformation. Forewarned is forearmed: studies show that being warned that someone will intentionally try to mislead you on a topic, and being presented facts and arguments to “pre-bunk” the misinformation, has been proven to diminish the effects of malicious memes.

Prevention is better than cure: once someone is exposed, debunking has little effect, and fact-checking spreads slower on social media than misinformation. The same actions you’d take to protect the public from a pandemic are effective against mind viruses too: quarantine those spreading the virus (social media ban), protect the most vulnerable (public education), and treat the infected (mental health support). Be careful with this technique: one of the ‘conspiracy theories’ the study successfully inoculated people against – the “lab leak hypothesis” – later became a more generally accepted theory.

Fighting Smoke with Fire

We’re social animals, and we’re prone to gossip. It helps us enforce social norms and punish indiscretions by ostracizing victims, as well as signaling commitment to the in-group. To our prehistoric ancestors being expelled from our tribe was as good as a death sentence, and biologically we still react as if that’s at risk of happening. Shame is what keeps us in line, and the scapegoat mechanism provides an outlet for social tensions. When this machinery goes haywire and selects the wrong victim, it can cause real-world harm.

To make good gossip that spreads virally, you need a plausible story which is distorted in 3 directions. The story must be leveled of important details, because we have more capacity to believe if there’s no easy way to verify. The remaining details in the story must then be sharpened – made more specific – to make it memorable or the rumor won’t be passed on. Finally the rumor must be assimilated into the group: adapted to make sense to those spreading the story. 

To dispel a maladapted meme, you need to put this mechanism into reverse. First, realize that it’s no use denying the rumor: receiving contradictory evidence only serves to strengthen beliefs, and everybody suspects there's no smoke without fire. Instead you must offer a new, better story in return: give them fire in place of smoke. Confess to the ‘real’ story, with enough embarrassing details to make it credible. Catharsis relieves the burden of repressing taboo topics, and gives the crowd something new to discuss. Inject too many details into the new story, to take all the fun out of it: it’ll fizzle out instead of becoming something new to dispel.

The Social Benefits of Bad Beliefs

Whenever you find people believing something unlikely or unusual, there are usually group social dynamics at play. If some group has adopted you as their target, the attacks won’t stop until the group is disbanded or they pick a new target. The first and best strategy is Aposematism: signaling that it’ll be unprofitable to attack you. Frogs do this with bright colors signaling their toxins, and honey badgers do it with their aggressive behavior. One strategy on social media is ‘shit posting’ – deliberatively being provocative – which tells opposing groups you can take it.

If that doesn’t work, because you’re too juicy a target, you have to hit back at the group itself. We are predominately preoccupied with identifying how our beliefs will be interpreted by others, rather than examining if they’re true. In fact the more the belief differs from our model of reality, the more important it becomes to group cohesion, because it becomes a costly signifier that you’re “one of us”. We likely wouldn’t believe these things in isolation, but maintaining good relations with your spouse, family, tribe, church, military unit, political party, or workplace can easily outweigh other considerations. The key insight is that everyone is part of multiple tribes: if being a member of one tribe starts hurting your status in another, there will be a tipping point where you’d start to distance yourself from them.

We’ve already established that providing evidence to counter beliefs is a losing strategy, so do the opposite. People who argue against your belief are not in your tribe, and therefore can be safely ignored. What can't be ignored is someone agreeing in a way that’s embarrassing, or taking things further than you wanted. They make you look bad by association, to the other tribes you’re part of. Every group has a tiny minority of fundamentalists, and by moving towards their position, rather than away, it’ll force them to a more radical higher ground. You may succeed in knocking them out of the Overton window – the range of policies politically acceptable to the mainstream population – into a no-man’s land that devastates group retention and recruitment.

Evolutionary Design Thinking

For evolution to arise in a system, you need just 3 things: replication, variation, and selection. In any system where traits can be inherited (directly or through imitation), more agents are produced than are able to survive, and agents with more favorable traits stand a higher chance of survival, you’ll get natural selection. Any such system, given enough time, will produce agents that are extremely well optimized for their environment. If we don’t take this into account, it’s easy to accidentally design a system in a way that creates monsters.

Say you run a social network, and you have a bot problem. Intuitively you may think it makes sense to enact a penalty for anyone who breaks the rules. However you just unknowingly introduced selection pressure to your system. The barrier to entry will give malicious agents a lucrative puzzle to solve. They’ll explore questions like “What bots don’t get detected?”, “In what conditions are the risks of a penalty tolerable?”. Given enough time, a “genius corrupter” will emerge: what Bret Weinstein calls an entity that has evolved to test the limits of a system without getting caught. In extreme cases an agent may corrupt the system itself, hijacking the enforcement mechanism for its own benefit. Rather than solve the bot problem, you’ve only made them stronger. You can find the solution by reversing the famous phrase “what doesn’t kill me, makes me stronger”. Going nuclear on bad behavior makes it no longer worth it to play the game. The evolutionary process grinds to a halt, because every attempt goes extinct. 

Another protection against corruption is decentralization: if you need 51% approval for a bad idea to pass, you’re subject to the tyranny of a small majority. However in a decentralized system, bad ideas are unevenly distributed – some groups might be 80% sold on a bad idea, others only 20% – there will be safe enclaves where the bad idea couldn’t reach a majority. When the inevitable consequences arrive, everyone can jump ship to the group that got it right. More of the problem space is explored collectively, by everyone making decisions in their self interest individually, building redundancy against bad decisions.

Conclusion

Most people are aware of the term meme, but are unaware of its broader pre-internet definition. Bad actors are already using memetic engineering against us, and studies show that warning you about this ahead of time will help inoculate you against its effects. Whatever you think about Elon Musk and his anti-woke crusade, and whether you’re left or right leaning, studying memetics can help level the playing field. If only one side of an argument is using these techniques, we’re unlikely to make the necessary trade-offs to arrive collectively at the right answer. If you’re interested in learning more about memetics, you can read essays from my book at marketingmemetics.com.

November 18, 2021

More to read