A Whirlwind Tour of LW Rationality: 6 Books in 32 Pages - How To Actually Change Your Mind

(Back to “Map And Territory”)

How To Actually Change Your Mind E: Overly Convenient Excuses

Humility is a complicated virtue, and we should judge it by whether applying it makes us stronger or weaker, and by whether it is an excuse to shrug. To be correctly humble is to take action in anticipation of one’s own errors. (The Proper Use Of Humility)

A package deal fallacy is where you assume things traditionally grouped together must always be so. A false dilemma is presenting only two options where more exist. Justifications for noble lies are usually one of the two; it is preferable to seek a third alternative, which may be less convenient. (The Third Alternative)

Human hope is limited and valuable, and the likes of lotteries waste it. (Lotteries: A Waste Of Hope, New Improved Lottery) There is a bias in which extremely tiny chances are treated as more than tiny in implication, and justify proclaiming belief in them. There is a tendency to arbitrarily choose to ‘believe’ or not believe a thing rather than reacting to probabilities. (But There’s Still A Chance, Right?)

The fallacy of grey is to regard all imperfection and all uncertainty as equal. Wrong is relative. (The Fallacy Of Grey) There is a sizeable inferential distance from thinking of knowledge as absolutely true to understanding knowledge as probabilistic. (Absolute Authority) Eliezer says he would be convinced that 2 + 2 = 3 by the same processes that convinced him that 2 + 2 = 4; a combination of physical observation, mental visualization, and social agreement, such as observing that putting two more objects down beside two objects produced three objects. (How To Convince Me That 2+2=3)

Because of how evidence works, a probability of 100% or 0% corresponds to infinite certainty, and requires infinite evidence to correctly attain. As a result it is always incorrect. (Infinite Certainty) 0 and 1 are [in a sense] not probabilities. (0 And 1 Are Not Probabilities)

It is reasonable to care how other humans think, as part of caring about how the future and present look. This is somewhat dangerous, and so must be tempered by a solid commitment to respond to bad thinking only with argument. (Your Rationality Is My Business)

How To Actually Change Your Mind F: Politics and Rationality

Politics is the mind-killer. People cannot think clearly about politics close to them. In politics, arguments are soldiers. When giving examples, it is tempting to use contemporary politics. Avoid this if possible. If you are discussing something innately political, use an example from historic politics with minimal contemporary implications if possible. (Politics Is The Mind-Killer)

Policy debates should not appear one-sided. Actions with many consequences should not be expected to have exclusively positive or negative consequences. If they appear to, this is normally the result of bias. They may legitimately have lopsided costs and benefits. (Policy Debates Should Not Appear One-Sided)

Humans tend to treat debates as a contest between two sides, where any weakness in one side is a gain to the other and visa versa, and whoever wins is correct on everything and whoever loses is wrong on everything. This is correct behaviour for a single, strictly binary question, but an error for any more complicated debate. (The Scales Of Justice, The Notebook Of Rationality)

The fundamental attribution error is a tendency in people to overly attribute the actions of others to innate traits, while overly attributing their own actions to circumstance as opposed to differences in themselves. Most people see themselves as normal. (Correspondence Bias) Even your worst enemies are not innately evil, and usually view themselves as the heroes of their own story. (Are Your Enemies Innately Evil?)

Stupidity causes more random beliefs, not reliably wrong ones, so reversing the beliefs of the foolish does not create correct beliefs; reversed stupidity is not intelligence. Foolish people disagreeing does not mean that you are correct. (Reversed Stupidity Is Not Intelligence)

Authority can be a useful guide to truth before you’ve heard arguments, but is not so after arguments. (Argument Screens Off Authority) The more distant from the specific question evidence is, the weaker it is. You should try to answer questions using direct evidence- hug the query. Otherwise learning abstract arguments, including about biases, can make you less rather than more accurate. (Hug The Query)

Speakers may manipulate their phrasing to alter what aspects of a situation are noticed. (Rationality And The English Language) Simplifying language interferes with this, and allows you to recognise errors in your own speech. (Human Evil And Muddled Thinking)

How To Actually Change Your Mind G: Against Rationalization

Because humans are irrational to start with, more knowledge can hurt you. Knowledge of biases gives you ammunition to use against arguments, including knowledge of this one. (Knowing About Biases Can Hurt People)

Expect occasional opposing evidence for any imperfectly exact model. You should not look for reasons to reject it, but update incrementally as it suggests. If your model is good, you will see evidence supporting it soon. (Update Yourself Incrementally) You should not decide what direction to change your opinion in by comparing new evidence to old arguments; this double-counts evidence. (One Argument Against An Army)

The sophistication with which you construct arguments does not improve your conclusions; that requires choosing what to argue in a manner that entangles your choice with the truth. (The Bottom Line)

Reaction to evidence that someone is filtering must include reacting to knowledge of the filtering. Knowing what is true can require looking at evidence from multiple parties. (What Evidence Filtered Evidence?)

Rationalization is determining your reasoning after your conclusion, and runs in the opposite direction to rationality. (Rationalization) You cannot create a rational argument this way, whatever you cite. (A Rational Argument)

Humans tend to consider only the critiques of their position that they know they can defeat. (Avoiding Your Belief’s Real Weak Points) A motivated skeptic asks if the evidence compels them to believe; a motivated credulist asks if the evidence allows them to believe. Motivated stopping is ceasing the search for opposing evidence earlier when you agree, and motivated continuation is searching longer when you don’t. (Motivated Stopping And Motivated Continuation)

Fake justification is searching for a justification for a belief which is not the one which led you to originally hold it. (Fake Justification) Justifications for rejecting a proposition are often not the person’s true objection, which when dispelled would result in the proposition being accepted. (Is That Your True Rejection?)

Facts about reality are often entangled with each other. (Entangled Truths, Contagious Lies, Of Lies And Black Swan Blowups) Maintaining a false belief often requires other false beliefs, including deception about evidence and rationality themselves. (Dark Side Epistemology)

How To Actually Change Your Mind H: Against Doublethink

In doublethink, you forget then forget you have forgotten. In singlethink, you notice yourself forgetting an uncomfortable thought and recall it. (Singlethink)

If you watch the risks of doublethink enough to do it only when useful, you cannot do it. If you do not, you will do it where it harms you. Doublethink is either not an option or harmful. (Doublethink (Choosing To Be Biased))

The above on doublethink not be a dispassionate reporting of the facts; Eliezer admits that they may have been tempted into trying to create a self-fulfilling prophecy. They then say that it may be wise to at least tell yourself that you can’t self-deceive, so that you aren’t tempted to try. (Don’t Believe You’ll Self-Deceive)

It is possible to lead yourself to think you believe something without believing it. Believing that a belief is good can lead you to false belief-in-belief. (No, Really, I’ve Deceived Myself, Belief In Self-Deception) We often do not separate believing a belief from endorsing a belief. Belief-in-belief can create apparently contradictory beliefs. (Moore’s Paradox)

How To Actually Change Your Mind I: Seeing With Fresh Eyes

Anchoring is a behaviour in which we take a figure we’ve recently seen and adjust it to answer questions, making results depend on the initial anchor. A strategy for countering it might be to dwell on an alternative anchor if you notice an initial guess is implausible. (Anchoring And Adjustment)

Priming is an aspect of our brain’s architecture. Concepts related to ideas we’ve recently had in mind are recalled faster. This means that completely irrelevant observations influence estimates and decisions. This is known as contamination. It supports confirmation bias; having an idea in our head makes compatible ideas come to mind more easily, making us more receptive to confirming than disconfirming evidence for our beliefs. (Priming And Contamination)

Some evidence suggests that we tend to initially believe statements, then adjust to reject false ones. Being distracted makes us more likely to believe statements explicitly labeled as false. (Do We Believe Everything We’re Told?)

The hundred-step rule is the principle that because neurons in the human brain are slow, any hypothesised operation can be very parallel but must complete in under a hundred sequential neuron spikes. It is a good guess that human cognition consists mostly of cache lookups.

We incorporate the thoughts of others into this cache, and alone could not regenerate all the ideas we’ve collected in a single lifetime. We tend to incorporate and then repeat or act on cached thoughts without thinking about their source or credibility. (Cached Thoughts)

“Outside the box” thinking is a box of its own, and along with stated efforts at originality and subversive thinking follows predictable patterns; genuine originality requires thinking. (The “Outside The Box” Box) When a topic seems to have nothing to be said, it can mean we do not have any related cached thoughts, and find generating new ones difficult. (Original Seeing)

The events of history would sound extremely strange described to someone prior to them. (Stranger Than History) We tend to treat fiction as history which happened elsewhere. This causes us to favour hypotheses which fit into fun narratives, over other hypotheses that might be likely. (The Logical Fallacy Of Generalization From Fictional Evidence)

A model which connects all things contains the same information as a model that connects none. Information is contained in selectiveness about connections, and the more fine-grained this is the more information is contained. The virtue of narrowness is the definition and use of narrow terms and ideas rather than broad ones. (The Virtue Of Narrowness)

One may sound deep by coherently expressing cached thoughts that the listener hasn’t heard yet. One may be deep by attempting to see for yourself rather than following standard patterns. (How To Seem And Be Deep)

We change our mind less often than we think, and are resistant to it. A technique to mitigate against this is to hold off on proposing solutions as long as possible. (We Change Our Minds Less Often Than We Think, Hold Off On Proposing Solutions)

Because of confirmation bias, we should be suspicious of ideas that originally came from sources whose output was not entangled with the truth. However, to disregard other evidence entirely in favour of judging the original source would be the genetic fallacy. (The Genetic Fallacy)

How To Actually Change Your Mind J: Death Spirals and the Cult Attractor

The affect heuristic is when subjective impressions of goodness/badness act as a heuristic. It causes the manner in which a problem is stated and irrelevant aspects of a situation to change the decisions we make. (The Affect Heuristic) The halo effect is this applied to people; when our subjective impression of a person in one regard, such as appearance, alters our judgement of them in others. (The Halo Effect)

We overestimate the altruism of those who run less risk compared to those who run more, and attribute less virtue to people who are generous for lesser as well as greater need. (Superhero Bias) We lionize messiahs for whom doing great things is easy over those for whom it is hard. (Mere Messiahs)

We tend to evaluate things against nearby points of comparison. (Evaluability And Cheap Holiday Shopping) When we lack a bounded scale to put our estimates within, we make one up, inconsistently between people. (Unbounded Scales, Huge Jury Awards, And Futurism)

An affective death spiral is a scenario in which a strong positive impression assigned to one idea causes us to improve our impressions of related ideas, which we then treat as confirmation of the original idea in a self-sustaining cycle. (Affective Death Spirals) We can diminish the effect of positive impressions enough to prevent this by splitting big ideas into smaller ones we treat independently, reminding ourselves of the conjunctive bias and considering each additional claim to be a burdensome detail, and following the suggestions in the Against Rationalization sequence. (Resist The Happy Death Spiral)

Considering it morally wrong to criticise an idea accelerates an affective death spiral. (Uncritical Supercriticality) Evaporative cooling of group beliefs is a scenario in which as a group becomes more extreme, moderates leave, and as they are no longer acting as a brake, the group becomes yet more extreme, in a cycle. This is another reason why tolerating dissent is important. (Evaporative Cooling Of Group Beliefs)

A spiral of hate is the mirror image of an affective death spiral, in which a strong negative impression of a thing causes us to believe related negative ideas, which we then treat as strengthening the original impression. You can correspondingly observe it become morally wrong to urge restraint or to object to a criticism. It, too, leads to poor choice of action. (When None Dare Urge Restraint)

Humans, once divided into opposing groups, will naturally form positive and negative stereotypes of the two groups and engage in conflict. (The Robbers Cave Experiment) Every cause has a natural tendency for its supporters to become focused on defending their group, even if they declare ‘rationality’ to be their goal. (Every Cause Wants To Be A Cult)

Beware being primarily a guardian of the truth rather than primarily a seeker of it. (Guardians Of The Truth) The Nazis can be understood as would-be guardians of the gene pool. (Guardians Of The Gene Pool)

There are things we know now which earlier generations could not have known, which means that from our perspective we should expect elementary errors even in our historic geniuses. This is a defining attribute of scientific disciplines. It feels unfair to consider things they could not have known to be flaws in their ideas, but nevertheless they are. It is foolish to declare a system of ideas to be closed to further development. We already have examples of people who declared themselves to be about being Rational who fell into that trap in history. (Guardians Of Ayn Rand)

Two ideas for countering a tendency towards affective death spirals around a group are to prefer using and describing techniques over citing authority, and to deliberately look foolish to reduce the positive affect you give to the techniques you describe, so they are judged on their own merits. (Two Cult Koans)

We tend to conform to the beliefs of those around us, and are especially inclined to avoid being the first dissenter, for social reasons. Being the first dissenter is thus a valuable service. (Asch’s Conformity Experiment) It can be correct if you do not believe you have any special advantage to believe that the majority opinion is more likely to be the true one, but it remains important to express your concerns. Doing so is generally just as socially discouraged as outright disagreement. (On Expressing Your Concerns)

Lonely dissent is often just a role people play in defined patterns. When it is real, it requires bearing the incomprehension of the people around you and discussing ideas that are not forbidden but outside bounds which aren’t even thought about. Doing this without a single other person is terrifying. Being different for its own sake is a bias like any other. (Lonely Dissent)

Cults vary from sincere but deluded and expensive groups, to “love bombing”, sleep deprivation, induced fatigue, distant communes, and daily meetings to confess impure thoughts. Lists of cult characteristics include things which describe other organisations, like political parties and corporations. The true defining aspect is the affective death spiral, which should be fought in any group, and judged independently of how weird the group is in other respects. (Cultish Countercultishness)

How To Actually Change Your Mind K: Letting Go

If we only admit small, local errors, we only make small, local improvements. Big improvements require admitting big errors. Rather than grudgingly admitting the smallest errors possible, be willing to consider that you may have made fundamental mistakes. (The Importance Of Saying “Oops”)

Reinterpreting your mistakes to make it so that you were right ‘deep down’, or morally right, or half-right, avoids the opportunity to see large errors in the route you are on and adjust. (The Crackpot Offer) Being ready to admit you lost lets you avoid turning small mistakes into bigger ones. (Just Lose Hope Already)

A doubt exists to potentially destroy a particular belief, on the basis of some specific justification. A doubt that fails to either be destroyed or destroy its belief may as well not have existed at all. Wearing doubts as attire does not make you more rational. (The Proper Use Of Doubt)

You can face reality. What is true is already so. Owning up to it doesn’t make it any worse. (You Can Face Reality)

Criticising yourself from a sense of duty leaves you wanting to have investigated, not wanting to investigate. This leads to motivated stopping. There is no substitute for genuine curiosity, so attempt to cultivate it. Conservation of expected evidence means any process you think may confirm your beliefs you must also think may disconfirm them. If you do not, ask whether you are looking at only the strong points of your belief. (The Meditation On Curiosity)

The laws governing evidence and belief are not social, but aspects of reality. They are not created by rationalists, but merely guessed at. No one can excuse you from them, any more than they may excuse you from the laws of gravity, regardless of how unfair they are in either case. (No One Can Exempt You From Rationality’s Laws)

When you have a cherished belief, ask yourself what you would do, assuming that it was false. Visualise the world in which it is false, without challenging that assumption. Answering this grants yourself a line of retreat- a calm, tolerable path forward- enabling you to consider the question. (Leave A Line Of Retreat)

When you are invested heavily and emotionally in a long-lived belief which is surrounded by arguments and refutations, it can be desirable to attempt to instigate a real crisis of faith about it, one that could go either way, as it will take more than an ordinary effort to displace if false. (Crisis Of Faith, The Ritual)

(Continue with “The Machine In The Ghost”)