Title Map and Territory (Book 1of 6 from the AI to Zombies series)
Author Elizer Yudkowsky
Year Published 2018
Kind of Book Rationality/Critical Thinking
How strongly I recommend it 8/10
My Impressions A collection of essays originally on the Rationalist blog "Less Wrong." These essays tackle different questions, dilemmas, and insights into the world of rationality and critical thinking. They are short and full of fun and silly examples. I feel like reading these short essays makes me smarter and really examine my thinking and assumptions.
Date Read circa 2020
Practical Takeaways
(consistency bias) View changing your mind as a sign of growth, not as a negative thing
(planning fallacy) When estimating how long something will take deliberately avoid thinking about the specifics of the task and ask yourself how long similar projects have taken in the past
(planning fallacy) when estimating how long something will take, ask an expected outsider how long it took them. (that answer will seem hideously long. The answer is true. Deal with it.)
If something can be destroyed by truth, destroy it ("That which can be destroyed by truth should be.")-P.C. Hodgell
Don't ask yourself what to believe, ask yourself what you anticipate will happen
Be not too quick to blame those who misunderstand your perfectly clear sentences, spoken or written. (chances are, your words are more ambiguous than you think. It makes perfect sense to you because you know what you meant)
Pursue truth out of a genuine curiosity about what is true above all (moral reasons or pragmatic reasons)
When talking to someone make sure your inferential pathway starts with what they already know and accept and draw your argument from there. Ie. Don't assume prior knowledge or prior assumptions
Don't hint that you are a few inferential steps away from the person your speaking to (or they will think you're condescending)
If students are guessing the answer they want you to hear, ask them to explain to you their thought process
Have students show their work (this prevents them from memorizing without really learning or from taking shots in the dark)
When trying to understand something, put it in your own terms. (If you can only use the author's exact terms, you don't understand it)
When you run into something you don't understand use the word 'magic.' eg. X magically does Y (This will prevent you from thinking you know something you don't)
If there are 70% red balls and 30% blue balls drawn at random and you have to guess what is coming, choose red every time, not red 70% of the time and blue 30% of the time
Make predictions based on history, not based on sci-fi and fiction
Try to feel as if everything that you read in a history book happened to you personally (eg. Imagine yourself as a peasant starving during the French Revolution)
Remember yourself being a thousand peasants for every time you imagine yourself being a ruler in history
Don't stop wanting to know the answer to something you don't know, just because other people already know the answer (Eg. Just because other people know how a lightbulb works, it shouldn't diminish your curiosity to find out)
Big Ideas
Looking back on old work and seeing mistakes (or that your mind has changed) is a sign of growth
Humans are not rational,
But
Humans are predictably irrational
Our mind might have adapted through evolution to believe some things that aren't true
Sometime we feel pressured to believe what others believe in order to get along socially
It is "rational" to use intuition/hunches/and gut feelings when they're appropriate
Things usually take longer in reality than we believe they will take in our worst-case scenarios
People pursue truth for 3 reasons
Moral injunction ie. They believe it is inherently noble, important, and worthwhile
Pragmatism ie. They want to accomplish some specific real-world goal (eg. Building an Airplane)
Curiosity ie. They just want to know
The problem with pursuing truth for moral reasons is what you decide to investigate will also be driven by your ideals
The problem with pursuing truth for curiosity sake is you may not pursue useful questions because they don't happen to tickle your aesthetic sense the way more esoteric questions do
The problems with pursuing truth for pragmatic reasons is that sometimes what is useful is not what is "true" ie. Metaphorical truth
The reason the Old Testament claims large-scale geopolitical miracles (like Egypt ruling much of Canaan) and the New Testament only claims small-scale miracles (like Jesus curing a man from blindness) is because the Roman Empire had libraries and could document large scale history
Not getting involved in a conflict between the powerful and the powerless is the same thing as siding with the powerful -Paolo Freire
Not making a decision is a decision
There is a difference between judging something as neutral and withholding judgement because you don't have enough time/care/focus/motivation to make a judgement.
How long a sentence is (or how advanced the vocabulary is) is not indicative of how complex the thing being said is
We believe things instinctively
Disbelief requires conscious effort
If nothing no evidence can prove your theory wrong than your theory doesn't actually prove anything.
In probability theory, absence of evidence is always evidence of absence.
The reason Scientists have trouble communicating with lay audiences is because they are so many inferential steps removed from them and they often go back one step when they really need to go back several
Students are never rewarded for saying "I don't know in school"
So
Students grow up thinking it is bad to admit you don't know the answer to something and that it is better to guess or pretend like you know the answer
A Simulation Argument doesn't explain anything about the Universe, because you are then left with the questions, 'who created the beings running the simulation?'
Something being mysterious, is a statement about the observer, not the thing. Nothing is inherently mysterious
We remake mistakes from history after everyone from the generation who made them are dead
If someone claims that there is no objective truth, the claim itself must not be objective
If you can't write a concept in your own terms (and can only copy the author's terms exactly) it means you don't understand it
Surprising Facts
During the Roman Empire sports fans of chariot racing teams would sometimes murder fans of an opposite racing team
When asking people how much time they thought a task would take it was found that asking subjects for their best guess scenario and their hoped-for 'best case' scenario, the numbers people gave were the same on average
When asking people how much time they thought a task would take in a worst-case scenario, on average people still estimate less time than it ends up taking. Ie. Things usually take longer in reality than we believe they will take in our worst-case scenarios
The New Testament was created during the Roman Empire
Emergency rooms are legally obligated to treat anyone, regardless of ability to pay
Shepherds built counting systems by throwing a pebble into a bucket whenever a sheep left the fold, and taking a pebble out whenever a sheep returned
Unknown Terms
Statistical Bias: This cognitive bias occurs when a model or statistic is not representative of the population. In this case your method of learning about the world is biased, so acquiring more data can even consistently worsen things
Scope insensitivity (Scope neglect): This cognitive bias occurs when the scope of an altruistic action has little effect on our willingness to pay. For example people will pay the same (and sometimes more) when they are told there are 2,000 birds that need help than when they are told there are 200,000
Selective Reporting: a term for how certain stories are more likely to get reported because they are outrageous, controversial, exciting, or potentially lucrative for the news outlets
Cognitive Biases: those obstacles to truth which are produced...by the shape of our own mental machinery….[they are] distinguishable from errors stemming from damage to an individual human brain, or from absorbed cultural mores; biases arise from machinery that is humanly universal."
Corroborative evidence: evidence that tends to support a proposition that is already supported by some initial evidence, therefore confirming the proposition.
Epistemic Rationality: The type of rationality focused on building accurate maps of reality 2)systematically improving the accuracy of ones beliefs
Instrumental Rationality: The type of rationality focused on systematically achieving one's values 2) it is about steering reality—sending the future where you want it to go 3) winning (not necessarily at another's expense, because your highest value might be that others win) winning in the sense that you are acting rationally to pursue your highest value
Probability theory: 1)the branch of mathematics concerned with probability. 2)the set of laws underlying rational belief
Belief in Belief: a term coined by Daniel Dennett. It is the idea that is sometimes easier to believe that you should believe in something than to actually believe in it.
Evidence: an event entangled, by links of cause and effect, with whatever you want to know about.
Truth Bias: This cognitive bias is the phenomenon whereby people are more likely to correctly judge that a truthful statement is true than that a lie is false. Ie. We have a bias towards believing rather than disbelieving what we see and hear
Illusion of Transparency: This is the tendency to overestimate how clear you are being with your language or the way you are communicating, because since you know what you meant, you assume others must as well.
Inferential steps (Inferential distance): This is the gap in prior knowledge between two people about what is being spoken about. If this gap is too large the receiver will not be able to grasp what the speaker is saying because they don't have the same prior information or understanding of references to make what is being said intelligible. (eg. If I say she is an ENFJ with an electra complex and the person I am speaking to doesn't know what an ENFJ is or what the electra complex is they are one inferential step removed from understanding me. If I tried to explain that an Electra complex is a type of neurosis and they didn't know what a neurosis was, then they are two inferential steps away from me. If I tried to explain that a neurosis is a reoccurring psychological fixation and they did not know what a fixation was they would be 3 inferential steps away from me. And on and on)
Guessing the Teacher's Password: when a student tries to guess the answer the teacher wants to hear without any understanding of the knowledge, but is just taking shots in the dark.
The Logical Fallacy of Generalization from Fictional Evidence: This fallacy is when someone uses a fiction example (from a movie or book) as real world evidence (eg. "we should be worried about AI, just look what happened in Terminator!")
Wu wei: 'non-action'. The concept, in Daoism, of effortlessly achieving one's goals by ceasing to strive and struggle to reach them
Transhumanism: The idea that humans should use technology to radically improve their lives —ig. Curing disease or ending aging 2) a philosophical movement, the proponents of which advocate and predict the enhancement of the human condition by developing and making widely available sophisticated technologies able to greatly enhance longevity, mood and cognitive abilities