hero

Super Thinking: The Big Book of Mental Models

von Gabriel Weinberg

Auf Amazon anschauen (opens new window)

  • There is a smaller set of mental models, however, that are useful in general day-to-day decision making, problem solving, and truth seeking. These often originate in specific disciplines (physics, economics, etc.), but have metaphorical value well beyond their originating discipline.

  • Critical mass is one of these mental models with wider applicability: ideas can attain critical mass; a party can reach critical mass; a product can achieve critical mass. Unlike hundreds of other concepts from physics, critical mass is broadly useful outside the context of physics.

  • We call these broadly useful mental models super models because applying them regularly gives you a super power: super thinking—the ability to think better about the world—which you can use to your advantage to make better decisions, both personally and professionally.

  • What is elementary, worldly wisdom? Well, the first rule is that you can’t really know anything if you just remember isolated facts and try and bang ’em back. If the facts don’t hang together on a latticework of theory, you don’t have them in a usable form. You’ve got to have models in your head. And you’ve got to array your experience—both vicarious and direct—on this latticework of models.

  • “History doesn’t repeat itself, but it does rhyme.”

  • When you don’t use mental models, strategic thinking is like using addition when multiplication is available to you.

  • For instance, unless you are a physicist, Coriolis force, Lenz’s law, diffraction, and hundreds of other concepts are unlikely to be of everyday use to you, but we contend that critical mass will prove useful. That’s the difference between regular mental models and super models. And this pattern repeats for each of the major disciplines. As Munger said:

  • And the models have to come from multiple disciplines—because all the wisdom of the world is not to be found in one little academic department …. You’ve got to have models across a fair array of disciplines. You may say, “My God, this is already getting way too tough.” But, fortunately, it isn’t that tough—because 80 or 90 important models will carry about 90 percent of the freight in making you a worldly-wise person. And, of those, only a mere handful really carry very heavy freight.

  • When I urge a multidisciplinary approach … I’m really asking you to ignore jurisdictional boundaries. If you want to be a good thinker, you must develop a mind that can jump these boundaries. You don’t have to know it all. Just take in the best big ideas from all these disciplines. And it’s not that hard to do.

  • Reading this book for the first time is like Spider-Man getting his spider bite or the Hulk his radiation dose. After the initial transformation, you must develop your powers through repeated practice.

  • “The best time to plant a tree was twenty years ago. The second best time is now.”

  • The concept of inverse thinking can help you with the challenge of making good decisions. The inverse of being right more is being wrong less. Mental models are a tool set that can help you be wrong less. They are a collection of concepts that help you more effectively navigate our complex world.

  • Let us offer an example from the world of sports. In tennis, an unforced error occurs when a player makes a mistake not because the other player hit an awesome shot, but rather because of their own poor judgment or execution. For example, hitting an easy ball into the net is one kind of unforced error. To be wrong less in tennis, you need to make fewer unforced errors on the court. And to be consistently wrong less in decision making, you consistently need to make fewer unforced errors in your own life.

  • Some things benefit from shocks; they thrive and grow when exposed to volatility, randomness, disorder, and stressors and love adventure, risk, and uncertainty. Yet, in spite of the ubiquity of the phenomenon, there is no word for the exact opposite of fragile. Let us call it antifragile. Antifragility is beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better.

  • The central mental model to help you become a chef with your thinking is arguing from first principles. It’s the practical starting point to being wrong less, and it means thinking from the bottom up, using basic building blocks of what you think is true to build sound (and sometimes new) conclusions. First principles are the group of self-evident assumptions that make up the foundation on which your conclusions rest—the ingredients in a recipe or the mathematical axioms that underpin a formula.

  • When arguing from first principles, you are deliberately starting from scratch. You are explicitly avoiding the potential trap of conventional wisdom, which could turn out to be wrong. Even if you end up in agreement with conventional wisdom, by taking the first-principles approach, you will gain a much deeper understanding of the subject at hand.

  • As another example, any startup business idea is built upon a series of principled assumptions: • My team can build our product. • People will want our product. • Our product will generate profit. • We will be able to fend off competitors. • The market is large enough for a long-term business opportunity.

  • My team can build our product. We have the right number and type of engineers; our engineers have the right expertise; our product can be built in a reasonable amount of time; etc. • People will want our product. Our product solves the problem we think it does; our product is simple enough to use; our product has the critical features needed for success; etc. • Our product will generate profit. We can charge more for our product than it costs to make and market it; we have good messaging to market our product; we can sell enough of our product to cover our fixed costs; etc. • We will be able to fend off competitors. We can protect our intellectual property; we are doing something that is difficult to copy; we can build a trusted brand; etc. • The market is large enough for a long-term business opportunity. There are enough people out there who will want to buy our product; the market for our product is growing rapidly; the bigger we get, the more profit we can make; etc.

  • Once you identify the critical assumptions to de-risk, the next step is actually going out and testing these assumptions, proving or disproving them, and then adjusting your strategy appropriately.

  • Unfortunately, people often make the mistake of doing way too much work before testing assumptions in the real world. In computer science this trap is called premature optimization, where you tweak or perfect code or algorithms (optimize) too early (prematurely). If your assumptions turn out to be wrong, you’re going to have to throw out all that work, rendering it ultimately a waste of time.

  • model to help you test your assumptions, called minimum viable product, or MVP. The MVP is the product you are developing with just enough features, the minimum amount, to be feasibly, or viably, tested by real people.

  • “Everybody has a plan until they get punched in the mouth.”

  • Ockham’s razor helps here. It advises that the simplest explanation is most likely to be true.

  • “Everything should be made as simple as it can be, but not simpler!” In medicine, it’s known by this saying: “When you hear hoofbeats, think of horses, not zebras.”

  • A related trap/trick is nudging. Aldert Vrij presents a compelling example in his book Detecting Lies and Deceit: Participants saw a film of a traffic accident and then answered the question, “About how fast were the cars going when they contacted each other?” Other participants received the same question, except that the verb contacted was replaced by either hit, bumped, collided, or smashed. Even though the participants saw the same film, the wording of the question affected their answers. The speed estimates (in miles per hour) were 31, 34, 38, 39, and 41, respectively.

  • Anchoring isn’t just for numbers. Donald Trump uses this mental model, anchoring others to his extreme positions, so that what seem like compromises are actually agreements in his favor. He wrote about this in his 1987 book Trump: The Art of the Deal: My style of deal-making is quite simple and straightforward. I aim very high, and then I just keep pushing and pushing to get what I’m after. Sometimes I settle for less than I sought, but in most cases I still end up with what I want.

  • Availability bias stems from overreliance on your recent experiences within your frame of reference, at the expense of the big picture. Let’s say you are a manager and you need to write an annual review for your direct report. You are supposed to think critically and objectively about her performance over the entire year. However, it’s easy to be swayed by those really bad or really good contributions over just the past few weeks. Or you might just consider the interactions you have had with her personally, as opposed to getting a more holistic view based on interactions with other colleagues with different frames of reference.

  • Online this model is called the filter bubble, a term coined by author Eli Pariser, who wrote a book on it with the same name.

  • When you put many similar filter bubbles together, you get echo chambers, where the same ideas seem to bounce around the same groups of people, echoing around the collective chambers of these connected filter bubbles. Echo chambers result in increased partisanship, as people have less and less exposure to alternative viewpoints. And because of availability bias, they consistently overestimate the percentage of people who hold the same opinions.

  • What story would they tell? How much would they agree with your story? Authors Douglas Stone, Bruce Patton, and Sheila Heen explore this model in detail in their book Difficult Conversations: “The key is learning to describe the gap—or difference—between your story and the other person’s story. Whatever else you may think and feel, you can at least agree that you and the other person see things differently.”

  • Another tactical model that can help you empathize is the most respectful interpretation, or MRI. In any situation, you can explain a person’s behavior in many ways. MRI asks you to you interpret the other parties’ actions in the most respectful way possible. It’s giving people the benefit of the doubt.

  • Another way of giving people the benefit of the doubt for their behavior is called Hanlon’s razor: never attribute to malice that which is adequately explained by carelessness.

  • Another tactical model to help you have greater empathy is the veil of ignorance, put forth by philosopher John Rawls. It holds that when thinking about how society should be organized, we should do so by imagining ourselves ignorant of our particular place in the world, as if there were a veil preventing us from knowing who we are. Rawls refers to this as the “original position.”

  • Speaking of privilege, we (the authors) often say we are lucky to have won the birth lottery. Not only were we not born into slavery, but we were also not born into almost any disadvantaged group. At birth, we were no more deserving of an easier run at life than a child who was born into poverty, or with a disability, or any other type of disadvantage. Yet we are the ones who won this lottery since we do not have these disadvantages.

  • We now know this to be the case—all of our continents were previously grouped together into one supercontinent now called Pangea. However, his theory was met with harsh criticism because Wegener was an outsider—a meteorologist by training instead of a geologist—and because he couldn’t offer an explanation of the mechanism causing continental drift, just the idea that it likely had taken place.

  • After struggling to get his ideas adopted, Semmelweis went crazy, was admitted to an asylum, and died at the age of forty-seven. It took another twenty years after his death for his ideas about antiseptics to start to take hold, following Louis Pasteur’s unquestionable confirmation of germ theory.

  • However, they both noticed obvious and important empirical truths that should have been investigated by other scientists but were reflexively rejected by these scientists because the suggested explanations were not in line with the conventional thinking of the time. Today, this is known as a Semmelweis reflex.

  • There is a reason why many startup companies that disrupt industries are founded by industry outsiders.

  • The reason is because outsiders aren’t rooted in existing paradigms. Their reputations aren’t at stake if they question the status quo. They are by definition “free thinkers” because they are free to think without these constraints.

  • The pernicious effects of confirmation bias and related models can be explained by cognitive dissonance, the stress felt by holding two contradictory, dissonant, beliefs at once. Scientists have actually linked cognitive dissonance to a physical area in the brain that plays a role in helping you avoid aversive outcomes. Instead of dealing with the underlying cause of this stress—the fact that we might actually be wrong—we take the easy way out and rationalize the conflicting information away. It’s a survival instinct!

  • Most people are binary and instant in their judgments; that is, they immediately categorize things as good or bad, true or false, black or white, friend or foe. A truly effective leader, however, needs to be able to see the shades of gray inherent in a situation in order to make wise decisions as to how to proceed. The essence of thinking gray is this: don’t form an opinion about an important matter until you’ve heard all the relevant facts and arguments, or until circumstances force you to form an opinion without recourse to all the facts (which happens occasionally, but much less frequently than one might imagine). F. Scott Fitzgerald once described something similar to thinking gray when he observed that the test of a first-rate mind is the ability to hold two opposing thoughts at the same time while still retaining the ability to function.

  • For example, if you’re an experienced hiker in bear country, you know that you should never stare down a bear, as it will take this as a sign of aggression and may charge you in response. Suppose now you’re hiking in mountain lion country and you come across a lion—what should you do? Your intuition would tell you not to stare it down, but in fact, you should do exactly that. To mountain lions, direct eye contact signals that you aren’t easy prey, and so they will hesitate to attack.

  • In other words, as we explained at the beginning of this chapter, using mental models over time is a slow and steady way to become more antifragile, making you better able to deal with new situations over time. Of course, the better the information you put into your brain, the better your intuition will be.

  • The root cause, by contrast, is what you might call the real reason something happened. People’s explanations for their behavior are no different: anyone can give you a reason for their behavior, but that might not be the real reason they did something. For example, consistent underperformers at work usually have a plausible excuse for each incident, but the real reason is something more fundamental, such as lack of skills, motivation, or effort.

  • In an 1833 essay, “Two Lectures on the Checks to Population,” economist William Lloyd described a similar, but hypothetical, overgrazing scenario, now called the tragedy of the commons.

  • You’ve probably gone out to dinner with friends expecting that you will equally split the check. At dinner, each person is faced with a decision to order an expensive meal or a cheaper one. When dining alone, people often order the cheaper meal. However, when they know that the cost of dinner is shared by the whole group, people tend to opt for the expensive meal. If everyone does this then everyone ends up paying more!

  • Unfortunately, some people cannot be medically immunized, such as infants, people with severe allergies, and those with suppressed immune systems. At no fault of their own, they face the potentially deadly consequences of the anti-vaccination movement, a literal tragedy of the commons.

  • The Coase theorem holds that instead of limiting the cows, another solution would have been to simply divide the grazing rights to the commons property among the farmers. The farmers could then trade the grazing rights among themselves, creating an efficient marketplace for the use of the commons.

  • Goodhart’s law summarizes the issue: When a measure becomes a target, it ceases to be a good measure. This more common phrasing is from Cambridge anthropologist Marilyn Strathern in her 1997 paper “‘Improving Ratings’: Audit in the British University System.” However, the “law” is named after English economist Charles Goodhart, whose original formulation in a conference paper presented at the Reserve Bank of Australia in 1975 stated: “Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.”

  • Both describe the same basic phenomenon: When you try to incentivize behavior by setting a measurable target, people focus primarily on achieving that measure, often in ways you didn’t intend. Most importantly, their focus on the measure may not correlate to the behavior you hoped to promote. High-stakes testing culture—be it for school examinations, job interviews, or professional licensing—creates perverse incentives to “teach to the test,” or worse, cheat. In the city of Atlanta in 2011, 178 educators were implicated in a widespread scandal involving correcting student answers on standardized tests, ultimately resulting in eleven convictions and sentences of up to twenty years on racketeering charges.

  • Koenigswald’s discoveries might have been more impressive still but for a tactical error that was realized too late. He had offered locals ten cents for every piece of hominid bone they could come up with, then discovered to his horror that they had been enthusiastically smashing large pieces into small ones to maximize their income.

  • This model gets its name from a situation involving actual cobras. When the British were governing India, they were concerned about the number of these deadly snakes, and so they started offering a monetary reward for every snake brought to them. Initially the policy worked well, and the cobra population decreased. But soon, local entrepreneurs started breeding cobras just to collect the bounties. After the government found out and ended the policy, all the cobras that were being used for breeding were released, increasing the cobra population even further.

  • A related model to watch out for is the hydra effect, named after the Lernaean Hydra, a beast from Greek mythology that grows two heads for each one that is cut off. When you arrest one drug dealer, they are quickly replaced by another who steps in to meet the demand. When you shut down an internet site where people share illegal movies or music, more pop up in its place. Regime change in a country can result in an even worse regime.

  • If you do engage, another trap to watch out for is the observer effect, where there is an effect on something depending on how you observe it, or even who observes it. An everyday example is using a tire pressure gauge. In order to measure the pressure, you must also let out some of the air, reducing the pressure of the tire in the process. Or, when the big boss comes to town, everyone acts on their best behavior and dresses in nicer clothes.

  • The general model for this impact comes from economics and is called path dependence, meaning that the set of decisions, or paths, available to you now is dependent on your past decisions.

  • Another model from economics offers some reprieve from the limitations of path dependence: preserving optionality. The idea is to make choices that preserve future options. Maybe as a business you put some excess profits into a rainy-day fund, or as an employee you dedicate some time to learning new skills that might give you options for future employment. Or, when faced with a decision, maybe you can delay deciding at all (see thinking gray in Chapter 1) and, instead, continue to wait for more information, keeping your options open until you are more certain of a better path to embark upon.

  • You need to find the right balance between preserving optionality and path dependence.

  • Roman writer Marcus Seneca said, “The abundance of books is a distraction”—in the first century A.D.!

  • Some decisions are consequential and irreversible or nearly irreversible—one-way doors—and these decisions must be made methodically, carefully, slowly, with great deliberation and consultation. If you walk through and don’t like what you see on the other side, you can’t get back to where you were before …. But most decisions aren’t like that—they are changeable, reversible—they’re two-way doors. If you’ve made a suboptimal [reversible] decision, you don’t have to live with the consequences for that long. You can reopen the door and go back through …. As organizations get larger, there seems to be a tendency to use the heavy-weight [irreversible] decision-making process on most decisions, including many [reversible] decisions. The end result of this is slowness, unthoughtful risk aversion, failure to experiment sufficiently, and consequently diminished invention.

  • In your own life, you can use Hick’s law to remember that decision time is going to increase with the number of choices, and so if you want people to make quick decisions, reduce the number of choices. One way to do this is to give yourself or others a multi-step decision with fewer choices at each step, such as asking what type of restaurant to go to (Italian, Mexican, etc.), and then offering another set of choices within the chosen category.

  • Some extremely productive people, including Steve Jobs and Barack Obama, have tried to combat decision fatigue by reducing the number of everyday decisions, such as what to eat or wear, so that they can reserve their decision-making faculties for more important decisions. Barack Obama chose to wear only blue or gray suits and said of this choice, “I’m trying to pare down decisions. I don’t want to make decisions about what I’m eating or wearing.

  • If you want more variety in your life, one suggestion is to front-load the decisions on your outfits and meals for the week to Sunday. Making these decisions on a usually lower-stress day can free up your decision-making capacity for the workweek. Meal planning and even some meal prep on the weekend can help keep you from making unhealthy choices when you are overwhelmed later in the week.

  • “If you chase two rabbits, both will escape.”

  • The mechanical advantage gained by a lever, also known as leverage, serves as the basis of a mental model applicable across a wide variety of situations. The concept can be useful in any situation where applying force or effort in a particular area can produce outsized results, relative to similar applications of force or effort elsewhere.

  • Fifty years out, at the 5 percent discount rate, the million dollars that year is worth only $87,204 to you today ($1M/1.0550

  • This model applies in other contexts as well: waiting until you are sure you are making the perfect decision, until you have crafted the flawless product, and so on. The best time to call something done is much earlier than it usually happens.

  • An instance where sunk costs lead to an escalation of commitment is sometimes called the Concorde fallacy, named after the supersonic jet whose development program was plagued with prohibitive cost overruns and never came close to making a profit. Ask yourself: Is my project like the Concorde? Am I throwing good money after bad when it is better to just walk away?

  • As Benjamin Franklin wrote in The Way to Wealth, “An investment in knowledge pays the best interest.”

  • Architect Christopher Alexander introduced the concept of a design pattern, which is a reusable solution to a design problem. This idea has been adapted to other fields and is especially popular in computer science.

  • The opposite of the well-tested design pattern is the anti-pattern, a seemingly intuitive but actually ineffective “solution” to a common problem that often already has a known, better solution. Most of the mental models in this book are either design patterns or anti-patterns, and learning them can help you avoid common mistakes. Anti-patterns in this chapter include bike-shedding, present bias, and negative returns. You can avoid anti-patterns by explicitly looking for them and then seeking out established design patterns instead.

  • Common examples of black box algorithms include recommendation systems on Netflix or Amazon, matching on online dating sites, and content moderation on social media.

  • Physical tools can also be black boxes. Two sayings, “The skill is built into the tool” and “The craftsmanship is the workbench itself,” suggest that the more sophisticated tools get, the fewer skills are required to operate them. Repairing or programming them is another story, though!

  • Parallel processing is an example of a divide and conquer strategy. If you can break a problem into independent pieces and hand these pieces out to different parties to solve, you can accomplish more, faster. Think of when you delegate parts of a project to different people or departments to work on.

  • This rise and fall of the dark-colored peppered moth is a showcase of natural selection, the process that drives biological evolution.

  • If you magically transported successful companies, movies, and books from fifty years ago to the present day, and released them now for the first time, most would not be successful because society has evolved so much since their heyday.

  • Reversing course once a strategy tax is established can be even more costly. In 1988, George H. W. Bush delivered this famous line at the Republican Party’s national convention: “Read my lips: no new taxes.” Later, this commitment caused significant problems for him when he faced a recession as president. Ultimately, Bush decided he had to break his pledge and raise taxes, and it cost him reelection.

  • A model related to the strategy tax is the Shirky principle, named after economics writer Clay Shirky. The Shirky principle states, Institutions will try to preserve the problem to which they are the solution. An illustrative example is TurboTax, a U.S. company that makes filing taxes easier, but also lobbies against ideas that would make it easier to file taxes directly with the government.

  • Of course, things can and do eventually become unpopular, and there is another mental model to describe the point at which something’s relevance begins to decline. This model is peak, as in peak sexism, peak Facebook. This concept was actually popularized with oil, as peak oil is usually defined as the point in time when the maximum amount of oil is being extracted from Earth. After peak oil, the decline may be a slow one, but it will have begun, with oil production falling each year instead of rising.

  • Consequently, the best organizational culture in many situations is one that is highly adaptable, just as it is recommended for people themselves to be highly adaptable. That is, you likely want to craft an organizational culture that can readily accept new strategies or processes. A culture like this is agile, willing to experiment with new ideas, not tied down to existing processes.

  • An example of a flywheel in everyday life is how it takes a lot of time and practice to become an expert on a topic, but once you are an expert it takes only minimal effort to remain on top of new developments in the field. On a shorter time scale, any personal or professional project can be viewed from the perspective of a flywheel. It is slow when you get started on the project, but once you gain some momentum, it seems easier to make progress.

  • The flywheel model tells you your efforts will have long-term benefits and will compound on top of previous efforts by yourself and others. It’s the tactical way to apply the concepts of momentum and inertia to your advantage.

  • First, from biology, there is homeostasis, which describes a situation in which an organism constantly regulates itself around a specific target, such as body temperature. When you get too cold, you shiver to warm up; when it’s too hot, you sweat to cool off. In both cases, your body is trying to revert to its normal temperature. That’s helpful, but the same effect also prevents change from the status quo when you want it to occur.

  • Potential energy is the stored energy of an object, which has the potential to be released. Center of gravity is the center point in an object or system around which its mass is balanced.

  • A common example of a forcing function is the standing meeting, such as one-on-one meetings with a manager or coach, or a regular team meeting. These are set times, built into the calendar, when you can repeatedly bring up topics that can lead to change.

  • Critical mass as a super model applies to any system in which an accumulation can reach a threshold amount that causes a major change in the system. The point at which the system starts changing dramatically, rapidly gaining momentum, is often referred to as a tipping point. For example, a party needs to reach a critical mass of people before it feels like a party, and the arrival of the final person needed for the party to reach the critical number tips the party into high gear.

  • We had a big snowstorm this year; so much for global warming. My grandfather lived to his eighties and smoked a pack a day for his whole life, so I don’t believe that smoking causes cancer. I have heard several news reports about children being harmed. It is so much more dangerous to be a child these days. I got a runny nose and cough after I took the flu vaccine, and I think it was caused by the vaccine.

  • “Anecdotal thinking comes naturally, science requires training.”

  • Other examples of common proxy metrics include the body mass index (BMI), used to measure obesity, and IQ, used to measure intelligence. Proxy metrics are more prone to criticism because they are indirect measures, and all three of these examples have been criticized significantly.

  • We called this section “The Bell Curve,” however, because the normal distribution is especially useful due to one of the handiest results in all of statistics, called the central limit theorem. This theorem states that when numbers are drawn from the same distribution and then are averaged, this resulting average approximately follows a normal distribution. This is the case even if the numbers originally came from a completely different distribution.

  • Now that you know about Bayes’ theorem, you should also know that there are two schools of thought in statistics, based on different ways to think about probability: Frequentist and Bayesian. Most studies you hear about in the news are based on frequentist statistics, which relies on and requires many observations of an event before it can make reliable statistical determinations. Frequentists view probability as fundamentally tied to the frequency of events.

  • In statistics, a false positive is also known as a type I error and a false negative is also called a type II error.

  • Once you set your false positive rate, you then determine what sample size you need in order to detect a real result with a high enough probability. This value, called the power of the experiment, is typically selected to be an 80 to 90 percent chance of detection, with a corresponding false negative error rate of 10 to 20 percent. (This rate is also denoted by the Greek letter β, beta, which is equal to 100 minus the power.) Researchers say their study is powered at 80 percent.

  • Statistical significance should not be confused with scientific, human, or economic significance. Even the most minuscule effects can be detected as statistically significant if the sample size is large enough. For example, with enough people in the sleep study, you could potentially detect a 1 percent difference between the two groups, but is that meaningful to any customers? No.

  • That rate is low, and this problem is aptly positive results the replication crisis. This final section offers some models to explain how this happens, and how you can nevertheless gain more confidence in a research area.

  • Unfortunately, studies are much, much more likely to be published if they show statistically significant results, which causes publication bias. Studies that fail to find statistically significant results are still scientifically meaningful, but both researchers and publications have a bias against them for a variety of reasons. For example, there are only so many pages in a publication, and given the choice, publications would rather publish studies with significant findings over ones with none. That’s because successful studies are more likely to attract attention from media and other researchers. Additionally, studies showing significant results are more likely to contribute to the careers of the researchers, where publication is often a requirement to advance.

  • Systems thinking describes this act, when you attempt to think about the entire system at once. By thinking about the overall system, you are more likely to understand and account for subtle interactions between components that could otherwise lead to unintended consequences from your decisions. For example, when thinking about making an investment, you might start to appreciate how seemingly unrelated parts of the economy might affect its outcome.

  • In biology, the T cells that help power your immune system, once activated, thereafter require a lower threshold to reactivate. Hysteresis describes how both the metal and the T cells partially remember their states, such that what happened previously can impact what will happen next. Again, this may already seem like a familiar concept, because it is similar to the mental model of path dependence (see Chapter 2), which more generally describes how choices have consequences in terms of limiting what you can do in the future. Hysteresis is one type of path dependence, as applied to systems.

  • Crowdsourcing has been effective across a wide array of situations, from soliciting tips in journalism, to garnering contributions to Wikipedia, to solving the real-world problems of companies and governments. For example, Netflix held a contest in 2009 in which crowdsourced researchers beat Netflix’s own recommendation algorithms.

  • • Diversity of opinion: Crowdsourcing works well when it draws on different people’s private information based on their individual knowledge and experiences. • Independence: People need to be able to express their opinions without influence from others, avoiding groupthink. • Aggregation: The entity doing the crowdsourcing needs to be able to combine the diverse opinions in such a way as to arrive at a collective decision.

  • If more people are making yes predictions than no predications, then the price of the stock rises, and vice versa. By looking at the current prices in the prediction market, you can get a sense of what the market thinks will happen, based on how people are betting (buying shares). Many big companies operate similar prediction markets internally, where employees can predict the outcome of things like sales forecasts and marketing campaigns.

  • Getting into an arms race is not beneficial to anyone involved. There is usually no clear end to the race, as all sides continually eat up resources that could be spent more usefully elsewhere. Think about how much better it would be if the money spent on making campuses luxurious was instead invested in better teaching and other areas that directly impact the quality and accessibility of a college education.

  • Casinos give away a lot of free stuff (reciprocity); they get you to first buy chips with cash (commitment); they try to personalize your experience to your interests (liking); they show you examples of other people who won big (social proof); they constantly present you with offers preying on your fear of missing out (scarcity); and dealers will even give you suboptimal advice (authority). Beware. There is a reason why the house always wins!

  • Fear is a particularly strong influencer, and it has its own named model associated with it, FUD, which stands for fear, uncertainty, and doubt. FUD is commonly used in marketing (“Our competitor’s product is dangerous”), political speeches (“We could suffer dire consequences if this law is passed”), religion (eternal damnation), etc.

  • A Trojan horse can refer to anything that persuades you to lower your defenses by seeming harmless or even attractive, like a gift. It often takes the form of a bait and switch, such as a malicious computer program that poses as an innocuous and enticing download (the bait), but instead does something nefarious, like spying on you (the switch).

  • The Enron and Theranos tactics both exemplify another dark pattern, called a Potemkin village, which is something specifically built to convince people that a situation is better than it actually is. The term is derived from a historically questionable tale of a portable village built to impress Empress Catherine II on her 1787 visit to Crimea.

  • One adage to keep in mind when you find yourself in a guerrilla warfare situation is that generals always fight the last war, meaning that armies by default use strategies, tactics, and technology that worked for them in the past, or in their last war. The problem is that what was most useful for the last war may not be best for the next one, as the British experienced during the American Revolution.

  • IBM famously miscalculated the rise of the personal computer relative to its mainframe business, actually outsourcing its PC operating system to Microsoft. This act was pivotal for Microsoft, propelling it to capture a significant part of the profits of the entire industry for the next thirty years. Microsoft, in turn, became so focused on its Windows operating system that it didn’t adapt it quickly enough to the next wave of operating system needs on the smartphone, ceding most of the profits in the smartphone market to Apple, which is now the most profitable company in history.

  • Instagram had only thirteen employees when Facebook bought it for one billion dollars in 2012; a few years later Facebook bought WhatsApp, with fifty-five employees, for a whopping nineteen billion dollars.

  • Organizations are always on the lookout for 10x individuals because they can be the ingredients of a true dream team. Keeping Joy’s law in mind, however, reminds you that just seeking out 10x people is a trap for two reasons. First, they are extremely rare; not every organization can hire world-class talent, because there just isn’t enough to go around.

  • Whether invading countries or markets, the first wave of troops to see battle are the commandos …. A startup’s biggest advantage is speed, and speed is what commandos live for. They work hard, fast, and cheap, though often with a low level of professionalism, which is okay, too, because professionalism is expensive. Their job is to do lots of damage with surprise and teamwork, establishing a beachhead before the enemy is even aware that they exist …. Grouping offshore as the commandos do their work is the second wave of soldiers, the infantry. These are the people who hit the beach en masse and slog out the early victory, building on the start given them by the commandos …. Because there are so many more of these soldiers and their duties are so varied, they require an infrastructure of rules and procedures for getting things done—all the stuff that commandos hate …. What happens then is that the commandos and the infantry head off in the direction of Berlin or Baghdad, advancing into new territories, performing their same jobs again and again, though each time in a slightly different way. But there is still a need for a military presence in the territory they leave behind, which they have liberated. These third-wave troops hate change. They aren’t troops at all but police. They want to fuel growth not by planning more invasions and landing on more beaches but by adding people and building economies and empires of scale.

  • If you put a commando person in a police role (e.g., project manager, compliance officer, etc.), they will generally rebel and make a mess of everything, whereas if you put a police person in a commando role (e.g., a position involving rapid prototyping, creative deliverables, etc.), they will generally freeze up and stall out.

  • Another mental model that helps you consider people’s strengths is foxes versus hedgehogs, derived from a lyric by the Greek poet Archilochus, translated as The fox knows many things, but the hedgehog knows one big thing. Philosopher Isaiah Berlin applied the metaphor to categorize people based on how they approach the world: hedgehogs, who like to frame things simply around grand visions or philosophies; and foxes, who thrive on complexity and nuance. Hedgehogs are big picture; foxes appreciate the details.

  • In addition, higher roles tend to involve more strategy than tactics. Generally, strategy is the big picture; tactics are the details. Strategy is the long term, defining what ultimate success looks like. Tactics are short term, defining what we’re going to do next to get there. The Peter principle factors in because promoting someone who is great tactically into a strategic role can be problematic if that person is not strong strategically.

  • Apple is known for popularizing a mental model called directly responsible individual, or DRI for short. After every meeting, it is made clear that there is one DRI who is responsible and accountable for the success of each action item. DuckDuckGo similarly assigns a DRI to every company activity—from the smallest task to the largest company objective.

  • The Pygmalion effect is a model that states that higher expectations lead to increased performance, as people try to meet the expectations set for them. (It’s named after the Greek myth of Pygmalion, a sculptor who crafted his ideal spouse, whom Aphrodite then gave life to as Galatea.)

  • In any group setting, it is important to understand the culture, including whether it is one that prefers high-context or low-context communication. A low-context culture is explicit and direct with information, preferring that you be real and tell it like it is. You need a low amount of context to understand low-context communication, because most everything you need to know is clearly expressed.

  • The manager’s schedule is for bosses. It’s embodied in the traditional appointment book, with each day cut into one-hour intervals. You can block off several hours for a single task if you need to, but by default you change what you’re doing every hour …. But there’s another way of using time that’s common among people who make things, such as programmers and writers. They generally prefer to use time in units of half a day at least. You can’t write or program well in units of an hour. That’s barely enough time to get started. When you’re operating on the maker’s schedule, meetings are a disaster. A single meeting can blow a whole afternoon, by breaking it into two pieces each too small to do anything hard in. Plus you have to remember to go to the meeting. That’s no problem for someone on the manager’s schedule. There’s always something coming on the next hour; the only question is what. But when someone on the maker’s schedule has a meeting, they have to think about it.

  • Another model to keep in mind that can quickly erode culture and morale is the mythical man-month. It comes from computer scientist Fred Brooks, who originally presented it in a book with the same name. Man-month, or person-month, is a unit of measurement for how long projects take (e.g., this project will take ten person-months). Brooks declares that this entire way of measurement is flawed, based on a myth that you can simply add more people (person-months) to a project and get it done faster.

  • Price differences like these tend to not last very long, because others notice and pursue the same discrepancies until they no longer exist. It can certainly be profitable to take advantage of these short-term opportunities, but you need to keep finding new ones to continue turning a profit.

  • For a first mover, the difference between success and failure hinges on whether they can also be first to achieve product/market fit. That’s when a product is a such a great fit for its market that customers are actively demanding more. This model was also developed by Andy Rachleff, who explained in “Demystifying Venture Capital Economics, Part 3,” “First to market seldom matters. Rather, first to product/market fit is almost always the long-term winner …. Once a company has achieved product/market fit, it is extremely difficult to dislodge it, even with a better or less expensive product.”

  • Way back in Chapter 1 we explained how you want to de-risk an idea by testing your assumptions as cheaply as possible. Customer development is one way to do that, by talking directly to customers or potential customers. As Blank says, “There are no facts inside the building so get the hell outside!” If you can ask the right questions, you can find out whether you have something people really want, signaling product/market fit.

  • Knowing the real job your product does helps you align both product development and marketing around that job. Apple does this exceptionally well. For instance, it introduced the iPod in 2001 amid a slew of MP3-player competitors but chose not to copy any of their marketing lingo, which was focused on technical jargon like gigabytes and codecs. Instead, Steve Jobs famously framed the iPod as “1,000 songs in your pocket,” recognizing that the real job the product was solving was letting you carry your music collection with you.

  • Another clarifying model is what type of customer are you hunting? This model was created by venture capitalist Christoph Janz, in a November 4, 2016, post on his Angel VC blog, to illustrate that you can build large businesses by hunting different size customers, from the really small (flies) to the really big (elephants). Janz notes that to get to $100 million in revenue, a business would need 10 million “flies” paying $10 per year, or 1,000 “elephants” paying $100,000 per year. Believe it or not, there are successful $100 million revenue businesses across the entire spectrum, from those seeking “amoebas” (at $1 per year) to those seeking “whales” (at $10 million per year).

  • If you have no bright spots after some time, it is likely you do need to pivot. It’s like the old phrase “You don’t have to go home, but you can’t stay here.” If you do have some bright spots, you can try to figure out why things are working there and focus on growing out from that base. This is actually a useful strategy for advancing any idea, struggling or otherwise, drawing on the military concept of the beachhead. That’s where a military offense takes and defends a beach so that more of their force can move through the beachhead onto the greater landmass.

  • Once you achieve product/market fit or whatever type of fit you are trying to achieve, it is time to protect your position. Warren Buffett popularized the term moat, making an analogy to the deep ditch of water surrounding a castle to describe how to shield yourself from the competition, thereby creating a sustainable competitive advantage.

  • Protected intellectual property (copyright, patents, trade secrets, etc.) • Specialized skills or business processes that take a long time to develop (for example, Apple’s vertically integrated products and supply chain, which meld design, hardware, and software) • Exclusive access to relationships, data, or cheap materials • A strong, trusted brand built over many years, which customers turn to reflexively • Substantial control of a distribution channel • A team of people uniquely qualified to solve a particular problem • Network effects or other types of flywheels (as described in Chapter 4) • A higher pace of innovation (e.g., a faster OODA loop)

  • These same moat types can apply to your personal place in an organization or field as well. For example: You can have the biggest personal network (exclusive access to relationships). You can build a personal following (strong, trusted brand). You can become the expert in an in-demand area (unique qualifications). You can create a popular blog (substantial control of a distribution channel). Each of these and more can create a moat that protects your place in a competitive landscape.

  • When disruptive technologies like this first emerge, they are usually inferior to the current technologies in the ways that most buyers care about. For decades, digital photography was comparatively expensive and produced lower-quality photographs than film; however, its convenience (in not having to develop pictures) appealed to some buyers and allowed the market to progress. Slowly but surely, the price and performance gap between digital and film closed. Once it crossed the tipping point (see Chapter 4) of being attractive to most consumers, the digital camera market exploded.

  • Kodak wasn’t blind to these developments either. Initially it was even the market leader in digital cameras too, with a 27 percent share in 1999. However, it didn’t invest heavily enough in the technology relative to its competitors, the way Intel did when pivoting to microprocessors. Kodak simply wasn’t paranoid enough.

  • Business strategist Geoffrey Moore named this jump crossing the chasm in a book by the same name. The chasm here refers to the fact that many ideas, companies, and technologies fail to make it from one side to the other. That’s because there is a huge gulf in expectations between early adopters and the early majority, which most things fail to meet. Early adopters like to tinker with things or are a small subset of people who really need something, but to cross the chasm into the early majority, a product has to solve a real ongoing need for a lot more people. And most products just aren’t compelling enough to cross this gulf and truly spread into the mainstream.

  • A related mental model is the cargo cult, as explained by Feynman in his 1974 Caltech commencement speech: In the South Seas there is a cargo cult of people. During the war they saw airplanes land with lots of good materials, and they want the same thing to happen now. So they’ve arranged to imitate things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head like headphones and bars of bamboo sticking out like antennas—he’s the controller—and they wait for the airplanes to land. They’re doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn’t work. No airplanes land. So I call these things cargo cult science, because they follow all the apparent precepts and forms of scientific investigation, but they’re missing something essential, because the planes don’t land.

  • Second, try writing. Even if you never publish anything, the act of writing clarifies your thinking and makes you aware of holes in your arguments. You can combine writing and finding a partner by participating in an online forum or blog where the complex topics that interest you are discussed and analyzed.

  • Over time, your efforts will expand what Warren Buffett calls your circle of competence. The inside of the circle covers areas where you have knowledge or experience—where you are competent—and in those areas, you can think effectively. In areas outside the circle, you cannot. The most dangerous zone is just outside your circle of competence, where you might think you are competent but you really are not. Buffett wrote in a 1999 shareholder letter: If we have a strength, it is in recognizing when we are operating well within our circle of competence and when we are approaching the perimeter ….

  • In my whole life, I have known no wise people (over a broad subject area) who didn’t read all the time—none, zero. You’d be amazed at how much Warren [Buffett] reads—and how much I read. My children laugh at me. They think I’m a book with a couple of legs sticking out. Since the really big ideas carry 95% of the freight, it wasn’t at all hard for me to pick up all the big ideas in all the disciplines and make them a standard part of my mental routines. Once you have the ideas, of course, they’re no good if you don’t practice. If you don’t practice, you lose it. So I went through life constantly practicing this multidisciplinary approach. Well, I can’t tell you what that’s done for me. It’s made life more fun. It’s made me more constructive. It’s made me more helpful to others. It’s made me enormously rich. You name it. That attitude really helps.