The System 1 vs. System 2 Debate: A False Dilemma
Why decision making isn’t as simple as intuition versus analysis.
A previous version of this post was published on Psychology Today.
Much of the scientific and popular literature on decision making is framed around a dual-process model1. This model contrasts the relatively automatic, intuition-driven, and unconscious System 1 with the more effortful, deliberate, and conscious System 2. According to this perspective, fast, intuitive thinking is prone to errors, while slow, analytical thinking is portrayed as the corrective mechanism. Though intuitively appealing, this dichotomy oversimplifies the complexities of decision-making and is itself flawed2.
Melnikoff and Bargh (2018) highlighted several fundamental issues with what they called “the mythical number 2.” They noted that many of the attributes ascribed to System 1 and System 2 fail to align with empirical evidence and argued that dual-process theories are often unfalsifiable. Furthermore, much of the support for these theories reflects “confirmation bias at work” (p. 283). Contrary to the dichotomy’s assumptions, biases, motivated reasoning, and fallacious reasoning influence all decision making—whether intuitive or analytical, unconscious or conscious.
Couchman et al. (2016)3 provided compelling evidence that further challenges the dual-process perspective. They examined the performance of college students on multiple-choice exams, focusing on the relationship between students’ confidence in their initial responses and the accuracy of those responses4. The results revealed that when students had confidence in their initial responses—with confidence being rooted in their expertise—their answers were correct over 85% of the time. By contrast, when students lacked confidence, their accuracy dropped to just over 50%5. The takeaway from Couchman et al’s study is that intuitive responses aren’t inherently flawed—provided they’re grounded in confidence derived from relevant expertise and experience6.
Findings such as these call into question the claims made by dual-process advocates. The results indicate that intuitive decision making, far from being error-prone, can be highly accurate when supported by appropriate experience and expertise. This challenges the dichotomy’s oversimplified view that intuitive thinking (System 1) is fundamentally unreliable compared to deliberate, analytical thinking (System 2). It essentially creates a false dilemma.
The False Dilemma
Studies grounded in the dual-process model oversimplify the complexity of decision making by pitting System 1 and System 2 against each other. This framing implies that decisions must be attributed to either fast, intuitive thinking or slow, deliberate analysis7. Such studies often rely on the (in)correctness of the responses to infer which system was involved, attributing errors to System 1 and correct answers to System 2. This approach assumes a simplistic one-to-one correspondence between correctness and the system used, without examining the actual decision-making process. Consider the following:
A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?
According to dual-process researchers, if your answer to the above was 10 cents, it was because you gave the intuitive response attributed to System 1 (because that answer is incorrect8). If your response was 5 cents (the correct answer), then you supposedly engaged your conscious, effortful decision-making capabilities and overrode your initial System 1 response. Thankfully, System 2 swooped in to save the day.
But how do we know that if you said 10 cents, it was because you relied on an intuitive response? Couldn’t you have consciously thought about it and concluded that 10 cents was correct? If so, wouldn’t that mean System 2 produced the incorrect answer? Conversely, isn’t it possible that you intuitively knew the correct answer was 5 cents, meaning System 1 was responsible for the success rather than the error?
The only response dual-process theories can offer to these questions essentially amounts to: we just know. If you got the correct answer, it must have been the result of System 2. If you got it wrong, it must have been the result of System 1. Such reasoning is problematic because there’s no evidence—other than differences in correctness—to use as the basis for these inferences.
Thus, the most popular model of decision making in contemporary theory and practice can be summarized as follows:
A heuristic that (over)simplifies a complex process into two competing systems;
A false dichotomy because there’s no evidence that decisions are solely the result of one or the other system;
Grounded in studies that attribute errors to one system and correct answers to the other, thereby reinforcing the existence of these two competing systems.
No Clear Distinction Between Intuitive and Effortful Decision-Making
The dual-process perspective has serious flaws, suggesting a need for a better way to conceptualize how we make decisions. Gigerenzer and Gaissmaier (2011) offer an alternative by defining heuristics as “a strategy that ignores part of the information, with the goal of making decisions more quickly, frugally, and/or accurately than more complex methods." Because people have limits on how much information they can process—known as bounded rationality—decision making inherently involves ignoring some information. Gigerenzer and Gaissmaier concluded that “there is no strict dichotomy between heuristic and nonheuristic, as strategies can ignore more or less information."
Rather than decision accuracy being attributable to one system or the other, it generally depends on the specific strategy used (e.g., pattern recognition, recall, weighting different information) and the amount of information considered (i.e., what is ignored versus what is considered). The complexity of the strategy and the information considered determine the time and cognitive effort required. Effective decision making occurs when we apply a strategy and process enough information to reach a decision that is sufficiently accurate for the situation9.
Almost every decision produces one or more quick, cognitively frugal intuitive responses. These responses are influenced by both stored knowledge from past experiences and novel information available in the present situation (Pennycook et al., 2015). They rely on salient pieces of information as the basis for initiating possible decisions10. If a response (let’s call it Intuitive Response 1 or IR1) emerges quickly or inspires sufficient confidence, we may adopt it. If not, we expand our search for additional information before finalizing the decision. This expanded search may either reinforce confidence in IR1 or lead to an alternative response (e.g., IR2, IR3, or a previously unconsidered option).
To put this into context, let’s consider the following situation. One day, Tom notices his lawn is looking a little bare and decides to buy some grass seed. Though he’s not all that familiar with the various types of seed available, he decides he needs to buy something to put down11. At the lawn and garden store, he notices bags of fescue, which is a type of grass seed he’s vaguely familiar with. This recognition triggers an intuitive response to purchase fescue. However, Tom isn’t confident it’s the right choice, so he expands his search. After comparing options, he learns that fescue requires above-average watering, which he wants to avoid. He notices that Kentucky Bluegrass requires only average watering and seems like a better fit. Using the take-the-best heuristic12, Tom decides to purchase the Kentucky Bluegrass instead13.
In Tom’s case, only fescue initially came to mind, resulting in one intuitive response. However, if he had also recognized zoysia grass, he would have started with two competing intuitive responses. Given that Tom prioritized watering needs as his primary heuristic criterion, zoysia would have been a better choice because it requires less water than fescue. However, zoysia grows slowly, and if that factor became salient to Tom, he might have expanded his heuristic criteria to be inclusive of average watering and quick growth. At this stage, Tom moved beyond his initial intuitive responses and applied a simplified fast-and-frugal tree heuristic that focused on the updated criteria—again leading him to select Kentucky Bluegrass.
Dissecting Tom’s decision using the dual-process perspective illustrates its limitations. It might claim that System 1 generated Tom’s initial response, while System 2 stepped in to correct it. But did Tom truly engage in effortful, conscious deliberation? All he did was adjust his search criteria based on relevant factors. Even if some conscious reasoning occurred, it was far from the exhaustive, deliberative process typically associated with System 2. Instead, Tom used heuristics to identify relevant enough criteria and make a decision that met those criteria.
Scenarios like Tom’s expose the flaws in the dual-process perspective, which assumes a rigid, mechanized approach to decision making. In reality, decisions often begin with one or more intuitive responses. If we feel confident enough, we act on them. If not, we expand our information search until we reach an acceptable level of confidence. This process can happen almost instantaneously or take considerable time, unfold incrementally, and involve varying levels of conscious and unconscious thought.
The Implications
So, what does this all mean? Most importantly, it means that we should stop assuming that intuitive decision making is inherently error-prone. As Gary Klein’s work in Naturalistic Decision Making (NDM) has repeatedly demonstrated, expertise and experience can yield quite valid intuitive responses. While terms like “gut instincts” are often used to describe intuition, they misleadingly suggest that intuitive responses lack analytical depth.
In reality, as Prietula and Simon (1989) argued, “intuition grows out of experiences that once called for analytical steps” (p. 122). Similarly, Simon (1993) concluded there is “absolutely no evidence” that “there is analytic thinking and intuitive thinking” (p. 405). These perspectives challenge the notion that intuition and analysis are fundamentally distinct, instead framing intuition as a skill refined through experience and practice. Evidence based on Recognition-Primed Decision (RPD) model further supports this view, demonstrating that intuitive responses—and how they evolve with additional information—are often the product of sophisticated cognitive processes.
The key to effective intuitive decision making lies in learning to better calibrate confidence in these responses (i.e., developing refined meta-thinking skills) and being willing to expand search strategies when confidence is low or novel information emerges. Similarly, we should stop assuming that conscious, effortful decision making is inherently superior to intuitive decision making. There are contexts where engaging in very planful, deliberative decision-making processes are appropriate. But the success of these processes still depends on the effectiveness of the heuristics we employ. Even when we aim to be more deliberative, slow, or cognitively intensive, heuristics play an extensive and often unacknowledged role in shaping our decisions. Recognizing this can help us make better decisions, regardless of whether they rely on intuition, conscious analytical processing, or a combination of both.
This intimation of a dual process perspective goes all the back at least to William James (1890), who postulated two different types of thinking: associative thinking and true reasoning. Wason and Evans (1974) later proposed the existence of separate intuitive and analytic processes. Kahneman later popularized the System 1/System 2 conceptualization, culminating in his book, Thinking, Fast and Slow (2011).
The irony of all this is that the dual-process approach is actually a heuristic, which many dual-process theorists primarily associate with intuition, which then implies it is fraught with error.
If you don’t have access to the full article, a summary of some of the results can be found as a part of a Conversation article from 2015, written by the lead author of the study.
While this study employs a task like many traditional dual-process tasks (i.e., multiple-choice questions with a defined correct answer), the fact that participants should have expected some knowledge of the material in question differentiates it from more traditional laboratory-based decision-making tasks, such as the one mentioned later. The lack of attention to decision makers’ expertise, which is one part of their frame of reference, is pervasive in many decision making research studies.
Which is still higher than chance, given that the study involved multiple choices questions. But it would not have been high enough for them to obtain a passing grade if all their guesses possessed only this degree of accuracy.
This is an issue I discussed when focusing on the expertise bubble.
They do not generally leave open the possibility it could be based on both.
System 1 would also be the alleged culprit if you gave a different erroneous response.
Also known as ecological rationality.
How much information we rely on will vary. Some aspects of the situation are likely to catch our attention, while others may be less salient. The information that catches our attention is what influences those heuristic responses.
At this point, I will tell you I know next to nothing about grass seed. I just needed a good example (don’t ask me how this example came to mind because I have no idea). For anyone who does know grass seed and would say my descriptions are wrong, I will refer you to Scotts, which is where my information came from.
Note that what is considered best was a value judgment on Tom’s part.
In the alternative, he might check out a few other bags and notice that none of them require less watering than Kentucky Bluegrass before deciding to stick with that choice.
Excellent. This is really helpful in sorting through the issues.