Next Article in Journal
A Blockchain-Based Secure Image Encryption Scheme for the Industrial Internet of Things
Next Article in Special Issue
Some Notes on Counterfactuals in Quantum Mechanics
Previous Article in Journal
Dynamic Effects Arise Due to Consumers’ Preferences Depending on Past Choices
Previous Article in Special Issue
Balanced Quantum-Like Bayesian Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

On the Irrationality of Being in Two Minds

School of Information Systems, Queensland University of Technology, Brisbane 4000, Australia
*
Author to whom correspondence should be addressed.
Entropy 2020, 22(2), 174; https://doi.org/10.3390/e22020174
Submission received: 14 December 2019 / Revised: 28 January 2020 / Accepted: 31 January 2020 / Published: 4 February 2020
(This article belongs to the Special Issue Quantum Information Revolution: Impact to Foundations)

Abstract

:
This article presents a general framework that allows irrational decision making to be theoretically investigated and simulated. Rationality in human decision making under uncertainty is normatively prescribed by the axioms of probability theory in order to maximize utility. However, substantial literature from psychology and cognitive science shows that human decisions regularly deviate from these axioms. Bistable probabilities are proposed as a principled and straight forward means for modeling (ir)rational decision making, which occurs when a decision maker is in “two minds”. We show that bistable probabilities can be formalized by positive-operator-valued projections in quantum mechanics. We found that (1) irrational decision making necessarily involves a wider spectrum of causal relationships than rational decision making, (2) the accessible information turns out to be greater in irrational decision making when compared to rational decision making, and (3) irrational decision making is quantum-like because it violates the Bell–Wigner polytope.

Graphical Abstract

1. Introduction

Understanding how humans form decisions is of great interest in a range of areas, and modeling decision making is important to psychology, social science, politics, economics, computer science, and cognitive science. Current models of human decision making rely on Bayesian probabilities, so much so that the term “Bayesian Cognition” has become mainstream [1,2]. Bayesian models account for rational decision making where “rationality” is defined by the laws of probability theory. However, decades of research have found that a whole range of human judgement deviates substantially from what would be normatively correct according to logic or probability theory.
Theories proposed to account for the deviation of human decision makers from rationality include bounded rationality [3], dual process theories [4], models in quantum cognition [5,6,7,8,9], and a growing list of cognitive biases and heuristics [10]. Dual process theories are prominent and widespread within many fields of psychological science [11]. Whilst there are multiple variations of these theories, all have in common the thesis that two processes are employed in human decision making: One fast, intuitive system sometimes prone to error (System 1), and a second slower, more controlled process based on rational thought (System 2). The extent to which either or both of these are employed is dependent on the relative cognitive resources and time that each typically consumes.
System 1 is fast and requires few cognitive resources and little effort [4], so is often considered the default system [12]. System 1 can be termed “irrational" because the intuition, biases, and heuristics that typify this system often do not adhere to the laws of logic or probability theory. In contrast, System 2 involves controlled analytic thought and is considered to be primarily logical and rational. It requires conscious activation and is a significant drain on cognitive resources [4].
Many of the heuristics employed by System 1 can be considered in terms of attribute substitution [10]. Put simply, humans tend to substitute a difficult problem for a more plausible, easier alternative. An example can be seen in the famous illustration of the conjunction fallacy, where participants overestimate the likelihood of Linda to be both a bank teller and a feminist, compared to her being a bank teller [13]. Rather than employing System 2 to conduct a rational estimation of probabilities, participants instead substitute the easier representativeness bias to conclude that Linda must be a feminist and that, therefore, the conjunction is the most likely option.
It is easy to see that the assumed rationality of Bayesian Cognition aligns more closely with System 2. The picture from the literature seems to suggest that human decision makers are often irrational because they employ System 1, even when employing System 2 would result in a rational outcome.
Although rational models of human decision making have become prominent and have achieved much success, there has been an emergence of models based on an alternative probabilistic framework drawn from quantum theory [6,7,8,9]. These quantum models show promise in addressing decision making that would normally be considered irrational [14,15]. This article continues this line of research by proposing a model for irrational decision making based on the notion of a bistable probability. The term “bistable” aims to capture the intuition of two sometimes competing systems involved in decision making and consequently the decision maker’s being caught between two “minds”.
The view taken in this paper is that decisions based on intuition, i.e., made by System 1, can sometimes result in a different outcome than judgements based on rational probability, i.e., made by System 2. When this happens, we deem the decision making to be “irrational”, as it deviates from rational judgement. However, the theory presented in the following sections is agnostic to how Systems 1 and 2 interact. It is the deviation from rational judgement that is the core issue.
We will show that bistable probabilities allow (ir)rational decision making to be systematically investigated. Irrationality raises questions such as the following: (1) How does irrationality affect the probabilistic judgement of causality? (2) Does irrationality affect the amount of information available for decision making? (3) How do irrational decisions relate to rational probabilistic judgements? These questions will be addressed in the following sections.

2. Bistable Parameters and Bistable Projection Operators

In order to model the deviation of irrational decision making from rational decision making, we propose a deformed probability, introducing what we term a “bistable parameter”. The purpose of this parameter is to capture the disagreement between System 1 and System 2. In doing so, the degree of substitution of a simple heuristic-based inference in place of a knowledge-based, classically Bayesian, inference [16] is also accounted for.
Similarly to the noise model proposed by [17,18], bistable probabilities are founded on an event space featuring two probabilities associated with a given decision outcome. The two probabilities relate to System 1 and System 2. For example, consider the scenario depicted in Figure 1, where System 2 is assumed to mediate the decisions of System 1. There are two decision outcomes A and B. System 1 opts for outcome A with probability k. System 2 may intervene and alter the response of System 1 with probability 1 p , resulting in choice B, or agree with System 1 and remain with choice A with probability p. The final probability for choosing option A is a function of both k and p: P k ( A ) = 1 p k + 2 k p . Note that when k = 1 , System 2 fully determines the final judgement, which is hence considered “rational”. When 0 k < 1 , the probability of the final judgement is deformed by a degree of irrationality. When k = 0 , the final outcome is considered “irrational" because it is the converse of the rational judgement, i.e., P k = 1 ( A ) = p vs. P k = 0 ( A ) = 1 p . When k = 0.5 , P k ( A ) = 0.5 irrespective of System 2’s mediation. This reflects the situation when the decision maker is caught between two minds with no means of resolving the conflict between the two, so the ultimate decision is random. Finally, System 2 “agrees" with System 1 when P k ( A ) = k . This occurs when p = 1 , which implies that no mediation is being employed by System 2.
Relating this to an example commonly used in the literature [13], we consider the following question: Is a person, Linda, more likely to be (a) a bank teller, or (b) a bank teller and active in the feminist movement? Without any additional information about Linda (for now, we will consider the question in the absence of the usually accompanying passage of background text), one might assume that the option that is logically more likely is option a), and one might assign a higher probability to this option. The reason that this is the logical and rational choice is tied to the rule that the probability of a conjunction of two propositions cannot be higher than the probability of either of the individual propositions ( P ( A B ) P ( A ) ). One’s intuitive response to this question might be to choose the simpler option, a), illustrated as a high value for k, for example, 0.8 . The parameter p determines the degree to which System 2 allows the intuition of System 1 to determine the final decision. Due to the fact that the option deemed more probable by System 1 would also be favored by a rationally driven System 2, we assign a high value for p, for example, 0.9 . Taking these values of k and p and following our model, we find a probability of 0.74 for the final decision of option a), that Linda is a bank teller, to be more likely.
Let us now consider the more complete example of the Linda problem, where a passage of text is presented prior to the above question. This text provides additional background information about Linda that tends to accord with a representation of someone who is active in the feminist movement. The question, coupled with the presentation of the background information, provides a classic demonstration of the conjunction fallacy, where the conjunction of two propositions (option b) is judged to be more probable than one of the propositions (option a). This has been attributed to a judgement heuristic labelled representativeness [13], to which System 1 is prone [10]. In our model, we reflect the likely choice of a System-1-driven judgement by assigning a low value for k, for example, 0.3 . To determine a value of p, we can assign a low value based on the premise that the rational thought process employed by System 2 is unlikely to allow the erroneous intuition of System 1 to determine the final decision, and we might assign a value of 0.2 , for example. Using these values for k and p in our model, we find a probability of 0.62 for the final decision of option (a), that Linda is a bank teller, to be more likely.
However, due to the fact that System 2 is constrained in terms of time and cognitive resources [4], the probability that System 2 will differ from or mediate the intuition of System 1 depends on a range of factors, including time constraints, motivation, and availability of cognitive resources. If one or a combination of these factors is present in such a way that it would be reasonable to assume that System 2 is less likely to intervene, we can instead choose a higher value of p, for example, 0.7 . Using the same value for k, 0.3 , we find that it is now option (b), that Linda is a bank teller and active in the feminist movement, that has the higher probability in the final decision ( 0.62 ), which is illustrative of the conjunction fallacy.
It should be noted that, although the preceding example frames the model in terms of a process (i.e., System 1 providing an intuition for which System 2 may or may not intervene and override), this model is not a dynamic one. It is agnostic to the order of System 1 and 2 and incorporates a time element only in the selection of a value of p (i.e., in determining the opportunity that System 2 might have to intervene in the above example). For a dynamic model, see [19]. We do not claim that defining irrationality in the preceding way presents a complete picture. In fact, irrationality can be considered in different ways, e.g., in relation to ideas and theories about anti-realism [20] or to some theories about biases and perceptions in cognitive science [17,18,21,22,23]. However, by parameterizing irrationality using a bistable parameter, irrationality can be investigated in a systematic way.
Whilst comparisons have been made between a similar noise model [24] and models based on quantum formalism [25], we show that the bistability model developed in the present paper can be encapsulated within a quantum framework. Bistable probabilities can be expressed as normalized positive-operator-valued (POV) measures [26]. A projection in quantum mechanics is defined by using orthogonal states, | ϕ i , i.e.,
π i = | ϕ i ϕ i | ,
with the following conditions:
π i π j = δ i , j π i , and i = 1 d π i = I ,
in which δ i , j is the Kronecker delta and d is the dimension of the Hilbert space. However, a general measurement in quantum mechanics is described by means of a POV projection acting on the quantum state defined in the complex Hilbert space [27]. Despite the fact that orthogonality is not a necessary condition with respect to such projections, which means that the results of two measurements following each other are not the same, i.e., E i E i E i , the second condition holds:
i = 1 d E i = I .
In a binary system, a set of unsharp projections is defined as follows [28],
E ± n = 1 2 I 2 × 2 ± η 2 σ · n ^ , 0 η 1 ,
in which I 2 × 2 is the two-dimensional identity matrix, η [ 0 , 1 ] is the so-called noisy parameter, σ s are the standard Pauli matrices,
σ x = 0 1 1 0 , σ y = 0 i i 0 , σ z = 1 0 0 1 ,
and n ^ = ( sin ϑ cos φ , sin ϑ sin φ , cos ϑ ) gives the direction of projections in the Bloch sphere.
Although in quantum cognition, the noise parameter η is attributed to memorylessness and weak interaction [29], we ascribe it to the impact of irrationality on decision making.
We consider a linear map η = 2 k 1 together with an extended noise interval η [ 1 , 1 ] . Hence, we obtain the bistable projection as follows:
P ± n ^ = ( 1 k ) I 2 × 2 ± ( 2 k 1 ) π ± n ^
in which k is a bistable parameter and defined in the interval 0 k 1 . π ± n ^ is a positive value projection and is defined by
π ± n ^ = 1 2 I 2 × 2 ± σ · n ^ = cos 2 ϑ 2 e i φ sin ϑ 2 cos ϑ 2 e i φ sin ϑ 2 cos ϑ 2 sin 2 ϑ 2 .
By using an analogy with quantum mechanics, we postulate that the probability of an irrational decision is given by the expectation value of the associated bistable projection, that is,
P ± ( k , n ) = ψ | P ± n ^ | ψ ,
in which the bistable projection P ± n ^ is defined by the Equation (6) and | ψ = ( p , 1 p ) T gives the probability of a rational decision. Note that in the special case in which the bistable projection (8) is defined in the direction z,
P + z = k 0 0 1 k , P z = 1 k 0 0 k .
We can reproduce the output of bistable decision making, i.e.,
P k ( + ) = p 1 p k 0 0 1 k p 1 p = 1 p k + 2 k p
P k ( ) = p 1 p 1 k 0 0 k p 1 p = p + k 2 k p .

3. Causality

In this section, we address the question of what irrationality means for the probabilistic judgement of causality.

3.1. Inferring Causality

Let us start with Reichenbach’s principle: We assume that two variables Y and Z are found to be statistically dependent; then, (i) either Y is part of a cause of Z or Z is part of a cause of Y, as shown in the plot Figure 2a, and (ii) Y and Z have a common cause X, illustrated in the plot Figure 2b. Consequently, causal independence implies statistical independence, i.e., P ( Y , Z ) = P ( Y ) P ( Z ) and P ( Y , Z | X ) = P ( Y | X ) P ( Z | X ) , in which X is the collection of all variables acting as common causes [30]. Therefore, causality can determined based on probabilities. In fact, the following relation:
P k ( Y = i ) P k ( Z = j ) P k ( Y = i , Z = j )
in which i , j can be any observable outcomes ±, characterizes a causal relationship, i.e., either Y causes Z or Z causes Y. Conversely, probabilities that are not factorizable are non-causal, and are therefore un-deterministic [30]. For example, by using (10), and assuming a rational non-causal relationship between two variables ( k = 1 ), Y = + and Z = + ; i.e.,
P k = 1 ( Y = ± , Z = ± ) = P k = 1 ( Y = ± ) P k = 1 ( Z = ± ) .
Then, by considering ( 0 k < 1 ) , we have
1 P ( Y = + , Z = + ) k + 2 k P ( Y = + , Z = + ) 1 P ( Y = + ) k + 2 k P ( Y = + ) 1 P ( Z = + ) k + 2 k P ( Z = + ) .
In other words, the presence of irrationality eliminates the cognitive agent’s ability to recognize independence, and potentially spurious causality is discerned by the agent. The tendency to overestimate relationships between events is seen in many heuristics stemming from System 1 processes, such as the illusion of validity bias [31], spontaneous causal inference [32], and illusions of causation [33]. In addition, this effect is exemplified in classically irrational thought processes, such as superstition and Obsessive Compulsive Disorder, where actions are erroneously causally linked to positive or negative events in an individual’s mind. In addition, a comprehensive empirical study of human causal reasoning found that participants committed violations of the Markov condition, which prescribes when variables are independent of each other [34]. For example, in a common cause network (Figure 2b), the Markov condition entails that variables Y and Z are conditionally independent, i.e., non-causally related, when the value of X is known. However, participants deemed Y and Z to influence each other when they were supposedly independent because of the Markov condition. These violations were present in experimental conditions which were specifically designed to distinguish between the processing of System 1 and System 2. These findings suggest that the distortion is not always due to System 1.
The bistable model can also account for situations where the converse occurs; namely, a rational causal relation (see Equation (12)) is distorted into a non-causal relation. Consider a common effect network where X Z and Y Z , (Figure 2a). If the value of Z is known, then X and Y become conditionally dependent. However, they may be irrationally deemed to be conditionally independent, and hence not causally related—for example, Ozone → Humidity and Air Pressure → Humidity. If it becomes known that the Humidity is high, then, rationally, Ozone and Air pressure become conditionally dependent. However, irrationally, one might see these as independent. The distortion also corresponds to violations of the “causal faithfulness condition" which states that variables that are causally connected are probabilistically dependent [35].

3.2. Causal Strength Criterion

By employing causal strength and its power, we study the impact of irrationality on a final judgement. The definition of the causal strength measure given by [36,37,38,39],
Δ P = P ( Y | X ) P ( Y | ¬ X ) ,
is independent of P ( X ) , but we should note that the causal strength is low if P ( Y | ¬ X ) is high. Therefore, as another criterion, the power of causal strength κ is suggested:
κ = Δ P P ( ¬ Y | ¬ X ) ,
in which Δ P is given by relation (15).
By using relations (15) and (16), for a final judgement with probability P ( X o u t = + ) = 1 p k + 2 k p , the causal strength measure and its power are obtained as
Δ P k = P ( X o u t = + | k = + ) P ( X o u t = + | k = ) = k + p 1 κ = Δ P k P ( X o u t = | p = ) = k + p 1 ( 1 p ) ( 1 k ) ,
in which the causal strength measure and its power clearly depend on the probability k. In fact, the relations of (17) indicate that increasing irrationality, i.e., lower levels of k, decreases the causal strength and its power.
We now consider a situation in which an outcome of variable X causes the outcome of variable Y. When the level of bistability changes, what effect does it have on the cause–effect relationship between X o u t and Y o u t ? In other words, if we rationally assume that there is a cause–effect relationship between two variables, what can we say about the cause–effect relationship between the final outcomes? For simplicity, we assume that the bistable projections of variable X are given by the relations in (9) and the associated bistable projections of variable Y = ± are considered in the x-axis direction:
P k Y ( + ) = 1 2 k 1 2 k 1 2 1 2 , P k Y ( ) = 1 2 1 2 k 1 2 k 1 2 .
Hence, the causal strength measure is given by
Δ P k = P k ( Y = + | X = + ) P k ( Y = + | X = ) = P k Y ( + ) P k X ( + ) P k Y ( + ) P k X ( ) = 1 2 ( 1 k p + 2 k p )
κ = 1 p k + 2 k p p + k 2 k p + ( 1 2 k ) p ( 1 p )
in which we assume that a real Hilbert space describes the original probability of the system, i.e., | ψ = ( p , 1 p ) T . Again, the above-mentioned relations (19) and (20) illustrate that the causal strength decreases by increasing the role of the bistable parameter, that is, decreasing the bistable parameter. In addition, we note that if the value of the bistable parameter is equal to k = 0 . 5 , both criteria approach zero, which means that it is not possible to establish a cause–effect relationship between variables X o u t and Y o u t .

4. Polytopes of Bistable Probabilities

We now consider a joint decision scenario where two decisions E 1 and E 2 with probability P 1 and P 2 , with P 12 denoting the joint probability. There are necessary and sufficient conditions for the rational values of P 1 , P 2 , and P 12 , as in the following [40]:
0 P i 1 , P i P 12 , P 12 0 , P 1 + P 2 P 12 1 ,
where i = 1 , 2 . The above-mentioned relations are so-called Boole’s conditions [40]. By considering a three-dimensional space in which P 1 , P 2 , and P 12 are coordinate axes, Boole’s conditions (21) construct a polytope. Therefore, each point that fulfills Boole’s conditions is a potential rational choice. The plot in Figure 3a illustrates a geometrical representation of this polytope. Now, based on this interpretation, by which the polytope indicates the volume of information that can be accessed, we consider two bistable outputs and their conjunction operator. To obtain the polytope structure, we consider the geometrical structure of the truth table of bistable outputs, which is the same as the truth table of non-bistable outputs. We can obtain a collection of linear inequalities for probabilities of bistable outputs:
( P 1 k , P 2 k , P 12 k ) = λ 1 ( 0 , 0 , 0 ) + λ 2 ( 0 , 1 , 0 ) + λ 3 ( 1 , 0 , 0 ) + λ 4 ( 1 , 1 , 1 )
in which λ i 0 , for i = 1 , , 4 and i = 1 4 λ i = 1 . The following polytope’s equations describe information which is accessible:
( 2 k 1 ) P i P 12 k k 1 , i = 1 , 2 ,
and
( 2 k 1 ) ( P 1 + P 2 ) P 12 k 2 k 1 ,
while we keep in mind the following classical identity:
P ( E 1 E 2 ) = P ( E 1 ) + P ( E 2 ) P ( E 1 E 2 ) .
In addition, in the case where k = 1 / 2 , P 12 k is independent of probabilities P i , i = 1 , 2 , 0 P 12 k 1 / 2 . Plots (a)-(f) in Figure 3 illustrate polytopes of the outputs for different values of k, that is, k = 1 , 0.9 , , 0.5 . This figure indicates that the accessible information increases as the bistable parameter decreases. In other words, increasing irrationality in decision making results in an increase in the amount of information accessible to the cognitive agent transacting the decision.
We define a new concept, “pure irrational information volume” (PIIV), as the difference in volumes of a polytope of irrational decision making ( 0 k < 1 ) compared to the polytope of rational decision making ( k = 1 ) , that is, Δ ( k ) = V ( k ) V ( k = 1 ) . This indicates the extra information that is accessible in irrational decision making. In the case of bistablity, the PIIV is given by:
Δ ( k ) = 1 k 6 + 1 k 2 2 ( 2 k 1 ) 2 + 1 ( 2 k 1 ) 2 + 1 .
In fact, PIIV Δ ( k ) is a criterion by which we draw a comparison between the amount of information accessible by System 1 and System 2. Figure 4 indicates function Δ ( k ) as a function of k. The plot illustrates that decreasing parameter k, i.e., increasing irrationality, causes the accessible information to increase. This increased availability of information to the irrational decision maker could be interpreted in terms of exploration [41], where irrationality may be seen to co-occur with the search for novel information or with increased actions or strategies available to the cognitive agent. To place this into the context of attribute substitution, the PIIV accounts for the wider variety of available heuristics employed by a decision maker who is relying on System 1.

5. The Bell–Wigner Polytope of Irrational Decision Making

When studying probability theory at school, toy examples such as tossing coins or pulling colored marbles from a bag are often used. When a red marble is drawn from a bag, it is unquestionably assumed that it already had the property of being red before it was pulled out of the bag. Its property of pre-existing redness is simply noted when the marble is retrieved, thereby contributing to the relative frequency of red marbles sampled from the bag. When George Boole was developing probability theory in the mid-1850s, he did so by considering what he called the “conditions of possible experience”. He formalized his intuitions into inequalities that the relative frequencies must satisfy. For example, for events E 1 and E 2 with relative frequencies p 1 and p 2 and where p 12 denotes the frequency of the joint event E 1 E 2 ,
0 p i 1 , i = 1 , 2 , 3 .
0 p i j min { p i , p j } , i , j = 1 , 2 , 3 .
p i + p j p i j 1 , i , j = 1 , 2 , 3 .
p 1 + p 2 + p 3 p 12 p 13 p 23 1 .
p 1 p 12 p 13 + p 23 0
p 2 p 12 p 23 + p 13 0
p 3 p 13 p 23 + p 12 0 .
Pitowsky [42] uses the preceding inequalities to define the “Bell–Wigner" polytope. Basically, it is the region within the polytope that defines Boole’s conditions of possible experience. Pitowsky [40] shows that quantum systems do not always adhere to these conditions, meaning that “quantumness" can be identified by regions that are outside of the Bell–Wigner polytope [43]. We will use this property in the following to examine whether irrational decision making conforms to Boole’s condition of possible experience or is quantum-like. For this purpose, three bistable parameters k 1 , k 2 , and k 3 are used to respectively attenuate the probabilities p 1 , p 2 , and p 3 . These parameters were systematically manipulated and the inequalities were tested for violation. Figure 5 depicts six plots for different values of k 3 , while each plot indicates values of inequality (31) with respect to different values of k 1 and k 2 . These plots illustrate that the maximum violation of the Bell–Wigner polytope happens whenever just one probability becomes irrational. In other words, decision making is necessarily quantum-like in the presence of irrationality. A future direction is to examine the hypothesized connection between quantum-like decision making and irrationality by using the QTEST framework [44]. For example, QTEST could be used to estimate how far simulated data fit a Bayesian model. A lack of good fit of a Bayesian model could suggest the presence of a quantum-like model. This is because Bayesian models derive from standard probability theory and must therefore be bounded by the Bell–Wigner polytope.

6. Conclusions

In this paper, we introduced and studied the mathematical consequences of a bistable probabilistic model which enables degrees of irrationality (that is, disagreement between System 1 and System 2) to be systematically investigated. By means of POV projections, it was shown that the bistable model can be considered part of an overarching quantum formalism. We discussed the implications of the bistable model in terms of the propensity of cognitive agents to spuriously infer causality, the impact of irrationality on causal power, and formalizing the amount of extra information available to the irrational decision maker. Finally, we simulated decision making and demonstrated violations of the Bell–Wigner polytope. Such violations suggest that irrational decision making is quantum-like.

Author Contributions

S.D., L.F., and P.B. equally contributed to this research. Formal analysis, S.D. and P.B.; conceptualization, S.D.; investigation, S.D., L.F., and P.B.; writing—original draft preparation, S.D.; writing—review and editing, S.D., L.F., and P.B.; validation, S.D., L.F., and P.B.; software, S.D., visualization, L.F.; funding acquisition, P.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Asian Office of Aerospace Research and Development (AOARD) grant: FA2386-17-1-4016.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
POVpositive operator valued
PIIVpure irrational information volume

References

  1. Chater, N.; Oaksford, M. The Probabilistic Mind: Prospects for Bayesian Cognitive Science; Oxford University Press: Oxford, UK, 2008. [Google Scholar]
  2. Tauber, S.; Navarro, D.; Perfors, A.; Steyvers, M. Bayesian models of cognition revisited: Setting optimality aside and letting data drive psychological theory. Psychol. Rev. 2017, 124, 410–441. [Google Scholar] [CrossRef] [Green Version]
  3. Simon, H.A. Theories of bounded rationality. Decis. Organ. 1972, 1, 161–176. [Google Scholar]
  4. Evans, J.S.B. In two minds: dual-process accounts of reasoning. Trends Cogn. Sci. 2003, 7, 454–459. [Google Scholar] [CrossRef] [Green Version]
  5. Moreira, C.; Fell, L.; Dehdashti, S.; Bruza, P.; Wichert, A. Towards a Quantum-Like Cognitive Architecture for Decision-Making. arXiv 2019, arXiv:1905.05176. [Google Scholar]
  6. Khrennikov, A. Ubiquitous Quantum Structure: From Psychology to Finance; Springer: Heidelberg, Germany, 2010. [Google Scholar]
  7. Pothos, E.; Busemeyer, J. A quantum probability explanation for violations of ‘rational’ decision theory. Proc. R. Soc. B 2009, 276, 2171–2178. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Busemeyer, J.R.; Bruza, P.D. Quantum Models of Cognition and Decision; Cambridge University Press: Cambridge, UK, 2012. [Google Scholar]
  9. Bruza, P.D.; Wang, Z.; Busemeyer, J.R. Quantum cognition: a new theoretical approach to psychology. Trends Cogn. Sci. 2015, 19, 383–393. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Kahneman, D.; Frederick, S. Representativeness revisited: Attribute substitution in intuitive judgment. In Heuristics and Biases: The Psychology of Intuitive Judgment; Gilovich, T., Griffin, D., Kahneman, D., Eds.; Cambridge University Press: Cambridge, UK, 2002; Chapter 2; pp. 49–81. [Google Scholar]
  11. Gladwin, T.; Figner, B. “Hot” Cognition and Dual Systems: Introduction, Criticisms and Ways forward. In Neuroeconomics, Judgment and Decision Making; Wilhelms, E., Reyna, V., Eds.; Psychology Press: London, UK, 2015; Chapter 8; pp. 157–180. [Google Scholar]
  12. Evans, J.S.B. On the resolution of conflict in dual process theories of reasoning. Think. Reason. 2007, 13, 321–339. [Google Scholar] [CrossRef]
  13. Tversky, A.; Kahneman, D. Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychol. Rev. 1983, 90, 293. [Google Scholar] [CrossRef]
  14. Conte, E.; Khrennikov, A.; Todarello, O.; De Robertis, R.; Federici, A.; Zbilut, J. On the Possibility That We Think in a Quantum Mechanical Manner: An Experimental Verification of Existing Quantum Interference Effects In Cognitive Anomaly of Conjunction Fallacy. Chaos Complex. Lett. 2011, 4, 123–136. [Google Scholar]
  15. Busemeyer, J.; Pothos, E.; Franco, R.; Trueblood, J. A quantum theoretical explanation for probability judgment errors. Psychol. Rev. 2011, 118, 193–218. [Google Scholar] [CrossRef] [Green Version]
  16. Honda, H.; Matsuka, T.; Ueda, K. Memory-based simple heuristics as attribute substitution: Competitive tests of binary choice inference models. Cogn. Sci. 2017, 41, 1093–1118. [Google Scholar] [CrossRef] [PubMed]
  17. Costello, F.; Watts, P. Surprisingly rational: Probability theory plus noise explains biases in judgment. Psychol. Rev. 2014, 121, 463. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Costello, F.; Watts, P. People’s conditional probability judgments follow probability theory (plus noise). Cogn. Psychol. 2016, 89, 106–133. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  19. Diederich, A.; Trueblood, J.S. A dynamic dual process model of risky decision making. Psychol. Rev. 2018, 125, 270. [Google Scholar] [CrossRef]
  20. Hoffman, D.D.; Prakash, C. Objects of consciousness. Front. Psychol. 2014, 5, 577. [Google Scholar] [CrossRef] [Green Version]
  21. Costello, F.; Watts, P. Explaining high conjunction fallacy rates: The probability theory plus noise account. J. Behav. Decis. Mak. 2017, 30, 304–321. [Google Scholar] [CrossRef]
  22. Costello, F.; Watts, P.; Fisher, C. Surprising rationality in probability judgment: Assessing two competing models. Cognition 2018, 170, 280–297. [Google Scholar] [CrossRef]
  23. Costello, F.; Watts, P. The rationality of illusory correlation. Psychol. Rev. 2019, 126, 437. [Google Scholar] [CrossRef]
  24. Costello, F.; Watts, P. Invariants in probabilistic reasoning. Cogn. Psychol. 2018, 100, 1–16. [Google Scholar] [CrossRef]
  25. Yearsley, J.M.; Trueblood, J.S. A quantum theory account of order effects and conjunction fallacies in political judgments. Psychon. Bull. Rev. 2018, 25, 1517–1525. [Google Scholar] [CrossRef] [Green Version]
  26. Busch, P. Unsharp reality and joint measurements for spin observables. Phys. Rev. D 1986, 33, 2253. [Google Scholar] [CrossRef] [PubMed]
  27. Liang, Y.C.; Spekkens, R.W.; Wiseman, H.M. Specker’s parable of the overprotective seer: A road to contextuality, nonlocality and complementarity. Phys. Rep. 2011, 506, 1–39. [Google Scholar] [CrossRef] [Green Version]
  28. Wu, X.; Zhou, T. Diagnosing steerability of a bipartite state with the non-steering threshold. arXiv 2019, arXiv:1904.04829. [Google Scholar]
  29. Trueblood, J.S.; Yearsley, J.M.; Pothos, E.M. A quantum probability framework for human probabilistic inference. J. Exp. Psychol. Gen. 2017, 146, 1307. [Google Scholar] [CrossRef]
  30. Allen, J.M.A.; Barrett, J.; Horsman, D.C.; Lee, C.M.; Spekkens, R.W. Quantum common causes and quantum causal models. Phys. Rev. X 2017, 7, 031021. [Google Scholar] [CrossRef] [Green Version]
  31. Einhorn, H.J.; Hogarth, R.M. Confidence in judgment: Persistence of the illusion of validity. Psychol. Rev. 1978, 85, 395. [Google Scholar] [CrossRef]
  32. Hassin, R.R.; Bargh, J.A.; Uleman, J.S. Spontaneous causal inferences. J. Exp. Soc. Psychol. 2002, 38, 515–522. [Google Scholar] [CrossRef]
  33. Matute, H.; Blanco, F.; Yarritu, I.; Diaz-Lago, M.; Vadillo, M.; Barberia, I. Illusions of causality: How they bias our everyday thinking and how they could be reduced. Front. Psychol. 2015, 6, 888. [Google Scholar] [CrossRef] [Green Version]
  34. Rehder, B. Independence and dependence in humal causal reasoning. Cogn. Psychol. 2014, 72, 54–107. [Google Scholar] [CrossRef]
  35. Weinberger, N. Faithfulness, Coordination and Causal Coincidences. Erkenntnis 2018, 83, 113–133. [Google Scholar] [CrossRef] [Green Version]
  36. Icard, T.F.; Kominsky, J.F.; Knobe, J. Normality and actual causal strength. Cognition 2017, 161, 80–93. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Cheng, P.W.; Novick, L.R. Covariation in natural causal induction. Psychol. Rev. 1992, 99, 365. [Google Scholar] [CrossRef] [PubMed]
  38. Jenkins, H.M.; Ward, W.C. Judgment of contingency between responses and outcomes. Psychol. Monogr. Gen. Appl. 1965, 79, 1. [Google Scholar] [CrossRef] [PubMed]
  39. Griffiths, T.L.; Tenenbaum, J.B. Structure and strength in causal induction. Cogn. Psychol. 2005, 51, 334–384. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  40. Pitowsky, I. George Boole’s ‘conditions of possible experience’ and the quantum puzzle. Br. J. Philos. Sci. 1994, 45, 95–125. [Google Scholar] [CrossRef]
  41. Friston, K.; Schwartenbeck, P.; FitzGerald, T.; Moutoussis, M.; Behrens, T.; Dolan, R.J. The anatomy of choice: active inference and agency. Front. Hum. Neurosci. 2013, 7, 598. [Google Scholar] [CrossRef] [Green Version]
  42. Pitowsky, I. Correlation polytopes: their geometry and complexity. Math. Program. 1991, 50, 395–414. [Google Scholar] [CrossRef]
  43. Vourdas, A. Probabilistic inequalities and measurements in bipartite systems. J. Phys. A Math. Theor. 2019, 52, 085301. [Google Scholar] [CrossRef] [Green Version]
  44. Zwilling, C.; Cavagnaro, D.; Regenwetter, M.; Lim, S.; Fields, B.; Zhang, Y. QTEST 2.1: Quantitative testing of theories of binary choice using Bayesian inference. J. Math. Psychol. 2019, 91, 176–194. [Google Scholar] [CrossRef]
Figure 1. A schematic setup for a bistable model structure.
Figure 1. A schematic setup for a bistable model structure.
Entropy 22 00174 g001
Figure 2. Alternative causal models based on Reichenbach’s principle.
Figure 2. Alternative causal models based on Reichenbach’s principle.
Entropy 22 00174 g002
Figure 3. Polytopes for different values of k { 1.0 , 0.9 , 0.8 , 0.7 , 0.6 , 0.5 } are respectively shown in plots (a)–(f).
Figure 3. Polytopes for different values of k { 1.0 , 0.9 , 0.8 , 0.7 , 0.6 , 0.5 } are respectively shown in plots (a)–(f).
Entropy 22 00174 g003
Figure 4. Pure irrational information volume (PIIV), i.e., Δ ( k ) as a function of the bistable parameter k.
Figure 4. Pure irrational information volume (PIIV), i.e., Δ ( k ) as a function of the bistable parameter k.
Entropy 22 00174 g004
Figure 5. Bell–Wigner polytopes for different values of k 3 = 1 , 0.9 , , 0.5 are illustrated in plots (a), (b), ⋯, (f). Colors differentiate different values of k 1 , with blue signifying k 1 = 0.5 and green signifying k 1 = 1 . Paramater k 2 varies from 0.5 to 1. Bars below the k 1 , k 2 plane signify negative probabilities.
Figure 5. Bell–Wigner polytopes for different values of k 3 = 1 , 0.9 , , 0.5 are illustrated in plots (a), (b), ⋯, (f). Colors differentiate different values of k 1 , with blue signifying k 1 = 0.5 and green signifying k 1 = 1 . Paramater k 2 varies from 0.5 to 1. Bars below the k 1 , k 2 plane signify negative probabilities.
Entropy 22 00174 g005

Share and Cite

MDPI and ACS Style

Dehdashti, S.; Fell, L.; Bruza, P. On the Irrationality of Being in Two Minds. Entropy 2020, 22, 174. https://doi.org/10.3390/e22020174

AMA Style

Dehdashti S, Fell L, Bruza P. On the Irrationality of Being in Two Minds. Entropy. 2020; 22(2):174. https://doi.org/10.3390/e22020174

Chicago/Turabian Style

Dehdashti, Shahram, Lauren Fell, and Peter Bruza. 2020. "On the Irrationality of Being in Two Minds" Entropy 22, no. 2: 174. https://doi.org/10.3390/e22020174

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop