Decoherence, appearance, and reality in agential realism

This paper reconsiders what implications quantum decoherence has for Karen Barad’s agential realism. In contrast with the recent claim of Thomas Everth and Laura Gurney (2022), this paper argues that decoherence supports rather than defeats the holist, relational ontology of agential realism. Indeed, decoherence offers an explanation for how a quantum system can remain entangled and superposed in principle while it nevertheless in practice appears classical to a local observer. Decoherence shows why the appearance of classicality is not an objection to the ontology being in reality that of agential realism, in accordance with Barad’s repeated insistence that we should not mistake principle for practice. Whether users of agential realism in social theory should be encouraged by this is another matter that this paper does not take a stance on. As an ontology, however, agential realism is vindicated.


Introduction
Prompted by the recent claim of Thomas Everth and Laura Gurney (2022) that quantum decoherence is detrimental to Karen Barad's (2007) agential realism, this paper argues that a different interpretation of decoherence -the one which Barad seems to endorse -is more compatible with the relational ontology of agential realism.
Barad's agential realism appeals to quantum mechanics to defend this relational ontology where any division into individuated elements is produced by intra-actions in a fundamental, entangled whole. Barad, however, does not limit the application of agential realism to quantum mechanics but finds it useful for reform in social theory broadly construed. More particularly, agential realism, by its relationality, "calls into question the dualisms of object-subject, knower-known, nature-culture, and word-world" (Barad, 2007, 147) and requires a rethinking of notions such as agency, power, and discourse (Barad, 2007, 26). Agential realism has had its primary influence in feminist new materialism, where its proponents for instance say of Barad that "[s]he shows how quantum physics can inform our thinking about gender, racial, queer and other differences" ( de Freitas, 2016, 150).
As Everth and Gurney interpret her, Barad justifies this broad scope of agential realism by arguing that agential realism follows from our most fundamental theory of everything, quantum mechanics (see also Faye & Jaksland, 2021;Jaksland, 2021;Hollin et al., 2017;Holzhey, 2021). Referring to feminist new materialism, Everth and Gurney, however, are "concerned about the uncritical application of Barad's theories in these fields, in particular in relation to the extrapolation of quantum mechanical indeterminacy at the human scale" (2022, 2). Their argument, in a nutshell, is that the process known as decoherence shows how those same quantum effectsentanglement in particular -that Barad uses to justify her relational ontology refute this ontology when they are considered in contexts with many degrees of freedom like those obtaining in the macroscopic world studied by social theory.
Taking a new look at decoherence, this paper, however, proposes that decoherence is more compatible with agential realism than Everth and Gurney indicate if we take the central commitment of agential realism to be that any division into individuated elements is produced by intra-actions in a fundamental, entangled whole. Unless one adds to decoherence a mechanism that collapses the wave function, a process that Barad (2007, 348) rejects takes place in quantum physics at any stage, decoherence only explains why macroscopic objects often appear to have independent and persistent existence for local observers like ourselves. Behind this appearance, the quantum weirdness remains, at least at the fundamental level of the ontology. Ontologically, both superposition and entanglement are pervasive even in systems that have undergone decoherence. This distinction between appearance and reality, I claim, connects with Barad's remark that decoherence often, though not always, allows us to treat macroscopic objects classically, i.e., without paying heed to quantum mechanics, "'for all practical purposes' (but not in principle)" (Barad, 2007, 279, emphasis in original). This paper therefore proposes that we can understand Barad's appeal to the distinction between practice and principle in the context of decoherence as preempting the critique of Everth and Gurney. What matters for agential realism, Barad seems to say, is not how reality appears but how it fundamentally is, and even after decoherence, as this paper will argue, the ontology remains that of an entangled whole despite often not appearing so to local observers.
While this offers some vindication for agential realism from the perspective of (fundamental) ontology, it leaves open whether this move from practice to principle should impress social theory. If local observers can typically treat themselves and the objects they interact with classically, then why should social theory care that this apparent classicality is quantum weirdness in elaborate disguise? Answering this question, however, is not the aim of this paper. Rather, the purpose is only to argue that the ontology of agential realism remains consistent even at macroscopic scale where decoherence has taken place.
What would be challenged by decoherence, as Everth and Gurney convincingly argue, is the claim that two objects always acquire their determinacy through the intraactions with each other. Everth and Gurney see the latter as a problem for agential realism. While I agree that one can read Barad as being committed to this additional claim, the paper demonstrates how Barad considers cases where elements are not individuated through intra-actions with each other, but she gives these as arguments in favor of agential realism. Based on these cases, Barad's understanding of agential realism seems, in other words, to be that two objects do not have to acquire their determinacy through intra-actions with each other as long as they acquire it through intra-actions in the fundamental, entangled whole. While the former is challenged by decoherence, the latter is not, and there are therefore indications that agential realism is more compatible with decoherence than Everth and Gurney argue.
This paper assumes the same extrapolation interpretation of Barad as Everth and Gurney whereby the use of agential realism in social theory is interpreted as warranted by features of quantum mechanics. This interpretation is disputed by others who, for instance, understand Barad to say that quantum mechanics only reveals unnoticed assumptions of our theorizing that we can then better identify in other contexts as well. The extrapolation interpretation will, however, not be defended here since the purpose is to qualify the argument of Everth and Gurney assuming this shared interpretational premise. Indeed, a denial of the extrapolation interpretation is equally problematic for this paper and for Everth and Gurney. Neither the problems for Barad's agential realism identified by Everth and Gurney nor the qualifications of those made here are relevant if one adopts the other interpretation of Barad.

Agential realism
Doing justice to all parts of a framework as complex as agential realism is beyond the scope here. The focus will therefore be on those of its fundamental claims that are, allegedly, challenged by decoherence. Most important among these is the rejection of the "metaphysics of individualism" according to which, Barad explains, "there are individual objects with individually determinate properties, and measurements reveal the preexisting values of particular physical quantities" (2007,262). Barad defends the rejection of the metaphysics of individualism by appeal to three of the central characteristics of quantum mechanics: superposition, complementarity, and entanglement. Giving a completely interpretation-independent account of these is difficult, and in the following the quantum formalism is presented in the realist voice (also adopted by Barad). See expositions of, for instance, Bohmian mechanics (e.g. Esfeld et al., 2014) or QBism (e.g. Fuchs et al., 2014) for other ways to conceive of these concepts.
Superposition describes the ability of a quantum system to be in a state where some of its properties are indeterminate. Such a property can, for instance, be position, and a quantum system can therefore be in a state where it, in a certain sense, is in many places at once. Velocity, momentum, energy, and rotation are also among the properties that are not always determinate.
Complementarity is a concept coined by Niels Bohr, whom Barad draws much inspiration from, which describes the circumstance in quantum mechanics that certain properties cannot be measured and therefore be determined at once. An experimental arrangement measuring the position of a quantum particle excludes the possibility of measuring its momentum (and velocity). More precisely, the degree of indeterminacy on momentum scales with Planck's constant over the precision of the position measurement. We might propose to just measure the momentum after the position measurement, but if we do so and then measure position again, we find no correlation between the two position measurements. A measurement forces the system -how is central to what is known as "the measurement problem" -to take a determinate value of the property being measured, but this renders its conjugate or "complementary" property indeterminate, i.e., the system enters an extreme superposition of values of the complementary property.
That complementary properties cannot be determinate at once rejects the claim by the metaphysics of individualism that objects have "individually determinate properties." Furthermore, the absence of correlation when we measure first position, then momentum, and then position again shows that we can hardly say that "measurements reveal the preexisting values of particular physical quantities." Bohr argues, on these grounds, that the ascription of properties such as position or momentum in quantum mechanics is only meaningful in the context of an apparatus measuring that property and therefore only meaningful one at the time since only one property in a complementary pair can be measured at once. By appeal to the quantum effect known as entanglement, Barad proposes to imbue this contextualization with ontological significance whereas Bohr seems consider it only epistemic and semantic (see Jaksland & Faye, 2021 for a critical discussion of Barad's reading of Bohr, but also Howard, 2021 for a recent interpretation of Bohr that is closer to Barad's).
Entanglement is an effect peculiar to quantum mechanics whereby the state of two or more apparently distinct systems cannot be described by a combination of their respective individual states. The state of the full system is non-separable and cannot strictly be viewed as comprising several distinct subsystems corresponding to, say, individual particles. The way this non-separability due to entanglement manifests itself is, for instance, in the curious effect where two particles as great distance can affect each other instantaneously (Aspect et al., 1982). The quantum formalism does not allow operations on only one of the subsystems without affecting the other, and this is manifested through these experiments with particles at great distance. Entangled systems form one whole.
What happens when a measuring apparatus comes into contact with a quantum system, according to the quantum formalism, is precisely that the two become entangled. They can, as a consequence, not be considered separate entities with their own state but comprise instead of one whole. Barad proposes to understand complementarity in this light. A property can only be ascribed when a quantum system is in the context of an apparatus that can measure that property because the property belongs to the entangled whole of apparatus and observed system, what Barad calls a "phenomenon." As Barad summarizes it, phenomena do not merely mark the epistemological inseparability of 'observer' and 'observed'; rather, phenomena are the ontological inseparability of agentially intra-acting 'components.' That is, phenomena are ontologically primitive relations-relations without preexisting relata (Barad, 2003, 815, emphasis in original).
The very notion of an "'individual' is ontologically and semantically indeterminate" and only specific intra-actions within phenomena can produce the conditions that "resolves the inherent indeterminacy in a way that makes this notion intelligible" (Barad, 2007, 316) and which, in turn, provides for an effective or, in Barad's terms, "agential" cut between observer and observed. This is the final and central component of Barad's rejection of the metaphysics of individualism. Fundamentally, there are no individual objects but only an entangled whole.
As consequence of this central commitment, agential realism precludes the separate existence of subjects and objects or knowers and known, and the dualisms associated with these notions can therefore not be sustained. Furthermore, the material conditions for the ascription of properties due to complementarity rejects, according to Barad, the word-world dualism. A concept like 'position' "from the perspective of quantum physics must be understood as semantically determinate only for a given experimental arrangement" (Barad, 2007, 300). By the same reasoning, there is an inherent materiality to discourse which in turn influences the post-structuralist understanding of power. Finally, since certain apparatuses render some concepts, for instance 'position', semantically determinate at the expense of others, for instance 'momentum', these apparatuses exercise, Barad (2007, 148) argues, an agency which challenges the classical anthropocentric understanding of this concept. How Barad intends that these changes should be implemented in theorizing is beyond the scope here. What is important to notice is, however, that these implications for theorizing are, on this interpretation, driven by Barad's account of the microphysics of quantum mechanics.

Decoherence
This transition from microphysics to the macroscopic contexts of the social world is precisely what Everth and Gurney criticize. Ironically, entanglement, that is otherwise so central to Barad's argument, is driving this worry since "when quantum systems entangle with a quasi-infinite number of others (the environment), something fundamentally new happens" (Everth & Gurney, 2022, 5). This process is known as decoherence, and Everth and Gurney claim that it challenges the indeterminacy that Barad finds inherent in the quantum ontology at all scales. Through decoherence, Everth and Gurney write, "[t]he superpositions of the many possible states of the systems' properties in question are destroyed almost instantaneously" and "the values of the properties that are entangled with the environment become determinate and settle into one of their physically possible states" (2022, 5). Where Barad rejects the metaphysics of individualism, that "there are individual objects with individually determinate properties, and measurements reveal the preexisting values of particular physical quantities" (2007, 262), Everth and Gurney argue that this metaphysics does hold when decoherence has taken place, i.e., "when quantum systems entangle with a quasi-infinite number of others (the environment)," as they put it above. During decoherence "something fundamentally new happens," as they say, and the result is individually existing objects with determinate properties. After decoherence, the ontology is not that of a fundamental, entangled whole where any individuated elements are produced by intra-action within this whole in explicit contradiction to what agential realism claims.
When Everth and Gurney remark that "[d]ecoherence reduces quantum systems to a mixture of classical states, previously indeterminate properties assume determinate values based on probabilities" (2022, 11, emphasis added), they do recognize one limitation for decoherence: Decoherence does not determine what (classical) state a system will be in. Decoherence can only give probabilities for being in a range of possible states, a probability distribution known as a mixture. Nevertheless, in qualifying that systems thereby "assume determinate values based on probabilities," Everth and Gurney seem to suggest that the mixture merely expresses an uncertainty on the part of the decoherence formalism about which determinate state a given system has actually assumed. This limitation for decoherence is, on this account, epistemic and not ontological in kind. The metaphysics of individualism is not rejected anew just because the precise configuration realizing this metaphysics in any particular case is epistemically uncertain from the perspective of a formalism, or so their argument seems to go.
This understanding of the mixture as an expression of epistemic uncertainty crucially depends on the assumption that the system to which it applies has a determinate state. This assumption, though, seems rather natural since a mixture abides by the laws of classical probability theory. Take, for instance, a double slit experiment. A classical object will go through one slit or the other. A quantum object can, however, be in a superposition of going through one slit or the other which results in the detection of an interference pattern when the experiment is repeated many times. This interference pattern cannot be explained if the object goes through either one slit or the other. Instead, the interference pattern is typically explained as resulting from the object going through both slits at once and then interfering with itself. The probability for where the superposed object ends up on the detection screen -what manifests itself as the pattern on the detection screen -includes a contribution that is distinctly quantum mechanical and which cannot be viewed as merely expressing an uncertainty about which slit the object went through. If, however, the object is describable as a mixture (in the position basis) before going through the slits, as would for instance be the case after decoherence, then it does not produce an interference pattern. The pattern on the detection screen can here be explained by each object having gone through only one of the two slits. The probability for where each object ends up on the detection screen is in this case consistent with it merely being uncertain which slit the object went through. In general terms, a system describable as a mixture (in some basis) displays none of the quantum-weirdness (in that basis) -such as the interference in the double slit experiment -that otherwise questions whether systems have determinate classical states.
Quantum systems with all their weirdness can, however, behave just like mixtures under certain circumstances. When this is so, the system is said to be in an improper mixture to distinguish it from the proper mixtures capturing classical uncertainty (see D'espagnat, 2003, Chap. 7 for a general discussion; and Zeh, 2007 for a discussion in the context of decoherence). In proper mixtures, the uncertainty about the precise state is due to our limited knowledge about the system, whereas the state of the system in an improper mixture is still in an indeterminate, superposed, and entangled state. Nevertheless, proper and improper mixtures will look the same if one only monitors the degrees of freedom within the system described by the mixture, but improper mixtures are really superposed states with an (possibly intricate) entanglement structure that makes them look classical. For instance, two particles whose position degrees of freedom are maximally entangled (as in the Einstein-Podolsky-Rosen thought experiment) will not produce an interference pattern in a double slit experiment. An observer that only monitors the degrees of freedom of one of the particles will therefore observe it to behave as if it went through a determinate (though unknown) slit, even though it is still ontologically in a superposed and entangled state.
This difference between proper and improper mixtures is important for the understanding of decoherence. That a system allegedly assumes a determinate classical state through decoherence where it gets entangled with the environment is inferred from its behavior as a mixture, i.e., that it effectively satisfies classical probability theory, but this is precisely the inference that the existence of improper mixtures due to entanglement questions. Importantly, the question is not whether decoherence really takes place, but whether this real physical process gives rise to a proper or improper mixture. The latter can remain undecided because a system in an improper mixture will also effectively satisfy classical probability theory but still be in a superposed state and entangled with the environment. An observer that only monitors the degrees of freedom within the system describable as a mixture and not those in the environment will not be able to determine whether the system is in a proper or improper mixture. Such a local observer will not be able to determine whether decoherence gives rise to independent objects with determinate but unknown properties (proper mixture) or whether the system remains in an indeterminate, superposed, and entangled quantum state (improper mixture). Looking at the process of decoherence as described by the quantum formalism, however, this process only gives rise to local subsystems describable as improper mixtures: "we do need to add some interpretive rule to get from the improper ensemble emerging from decoherence to the perception of individual terms alone" (Schlosshauer, 2005(Schlosshauer, , 1297, i.e., to get a proper mixture. Now, when Everth and Gurney say that, in decoherence, "something fundamentally new happens" (2022, 5), they could precisely be understood as adding this interpretive rule which would also cohere with their remarks that suggest that decoherence gives rise to proper mixtures (though Sect. 6 will consider the possibility that they instead mean by this the emergence of another ontological level). While adding such an interpretive rule that reduces improper mixtures to proper mixtures is consistent (Crull, 2021, Sect. 3), two observations are relevant to make about this move. First, adding this interpretive rule only in the case of decoherence and not in general when quantum systems entangle to produce mixtures would be rather arbitrary. Why should a system in a superposition state that gets entangled with the environment reduce to a proper mixture, i.e., to a determinate but uncertain state, when entangling the system with another quantum system reduces it to an improper mixture, i.e., a system that appears classical, but which ontologically is still both superposed and entangled? Thus, this interpretive rule is most coherently applied to all cases where entanglement between systems entails that one or more of them can be described as a mixture. Second, the interpretive rule whereby the final reduction to a proper mixture takes place is equivalent to assuming that the wave function collapses during the decoherence process. Wave function collapse is one of the central outstanding interpretational discussions in quantum mechanics with some arguing that it does take place and others denying this. Barad very explicitly sides with those who deny that a collapse of the wave function ever occurs: "There is no 'collapse' -no additional physical mechanism (beyond that governed by the quantum theory) -that transforms a superposition or entanglement that exists before the measurement into a definite state upon measurement" (Barad, 2007, 345). Thus, adding the additional interpretative rule -the collapse of the wave function -to decoherence to get proper mixtures instead of improper ones simply amounts to disagreeing with Barad over this central interpretational question in quantum mechanics. It is hardly surprising that agential realism becomes problematic if one denies it this central assumption.
Indeed, if collapse does not take place and decoherence therefore only reduces systems to improper mixtures, then tables are turned. In that case, the system will be entangled with the environment such that it cannot be ascribed a state of its own, and the full state will be superposed in such a way that the properties of subsystems are not determinate. If the system remains in an improper mixture after decoherence, the system has therefore not transitioned to a classical ontology since neither (ontological) superpositions nor entanglement feature in classical ontologies. The classicality due to decoherence can thus be said to be only an appearance that the entangled and superposed fundamental ontology can have to a local observer. This understanding of decoherence seems much more compatible with the claim of agential realism that any division into individuated elements is produced by intra-actions in a fundamental, entangled whole. It is precisely the particular entanglement within the whole that comprises both system and environment that produces the possibility for treating a system as though it is individually existing. Everything is still entangled with everything else, and only the particular features of this entanglement -the intra-actions in this entangled totality, Barad might say -allow local observers to treat certain subsystems as though they are separate objects with determinate properties. In this picture, the metaphysics of individualism, that "there are individual objects with individually determinate properties, and measurements reveal the preexisting values of particular physical quantities," is still rejected despite decoherence having taken place. Even after decoherence, no system can be regarded as an individual object in the sense that a pure state can be ascribed to it. Instead, only an improper mixture can be ascribed where properties remain ontologically indeterminate, and these can therefore not preexist measurement.
Barad's brief mention of decoherence does indeed suggest that she interprets the process as giving rise to improper mixtures (and this accords with her very detailed rejection of the collapse of the wave function): quantum behavior is difficult to observe because of the difficulty of shielding an object, especially a relatively large object, from interactions with its 'environment,' which continually fluctuates in an erratic fashion in such a way that a superposition is 'randomized' into a mixture 'for all practical purposes' (but not in principle). This randomization process is called 'decoherence' (Barad, 2007, 279).
Everth and Gurney, considering the same remark, writes that Barad (2007) does not address the fundamental conflict that this juxtaposition of 'all practical purposes' and 'in principle' hints at for her theory about the macroscopic world we live in. Yet this is the central argument that decoherence and quantum Darwinism theory addresses. According to decoherence theory, macroscopic objects acquire classicality in principle and irreversibly (2022,(15)(16).
In my view, however, Barad's "juxtaposition" has a rather clear interpretation as asserting that decoherence, in Barad's view, gives rise to improper mixtures that, though they appear classical to local observers (in practice), are not so from the perspective of the fundamental ontology (in principle). Indeed, this connection between proper and improper mixtures and the principle/practice juxtaposition is explicitly made in a quote from Barad's discussion of decoherence: If this disruption [due to decoherence] truly destroys quantum coherence, destroys it not just in practice but in principle, then it will be impossible ever to recover an interference signal. On the other hand, if the disruption leads rather to the creation of an entanglement, then the state has become more complex but its fundamental nature has not altered -it is a superposition, not a mixture (Greenstein & Zajonc, 1997, 209; quoted in Barad, 2007, 349). When Barad claims that decoherence only produces a "mixture 'for all practical purposes'", she clearly opts for the latter understanding suggested by Greenstein and Zajonc whereby the "fundamental nature has not altered," i.e., the ontology remains superposed and entangled. If so, this is furthermore a theme -the difference between proper and improper mixtures -that is theorized throughout Barad's work (see 2007, 265-71, 285, 346-49, 455-57, 465).
When Everth and Gurney write that "macroscopic objects acquire classicality in principle and irreversibly" through decoherence, they can appear to just disagree with Barad over whether to interpret the mixtures produced by decoherence as proper or improper mixtures (more on the irreversibility in Sect. 5). But Barad's understanding in terms of improper mixtures is perfectly mainstream and shared by Joos (2009, 156), Schlosshauer (2005Schlosshauer ( , 1279, and Zeh (1997, 4) explicitly, and even implicitly by Zurek according to Fortin and Lombardi (2016, 237). Nevertheless, this disagreement is, of course, admissible and making explicit that Barad adopts this particular interpretation of decoherence and that another interpretation is available is a valuable contribution to the understanding of Barad. However, a criticism of Barad's agential realism based on such an interpretational disagreement does not expose an inconsistency within agential realism, as Everth and Gurney seem to suggest, though it would, of course, add to the existing worry that agential realism is not capturing the "ontological issues that quantum physics forces us to confront," as Barad (2007, 24) claims, but rather the implications of one particular interpretation of quantum mechanics (Jaksland, 2021;Pinch, 2011).

Appearance and reality
Importantly, the issue whether the mixtures produced by decoherence are proper or improper is not about the size at macroscopic length scales of characteristic quantum effects such as interference between superposed states. On both interpretations, these quantum effects will typically be unnoticeable for a local observer, either because they have ceased to exist, as the proper mixture interpretation suggests, or because they effectively vanish, as the improper mixture interpretation suggests. The difference is rather that in the latter case, the classicality is in appearance only. Also, Barad explicitly acknowledges that the "ratio" which governs the size of quantum effects is vanishing in most macroscopic contexts but explains that the fact that this ratio is not strictly zero is the key point. In other words, the fact that Newtonian mechanics provides good approximations to the exact quantum mechanical solutions for many macroscopic situations is not evidence against the new epistemology or ontology suggested by my elaboration of Bohr's account (Barad, 2007, 416).
Thus, contrary to what Everth and Gurney say, it is not the case that "[f]or Barad, quantum effects are dominant at every scale" (2022, 6). This is not why Barad claims that agential realism remains relevant even in our theorizing at macroscopic scales (see Jaksland, Forthcoming for an exposition of what Barad's argument might be instead). Rather, Barad states that the precision of the Newtonian approximation -the appearance of classicality -is irrelevant for the questions of reality that Barad are concerned with. Instead of challenging Barad's agential realism, decoherence could be regarded as an explanation for why the precision of the Newtonian approximation at macroscopic scales does not defeat agential realism. Decoherence explains why an entangled totality can appear classical to a local observer without actually being classical in reality.
Indeed, the mechanism behind decoherence -in technical terms, the dynamical selection of a pointer basis through entanglement with other degrees of freedom -is the same that Barad describes as intra-actions within phenomena and which give rise to the effective or agential separation of a phenomenon into elements where properties can then be ascribed. The degrees of freedom in an apparatus measuring, for instance, positions are such that, upon entanglement with the object system, the full entangled system will very quickly evolve into a state where the object system will appear to be in a mixture of position states for an observer that does not monitor the entanglement with the apparatus. This is what Barad describes as a local determinacy within the indeterminate quantum ontology. The intra-actions within phenomena or, we might say, the dynamical evolution of the entangled totality produces conditions -the appearance of the mixture -which allow for the ascription of the property 'position' and the effective separation into object system and measuring apparatus. Barad summarizes this as follows: "[W]ithin a phenomenon, where we have agential separability, the mark on the 'measuring instrument' (e.g., the direction of a pointer) is describable as a mixture (even though it is not strictly speaking a mixture). That is, it appears as a mixture if the degrees of freedom of the instruments are bracketed, which is just what is done in describing the instrument classically" (Barad, 2007, 346). Barad emphasizes again, however, that this "agential cut does not disentangle the phenomenon into independent subsystems; after all, it is their very intra-action (their nonseparability) that makes manifest particular marks on bodies in the first place" (Barad, 2007, 328). Instead, "[w]hat the agential cut does provide is a contingent resolution of the ontological inseparability within the phenomenon and hence the conditions for objective description," but, Barad qualifies, "[s]trictly speaking, there is only a single entity -the phenomenon -and hence the proper objective referent for descriptive terms is the phenomenon." (Barad, 2007, 328). Thus, the same mechanism that decoherence is based on explains, according to Barad, why the ontological inseparability of agential realism is consistent with the appearance of classicality.
The environment is just another measuring apparatus on this account. This follows from Barad's posthumanism that rejects any special status to measuring apparatuses readable by human beings: "it is important to recognize that apparatuses are not merely human-constructed laboratory instruments that tell us how the world is in accordance with our human-based conceptions. Rather, apparatuses are specific material configurations (dynamic reconfigurings) of the world that play a role in the production of phenomena" (Barad, 2007, 335). This production takes place whenever the entanglement structure becomes one where one part of the entangled whole can be described as an improper mixture by ignoring (tracing out) the rest of the degrees of freedom, a process where, "in its causal intra-activity, 'part' of the world becomes determinately bounded and propertied in its emergent intelligibility to another 'part' of the world" (Barad, 2007, 335). This could be a quantum system (the former 'part') and a measuring apparatus in a laboratory (the latter 'part'), but the decoherence of a system through entanglement with the environment is another process that enacts this determinacy where the system becomes describable by an improper mixture.
Everth and Gurney claim that " Barad's (2007) avoidance of a deeper engagement with decoherence theory avoids the implications of the emergence of classical reality for agential realism and for her elevation of quantum phenomena to the macroscopic level" (2022, 16). I argue, to the contrary, that Barad's engagement with decoherence and other similar entanglement-based processes precisely forms the basis for her explanation of why the appearance of classicality at macroscopic length scales is not evidence against it in reality being deeply quantum.

Independent classicality
Often in Barad's account, the focus is on the determinacy that the entanglement between object system and measuring apparatus provides. Everth and Gurney explain, however, that "[f]or the critique of Barad (2007), it is important to point out that environment-induced decoherence happens for macroscopic objects independent from each other and gives objects a separate, historised and individual classical existence within the environment" (2022,14). This independent constitution, they then argue, is at tension with " Barad's (2007) view that relata […] are invoked as phenomena in the intra-action between them" (2022, 14, emphasis in original).
I agree that decoherence entails that macroscopic objects are typically constituted as relata within phenomena by their entanglement with the environment and thus not (primarily) by virtue of their entanglement with each other. I am, however, less convinced that this is in tension with agential realism. In my reading, agential realism is only committed to the view that any division into individuated elements is produced by intra-actions in a fundamental, entangled whole. But this view does not imply that an agential individuation of two objects is always produced by their mutual intra-action, only that their individuation is ontologically derivative from the fundamental, entangled whole. Indeed, Barad does consider cases where this individuation -including the possibility for the ascription of properties -is not due to the entanglement between measuring apparatus and object system, but Barad does not consider these a problem for agential realism.
One example is Barad's (2007, 258-65) discussion of spin measurements. A particle with spin 1/2 can spin either one way or the other, typically denoted up and down, along any particular direction (denoted z here to conform with Barad's discussion). Barad describes a setup with a device (which she denotes "SG z ") that will deflect the particle upwards if it has spin up and downwards if it has spin down. If the particle is deflected downwards, it will hit a detector. Thus, if the detector is hit, the particle had spin down and if the detector is not hit, the particle has spin up. Imagine a particle in a superposed spin state (in the z-direction) going through the measuring device. The agential realist account of the situation is the following: It is not actually meaningful to describe the situation as involving a particle to which the property spin (in the z-direction) applies. Rather, the property of having a spin applies to the phenomena comprised of the intra-action between the particle and the measuring apparatus. It is this intra-action that constitutes an agential separation between object and agency of observation where the concept spin becomes semantically determinate. In quantum mechanical terms, the particle is in a superposed spin state (in the z-direction) and when going through the measuring device, the particle and measuring device become entangled. The consequence is that the particle plus measuring device is in a now entangled and superposed state of being spin down and hitting the detector and being spin up and not hitting the detector. The entanglement, however, is such that, when tracing out (ignoring) the degrees of the measuring device, the resulting reduced state of the particle is a mixture of spin states. When the degrees of freedom of the measuring device entangle with the particle through their intra-action, the measuring device dynamically diagonalizes the reduced state of the particle in the spin basis (in the z-direction). The particle can thereby be treated as an individual where its spin is given by a classical probability distribution. This is what Barad means by saying that the concept of spin has become semantically determinate. She insists that the property is ascribed to the whole phenomenon of particle plus measuring device because the reduced state results from tracing out the degrees of freedom of the measuring device. In Barad's words, when an SG z device is in place, the specific material arrangement (not the will of the experimenter) enacts a cut between the 'object of observation' and the 'measuring device' such that the boundaries and properties in question become determinate. In particular, with the SG z device in place, the notion of spin in the z-direction becomes meaningful, and the value of the corresponding property becomes definite. 1 In the absence of such a device, the concept of spin in the z-direction is meaningless, and there is no fact of the matter about the boundaries and properties of the object (Barad, 2007, 263).
Here, the resolution into individuals -the particle and the measuring device -with determinate properties is the result of their mutual intra-action.
What is interesting for the question whether it is " Barad's (2007) view that relata […] are invoked as phenomena in the intra-action between them," as Everth and Gurney claim, is the setup Barad considers next. Here, another spin measuring device is placed behind the up-stream which again deflects particles with spin up upwards and particles with spin down downwards (in the z-direction) where those going down will hit a detector. This simple setup involves two intra-actions, one at each measuring device. What will happen, Barad explains, is that "all the particles that emerge through the top output of the first SG Z , and head into the second one emerge through the top of the second device, indicating that all the emerging particles have measured eigenvalues up" (2007,259). The second measuring device affirms the measurement done by the first device. In quantum mechanical terms, the state after the second intra-action is a superposition of being spin down and hitting the detector and being spin up and not hitting any of the two detectors. In the first intra-action, the reduced state of the particle is a mixture of spin (in the z-direction). In the second intra-action, the reduced state of the particle is unchanged, and the reduced state therefore remains a mixture. The phenomenon of particle plus the first measuring device constitutes the particle as a relatum and allows for the application of the concept spin already before the particle moves through the second measuring device. The intra-action between the particle and the second measuring device, one might say (though, Barad, as discussed below, will ultimately want to put this in a slightly different way), only monitors a particle already constituted as an object to which the concept of spin applies. Barad explains that "even though measurements do not disclose preexisting values, they are not some arbitrary playing around, but rather definite, consistent, and reproducible values are obtained" (2007,265).
This emphasis on consistency and reproducibility between intra-actions appears to be in tension with Barad's remarks elsewhere that suggests that the resolution into individuals which can be ascribed properties is produced by the intra-action between these same relata. 2 Existence is not an individual affair. Individuals do not preexist their interactions; rather, individuals emerge through and as part of their entangled intrarelating (Barad, 2007, ix).
This quote could be read as saying that such independent existence is absent prior to every intra-action. This reading, however, is difficult to fit with Barad's account of the spin measurements. A reading that fits better with this account is if Barad instead means that each intra-action constitutes a new phenomenon and that, since individuals are derivative from the phenomenon in agential realism, they can therefore be said to emerge anew with each intra-action. This also coheres well with Barad's remark that "the addition of an auxiliary apparatus entails the constitution of a new phenomenon" (2007,413). Barad justifies this view with reference to one of Schrödinger's remarks about the consequences of entanglement: "When two systems interact, their ψ-functions [...] do not come into interaction but rather they immediately cease to exist and a single one, for the combined system takes their place" (Schrödinger, 1935; quoted in Barad, 2007, 283). In the spin measurements, even though the reduced state of the particle remains the same after intra-acting and therefore becoming entangled with the second measuring device, the degrees of freedom being traced out are not the same. The phenomenon that produces the relata is different. This is the reason why Barad can insist that the particle is constituted anew upon the intra-action with the second measuring device even though the second measuring device merely reproduces the value of spin obtained in the first measurement.
This reconstitution of sameness, as one might call it, could, however, seem to be in conflict with Barad's emphasis on the particularity or specificity of the resolution within intra-action.
Subjects and objects do not preexist as such but are constituted through, within, and as part of particular practices (Barad, 2007, 208). That is, the agential cut enacts a resolution within the phenomenon of the inherent ontological (and semantic) indeterminacy. In other words, relata do not preexist relations; rather, relata-within-phenomena emerge through specific intra-actions (Barad, 2007, 334).
In emphasizing particularity and specificity, Barad can certainly be read as implying that the resolution of the indeterminacy is very different between different intraactions. I suggest instead reading 'particular' and 'specific' as part of Barad's (2007, 86-94) questioning of reflection and sameness. On Barad's account, an object is only constituted as an object within a phenomenon. There is no object in itself whose sameness can be traced between phenomena. Every intra-action generates a new phe-nomenon with its own resolution of relata within that phenomenon. Every history ascribed to these relata which might form the basis for a judgement about sameness is itself part of that specific phenomenon. When Barad emphasizes particularity and specificity, this is not implying change, but rather that relata-within-phenomena derive their individuality from specific phenomena. A particle with the same reduced state can result from many different entangled totalities but asking in which of these the particle is the same particle is hardly a meaningful question in agential realism. This, I propose, is the sense in which objects are "constituted through […] particular practices," as Barad puts it above.

Emergent ontology
Decoherence explains why we, in practice, can typically treat macroscopic objects as though they have independent existence both before and after our and others' interaction with them but also why this appearance is in principle misleading. Indeed, decoherence reinforces the agential realist ontology with its pervasive entanglement and where the appearance of separations between objects and the possibility for the ascription of properties result from intra-actions within this entangled totality that is ultimately real.
Perhaps, however, Everth and Gurney would not, in the end, disagree with this. When they say that, in decoherence, "something fundamentally new happens," they may be referring, not to collapse, but merely to the emergence of another robust level of the ontology. Indeed, Everth and Gurney argue that the "pervasive network of constant and extensive intra-actions at the micro-level between quantum systems and the environment is generative of something qualitatively very different at the macroscopic level" (Everth & Gurney, 2022, 14, emphasis in original). One might understand this as the suggestion that decoherence produces a strongly emergent macroscopic level of reality which is therefore not merely an appearance but has a real ontology of its own; something that is supported when Everth and Gurney (2022, 15) claim that this emergent level exercises downward causation. If this emergent macroscopic level of reality is the relevant domain for social theorizing, then the consistency of agential realism as a fundamental ontology is irrelevant for social theorizing. What would matter is that agential realism mischaracterizes the emergent macroscopic level of reality. Barad, however, rejects the proposal that there are multiple ontological levels: "the universe is not broken up into two separate domains (i.e., the microscopic and the macroscopic) identified with different length scales" (Barad, 2007, 85). Speaking directly of the implications of decoherence, Barad repeats this view: "It is not that we live our daily lives in a classical world, rather than a quantum one; the point is that we generally don't notice quantum effects because they are very small (too small to notice without special equipment)" (2007,279). As this quote hints at, Barad defends this rejection of ontological stratification by the argument that one could, again in principle, still observe these quantum effects, though "one has to know how to identify an entanglement (e.g., where to look for correlations and how to measure them), and generally speaking, this is far from evident" (Barad, 2007, 279). That such recovery of quantum effects is possible is precisely what quantum eraser experiments demonstrate (see Barad, 2007, 310-317). Since the destruction of the interference pattern in a quantum eraser and in decoherence is the result of the same kind of process, the careful monitoring and destruction of entanglement within the environment could likewise, in principle, reinstate the quantum interference patterns even after an object decohered (see Greenstein & Zajonc, 1997, 15-16, also quoted above). This is, of course, only so if decoherence, like the quantum eraser, produces improper mixtures, whereas it would not be the case if decoherence reduces the system to a proper mixture (Zeh, 2007, 153). The latter option was, however, questioned in Sect. 3.
Indeed, Franklin and Seifert (2021) argue that the question whether decoherence entails strong emergence is closely connected to the measurement problem. Without collapse, the interpretation favored by Barad, there is no strong emergence, as Alastair Wilson (2022) also argues in the specific context of Everettian quantum mechanics. More generally, Barad is far from alone in rejecting strong emergence in the quantum to classical transition (see, e.g., Bitbol, 2007;Guay & Sartenaer, 2016;Seifert, 2020;Wilson, 2021, Chap. 6). Others disagree (Drossel, 2021;Ellis, 2006;Gambini et al., 2015), but this simply signifies that this is another interpretational disagreement and not a challenge to the consistency of agential realism and Barad's claim that it remains true at all scales.
What remains true, of course, is the fact that the appearance of classicality that decoherence provides for is very robust irrespective of whether it provides for a level of ontology of its own. Reinstating the quantum interference is typically practically impossible after decoherence with the environment, especially for macroscopic objects, because the information about, say, the position of the object is so quickly spread in the environment that all the entanglement cannot be destroyed. Perhaps this robustness of the decohered appearance of classicality is ultimately what Everth and Gurney want to highlight. Barad, however, does not seem to disagree. While the appearance of classicality is very robust, this is not an argument against agential realism, according to Barad, or, as she puts it above, "the fact that Newtonian mechanics provides good approximations to the exact quantum mechanical solutions for many macroscopic situations is not evidence against the new epistemology or ontology suggested by my elaboration of Bohr's account" (Barad, 2007, 416). This paper has argued how this claim can be justified if agential realism is meant as a fundamental ontology. If decoherence only produces improper mixtures, then this explains why Barad's relational ontology is consistent with the fact that we typically perceive the world as comprising of persistent, individuated objects. Decoherence is, in this sense, not a problem for agential realism but rather central for defending its consistency with this appearance of classicality. What would be challenged by decoherence is a claim that quantum physical effects are large enough at macroscopic scales to be significant. But Barad makes no such claim as Sect. 4 showed. Her argument that agential realism is relevant for social theorizing must be another one. What the argument might be instead and whether it is sound are important questions but not questions that decoherence has any bearing on.