Between Physics and Metaphysics — on Determinism, Arrow of Time and Causality

Contemporary physics, with two Einstein’s theories (called “relativity” what can be interpreted erroneously) and with Heisenberg’s principle of indeterminacy (better: “lack of epistemic determinism”) are frequently interpreted as a removal of the causality from physics. We argue that this is wrong. There are no indications in physics, either classical or quantum, that physical laws are indeterministic, on the ontological level. On the other hand, both classical and quantum physics are, practically, indeterministic on the epistemic level: there are no means for us to predict the detailed future of the world. Additionally, essentially all physical principles, including the arrow of time and the conservation of energy could be, hypothetically, violated (with some exceptions in the world of heavier quarks, and probably, the cosmological arrow of time). However, in contrast to Hume’s skepticism, we have no experimental evidence that the causality can be removed or even “hung on” in any case. The text contains some didactical-like issues, as well.


Introduction
The question to which extent we determine our lives and how it is decided by some external fate waving our live thread, is probably as old as the human reflection. In addition, it is still vivid: in modern cosmology these are parallel universes of Max Tegmark 1 , which we cannot contact with but our twins perform there parallel actions with ours.
From times of Aristotle, physics is called as a starting subject in philosophical discussion: the teleological cause appeared in his Physics. The reason for the epistemological importance The first great success of this paradigm was Philosopiae Naturalis Prinicpia Mathematica. Newton applying his own mathematical methodology explained the free fall described by Galileo. By reasoning on the motion of Moon around Earth, he generalized the concept of gravity and applied it to whole Solar system, including comets and satellites of Jupiter. These achievements brought to physics a splendour of the science that allows not only reproduce but also predict phenomena, with a great (perfect?) precision.
Soon, a temptation came to physicists that if we knew the initial conditions we would be able to determine the eternity of the universe: "En effet, si l'état présent de l'univers était entièrement semblable à l'état antérieur qui l'a produit, il ferait naître à son tour un état pareil; la succession de ces états serait donc alors éternelle" (Laplace, 1840). In fact, already in 1801 Gauss managed to calculate the expected position of the first planetoid, Ceres (discovered few months earlier casually by an Italian priest, Secchi), and in 1845 Adams and Le Verrier predicted the existence of Neptun from small deviations observed in the motion of Uranus. The world seemed to be a perfect clock.
These successes triggered an idea, that the motion of the whole universe is fully deterministic. Already before there was a similar conviction on the immense possibilities of physics, in the words of Archimedes: "give me a point for the support and I will move Earth". Obviously, to move Earth, the support should be enough massive and a cantilever sufficiently long and stiff. Hence, this statement was, practically (i.e. ontologically), meaningless.
Similarly, the convincement of Laplace is meaningless. Even if equations of motion in the classical mechanics are exact (i.e. the mathematics that we invented and/or discovered fits perfectly the universe) their solving, especially in case of many bodies, presents serious problems. Pluto, till recently the ninth planet, was discovered by pure observations: it is much smaller then Neptune and his orbit is inclined to the ecliptics, so theoretical calculations failed in predicting its existence. Even if objects in the belt of asteroids are constantly monitored, a huge number of them present a serious risk of an unexpected impact on Earth. Henri Poincaré showed that even for three interacting bodies we cannot perform "exact" calculations; computer-based, approximate, "perturbative" methods must be used.
Here we come to the real problem with Laplace's optimism: in order to know the initial conditions (3D vectors of the position and 3D vectors of velocities) of all objects in the universe, we would need to register them. Nevertheless, there is not enough atoms in the whole universe to do this. Therefore, even if from the ontological point of view the fate of the universe could be fully deterministic, it is not available and/or is not useful to our knowledge.
A half of century ago the term "classical chaos" was invented. It was discovered theoretically that some system, described by equations more complicated than linear ones (i.e. more complex than Newton's laws) behave in an unpredicted way. However, this is only an illusionary chaos: simply, the initial conditions influence the final state much more than in linear systems. In principle, if we knew these conditions with enough accuracy, we could predict the behaviour of the system. Nevertheless, this is impossible even in the simple case of six-wall dices, not saying about the thermodynamics of weather forecasts. Again, what is ontologically deterministic is not such epistemically.
Finally, still within the classical physics, but extended by Einstein beyond the Newton's equations, the gravity is not so simply as we teach in schools. Even if the equation of the general relativity can be represented mathematically in a beautiful, concise form G = (8πG/c 4 ) T Section One. Inert Matter Philosophy and Cosmology. Volume 24, 2020 (without the so-called cosmological term, with G being a tensor of the space-time curvature, T the tensor of mass-energy, G -gravitational constant and c -the light velocity), in practice these are ten, coupled, differential-integral equations. Quoting Michał Heller, Polish priest and cosmologist, the above equation contains some 10,000 terms, out of which we know now five or six: the laws of Newton (the second law of dynamics and the Newtonian gravity), the gravitational time dilatation (Pound-Rebka experiment), the space-frame dragging (Lense-Thirring's effect), geodetic precession (de Sitter's effect). The three latter are extremely small, but measurable (and applied as small corrections in Global Positioning System). It will take us a few centuries to find explicitly next terms in Einstein's equation.
Therefore, our limits on predicting the future of universe on the ground of the classical physics are purely epistemic: few equations can be solved exactly, i.e. analytically, for other approximate methods must be used, we cannot gain all the information needed, and even if gained -there is no way to store and process this information.
Resuming the classical mechanics: nothing indicates that the physical world would not be fully deterministic. Newton's and Einstein's equations are mathematically exact, so the future is exactly determined. However, we do not know how to solve these equations analytically and we do not know all starting parameters, so from epistemic point of view the classical world is practically indeterministic.

Quantum mechanics
Quantum mechanics, apparently, changed the certainty of the classical world. Heisenberg's principle states that some pairs of measurements, like the position and the velocity, or like the energy and the time are non-commutating, i.e. cannot be measured both with an arbitrarily chosen precision. We still lack an exact mathematical limit, see (Erhart et al. 2012;Fujikawa et al., 2013), but it is usually quoted, that the indetermination in measurements of the position Δx times the indetermination in measurement of the momentum Δp (which is the velocity times the mass of the particle) is higher than Planck's constant h divided 2π: Reasons that brought Werner Heisenberg to this principle are not clearly explained, even at the university level. Himself, he recalls attempts in February 1927 to describe the trajectory of an electron in a fog chamber. Triggered by discussions with Einstein on the observable world, he realized that it is not an intact trajectory, which is observed, but a series of points, where the electron collided with gas particles. And we assume, that the electron travelled between these points (with a certain velocity). Therefore, observations of the position and the velocity of the electron are exclusively complementary.
The material reason for the principle of indeterminacy is that in order to measure the position of an electron a probe-particle must be sent to collide with it. A quantum of light, i.e. the photon is a good probe projectile. In order to determine electron's position with a high accuracy, a photon with a short length should be sent. However, on the basis of the wavematter dualism, the shorter is the wavelength of the photon, the higher is its momentum. Shorter is the wavelength of the photon, a higher recoil momentum is transferred to the electron, i.e. a higher is the change in electron's velocity. One arrives in this way to the Δx Δp dependence.
An ontological interpretation of Heisenberg's principle would say: "In the moment of the measurement, the system jumps into one state of the many probable states, without any apparent reason to jump into one or another possible state" (Silva, 2013). Heisenberg, in particular in discussion with Einstein, sustained that the quantum mechanics removed the determinism from physics: we cannot predict the path of an electron. However, in our interpretation, Heisenberg's principle is not a kind of a philosophical rule governing the universe: it is a precise, experimental limit posed on our knowledge of the micro-world. The position and the velocity of an electron are always "exact": these are we (i.e. physicists) who cannot (using other material particles) measure both the position and the velocity. The indeterminism resulting from Heisenberg's principle is not ontological but epistemic, i.e. concerning the very nature of the knowledge not the electron itself.
A similar conclusion regards the so-called matter-wave dualism. Particles of the microworld behave in some conditions like material points and in other conditions -like waves of matter. What is the reason? Why sometimes particles appear as material points and sometimes like diffused waves? An analogue question is: how does a coin look like? It depends on the side of the coin. The same with dualism: a specific nature appears in a given, projected experiment. When an electron strikes a scintillation pixel on a TV screen it behaves like a particle, if it goes through a thin sample of silicon in an electron microscopy -it behaves like a wave. Again, the dualism is nothing mysterious -physicists know equations and know how to handle experimentally electron beams (Karwasz, 2005).
Two interesting aspects are related to the dualism. The first one is analogous to Young's experiment with the interference when a beam of light is sent via two narrow slits. On the screen, we get not shadows of the slits but a series of black and white fringes. The same pattern can be obtained with electrons, even if we send them one by one. In a common sense, it seems that a successive electron "knows" that the previous one chose a slit A so the next electron "decides" to go through a slit B. Again, the final pattern on the screen is strictly governed by equations of quantum mechanics but we cannot predict the path of any successive electron. Better, we could monitor the path of electrons, say, illuminating them by light before slits. Then the interference pattern disappears: electrons do not show their wave character. Electrons are the same, but the epistemologies are different.
The second important experiment related to the dualism is the quantum cryptography. The origin of the quantum cryptography is to be traced back to so-called paradox of Einstein-Rosen-Podolsky. Laws of conservation (of momentum, energy and angular momentum) hold both in classical and in quantum physics. When an atom emits light, it is one of its electrons that falls from a higher to a lower orbit. The electron, when orbiting, possesses the angular momentum. During the transition its momentum changes: the difference in the angular momentum is carried by the photon emitted.
Einstein, Podolsky and Rosen asked in 1935 what happened if a couple of photons with opposite angular momenta were emitted from an atom (we call such photon to be entangled). According to quantum mechanics, the momenta of these photons are not determined until a measurement is performed. The result of the measurement on one of the photons can give a result up or down. At this point, the angular momentum of the other photon is immediately determined, independently from the distance. This would violate the locality (i.e. the specific relativity, as the "knowledge" on the other photon propagates quicker than light.) EPR phenomena were verified experimentally and they become the basis of the successful applications of quantum cryptography (Horodecki et al., 2004). Simplifying, two partners exchange information using entangled photons. Before, they exchange the cryptographic key (via a normal correspondence). Then the information is sent as a chain of entangled photons. Differently from the traditional cryptography, any attempt to decipher the information without the key makes the whole message disappear. Quoting one of the founders of quantum cryptography, Paweł Horodecki, we know how to produce photons, how to make the computer encode the information into photons, how to transmit them and how to decipher the information, but we do not know why the whole process works. Our ability of using the mathematical apparatus exceeds our possibility of understanding phenomena.
Quantum mechanics caused serious discussions on the determinism. Some, quite influential authors, physicists and philosophers (Barrow, 1999: 356) introduced a concept of "empty time" in which the non-material world would appear. We do not agree much on such "lacunas" -there is no need to steer "by hand" single electrons in the diffraction experiment: the Schrödinger equation predicts the final result.
Resuming, physicists still struggle in hope to "understand" quantum mechanics: they would like to "know" the trajectory of an electron instead of "expecting" the trajectory to be somewhere, see (Laloë, 2001). Quantum mechanics is the first serious case that we do not understand the nature, even if we can operate it. So, nothing excludes that quantum mechanics could be (ontologically) fully deterministic. The serious point is that this determinism is intrinsically not accessible to us. Seems that metaphysics overrides physics (Marganeu, 1941).

Special relativity
The time and the space were already the main subject of Aristotle's Physics. For him the space was determined by objects existing in it: a free space was the distance between bodies. The modern understanding of the space (and time) came with Galileo, who noticed that the description of the motion depends on the chosen system of reference: inside a ship moving with a constant velocity all objects seem to be in rest. A next step was the space system of reference as introduced by Descartes in the form of three perpendicular axes. These axes go also to negative values. In the space, the objects can move in three directions, with both positive and negative arrows. For the time axis the dependences in negative directions are graphically signed but, obviously, are physically not accessible (any more), as they correspond to the past.
With Immanuel Kant the space and time became abstract categories, in which the whole real world is immersed: the space and the time exist also without the (material) world. Such pure space and time vanished with works by Einstein. The special relativity removed the absolute time and space. With the general relativity, an empty space disappeared: the universe contains matter and the matter defines both the time and the space. Einstein's general relativity seems to bring back the notion of space and time to Aristotle (Karwasz, 2016).
The restricted theory of relativity is usually said that all observers have their own systems of reference (i.e. the space and time) and events that are contemporary in one system can happen in different moments in another system of reference. These statements are correct, but differences in time do not change the causality principle. This Einstein's has important consequences. The first is that the velocity of light is the highest possible with which the information can be transmitted. This puts intrinsic limits on our knowledge of the world: we cannot know cosmological limits of the universe, as already stated by Copernicus (De Revolutionibus, Liber I, 6a) "Why then do we still hesitate to grant it the motion appropriate by nature to its form rather than attribute a movement to the entire universe, whose limit is unknown and unknowable?" 2 by Grzegorz P. Karwasz Philosophy and Cosmology. Volume 24, 2020 21 The second consequence of the special relativity is more optimistic: looking very far into the space we look also deeply into the past, back almost to the beginning of the universe. Unfortunately, looking into "before" the Big Bang lacks the information carriers: the third detection of the gravitational wave (Abbott et al., 2017) proved that they travel with light velocity.
Einstein's theory does not make events ontologically "relative" what could be understood as indeterminism. This is exactly opposite: it puts a precise order in actions and results: the space-time distance between them is always governed by the velocity of light, c. No event can happen before a signal of light brings the information from the cause, to trigger the effect. Space-time distance is described now by four coordinates -three perpendicular space axis, x, y, z and the time distance (cΔt), using the same units.
The space-time distance is positive for any pair of events linked by the causality relation (i.e. a sufficient time is elapsed between the events, depending on the relative space distance).
The space-time interval of two events is independent from the fact of movement. "With the sign of (Δs) 2 , nature is telling us about the causal relation between the two events" (Tipler, 2008). Minkowski described the time-space as a particular cone of events that can be connected by the causality relation. In Minkowski's space, every observer has his own cone, with the origin in his "now-and-here" but all cones have identical features: the time is individual for different observers, as already stated by Aristotle in Physics.
The restricted relativity brings some "paradoxes": moving objects seem shorter and the time in moving reference systems goes slower. The first experimental evidence of the relativistic stretching of time is the elongation of the lifetime of instable particles, muons coming from the cosmic radiation and travelling almost at the velocity of light: they reach Earth surface thanks to the relativistic dilatation of their time-frame (Rossi, 1941). In contrary, a fast moving stick fitting (hypothetically) into a barn (Tippler & Llellevyn, 2009) remains still a Gedanken experiment, solely.
We note formally that nothing is said in the special relativity on the sign of the time t (positive or negative); the only requirement for the causality is the positive value of the space-time interval (Δs) 2 .

General relativity
Einstein's restricted relativity was born from considerations on the motion with a constant velocity. A question on the accelerated motion brought the result that the acceleration and the gravity are to some extent identical. This is the core of the general relativity.
A really unexpected issue, in the very first instance for Einstein himself, was that Universe could not be stable: it must expand in continuation in order not to collapse under the omni-present forces of gravity. More, there must be a mysterious force powering this expansion (Pearlmutter, 2003): the dark energy, together with dark mass exceeds the whole matter perceptible to us by a factor of 20. We observe only 5% of the existing matter and still we have no clear idea what dark mass and dark energy could be. For a moment, scientists speculate, moving within already used ideas, like hidden dimensions, or new type of elementary particles, "axions", giving explicitly the time arrow (Wilczek, 2016). For a moment, proposed solution remain still only vague hypothesis. Therefore, through the equation of the general relativity, the very universe is the prove of its beginning in time: the expanding universe is the cosmological time arrow (López, 2018). However, nothing, mathematically speaking, would forbid the universe collapse (and the cosmological arrow of time go backwards). Einstein's relativity, especially that general, brought back the space and time to the Aristotle's matter: this is the omnipresent matter, which defines the space and time. No vacuum exists in any place of the universe.
A limit of the light velocity on any transfer of matter and/or information surprised even Einstein. In 1935, he and Nathan Rosen worked on unifying the electromagnetism with general relativity theory (Einstein & Rosen, 1935). They elaborated the idea of a double funnel, connecting separate sheets of space-time, thus allowing to "cheat" the light velocity limit. These, purely hypothetic structures were called "space-time worms". The problem is that either such structures are unstable or any material object (the light included) gets captured inside. Space time-worms are now judged as a wrong step done by a genius (Lindley, 2005).
The ideas on travelling faster than light via folding the space-time come back in other forms. A Mexican physicist, Miguel Alcubierre (1994) speculated about another kind of energy, negative one, which would be needed for such travels. Moreover, a travel to the nearest star would consume such negative energy equivalent to the mass of Jupiter. In other words, to travel through the universe we would gradually make it annihilate. A really costly solution… So the whole physics -classical, quantum, relativistic -puts a precise order on events: all are governed by strict mathematical rules. There are no cracks in the ordered happening of actions and results. In other words, all physics points towards the determinism and, as a consequence, to the causality principle. This determinism is governed by the precise order of the time line, going from the past to the future. And the time and space are, following Aristotle and Einstein, intrinsically related to the material world. Therefore, before discussing philosophical issues of the causality principle, we come back to the question of the negative time arrow.

The principle of causality
The existence of the causal relations has been questions by Hume some three centuries ago. As remembered recently by John Polkinhorne (1989), in spite of a continuous debate, "The fact is that our actual knowledge of the causal structure of the physical world is still patchy and incomplete." Inverting the time (or travelling freely in time) would allow (it is believed so) to change the future. Therefore, it would violate the determinism.

Time arrow
One of favourite ways of science fiction authors to change the present (and in this way to intervene in the causality chain) is to travel in time, or invert the time flow. Changing now an event that happened in the past will, it seems, change the present and the future.
In the whole classical physics, including Maxwell laws of electromagnetism, the time and space are mathematical objects, like the numerical axis in mathematics, i.e. continuous and extending from minus infinity to plus infinity. If we watch a film of two colliding balls and this film goes in the backward direction, nobody can notice the difference: instead of going from right to left, the balls go in the opposite direction but all laws of mechanics act perfectly.
Also in the electrodynamics, the time is perfectly symmetric. Maxwell' laws allow to use (-t) instead of (+t) -this changes only the sign of the voltage created. One of the best teachers ever and Nobel prize winner, Richard Feynman in his famous "Lectures", when speaking about electromagnetic waves, stated clearly "Now let us assume that the sign of the time axis is plus". Solving Maxwell's equations with the sign plus we get "retarded potentials", i.e. an electromagnetic wave travelling in space (in any direction) and into the future. Nothing, in principle, would forbid to use "anticipated" potentials and make an electromagnetic wave (i.e. a message) travel back in time.
A global time arrow appears in the expanding universe, but this arrow is purely experimental. It appears also in thermodynamics, i.e. phenomena of heat exchange. It is trivial that when mixing a glass of hot water with a glass of cold water we get tepid water. And it is not possible any more to separate hot molecules in from cold ones -in fact they exchanged they kinetic energies and on average they are tepid. However, this is only the mean energy -there are always some molecules, which are somewhat hotter and other, somewhat cooler: we do not know which.
James Maxwell invented a microscopic "demon" -who could separate back the warm molecules from cold ones. A demon opens a gate when a warmer molecule approaches it. By closing and opening the gate in appropriate moments, the liquid (or gas) can be separated into two different temperatures. This would invert the thermodynamic time arrow. However, there is one problem.
In order to open and close the gate the demon has to acquire and process the information. However, the information is physical. "It is stored in physical system like books and memory sticks, transmitted by physical means -for instance via electrical or optical signals -and processed in physical devices. Therefore, it must obey the laws of physics, in particular laws of thermodynamics" (Lutz & Ciliberto, 2015). This law say that processing information consumes the energy: the gain of energy from a possible work of the demon is never higher than the energy he consumes to process the information.
Therefore, in thermodynamics, one could reverse the arrow of time, but this would consume the energy. Again, we need the energy if we want to force the laws of physics. And, knowing that we cannot create energy either mass from null, such an inverting of time, on a global scale, is outside our material possibility.
Another evidence for the time arrow are unstable elementary particles. Schrödinger's equation did not include the relativity. In 1931, Paul Dirac proposed a relativistic equation to describe the dynamics of the electron. This equation is still valid if the electrical charge of the electron (that is negative) gets changed into positive. Soon, such positive electrons (positrons) were discovered in the cosmic radiation. They are complementary to electrons -the same mass, spin, magnetic momentum and the opposite electric charge, point-like and stable in vacuum; when meeting electrons they annihilate in nanoseconds (Karwasz et al., 2004).
A question was raised if the world with the inverted electrical charges (positive electrons and negative protons) is the same as our world. In 1957 an experiment on a radioactive decay of cobalt showed that changing the electrical sign changes also the orientation of the rotation of elementary particles: say, electrons rotate (around their own axis) left and positronsright.
In 1917, a German mathematician, Emmy Noether, stated that the symmetry of the space (up-down, left-right) corresponds to the conservation of momentum (i.e. of the velocity as a vector, considering also its direction). Further, the conservation of energy is assured by the symmetry of the time axis and the conservation of the angular momentum by the symmetry of rotations -clockwise, anticlockwise. Now, with the broken symmetry of rotation physicists challenged the nature to check if a combine symmetry -charge C + rotations (= parity, P) is conserved. Studies of the decay of another "exotic" particle, kaons, showed in 1964 that neither the CP symmetry is conserved. Only in 2000 a series of experiments, see (Geer et al., 2000) showed that the combined symmetry -CPT is conserved within experimental uncertainties. This means that for kaons the time is asymmetric.
Experiments on quarks of the third generation (heavier and several orders of magnitude less stable than kaons) showed that for the very exotic particles that, physicists suppose, existed only in the first instances after the Big Bang) the time is even more asymmetric. In detail, for quarks of the second generation (called strange, component of kaons) a reference number for breaking the symmetry of time is 0.002, for bottom quarks (much heavier, of the third generation) such a reference number is as much as 0.1 (Barlow, 2014).
If the very early universe (in first 10 -12 s) consisted of heavy quarks -the time arrow was decisively asymmetric. Normal quarks (i.e. nowadays quarks, up and down, that the whole matted is now made of) do not show any measurable time asymmetry.
In conclusion, even if breaking of the time symmetry is possible (not forbidden), decisively the whole universe is subject to the positive time arrow (i.e. aging). Note however that inverting the time arrow does not say anything on the causality principle. Moreover, the question of sending messages back in time (from future to the present) needs some additional considerations.

Messages from the future
The mathematical symmetry of the space would suggest a mathematical symmetry of the time. But the physical (i.e. filled with material particles) time-space could be non fully symmetric, as indicated by the time asymmetry for heavier quarks. Physical theories hardly explain why.
Our hypothesis is that propagation of signals back in time is possible but highly dampened. A physical analogy is a mirror. Why does a mirror (i.e. the glass covered on the back side by silver) reflect the light? Because silver reflects light? This is not a sufficient explanation. Silver reflects the light because propagation of the light inside silver is strongly damped. The light enters only the surface layers; it is quickly attenuated and therefore sent back 3 .
So signals from the future are possible? Even if, they are highly damped: they propagate from the near future, and are more intense if the importance of the massage is higher. But they are also noisy -so confused that one could not prevent the predicted events to happen. Messages from the future may be possible, but they must resemble watching TV (that we cannot influence). So seeing the future does not brake the chain of actions happening in the present: the principle of (ontological) determinism is not violated.

Causality
The main mental disorder that can emerge from a superficial understanding of Heisenberg's principle is that the quantum mechanics introduced casual (contingent) events: the path of a single electron is not governed by any rule -it is completely random. This is not the case: the path of every electron is such that the final picture are the interference fringes. The "behaviour" of a single electron is governed by Schrödinger's equation that defines the probability of a given path. The sum of these probabilities, via applying the rules of statistics, gives always what is predicted by equations of quantum mechanics.
From the very beginning of quantum mechanics (Bohm, 1980;Schroer, 2011), numerous trials to find out some hidden variables that determine one-to-one the path are under way; nowadays these trials fail to gain any wider scientific acceptance. We need not hidden variables: the position of every electron is well determined in any moment (we could even illuminate it, but then the interference picture disappears) and only that we cannot know it.
But the concept of the causality is more stringent than the determinism. The determinism means that two events touch each other in Minkowski's space. The search for the causality is more abstract than the determinism -the causality means that events form some longer chains. These chains can be ramified: not only events directly "touching" one another in Minkowski's space are related, but also those which seem to be independent. This is the main difficult in proving the causality.
In analyzing the chain of events two types of reasoning should be distinguished: 1) an event will have an effect; 2) if we observe an effect, for sure there was a cause. We show this distinction on a picture below.
David Hume stressed that we observe only separated facts and that deducing on their connection via a cause-effect relation is only our supposition. As far as Heisenberg doubted in determinism, Albert Einstein (1994) questioned even the causality, in particular that proposed by Kant: Hume noticed clearly that some concepts, for example the causality, cannot be deduced with logical methods from experimental data. Kant, being strongly convinced that some concepts are indispensable, and exactly those which proved such in practice, has interpreted them as indispensable premises of any speculation, and has distinguished from the concept of the experimental origin. Instead, I am convinced that such a distinction is erroneous… All concepts, also those the most close to the experience, are from the logical point of view conventions freely chosen, like in fact in the case of the concept of the causality from which has the origin this type of problems.
The EPR paradox can be considered as an argument against quantum mechanics (that predict entanglement) and/or against the causality and/or against the locality of events. If the second, entangled photon (in a remote lab) inverts its spin when the first one (in the local lab) is detected, so either there is some non-local field governing the whole space, or the two photons exchanged "information" with the velocity exceeding the speed of light.
The EPR experiment translated to an experimental procedure by so-called Bell inequalities is under exhausting series of checks starting from seventies of last century. The aim of these experiments is to impede photons exchange "information", for ex. via supplying cables of the computers. Till now all experiments confirm EPR. Alan Aspec (2015), the founder of this branch of experiments says that in view of EPR we should renounce the local realism.
Does EPR violate the causality? Can we influence events in the remote lab? No! As I got the explanation from an Italian pioneer of such "tele-transportation", Francesco De Martini (Boschi et al, 1998), we have information on the state of the photon in the remote lab but

Causality
The main mental disorder that can emerge from a superficial understanding of Heisenberg's prin the quantum mechanics introduced casual (contingent) events: the path of a single electron is not gover rule -it is completely random. This is not the case: the path of every electron is such that the final pic interference fringes. The "behaviour" of a single electron is governed by Schrödinger's equation that probability of a given path. The sum of these probabilities, via applying the rules of statistics, gives alw predicted by equations of quantum mechanics.
From the very beginning of quantum mechanics (Bohm, 1980;Schroer, 2011), numerous trials some hidden variables that determine one-to-one the path are under way; nowadays these trials fail wider scientific acceptance. We need not hidden variables: the position of every electron is well determ moment (we could even illuminate it, but then the interference picture disappears) and only that we canno But the concept of the causality is more stringent than the determinism. The determinism mea events touch each other in Minkowski's space. The search for the causality is more abstract than the det the causality means that events form some longer chains. These chains can be ramified: not only eve "touching" one another in Minkowski's space are related, but also those which seem to be independent. main difficult in proving the causality.
In analyzing the chain of events two types of reasoning should be distinguished: 1) an event w effect; 2) if we observe an effect, for sure there was a cause. We show this distinction on a picture below.
David Hume stressed that we observe only separated facts and that deducing on their conne cause-effect relation is only our supposition. As far as Heisenberg doubted in determinism, Albert Eins questioned even the causality, in particular that proposed by Kant: Hume noticed clearly that some concepts, for example the causality, cannot be deduced w methods from experimental data. Kant, being strongly convinced that some concepts are indispe exactly those which proved such in practice, has interpreted them as indispensable premi speculation, and has distinguished from the concept of the experimental origin. Instead, I am con such a distinction is erroneous… All concepts, also those the most close to the experience, a logical point of view conventions freely chosen, like in fact in the case of the concept of the cau which has the origin this type of problems.
The EPR paradox can be considered as an argument against quantum mechanics (that predict ent and/or against the causality and/or against the locality of events. If the second, entangled photon (in a inverts its spin when the first one (in the local lab) is detected, so either there is some non-local field go whole space, or the two photons exchanged "information" with the velocity exceeding the speed of light.
The EPR experiment translated to an experimental procedure by so-called Bell inequalitie exhausting series of checks starting from seventies of last century. The aim of these experiments is 4 Mathematically, a propagation of light is described by a sine wave, that is the sum of exp [± i(kx -ωt]. The attenuati light is described by exp[± (kx -ωt]. The difference is in the imaginary unit i, which results from a sign minus in the propagation.
1) Known action → ?expected result 2) ? Probable reason → Observed state this is completely useless. In order to use this information, i.e. to perform an action in the remote lab, we need to send this information to the remote lab. And this can be done only in a classical way (i.e. within the special relativity) -by telephone, with the c velocity. The causality principle survives the EPR test. The causality survives also the hypothesis of Einstein's (1935) space-worms: no material particles, including photons, could travel via such worms, even if they (i.e. worms) had existed (Fuller & Wheeler. 1962). However, criticism of Hume comes from all over modern science: the conviction on the causality principle is the basis and motivation of any research. "I cannot literally observe the causal relation between a mosquito on my arm and the itch that follows its departure. But my causal inference is based on strong background knowledge", writes Patricia Churchland (2016) in "How Biology Influences Philosophy".
"Proving" the causality is a difficult or more precisely, an impossible task. In other subjects it is enough to show the existence of an example, say of unicorn, to know that they exist. Normally, it is more difficult to show the non-existence of unicorns, as the whole globe should be browsed. In the case of the causality, one should find an event that occurred without a cause. The question is always the same -there was no cause at all or simply we are not able to identify this cause.
Repeating, the modern physics, from times of the EPR paradox, has removed from principles of science the local reality: "entangled" events can happen simultaneously at any distance. But this violation of locality is only apparent: no (material) information can be exchanged with the velocity higher than c. It is hard to understand the mathematical details of the EPR phenomena (or better quantum entanglement) but there are no doubts that the term locality needs a modified meaning. Leonard Susskind writes (Suskind & Friedman, 2014): Some people think so. Einstein railed against the "spooky action at a distance" (spukhafte Fernwirkung) that he claimed was implied by quantum mechanics. And John Bell became almost a cult figure by proving that quantum mechanics is nonlocal. On the other hand, most theoretical physicists, particularly those who study quantum field theory, which is riddled with entanglement, would claim the opposite: quantum mechanics done correctly ensures locality. The problem, of course, is that the two groups mean different things by locality. Let's begin with the quantum field theorist's understanding of the term. From this point of view, locality has only one meaning: it is impossible to send a signal faster than the speed of light.
Therefore, in spite of all different trials, starting from space-time worm-holes, and including the EPR paradox, modern physics did not remove the principle of causality from the nature. The objection of Hume against the principle of causality is that it cannot be proved on the basis of experimental evidence. It cannot be deduced form first principles, either.
However, in mathematics we have a third (not experimental, not deductive) way of proving laws: the mathematical induction procedure. It consists in: i) showing the first event, for n=1, ii) assuming that the event happened for a given n-case, iii) proving that n will cause n+1 be valid. There is no need to prove the whole chain of relations for n = 2, 3, etc.
The physical world is for sure more complex than simple mathematical relations for natural numbers. So the existence of the mathematical induction is not a proof for the causality principle, but only a hint: i) events happen, ii) a successive event is always caused by the preceding one, iii) so on the basis of the mathematical induction we believe in the whole chain. Obviously, like with predicting the future, the above reasoning is not a proof of causality, but only a possibility.