In line with previous editions, the Time Machine Factory 2019 (TM2019) conference focuses on causality and non-locality in physics, and the insurgence of situations where causality can potentially be violated, and how this relates to the existence of “time machines”.
The conference has the following aims
Revive the interest in studying time travel, which poses new mathematical challenges and could not be ruled out by physics
Study the role of causality in fundamental physics including General Relativity and Quantum mechanics
Explore the mathematical, physical and logical consequences of closed timelike curves and other mechanisms which allow time travel
Contribute to a comprehensive vision of the underlying issues as well as exploring potential applications to relativistic and quantum metrology and space-time navigation.
Although violations of chronology might seem to contradict common sense and lead to logical paradoxes, “time machines” are not ruled out by the current laws of physics.
The conference will highlight new horizons in physics and astrophysics associated with the possible existence of a Time Machine. Examining these fundamental issues provides an opportunity to initiate new ideas and investigations which explore the role of causality and the interplay between General Relativity and Quantum Mechanics, two theories that on their own have been extensively verified by experiment but have not yet been successfully combined into a single unified theory.
Exactly hundred years ago, in May 1919, Arthur Eddington and three colleagues observed the gravitational light deflection, as predicted by Albert Einstein, during a total Sun eclipse. After a historical introduction, I'll discuss the relevance of gravitational lensing as an important tool for present-day astrophysics. In the last part of the talk I will investigate in some detail the pictures of the shadow of the black-hole candidate at the centre of M87, which were released to the public on 10 April 2019, just hundred years after the Eddington expedition.
Black holes in equilibrium are fundamental objects predicted by General Relativity. However, real black holes form, evolve and eventually evaporate, thus they are dynamical. Do they have a well-defined boundary? Where? The usual Event Horizon is global and teleological, thus not well defined for dynamical black holes. The concepts of dynamical and trapping horizons, based on closed trapped surfaces, are promising alternatives. I will show, however, the fundamental problems inherent to dynamical or trapping horizons. I will then introduce the concept of Core of a black hole, and discuss the possibility that they can select a unique horizon.
What would happen if you could enter inside a black hole? You would travel to the future, coming out of a white hole! In fact, the huge gravitational redshift distinguish two characteristic time for such a process: the one of the infalling observer, that is fast, and the one of an external observer, that is extremely long. I discuss how such a process is allowed by gluing classical metrics without violation of causality. On the other hand, the full process is a characteristic non-perturbative quantum phenomenon, that involve the superposition of different geometries. I discuss the condition for this to happen, including an intriguing realisation in the remnant phase of the black hole.
We study the properties of regular black holes using both test and gravitating scalar fields. The main motivation being to discover features that distinguish them from real black holes. One such characteristic is regularity of horizon which is spoilt by scalar field in spherically symmetric static cases.
Abstract
In general relativity theory (GRT) one can construct solutions which are related to real physical objects. The most famous one is the black hole solution. One now believes that in the center of many galaxies there is a rotating super-massive black hole, the Kerr black hole. Because there is an axis of rotation, the Kerr solution is a member of the family of the axially symmetric solutions of the Einstein equations. A legitimate question could be: are there other axially or cylindrically symmetric asymptotically flat solutions of the equations of Einstein with a classical or non-classical matter distribution and with correct asymptotical behavior, just as the Kerr solution? Many attempts are made, such as the Weyl-, Papapetrou- and Van Stockum solution. None of these attempts result is physically acceptable solution. Often, these solutions possess closed timelike curves (CTC's). The possibility of the formation of CTC's in GRT seems to be an obstinate problem to solve in GRT. At first glance, it seems possible to construct in GRT causality violating solutions. CTC's suggest the possibility of time-travel with its well-known paradoxes. Although most physicists believe that Hawking's chronology protection conjecture holds in our world, it can be alluring to investigate the mathematical underlying arguments of the formation of CTC's. There are several spacetimes that can produce CTC's. Famous is the Tipler-cylinder. Most of these spacetimes can easily characterized as un-physical.
The problems are, however, more deep-seated in the vicinity of a (spinning) cosmic string or in the so-called Gott-spacetime. These cosmic string models gained much attention the last decades. Two cosmic strings, approaching each other with high velocity, could produce CTC's. If an advanced civilization could manage to make a closed loop around this Gott pair, they will be returned to their own past. However, the CTC's will never arise spontaneously from regular initial conditions through the motion of spinless “cosmons” ( “Gott’s pair”): there are boundary conditions that has CTC's also at infinity or at an initial configuration. If it would be possible to fulfil the CTC condition at t0, then at sufficiently large times the cosmons will have evolved so far apart that the CTC's would disappear. The chronology protection conjecture seems to be saved for the Gott spacetime. There are still some unsatisfied aspects around spinning cosmic strings. If the cosmic string has a finite dimension, one needs to consider the coupled field equations, i.e., besides the Einstein equations, also the scalar and gauge field equations. It came as a big surprise that there exists a vortex-like solution in GRT comparable with the magnetic flux lines in type II superconductivity. Many of the features of the Nielsen-Olesen vortex solution and superconductivity will survive in the self-gravitating situation. These vortex lines occur as topological defects in an abelian U(1) gauge model, where the gauge field is coupled to a charged scalar field. It can easily be established that the solution must be cylindrically symmetric, so independent of the z-coordinate and the energy per unit length along the z-axis is finite. There are two types, local (gauged) and global cosmic strings. We are mainly interested in local cosmic strings, because in a gauge model, strings were formed during a local symmetry breaking and so have a sharp cutoff in energy, implying no long range interactions. It turns out that spinning cosmic string solutions can cause serious problems when CTC's are formed which are not hidden behind a horizon, as is the case for the Kerr metric. One can "hide" the presence of the spinning string by suitable coordinate transformation in order to get the right asymptotic behaviour and without a residue of the angle deficit. One obtains then a helical structure of time, not desirable. Further, it is not easy to match the interior on the vacuum exterior and to avoid the violation of the weak energy condition (WEC). Many attempts are made to find a physically acceptable solutions, but all failed. It is clear that an additional field must be added to compensate for the energy failure close to the core of the string. That part of the mass density of a rotating string due to its angular deficit is insufficient. In general one can conclude that there is an urgent need for a satisfying physical interpretation of CTC’s in this spacetime.
In my talk I will consider the spinning string in conformal gravity, where the interior consists of a gauged scalar field. Conformal invariance in GRT considered as exact at the level of the Lagrangian but spontaneously broken, is an approved alternative for disclosing the small-distance structure when one tries to describe quantum-gravity problems. Moreover, the conformal invariant cosmological models could solve the dark energy/matter problem.
We will write the metric as g_μν=ω^2 g ̃_μν , with ω a dilaton field, handled on equal footing with the Higgs field and g ̃_μν the “unphysical” metric. By demanding regularity of the action, no problems emerge when ω→0. For the vacuum exterior, exact (Ricci-flat) solutions are found with the correct asymptotic features which can be matched on the numerical interior solution. For global cosmic strings, the existence of CTC's can be avoided or pushed to infinity by suitable values of the integration constants. These constants can be used to fix the parameters of the cosmic string by the smooth matching of the solutions at the boundary. There seems to be no problems in order to fulfil the weak energy condition.
Our result could be a new possible indication that local conformal invariance and spontaneously broken in the vacuum, can be a promising method for studying quantum effects in GR, as was found in many other studies.
R. J. Slagter and S. Pan, 2016, Found of Phys, 46, 1075
R. J. Slagter, 2019, Phys of the Dark Universe, 24, 100282
R. J. Slagter and C. L. Duston, 2019, ArXiv: gr-qc/190206088 [subm to Ann of Phys
In calculations of gravitational collapse to form black holes, trapping horizons (foliated by marginally trapped surfaces) make their first appearance either within the collapsing matter or where it joins on to a vacuum exterior. Those which then move outwards with respect to the matter have been proposed for use in defining black holes, replacing the global concept of an event horizon, which has some serious drawbacks for practical applications. I here present results from a study of the properties of both outgoing and ingoing trapping horizons, assuming strict spherical symmetry throughout. Their causal nature (i.e. whether they are spacelike, timelike or null) is investigated, following two different approaches, one using a geometrical quantity related to expansions of null geodesic congruences, and the other using the horizon velocity measured with respect to the collapsing matter. The models treated are simplified, but do include pressure effects in a meaningful way and we analyze how the horizon evolution depends on the initial conditions of energy density and pressure of the collapse. (NOTE: This work has been published in Classical and Quantum Gravity 34 (2017) no.13, 135012 )
The possibility of closed timelike curves in general relativity opens up the physical possibility of
time travel. This talk reviews the different quantum mechanical theories of closed timelike curves, and
discusses their various advantages and drawbacks. We will discuss whether it is possible to use closed
timelike curves to build a time machine.
We propose a time-of-arrival operator in quantum mechanics by conditioning on a quantum clock. This allows us to bypass some of the problems of previous proposals, and to obtain a Hermitian time of arrival operator whose probability distribution arises from the Born rule and which has a clear physical interpretation. The same procedure can be employed to measure the "time at which some event happens" for arbitrary events (and not just specifically for the arrival time of a particle).
The two-state-vector formalism, the entangled histories and the pseudo-density formalisms are attempts to better understand quantum correlations in time. These formalisms share some similarities, but they are not identical, having subtle differences in their interpretation and manipulation of quantum temporal structures [1, 2]. I will show, for instance, that they treat operators and states on equal footing, leading to the same statistics for all measurements. I will discuss the topic of quantum correlations in time and show how they can be generated and analysed in a consistent way using these formalisms. I will also elaborate on an unconventional behaviour of temporal monogamic structures and quantum histories of evolving multipartite systems which do not exhibit global nonlocal correlations in time but nevertheless can lead to entangled reduced histories characterizing evolution of an arbitrarily chosen subsystem.
[1] M. Nowakowski, E. Cohen, P. Horodecki, Entangled Histories vs. the Two-State-Vector Formalism -Towards a Better Understanding of Quantum Temporal Correlations, Phys. Rev. A 98, 032312 (2018).
[2] M. Nowakowski, Quantum Entanglement in Time, AIP Conference Proceedings 1841, 020007 (2017).
The set of quantum mechanical nonlocal correlations is unique and intriguing in many ways. Characterizing this set is expected to cast light on the fundamental physical principles governing quantum theory, those from which the mathematical structure of the theory arises. Recently, we have shown ([A. Carmi and E. Cohen, Sci. Adv. eaav8370 (2019)] and followup works) that this set may largely be derived from the requirement that uncertainty relations, broadly understood, are local in the sense of being independent of the choices made by other parties. Relativistic independence, as we have named this condition, treats nonlocal correlations and uncertainty relations on an equal footing. Furthermore, it implies that quantum mechanics can be as nonlocal as it is without violating relativistic causality thanks to the existence of intrinsic uncertainty.
The notion of relativistic independence can be also encoded in a new kind of nonlocal hidden variables we term "pseudolocal". We have shown that different kinds of quantum hidden variables lead to backwards in time signaling if known [A. Carmi, E. Cohen, L. Maccone, and H. Nikolic, arXiv:1903.01349].
In this talk we shall briefly present these previous works and then build upon them to show how this view gives rise to causal structures. We will demonstrate how such a causal structure tightens the bounds on the set of nonlocal correlations in any physical theory as the number of experimenters, measuring devices and incorporated statistical moments increases. Finally, we will connect the failure of counterfactual definiteness with time-irreversibility and discuss a sense in which entanglement gives rise to the arrow of time.
The violation of the discrete symmetries of charge conjugation (C), parity inversion (P), and time reversal (T) observed in high energy physics are clearly fundamental aspects of nature. A new quantum theory [1,2] has been introduced to demonstrate the possibility that the violations have large-scale physical effects. The new theory does not assume any conservation laws or equations of motion. In particular, if T violation is turned off, matter is represented in terms of virtual particles that exist momentarily only. However, with T violation turned on, what was the mathematical structure of a virtual particle now traces out an unbounded world line that satisfies conservation laws and an equation of motion. The theory is then analogous to the 5 dimensional "proper time" formalism introduced by Feynman [3], extended by Nambu [4] in the 1950's, and developed as "parameterized relativistic quantum theories" [5]. The important point here is that time evolution and conservation laws are not built into the new theory, but rather they emerge phenomenologically from T violation. In other words, the new theory proposes that T violation is the origin of dynamics and conservations laws. It has experimentally testable predictions and offers new insight into the quantum nature of time.
The talk will include an analysis of the nature of the T violation from known and expected sources such as mesons, neutrinos, and a Higgs-like scalar field. In appropriate parameter regimes, the commutator of the time-reversed versions of the associated T violating Hamiltonian, $\hat{H}_F$ and $\hat{H}_B$, is found to approach the canonical form $[\hat{H}_F,\hat{H}_B]=i\lambda \hat{1}$ where $\hat{H}_B=\hat{T}\hat{H}_F\hat{T}^{-1}$, $\hat{T}$ is Wigner's time reversal operator, $\hat{1}$ is the identity operator, and $\lambda=\langle i[\hat{H}_F,\hat{H}_B]\rangle$ represents the amount of T violation.
[1] J.A. Vaccaro, Quantum asymmetry between time and space, Proc. R. Soc. A 472, 20150670 (2016).
https://dx.doi.org/10.1098/rspa.2015.0670
[2] J.A. Vaccaro, The quantum theory of time, the block universe, and human experience, Phil. Trans. R. Soc. Lond. A 376, 20170316 (2018). https://dx.doi.org/10.1098/rsta.2017.0316
[3] R.P. Feynman, Mathematical Formulation of the Quantum Theory of Electromagnetic Interaction, Phys. Rev. 80, 440-457 (1950), Appendix A. https://dx.doi.org/10.1103/PhysRev.80.440
[4] Y. Nambu, The Use of the Proper Time in Quantum Electrodynamics I, Prog. Theor. Phys. 5, 82 (1950). https://dx.doi.org/10.1143/ptp/5.1.82
[5] J.R. Fanchi, Review of invariant time formulations of relativistic quantum theories, Found. Phys. 23, 487-548 (1993). https://dx.doi.org/10.1007/BF01883726
The description of time in quantum mechanics and in particular in connection with quantum gravity and cosmology has always presented significant difficulties. One of descriptions based on Page and Wootters (PaW) mechanism which considers “time” as a quantum degree of freedom[1]. Here we give a complete review of the Page and Wootters' quantum time mechanism and provide experimental illustrations that are able to describe time as an emergent property of quantum correlations and giving us access to the possibility of a test of the Leggett-Garg inequalities.
[1] D.N. Page and W.K. Wootters, Phys. Rev. D 27, 2885 (1983).
[2] E. Moreva, M. Gramegna, G. Brida, L. Maccone, M. Genovese, “Quantum time: Experimental multitime correlations”, Physical Review D 96 (10), 102005 (2017)
One says that there is a quantum computational speedup when the computation of the solution of a problem is more efficient quantumly than classically. Let us consider, as an example, the simplest case that Bob, the problem setter, hides a ball in a chest of four drawers. Alice, the problem solver, is to locate in by opening drawers (by querying the oracle: is the ball in that drawer?). While in the classical case Alice may need to open up to three drawers, with the quantum algorithm devised by Grover she only needs to open one.
The usual representation of quantum algorithms is limited to the computation of the solution of the problem. We extend it to the process of setting the problem. Bob, who operates on the quantum register B, randomly selects the problem-setting (the number of the drawer with the ball) by an initial measurement in a (possibly incoherent) superposition of all the possible problem-settings. He could then unitarily change it into a desired setting but for simplicity we omit this operation. Alice, who operates on the quantum register A initially in an arbitrary sharp state (standing for a blank blackboard), unitarily computes the corresponding solution and reads it by the final measurement. With probability one of reading the solution, the process between the initial and final measurement outcomes is reversible – no information is destroyed along it.
We physically represent the fact that the problem-setting selected by Bob must be hidden from Alice (it would tell her the solution of the problem) by relativizing the extended representation to her. In the representation with respect to Alice, the projection of the quantum state associated with the initial measurement is postponed till the end of the unitary part of her problem-solving action. After the initial measurement, the quantum state of register B to Alice remains the quantum superposition of all the possible problem settings. It represents her complete ignorance of the problem setting selected by Bob. Alice unitarily changes the tensor product of this superposition and the sharp state of register A into a superposition of tensor products, each a problem setting in B multiplied by the corresponding solution in A. Then she selects the problem setting already selected by Bob by the final measurement of the solution.
We represent the reversibility of the process between the initial and final measurement outcomes by time-symmetrizing it. In this kind of process, and in the usual way of seeing, the information that specifies the initial measurement outcome and consequently the final one (in the present example, both the number of the drawer with the ball) is all selected by the initial measurement; its outcome (encoding the problem setting selected by Bob in register B) undergoes the time forward unitary transformation until becoming the state before the final measurement (encoding the solution in register A). The latter measurement just reads the solution encoded in A, without selecting anything. However, the thing could be seen in the time-symmetric way. The initial measurement does not select anything, the initial state superposition undergoes the unitary transformation that represents Alice’s problem solving action and the final measurement performs all the selection. The measurement outcome, which encodes the solution in register A, by the Parisian Zigzag propagates backwards in time by the inverse unitary transformation until becoming the outcome of the initial measurement, encoding the problem setting in register B.
However, either way of seeing, introducing a preferred direction of time, is not symmetric in time. According to the tenet of the Two-State-Vector Formalism, we assume that the initial and final measurements evenly contribute to determining the process in between, namely to selecting the information that specifies either measurement outcome. The half information selected by the initial measurement propagates forward in time, that selected by the final measurement backwards in time according to the Parisian Zigzag. Since there are many ways of halving the information, we should take all the corresponding time-symmetrization instances in quantum superposition.
This time-symmetrization procedure leaves the extended representation of the quantum algorithm, which is ordinary in character since no observer is shielded from any measurement outcome, unaltered.
It shows that the representation of the quantum algorithm relativized to Alice is a superposition of (partly overlapping) superpositions, the time-symmetrization instances, each a quantum algorithm by itself. In each instance, Alice remains shielded from the information coming to her from the initial measurement, not from that coming to her from the final measurement. The computational complexity of the problem to be solved by her is correspondingly reduced. All is as if she knew in advance, before performing the unitary part of her problem-solving action, half of the information that specifies the problem-setting and thus the solution of the problem and could use this information to reach the solution with fewer oracle queries. This accounts for the quantum computational speedup. The fact that the final measurement non-locally changes the state of register B to Alice at the beginning of the unitary part of the quantum algorithm, from a superposition of all the problem setting to that of a reduced part thereof, is of course a form of temporal nonlocality. It cannot be seen in the usual representation of quantum algorithm, which, by an application of the principle of locality, replaces the initial measurement by its measurement outcome.
The above accounts for the computational speedup of all the quantum algorithms examined. These comprise the major quantum algorithms and cover both the quadratic and exponential speedups. More generally, given an oracle problem, the number of oracle queries required to solve it in an optimal quantum way is that of a classical algorithm (a Turing machine) endowed with the advanced knowledge of half of the information that specifies the setting and the corresponding solution of the problem.
The fact that, in each time-symmetrization instance, Alice knows in advance half of the solution she will read in the future and uses this information to reach the solution with fewer oracle queries is a half causal loop. Its physical viability is discussed. The fact that there is apparently information going back in time from the final to the initial measurement is compensated for by the fact that one has to take the superposition of all the instances (apparent backward causality is compensated for by the indeterminacy inherent in the very notion of quantum superposition). This superposition – the quantum algorithm to Alice back again – is an ordinary quantum mechanical superposition, where apparently no information is sent back in time.
References
Ekert, A. K. and Jozsa, R.: Quantum Algorithms: Entanglement Enhanced Information Processing arXiv:quant-ph/9803072 (1998)
Dolev, S. and Elitzur, A. C.: Non-sequential behavior of the wave function. arXiv:quant-ph/0102109 v1 (2001)
Castagnoli, G. and Finkelstein, D. R.: Theory of the quantum speedup. Proc. Roy. Soc. A 1799, 457, 1799-1807 (2001)
Castagnoli, G.: The quantum correlation between the selection of the problem and that of the solution sheds light on the mechanism of the quantum speed up. Phys. Rev. A 82, 052334-052342 (2010)
Aharonov, Y., Cohen, E., and Elitzur, A. C.: Can a future choice affect a past measurement outcome? Ann. Phys. 355, 258-268 (2015)
Elitzur, A.C., Cohen, E., Okamoto, R. and Takeuchi, S.: Nonlocal position changes of a photon revealed by quantum routers. Sci. Rep. 8, 7730 (2018)
Prof. Stefano Liberati (SISSA) and Prof. Lorenzo Maccone (Univ. Pavia);
Moderators: Mariateresa Crosta (INAF-OATo)
The notion of causality, both local or global, is tied inextricably to the Lorentzian character of spacetime.
This is embodied by the causal structure poset which, given weak causality constriants, determines the conformal
spacetime geometry. This is the starting point for the causal set approach to quantum gravity, where the
underlying continuum is replaced by a locally finite partially ordered set. In this talk I will discuss the role played by
causality both kinematically and dynamically in quantum gravity, with a focus on the causal set approach.
The well-known theorem of Choquet-Bruhat and Geroch states that for given smooth initial data for the Einstein equations there exists a unique maximal globally hyperbolic development. In particular, the time-evolution of globally hyperbolic solutions is unique. This talk investigates whether the same results hold for quasilinear wave equations defined on a fixed background. We first present an example of a quasilinear wave equation for which unique evolution of smooth globally hyperbolic solutions in fact fails and contrast this case with the Einstein equations. We then proceed by presenting conditions which guarantee unique evolution. This talk is based on joint work with Felicity Eperon and Harvey Reall.
I present a new gravitational collapse singularity theorem which improves Penrose's and which does not assume predictability (global hyperbolicity) while it is compatible with chronology violation (closed timelike curves) and black hole evaporation.
This talk illustrates a model of spacetime with closed timelike curves proposed in a recent paper (D. Fermi and L. Pizzocchero, Class. Quantum Grav. 35 (2018), 165003, 42pp). This spacetime is diffeomorphic to R4 and carries an ad hoc metric; it consists of a flat outer region and of a “time machine”, formed by a toroidal interface and by an inner flat region. The timelike geodesics of this model, representing motions in free fall, can be analyzed qualitatively and computed analytically by quadratures; in this way, it is shown that a freely falling observer can start from the outer Minkowskian region, travel across the time machine and then return to its initial position at an earlier time, as evaluated by an inertial frame for the outer region with a clock fixed in the initial position. With a suitable choice of the initial conditions, the amount of time travelled in the past according to this fixed clock can be made arbitrarily large, while keeping non large the duration of the trip according to the traveller’s clock; quantitative examples are given.
The price for the above features of the model is the violation of the standard energy conditions in the interface of the time machine. Another problem are the tidal forces experienced by the traveller within this interface: as shown by a quantitative analysis, these are non destructive for a human being only if the size of the machine (and of the interface) is astronomical. A time machine of this size also has a non large interfacial mass-energy density, much smaller (in absolute value) than the density of water; the energy density is much below the Planck scale even for a machine of size comparable with the human scale, which ensures that the treatment of these objects via classical physics is correct.
General relativity allows for the existence of closed time-like curves, along which a material object could travel back in time and interact with its past self. Previous studies by Thorne and others showed that for any choice of initial conditions, consistent dynamics — even in the presence of closed time-like curves — exist. Moreover and counterintuitively, they showed that the examples with self-interaction lead to an infinite number of consistent dynamics. While in these previous studies initial conditions only where subject to the experimenter’s choice, we allow for arbitrary operations to be performed in local space-time regions. We find that any such dynamics can be realised through reversible interactions. We further find that consistency with local operations is compatible with non-trivial time travel: Three parties can interact in such a way to be all both in the future and in the past of each other, while being free to perform arbitrary local operations. Finally, the states described in our framework are uniquely determined.
While the dynamics of black hole evaporation and closed-timelike-curve physics in the presence of quantum fields are to some extent understood in principle, the computations necessary to produce concrete predictions from them are often intractable in practice. Here we show how tensor-network based numerics, which assign a manageably sparse representation to certain quantum states, can be used to perform them. As a first step wecompute the Hadamard-regularized stress-energy tensor of a 1+1-D massive Dirac field in various quantum states, demonstrating the Unruh effect in flat and curved spacetime.
I would like to explore a change in the interpretation of time. By thinking time as a cut, and no more like a lapse, there could be interesting opportunities. Particularly, with such interpretation, quantum gravity theories based on 3+1 spacetime (e.g. Kuchar or Ellis' evolving block universe) may open unexpected and fruitful views. Among the many consequences, there will be no possibility for time machines.
The Hartman effect – first discovered by MacColl, in 1932 – is the claimed observation that, when a particle tunnels, it arrives at the opposite side of the barrier the moment it encounters the barrier. If this is so, then sufficiently wide barriers and fast particles should produce superluminal effective velocities. However, such superluminal effective velocities have been dismissed as attributable solely to uncertainty in the initial position of the particle. We examine this position, and further investigate this motif for superluminal velocities, and associated backward time travel – the latter expected in the particle frame of reference, when the particle travels superluminally in the barrier frame.
In this talk I shall review the implications of superluminal travel and the means by which it can be achieved in classical General Relativity. We shall then see in the specific case of superluminal warp drives how it seems that a preemptive form of chronological protection is at work once their dynamics it is analysed within quantum field theory in curved spacetime. Finally, we shall discuss the robustness of this chronological protection with respect the details of the spacetime structure.
In this talk I will discuss the properties of quantum fields in causal set theory, a theory of quantum gravity in which nonlocality emerges as a consequence of discreteness and local Lorentz invariance. In particular I will present some recent results regarding the computation of entanglement entropy in this context and consider some comparisons with other models of quantum spacetime with particular attention to the fate of Lorentz symmetries.
Harnessing the flow of proper time of arbitrary external systems over which we exert little or no control has been a recurring theme in both science and science-fiction. Unfortunately, all relativistic schemes to achieve this effect beyond mere time dilation are utterly unrealistic. In this work, we find that there exist non-relativistic scattering experiments which, if successful, freeze out, speed up or even reverse the free dynamics of any ensemble of quantum systems present in the scattering region. This "time warping" effect is universal, i.e., it is independent of the particular interaction between the scattering particles and the target systems, or the (possibly non-Hermitian) Hamiltonian governing the evolution of the latter. The protocols require careful preparation of the probes which are scattered, and success is heralded by projective measurements of these probes at the conclusion of the experiment. We fully characterize the possible time translations which we can effect on n target systems through a scattering protocol of fixed duration; the core result is that time can be freely distributed between the systems, and reversed at a small cost. For high n, our protocols allow one to quickly send a single system to its far future or past. In this sense, we have devised a time machine for very small stuff.
Although quantum field theory inherits much of the basic structure laid out by the postulates of ordinary quantum mechanics, it is known that the measurement theory cannot go through unscathed. There are examples of idealised measurements in quantum field theory which produce superluminal signalling. These examples indicate that endowing quantum theory with a relativistic spacetime structure restricts the set of admissible quantum operations. There is, as of yet, no characterisation of these operations. To this end, here we proceed to clarify the causality issues which arise in measurements of quantum fields, as well as characterise a class of permissible measurements.
We report on the results of our ongoing work on reducing the energy requirements of classical warp drives. The existing warp drive solutions by van den Broek and Alcubierre assume spherical symmetry. We show that by considering their counterparts of arbitrary shape, one can reduce the energy requirements by orders of magnitude. Further, I will outline a method of constructing more general classes of warp drives. As a demonstration, we have constructed, for the first time, a warp drive solution with a region resembling the ergosphere region of Kerr black holes. I will present on the properties of such drives and discuss the possibility of applying the Penrose process to them.
WINGS (Women IN GravitieS) aims to associate women involved in studies of Gravity & General Relativity with applications to Relativistic Astrophysics, Cosmology, Mathematics and Quantum Physics with the purpose of promoting research activities carried out in these fields by women.
The idea is to highlight women‘s contribution in areas of research that appear to be male oriented in the hope of overcoming the cliché of a cultural gender bias based on the female disinterest towards difficult subjects, i.e. the so called hard sciences, such as General Relativity and related issues, both from the theoretical side and the experimental one.
As a starting point we would like to propose WINGs as a qualifying appointment in any conference dedicated to gravity and fundamental physics and, therefore, to plan a scientific session to promote and give visibility to the research performed by women in the aforementioned fields. The consideration of intersectionality will be a key aspect to overcome various barriers that do not facilitate access to a scientific career as well as the consideration of the inclusiveness of different minorities.
Traversable Wormholes are a prediction of General Relativity. After the discovery of the Gravitational Wave signals detected in 2015, Traversable Wormholes have had another renaissance, because they can be considered as Black Hole Mimickers.
In this talk we give a pedagogical introduction and we present some theoretical aspects at classical and semiclassical level, namely when the source has quantum mechanical origin. A brief description of a Self-Sustained Traversable Wormhole, namely a Traversable Wormhole ehich is sustained by its own quantum fluctuations is also presented.
We present wormholes based on the Robinson–Trautman class of spacetimes generally containing geometries without symmetries. We focus on a model sourced by a ghost scalar field investigating its asymptotics, stability and other issues. Within the same family of geometries one can construct a thin-shell model which approaches simple spherically symmetric wormhole in the distant future. The generalization of the second model to higher dimensions provides a possibility of avoiding the energy condition violation.
For the simplest case of Ellis Wormhole (WH) the fluid moving through the mentioned metrics is considered. For this purpose, the set of linearized equations composed of the Euler and continuity equations is examined. The propagation of sound waves has been considered and corresponding non-trivial analytical and numerical results – obtained.
We will show two fundamental applications of quantum superpositions of spatially separated states of mesoscopic objects (nano- and micro-spheres). Firstly we are going to show how convenient it may be to prepare and probe such superpositions through a pure ancillary system such as a spin. Next, we are going to show how an entanglement between two such interferometers can be generated purely through the Newtonian interaction between the masses and that this can be probed, at the end of the interferometry, purely by measuring the correlations between spins. We are going to justify why, under the assumption of locality of physical interactions and under a reasonable definition of classicality, the above entanglement signifies the qualitatively quantum nature of gravity. We are also going to discuss how the same spin-induced and probed superpositions will open up the ability to detect low frequency gravitational waves, immune to initial thermal noise, with a meter-scale apparatus.
I will first review the timeless Page-Wootters picture of the quantum universe in which there is no overall dynamics, but where the states of quantum fields evolve relative to the quantum states of the underlying space. I will then introduce the concept of superposing different causal orders – a notion that could be naturally motivated within some approaches to quantum gravity – and ask if and how this phenomenon could be incorporated within the Page-Wootters formalism. I hope to finish by speculating about some possible experimental implications.
The current description of SpaceTime follows Quantum Mechanics principles at the smallest scales, while it is commonly associated to General Relativity in cosmological terms. The opposite perspectives eventually differ in terms of discrete versus continuum analysis.
Time seems vanishing in the latest formulations of Quantum theories (Loop Quantum Gravity), as a static spin-foam that describes spatially entangled loops and forgets the importance of Memory. On the other hand, Time is described in relativity terms as a continuum existing block, of which we perceive instants that are part of an always existing word-line, like worms in a 4D-pancake with the tail on the birth and the head on the last instant of existence.
It seems that our latest descriptions of Time point at everything (in relativity) or nothing (in quantum) but still strive to understand entanglement, coherence and eventually evolution in Time.
Recently, several efforts have been done to reestablish a more natural perspective on Science, able to better face not only SpaceTime fabric description, but also debate on deeper philosophical questions concerning Time and its role in the Universe.
An extended review of the importance of Time in elementary physics, as well as in many other disciplines, is given by Lee Smolin in [1].
The coming ideas follow a similar path, trying to continue the effort towards a physical description of SpaceTime that seems to frame better quantum and cosmological scales with natural evolution.
The aim of this contribution is to propose new conjectures on SpaceTime variables and their description, through the concepts of network, entropy and coherent decoding (borrowed from Information Theory and quantum computation) and to offer a possible wider perspective on SpaceTime fabric and evolution.
Given the unnatural physical existence of a Real continuum and consequent infinities, in the context of a discrete universe on space and even time on a Planck scale (as discussed in [2]), starting from the definition of a new reference frame based on an Absolute Time T[k], an imaginary time ict and on the relative momenta, a possible interpretation of AdS/CFT correspondence, SpaceTime fabric and elementary particles behavior is proposed.
The Absolute Time is described in the AdS bulk of the Maldacena correspondence. It is represented as entangled memory links (between imaginary points on the surface) that develop highlighting surface correlation in a wavelet decomposition of the local pulse (phase shift info in respect to the Absolute Time reference pulse).
The memory links are described as well as a deep neural network (growing in T[k]) that stores and project the evolving surface information, as discussed in [3]. The information stored and projected through the Absolute Time is interpreted in the context of SpaceTime fabric as the most efficient quantum computation network, as proposed also in [4], [5] and [6].
Imaginary time, following Hawking intuition, maps the surface of the AdS/CFT correspondence to a diffusive space distance in a relativistic and flat space, in coherence c with the pulse of T[k]. It defines, at any given k (Now), the full current 3D space, from -∞@T[k-1] to +∞@T[k+1].
In an evolving 4D-SpaceTime, the perceived 3D Space is interpreted as emerging, in each current Now, as the current configuration of SpaceTime information. It is distributed on the Universe Surface of Existence along ict at a given k, giving 2 probabilistic Real degrees of freedom in each surface bit and the correlation derived on the memory-links, entangled along the past instants in the Absolute Time.
The momenta involved represent the phase variations (along both times in respect to the relative reference of coherence) and develop as in a logarithmic spiral, following a relativistic description of the information space and coherent time on the surface of the bulk.
A mathematical description of the mentioned momenta in relativistic terms is proposed.
Bosons are described as single qubits of information and elementary vibrating strings, flowing with no inertia on the surface defined by the imaginary time.
Matter elementary particles, as Entities in SpaceTime, are described as Networks of imaginary points sharing a common beat (decoding as a coherent-Self in T[k]). They emerge from the entanglement in the Absolute Time of surface strings, that reduce their local degrees of freedom to become interconnected in the bulk, pulsing as a single, persisting Self (debated in [7], [8]).
Following the conjectures proposed and the parallel with information encoded in the entanglement of surface strings in T[k], the Dirac equation is mapped to Shannon Entropy, as a summary of the information content shown on the surface, mathematically expressed as the sum of the information derived over the variations along the past loopy ticks of the Absolute Time.
The geometry proposed is then applied to Dark Matter, interpreted as diffusive wrinkles in the local fabric, and to Black Holes, as coherent Self and quantum networks showing maximum surface entropy at current T[k].
Black holes are described, on the horizon, as 1 tick away from the coherent Now (as a result of the SpaceTime decoding algorithm local computation on both momenta).
The horizon results too out of phase in the local current T[k] coherence to be decoded in the surrounding 3D local space (consequently with very little chance of receiving, on the outside, any information).
Further reasoning on SpaceTime info compression algorithm and relative computational efficiency as a quantum computer are presented in the context of fabric entanglement (as living memory-roots through past events) and maximum entropy on the surface (as equivalent to Shannon max info compression).
To illustrate possible wider similarities and consequences of the proposed conjectures, the Evolution of Network-Entities and of information is described as cycles of transformation in ict and new gained persistence in T[k], in a growing of global surface entropy and local complexities, levels of abstraction, efficiency in equilibrium and Self-gained emerging properties.
Finally, a possible interpretation of the Origin is proposed, rewinding back both times, in the context of Absolute Time and imaginary time emerging from a no-boundary alike model (Hartle and Hawking).
Further developments on the proposed conjectures are still required.
A wider mathematical and physical analysis is suggested to extend the comprehension of information in SpaceTime and to evaluate the implications on telecommunication and energy production.
A deeper philosophical understanding is expected.
The full paper is available at [9].
[1] L. Smolin, Time Reborn: From the Crisis in Physics to the Future of the Universe, 2013.
[2] C. Rovelli and M. Christodoulou, “On the possibility of experimental detection of the discreteness of time” arXiv:1812.01542v2, 2018.
[3] X.-L. Qi, “Exact holographic mapping and emergent space-time geometry” arXiv:1309.6282v1, 2013.
[4] L. Zhou and X. Dong, “Geometrization of deep networks for the interpretability of deep learning systems” arXiv:1901.02354v2, 2019.
[5] L. Zhou and X. Dong, “Spacetime as the optimal generative network of quantum states: a roadmap to QM=GR?”arXiv:1804.07908v1, 2018.
[6] P. Caputa and J. M. Magan, “Quantum Computation as Gravity” PHYSICAL REVIEW LETTERS, pp. 122, 231302, 2019.
[7] G. Jaroszkiewicz and J. Eakins, “Particle decay processes, the quantum Zeno effect and the continuity of time” arXiv:quant-ph/0608248, 2006.
[8] T. Ullrich, D. Kharzeev and Z. Tu, “The EPR paradox and quantum entanglement at sub-nucleonic scales” arXiv:1904.11974, 17 May 2019.
[9] A. Capurso, “Conjectures on SpaceTime”
Available: http://www.tempiodicrono.net/download/Capurso-Conjectures_paper.pdf.
I will expose a local, fully quantum-field-theory compliant model of the Aharonov-Bohm effect, where the Aharonov-Bohm phase is gradually and locally acquired. I will explore the theoretical and experimental implications of this model, especially in regard to locality and causality in quantum theory.
Quantum optical systems present several interesting properties that allow using them as a tool for visualizing physical phenomena otherwise subject of theoretical speculation only, as Bose Einstein condensation for Hawking radiation [1] or Page Wootters model [2-5].
Closed Time-like Curves (CTC), one of the most striking predictions of general relativity, are notorious for generating paradoxes, such as the grandfather's paradox, but these paradoxes can be solved in a quantum network model [6], where a qubit travels back in time and interacts with its past copy. However, there is a price to pay. The resolution of the causality paradoxes requires to break quantum theory's linearity. This leads to the possibility of quantum cloning, violation of the uncertainty principle and solving NP-complete problems in polynomial time. Interestingly, violations of linearity occur even in an open time-like curve (OTC), when the qubit does not interact with its past copy, but it is initially entangled with another, chronology-respecting, qubit. The non-linearity is needed here to avoid violation of the monogamy of entanglement. To preserve linearity and avoid all other drastic consequences, we discuss how the state of the qubit in the OTC is not a density operator, but a pseudo-density operator (PDO) - a recently proposed generalisation of density operators, unifying the description of temporal and spatial quantum correlations. Here I present an experimental simulation of the OTC using polarization-entangled photons, also providing the first full quantum state tomography of the PDO describing the OTC, verifying the violation of the monogamy of entanglement induced by the chronology-violating qubit. At the same time the linearity is preserved since the PDO already contains both the spatial degrees of freedom and the linear temporal quantum evolution. These arguments also offer a possible solution to black hole entropy problem.
[1] J.Steinahauer et al., Nature Physics volume12, 959–965 (2016)
[2] D.N. Page and W.K. Wootters, Phys. Rev. D 27, 2885 (1983); W.K. Wootters, Int. J. Theor. Phys. 23, 701 (1984).
[3] E.Moreva,M.Gramegna,G.Brida,L.Maccone,M.Genovese, Phys. Rev. A 89, 052122 (2014).
[4] V.Giovannetti, S.Lloyd, L.Maccone, Phys. Rev. D, 92, 045033 (2015).
[5] E.Moreva,M.Gramegna,G.Brida,L.Maccone,M.Genovese, Phys. Rev.D in press. arXiv:1710.00707
[6] D. Deutsch, Phys. Rev. D 44, 10, 1991.
[7] C. Marletto, V. Vedral, S. Virzì, E.Rebufello,A.Avella,M.Gramegna, I.P. Degiovanni,M.Genovese, in press.
Recently, hypothetical faint effects in interferometers connected to non-commutativity of position variables in different directions originating at the Planck scale have been considered, as a possible signature of quantum gravity. In particular, this idea led to the realization of a double 40 m interferometer at Fermilab with state of the art sensitivity in the MHz domain. Although instruments such as optical interferometers represent probably the most sensitive devices currently available, their performance are still limited by shot noise, if operated with classical light. Quantum metrology, allows to overcame these limits, by exploiting quantum properties of light, therefore representing a promising avenue for enabling new discoveries.
Here we present an experiment of quantum-enhanced correlated interferometry, showing an improved sensitivity with respect to a single interferometer in revealing faint stochastic noise, such as the ones predicted by some Planck scale model. Using quantum-enhanced correlation techniques between two Michelson interferometers, we reach a sensitivity of 10-17 m/(Hz)1/2 at 13.5 MHz in a few seconds of integration time, which is 20 times better than the one of a single device. Moreover, by injecting bipartite quantum correlated states, we also demonstrated a sub shot noise sensitivity in the comparison of different interferometers' signals. In perspective, the proposed technique could allow either to reduce the size to a table top scale or to further improve the sensitivity of large setup such as the Fermilab facility.
Title: Contrary Inferences for Classical Histories in the Consistent Histories Approach
The non-relativistic quantum theory is one of the most successful theories in the history of science, since it has been verified experimentally in several different situations and with extremely high precision. Despite the fact that its mathematical formalism is universally accepted, its conceptual foundations have always been a subject of scientific dispute. The standard interpretation is that of the Copenhagen school which has many conceptual and practical problems. One of the most prominent is the distinction between the classical and quantum world, as well as the issue of the quantum-to-classical transition. Closely related to these is also the famous measurement problem.
An alternative interpretation of quantum theory based on the histories approach is the consistent histories theory [1-5]. The space of states consists of all the possible histories of a quantum system and the aim is to derive probabilities for the realization of a (coarse-grained) history of this system. The probability of a history, which is defined in relation to the other histories belonging to the corresponding partition of the histories space, are assigned only when a condition defined on this coarse-grained-histories set holds. When such a set satisfies the consistency condition, it is called consistent histories set (CHS). Unfortunately, there are many CHS which are not mutually compatible. This leads to the existence of contrary inferences, which are defined as two contradictory arguments both implied with probability one [6]. Of course, this issue does not arise in the classical world. The existence of contrary inferences comes from the existence of zero covers [7]; specifically, by covering the full histories space with two (overlapping) zero quantum measure sets. It is known that, in quantum theory, many interpretational problems arise because of the existence of zero quantum measure covers, e.g. the Kochen-Specker theorem and the contextuality [8]. The strangeness of contrary inferences is typically justified by proponents of the consistent histories, by arguing that these appear in the small scale (far from the classical domain), where counterintuitive properties are expected to appear. Then the technical way to avoid such issues is to focus on and compare propositions belonging to a single CHS, an assumption justified in the microscopic world, but much less in classical scales.
In this talk, we give an example of two contrary classical propositions in the context of consistent histories approach. We analyze the arrival time of a (semi-) classical free particle in an infinite square well. By selecting two different partitions of the histories space, we find a quantum measure of zero cover, which consists of two coarse-grained sets. Thus, we end up with contrary inferences for a classical particle. The consequences of this example for histories formulations of quantum theory will be shortly discussed.
References
[1] M. Gell-Mann and J. B. Hartle. Classical equations for quantum systems. Phys. Rev., D 47:3345–3382, 1993.
[2] R. B. Griffiths. Consistent histories and the interpretation of quantum mechanics. J. Statist. Phys., 36:219–272, 1984.
[3] R. Omnes. Logical Reformulation of Quantum Mechanics. 1. Foundations. J. Statist. Phys., 53:893–932, 1988.
[4] R. Omnes. Logical Reformulation of Quantum Mechanics. 2. Interferences and the Einstein-Podolsky- Rosen Experiment. J. Statist. Phys., 53:933–955, 1988.
[5] R. Omnes. Logical Reformulation of Quantum Mechanics. 3. Classical Limit and Irreversibility. J. Statist. Phys., 53:957–975, 1988.
[6] A. Kent. Consistent sets and contrary inferences: Reply to Griffiths and Hartle. Phys. Rev. Lett., 81:1982, 1998.
[7] P. Wallden. Contrary Inferences in Consistent Histories and a Set Selection Criterion. Found.Phys., 44(11):1195–1215, 2014.
[8] S. Surya and P. Wallden. Quantum covers in quantum measure theory,arxiv: 0809.1951 [quant-ph], 2008.
What allowed Einstein to transcend Newton’s conception of absolute time was his insistence on an operational definition of time in terms of the measure- ment of a clock. Quantum theory has yet to be liberated from this absolute time as evidenced by the Schr ̈odinger equation in which time appears as an external classical parameter.
In this talk I will introduce an operational formulation of quantum theory known as the conditional probability interpretation of time (CPI) in which time is defined in terms of an observable on a quantum system functioning as a clock; in some contexts, the CPI is known as the Page and Wootters mechanism. This clock and the system whose dynamics it is tracking, do not evolve with respect to any external time. Instead, they are entangled and as a consequence a relational dynamics emerges between them.
I will present a generalization of the CPI to the case when the clock and system interact [1], which should be expected at some scale when the gravita- tional interaction between them is taken into account. I will demonstrate how such clock-system interactions result in a time-nonlocal modification to the Schro ̈dinger equation. I will then examine relativistic particles with internal degrees of freedom that constitute a clock which tracks their proper time [2]. By examining the conditional probability associated with two such clocks reading different proper times, I will show that these clocks exhibit both classical and quantum time dilation effects. Moreover, in connection with quantum metrol- ogy, it will be seen that the Helstrom-Holevo lower bound requires that these clocks satisfy a time-energy uncertainty relation between the proper time they measure and their rest mass. Finally, I will show how the CPI constitutes one out of a trinity of distinct but equivalent formulations of the same relational quantum dynamics [3].
References:
[1] Quantizing time: Interacting clocks and systems
A. R. H. Smith and M. Ahmadi, Quantum 3 160 (2019)
[2] Relativistic quantum clocks observe classical and quantum time dilation A. R. H. Smith and M. Ahmadi, arXiv:1904.12390 (2019)
[3] The trinity of relational quantum dynamics
A. R. H. Smith, M. P. E. Lock, and P. A. Ho ̈hn, Forthcoming (2019)
We study quantum gravity induced quantum causal structure in the context of quantum field theories. We argue both conceptually and numerically that when spacetime is treated quantumly, (1) exact microcausality condition, (2) exact causal boundaries, and (3) the distinction between particles and antiparticles cannot be maintained. These suggest possibilities of "time travel" and "tunneling out of black holes", but to examine whether such possibilities can be realized, concrete calculations are needed. We present a method to conduct calculations for quantum field theories on quantum spacetime based on the expansion of Feynman diagrams into worldline diagrams. As a first application, we show that quantum causal structure regularizes matter field UV singularities. This result reinforces previous suggestions from analyzing entanglement in the presence of quantum causality.
Closed timelike curves are striking predictions of general relativity allowing for time-travel. They are afflicted by notorious causality issues (e.g. grandfather’s paradox). Quantum models where a qubit travels back in time solve these problems, at the cost of violating quantum theory’s linearity—leading e.g. to universal quantum cloning. Interestingly, linearity is violated even by open timelike curves (OTCs), where the qubit does not interact with its past copy, but is initially entangled with another qubit. Non-linear dynamics is needed to avoid violating entanglement monogamy. Here we propose an alternative approach to OTCs, allowing for monogamy violations. Specifically, we describe the qubit in the OTC via a pseudo-density operator—a unified descriptor of both temporal and spatial correlations. We also simulate the monogamy violation with polarization-entangled photons, providing a pseudo-density operator quantum tomography. Remarkably, our proposal applies to any space-time correlations violating entanglement monogamy, such as those arising in black holes.
In this talk I describe the mathematics required in order to provide a description of the observables for quantum fields on low-regularity spacetimes. The first step involves constructing low-regularity advanced and retarded Green operators as maps between suitable function spaces. In specifying these we need to use graph norms on Sobolev spaces to ensure that the Green operators are well-defined inverses. The causal propagator is then used to define a symplectic form on a topological vector space $V(M)$. A key point is the way in which the causal propagator on a (non-smooth) globally hyperbolic spacetime restricts to the causal propagator on a smaller causally compatible submanifold and therefore induces a symplectic map between the vector spaces. This property enables one to provide a locally covariant description of the quantum fields in terms of the elements of quasi-local $C^*$-algebras on which one may define canonical commutation relations. I end with a brief discussion on the choice of Sobolev micro-local spectrum condition used to single out the physical states in the low-regularity setting.