Brought to you by:

Table of contents

Volume 504

2014

Previous issue Next issue

EmQM13: Emergent Quantum Mechanics 2013 3–6 October 2013, Vienna, Austria

Accepted papers received: 06 March 2014
Published online: 14 April 2014

Preface

011001
The following article is Open access

These proceedings comprise the invited lectures of the second international symposium on Emergent Quantum Mechanics (EmQM13), which was held at the premises of the Austrian Academy of Sciences in Vienna, Austria, 3–6 October 2013.

The symposium was held at the ''Theatersaal'' of the Academy of Sciences, and was devoted to the open exploration of emergent quantum mechanics, a possible ''deeper level theory'' that interconnects three fields of knowledge: emergence, the quantum, and information. Could there appear a revised image of physical reality from recognizing new links between emergence, the quantum, and information? Could a novel synthesis pave the way towards a 21st century, ''superclassical'' physics? The symposium provided a forum for discussing (i) important obstacles which need to be overcome as well as (ii) promising developments and research opportunities on the way towards emergent quantum mechanics. Contributions were invited that presented current advances in both standard as well as unconventional approaches to quantum mechanics.

The EmQM13 symposium was co–organized by Gerhard Grössing (Austrian Institute for Nonlinear Studies (AINS), Vienna), and by Jan Walleczek (Fetzer Franklin Fund, USA, and Phenoscience Laboratories, Berlin). After a very successful first conference on the same topic in 2011, the new partnership between AINS and the Fetzer Franklin Fund in producing the EmQM13 symposium was able to further expand interest in the promise of emergent quantum mechanics.

The symposium consisted of two parts, an opening evening addressing the general public, and the scientific program of the conference proper. The opening evening took place at the Great Ceremonial Hall (Grosser Festsaal) of the Austrian Academy of Sciences, and it presented talks and a panel discussion on ''The Future of Quantum Mechanics'' with three distinguished speakers: Stephen Adler (Princeton), Gerard 't Hooft (Utrecht) and Masanao Ozawa (Nagoya).

The articles contained in these proceedings represent the talks of the invited speakers as written immediately after the symposium. The volume starts with a contribution by organizers Jan Walleczek and Gerhard Grössing, essentially explaining why emergent quantum mechanics, and other deterministic approaches to quantum theory, must be considered viable approaches in quantum foundations today. This is followed by the exposition of Stephen Adler's talk who introduced to a general audience key questions at the current frontiers of quantum mechanics during the opening evening (with the contents of his conference talk appearing elsewhere). The conference proceedings then continues with the presentations as given in their chronological order i.e. starting with the opening talk of the scientific program by Gerard 't Hooft. While the page number was restricted for all invited speakers, the paper by Jeff Tollaksen was given more space, as his invited collaborator Yakir Aharonov was unable to deliver a separate talk, in order to represent both contributions in one paper. Note that the talks of all speakers, including the talks of those who could not be represented in this volume (M. Arndt, B. Braverman, C. Brukner, S. Colin, Y. Couder, B. Poirier, A. Steinberg, G. Weihs and H. Wiseman) are freely available on the conference website as video presentations (http://www.emqm13.org).

The organizers wish to express their gratitude to Siegfried Fussy and Herbert Schwabl from AINS for the organizational support. The organizers also wish to thank Bruce Fetzer, President and CEO, John E. Fetzer Memorial Trust, and the Members of the Board of Trustees, for their strong support and for funding this symposium.

We also wish to thank the Austrian Academy of Sciences for allowing the symposium to be held on their premises, and Anton Zeilinger, President of the Austrian Academy of Sciences, for his welcome address. The expertise of the Members of the Scientific Advisory Board of the EmQM13 symposium, Ana Maria Cetto (Mexico), Lajos Diósi (Budapest), Maurice de Gosson (Vienna), Edward Nelson (Princeton), Theo Nieuwenhuizen (Amsterdam) and Helmut Rauch (Vienna), is also gratefully acknowledged.

Finally, it is a pleasure to again thank Sarah Toms and her team at IOP Publishing (Bristol) for their friendly advice and help during the preparation of these proceedings.

Vienna, Pisa, Berlin, February 2014

Gerhard Grössing,

Hans–Thomas Elze,

Johannes Mesa Pascasio,

Jan Walleczek

The front cover image shows two bouncing oil droplets on an oscillating oil surface, as they are employed by Couder, Fort, Bush, and others to show macroscopic analogues of wave-particle complementarity (courtesy of Dan Harris and John Bush, MIT).

011002
The following article is Open access

All papers published in this volume of Journal of Physics: Conference Series have been peer reviewed through processes administered by the proceedings Editors. Reviews were conducted by expert referees to the professional and scientific standards expected of a proceedings journal published by IOP Publishing.

Papers

Invited lectures

012001
The following article is Open access

and

Does "epistemic non-signalling" ensure the peaceful coexistence of special relativity and quantum nonlocality? The possibility of an affirmative answer is of great importance to deterministic approaches to quantum mechanics given recent developments towards generalizations of Bell's theorem. By generalizations of Bell's theorem we here mean efforts that seek to demonstrate the impossibility of any deterministic theories to obey the predictions of Bell's theorem, including not only local hidden-variables theories (LHVTs) but, critically, of nonlocal hidden-variables theories (NHVTs) also, such as de Broglie-Bohm theory. Naturally, in light of the well-established experimental findings from quantum physics, whether or not a deterministic approach to quantum mechanics, including an emergent quantum mechanics, is logically possible, depends on compatibility with the predictions of Bell's theorem. With respect to deterministic NHVTs, recent attempts to generalize Bell's theorem have claimed the impossibility of any such approaches to quantum mechanics. The present work offers arguments showing why such efforts towards generalization may fall short of their stated goal. In particular, we challenge the validity of the use of the non-signalling theorem as a conclusive argument in favor of the existence of free randomness, and therefore reject the use of the non-signalling theorem as an argument against the logical possibility of deterministic approaches. We here offer two distinct counter-arguments in support of the possibility of deterministic NHVTs: one argument exposes the circularity of the reasoning which is employed in recent claims, and a second argument is based on the inconclusive metaphysical status of the non-signalling theorem itself. We proceed by presenting an entirely informal treatment of key physical and metaphysical assumptions, and of their interrelationship, in attempts seeking to generalize Bell's theorem on the basis of an ontic, foundational interpretation of the non-signalling theorem. We here argue that the non-signalling theorem must instead be viewed as an epistemic, operational theorem i.e. one that refers exclusively to what epistemic agents can, or rather cannot, do. That is, we emphasize that the non-signalling theorem is a theorem about the operational inability of epistemic agents to signal information. In other words, as a proper principle, the non-signalling theorem may only be employed as an epistemic, phenomenological, or operational principle. Critically, our argument emphasizes that the non-signalling principle must not be used as an ontic principle about physical reality as such, i.e. as a theorem about the nature of physical reality independently of epistemic agents e.g. human observers. One major reason in favor of our conclusion is that any definition of signalling or of non-signalling invariably requires a reference to epistemic agents, and what these agents can actually measure and report. Otherwise, the non-signalling theorem would equal a general "no-influence" theorem. In conclusion, under the assumption that the non-signalling theorem is epistemic (i.e. "epistemic non-signalling"), the search for deterministic approaches to quantum mechanics, including NHVTs and an emergent quantum mechanics, continues to be a viable research program towards disclosing the foundations of physical reality at its smallest dimensions.

012002
The following article is Open access

Public talk at the EmQM13 conference opening event on "The Future of Quantum Mechanics".

012003
The following article is Open access

Nature's laws in the domain where relativistic effects, gravitational effects and quantum effects are all comparatively strong are far from understood. This domain is called the Planck scale. Conceivably, a theory can be constructed where the quantum nature of phenomena at such scales can be attributed to something fundamentally simpler. However, arguments that quantum mechanics cannot be explained in terms of any classical theory using only classical logic seem to be based on sound mathematical considerations: there can't be physical laws that require "conspiracy". It may therefore be surprising that there are several explicit quantum systems where these considerations apparently do not apply. In the lecture we will show several such counterexamples. These are quantum models that do have a classical origin. The most curious of these models is superstring theory. This theory is often portrayed as to underly the quantum field theory of the subatomic particles, including the "Standard Model". So now the question is asked: how can this model feature "conspiracy", and how bad is that? Is there conspiracy in the vacuum fluctuations?

012004
The following article is Open access

We discuss the action principle and resulting Hamiltonian equations of motion for a class of integer-valued cellular automata introduced recently [1]. Employing sampling theory, these deterministic finite-difference equations are mapped reversibly on continuum equations describing a set of bandwidth limited harmonic oscillators. They represent the Schrödinger equation. However, modifications reflecting the bandwidth limit are incorporated, i.e., the presence of a time (or length) scale. When this discreteness scale is taken to zero, the usual results are obtained. Thus, the linearity of quantum mechanics can be traced to the postulated action principle of such cellular automata and its conservation laws to discrete ones. The cellular automaton conservation laws are in one-to-one correspondence with those of the related quantum mechanical model, while admissible symmetries are not.

012005
The following article is Open access

Theoretical physics seems to be in a kind of schizophrenic state. Many phenomena in the observable macroscopic world obey nonlinear evolution equations, whereas the microscopic world is governed by quantum mechanics, a fundamental theory that is supposedly linear. In order to combine these two worlds in a common formalism, at least one of them must sacrifice one of its dogmas. I claim that linearity in quantum mechanics is not as essential as it apparently seems since quantum mechanics can be reformulated in terms of nonlinear Riccati equations. In a first step, it will be shown where complex Riccati equations appear in time-dependent quantum mechanics and how they can be treated and compared with similar space-dependent Riccati equations in supersymmetric quantum mechanics. Furthermore, the time-independent Schrödinger equation can also be rewritten as a complex Riccati equation. Finally, it will be shown that (real and complex) Riccati equations also appear in many other fields of physics, like statistical thermodynamics and cosmology.

012006
The following article is Open access

, , and

By introducing the concepts of "superclassicality" and "relational causality", it is shown here that the velocity field emerging from an n-slit system can be calculated as an average classical velocity field with suitable weightings per channel. No deviation from classical probability theory is necessary in order to arrive at the resulting probability distributions. In addition, we can directly show that when translating the thus obtained expression for said velocity field into a more familiar quantum language, one immediately derives the basic postulate of the de Broglie-Bohm theory, i.e. the guidance equation, and, as a corollary, the exact expression for the quantum mechanical probability density current. Some other direct consequences of this result will be discussed, such as an explanation of Born's rule and Sorkin's first and higher order sum rules, respectively.

012007
The following article is Open access

, and

In previous papers, the quantum behavior of matter has been shown to emerge as a result of its permanent interaction with the random zero-point radiation field. Fundamental results, such as the Schrödinger and the Heisenberg formalism, have been derived within this framework. Further, the theory has been shown to provide the basic QED formulas for the radiative corrections, as well as an explanation for entanglement in bipartite systems.

This paper addresses the problem of spin from the same perspective. The zero-point field is shown to produce a helicoidal motion of the electron, through the torque exerted by the electric field modes of a given circular polarization, which results in an intrinsic angular momentum, of value /2. Associated with it, a magnetic moment with a (g-factor of 2 is obtained. This allows us to identify the spin of the electron as a further emergent property, generated by the action of the random zero-point field.

012008
The following article is Open access

The outcome of a single quantum experiment is unpredictable, except in a pure-state limit. The definite process that takes place in the apparatus may either be intrinsically random or be explainable from a deeper theory. While the first scenario is the standard lore, the latter implies that quantum mechanics is emergent. In that case, it is likely that one has to reconsider radiation by accelerated charges as a physical effect, which thus must be compensated by an energy input. Stochastic electrodynamics, for example, asserts that the vacuum energy arises from classical fluctuations with energy 1/2ℏω per mode. In such theories the stability of the hydrogen ground state will arise from energy input from fluctuations and output by radiation, hence due to an energy throughput. That flux of energy constitutes an arrow of time, which we call the "subquantum arrow of time". It is related to the stability of matter and it is more fundamental than, e.g., the thermodynamic and cosmological arrows.

012009
The following article is Open access

We review the derivation of quantum theory as an application of entropic methods of inference. The new contribution in this paper is a streamlined derivation of the Schrödinger equation based on a different choice of microstates and constraints.

012010
The following article is Open access

We follow the idea that particles are topological solitons, characterised by topological quantum numbers determining charge and spin. To describe such particles and their spin we use the three rotational degrees of freedom of local spatial dreibeins. Orbiting particles contribute by internal rotation to the total angular momentum. The mass of these particles is given by their field energy only.

012011
The following article is Open access

Emergent quantum mechanics seeks a deeper level theory, anticipating that such a theory will provide a clearer picture of the relation between the quantum and classical worlds. In this work we show that the quantum-classical divide is a manifestation of the transition from Newton's absolute time to relativity's path-dependent time. The prior theory in this case is that particles are intrinsic clocks. The emergence of separate classical and quantum behaviour is seen by considering different continuum limits in a single digital clock model. A continuum limit that constructs a continuous worldline provides a simple basis for Minkowski spacetime. An alternative limit in which the clock itself contains boost information leads to the Dirac equation.

012012
The following article is Open access

and

There is a theoretical evidence that relativistically invariant quantum dynamics at (enough) large space-time scales can result from a cooperative process of two inter-correlated non-relativistic stochastic dynamics, operating at different energy scales. We show that the Euclidean transition amplitude for a relativistic particle is identical to the transition probability of a Brownian particle propagating in a granular space. We discuss the issue of the robustness of the special-relativistic quantum mechanics thus obtained under small changes in the granular-space distribution. Experimental implications for early Universe cosmology are also briefly outlined.

012013
The following article is Open access

After a brief review of stochastic mechanics of nonrelativistic particle systems, the paper discusses in a qualitative way issues concerning the application of stochastic mechanics to relativistic fields and their relation to quantum field theory. It is suggested that gauge theories will be essential to this program.

012014
The following article is Open access

Given the experimental precision in condensed matter physics – positions are measured with errors of less than 0.1pm, energies with about 0.1meV, and temperature levels are below 20mK – it can be inferred that standard quantum mechanics, with its inherent uncertainties, is a model at the end of its natural lifetime. In this presentation I explore the elements of a future deterministic framework based on the synthesis of wave mechanics and density functional theory at the single-electron level.

012015
The following article is Open access

The entanglement and the violation of Bell and CHSH inequalities in spin polarization correlation experiments (SPCE) is considered to be one of the biggest mysteries of Nature and is called quantum nonlocality. In this paper we show once again that this conclusion is based on imprecise terminology and on the lack of understanding of probabilistic models used in various proofs of Bell and CHSH theorems. These models are inconsistent with experimental protocols used in SPCE. This is the only reason why Bell and CHSH inequalities are violated. A probabilistic non-signalling description of SPCE, consistent with quantum predictions, is possible and it depends explicitly on the context of each experiment. It is also deterministic in the sense that the outcome is determined by supplementary local parameters describing both physical signals and measuring instruments. The existence of such description gives additional arguments that quantum theory is emergent from some more detailed theory respecting causality and local determinism. If quantum theory is emergent then there exist perhaps some fine structures in time-series of experimental data which were not predicted by quantum theory. In this paper we explain how a systematic search for such fine structures can be done. If such reproducible fine structures were found it would show that quantum theory is not predictably complete, which would be a major discovery.

012016
The following article is Open access

and

The relationship between the real part of the weak value of the momentum operator at a post selected position is discussed and the meaning of the experimentally determined stream-lines in the Toronto experiment of Kocsis et al is re-examined. We argue against interpreting the energy flow lines as photon trajectories. The possibility of performing an analogous experiment using atoms is proposed in order that a direct comparison can be made with the trajectories calculated by Philippidis, Dewdney and Hiley using the Bohm approach.

012017
The following article is Open access

Quantum mechanics with massive particles becomes an important tool for fundamental research and applied science since many previously named "Gedanken" experiments become feasible. Neutrons are massive particles which couple to gravitational, nuclear and electro-magnetic interactions and they are sensitive to topological effects as well. Therefore they are proper tools for testing quantum mechanics where several previously named "hidden" parameters become measurable. Widely separated coherent beams can be produced by means of perfect crystal interferometers and they can be influenced individually. Spinor symmetry, spin superposition and quantum beat effect experiments have been performed and topological phases have been observed. Recent experiments related to the decoherence problem have shown that interference effects can be revived even when the overall interference pattern seems to be incoherent. All retrieval processes involve inherently unavoidable losses which stem partly from the theory itself and partly from an imperfect environment. Related post-selection experiments shed a new light on questions of quantum non-locality and support the request for more complete quantum measurements in the future. A more rational explanation of non-locality effects may be obtained when the plane wave components outside the wave packets are included in the discussion. This can also help to discuss entanglement and contextuality effects in a new light. In all quantum experiments more information can be extracted by more complete quantum experiments which will be important in the future to get a better understanding of quantum physics. An example may be the consideration of the Compton frequency and of proper time effects of matter waves.

012018
The following article is Open access

Tests of Bell's theorem rule out local hidden variables theories. But any theorem is only as good as the assumptions that go into it, and one of these assumptions is that the experimenter can freely choose the detector settings. Without this assumption, one enters the realm of superdeterministic hidden variables theories and can no longer use Bell's theorem as a criterion. One can like or not like such superdeterministic hidden variables theories and their inevitable nonlocality, the real question is how one can test them. Here, we propose a possible experiment that could reveal superdeterminism.

012019
The following article is Open access

We show that by taking into account randomness of realization of experimental contexts it is possible to construct common Kolmogorov space for data collected for these contexts, although they can be incompatible. We call such a construction "Kolmogorovization" of contextuality. This construction of common probability space is applied to Bell's inequality. It is well known that its violation is a consequence of collecting statistical data in a few incompatible experiments. In experiments performed in quantum optics contexts are determined by selections of pairs of angles (θi'j) fixing orientations of polarization beam splitters. Opposite to the common opinion, we show that statistical data corresponding to measurements of polarizations of photons in the singlet state, e.g., in the form of correlations, can be described in the classical probabilistic framework. The crucial point is that in constructing the common probability space one has to take into account not only randomness of the source (as Bell did), but also randomness of context-realizations (in particular, realizations of pairs of angles (θi, θ'j)). One may (but need not) say that randomness of "free will" has to be accounted for.

012020
The following article is Open access

The Diosi-Penrose model of quantum-classical boundary postulates gravity-related spontaneous wave function collapse of massive degrees of freedom. The decoherence effects of the collapses are in principle detectable if not masked by the overwhelming environmental decoherence. But the DP (or any other, like GRW, CSL) spontaneous collapses are not detectable themselves, they are merely the redundant formalism of spontaneous decoherence. To let DP collapses become testable physics, recently we extended the DP model and proposed that DP collapses are responsible for the emergence of the Newton gravitational force between massive objects. We identified the collapse rate, possibly of the order of 1/ms, with the rate of emergence of the Newton force. A simple heuristic emergence (delay) time was added to the Newton law of gravity. This non-relativistic delay is in peaceful coexistence with Einstein's relativistic theory of gravitation, at least no experimental evidence has so far surfaced against it. We derive new predictions of such a 'lazy' Newton law that will enable decisive laboratory tests with available technologies. The simple equation of 'lazy' Newton law deserves theoretical and experimental studies in itself, independently of the underlying quantum foundational considerations.

012021
The following article is Open access

In this report we discuss three aspects: 1) Semiclassical gravity theory (SCG): 4 levels of theories describing the interaction of quantum matter with classical gravity. 2) Alternative Quantum Theories: Discerning those which are derivable from general relativity (GR) plus quantum field theory (QFT) from those which are not 3) Gravitational Decoherence: derivation of a master equation and examination of the assumptions which led to the claims of observational possibilities. We list three sets of corresponding problems worthy of pursuit: a) Newton-Schrödinger Equations in relation to SCG; b) Master equation of gravity-induced effects serving as discriminator of 2); and c) Role of gravity in macroscopic quantum phenomena.

012022
The following article is Open access

and

Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

012023
The following article is Open access

and

The basic strategy underlying models of spontaneous wave function collapse (collapse models) is to modify the Schrödinger equation by including nonlinear stochastic terms, which tend to localize wave functions in space in a dynamical manner. These terms have negligible effects on microscopic systems—therefore their quantum behaviour is practically preserved. On the other end, since the strength of these new terms scales with the mass of the system, they become dominant at the macroscopic level, making sure that wave functions of macro-objects are always well-localized in space. We will review these basic features. By changing the dynamics of quantum systems, collapse models make predictions, which are different from standard quantum mechanical predictions. Although they are difficult to detect, we discuss the most relevant scenarios, where such deviations can be observed.

012024
The following article is Open access

The uncertainty relation formulated by Heisenberg in 1927 describes a trade-off between the error of a measurement of one observable and the disturbance caused on another complementary observable so that their product should be no less than a limit set by Planck's constant. In 1980, Braginsky, Vorontsov, and Thorne claimed that this relation leads to a sensitivity limit for gravitational wave detectors. However, in 1988 a model of position measurement was constructed that breaks both this limit and Heisenberg's relation. Here, we discuss the problems as to how we reformulate Heisenberg's relation to be universally valid and how we experimentally quantify the error and the disturbance to refute the old relation and to confirm the new relation.

012025
The following article is Open access

Neutron interferometer and polarimeter are used for the experimental investigations of quantum mechanical phenomena. Interferometry exhibits clear evidence of quantum-contextuality and polarimetry demonstrates conflicts of a contextual model of quantum mechanics á la Leggett. In these experiments, entanglements are achieved between degrees of freedom in a single-particle: spin, path and energy degrees of freedom are manipulated coherently and entangled. Both experiments manifest the fact that quantum contextuality is valid for phenomena with matter waves with high precision. In addition, another experiment is described which deals with error-disturbance uncertainty relation: we have experimentally tested error-disturbance uncertainty relations, one is derived by Heisenberg and the other by Ozawa. Experimental results confirm the fact that the Heisenberg's uncertainty relation is often violated and that the new relation by Ozawa is always larger than the limit. At last, as an example of a counterfactual phenomenon of quantum mechanics, observation of so-called quantum Cheshire Cat is carried out by using neutron interferometer. Experimental results suggest that pre- and post-selected neutrons travel through one of the arms of the interferometer while their magnetic moment is located in the other arm.

012026
The following article is Open access

and

We discuss a discrete-event simulation approach, which has been shown to give a unified cause-and-effect description of many quantum optics and single-neutron interferometry experiments. The event-based simulation algorithm does not require the knowledge of the solution of a wave equation of the whole system, yet reproduces the corresponding statistical distributions by generating detection events one-by-one. It is showm that single-particle interference and entanglement, two important quantum phenomena, emerge via information exchange between individual particles and devices such as beam splitters, polarizers and detectors. We demonstrate this by reproducing the results of several single-neutron interferometry experiments, including one that demonstrates interference and one that demonstrates the violation of a Bell-type inequality. We also present event-based simulation results of a single neutron experiment designed to test the validity of Ozawa's universally valid error-disturbance relation, an uncertainty relation derived using the theory of general quantum measurements.

012027
The following article is Open access

and

In a recent paper we have examined the short-time propagator for the Schrödinger equation of a point source. An accurate expression modulo Δt2 for the propagator showed that it was independent of the quantum potential implying that the quantum motion is classical for very short times. In this paper we apply these results to the experiment of Itano, Heinzen, Bollinger and Wineland which demonstrates the quantum Zeno effect in beryllium. We show that the transition is inhibited because the applied continuous wave radiation suppresses the quantum potential necessary for the transition to occur. This shows there is no need to appeal to wave function collapse.

012028
The following article is Open access

Mermin's "shut up and calculate!" somehow summarizes the most widely accepted view on quantum mechanics. This conception has led to a rather constraining way to think and understand the quantum world. Nonetheless, a closer look at the principles and formal body of this theory shows that, beyond longstanding prejudices, there is still room enough for alternative tools. This is the case, for example, of Bohmian mechanics. As it is discussed here, there is nothing contradictory or wrong with this hydrodynamical representation, which enhances the dynamical role of the quantum phase to the detriment (to some extent) of the probability density. The possibility to describe the evolution of quantum systems in terms of trajectories or streamlines is just a direct consequence of the fact that Bohmian mechanics (quantum hydrodynamics) is just a way to recast quantum mechanics in the more general language of the theory of characteristics. Misconceptions concerning Bohmian mechanics typically come from the fact that many times it is taken out of context and considered as an alternative theory to quantum mechanics, which is not the case. On the contrary, an appropriate contextualization shows that Bohmian mechanics constitutes a serious and useful representation of quantum mechanics, at the same level as any other quantum picture, such as Schrödinger's, Heisenberg's, Dirac's, or Feynman's, for instance. To illustrate its versatility, two phenomena will be briefly considered, namely dissipation and light interference.

012029
The following article is Open access

In this article, we will examine new fundamental aspects of "emergence" and "information" using novel approaches to quantum mechanics which originated from the group around Aharonov. The two-state vector formalism provides a complete description of pre- and post-selected quantum systems and has uncovered a host of new quantum phenomena which were previously hidden. The most important feature is that any weak coupling to a pre- and post-selected system is effectively a coupling to a "weak value" which is given by a simple expression depending on the two-state vector. In particular, weak values, are the outcomes of so called "weak measurements" which have recently become a very powerful tool for ultra-sensitive measurements. Using weak values, we will show how to separate a particle from its properties, not unlike the Cheshire cat story: "Well! I've often seen a cat without a grin," thought Alice; "but a grin without a cat! It's the most curious thing I ever saw in all my life!" Next, we address the question whether the physics on different scales "emerges" from quantum mechanics or whether the laws of physics at those scales are fundamental. We show that the classical limit of quantum mechanics is a far more complicated issue; it is in fact dramatically more involved and it requires a complete revision of all our intuitions. The revised intuitions can then serve as a guide to finding novel quantum effects. Next we show that novel experimental aspects of contextuality can be demonstrated with weak measurements and these suggest new restrictions on hidden variable approaches. Next we emphasize that the most important implication of the Aharonov-Bohm effect is the existence of non-local interactions which do not violate causality. Finally, we review some generalizations of quantum mechanics and their implications for "emergence" and "information." First, we review an alternative approach to quantum evolution in which each moment of time is viewed as a new "universe" and time evolution is given by correlations between different moments. Next, we present a new solution to the measurement problem involving future boundary conditions placed on the universe as a whole. Finally, we introduce another fundamental approach to quantum evolution which allows for tremendous richness in the types of allowable Hamiltonians.