Life as the Explanation of the Measurement Problem

This study argues that a biological cell, a dissipative structure, is the smallest agent capable of processing quantum information through its triangulated, holographic sphere of perception, where this mechanism has been extended by natural evolution to endo and exosemiosis in multicellular organisms and further to the language of Homo sapiens. Thus, life explains the measurement problem of quantum theory within the framework of the holographic principle, emergent gravity, and emergent dimensionality. Each Planck triangle on a black hole surface corresponds to a qubit in an equal superposition, attaining known bounds on the products of its energies and orthogonalization interval. Black holes generate entropy variation shells through the solid-angle correspondence. The entropic work introduces the bounds on the number of active Planck triangles dependent on the information capacity of the black hole generator. The velocity and dissipativity bounds and the bounds on the theoretical probabilities for active, energy-carrying Planck triangles were derived. In particular, this study shows that black holes, Turing machines, and viruses cannot assume the role of an observer. The entropy variation shells and black-body objects may hint at solutions to ball lightning and sonoluminescence unexplained physical spherical phenomena. “It is also possible that we learned that the principal problem is no longer the fight with the adversities of nature but the difficulty of understanding ourselves if we want to survive” [1].

The conjecture that life explains the measurement problem of quantum theory (QT) was probably first hinted at by Howard Pattee [9], who correctly noted that a measurement must be a record of an event and not the event itself.It is now generally accepted [2,7,[10][11][12][13][14]] that all information in the universe evolves, decreasing the entropy.Furthermore, observer-independence of observed reality has been experimentally disproven [15][16][17].QT deals with quantum information, while the one we are constructed to experience is classical."We form for ourselves images or symbols of the external objects; the manner in which we form them is such that the logically necessary (denknotwendigen) consequences of the images in thought are invariably the images of materially necessary (naturnotwendigen) consequences of the corresponding objects.(...) Experience shows that (...) such correspondences do in fact exist" [18].And these correspondences exist despite the ugly duckling mathematical theorem (UDT) [19,20] asserting that every two objects we perceive are equally similar (or equally dissimilar), however ridiculous that may sound.Satosi Watanabe, who proved this theorem, was so puzzled by his own discovery that he proposed, as a corollary, that "we have to ponderate (give weights to) the predicates so that we can say that in order for two objects to be similar to each other they have to share more important (weighty) predicates" [20].Indeed, everyone learns to * szymon@patent.plgive weights to the predicates or learns to discern during toddlerhood.But this empirical observation can by no means be equal to a corollary of a mathematical theorem.
The fact that individual perceptions of nature usually correspond to each other does not imply that any reality that would be observer-independent or objective exists up to each and every measured quantum.For example, a rainbow is a perfect illustration of observer dependence.On the other hand, relativity, or inconsistency of perceptions of moving observers, is the conclusion of relativity theory.Thus, any consistent objective reality, if it existed, according to relativity theory, would apply solely to unmoving observers.But they cannot exist.Entropic gravity [4] shows that both inertia and gravity are emerging phenomena, whereas the standard classical concepts of position, velocity, acceleration, mass, force, etc. are far from obvious.This invalidates some objectively existing spacetime that would be consistently and objectively real for all observers: time and space have already been deprived of the last trace of objective reality [21] by the very creator of relativity theory.
Christians believe that God is the maker of all things visible and invisible [22,23] 1 .In fact, they believe in something even more weighty, namely that "what is seen was not made out of what was visible" [24].The visible things are obvious (at least to those with healthy and working eyes), and God cannot be proven or disproven by the scientific method.This notion is subjective, although certain deities, such as Chronology Protector [25], or Cosmic Censor [26], made their homes in physics.But can the invisible things, introduced to philosophy by Saul of Tarsus, be studied?What would those invisible things be, then?Are they invisible because we need a microscope or telescope (or X-rays?) to see them?Obviously not, because then we would see them using a microscope or telescope (or X-rays).Dark matter is invisible by its very definition.But this artificial concept, required to explain galaxy rotation curves, is redundant within the framework of emergent gravity.Parallel universes are unmeasurable.But they are not required to explain anything and fall under Occam's razor.Thus, things visible are measurable, and things invisible must define a boundary between visible and invisible things.
It turns out that QT defines this boundary: unless we see the things modeled by QT -they remain invisible and follow the unitary operators' evolution2 having complex eigenvalues; when we measure them, they yield measurement outcomes as real eigenvalues of Hermitian operators.Two contradicting mathematical apparatuses to describe the same physical phenomenon?!This is known as the measurement problem, to which no consensus has been achieved in the scientific community so far [27].
On the other hand, the mathematics of complex numbers is fundamentally invisible unless it succumbs to visibility, like Euler's formula (e iϕ = cos(ϕ) + i sin(ϕ), the jewel of physics [28]), Euler's reflection formula, the gamma function, the Mandelbrot (z n+1 = z 2 n + c, z, c ∈ C) and Julia sets, the Riemann zeta function, reflection functions of holomorphic omnidimensional convex polytopes inscribed inside n-ball, n ∈ C [8], Schrödinger equation, Dirac equation, and the vast number of other remarkable, obvious, and simple after discovery, relations involving the imaginary unit i.
Both QT and the mathematics of complex numbers involve the imaginary unit.Thus, it follows that the invisible things are related to (or, rather, are invisible because of) the imaginary unit.And time is imaginary, albeit not as an imaginary coordinate of complex Minkowski spacetime [29], where the time coordinate is spatialized by being multiplied by the speed of light in vacuum (c), and perceivable as real only in the present (cf. the surface of n-ball in n ∈ C (5)).
Even though we know about QT, bringing us the universe of invisible things, at least since 1877 when Ludwig Boltzmann introduced energy quantization and quantum discontinuity [30,31] to classical reality, we are still struggling to reconcile somehow things visible with the invisible ones (which is logically impossible) or at least to make invisible things (including unitarity of QT) contained.And the voices of those who oppose this struggle, of those who are convinced that the invisible things cannot be reconciled with the visible ones and cannot be contained (closed) in some boxes made of visible things, like the Schrödinger cat for example, are meek.Research in the field of fundamentally invisible things is fundamentally difficult.It is like wandering astray in the dark.As David Mermin succinctly put it, "Shut up and calculate!".And the question "What is it [such research] good for?"[32] always hangs in the air, even if time again and again shows that it bears fruit.Planck's principle haunts the research of invisible things.
However, there seems to be a light at the end of the tunnel.It was shown [33], for example, that an arbitrary collection of real-valued functions of time and at least one conserved quantity depending on these real-valued functions, which is constant with respect to time, allows deriving most of the experimentally confirmed physical theories.The evolution of information [2,[10][11][12][13][14] is an example of a function of time, and this conserved quantity is certainly the energy of the universe considered an isolated system.
In addition, it was shown that an isolated quantum system could not function as observers [34,35].This discovery significantly reduces the cardinality of the set of possible observing entities.
The paper is structured as follows.Section II briefly summarizes the differences between quantum and classical information to show that the concepts of time, distinguishability, and memory are inherent to the latter, while the concept of entropy is to the former.Section III deals with classical and quantum probabilities.Section IV reviews the known entropy formulas relating to quantum and classical information.Section V concerns triangulated holographic spheres, where fluctuating spherical Planck triangles correspond to qubits in equal superpositions of energy states, with the vanishing, nondegenerate ground state.In particular, it concerns entropy variation spheres and shells in thermodynamic nonequilibrium, dissipative structures generated by black holes.Section VI provides an overview of the dynamics of entropy variation spheres in terms of Pythagorean velocity and acceleration relations as a function of the Unruh temperature.The concept of a holographic sphere is extended in Section VII to biological cells, quantum information storage devices.Section VIII shows that other agents, theoretically capable of performing quantum measurements (Turing machines) or maintaining biological evolution (viruses), are not observers in the sense given in Sections V and VII.Section IX discusses and Section X concludes the findings of this study.

II. INFORMATION
Information can be either quantum or classical.The bit is the smallest possible amount of information, always containing a natural number of bits.A qubit is the basic unit of quantum information.The relative phase factor of the qubit is lost upon its measurement, and the qubit reduces to one bit of classical information.Quantum measurements of isolated quantum states repeated in the same basis provide zero bits of classical information [36].
Classical information is finite.Unlike quantum information [37][38][39], classical information can be cloned, deleted, and hidden in correlations between the system and the environment 3 .The removal of classical information is associated with minimum energy dissipation and the increase in entropy given by Landauer's principle [40].A recording medium (memory) is necessary to make a measurement and record it as classical information, and any record can be encoded in a finite bit string.Memory must be finite by the Bekenstein bound [41].Classical information must also relate to spatially and temporarily distinguishable phenomena above the limits of Planck length (ℓ P ) and time (t P ), the smallest physically significant length and interval, as well as above the uncertainty principle threshold, a violation of which would imply a violation of the second law of thermodynamics [42].Finally, classical information is interpreted, bit by bit, by those able, including living biological cells, their multicellular conglomerates, eusocial (and antisocial) groups of such cells and conglomerates, and Turing machines.
The lack of any classical information about a past event equalizes this event to an event that has never happened.
Quantum information is infinite [43], and quantum information carriers are indistinguishable.If two quantum particles of the same kind are indistinguishable, their trajectories 4between two distinct moments, they were measured are undefined, which leads to Bose-Einstein (symmetric) or Fermi-Dirac (antisymmetric) particle statistics, of which the latter accounts for the great variety of chemical properties of atoms in the universe [44].Particles' indistinguishability is also a foundation of classical statistical thermodynamics based on the Maxwell-Boltzmann statistic, being a base for the concepts of an ideal gas and Boltzmann entropy.
" [W]hat is seen [classical information] is temporary, but what is unseen [quantum information] is eternal" [22].And that statement gives Saul of Tarsus priority over Ludwig Boltzmann in the research of invisible things.
Probability is commonly defined as the measure of the likelihood of an event occurrence (i.e., it will be distinguishable from other events of the sample space).Therefore, it is a circular definition (probability is a synonym of likelihood) requiring an interpretation.There are two competing ones.The ontic (also called objective or scientific, or physical) interpretation assumes some objective physical element of reality: a coin, a dice, a roulette, a football team in a given match, etc., to which probability is associated and either calculated as a relative frequency of occurrence of the event in a long run of previous trials (frequentism) or modeled as a tendency of this element of reality to produce this event occurrence (propensity).The epistemic (also called subjective or evidential) interpretation regards probability as a measure of the degree of belief of an individual assessing the uncertainty of the future event occurrence based on their previous experiences.
The ontic interpretation involves calculations and logical inferring and thus may be employed by humans and humandesigned algorithms.In principle, epistemic interpretation requires only memory to store the prior experiences on which an individual's subjective degree of belief is based or estimated.I avoid the word inferred in this context, as there are various theories of reasoning to arrive at this degree of belief.Bayesian probability, Dempster-Shafer theory, and Lotfi Zadeh's possibility theory are just a few examples.In any case, this degree of belief must be based on some classical information recorded earlier, and how it is inferred is a secondary issue.Therefore, humans and other living organisms employ the epistemic probability with a memory tuned to gain and retain fitness-relevant information [45] regardless of the actual implementation [46] of this mechanism.Thus, it may be regarded as an equivalent of a survival instinct.Turing machines do not have a survival instinct, not to mention beliefs.
A measurement of a pure quantum state is also associated with a certain probability calculated using the Born rule as a square of a complex probability amplitude, which is mathematically elegant but brings about the measurement problem that, in turn, demands an interpretation: some (many-worlds interpretation, De Broglie-Bohm interpretation, objectivecollapse theories, etc.) argue that this quantum measurement probability is ontic, others (QBism) that it is epistemic.Some (superdeterminists) question the concept of probability itself, arguing that events do not occur, but are superdetermined.
Overall, the concept of real nonnegative probability p is only a quarterdeck over the concept of quantum measurement and complex probability amplitude λ admitting negative probabilities (e.g., in Wigner distributions).

IV. ENTROPY
In statistical mechanics, classical entropy is related to the notion of multiplicity (Wahrscheinlichkeit) W ∈ N, the number of microstates corresponding to a particular macrostate of a thermodynamic system of specified energy.It is provided by the Boltzmann entropy formula where k B ≈ 1.38 × 10 −23 J/K is the Boltzmann constant.This formula was generalized by Gibbs to distributions of microstates, where the microstates are not equally probable (p j 1/W) that shows that the multiplicity W represents the inverse of probability.
In classical information theory, Shannon entropy quantifies the information gained, on average, while measuring a random variable, where the outcomes are given by probabilities p j .S H (3) is equal to the average number of questions needed to ask to acquire the missing information about the measured random variable [47].Increasing the number of possible outcomes will require more questions to be asked.Also, an increase in patternlessness [48] of the distribution of outcomes will increase the average number of questions.S H increases, therefore, in only one direction, towards the equiprobability of the outcomes.Almost the same form of equations ( 2) and (3) shows that the Gibbs entropy formula (2) is, in fact, a measure of information or uncertainty.Although making it dimensionless to transfer the burden of carrying the units of energy to temperature [47] would still be problematic due to the equipartition theorem relating the average kinetic energy5 of a particle not only to the temperature of a system but also to the particle's degrees of freedom (DOFs).
Finally, the quantum von Neumann entropy formula extends Gibbs entropy (2) and Shannon entropy (3) to define the entropy of a quantum system containing a probabilistic mixture of quantum states described by density matrix ρ (directly as a logarithm of ρ, or in terms of its eigenvalues λ j ).
Volume integration over the Maxwell-Boltzmann particle statistic introduces the natural base of the logarithm in Boltz-mann (1) and Gibbs (2) entropy formulas, and the indistinguishability of particles is further assumed, as it would otherwise lead to the Gibbs paradox.On the other hand, the base of the logarithm in the Shannon entropy (3) may be freely chosen depending on the unit of information considered.Base 2 is used, for example, if S H is to be measured in bits.
Thus, the entropy formulas of Gibbs (2), von Neumann (4), and Shannon (3) are functions of probability.For impossible events (p = 0), certain events (p = 1), and pure quantum states (ρ = ρ 2 ), they vanish 6 .The von Neumann entropy (4) generalizes the notion of entropy to quantum information and is nonvanishing only for impure probabilistic mixtures to reach the limit of S Q = S H (for b = e in S H (3)), as shown in Fig. 1, if all states of ρ are orthogonal, in which case the density matrix has only diagonal entries.This is the patternless thermal noise of black-body-object radiation, as discussed in the subsequent two sections.In other words, the orthogonal states of a density matrix ρ in S Q (4) are distinguishable, just as the outcomes of the random variable in S H . Non-orthogonal states are either partially distinguishable or indistinguishable in the case of a pure state.

V. BLACK HOLES AS GENERATORS OF ENTROPIC VARIATION SPHERES
The idea that observable DOFs of a system can be described as if they were bits of classical information corresponding to Planck areas ℓ 2 P forming a two-dimensional lattice (a holographic screen) had been proposed in [3] and is now known as the holographic principle.It has been further researched [4] to demonstrate that gravity and inertia are entropic in nature.This experimentally confirmed theory [51] is now known as entropic (or emergent) gravity and explains why gravity allows action at a distance even when there is no mediating force field.It explains galaxy rotation curves without using dark matter and is decoherence-free [52].
Further research [2] demonstrated that a holographic screen is a holographic sphere (HS).Interior-less, one-sided blackbody objects (BBOs): the densest, unsupported [2], black holes (BHs), neutron stars, and the least dense white dwarfs, supported against collapse, as it is accepted, owing to the Pauli exclusion principle, emit perfect black-body radiation and thus define HSs in thermodynamic equilibrium.Nonequilibrium HSs, the entropy variation spheres (VSs) can form stable dissipative structures, thermodynamically open systems operating nonlinearly far from thermodynamic equilibrium, having a dynamical régime that is in some sense in a reproducible steady state.In this notation (used in this paper for subscripts of physical quantities), HSs include BBOs and VSs, while BH ⊂ BBO.In addition, it was shown [7] that charged BBOs need energy that exceeds their mass-energy equivalence ratios.Imaginary parts of complex energies, defined 6 We note that 0 ln 1 0 that occurs in the entropy formulas for impossible and certain events is undefined.It is only taken by convention as 0 ln 1 0 ≔ 0.
by imaginary Planck units and inaccessible for direct observation, store the excess of these energies.However, electrostatics extends the scope of this study.It is related to a complementary physical configuration defined by the second negative fine structure constant α 2 ≈ −140.1779 that introduces the complementary set of Planck units [7].We also note that all electrical units can be expressed by means of mass, length, time, and charge units, and the elementary charge e is the same in perceivable (α) and complementary (α 2 ) physical configurations, with the former having a lower Planck energy (E P ), thus setting more favorable conditions for biological evolution to emerge [7].Furthermore, BHs are fundamentally uncharged, since the parameters of any conceivable BH, in particular, charged (Reissner-Nordström) and charged-rotating (Kerr-Newman) BH, can be arbitrarily altered using Penrose processes [53,54] to extract electrostatic and/or rotational energy of BH [55].Therefore, we shall limit further considerations to uncharged non-rotating BHs that have the Schwarzschild radius7 R BH = 2GM BH /c 2 , where M BH is the BH mass and G is the gravitational constant.We note that the areas This equation reduces 8 to familiar A(3) B = 4πr 2 in three real (spatial) dimensions and one imaginary (time) dimension for n = 3 + 0i (i.e., at the present moment of perception), the trigonometric member vanishes for radius r = 1/ √ π, and for both conditions A(3) B = 4.In particular, R = rℓ P = ℓ P / √ π is the radius of a 4-bits BH, while four bits are one unit of a BH entropy [57].Taking into account imaginary time, equation (5) means that a basketball, for example, that existed 5 minutes before now, looks very different from the basketball that will be existing 5 minutes after now (due to the antisymmetry of the sine function that directly introduces the arrow of time), and looks very different than the basketball seen now.An object is spherical only in the present moment of perception.Furthermore, all geometrical objects have bi-valued volumes and surfaces.By choosing complex analysis, we enter bivalence due to its very nature A square root is bivalued, and this cannot be neglected as nonphysical; bivalence extends real effects (one value), just as quantum theory extends classical physics [8].
In addition [2] it was shown that HSs are triangulated 9 .Their interaction with the environment occurs through the binary potential δϕ k = −c 2 • {0, 1} associated with individual triangles.The non-positivity of the binary potential is inherited from the entropy variation [2,4,32] that locally decreases the entropy.
The energy-time version of Heisenberg's uncertainty principle 10 (HUP) holding for any pair of conjugate variables is where δE represents the standard deviation of energy, δt represents the standard deviation of time, and is the reduced Planck constant.However, there is "no reason inherent in the principles of quantum theory why the energy of a system cannot be measured in as short a time as we please" [58,59].Thus, if δt = 0, the product on the LHS of ( 6) is undefined even if δE were infinite and the meaning of δt is problematic in this version of HUP 11 , in particular, if we assume an eternalist view of time, according to which all existence in time is equally real.It has also been established [61,62] that where δt ⊥ represents the time (the orthogonalization interval), that any quantum system expressed as a linear superposition of its energy eigenstates |E n , needs to evolve from one state to an orthogonal one, (δE) 2 = ψ|H 2 |ψ − ( ψ|H|ψ ) 2 is the variance of the system's energy distribution, and H is the system's Hamiltonian.Furthermore, the Margolus-Levitin theorem (MLT) [63] asserts that where is the quantum-mechanical average energy (the energy of the ground state is taken to be zero) [63] of any quantum system (8).
The bounds ( 7) and ( 9) remain the same, although for any E avg , δE can be as large as we like [63].The Levitin-Toffoli Theorem 1 (LTT1) [64] asserts that both bounds (7) and ( 9) are attained if and only if the state ( 8) is a pure two-level (binary) state (qubit) in an equal superposition 10 Which should be properly called "uncertainty theorem", as it is proved. 11For an insightful discussion cf.ref. [60] (p.413-415).
of energy eigenstates, unique up to degeneracy of the energy level E 1 and arbitrary phase factors ϕ 0 and ϕ 1 .Thus, the bounds ( 7) and ( 9) are attained by a state for which δE = E avg = E 1 /2.Substituting δE from the relation (7) attained by the qubit (11) into HUP (6) yields relating the standard deviation of time with the orthogonalization interval in this case.We conjecture that δt ⊥ = ⌊π⌋ δt.
The Levitin-Toffoli Theorem 3 (LTT3) [64] asserts that or where E max is the maximum energy eigenvalue that E 1 can take in the qubit |ψ q (11).The Levitin-Toffoli Theorem 4 (LTT4) [64] asserts 12 that and the bound ( 15) is attained only by the state ( 11) with E 1 = E max .On the other hand, both LTT3 and LTT4 assert that Furthermore, by LTT4 (15) all three bounds (7), (9), and ( 15) can be attained only by a state (11) for which E 1 = E max .
But are there natural quantum systems having a vanishing ground-state energy and only two possible states?Theorem 1.A BH represents a quantum state attaining three bounds (7), (9), and (15), that is, the state for which E max = E BH and E avg = δE = E BH /2.
Proof.We define E avg ≔ Êavg E P , δE ≔ δ ÊE P , Êavg , δ Ê ∈ R. The form of the qubit (11) with the eigenenergy E 0 = 0 dictates the discrete Bernoulli probability distribution for which Êavg = p 1 , δ Ê = p 1 − p 2 1 .Êavg = δ Ê for the probabilities p 1 = {0, 1/2}.p 1 = 0 corresponds to δE = E avg = 0 which implies E 1 = 0 and is not satisfied by the qubit (11).On the other hand, p 1 = 1/2 is obtained using the Born rule from the probability amplitudes of the qubit (11).This proves δE = E avg 0. The average energy (10) of two states of the qubit ( 11) is The rest energy of any object is given by the mass-energy equivalence.BH Schwarzschild radius defines the minimum size of this object with respect to its mass and thus its maximum energy 12 The proof by contradiction of LTT3 is valid also for where M BH ≔ m BH m P , m BH ∈ R, m P is the Planck mass, and The temperature of this object with respect to its acceleration a, given by the Unruh temperature where T P is the Planck temperature, a ≔ a R a P , a R ∈ R, a P is the Planck acceleration.In the case of a BH this becomes Hawking temperature where now a R = 1/d BH is the BH surface gravity.Entropic work [2,7], the product of the entropy and temperature of this object, in the case of a BH is where BH is the BH entropy [57].Therefore, δE = E avg = E max /2, which completes the proof.
The proof, illustrated in Fig. 2(a), can be readily extended to other BBOs [7], the only two-state quantum systems with vanishing zero-point energy that attain the bounds ( 7), (9), and (15).
We shall first introduce certain definitions related to HSs.
is the sum of active, inactive, and fractional triangles on its surface, where D HS = d HS ℓ P , d HS ∈ R is the HS diameter.
The case of p 1 = 0 in Theorem 1 corresponds to degenerate BHs that have information capacity N BH < 1 and energy stored in the informationless fractional Planck triangle(s).
given by the equipartition theorem (EPT) of one DOF.This temperature is the same for a given BH, although it is momentary as BHs fluctuate [2,7,65].
This form (22) of the EPT [2,4,7] corresponds to DOFs' statistical definition (i.e., N BH represents the number of fluctuating [and fractional] triangles that are free to vary).The EPT ( 22) is rigorously proven only for one DOF and under the assumption that the DOF energy depends quadratically on the generalized coordinate, which holds for a Planck area ℓ 2 P and the associated quadratic binary potential δϕ k .
Theorem 2. One DOF defines one bit corresponding to the FPT.
Proof.The energy of the FPT ( 22) can be expressed as which equals BH energy iff N BH = 1, i.e. for one bit corresponding to the FPT.If it were technologically feasible to probe a single FPT on a BH surface, we would expect this triangle to be inactive or active with the same probabilities as a result of the measurement of the qubit (11) associated with this triangle.We would obtain the same result if we probed many FPTs.This means that BBOs are ergodic systems that define thermodynamic equilibrium, algorithmically random, or patternless sequences [48] that maximize both Solomonoff-Kolmogorov-Chaitin complexity and von Neumann (4) and Shannon (3) entropies.However, this is not true for VSs.

Theorem 3. A BH generates a VS having energy bounded by
and information capacity bounded by Proof.The energy bounds (25) follow from the LTT3 ( 14) and Theorem 1. Expressing the BH energy ( 17) by its information capacity N BH and defining E 1 ≔ m VS E P ≤ m BH E P , produces the inequality which establishes the bounds (26) and yields where In all cases, as shown in Fig. 3, the Planck triangle of VS is located somewhere on the VS surface defined by a solid angle that corresponds to the BH Planck triangle and is inversely proportional to the BH information capacity.Similarly to the proof of Theorem 1, this proof can also be extended to other BBOs [7].
Plugging the relation ( 27 A π-bit BH (d BH = 1) defines the solid angle ( 29) Ω = 4 with only one active Planck triangle and the relation (24) for the π-bit BH yields an improvement on the EPT for an atom in a monoatomic ideal gas in R 3 .The relation (12) relates the HS orthogonalization interval δt ⊥ with the HS time intervals (39) for a FPT.The HS orthogonalization interval δt ⊥ can be interpreted as the minimum time that an HS needs to change the locations of its active triangles (each requiring an interval δt).Thus, for BHs, the relation (12) turns into equality, while for VSs, the strict inequality holds, as there are fewer active triangles than in the case of BHs, and thus the orthogonalization interval is shorter.By LTT4 (15) and Theorem 1 the BH orthogonalization interval amounts to t ⊥ BH = 4π/d BH , where δt ⊥ ≔ t ⊥ t P , t ⊥ ∈ R and t P is the Planck time, and is another parameter defining a BH.Theorem 4. The number of active Planck triangles N 1 on a VS is bounded by Proof.BH entropic work (20) is the work done by all active triangles of BH.Similarly, the BH temperature (19) along with the binary entropy variation δS VS = k B N 1 /2 [2] yields the VS entropic work that, using the energy bounds (25), and the relation (24) defines the bounds (31), as shown in Fig. 4.
We note that only for N BH exceeding the BH unit of entropy [57], ensemble).Furthermore, for N BH < 2, the temperature of BH (19) exceeds its energy (17) For BHs N 1 = ⌊N BH /2⌋ and N 0 = ⌈N BH /2⌉.Why do we assume that N But we can always add some missing information [47] or surprise [13] taken from the fractional triangle(s) to 1 to arrive at 1.375, and we cannot subtract information from 1.5.Furthermore, N 1 BH = ⌈N BH /2⌉ would produce an entropic work (32) greater than or equal to the BH entropic work (20).
We also note a discrepancy between even and odd numbers of BH bits, which manifests itself for small BHs.BHs with an even number of bits have N 1 = N 0 = ⌊N BH /2⌋ active triangles, while BHs with an odd number of bits have fewer active triangles as ⌊N BH /2⌋ = N 1 < N 0 = ⌈N BH /2⌉.Definition 7. A mass M VS is a dissipative mass [11] if its velocity satisfies the orbiting condition where is the escape velocity, and V L is the velocity of mass M VS perpendicular to the orbiting radius R VS .

Theorem 5. A dissipative mass M VS associated with a VS having a diameter D VS has a velocity
Proof.For a dissipative mass, condition (33) produces [2] where R BH is the Schwarzschild radius of mass M VS that appeared in this equation.The bounds (35)  We assume that v L represents tangential velocity as the maximum (radial) recoil velocity after BHs merger is approximately bounded by 10% of the speed of light [66].The bounds (34) mean that for VSHs the orbital velocity 1199 × 10 8 m/s, while the escape velocity |V E | = c defines the Schwarzschild radius of the internal BH generator.In the subsequent section, we shall return to the bounds (34) and to imaginary velocities v ∈ I. Theorem 6.The theoretical probability p 1 for a triangle on a VSH to be an active Planck triangle satisfies Proof.The theoretical probability p 1 ≔ N 1 /N VS .N 1 is given by the bounds (31), while N VS is given by the bounds ( 26) Since x − 1 < ⌊x⌋ then 1 16 − 1 4N BH < ⌊N BH /4⌋ /4N BH .For N BH < 4 the lower bound (36) is negative.This theorem applies to VSHs, as it extends a sample space beyond a specific radius of a VS.Probabilities (36)  Unlike patternless BBOs, active Planck triangles in VSHs can form patterns.As the entropy (Boltzmann, Gibbs, Shannon, von Neumann) of independent systems is additive, a merger of BH 1 and BH 2 produces a BH C having entropy being the sum of the merging BHs.Thus, shortly after the Big Bang, a merger of two primordial BHs, each having Planck length diameter, the reduced Planck temperature T P 2π , and no tangential acceleration a L , produced a BH having d BH = ± √ 2 that represents the minimum BH diameter, which allowed the notion of time [2].A collision of the latter two BHs produced a BH with d BH = ±2, which has a triangulation that defines only one precise diameter between its poles.And so on.The information capacity N BH of the BH generators started to increase, and the number of active triangles N 1 increased accordingly.Starting from N BH = 4 (cf.Fig. 4) the information started to evolve [2,7,[10][11][12][13][14]. The first BH generators produced the VSH of a hydrogen atom.Subsequent BHs produced the VSHs of the remaining atoms, organic compounds, polymers, coacervates, DNA, and life.
However, BHs themselves, patternless, interiorless spheres in thermodynamic equilibrium, defined by one real number, cannot be observers.

VI. DYNAMICS OF ENTROPIC VARIATION SPHERES
The previous study [2] introduced the concepts of a disturbing radius δR and a complementary gradient radius R GS = −δR, the segment δL orthogonal to R GS and δR, and the second interval δt R related to the first interval δt L through integral powers of the imaginary unit i.We shall express these physical quantities by Planck units where t P − = G/(−c) 5 = it P is the Planck time parameterized with the negative speed of light in vacuum and thus imaginary.The bivalued c = ±1/ √ µ 0 ǫ 0 comes from Maxwell's equations in vacuum [7].
Length and time relations (38) and ( 39) introduce four velocities and four accelerations that can be described as velocity and acceleration matrices (det Mutually orthogonal velocities and accelerations are bound with each other based on Pythagorean relations with c and a P as hypotenuses which is given by the Lorentz factor (in the case of velocities) 13 and by Hawking/Unruh radiation expressed in terms of 13 The other possibility is the Planck acceleration (in the case of accelerations) [2].We note that the relations ( 38)-( 43) are valid also for different sets of natural units of speed c * and acceleration a * , provided that c * = ℓ * /t * and a * = c * /t * (e.g., for c n and a Pi [7]).The velocity equation ( 42) represents two rectangular hyperbolas that have semi-major axes ±t L and ±t R , foci ±t L √ 2 and ±t R √ 2 and eccentricities of ± √ 2, while the acceleration equation ( 43) represents a circle with radius t 2 L = −t 2 R , as shown in Fig. 5.
Furthermore, the acceleration equation ( 43) is an elliptic paraboloid formula if t 4  L represents a dependent variable.On the contrary, the velocity equation ( 42) is a saddle surface formula, with t 2 L representing a dependent variable, as shown in Fig. 6.Both surfaces meet at t 4 L = t 2 L for r δ = 0. Squaring the velocity equation ( 42) and substituting into the acceleration equation ( 43) as t 4  L , provides l δ to r δ time-independent relation The relations ( 45) and ( 46) are imaginary for l δ ∈ (−1, 1)\{l δ = 0} and r δ ∈ (−1, 1) \ {r δ = 0}, as shown in Fig. 7. Furthermore, adding the velocity equation ( 42) to the acceleration equation ( 43) we arrive at the tangential relation bounding l δ with t L/R , whereas subtracting these equations yield the radial relation bounding r δ with t L/R in a similar way, as shown in Fig. 8.The tangential relation (47) yields imaginary l δ for t R ∈ (−1, 1) \ {t R = 0}; the radial relation ( 48) yields imaginary r δ for t L ∈ (−1, 1) \ {t L = 0}.Some VSs and Special Relativity results are given in Appendix C. The BH temperature ( 19) can be further expressed in terms of r δ , r GS and t L/R as Using the relation (49), the velocity equation ( 42) with r 2 δ yields the BH velocity equation and similarly the acceleration equation ( 43) becomes the BH acceleration equation To exclude imaginary values of l δ in the acceleration equation (51), we demand 1 − 1/d 2 BH ≥ 0. This leads to |d BH | ≥ 1 and π-bit BH providing only radial acceleration a R , since the tangential acceleration vanishes for d BH = 1 and is imaginary for |d BH | < 1, as shown in Fig. 10.
Equating the equations ( 50) and ( 51) with each other, yields (for t L 0) the d BH dependent time relations allowing to express l δ and r δ also as functions of d BH only as shown in Fig. 9.For a R = 0 (absolute zero), the relations (52) yield t 2 L = 1 and t 2 R = −1 = i 2 , and the relations (53) yield l 2 δ = 1, and r 2 δ = 0.However, the Nernst heat theorem asserts that at 0 K, entropy variations vanish (lim T →0 δS = 0).Therefore, at T = 0, the disturbing radius δR and the gradient radius R GS vanish.Furthermore, the relations ( 52) and ( 53) have a singularity at acceleration a R = a and t 2 R = 1.Thus, the disturbing and gradient radii δR 2 = R 2 GS = ℓ 2 P are well defined, but δL = 0. Using equations ( 52) and ( 53), the velocity and acceleration matrices ( 40), (41) and squared, can be further expressed in terms of the BH information capacity The bounds (34) on v L and the velocity relation ( 42) lead to Substituting 4 th powers of the velocities (56) into the bounds (34) and ( 58) yields Furthermore, using bounds (26), the bounds (59) can be expressed in terms of the VS mass M VS as 3.1415 × 10 −9 ≤ M VS ≤ 6.6641

VII. BIOLOGICAL CELLS AS ENTROPY VARIATION SPHERES
The oldest physical traces of microorganisms on Earth are reported to date back 3.77 billion years.However, the evolution of information [2,[10][11][12][13][14], including nuclear evolution in stars leading to heavier elements and organic evolution leading to polymers and coacervates, and finally to life, began at the Big Bang, 13.8 billion years ago.A cell consists of a cytoplasm containing various biomolecules, such as proteins and nucleic acids, which are enclosed within a lipid bilayer membrane with embedded proteins.The theoretical minimum di-ameter of a spherical cell has been estimated to be 200 nm, including its membrane [67].Cells are alive, wherein life is commonly defined as characteristic distinguishing physical entities that feature signaling and self-preservation (i.e., survival instinct) from those that do not, either because such features have ceased to exist or because they never existed for a given entity, which is thus classified as inanimate.
Cell signaling, understood as the ability of a cell to perceive and respond to its microenvironment, is the basis of normal cell self-preservation.Therefore, while interacting with the environment, a single biological cell must process classical information through its selectively permeable membrane.Any external stimuli acting on a cell membrane must be measured and classified by the cell, in the context of classical information, with the aim of providing the cell with some evolutionary gain.This classification is inherently imperfect and is burdened with an error.These errors condition the cell's survival and are the evolution engine.The better the organism perceives and responds to its environment, the better it is adapted to survive and reproduce.
Therefore, the cell membrane works as a VS, as discussed in Sections V and VI.Biological cells are both classical and quantum from an information-theoretic perspective.Interestingly, cells that do not adhere to other cells or surfaces do not proliferate [68].The patternless distribution of information on the cell membrane would prevent the cell from spatially locating itself in an environment, thus inhibiting its growth and division.
The mechanism of biosemiotic communication that emerged in a single cell has been transferred in the process of evolution to multicellular organisms.Not only to bypass the limits defined by the cell surface area to cell volume ratio [12].Valonia ventricosa (diameter up to 40 mm), one of the largest single-celled organisms, is still 40 times larger than Trichoplax adhaerens, one of the smallest multicellular organisms (diameter approximately 1 mm).It has been transferred to enhance the capabilities of classical information processing.The activation function of the Boolean {0, 1} 3 address space (2 3 = 8 possibilities) [5] resembles the logistic activation function employed in artificial neural networks.A neuron has 5-7 dendrites on average [69].
Semiosis, the production, communication, and interpretation of signs -coding and decoding -occurs within and between organisms [70] and is called endosemiosis within an organism and exosemiosis between organisms of the same or different species.Eusocial groups of organisms (ants, bees, termites, football teams, etc.) use exosemiosis to achieve evolutionary gain.Certainly, human communication involving the processing of classical information in the form of abstract definitions is a form of exosemiosis.This extends to numerous areas of human relations, sociology, democracy, and politics, to name a few.The impact of fabricated pieces of classical information that allegedly describe consistent objective reality (fake news) is widespread.
Any form of semiosis must be based on the ability to retrieve and process classical information stored in some memory, which is still not fully understood even in the case of single cells, although clearly the more information a cell wants to store and process, the more energy needs to be spent on it [71].It has been demonstrated [46], for example, that experience teaches plants to learn faster and forget slower in environments where it matters.Thus, remembering does not require, as is commonly believed, conventional neural networks; pathways of the animals' brains and neurons are just one possible and undeniably sophisticated solution but are not necessary for learning.Memory has evolved to enable and enhance reproductive fitness [45] and is, in turn, related to the concept of asymmetrical unidirectional time flow.The ability to measure and react to the environment is a characteristic that all living systems share [71], and no two single living cells are indistinguishable because the fitness-relevant information they store in their memories must be different.They would not interfere with each other in a double-slit experiment, even though the masses of many cells are still smaller than 2πm P so that their Compton wavelengths are greater than ℓ P , the threshold of distinguishability [2].Consider, as an example, the processing of quantum information through the VS of a human.Although all sensory information is provided through the VS, let us focus solely on visual perception.Sight is the most valued sense [72], though it may not be universally true across all cultures [73].Human eyes contain three types of cone cells that respond to light from different wavelengths in overlapping ranges, which has been shown to be evolutionary sufficient 14 to provide binocular color vision within a bandwidth of about 400 to about 700 nm.As photons in this bandwidth are easily blocked by matter, we can perceive obstacles. 14It has not turned out to be sufficient for mantis shrimps, for example.They have eyes capable of independent trinocular vision, provided with 12 to 16 types of photoreceptor cells, and sensitive to polarized light in a wavelength range from far-red to UVB.They need it to detect short bursts of light emitted as the result of the wave produced by their claws, in a sonoluminescence phenomenon, which -although scientifically unexplained -is exploited by nature.
When a photon of visual light capable of being absorbed by some electron in the cone cell of the eye [74] is emitted by another electron, it travels according to Feynman rules of quantum electrodynamics [44] along all possible paths, as shown in Fig. 12(a).But in an inertial frame of reference of a photon (if we assume that one exists), there would not be a particular moment of emission distinct from another moment of absorption.The time rate approaches zero for a moving object as it approaches the speed of light (C4), and photons always move at the speed of light.It is like an emitting electron adjoined to an absorbing electron, as shown in Fig. 12(b).And this fact supports the framework of emergent dimensionality [2,[5][6][7], as it clearly undermines the notion of some objectively existing, observer-independent spacetime.
Photoisomerization of the photon in the cone cell of the eye leads to signal transduction cascades and may be perceived by the brain as 1 bit of classical information irrespectively on the photon's incoming direction.Many photons will provide more information, allowing the subject to classify the perceived information as an object.Obviously, it would not imply that these bits that fluctuate in the VS have something to do with the velocity of some objectively real object.The stars made to whirl around a person in the center of a planetarium would not pull away the person's arms from his or her body.But it does not invalidate Mach's principle (stating that all bodies in the universe interact), on which relativity is grounded.Gravity and inertial acceleration are generated by entropy gradients acting radially on the VSs.
We have not been bestowed with sight (let alone other senses) to see some consistent objective world as it really is.Visual perception has evolved solely to provide some evolutionary gain.This evolutionary gain is locally acting against the second law of thermodynamics, for a living organism being a dissipative structure.The features of the perceivable universe, including its dimensionality, which requires a natural number of dimensions [75], should not be expected to be helpful in this process.
In contrast, they should maximize the perceivable informational diversity, allowing a choice between good and bad stimuli.And that is how the universe seems to be set up.For example, four-dimensional spacetime obeys Einstein's equations if and only if the sectional curvature of a given 2-plane (of a VS) always equals that of its orthogonal complement [76]; only for n = 4 there exists an uncountable family of nondiffeomorphic differentiable structures that are homeomorphic to R n [77], which is known as exotic R 4 and allows biological evolution [5].There are many other issues to examine, but only these examples indicate that the space we perceive maximizes informational diversity.Further research is required to determine other properties of the perceivable universe.
Data processing by VS of a biological cell is called quantum measurement.Nothing collapses and nothing is corrupted during the measurement on the VS, but is only recorded.The VS defines Heisenberg's cut in von Neumann's chain, and QT, applied to observation, is in a blatant contradiction to experience [78].It must be.Neither causality nor influence nor collapse are good words in the context of quantum measurements [79].

VIII. OTHER OBSERVING AGENTS?
Are there any other agents capable of performing observations and, to this end, provided with memory to record them?
Universal Turing machines (otherwise known as artificial intelligence) are, like biological cells, capable of pattern recognition, and this recognition may or not be correct (a programmer defines the measure of correctness).But all these machines may unexpectedly halt not only because they obey the second law of thermodynamics (like living organisms), but also upon receiving pathological input that will cause them to loop infinitely.This is known as the halting problem and pertains to any Turing-complete model of computation.On the other hand, quantum algorithms processing quantum information are not bounded by the halting problem (cf.Appendix B).Therefore, Turing machines are mere tools, improved versions of simple machines.Living organisms are immune to the halting problem.
Viruses are capable of maintaining biological evolution and thus have properties of dissipative structures.And they obey not only the second law of thermodynamics, but also the 2 nd law of infodynamics [14] 15 .They also feature biological phenomena called host tropism, tissue tropism, or cell tropism, which refer to how they preferentially target specific hosts, tissues, or cell types.But in this targeting virus does not process any classical information but is constructed to bind to specific cell surface receptors to enter a cell and deliver its genome.In this sense, it is an organic, chemical compound capable of damaging a biological living cell, such as gamma radiation or carbon monoxide.Viral evolution occurs only in infected host cells.Viruses are complex, indistinguishable organic molecules only.

IX. DISCUSSION
Various Wigner's-friend-type experiments illustrate that no single, consistent objective reality exists.Starting from the original Wigner concept [1] through the Deutsch enhancement [80], the Brukner version [81] involving two friends sharing an entangled state with the Frauchiger and Renner proposition of an extended Wigner's Friend gedanken experiment [82], it gradually became clear that any observer-independent QT framework is wrong.Finally, the gedanken experiment proposed in [81] experimentally confirmed the impossibility of observer-independent facts violating the associated Bell-type inequality by five standard deviations [16].
As Howard Pattee put it [9] the "physical meaning of a recording process in single cells 16 cannot be analyzed without encountering the measurement problem in quantum mechanics".On the other hand, quoting Feynman, "What I cannot create, I do not understand".We are far from creating a single biological cell in an abiogenetic process.
Gödel's incompleteness theorems demonstrate consistency problems of axiomatic systems based on classical information.QT consistently describes the use of itself [82] in terms of quantum information, but also consistently undermines the notion of observer-independent reality built by any observer from bits of classical information on the VS, the consciousness boundary bounding the quantum VSHs neural network of the human brain.
We note that the notions of matter, space, locality, etc., have already lost their tangible, classical meaning [43,83].The classical description has already been ruled out on the microgram mass scale [84] and quasiparticles have been observed in classical systems [85].The subject of scientific experimental research is thermodynamics in the complex plane, for example, where Lee-Yang zeros [86,87] and photonphoton thermodynamic processes under negative optical temperature conditions [88] have been experimentally observed.Negative masses of exciton-polaritons have also been measured directly [89].New phases of matter, such as liquid crystals, chiral bose-liquid states beyond the framework of symmetry-protected topological phases [90], quantum spin liquids [91], and discrete-time crystals that facilitate the experimental study of novel phases of matter [92] have also been observed.The exotic properties of quantum materials [93] lead to a unified origin of light and electrons [94].
High-dimensional, fractionally dimensional, and complexdimensional physical phenomena, such as synthetic dimensions [95] and photonic synthetic frequency dimension [96][97][98], multiphase fractal media [99,100], and complex geodesic paths in the presence of black hole singularities [101], are also a subject of research.In particular, 2D materials, such as graphene, which is also the subject of active experimental research, are closely related to (2 + i)-dimensional VS surfaces.The topological phases of matter and non-abelian anyons, which occur only in 2D systems, can be used for various quantum information tasks, such as the implementation of a robust quantum memory [102] and open up many interesting questions about mesoscopic transport in electronic systems with non-zero Berry's phase [103].It is now possible to explore 2D topological physics above liquid nitrogen temperatures [104].Artificially induced micro-BHs [105] may, in theory, shed new light on our findings presented in Section V.
The explanation of the measurement problem of QT posed in this study explains, or, as we conjecture, can be further researched to explain most of the unsolved problems in physics.The cosmic censor [26], the chromatology protector [25], and other block-universe concepts become irrelevant.The standard cosmology model needs a complete overhaul [106,107].The holographic principle and the problem of (including the obey the same set of physical laws".By molecules, he must have meant biological cells.arrow of) time are related to perception.The fine-tuned universe concept is meaningless, as fine-tuned physical constants are simply the result of our observations induced by exotic R 4 .The cosmological constant, dark matter/energy/fluid/etc. are obsolete within the proposed nonlocal framework.Every particle (electron, proton, quark, etc.) and antiparticle acquires a new meaning within the proposed framework, along with quasiparticles and other emergent phenomena.
We are aware that this study is incomplete, rendering somehow incomplete our claim that life is the explanation of the measurement problem.But again, research in the field of fundamentally invisible things is fundamentally difficult, and it is often the case that a incomplete [108] theory matures to completeness [109].In an attempt to achieve a balance between philosophy and engineering, we have commented on future research directions for this possibly new chapter in physics.

X. CONCLUSIONS
Comparing classical and quantum information shows that the former relates to the notion of probability, being only the tip of the iceberg over the concept of quantum measurement problem introduced by the latter.On the other hand, comparing known entropies, the surprise measures, shows that quantum entropy (4) equals classical information entropy (3) iff the mixture of states contains solely orthogonal ones, which corresponds to the patternless thermal noise of BBOs' radiation.
Each Planck triangle on a BH surface was shown to correspond to a qubit (11) in an equal superposition of the twofold BH energy and the nondegenerate vanishing ground state attaining known bounds (7), (9), and (15) on the products of energy and the orthogonalization interval δt ⊥ .Accordingly, each BH is a generator of VSs through the solid angle correspondence, where 2N BH ≤ ⌊N VS ⌋ ≤ 4N BH .The VS entropic work introduced the bounds on the number of VS active Planck triangles dependent on the generating BH information capacity with only one active triangle below the unit of the black hole entropy.The velocity bounds for the mass M VS associated with a VS, which makes the mass dissipative, were derived, along with the theoretical probabilities that a VSH triangle is an active Planck triangle carrying energy.The dual radius relation (38) and the dual real-to-imaginary time relation (39) introduced in the previous work [2] have been studied in terms of Planck units leading to four different velocities and accelerations acting on the VS, bounded by Pythagorean relations (42) and (43), and parametrized by the diameter of the BH.The results are consistent with the form of the binary potential of HS δϕ k = −c 2 • {0, 1}.
The VSs and BBOs may, respectively, hint at solutions to ball lightning and sonoluminescence unexplained physical spherical phenomena.
Mathematical physics is based on theorems, statements that have been proved.Therefore, it is invulnerable to scientific falsifiability.In Section V, for example, we have used the equipartition theorem, uncertainty theorem, Margolus-Levitin theorem [63], Levitin-Toffoli theorems [64], and Theorems 1-6 to discuss the consequences of the ugly duckling theorem [19,20] and the exotic R 4 theorem [77].leads to a contradiction: if hda(test, test) resolves that test(test) halts (1 in the 1 st condition of (B1)), test(test) will loop infinitely (1 st condition of (B3)), and if hda(test, test) resolves that test(test) loops infinitely (0 in the 2 nd condition of (B1)), test(test) will halt (2 nd condition of (B3)).
The contradiction dismisses the possibility of creating a computable halt determining algorithm hda universal for all (alg, data) tuples.
However, this proof is false for quantum algorithms processing quantum information (qubits).To prove that, assume first the existence of a computable quantum halt determining algorithm qhda(qalg, |Ψ data ) that would be able to determine whether any quantum algorithm qalg (quantum algorithms can be encoded classically as a finite bit string) will halt on any finite quantum register |Ψ data or not qhda (qalg, |Ψ data ) = 1 iff qalg (|Ψ data ) halts 0 iff qalg (|Ψ data ) loops infinitely . (B4) The impossibility of accomplishing this assumption is clear, as all quantum algorithms halt while being measured, so qhda always returns 1. Apart from that, constructing the quantum algorithm analogous to test accepting |Ψ data as input is also impossible, as (I) |Ψ data cannot be encoded classically as a finite bit string to become the first input of qhda, and (II), even if it could be so encoded, producing a copy of |Ψ data required for qtest operation would violate the no-cloning theorem [37].This shows that computability should not be determined solely by mathematics but also by the physical principles of QT [110] and that it is impossible to represent quantum information processing with a universal classical device [111].For example, a qubit |ψ = α|0 + β|1 requires two normalized complex amplitudes, that is, three real numbers.But an initialized blank qubit |ψ = e iϕ |0 requires only one real phase factor ϕ (which is lost upon qubit measurement).However, a physical implementation of a qubit can store the whole qubit information support, including the unobservable phase factor, despite this surjective isometry property of the qubit 17 .
Appendix C: Entropic Variation Spheres and Special Relativity Using the velocity relation (42), the Lorentz factor is where we assume v RR = r δ /t R is the observable radial velocity.Thus, the squared length contraction becomes where the proper length δL O ≔ l 0 ℓ P , l 0 ∈ R. Substituting t L from (C2) into the tangential relation (47) yields which for natural l 0 yields natural l δ forming the OEIS sequence A046176 18 ; indices of square numbers that are also hexagonal.

Definition 5 .
An HS number of bits ⌊N HS ⌋ = N 0 + N 1 ∈ N 0 is the sum of its active and inactive Planck triangles.Thus, the HS area covered by fractional triangles is {N HS }ℓ 2 P = (N HS − ⌊N HS ⌋) ℓ 2 P < ℓ 2 P .Definition 6.A fluctuating Planck triangle (FPT) is the Planck triangle associated with the BH qubit(11) and has energy corresponding to half of a BH temperature(19)

Figure 3 .
Figure 3.A black hole as a generator of entropy variation spheres through the solid angle Ω correspondence.
) into the Bekenstein bound [41] S HS = πk B d HS m HS , m HS ≤ d HS /4, valid for all HSs and attained by m BH = d BH /4, yields N VS = 2N BH , which using the relation (28), corresponds to d BH /m VS = 4 √ 2, where N VS = 4N BH corresponds to d BH /m VS = 4.We conjecture that the initial shell defined by the radii within this range of R BH ≤ R IS < √ 2R BH satisfies the local equilibrium hypothesis.

1 Figure 4 .
Figure 4. Lower (red) and upper (green) bound on the number of active VS Planck triangles N 1 as a function of the information capacity of the generating BH.Initial shell bound (blue).

1 2 1 2
that is, S HS /k B belongs to a group formed by natural and half-natural numbers, including zero.Let us say that N BH = 5.5.Then S BH /k B = 1.375 and S HS /k B BH = ⌊N BH /2⌋ = 1 or S HS /k B BH = ⌈N BH /2⌉ = 1.5.

Figure 7 .
Figure 7. Relations between the disturbing radius r δ defining the HS and the segment l δ orthogonal to r δ .

Figure 8 .
Figure 8. Radial (blue, cyan) and tangential (red, green) relations r δ , l δ of time t L on the HS.

Figure 12 .
Figure 12.Feynman rules of quantum electrodynamics: (a) frame of reference of an observer; (b) inertial frame of reference of a photon.
(11)nition 1.An active Planck triangle is the sphericalPlanck triangle that has energy E 1 = ±M HS c 2 ≔ ±m HS E P , m HS ∈ R, corresponding to the second energy state E 1 of the qubit(11).The mass M HS corresponds to the curvature of the active Planck triangle.HS contains N 1 ∈ N 0 active Planck triangles.Definition 2.
(35)ned in terms of velocity and diameter correspond to the bounds (26) on the BH information capacity defined in terms of mass or diameter.The upper bound(35)is attained by the Schwarzschild radius of mass M VS (R VS = R BH ).Since V 2 L ≥ V 2 VS ≥ N VS as we exclude v 4 L ≤ 1/4.Furthermore, by Theorem 3, d VS /m VS = 8.
and t L is imaginary, and conversely, below which r 2 δ < l 2 δ , and t L is real.In other words, for a R ∈ {1/ √ 2, 1}, time relation (39) is reversed, δt L ≔ t L t P − and δt R ≔ t L t P .Finally, at