You are on page 1of 2

Information loss associated with black holes

Still waiting for Timothy to get back to me.. Perhaps he never will. Found some interesting
sources that parallel recent developments:
http://en.wikipedia.org/wiki/Quantum_spacetime
http://en.wikipedia.org/wiki/Quantum_gravity
http://en.wikipedia.org/wiki/Loop_quantum_gravity
http://en.wikipedia.org/wiki/Discrete_Lorentzian_quantum_gravity
These are conventional research parallels to what i've been pursuing recently..

Still rabidly adhering to Occam.. Believe that will be a 'saving grace'.. Making any sophisticated
assumptions about spacetime goes against this. Whatever Iam space turns out to be, it should be:
simple, elegant, and reflect reality. Have perhaps found some more circumstantial evidence we're
living inside a giant simulation.. Let's examine.

In a physics simulation, never can we designate exact values. This translates to location,
momentum, and temporal approximations relating to simulation step size / location precision.
'Double precision' is usually the best we can do. Error in simulation has been analyzed
theoretically so i don't need to reexamine that here. Back to simulating Iam. Experimentally, we
will never be able to resolve detail to the Planck-length or Planck-time. Basically it's physically
impossible.

But suppose we are living inside a 'giant simulation'. Suppose the entities running the simulation
are not limited to double-precision values. Whatever the limit of precision is, it's finite. Let's
suppose the limit of precision in length is the Planck-length and the limit of precision in time is
Planck-time. This equates with global simulation step size being Planck-time and space-precision
limited by Planck-length.

How many particles are in our universe? What is the minimum information required to simulate
them? Let's estimate the total number of particles in our universe as about 8(1.878*1081) based
on the total mass of the universe, estimated proton equivalent, and multiplying that by 8 for some
WIMP, proton, electron, neutrino, and corresponding anti-particles. Multiplying that by the
magnitude of Planck-time and length gives 1.3*10162 bits of information at any one instant.

.. Some years ago i investigated information theory. It's abstract (beyond normal math
abstraction) and 'difficult' to comprehend. That's an understatement in any terms. Kind of like
trying to understand Godel's incompleteness theorem. My estimate above is purely conservative.
Likely it's much more than that. But imagine a 'state machine' that transitions based on T with
initial conditions p0. That simulator would be required to have a minimum of 10162 bits to
represent all particle locations at any one instant. Sounds incredible but since it's finite, is
possible.

Now we see why information theory is related to 'black hole' theory.. Is information conserved?
With singularities, likely not. When a mass is absorbed into a singularity, all information about
the mass is lost. The basis of singularity / black hole theory is that there must be a limit to
'nuclear tension' - when a neutron star collapses with too much mass.. But this equates with an
assumption about nuclear 'repulsion'; there's a limit. i never assumed this in any of my versions
of Iam space. To me, a 'singularity' in space is merely a neutron star with an event horizon. There
is no physical evidence black holes exist in terms of 'a different form of matter'. The event
horizon of a black hole is where even light cannot escape. This is caused by the force of gravity
exceeding escape velocity. But that does not imply, by itself, that the structure inside a black hole
is any different than neutron stars. Black holes may simply be neutron stars with event horizons.
There may not be a collapse of nuclear material.

Regardless of the structure of black holes, we need to determine the information loss regarding
masses falling into them. This relates to total universal information content and how it evolves.
Again, regardless of black hole structure, information is lost every time a mass is consumed by
one. So if 10162 bits of information is required at any one instant, there is an information loss
associated with the total number of black holes and average mass density surrounding them. Just
from this heuristic perspective, we see black holes determine information loss in the universe.

If I represents total universal information content, B represents number of black holes / neutron
stars, and rIl represents the average rate of information loss associated with black holes and
neutron stars, then the information content of the universe at any one instant is: I - B(rIl). This
may be a way to 'test the theory'. If we can measure/estimate the three parameters, we may be
able to validate/invalidate the theory.

The initial information content of the universe we can designate I0. So verily, I = I0 - ∫B(rIl) at any
one instant. Combining calculus and information theory is intricate but possible. Assuming B is
relatively constant throughout the life of any one particular cosmos, the function representing
instantaneous total information content becomes I = I0 - B∫(rIl). Again, if we can
measure/estimate four of the four values, we can test the validity of the theory.

Ball-park estimation of above is: I = 10162 - B∫(rIl) where I is current information content of the
universe, B is the total number of black holes and neutron stars, and rIl is the average rate of
information loss associated with black holes / neutron stars .. 'Co-conspirators' at NPA have
requested positive predictions (as opposed to negative predictions such as no Higgs) from 'my
theory'. This is 'best i can do' at the moment..

http://en.wikipedia.org/wiki/Observable_universe
http://en.wikipedia.org/wiki/Information_theory

You might also like