You are on page 1of 16

Eventors

By: Doron Shadmi

A part of the history of our universe is written by light.

According to the Special Relativity Theory (SRT) this history has a cone-
like shape, because the speed of light in a vacuum state has a finite value
(299,792,458 m/s) which is considered as one of the Nature's constants.

Let an eventor be any event that has a light-cone.

An eventor can be a single particle, a cluster of galaxies or any event


between them.

Each eventor has two bodies which are:

a) The Time-Body, which is the evenetor's time-line existence.

b) The Space-Body, which is the eventor's past light-cone.

By SRT, there cannot be any relationship between two eventors if their light-
cones do not meet.

Furthermore, the most up to date information of some eventor cannot be


beyond the intersection (marked by red dots) between any two Space-
Bodies.

1
Any information, which is not interfered, is actually isolated by the gap
between any two eventors' space bodies:

Isolated
info.

Shared
info.

Gravity can be understood as an anti-expansion phenomenon that slows


down the expansion process of our universe:

G
ra
vi
Rest ty
mass

Gravity is the tendency of some eventor to return to its basis along its time-
body.

From this point of view, a black hole is an eventor that has the tendency to
return to the basis of its space/time body, along its time-body.

The time-body of some eventor is actually its most integrated state that has a
constancy property, which is recognized by us as rest mass.

Particles like photons have no rest mass and they exist along side the light-
cone, which determinates the space-body of any eventor that has a rest mass.

2
Complexity can be progressed where there is some equilibrium between
gravity and expansion along the time-body of some eventor.

For example, our planet is an eventor that has this equilibrium, which
dramatically increases the probability of the existence of complex systems.

These systems have the ability to replicate themselves in such a way that
also includes the existence of slight deviations, called mutations.

Self replication and mutations are the proper conditions of life form
progression, and together with proper environmental conditions, there can be
an evolution process that leads to the existence of intelligent life forms.

An intelligent life form has the ability to be aware to the connection between
its time-body and its space-body.

This connection progresses along a curve that actually describes the self-
aware states of some eventor.

The next pages describe briefly (in a non-technical way) an idea called
Cybernetic-Kernel, which tries to formulate the progression of self-aware
states along the time-body (time-line) of an eventor.

About Life

This paper is meant to contribute something to our understanding of Life-


phenomena.

My point of view is based on a new concept of the Language of


Mathematics, where instead of the common Natural number, I use a new
information form called “Organic Number” or ON.

Let redundancy be more than one copy of the same entity can be found.

Let uncertainty be more than a one unique name is related to an entity.

ONs are based on a logical reasoning called “Complementary Logic” where


redundancy and uncertainty are used as fundamental properties.

From a global point of view the fundamental concept is Symmetry that is


measured by its internal redundancy_AND_uncertainty fundamental
properties.

3
The standard set is the case where redundancy_AND_uncertainty cannot be
found as fundamental properties, because we deal with broken symmetry,
which is recognized as an information form where both cardinal and ordinal
are already well-known.

A multiset (in the case of a finite collection) is the case where cardinal value
is well-known but ordinal is unknown; therefore multisets existence are
based on less well-known conditions, and can be understood as more
fundamental then the “normal” sets.

But from a global point of view, where symmetry is the fundamental


concept, the system is the fading transition between a “pure” multiset (for
example: {x,x}) and a “pure” set (for example: {{x},x}) and vise-versa.
Von-Neumann modeling is no more than the particular case that is based
on infinitely many nested levels of {{x},x} information form, which is
actually a non-finite Binary-Tree:
F
0 = |{ }|

T
1 = |{{ }}| = {0}
1.
|

F T
2 = |{{ },{{ }}}| = {0,1}
1. .
| |
2|____|
|

F T F T
3 = |{{ },{{ }},{{ },{{ }}}}| = {0,1,2}
1. . . .
| | | |
2|____| |____|
| |
3|__________|
|

F T F T F T F T
4 = |{{ },{{ }},{{ },{{ }}},{{ },{{ }},{{ },{{ }}}}}| = {0,1,2,3}
1. . . . . . . .
| | | | | | | |
2|____| |____| |____| |____|
| | | |
3|__________| |__________|
| |
4|______________________|
|
|

...

4
This Binary-Tree stands at the basis of N numbers because the information
form of a 2-valued excluded-middle logic is based on False_XOR_True :

F T
. .
| | <--(Standard Math logical system fundamental building-block)
|xor|
|

If we envision a collection from this point of view, we immediately realize


that ZF or Peano’s axiomatic systems are no more than the particular case
where redundancy_AND_uncertainty are not used as fundamental
properties of these axiomatic systems.

An ON is an ordered information form between a set and a multiset, where


its exact place in some collection is determined by its internal symmetrical
degree, measured by redundancy_AND_uncertainty .

Let us draw a partial case of ON 4 for better understanding.

If a,b,c,d are used to define uniqueness, then we get:

Uncertainty
<-Redundancy->^
d d d d |
|
c c c c |
|
b b b b |
|
{a, a, a, a} V {a, b, c, d}
. . . . . . . .
| | | | | | | |
| | | | |__| | |
| | | | | | | <--(Standard Math language uses only
| | | | |_____| | this no-redundancy_
| | | | | | no-uncertainty_symmetry)
|__|__|__|_ |________|
| |
={x,x,x,x} ={{{{x},x},x},x}

5
============>>>
Uncertainty
<-Redundancy->^
d d d d | d d d d
|
c c c c | c c c c
|
b b b b | b b b b b b b b b b
|
{a, a, a, a} V {a, a, a, a} {a, b, a, a} {a, a, a, a}
. . . . . . . . . . . . . . . .
| | | | | | | | | | | | | | | |
| | | | |__|_ | | |__| | | |__|_ |__|_
| | | | | | | | | | | |
| | | | | | | | | | | |
| | | | | | | | | | | |
|__|__|__|_ |_____|__|_ |_____|__|_ |_____|____
| | | |
{x,x,x,x} {{x,x},x,x} {{{x},x},x,x} {{x,x},{x,x}}

c c c

b b b b b b b

{a, b, a, a} {a, b, a, b} {a, a, a, d} {a, a, c, d}


. . . . . . . . . . . . . . . .
| | | | | | | | | | | | | | | |
|__| |__|_ |__| |__| | | | | |__|_ | |
| | | | | | | | | | |
| | | | |__|__|_ | |_____| |
| | | | | | | |
|_____|____ |_____|____ |________| |________|
| | | |
{{{x},x},{x,x}} {{{x},x},{{x},x}} {{x,x,x},x} {{{x,x},x},x}

a, b, c, d}
. . . .
| | | |
|__| | |
| | | <--(Standard Math language uses only this
|_____| | no-redundancy_no-uncertainty_symmetry)
| |
|________|
|
{{{{x},x},x},x}

Symmetry

Let x be a general notation for a singleton.

When a finite collection of singletons has the same color, it means that all
singletons are identical, or have the maximum symmetrical-degree.

When each singleton has its own unique color, it means that each singleton

6
in the finite collection is unique, or the collection has the minimum
symmetrical-degree.
Multiplication can be operated only among identical singletons, where
addition is operated among unique singletons.

Each natural number is used as some given quantity, where in this given
quantity we can order several different sets, that have the same quantity of
singletons, but they are different by their symmetrical degrees.

In a more formal way, within the same quantity we can define several
possible states, which exist between a multiset and a "normal" set, where the
complete multiset and the complete "normal" set are included too.

As this example of transformations between multisets and "normal" sets


shows, the internal structure of n+1 > 1 ordered forms, constructed by using
all previous n >= 1 forms:

1
(+1) = {x}

2
(1*2) = {x,x}
((+1)+1) = {{x},x}

3
(1*3) = {x,x,x}
((1*2)+1) = {{x,x},x}
(((+1)+1)+1) = {{{x},x},x}

4
(1*4) = {x,x,x,x} <---------- Maximum symmetrical-degree,
((1*2)+1*2) = {{x,x},x,x} Minimum information's
(((+1)+1)+1*2) = {{{x},x},x,x} clarity-degree
((1*2)+(1*2)) = {{x,x},{x,x}} (no uniqueness)
(((+1)+1)+(1*2)) = {{{x},x},{x,x}}
(((+1)+1)+((+1)+1)) = {{{x},x},{{x},x}}
((1*3)+1) = {{x,x,x},x}
(((1*2)+1)+1) = {{{x,x},x},x}
((((+1)+1)+1)+1) = {{{{x},x},x},x} <---- Minimum symmetrical-degree,
Maximum information's
5 clarity-degree
... (uniqueness)

This ordered collection allowed us to use ONs in order to define a time


depended model called Cybernetic Kernel.

In my opinion, the Cybernetic Kernel model can be used in order to improve


our insights about the transition between organic and inorganic chemistry.

7
Today we know that there were tiny irregularities in the Big-Bang’s
space/time fabric, where these irregularities are maybe the fundamental
conditions which allowed the existence of galaxies and clusters of galaxies,
which has a foam-like shape when observed from a great distance.

This foam-like shape is the result of opposite tendencies of Energy/Matter


integration/differentiation fluctuations.

These fluctuations and their dynamic results can be found in any observed
scale of our universe.

From the second law of Thermodynamics we knew that there is a global


tendency in the observed universe, which actually eliminates the difference
between integration and differentiation, until these fluctuations do not
express clear and highly ordered Energy/Matter phenomena, as can be
observed in early and present universe.

If this is the destiny of our universe, then we can ask: how did the original
fluctuation, which its Thermodynamics death we observed, came into being?

Another question is: do we interpret correctly the Energy/Matter


integration/differentiation fluctuations in the observed universe?

Let us examine a different model of these observed fluctuations:

By using the inflation model of the Big-Bang, we say that the first
fluctuation had a very high correlation, which allowed the very early
universe to “speak” in the same fundamental “language” called the laws of
nature.

If we describe the front of the first fluctuation in terms of information, then


this front was characterized by highly symmetrical degree that had a very
high redundancy degree of the first information structures, which were
created in the very early stage.

But we must not forget that these identical information structures holding an
elastic-like "memory" of several and different non-linear degrees of
space/time curvatures which are "aspirating" to a singular state from
different "points of view".

These different "points of view" of different non-linear degrees of


space/time curvatures, actually prevent a smooth return of each information
structure to the singular state.

8
The result of this non-smooth return is a diversity of different information
structures that can be observed in our universe.

The Organic Number is the model that describes the progression of the
Cybernetic Kernel along a time-line, as a result of the return of these
information structures to their singular state.

Cybernetic kernels are information structures based on self reference


property.

There is a straight ratio between the smoothness of a Cybernetic Kernel with


high degree of self referential property, and the complexity of the
information structures which are based on this Cybernetic Kernel.

Also there is a straight ratio between a Cybernetic Kernel and the self-aware
states that can be found in non-trivial complex systems like living creatures.

At this stage most of the observed information structures has the tendency to
become "Cybernetically-flat" in the long term (which is recognized as
entropy), but by this model there is the possibility that in the very long term,
there will be more information structures that are based on "smooth"
Cybernetic Kernels, and life phenomena, which we are a part of, will be the
main phenomenon that shapes our universe.

Organic Numbers and Cognition


Let us examine this situation:

On a table there is a finite unknown quantity of identical beads > 1.

We have:

A) To find their sum.

B) To be able to identify each bead.

Limitation: we are not allowed to use our memory after we count a bead.

By trying to find the amount of the beads (representing Locality) without


using our memory (representing Non-locality) we find ourselves stuck in 1,
so we need an interaction between Non-locality and Locality if we wish to
be able to find the bead's sum. By canceling the limitation mentioned above,
condition (A) is satisfied and the bead's amount is known, for example,
value 3. Let us try to identify each bead, but they are identical, so we will
identify each of them by its location on the table.
9
But this is an unstable solution, because if someone takes the beads, put
them between his hands, shakes them and put them back on the table, we
have lost track of the beads identity. Each identical bead can be the bead that
was identified by us before it was mixed with the other beads.

We shall represent this situation by:

((a , b , c),(a , b , c),(a , b , c))

By notating a bead as –let's say– 'c' we get:

((a , b),(a , b),c)

and by notating a bead as 'b' we get

(a,b,c)

We satisfy condition (B) but through this investigation we define a


universe, which is the result of Non-locality\Locality Interaction. This result
can be used for further mathematical development. Through this simple test
we have found that ZF or Peano axioms "leap" straight to the "end of the
story" where cardinal and ordinal values are well-known. As a result many
forms that have many clarity degrees are not researched. Organic
Mathematics uses Distinction as a first-order property, and as a result
superposition of identities is one of its fundamentals.
Through this simple test we get the insight that any mathematical concept
is first of all the result of memory\object (abstract or non-abstract)
interactions (it is not totally ideal and not totally real [1], for further details
please see pages 15-16). Let us examine what kind of ONs we get if each
information form is based on memory\object interactions.
Since each ON is at least an association between our memory and some
object(s), its form is based on interactions between at least two opposite
properties: Non-locality (memory) \ Locality (objects).

By using memory\object interactions as the basis of Organic Numbers


the researcher is basically educated to be aware of himself during research.
This fundamental attitude enables to define and develop the bridging
between Ethics and Formal Logic. An example of such development can be
shown by the idea of Cybernetic Kernels:

10
Cybernetic Kernels (CK)

CK1

CK6 CK5 CK4 CK3 CK2

There are 6 different CKs in this diagram, which are ordered by the
number of their self-interference. If we give an "elastic" property to CKs,
then CK1 is changed to CK6, and the level of ON5 Cybernetic Efficiency is
increased at each step. When the Cybernetic Efficiency is increased, ONs'
redundancy and uncertainty levels are reduced, which enables complexity
and self-awareness to be developed. We think that both Ethics and Formal
Logics have a common principle, which is: to develop the bridging between
the simple and the complex under a one comprehensive framework that is
aware of its results (it is naturally responsible).

11
The Complementary Space/Time (CoST) model:

Connectivity or integration is the property that is recognized by us as time or


correlation among different entities. A time-line of some universe is the
most connected state where no discrete phenomenon exists and all we have
is a smooth connectivity without space (no measured place).

Non-connectivity or differentiation is the property that is recognized by us as


space or non-correlation among entities. A space of some universe is the
most disconnected state where no smooth phenomenon exists and all we
have is discreteness without time (no timing) or correlation (no measured
flow).

Our universe is both time_AND_space and this complementary relation can


be found in any researched level within and without us. A cone and a sphere
are two separated models of a universe, where a sphere is a closed universe
(has a "start", a "middle" and an "end" along a time-line) and a cone is an
open universe (has a "start" but no "middle" and no "end" along a time-line).

A time-line in both models is like the “spine” of a universe, where any


space/time phenomena are changed relatively to it. Space/Time is a
complementary fading transition between “pure” time (the time-line) and
“pure” space (the surface). In other words, time and space are the polarities
of the same phenomena called universe, which its history exists as
complementary space/time environment that has common “laws of nature”
determined by the time-line, which is actually the attractor of a universe.

This time-line can be a single time-line, which is the attractor of a single


universe (closed or open):

12
Also a time-line can be a one branch that belongs to a tree-like attractor that
has a fractal-like property:

"A graphical representation of the multiple inflationary bubbles of Andrei Linde's "self-
reproducing inflationary universe". Linde's theory is one attempt to generate a "world
ensemble," or ensemble of varying universes -- within a larger Universe -- in which the
physical laws and properties may differ from one universe to the next. Changes in
shading represent "mutations" in basic physical properties from "parent" to "offspring"
universes. (Figure after Andrei Linde, "The Self-Reproducing Inflationary Universe, "
Scientific American 271 [November 1994]; p.55.)"

If we produce a cross-section and examine an arbitrary slice of a universe,


which its space/time fabric is the result of integration/differentiation
tendencies between “pure” time and “pure” space, then a natural equilibrium
between these “purities” , has the shape of an Archimedean-like curve, for
example:

13
This Archimedean-like curve is considered as the optimal zone, where
complex phenomena like life, for example, can find stable and rich enough
conditions in order to be progressed to self aware non-trivial complex
systems.

Summary:

This paper expands the wave-particle duality to a whole universe, where a


universe is a complementary phenomenon that exists between two opposite
properties, which are integration and differentiation.

Integration is understood as gravity and differentiation is understood as


expansion.

The most integrated state is understood as the 4D which is time or time-line.

The 3 other dimensions are the observed space, which is ordered relatively
to the time-line that is considered as its attractor.

So the history of a universe is the story of space/time complementary


associations along the time-line.

Without this time-line, no fundamental conditions can appear as natural laws


of a universe.

With this model we can examine the idea of rich enough conditions in the
space/time fabric, which could explain the origin and progression of life
phenomena along the time-line.

The next part of this research is to develop a new fundamental mathematical


language, using the insights coming from Quantum-Mechanics, where
redundancy and uncertainty are fundamental properties of its axiomatic
system.

By doing this, we actually re-examine the whole scientific cosmological


research in a new light, where the researcher himself is both
observer_AND_participator.

From this point of view any result in any level (and not just in QM level) is
influenced by the researcher, and the researcher has to include this influence
as an inseparable part of his results.

By using the word 'result' we mean that by this model, ethical results must
also be considered as an organic part of the scientific research and
development, where 'development' has two basis which are our technical

14
skills and our ethical skills, which are combined to a one comprehensive
scientific method, that can help us to survive the power of our developed
technology along the time-line.

The Ideal and the Real

OM's development is possible because we determine the limits of the


researchable by using the weak limit (Emptiness) and the strong limit
(Fullness). Cantor distinguished three levels of existences:

1) In the mind of God (the Intellectus Divinum)

2) In the mind of man (in abstracto)

3) In the physical universe (in concreto)

By using Fullness as "that has no successor" we show that Cantor's in


abstracto Transfinite system is not an actual infinity. We also show how
Distinction is a first-order property of any collection. These developments
are based on a cognitive approach of the mathematical science. In "On the
Reality of the Continuum" [1] (page 124) we find this sentence:

"From the realist standpoint, numbers and other real things do not need
admitting or legitimating by humans to come into existence."

From the idealist standpoint, numbers and other real things do need
admitting or legitimating by humans to come into existence. In both cases
the term "real thing" has to be understood. According to the realist if "real
things" are "real" iff they are totally independent of each other, then no
collection is a "real thing" (total independency does not able things to be
gathered).
According to the idealist if "real things" are "real" iff they are totally
dependent of each other, then no collection is a "real thing" (total
dependency does not able things to be identified). No collection exists in
terms of total dependency (total connectivity) or total independency (total
isolation). Since totalities are not researchable on their own, then any
research cannot avoid the existence of collections, where collections are the
only researchable "real things". Actually we find that a researchable realm is
both ideal (has relations) and real (has elements).
We have to notice that there is no symmetry in using concepts like
"Realist standpoint" in order to understand "real things" because if the
requested result is "real things" then we actually give a privilege to the
Realist standpoint over the Idealist standpoint about the requested "real
thing". This asymmetry can be avoided by changing the requested results to

15
"researchable things" instead of "real things". In that case the concept of
Collection is researchable exactly because it is not totally real and not totally
ideal.

Here is the last part of the quote from [1]:

"Furthermore, real objects are always legitimate objects of study in the


sciences, even if they are not fully understood or known."

We agree with this quote because "real objects" are valuable for science
iff they are researchable, or in other words, they are both real and ideal.

Reference

[1] Anne Newstead & James Franklin, On the Reality of the Continuum Philosophy 83
(2008), 117-27 http://web.maths.unsw.edu.au/~jim/newsteadcontinuum.pdf .

16

You might also like