You are on page 1of 33

temporal relativity, spacelet theory, and coherent processes

sg micheal, 2011/JAN/09

In honor of Richard Feynman, in sincere humility, i hereby found a new branch of science with
three main branchlets: temporal relativity, spacelet theory, and coherent processes. TR is the
theoretical foundation behind spacelet theory. ST is the fully deterministic theory of elementary
particles. Coherent processes is the deterministic analysis of: lasing, superfluidity, and coherent
structures in turbulent flow, as examples. Because the assumptions of these areas are in stark
contrast with those of the Standard Model of particle physics, we cannot use the standard
methodologies of quantum field theory and conventional quantum mechanics. We must
necessarily develop a temporal curvature analogy of quantum field theory. In fact, it is argued the
phenomenal success of quantum electrodynamics is actually due to this conceptual analogy
between the proposed temporal curvature theory and quantum field theory.

Many erroneously declare the 'greatest physicist of all time' to be Albert Einstein. Although his
faith in the theory described above and his contribution of relativity is fully recognized, most of
convention recognize Richard Feynman as that person. There are two basic reasons why he
deserves this title. One is that Feynman basically 'taught physicists the deep structure of physics'
by providing them the tools they needed to formally justify their ideas. The other is that he
recognized, he himself, the need for a 'less haphazard' theory of elementary particles. Feynman
was not as vocal about this as Einstein was but still believed we could find a better way of
looking at things. The fact he had the mathematical and conceptual sophistication to develop
modern QFT and QED, and the foresight to recognize their weaknesses, is the reason we honor
him.

The inspiration for this theory comes from the inconsistent modeling approaches between
conventional branches: nuclear chemistry and quantum mechanics. One assumes inherent
stability and the other quite the opposite. This inconsistency within convention was one of the
stimulants. However, the main impetus/motivation for it is the faulty main assumption of the
Standard Model: elementary particles are probability waves that interact via virtual particles.
Admittedly, this is the logical extension of Heisenberg's early matrix formulation. But at that
time, we did not have the level of engineering sophistication capable of 'pointing the way' toward
unification.

In my estimation, it will take about 100 years for this theory to be established because of the
conceptual inertia of the physics community. So by 2110, we should be looking back at at this
time saying "how could we be so naive?" As stated above, it's not really convention's fault for not
developing this theory previously. We simply didn't have the sophistication in perspective to be
able to model things appropriately. That's forgivable. What's 'unforgivable' (strictly speaking,
nothing is unforgivable) is if we ignore this 'wake up call'.

Many will see me as incapable of establishing a new branch of science. Many will see me as
arrogant. However, close to 40 years i've made it a lifetime discipline (humility) and
approximately 30 years i've privately studied physics. Even considering these factors, many
would still dismiss me as incapable. But it's always the 'fresh perspective' in science which
'solves the problem'. This particular perspective comes from engineering.
There are two engineering concepts and one relatively new area that contribute: elasticity,
impedance, and wavelet theory. Within engineering, these concepts are well developed and
deserve more attention from the physics community. Admittedly, specifically because of this
deterministic bent and the fact impedance 'smells' something like the historical aether, physicists
have ignored them.. But again, because of the level of sophistication of the models being
discussed, we cannot afford to ignore them anymore.

Models is plural above because we're being inclusive of general relativity. The new branch
briefly described above has its roots in both general and special relativity - as much as the
engineering concepts mentioned above. So it's not as if this 'new tree' of science does not have
foundations / roots / conceptual inspirations .. The path to this tree/moment was quite convoluted
and took me through territory mentioned above: from impedance to special relativity to general
relativity to now. It's not so much i 'borrowed concepts' as began to see a clearer image of this
view of elementary particles.. What's the famous quote.. "We see through a glass darkly.." Not
exact but you get the idea.. So the main reason Feynman himself could not develop these ideas
was because, i believe, he was browbeaten into submission by himself to follow a conventional
path. How would we have received his proclamations had he followed this path? i believe quite
derisively as i have been.. So basically 'no choice' Feynman had - but to support convention.

There is tremendous unspoken pressure to conform in the physics community. Basically, the
tenet is: conform or don't get supported. This is the battle any newcomer must face. Even
Einstein faced great ridicule within the community because of his philosophy and later years. He
was mocked and denigrated.. It's the sad unfortunate truth that even physicists are subject to
normal human frailties.. ;) So much of my previous writings were about the philosophy of
science, scientific method, and Occam's Razor because we have led ourselves astray.

This is my formal statement because if i don't make it, history will not recognize me as saying so.
i must take a stand and make an unequivocal declaration: the Higgs will never be detected (if
something is found, it will not be the Higgs), no evidence for gravitons will ever be detected,
many of the 'forces' will be overturned such as Casimir and weak, and all so-called quantum
effects will be subsumed into various portions of ST or coherent processes. In lieu of any
commendation, i respectfully request we formally develop: temporal curvature theory, TR, ST,
and coherent processes independent of any probabilistic/QFT formulation.

..All the hullabaloo about 2012 may very well be this 'transformation in physics' i'm calling for..
What i'm calling for is a return to determinism and rationality. And quite honestly, a return to
sanity.

Iam space - a 5D bosonless model of spacetime

Iam space is a proposed alternative to Minkowski space. The latter does not permit curved
spacetime and is therefore an incomplete basis for comprehensive models of physics in our
universe. More specifically, Minkowski space does not allow the theoretical framework of
general relativity. Iam space accommodates this including electromagnetic theory. The formal
structure of Iam space and comprehensive list of implications need development however, it is a
promising alternative. The initial benefits include: better fit to reality than Standard Model
predictions, an intuitive and accessible framework which connects to engineering more readily,
and is a more balanced and holistic approach.

There is evidence, from the IT community, that spacetime can be modeled in five dimensions
where the fifth dimension is scale. This is from an information standpoint. But consider that
instead of scale, a measure of curvature is introduced. Further, that index need not be associated
with spatial dimensions as assumed in general relativity. Curvature may exclusively be
associated with time.

Conventional extensions of the Standard Model predict proton decay. No obvious examples of
proton decay have been detected. The Standard Model predicts the existence of the Higgs boson.
None have been detected and the allowable mass range is fast becoming excluded. Conventional
Higgsless frameworks are ad hoc at best. They are not comprehensive nor realistic. It is proposed
the Higgs itself is an ad hoc construction of extremely dubious benefits.

Benignly, convention has accommodated theoretical work over the decades - assimilating various
constructs into the Standard Model: Casimir, non-locality, multi-state atoms and nuclei,.. But
these concepts actually detract from a holistic perspective. The concept of virtual-exchange,
arrived at through over-application of reduction, associated with Feynman's QED and path-
integral formulation of QM, is perhaps the worst example.

The development of quark theory is similar in that an incredible amount of theoretical effort has
been expended to explain the veritable zoo of particles detected in collider experiments.
However, none of these particles are actually stable. Exactly four stable particles are known:
proton, electron, neutrino, and photon. These are listed in order of decreasing mass. No
conventional mass is ascribed to the photon and a minority of physicists ascribe any to the
neutrino. The few who ascribe any mass to the latter designate it miniscule.

It is proposed the mass pattern, in terms of curvature (curvature is C = EtP/h, normalized c = 1),
is: 11/9*10^-20 6/9*10^-24 1/9*10^-28 -4/9*10^-32 which correspond to: proton, electron,
neutrino, and photon. (Normalized implies: µ0 = 1/ε0, Z0 = µ0, γ = √(1-v^2), lP = tP, E = m, and
C/tP = E/h where tP = lP Planck-length, mu, epsilon, Z, and gamma are standard designations of
permeability, permittivity, impedance, and relativistic factor.) It is proposed the anti-particles
associated with each still exist in our universe accumulated in anti-galaxies or perhaps in a
parallel anti-universe. That speculation is somewhat irrelevant to this discussion. Iam space does
not depend on it.

The coefficients are surprisingly exact. The first two are known. The rest are derived from the
pattern implied between the first two. Again, the values assigned to neutrino and photon are
speculative and based only on two data points. This is the unfortunate reality of the situation. If
humanity had allocated resources toward experimentally investigating neutrino and photon
masses instead of Higgs and others, they would have more hard evidence for them. This is an
indirect consequence of the 'benignness' mentioned above.

Other surprising consequences exhibit if the pattern is continued. It's further proposed the
curvature pattern continues with so-called dark-matter constituents: -9/9*10^-28 -14/9*10^-24
-19/9*10^-20 where the exponents are arrived at through symmetry about the photon. These are
undramatically labeled: dark-neutrino, dark-electron, and dark-proton. Assuming a balanced
distribution of particles, the percentages/ratio between dark-energy and dark-matter/normal-
matter is surprisingly close to conventional calculations: 72/28 vs 74/26. Note that photons have
slight negative curvature in this scenario.

The three paragraphs above are actually later developments/consequences of Iam space which are
not especially integral to the theory. They're an illustration of implications mentioned in the
abstract. Explicit predictions of the theory include: no Higgs nor graviton signatures, nuclear
meta-stable states are controllable, 'multi-state' atoms/molecules are controllable, double-slit
experiments are controllable via slit separation, slit size, and materials involved,.. Further
research is required in at least two areas: theoretical investigation of implications of complex
time including possible non-local effects - and - simulation runs of various media interfaces
varying media, energy range of TEW, and temporal curvature.

A first order correction to Iam space is (x, y, z, Zit, EtP/h) where the first three items are
Euclidean coordinates, Z represents the impedance of the media at that location, i is the
fundamental complex number, and the last is equivalent to C, temporal curvature at the same
location. While impedance has a tendency to lengthen the period between electromagnetic
events, curvature has a tendency to lengthen the period between mechanical events. This is the
essence of the theory. It's equivalent to Feynman’s statement about understanding non-locality
and QM.

The three basic assumptions of the Standard Model can be compared against those associated
with Iam space:

1. quantum self-interference is caused by non-locality


2. multi-state atoms/nuclei are exactly that
3. forces are caused by virtual exchange of force carrying particles

1. quantum self-interference is caused by extended portions of the standing waves


comprising elementary particles
2. multi-state atoms/nuclei are actually different representations (distinct instances)
of possible equivalent energy states
3. there are two distinct forces in our universe:
electromagnetic and another ‘mediated’ by temporal curvature

It's possible point one is moot since there's evidence Minkowski space 'causes' non-locality and
Minkowski space is contained in / a subset of Iam space. The lower three points are essentially a
deterministic view of quantum mechanics. The defining characteristic of the Standard Model and
its associated framework is item three: forces are caused by virtual exchange. The defining
characteristic of Iam space is essentially its deterministic counterpoint: there's only one force
called electromagnetism mediated by electromagnetic flux, impeded by 'temporal impedance',
and mechanical events are delayed by temporal curvature. Matter is essentially temporal
curvature. Gravitation is distributed temporal curvature. The strong nuclear force - similar. And,
relativistic/gravitational time dilation is enhanced temporal curvature.
Every event human eyes perceive is a direct consequence of either: temporal curvature or
temporal impedance. Essentially, it's a comprehensive yet simple model of our universe that
explains matter and interactions holistically. There's no need for virtual exchange because all
forces are accounted for.. The seeming omission of 'weak nuclear' is there because it's believed
humans simply don't understand that 'force'. Nuclear decay is a statistical process because
humans cannot know when unstable nuclei formed. The mechanisms of decay are barely
understood. When humans approach these deterministically, from within the Iam framework,
they will understand more fully nuclear decay.

The author is well aware of multi-state energy levels for example excited helium. Included are
examples of so-called quantum fluids. It's believed the conventional understanding via
'successful application' of PQM is simply due to statistical analogy of some not yet understood
deterministic coherent process. There is little doubt in the author's mind that if humanity put as
much effort into validating DQM as they have PQM, many insights would be revealed.

One way to think of gravity is as curved space. Another way to think of gravity is as curved time
(only). An object in a circular orbit (around Earth) is following a 'straight line' path (of least
action) through curved space - or - is following a path of same temporal curvature. An object in
free-fall is following a straight-line path to the maximum of spatial curvature - or - is following a
path to the maximum of temporal curvature. Gravity can be analyzed exclusively as a distributed
compression of time. (All trajectories can be treated as a linear combination of those two
orthogonal trajectories. They are fundamentally different in terms of temporal curvature. All
extended objects experience a gradient on different parts of their extension - it’s not just the
‘steepness of the hill’ which pulls them down. In the same way, time is infinitesimally slower on
the ‘low side’ of an object in orbit. Objects move to maximize time-dilation.)

It's interesting to note the author's initial revulsion of multi-dimensional approaches, such as
associated with string theory, was finally overcome in discovery of Iam space. Occam's Razor
was employed consistently in development/discovery. 5D or 10D, only time will decide..

References:
http://en.wikipedia.org/wiki/Minkowski_space
http://en.wikipedia.org/wiki/General_relativity
http://www.springerlink.com/content/f89m346543l3g071/
http://en.wikipedia.org/wiki/Proton_decay
http://en.wikipedia.org/wiki/Higgs_boson
http://en.wikipedia.org/wiki/Higgsless_model
http://en.wikipedia.org/wiki/Casimir_force
http://en.wikipedia.org/wiki/Non-locality
http://en.wikipedia.org/wiki/Static_forces_and_virtual-particle_exchange
http://en.wikipedia.org/wiki/Quark_theory
http://en.wikipedia.org/wiki/Elementary_particles
http://en.wikipedia.org/wiki/Planck_units
http://en.wikipedia.org/wiki/Anti-matter
http://en.wikipedia.org/wiki/Dark_matter
http://en.wikipedia.org/wiki/Graviton
http://en.wikipedia.org/wiki/Imaginary_time
http://en.wikipedia.org/wiki/Characteristic_impedance
http://en.wikipedia.org/wiki/Standard_model
http://en.wikipedia.org/wiki/Electromagnetism
http://en.wikipedia.org/wiki/Flux
http://en.wikipedia.org/wiki/Weak_force
http://en.wikipedia.org/wiki/Superfluid
http://wikibin.org/articles/gravitation-distributed-temporal-curvature.html
http://en.wikipedia.org/wiki/String_theory
http://en.wikipedia.org/wiki/Occams_razor

Another Possible Curvature Pattern (edited)

[deleted section] Of course, with only two data points (associated with proton and electron), the
curvature pattern could be anything: linear, non-linear, random,.. [deleted section discussing a
symmetric distribution about zero curvature] These could indicate several things. They could be
anti-particles or dark matter. The fact is we're talking about negative temporal curvature which
indicates anti-particles according to Dirac. But, the first element in the second pattern could
correspond to dark-energy (photons with slight negative temporal curvature).

[deleted section] Any dark-matter described above may behave like a massless superfluid (since
their negative temporal curvatures correspond to anti-gravity - a repulsive gravitational force). i
leave it to cosmologists to figure out the 'nitty gritty' of particle distribution patterns. The fact we
see nearly perfectly flat space indicates a balanced pattern about zero curvature.

But of course, that does not acknowledge any stellar/nucleosynthesis processes that disturb
initially balanced distribution patterns. Our ways of measuring dark-matter and energy are just
emerging as parts of science. Many scientists regard cosmology as a very speculative field. So we
have a lot of work to do: formally develop Iam space as Minkowski has been, derive all
implications of complex time where the coefficient represents complex media impedance, and
formally determine full implications of Iam space. This is the fundamental shift in perspective
Feynman was calling for when he designed QED.

Deleted sections above indicate the danger of embracing any particular particle distribution
scenario. Quantum cosmology is a field in science so rapidly evolving and subject to empirical
revelations that typically restructure the field every time a new significant discovery is made. In
order to make progress in this new branch, we need to focus on Iam space and implications as
described above. Later, we can work on particle distribution patterns.

Error in numbers but not concepts

It had been quite a while since i've 'played with the numbers' (of curvature etc), so i didn't notice
some errors in my numbers and numerical claims. Taking five numbers and their associated
uncertainties, i propagated the uncertainties until i arrived at the curvature ratio between electron
and proton. In the process, i noticed an error in the electron curvature and error in my claim about
'exactness'. Please forgive this - it was not intentional and the error - actually helped me think
more clearly about the concepts. But to clear the air, i will list out my numerical errors.

Idiotically, i couldn't even get the ratio right because i had the electron wrong. The correct
approximate values are: (2/3)10^-23, (11/9)10^-20, and (6/11)10^-3 for electron, proton, and
ratio. That leaves the progression to approximately be: (11/9)10^-20, (2/3)10^-23, (4/11)10^-26,
and (24/121)10^-29. It's a little sticky to convert back and forth between curvature and mass so
the easier approach is simply to use the NIS value for proton-electron mass ratio, invert it, and
apply that on the electron mass. The mass progression from proton to electron to something-1 to
something-2 is: 1.673*10^-27 kg, 9.109*10^-31 kg, 4.961*10^-34 kg, and 2.702*10^-37 kg.

Upsettingly, i re-realized the upper limit for neutrino mass proposed by convention is about 2 eV
(electron-volts). This converts to a mass of about 4*10^-36 kg. So.. "I'm frakked" (with the
numbers) as Starbuck loves to say. 8| Even considering my error bounds as i traced them above,
doesn't allow me leeway to claim rational numbers for coefficients. :( And what do we do with
the conventional claim neutrinos have a maximum mass of 2 eV? i can't just pretend that doesn't
exist..

My intuitive prediction for the pattern would have been something like this: proton, electron,
positive charged 'something-1', and negative charged 'something-2'. Not a neutral particle. So
there's something wrong with the way we measure mass, our theory about neutrinos, or my
thought processes. ;) i know what you're going to say..

All my numerical errors cannot erase the conceptual insights i've arrived at over the years: charge
moment is impeded spin, elementary particles are dual flux-vortices and screw-dislocations in
spacetime, more deeply - they are spherical standing waves of temporal curvature, and all the
associated insights of complex time.. i know there are many who would 'throw up their arms' at
the numerical mistakes i've made but the spirit and motivation of/for Iam space stands.

A billion numerical errors and Iam space are preferable to the Standard Model and virtual
exchange.

What do we call this new beast?

For quite a few months i've labored to accurately label this new branch of physics which
combines ideas from: general relativity, electromagnetism, and special relativity .. We can't keep
calling it "bosonless particle physics" because that only says what it's not. i've posted two
websites on scholarpedia.org and wikiversity.org with that name. Likely the former will be
removed due to scholarpedia restrictions but i hope not: essentially it belongs there because it's a
new branch of science. We can't call it part of quantum cosmology because that area depends on
conventional quantum mechanical techniques which employ constructs in direct conflict with
this approach. It's not quantum 'anything'. We could call it relativistic cosmology or
electromagnetic relativity but there has to be a stipulation: we're using a modified form of
relativity that does not curve space - only time. So strictly speaking, general relativity has to be
reformulated from the temporal curvature perspective, or temporal curvature must be developed
formally as general relativity has, then this must be conceptually combined with complex time
and Euclidean space to formally justify Iam space. This is the 'bottom up' approach. The 'top
down' approach takes Iam space as is, and develops cosmologies without benefit of any
conventional quantum mechanical techniques. Both approaches are formidable because they
require us to become a kind of 'TC Feynman' - inventing tools as we go along..

As suggested previously, we could use a curvature analog of Feynman's path-integral formulation


of QM .. (It was argued previously that the reason Feynman's QM/QED is so successful is
because it's actually an analog of a yet-to-be developed TC approach to particle physics.) So "TC
particle physics" or relativistic cosmology - you decide.. Regardless of what we call it, i predict it
will eventually replace QFT and the Standard Model. Please don't ever call me "TC Feynman"
because i haven't earned it, but i sincerely hope one of us does.. sgm

PS: again, CTCED (complex temporal curvature electro-dynamics), Iam physics, TC particle
physics, CTCT (complex temporal curvature theory), relativistic cosmology,.. you decide. But
let's move forward regardless.

Hey Baby, Your Space or Mine? ;)

When i was at Michigan State, i almost did date a conventional physicist. But in addition to her
busy personal life, she had conceptual reasons for avoiding me. She adamantly proclaimed
helium is a quantum superfluid and cannot be modeled in any deterministic way*. At that time, i
had not discovered Iam space so she could not reject me on that behalf. But if i had, surely she
would have added that to reasons for her rejection. The component of Iam space called
impedance is the critical factor conventional physicists reject my discovery. *Coherent processes
is the deterministic way of modeling superfluidity.

There are several different forms/usages of the term. There's electrical impedance, media
impedance,.. and even gravitational impedance. But what i'm mostly referring to is the
impedance of free space (although i'm including 'wave impedace'). Because of the historical
rejection of the luminiferous aether, conventional physicists refuse to accommodate any theory
which includes any aspect resembling it. Wave impedance is just too much like the luminiferous
aether for conventional physicists to stomach.

But i urge convention to reconsider for the aspect of temporal curvature. It's the proposed
summary characteristic of reality which can explain: mass, gravitation, strong force, time dilation
(both gravitational and relativistic), and other relativistic effects. Unfortunately for Iam space, i
have associated it with wave impedance. Conceptually, it was Occam's Razor which impelled me
to simplify Iam space thusly. So it was not my preference which dictated that action; it was
expediency, elegance, and simplicity.

To my meager understanding of the subject, Iam space is the minimal construct which will 'do
the job' of physics unification. Physicists have tried for years, both conventional and fringe, to
perform that action. It refers to the attempt to unify the 'four forces of nature': electromagnetic,
strong nuclear, weak nuclear, and gravitation. But the basic stumbling block of convention has
been the assumption forces work via virtual exchange of bosons - attempting to integrate particle
physics with gravitation. If they had worked it 'the other way around': attempting to
develop/integrate particle physics from a 'general relativity' standpoint, they might have had more
luck.

Luck has nothing to do with Iam space - i claim divine inspiration.. i know, that in and of itself is
reason for rejection.. But isn't divine inspiration better than say: "I invented Micheal space!
Aren't I wonderful!" ;) That way is saturated in ego and i refuse to approach ANYthing in that
manner.. Again, Iam space is a divinely inspired model of spacetime curtailed by Occam's Razor.

In order to lighten the discussion somewhat, i'll copy-paste a satire of it here:


Boxing Match: VX (virtual exchange) vs TC (temporal curvature)

Courtesy of David Chalmers, http://consc.net/chalmers/ , please enjoy the following Monty


Python skit about international philosophers:
http://www.youtube.com/watch?v=yiZt79UKUFQ

i must apologize i cannot produce the following skit in that style; i don't have a team of
comedians working for me.. :( Referee = R )

R: WELCOME to the ring! VX, virtual exchange! and TC, temporal curvature!
[crowd goes wild]
R: VX represents convention, is the reigning world champion,
has never been beaten in the history of physics!
[crowd goes wild]
R: in the other corner, please put your hands together for the upstart and newcomer,
temporal curvature!
[crowd boos and hisses]
[R waits for crowd to calm]
R: come forward gentlemen! [the two comply]
R: put your gloves together as a symbol of good faith, let the best man win! [the bell dings]
[VX and TC eye each other - gauging each other warily..]
R: they seem to be examining each other..
[crowd boos and hisses]
R: have some patience good people..
[crowd boos and hisses more loudly]
[under pressure from the crowd, the two boxers start to circle each other - fists ready..]
[the two dance and skip around like positronium]
R: they appear to be mimicking positronium..
[crowd becomes ugly - frothing at the mouth like rabid dogs..]
[under pressure from the crowd, the two begin feigning punches.. toy models of punches..]
R: I don't know what they're doing! but it looks virtually exciting!
[crowd doesn't believe and starts throwing food at the two..]
[under pressure, VX starts throwing real punches at TC]
R: finally! the fight begins!
[crowd screams in insane delight]
[TC blocks deterministically but VX's punches seem to move faster than light..]
R: look at those punches! you can barely see his fists move!
[crowd now frothing in rabid delight]
[VX repeatedly beats TC's body and face with lightning fast punches..]
R: look at those attacks! even from behind!
[crowd swoons in ecstasy]
R: oh! an unusual development!
[TC lowers his guard and slowly sits down in a lotus position..]
R: what is TC doing! the idiot! is he giving up???
[crowd resumes throwing food]
[impervious to crowd, referee, and opponent, TC meditates and Ohms..]
[the Ohms resonate throughout the auditorium silencing everyone but VX]
VX: get up and fight! coward! [VX screams at TC]
TC: Ohm..
[VX resumes his punching at incredible double-lightning speed..]
[his fists are a blur - nothing can be seen but VX's fists hitting his opponent..]
TC: Ohm..
R: what's happening! a new development???
[crowd gasps in amazement..]
R: some kind of temporal 'force field' is coming from TC! look everyone!
[VX is relentless - he increases his punch frequency..]
[now nothing can be heard or seen but a vibrating hum of VX's fists..]
[strangely, immediately surrounding TC is a glow of blue light, within that glow,
VX's fists slow markedly.. VX cannot make contact with TC anymore..]
[VX disappears in a puff of smoke, crowd gasps in astonishment]
R: well it looks like TC is the winner by default! VX has vanished!
[bell triple-dings]
[mixed reaction from crowd]
R: don't worry folks! [starts singing] Ti-i-i-ime is on my side! yes it is!..

References:
http://en.wikipedia.org/wiki/Superfluid
http://en.wikipedia.org/wiki/Coherence_(physics)
http://en.wikipedia.org/wiki/Electrical_impedance
http://en.wikipedia.org/wiki/Wave_impedance
http://en.wikiversity.org/wiki/Gravitational_characteristic_impedance_of_free_space
http://en.wikipedia.org/wiki/Impedance_of_free_space
http://en.wikipedia.org/wiki/Luminiferous_aether
http://wikibin.org/articles/gravitation-distributed-temporal-curvature.html
http://en.wikipedia.org/wiki/Occam's_razor
http://en.wikipedia.org/wiki/Mathematical_elegance
http://www.scholarpedia.org/article/Grand_unification
http://en.wikipedia.org/wiki/Fringe_physics
http://en.wikipedia.org/wiki/Static_forces_and_virtual-particle_exchange
http://en.wikipedia.org/wiki/Boson
http://en.wikipedia.org/wiki/General_relativity
http://en.wikipedia.org/wiki/Spacetime
Public letter to Timothy Clifton

i believe Timothy Clifton will become a recognized conventional authority in the theory of
gravitation and general relativity. He submitted his doctoral dissertation at King's College,
Cambridge University in August '06. The title of his dissertation is Alternative Theories of
Gravity. i'm reading it now. There's one statement that stands out to me: "It seems that GR is
unique not only in satisfying all of the conditions listed above, but also in being the simplest
relativistic metric theory of gravitation that can be conceived of." .. i've written to him personally
but believe some things, like this letter, should be part of public record.

The Natural Philosophy Alliance is a good place/forum for alternative physics ideas but we have
a tendency to get sidetracked and confused (forum conversations) .. The two greatest honors i
have there is meeting other open minded people who are critical of convention and the chance to
air mine (criticisms etc). But we have our weaknesses. We have a tendency to push our
individual ideas at the expense of others'. We tend to have unbalanced perspectives.. This is the
disadvantage of 'living on the fringe'.. Personally, i'd rather we got integrated into mainstream
physics, found some kind of support individually, and published in mainstream journals. i'm not
referring to Physics Essays (this journal is somewhat fringe itself). Considering the world
economic situation and hoarding tendencies of human beings, it's unlikely my desire will realize..

But any of us in NPA deserves a chance to air our ideas to 'the rest of the world' and so i've tried
to diligently maintain my relationship with NowPublic and Scribd .. My scholarpedia essay on
"Bosonless Particle Physics" has been deleted as anticipated .. Articles there are "by invitation
only". The intention is good but .. again i'm unfairly dissed .. Iam space deserves serious attention
from cosmologists and particle physicists.

Previously, i gave a brief history of Iam space but i need to rewrite it in English so that laypeople
can have a chance for understanding.. It all started when i was studying electromagnetism (for
engineers) at Florida International in Miami. Something clicked in my mind and i was shown*
something difficult to appreciate. i could write the equation here but it's better if i write it out in
English: charge moment is impeded spin. i know, the word 'moment' throws you.. It refers to a
higher order 'something' relating to whatever you're talking about.. Moment typically refers to
inertia - as in moment of inertia. But moment can also refer to charge indicating a measure of it.
So again, a measure of charge is directly related to spin via impedance. This is actually
astounding if we ponder it .. This was the beginning of my path toward Iam space.
*shown as in divinely inspired

Continuing this line of investigation, i found that elementary particles can be modeled by dual
flux vortices and screw-dislocations in space (dual structures with two manifestations). Markus
Lazar has independently investigated this (more formally than i have). Once i had discovered his
work, i realized that anything i did would be ignored - if he also was ignored (since he is part of
conventional research). So i compiled the ideas in a booklet (available at Scribd) called N and
Omega. i did other things for a while..

While i was doing other things, i could not forget physics no matter how hard i tried.. It would
grab me in the shower.. It would wake me in the middle of the night.. At some point, i was
shown other things .. That: space was measurably distorted by elementary particles (you can
calculate the exact distortion). So mass behaves like self-confined energy.. Thinking like this
reveals many things.. It verifies Einstein's famous equation. It verifies the importance of special
relativity.. It forces you to try to understand 'what's really happening' with accelerated particles.
So NPA's auto-rejection of SR and Einstein is unfair.. It's misplaced..

There are no people i more greatly admire in science than Feynman and Riemann. They are like
'gods' (or angels with divine intellect) to me.. Their perspectives and contributions to science are
unrivalled. But.. i question Feynman's perspective and assumptions.. Because of all the 'holes'
and ad hoc methods in the Standard Model, i've been forced to search for more elegant 'solutions
to the problem' (of unification) .. The path above finally led me to Iam space and the centrality of
temporal curvature. Once i realized that 'time can store energy' (like space proposed above), the
most startling revelation of all hit me directly in the face: theoretically, you don't need a separate
force for 'nuclear glue', gravitation, theory for SR,.. All you really need is temporal curvature. If
indeed particles are 'localized time warps', SR is explained by enhancement of that via kinetic
energy; the kinetic energy in a particle exactly equals its relativistic energy; its relativistic energy
can be looked at as - an amplification in the temporal distortion. Why does 'time slow down' for
speeding craft? Because they're amplifying the temporal distortion in all their particles. Combine
these realizations with the idea that gravity is simply a 'far field' effect of all the particles' (that
make up the gravitating body) temporal distortions - and you get a comprehensive, simple, and
elegant theory.

So with all due respect to Timothy, his statement above is clearly incorrect. GR is not the
"simplest relativistic metric theory of gravitation that can be conceived of." (Metric refers to a
fixed measure allowing measure within a space.) {R4, c, t0} is a metric space since c defines
measure on space and t0 on time. (The first 'space' is a mathematical term and the second 'space'
is a physics term.) .. i'm waiting for his reply..

We still need to name this 'new branch of science' .. (No one has written me with any
suggestions.) Maybe .. it just occurred to me .. TR (temporal relativity)? ;)

i need a few encouraging prayers..

Simulating Iam space

(x, y, z, Zt, C) where x, y, z are real numbers representing Euclidean coordinates in space, t is
non-negative real (a convenient convention), Z is typically complex representing media wave
impedance (for open space, this reduces to Z0i, the impedance of space times imaginary identity),
and C represents temporal curvature at (x, y, z). C typically takes on very miniscule numerical
values for elementary particles, is dimensionless, but can vary widely depending on local energy
density. Theoretically, it can even be negative. So as long as we keep clear which indices
correspond to spatial dimensions etc, we can reorganize the space for simplicity: R4xC (four real
dimensions plus one complex). This mathematical model is hardly unique but the assumption
that it corresponds to our physical universe is.

Electrons/protons can be thought of as: probability waves, particles, wave-particle dual


structures, flux vortices, screw-dislocations in space, and temporal distortions. Considering
tunneling and electron orbitals, perhaps it's best in this scenario to view them as 'electromagnetic
wavelets'. This view accommodates them to Iam space nicely. So consider an electron/proton
'living' in Iam space = (x, y, z, C, Zt). With c=1, C=EtP/h, E represents local energy density, tP,
and h Planck constants. Here's where things get a bit sticky. t represents time, but as we know
time progresses dependent on local energy density (the higher E is generally, the slower t
progresses). So in order to keep things clear, we must designate t=t(E), t is a function of local
energy density. Iam space becomes: (x, y, z, t(E), C(E), Z) where both E and Z are evolving,
dependent on position and time in any particular cosmology. It sounds recursively maddening but
is actually simulation feasible .. Iam space is getting bigger.. The minimal set becomes R5xC.

*tx below needs correction to tx=t0/√(1-1.48*10-27(m/r)) which is conventional gravitational time


dilation where m is affecting mass and r is distance from it
If we designate t0 the time index for flat-space (no curvature), then t(E) has the form *tx=(1+C)t0.
The bold subscript emphasizes the fact time usually progresses differently dependent on location.
In a simulation, this would equate with local dependency on energy density - augmenting global
step size locally. If C is negative, corresponding to anti-gravity or perhaps anti-particles, local
step size decreases relative to global step size. So for simulation purposes, Iam space becomes
(x, y, z, Cx=ExtP/h, tx=(1+Cx)t0, Zx) where again bold subscripts indicate location specific data.
We still have not defined Ex, local energy density, yet. That may be approached in a cellular way.
With sufficient resolution, our universe may be modeled by a cube of uniform cells. Each cell
has six neighbors. Instantaneous energy content for each cell can/must be tabulated, including
self cell, so that instantaneous local averages can be calculated. So the energy content of seven
cells determines average local energy density. So let's tentatively define local energy density to
be: Sum(neighbor cells, self cell)/7. Iamsim becomes (x, Ex, Ena, tx=(1+EnatP/h)t0, Zx) where we've
changed notation a bit: the first index is a vector indicating cell position in the 3D matrix, next is
cell energy content, next is neighborhood average, next is local step size, and finally last is local
media impedance. Iamsim relates to Iam space described above as 'minimal set' because Ena is
only there for computational convenience. The reason we need to account for local media
impedance is because waves/wavelets are typically impeded in physical movements (cellular
translation in this discussion); there are: boundary effects, group effects (coherence/interference),
and individual effects (momentum, spin, charge) relating to physical translation. Again for the
sake of clarity, impedance and associated translation 'machinery' equate with inclusively: laws of
optics, coherent phenomena (including lasing, superfluidity,..), interference phenomena (wave
cancellation, double-slit phenomena,..), and traditionally 'classic' phenomena (charge interaction,
mechanics,..). This may sound like a 'tall order' for the simulation but i assure: with some
simplifying assumptions, the requirements become computationally approachable.

Each simulation run would correspond to a cosmological instance (one possible universe of
many). We're not particularly interested in local physics unless a singularity arises and 'crashes
the simulation'. These should not be avoided rather - they should be studied to see what causes
them. A computational issue becomes cell size/number. We must provide sufficient resolution to
allow realistic containment/movement of elementary particles. We must experiment with
different particle distribution scenarios. Total number of particles is a 'good question'.. i suppose
it depends on what is computationally allowable presently. Ideally, there should be enough global
energy to form a neutron star but that's computationally unrealistic presently. There's probably an
optimal balance between cell size and total number of particles but we're concerned with realistic
cosmologies initially .. i suppose the real test of the paradigm will be when someday: we can
simulate star formation, life, and death in the scenario proposed above.

Re-reading this several times impels me to reiterate, temporal curvature can explain: mass,
gravitation, strong force, time dilation (both kinds), and relativistic effects. Impedance and
proper translation rules explain everything else. Creating this simulation faithfully will not only
be exciting and revealing, but 'tests the model' of Iam space. We're testing whether or not R5xC
is a good model for our universe. We're testing the importance of impedance and local curvature.
We're testing the fundamentalness of local temporal curvature.

One method for validating/correcting any translation rules are: does hydrogen form of its own
accord?, what about excited states?, can we stimulate them?, do they properly return to ground
state?, what about excited states of helium?, are they properly modeled?, what about spin-orbit
interactions?, are they evinced?, what about fusion?, testing spontaneous fission?, and of course
testing various coherent/interference phenomena..

It's interesting to note very few parameters are required with Iam space (contrast that with the
Standard Model). If the premise is correct, only two 'fundamental constants' are required:
impedance and t0, the rate time passes in flat space. Everything else is encoded in the dynamics
of particle interaction (based on impedance and temporal curvature) and self-characteristics
(again relating to impedance and temporal curvature).

Theoretically, i'm not up to becoming a 'TC Feynman' but with simulations.. i believe i can
implement the simulation above given proper access to capable equipment.. This essay is also a
request for others interested in physics simulations to attempt their own.

Iamsim vs Iam space

Frequently, i try to take on the role of 'distant observer' in regards to my work on Iam space. I try
to view it as another interested person might. The purpose is to check my own reasoning and
thought processes for relevance, validity, consistency,.. so that i can try to check the model for
the same reasons. For engineering purposes, there are others who have proposed discrete space
(and for other reasons). i have deliberately stayed away from discrete space because i do not
personally believe space is discrete (cellular as in Iamsim). But obviously, it has practical
applications. In linear systems science, many of the physical models are differential-continuous
but we actually simulate them on computers which are discrete. So a large part of linear systems
theory is called discrete (step-wise vs smooth).

As a distant observer looking at Iamsim and its possible correspondence to reality, i would
reiterate above and examine, in detail, the structure. With appropriate 'translation machinery'
(how one particle/photon moves from one cell to another), assuming impedance is constant,
uniform, and isotropic (does not depend on direction of travel), then the 'minimal set' of Iam
space reduces to R4. The reason for this is - the impedance of space is purely resistive,
approximately 377 ohms. However, the fourth component is not time as we think of it. Time in
this context is 'local time' which progresses at different rates dependent on local energy density.
From the cellular approach, again with proper translation machinery, temporal curvature is
'encoded' in that rate. So formally, Iam space becomes {R4 , Z0, t0, T} where the fourth
component of R4 has a very specific meaning as indicated above, Z0 is the impedance of free
space, t0 is the rate at which time passes in flat space (in our universe), and T represents the
(assumed enormously complex) translation machinery for a particle/photon to move from one
position to another. (For those who cannot stomach impedance it equates with c, the rate photons
travel in flat space and the limit rate for particles.)

*tx below needs correction to tx=t0/√(1-1.48*10-27(m/r)) which is conventional gravitational time


dilation where m is affecting mass and r is distance from it
Let's get a little more detailed to examine our assumptions:
Iam = {{x,y,z,tx: x,y,z∈R, *tx=(1+EnatP/h)t0, Ena is neighborhood average energy density, tP
Planck time, h Planck's constant, and t0 is 'flat space time rate'}, c, T defined later}

The reasoning for using neighborhood average and not individual cell energy content is similar to
my reasoning for space: time is not discrete in our universe - we assume and perceive it
continuous. So naturally even in a cellular spatial approach, we assume temporal effects are
smoothed. This equates with a smoothed boundary for particles (not abrupt and discontinuous). If
indeed particles are Planck/other sized temporal distortions in the 'fabric of time', considering the
requirement for instance that those temporal distortions explain the strong force between nuclei,
'nuclear glue' becomes necessarily a boundary effect .. We see above that more than two
constants are required for our universe: two Planck constants, flat time rate, and speed of energy
propagation. Gravitation in this picture is not a boundary effect - it's more like a residual effect of
the temporal boundary. We're missing one critical component: particles.

Iam = {{x,y,z,tx: x,y,z∈R, tx=(1+EnatP/h)t0, Ena, tP, h, t0, c}, T, p} where p is the cosmological
particle set. We see that four constants are required for spacetime, two functions relating to local
time, one (assumed) incredibly complex position translation structure, and initial set of particles.
Missing from this picture is initial conditions: initial energy distribution among particles, size,
and shape of Iam space. Many many surfaces have been proposed for the shape of our universe.
Above assumes a kind of hyper-cube. For simulation purposes (and theoretical analysis), we
must choose something..

Iam = {{x,y,z,tx: x,y,z∈R, tx=(1+EnatP/h)t0, Ena, tP, h, t0, c}, T, p, p0} where the last item
represents the initial energy/momentum configuration of all particles in each cosmological
instance. The 'only' thing to be determined is T, the particle location translation machinery. In a
sense, this corresponds to QFT+QED+QM largely developed by Feynman. But unfortunately,
these tools/constructs/set-of-assumptions/theoretical-base are not amenable to Iam space. For
example, Feynman diagrams are amazingly powerful heuristic tools for understanding particle
dynamics. But they're based on virtual particles. So we cannot use them directly; we must
develop Iam space analogs of Feynman diagrams.

Another approach is thinking of T as a linear transformation on x (a particular particle). But as


stated previously, T must include: coherent, interference, and individual phenomena. As is
typical in any new branch of science, a shift of perspective and associated assumptions is really
all that was required.. i have confidence we can reformulate QFT+QED+QM for use in Iam
space .. 100 years was perhaps not lost after all..

Iam = TOE?

I wasted some time fretting about curvature .. 'correcting' my curvature function to conform to
conventional temporal curvature, but conceptually - i was still on target.. Again, gravitation /
strong force is mediated by temporal curvature as 'far field' (residual) and near field boundary
effect. So i really needn't have 'got my panties in a twist' about it.. About five years ago, i ran a
nuclear simulation based on that and electrostatic forces - it seemed to perform well. What i was
curious about at the time was beryllium-8 - why it's unstable .. Other re-modifications of the
theory don't point toward an absolute necessity of complex time. That construct allows non-
locality via Minkowski but i'm sensing, from a modeling perspective, non-locality is not required
(and therefore - complex time is not absolutely required). Non-locality explains self-interference
but it can be explained in other more simplistic ways. Since i'm now viewing particles as
electromagnetic-temporal wavelets, self-interference is not surprising. Wavelets require complex
number and function theory but that can be accommodated in T - not required in the subspace of
Iam representing spacetime. All this may be 'gobble-dee-gook' for many but it amounts to
partitioning Iam space into segments: those that require complex number/function theory and
those that do not.

Also, constructing Iamsim as a discrete space may be 'overkill'. As mentioned before, computer
simulations are necessarily discrete as they are implemented on discrete devices (computers).
And recall that i implied above, near field (nuclear) theory and far field (gravity) theory may be
modeled and simulated by a single function. The fact 'nuclear glue' and gravity are both attractive
forces suggests this. In previous essays, i was simply trying to keep things as simple as possible.
Assuming a single attractive function is not horrendous .. but requires some constants and
assumptions.. Electrostatic forces are modeled by another (complex) function..

i've studied nuclear engineering so i'm familiar with decay and interaction schemes..
Traditionally, there's a set of interaction probabilities associated with a particle's 'cross section' (a
kind of probability of hitting a barn with a shotgun blindfolded and spun around). In fact,
PQMers use the term 'barn' to indicate the overall probability of interaction. But this is statistical
analysis (which was one of my majors at university) - not physics. True physics is based on
understanding underlying principles and valid constructs - not probability.

There was a time when students tried to understand intrinsic spin - trying to 'wrap their head
around the construct'.. Also, the c-frame.. They're both useful concepts to learn about
conventional physics.. But i question the relevance to reality .. If we approach physics / our
universe / reality balanced, heuristically, and with rabid attachment to Occam, we arrive at Iam
space and temporal curvature .. Admittedly, Iam space has 'transformed itself' conceptually
revamping itself to come closer to reality - but that's nothing more than strict adherence to
Occam and the scientific method .. Think of me as just the pencil and hand that writes about the
living Iam space that wants us to discover who She really is..

Forgive the analogy but sometimes it feels that way..


Others have written about Nature with a feminine character.. Our universe is like Earth:
receptive, harboring, nurturing,.. We could not exist without just the right conditions to produce
and cultivate us.. It's almost as if She's leading the way of discovery and Knowing her..

Enough about spirituality; I've written my share about that plenty .. At this moment, i'm more
concerned with cosmologists and particle physicists taking a serious look at Iam space rather
than their religious beliefs .. Several years ago, i spoke with a physicist in Michigan about some
simplistic beginning ideas relating to Iam space .. During the conversation, we determined (at
that stage of simplistic modeling) particles were self-interacting in 'my scheme' .. He was
repulsed by that and did not want to speak further about it .. But renormalization in physics is
exactly the conventional tool they use to 'get over' self-interaction.. How could he complain
about a weakness in my theory that convention already embraced? It was duplicitous..

At this stage in the theory, i don't see any self-interaction problems we saw those many years ago
.. At one point in development/discovery, i was convinced the 'Iam framework' was nothing more
than a better way to look at particle physics .. So there was little advantage in trying to force
convention to accept it .. But that was before discovery of temporal curvature .. After that, i'm
convinced it's a better framework - a model closer to our reality..

As a kind of tribute to one of my 'old best friends' Doug Sweeney, who stated "If it could have
been done, it would have been done." (remarking about physics unification and conscious
machines) .. But that seriously neglects the fact transformations in science typically come from
shifts in perspective. With the world economy in such tragic shape, with the Standard Model
ready to fall flat on its face, with the predictions i made as a child coming true (that we're
entering another ice-age), we're faced with a crucial dilemma: is sam crazy or right on target?

The way things look right now, i would not bet against me .. That's just a recommendation..

http://en.wikipedia.org/wiki/Beryllium-8#Beryllium-8
http://en.wikipedia.org/wiki/Wavelet
http://en.wikipedia.org/wiki/Barn_(unit)
http://en.wikipedia.org/wiki/Spin_(physics)
http://en.wikipedia.org/wiki/Inertial_frame_of_reference

Chemistry and life arise how?

The proper study of atomic orbital theory is rooted in nuclear structure. The geometric
configuration of the nucleus and therefore electromagnetic field produced by it determine
electron orbitals surrounding the nucleus. Of course, there are spin-orbit interactions which
determine a fair amount of spectroscopy / orbital dynamics / chemistry. But the bulk of electron
orbital structure (and therefore chemistry) is determined by nuclear structure.

It does sound strange to a chemist or biologist to hear, but i assure you - there really is no other
way to proceed starting from Iam space. So to study the chart of nuclides and how it arises is the
foundation of chemistry and biology .. Seriously, you can spend an entire lifetime formally
studying that chart. There are basically three types of regions in the chart: stable, marginally
stable, and highly unstable. Some nuclei are marginally stable and endure for millions of years.
Many are highly unstable and decay 'almost immediately' (in human time measuring schemes).
But there are some declared by nuclear chemistry to be unequivocally stable.

This concept, absolute unequivocal stability, is in direct conflict with Standard Model paradigms.
So chemists essentially ignore physicists adherence to PQM. Chemists know better. For the
purposes of chemistry and life, chemists know that certain nuclei are stable and therefore good
candidates for 'building blocks of life'. If they were nuclearly unstable, chemistry and life could
not manifest in this universe; life requires nuclear stability; stability is in direct conflict with the
Standard Model.

This awareness was actually one of the driving factors for Iam space. How can two main
branches of science be at such odds? It's unfathomable and untenable. Chemistry relies on
quantum principles such as in quantum chemistry - the theory of electron orbitals as accepted by
convention. There are essentially deterministic alternatives; i will not list them here; if you're
truly curious, you will find them.

One of the things that caught my eye/mind in the chart is 'metastable states' .. They're exactly
what they 'sound' to be: something nuclearly stable - but not really.. Just like WIMPs in
experimental nuclear physics, metastable states are a growing research area. They're both 'real
science' to me (as opposed to theories/research that depend on virtual bosons/Higgs). So if you
want to do real science, study metastable states and WIMPs over bosons..

WIMPs are good candidates to reinforce the Iam framework. They are not forbidden and may
relate to electron/proton/neutrino masses.. However, we should not base the theory on
detection/non-detection (as the Standard Model did so foolishly on Higgs - in that sense, he did
me a great favor by inventing it). Iam space is based on the centrality of TC/TR (temporal
curvature / temporal relativity). This basically says nothing about particle schemes. That's one of
the 'great benefits' of Iam space (some would say detraction because if you don't predict particle
schemes, how can it be verified?). There is one essential unspoken axiom of determinism:
inherent stability. If there are inherently stable nuclei/particles, how do they arise? What are the
determining factors for stability? If we ad hocly 'wave a magic wand' and simply declare some
particles are stable and some are not - we're no better than PQMers..

This is where discrete space comes in. If space is indeed discrete, that implies a kind of
'containment' on elementary particles. Space determines energy content, energy content
determines properties, properties determine interactions, interactions determine chemistry, and
chemistry determines life. So if indeed space is discrete, that determines whether or not life can
exist here.

The alternative to discrete space is more intuitive: we live in a continuous world/universe. It's
what we assume and perceive.. We don't perceive time as jumping from one moment to the next;
we perceive it continuously. Continuous time and space have been unspoken assumptions from
the beginnings of science. Only those fixated on 'watchmaking' are concerned with discrete time
(watches and computers 'tick' - not time - as we perceive it). If spacetime is continuous, then we
must necessarily define some structure creating paradigms. Structure must arise 'naturally' from
Iam space - if indeed Iam space is a good model of our universe.

So at this point, we're at a conceptual crossroads: is Iam space fundamentally discrete or


continuous? Which scheme makes the model more consistent/elegant? (We are after all 'elegance
hunters' are we not?) Even though the concept grates against me like an abrasive bozo in a bar, i
must concede it seems to be more consistent with Iam concepts.. It's conceptually repulsive
because we're raised to think continuously. But it may be preferable.

So we have a chance to do 'real science' yet again: when we determine the true structure of Iam
space, we discover the properties of our universe. Who will miss this opportunity to participate
and who will not? That's the question presently..

Will you?

Information loss associated with black holes

Still waiting for Timothy to get back to me.. Perhaps he never will. Found some interesting
sources that parallel recent developments:
http://en.wikipedia.org/wiki/Quantum_spacetime
http://en.wikipedia.org/wiki/Quantum_gravity
http://en.wikipedia.org/wiki/Loop_quantum_gravity
http://en.wikipedia.org/wiki/Discrete_Lorentzian_quantum_gravity
These are conventional research parallels to what i've been pursuing recently..

Still rabidly adhering to Occam.. Believe that will be a 'saving grace'.. Making any sophisticated
assumptions about spacetime goes against this. Whatever Iam space turns out to be, it should be:
simple, elegant, and reflect reality. Have perhaps found some more circumstantial evidence we're
living inside a giant simulation.. Let's examine.

In a physics simulation, never can we designate exact values. This translates to location,
momentum, and temporal approximations relating to simulation step size / location precision.
'Double precision' is usually the best we can do. Error in simulation has been analyzed
theoretically so i don't need to reexamine that here. Back to simulating Iam. Experimentally, we
will never be able to resolve detail to the Planck-length or Planck-time. Basically it's physically
impossible.

But suppose we are living inside a 'giant simulation'. Suppose the entities running the simulation
are not limited to double-precision values. Whatever the limit of precision is, it's finite. Let's
suppose the limit of precision in length is the Planck-length and the limit of precision in time is
Planck-time. This equates with global simulation step size being Planck-time and space-precision
limited by Planck-length.

How many particles are in our universe? What is the minimum information required to simulate
them? Let's estimate the total number of particles in our universe as about 8(1.878*1081) based
on the total mass of the universe, estimated proton equivalent, and multiplying that by 8 for some
WIMP, proton, electron, neutrino, and corresponding anti-particles. Multiplying that by the
magnitude of Planck-time and length gives 1.3*10162 bits of information at any one instant.

.. Some years ago i investigated information theory. It's abstract (beyond normal math
abstraction) and 'difficult' to comprehend. That's an understatement in any terms. Kind of like
trying to understand Godel's incompleteness theorem. My estimate above is purely conservative.
Likely it's much more than that. But imagine a 'state machine' that transitions based on T with
initial conditions p0. That simulator would be required to have a minimum of 10162 bits to
represent all particle locations at any one instant. Sounds incredible but since it's finite, is
possible.

Now we see why information theory is related to 'black hole' theory.. Is information conserved?
With singularities, likely not. When a mass is absorbed into a singularity, all information about
the mass is lost. The basis of singularity / black hole theory is that there must be a limit to
'nuclear tension' - when a neutron star collapses with too much mass.. But this equates with an
assumption about nuclear 'repulsion'; there's a limit. i never assumed this in any of my versions
of Iam space. To me, a 'singularity' in space is merely a neutron star with an event horizon. There
is no physical evidence black holes exist in terms of 'a different form of matter'. The event
horizon of a black hole is where even light cannot escape. This is caused by the force of gravity
exceeding escape velocity. But that does not imply, by itself, that the structure inside a black hole
is any different than neutron stars. Black holes may simply be neutron stars with event horizons.
There may not be a collapse of nuclear material.

Regardless of the structure of black holes, we need to determine the information loss regarding
masses falling into them. This relates to total universal information content and how it evolves.
Again, regardless of black hole structure, information is lost every time a mass is consumed by
one. So if 10162 bits of information is required at any one instant, there is an information loss
associated with the total number of black holes and average mass density surrounding them. Just
from this heuristic perspective, we see black holes determine information loss in the universe.

If I represents total universal information content, B represents number of black holes / neutron
stars, and rIl represents the average rate of information loss associated with black holes and
neutron stars, then the information content of the universe at any one instant is: I - B(rIl). This
may be a way to 'test the theory'. If we can measure/estimate the three parameters, we may be
able to validate/invalidate the theory.

The initial information content of the universe we can designate I0. So verily, I = I0 - ∫B(rIl) at any
one instant. Combining calculus and information theory is intricate but possible. Assuming B is
relatively constant throughout the life of any one particular cosmos, the function representing
instantaneous total information content becomes I = I0 - B∫(rIl). Again, if we can
measure/estimate four of the four values, we can test the validity of the theory.

Ball-park estimation of above is: I = 10162 - B∫(rIl) where I is current information content of the
universe, B is the total number of black holes and neutron stars, and rIl is the average rate of
information loss associated with black holes / neutron stars .. 'Co-conspirators' at NPA have
requested positive predictions (as opposed to negative predictions such as no Higgs) from 'my
theory'. This is 'best i can do' at the moment..

http://en.wikipedia.org/wiki/Observable_universe
http://en.wikipedia.org/wiki/Information_theory

Frame dragging, a test of GR and TR

http://en.wikipedia.org/wiki/Frame-dragging
http://en.wikipedia.org/wiki/Gravity_Probe_B
http://en.wikipedia.org/wiki/LARES_(satellite)

If general relativity is correct, frame dragging will be detected by LARES. It was not detected by
Gravity Probe B because there was too much noise in the data; in that case, noise overwhelmed
any 'frame dragging signal'. Frame dragging is predicted by GR because it says gravity is curved
space and also - direction of spin, of massive spinning bodies, is important. GR says massive
spinning bodies will cause a measurable 'twist' in space near the body.

Some time ago, i had no problem with this because an intermediate model of particles i had
developed was: dual flux-vortices screw-dislocations (in space). Screw dislocations are
essentially 'twists in space' which are a possible model of particles - indeed if twisting space is
possible.

But the model above was only an intermediate step in Iam space.. The modeling process did not
end there. Familiar readers know a concept called temporal curvature was developed/discovered
which unified special relativity, mass, gravitation, strong nuclear force, and both kinds of time
dilation. Yes, relativistic time dilation is part of SR but special mention is needed here to
emphasize the explanatory power of temporal curvature. Conceptually, TC is 'near field gluons'
or the 'stuff' which holds nuclei together. TC is 'far field residual' which equates with gravity. TC
is mass because, if the model's correct, energy in mass is energy in 'the fabric of time'. TC
explains SR because a particle's kinetic energy is its relativistic energy which is amplified TC.
And of course, if above is true, TC explains time dilation because that essentially is what it is.

How did i arrive at TC? By considering the 'expansion of space' (again an intermediate model of
elementary particles) which is caused by energy in the particle. i had proposed space is very
inelastic - very slightly elastic, that it takes tremendous pressure to distort it, but that distortion is
calculable/measurable. The logic is clear but totally depends on the 'fact' space is an elastic
medium. The final result, TC, may be correct, but the intermediate model may not be.

The intermediate model is a particle version of GR and supposes elementary particles are
miniscule twists in space - a very interesting proposal .. but possibly incorrect.

The reason i 'take back' the intermediate step now is because: if temporal curvature is truly
fundamental, if it's the true cause of gravitation, strong force, mass, and SR effects, it operates in
'flat space' and the rest is not required anymore (curved space is not required anymore to explain
gravity or anything). That would be 'overkill' and denies the centrality of temporal curvature. One
of the final versions of Iam space includes Euclidean space which is totally flat and is most
certainly not elastic. Euclidean space is flat and has no curvature. Euclidean space corresponds to
R3 in mathematics.

Drum roll please. So, temporal relativity (TR), which is the name of the 'new branch of physics' i
gave it, makes a very specific prediction: no frame dragging. Here we have a definitive test
between GR and TR. GR predicts frame-dragging; TR does not. If LARES unequivocally detects
frame-dragging, i must 'shut up' about TC and walk away, 'tail between legs' as a humiliated dog
might. On the other hand, if LARES unequivocally does NOT detect frame-dragging, that will be
a 'feather in the cap' for me and TR.

Anyone wanna take any bets?

More or less evidence for Lense-Thirring

If you perform an arXiv search on Lense-Thirring, you get about 200 papers directly related to
the subject. Lense-Thirring is the GR concept purporting a 'twist in space' near massive spinning
objects. Contrary to the claims of Gravity Probe B staff, due to noise in the data, a clear signal
confirming the phenomenon has yet to be found. Many related experiments are proposed and
some are being financially supported namely the Juno mission. Unfortunately for GR, that
particular mission has been delayed due to NASA budget restrictions. If the probe ever gets to
Jupiter, is not destroyed by mishap (knock on wood), and successfully performs an orbital
injection braking burn, we may obtain definitive data supporting/rejecting GR/TR. For those un-
initiates, TR stands for temporal relativity, a competing theoretical framework wrt GR that
exclusively depends on temporal curvature.

i've looked at six of the arXiv papers which range in topics. They indicate convention seriously
leans toward accepting GR over any other competing theory of gravitation. This is good - to take
a stand. i've always despised individuals who 'ride the fence' in any way.. Better to be wrong and
make progress than never to take a stand and eternally wallow in indecision.. So in this respect i
support conventions investigation into GR .. However, as implied above and covered in other
essays, i do not support GR directly: it sidesteps Occam's Razor.

As with the Standard Model and virtual exchange, GR is not the simplest theory which explains
reality. i firmly believe/state that science needs to take a good hard look at the current
assumptions of science and decide whether current investigations are actually worth the
resources allocated. In my meager estimation, both the assumption frames and several
experimental investigation themes are seriously questionable - only from the scientific standpoint
and Occam's Razor.

If we religiously adhere to Occam, we're forced to construct alternative theoretical frameworks


relative to the SM and GR both. The basic premise of the scientific method is observation and
hypothesis, a recursive relationship: we observe, we induce, we observe, we refine.. This is the
essence of the scientific method. But unfortunately for the SM, we started with a flawed original
premise: elementary particles are not probability waves. There is nothing virtual about reality.
GR makes an analogous erroneous assumption that space is elastic. This is not the simplest
explanation of gravitation as i've written in other essays. Particles may be viewed as
electromagnetic wavelets and so by their very nature (study wavelet theory) are uncertain. So if
wavelets are a good model of elementary particles, we don't need non-locality and complex time
and any other construct convention seems to prefer to embrace. Similarly, if GR is 'overkill' in
terms of modeling gravitation, we don't need elastic space as part of the model. Temporal
curvature is sufficient and minimal to explain gravitation - it also (by definition) rejects frame-
dragging / Lense-Thirring. So we have definitive tests between competing theories.
Unfortunately for particle physics, i have not devised a conclusive test between the SM and my
more simplistic models of reality. An intermediate model which impelled me toward TR is what
i call 'GR applied to elementary particles' (they're dual flux vortices and mini-screw dislocations
in this model). And via Occam, i've rejected that in favor of TR.

So in a sense, if Lense-Thirring is not found, that's also evidence for the TR model of elementary
particles (they're dual electromagnetic wavelets coupled with temporal distortions). The Lense-
Thirring effect seems to be the core/decisive test between competing theories.

If it exists and is real, we may have to take a step back in the modeling process. We may be
forced to embrace 'GR as applied to elementary particles' as briefly described above. This is
certainly preferable to the house of cards currently evinced by the SM.

We've covered a lot in this brief essay but identify Lense-Thirring / frame-dragging as a critical
definitive test between competing theories: SM vs TR-e.p. and GR vs TR. We've also identified
an intermediate alternative to the SM that, if frame-dragging is unequivocally detected, allows
convention to move toward a more realistic framework following the scientific method and
Occam.

Juno, the angular momentum of Jupiter and the Lense-Thirring effect, Lorenzo Iorio
The Shape of an Accretion Disc in a Misaligned Black Hole Binary, Rebecca G. Martin, J. E.
Pringle and Christopher A. Tout
Evidence for GR rotational frame-dragging in the light from the Sgr A* supermassive black hole,
B. Aschenbach
Recent Attempts to Measure the General Relativistic Lense-Thirring Effect with Natural and
Artificial Bodies in the Solar System, Lorenzo Iorio
ABOUT THE LENSE-THIRRING AND THIRRING EFFECTS,
ANGELO LOINGER AND TIZIANA MARSICO
Phenomenology of the Lense-Thirring effect in the Solar System, Lorenzo Iorio, Herbert I. M.
Lichtenegger, Matteo Luca Ruggiero, Christian Corda

http://en.wikipedia.org/wiki/Juno_(spacecraft)
http://arxiv.org/find/all/1/OR+au:Lense_Thirring+all:+EXACT+Lense_Thirring/0/1/0/all/0/1
http://en.wikipedia.org/wiki/Inductive_reasoning
http://en.wikipedia.org/wiki/Scientific_method
http://en.wikipedia.org/wiki/Occam's_Razor
Exploring planetary structures via Lense-Thirring and more

If general relativity is correct, one of its implications, Lense-Thirring / frame dragging, should be
able to be used to map planetary interiors. The Lense-Thirring effect is basically a twist in space
near massive bodies such as planets, stars, and black holes. If proven, it may also have
elementary particle implications..An approved NASA mission called Juno will be launched soon
and when it arrives to Jupiter, (one of the mission experiments) will map its gravitational field.

The reason Lense-Thirring works is because of spherical-asymmetry. A perfect sphere is


spherically symmetric. Earth (for example) is not. Earth's shape is basically an oblate spheroid
(flattened sphere). The 'opposite' of that is prolate spheroid (cigar shaped sphere) .. The theory of
Lense-Thirring predicts an asymmetric gravitational field determined by the differential twist -
further determined by mass-spin distribution. There is more mass spinning / distorting space
around the equator so the effect should be greatest there. Near the poles, there is very little mass
moving / distorting space - so twist is minimal there. Since space is continuous (like drawing a
line with a pencil without lifting it), any twist in one portion affects nearby portions. So even
though polar effects are smaller than equatorial effects (twisting forces (torque) on space is
uneven), since space is continuous, Lense-Thirring effects are smoothed out around a planetary
body..

Again, the fact Lense-Thirring effects are spherically asymmetric in very exact ways allows us to
confirm/deny the effect - and also - use it to map planetary interiors. Very exciting stuff. It's
almost like using radar (or some other wave) to penetrate deep into a planet's interior to 'see the
layers'. Essentially, every planet has some kind of internal structure - like an onion but with
different kinds of layers and thicknesses. Each layer has its own thickness and density. That
causes a different 'signature' (which can be detected statistically) using Lense-Thirring theory.
Lense-Thirring theory is the 'crystal ball' we use to peer into planetary interiors.

But it has nothing to do with magic - the theory makes very specific exact predictions about
gravitational fields .. Unspoken in most of the literature are two assumptions about space: it must
be somewhat elastic for L-T theory to work and somehow - mass is 'coupled' with space so that it
can exert torque. Very little attention is devoted to these two assumptions. More on that later..

Back to mapping planetary interiors.. Jupiter (for instance) may have around 5 to 10 layers
internally (an educated guess). Each layer/shell has its own thickness and density. Each layer
produces a unique 'L-T signature' detectable by Juno. Those signatures may, depending on the
true set of layers, interfere with each other (constructively or destructively). So we must
necessarily use statistics to determine the likeliest layer scenario that fits the data best once we
get it from Juno. This is an example of how Lense-Thirring may be used to determine internal
structure.

Okay, back to our assumptions and implications. If GR is correct, L-T is assumed correct. If we
detect the effect to a certain level of confidence, we assume it's a fact. But as stated above, we
have two associated assumptions that are not discussed much: elastic space and matter coupling
with space. The effect cannot manifest without both. Space cannot be twisted unless it's
somewhat elastic. Matter cannot twist space unless it's somehow 'connected' to it. These
unspoken assumptions associated with GR are actually applicable to elementary particles.
During the course of my discovery of TR (temporal relativity), i developed an intermediate
model of elementary particles that are dual structures: electromagnetic flux vortices and screw-
dislocations in space. The math describing both are have some intriguing parallels. Markus Lazar
has looked into this. If TR is incorrect, if GR is, elementary particles must have some
relationship to it .. Quantum gravity is the 'SM approach to gravity' but .. Rarely has a
conventional physicist tried the other way. (To go from GR to particle physics.) What we
measure as 'spin' may simply be our observations of screw-dislocations in space; elementary
particles may 'simply be' very small 'twists in space'. This is somewhat new.. Couple that
perspective with another: wavelet theory, and you've basically reconstituted the Standard Model
without all the mumbo-jumbo.

(Wavelet theory has 'built in' uncertainty. In that sense, we don't need any of the SM constructs
that produce uncertainty: quantum foam, complex time, non-locality,.. If indeed elementary
particles are electromagnetic wavelets-screw-dislocations, we don't need to add uncertainty to the
model structure.)

The model above also 'solves'/addresses one of the unspoken assumptions: coupling. If matter is
indeed little twists of space, we don't need to ask the question of how space and matter are
connected because they are one in the same (matter is distorted space-lets). Coined a word?
Okay, elementary particles are spacelets. We could even call this ST, spacelet theory. :)

During my path to Iam space i was Sure i was onto something fundamental .. TR seemed the
inevitable destination.. But if GR is correct (over TR), then we must back-pedal a bit. ST seems
the default theory if TR is incorrect .. Isn't this preferable to 10 dimensions (associated with
string theory)?

Inertia and proton/electron mass ratio

There are two kinds of conventional physicists: those that have open minds about the physics of
our universe and those that dogmatically adhere to probability theory. There are many of both
kinds.. So it's unfair of me to accuse the entire physics community of the latter.. This essay is an
appeal to the open-minded kind and the general public who also lean that way.. The other kind
might as well just skip to another essay/story..

i just wrote a letter to NPA, the Natural Philosophy Alliance. It was about one page long. It was
about inertia. It was also about the structure of space/time. If space/time has structure, we may
never know because the size of that structure may be beyond anything we can ever measure or
detect. That may be unfortunate reality. Of course, that does not prohibit us from trying to guess
the structure or determine which structure best fits observations. Unfortunately, there's a large
group of mathematical physicists who are hell-bent on determining that structure, even at the
price of our sanity. What i mean by that is if they determine the structure of space, if that model
is essentially incorrect but fits Standard Model assumptions, then we become a delusional society
looking at the universe in distorted ways based on our incorrect philosophy of science and nature.
The basis of the Standard Model is that physics is random: forces are based on virtual particle
exchange. Taken to the extreme, even spacetime is filled with random virtual particles. This is
the 'dogmatic view' mentioned above. But this rabid adherence to randomness is historically
based on the Copenhagen perspective which was philosophically anti-deterministic. What that
means is: in the history of physics, there was a group that could not stomach determinism
(because it violated their beliefs about freedom), so they devised a physics that supported their
philosophy. That physics is what i call probabilistic quantum mechanics. PQM and the
philosophy behind it have dominated physics for near a century.

i agree that freedom is paramount but we do not need to devise a quantum theory of physics
based on freedom just out of human insecurity. Freedom can be achieved in other more
sophisticated ways.. For instance, chaos theory is the branch of math that investigates random
behavior. Interestingly, some 'governing equations' for chaos are actually deterministic. So some
level of randomness/freedom can be achieved deterministically. This is actually an astounding
mathematical truth..

Another favorite area of mine is turbulence. Turbulence is 'kind of' the opposite of order.
Turbulence is the natural phenomenon which happens when fluid flow exceeds a certain critical
speed depending on the fluid. Laminar (smooth) flow is the contrasting kind. So we have another
example in nature where randomness can occur completely by itself. We don't really need to
'build in' randomness into the universe out of human insecurity..

There's a relatively new area of engineering called wavelet theory. Surprisingly, there are some
limits in 'perfection' which resemble Heisenberg uncertainty. The real question becomes: does
engineering mimic nature or the other way around? ;) Or have engineers simply discovered some
properties of nature that physicists have purported for decades? This sounds circular and i could
stop right there but engineering has some valuable insights which need expounding.

One thing physics neglects is media impedance - even the impedance of space - they ignore and
consider trivial. 'Strangely', it was the impedance of space which impelled me on the path of
discovery about 'alternative views' of nature. So can it be so invalid as physics claims? One of
my first discoveries was that the impedance of space relates charge to spin. i sent the result to a
journal and they said "interesting" but would not publish.. i kept at it .. Over the years i've
developed 'engineering' models of elementary particles which are more realistic and respect their
dual nature. Dual in that they exhibit electromagnetic character and 'mass' character both. Mass is
in quotes because we all know about Einstein's famous equation of mass-energy equivalence. So
we all know that mass and energy are 'interchangeable'. This concept and the fact elementary
particles have electromagnetic attributes forced me to try to find some way to 'unite' these two
features.. Why do electrons and protons exhibit these dual characteristics? Could it be because
space has two features which allow them? This is the simplest explanation and i pursued it
doggedly. i reasoned that the only possible way that e.p.s could exhibit dual character is because
space allows them. Space must have two unspoken qualities which allow electromagnetic
behavior and 'mass' behavior. 'Amazingly', all we need to do is look toward engineering..
Impedance and elasticity are the two concepts which designate 'what we need' to understand
elementary particles.. Many physicists scoff and walk away at this point in the discussion
because it smacks of determinism and the aether - which they've rejected years ago - both.
The 'bizarre' thing is: when we pursue this path to it's eventual conceptual end, we arrive at an
elegant theory of spacetime and elementary particles including general relativity / gravitation. So
could it be so wrong? As mentioned above, the main reason physicists cannot stomach this
theory is because it relates to concepts they've rejected as wrong .. In my most recent letter to
NPA, one of the things i unequivocally state is that we need to be able to discuss things
conceptually. If we cannot do that, it indicates we really don't understand. A conceptual
discussion forces you to imagine/visualize concepts. It forces you to relate ideas at a higher level
so that others can understand. Without conceptual understanding, we have no hope of
understanding our universe.

After years of study, i surmise the Standard Model is based on virtual exchange. That is the
central concept/assumption of the SM. Alternatively, the theory i've developed/discovered is
called temporal relativity, TR for short. TR is based on one critical assumption: time can store
energy like space. With that one assumption, you can explain quite a few things in nature.. Of
course, i couple that assumption with electromagnetic theory to be comprehensive. But
essentially TR is the simplest theory which explains conceptually: gravitation, strong force,
special relativity, and mass. If it's wrong, we can 'fall back' to a more complex model which is
the elementary particle version of general relativity.

Finally, we can discuss inertia and proton/electron mass ratio. Again, this will be a conceptual
discussion. What causes inertia? What causes the proton/electron mass ratio to be the value we
measure? From the classical perspective, inertia is mass resistance to acceleration. But that says
nothing about what Causes inertia. Is it media impedance? Or perhaps media elasticity? If it was
media impedance, larger objects would be impeded more than smaller objects of same mass ..
Let me copy-paste one critical part of my letter to NPA:
1. inelastic space/time = 0 energy propagation rate = no particles = dead universe
2. infinitely elastic space/time = infinite e.p.r. = everything happens all at once = dead universe
3. finitely elastic space/time = finite e.p.r. = finite set of e.p.s = possibility for life

The reason i put a slash between space and time is because we don't really need spacetime to be
elastic to explain nature. It could be one or the other. That's a simpler (and therefore preferred)
model. So i propose that elasticity is the core feature of our universe that determines whether or
not life can exist here. Now how that relates to proton/electron ratio is something for you to
determine.. There's a Nobel prize waiting for you! Go get it! ;)

(The Poynting vector in engineering indicates power flow in an electromagnetic field. Engineers
view photons as 'self propagating transverse electromagnetic waves' that are polarizable in two
ways: circular and linear. So photons 'self propagate' somehow.. Almost no one addresses how
they do this. Several alternative theorists propose energy propagates because it changes form:
from electromagnetic to mass-equivalent. If TR is correct, mass-equivalent is a temporal
distortion or curvature in time, so photons move because they change form between
electromagnetic energy and temporal energy. Alternatively, if GR is correct, mass-equivalent
is/are really little 'twists in space' (i call them spacelets), so photons move because their energy
transforms between electromagnetic field and spacelet. (The particle version of GR i call spacelet
theory, ST.) But again, convention dismisses all this because it's not formulated in the language
of QFT (quantum field theory) and because of the contradictory assumptions listed above. The
reason TR/ST cannot be formulated in the language of QFT is because that theory is based on
virtual exchange which is in direct opposition to the primary assumptions of TR/ST. A new
language must be developed analogous to QFT which is another challenge for theorists
embracing TR/ST. So there is much work to be done..)

The basic point of this essay is: if we truly understood our universe conceptually, we'd be able to
easily explain inertia and proton/electron mass ratio. The fact i can somewhat explain inertia
indicates this theoretical direction is preferable to SM/QED/QFT. Some might profess all i do is
a lot of 'hand waving' and gesticulation, endlessly propounding determinism and aether. But this
ridicule is unfair and unjust. If we religiously adhered to Occam's Razor and the scientific
method, we'd be pursuing TR/ST long ago..

Can we dispense with elastic time?

i'm guessing not - more on that below .. i spent the day doing four things: trying to visualize the
interaction (and structure) of spacelets with flux vortices that may comprise elementary particles,
trying to explain the cycle of plant and animal life to Joe (my nephew), trying to explain the
danger of rabies to Joe and Arthur (my son), and trying to explain the importance of friends
protecting friends to Joe, Arthur, and Poy (their friend). Not easy considering my broken Thai
and the fact they're 6, 3, and 5 respectively. ;) ..i also just spent a few minutes trying to chuck the
concept of temporal elasticity. But considering gravitational and relativistic time dilation, i don't
think we can.. Local time is affected by two things: strong gravity and your speed relative to your
starting point. If you're near strong gravity, time slows down 'because of the gravity'. And if your
speed relative to your starting point gets anywhere near c, the speed of light in a vacuum, time
slows down from that too.. So only considering those two facts, we cannot chuck the concept of
elastic time.

The reason however - we cannot model elementary particle mass with temporal curvature only is
because of two things: Lense-Thirring and the postulate e.p.s are dual spacelets e-m vortices. If
indeed Lense-Thirring is confirmed and indeed e.p.s are dual structures, space must have some
way to twist: elasticity. So elastic time alone cannot explain these things - by itself.

i had approached the structure issue some years ago but did not have access to decent simulation
equipment (fast computers with large capacity) so i 'postponed development' performing other
activities (living 'normal' life).. i'm confident that others can find a suitable 3D wavelet model to
represent spacelets and also the coupling scheme to define the relationship between spacelets and
flux vortices. That part of the theory does not concern me. The thing that concerns me is: how we
derive the proton/electron mass ratio from elasticity/impedance. If we postulate 'elastic modes'
which produce the stable particle distribution in our universe, we're no better than PQMers.
That's about as repulsive to me as trying to use QFT to justify spacelet theory.

The other thing that concerns me is mentioned above: the coupling scheme between spacelet and
vortex. My previous research indicated flux rotation rate and mass-spin are not identical so that
needs simulation and theoretical investigation. Another thing is energy distribution: is energy
dually manifested or does it reciprocate between them? (In the previous paper, it was suggested
energy reciprocates within a photon but not implied in e.p.s) Reciprocation might explain
tunneling and other features we detect, but presents problems when modeling orbitals
deterministically (orbit size is wrong when using a reciprocating e-m field). So i'm not convinced
elementary particle flux energy reciprocates with spacelet energy. It may be static and dually
manifested. So the coupling scheme between spacelet and e-m vortex needs thorough
investigation.

The two issues above: coupling scheme and derivation of particle distribution from
elasticity/impedance do not especially concern me. i'm confident human beings have the brain
power and patience to derive them both. After all, we've just spent about 100 years developing a
model based on a faulty primary hypothesis.. ;)

One last thing mentioned above: please please please do not attempt to use QFT to
justify/develop any part of spacelet theory. QFT is based on virtual particles and that simply has
no place in spacelet theory. Spacelet theory was developed from a deterministic semi-classical
perspective with heavy emphasis on classic electromagnetics. If you study the previous essays on
the subject, you get a feel of how the theory was developed. Please do not pervert the theory by
trying to include it in the Standard Model. Please respect the theory in the spirit it was developed.

Spacelet theory is essentially the application of GR to elementary particles. But it's also the
reaffirmation of engineering values and ethic. Engineers are practical and tend to see things that
way. What's the simplest model we can create to 'do the job'? We're expedient if nothing else. Of
course when required, we can be very sophisticated and subtle. But our main thrust is
expediency. So Occam's Razor is part of the engineering spirit. It's built in. Leave it to an
engineer to 'point the way' to unification.. i believe that was the most likely scenario anyways..
Please do not take this as arrogance - only a shift in perspective..

The human spirit and Higgs boson

Let's say you had a theory of human life that depended on this thing you propose called a 'spirit'.
That spirit somehow could communicate instantaneously to other spirits (perhaps through some
immeasurably small dimension we can call dimension-X). Also, that spirit is physically
undetectable - by any physical means (perhaps it resides in dimension-X?). If i could prove to
you that the spirit cannot exist - even in dimension X, wouldn't you have to admit your theory
about life was wrong?

According to QFT which is the 'language' of QED and QM (quantum field theory, quantum
electrodynamics, and quantum mechanics), virtual bosons can travel faster than light. Not only
that, they have exotic properties that don't obey normal physics. Feynman created the modern
version of QFT because it was the logical extension of PQM (probabilistic quantum mechanics).
But as i've said repeatedly, just because you can extend a theory does not make its fundamental
premise/assumption correct. (The fundamental premise of modern particle theory is that they are
inherently random with random virtual particles causing forces between them.)

But because forces seem to act instantaneously between particles, Feynman gave some 'rather
interesting' attributes to virtual bosons mentioned above .. i cannot prove Higgs cannot exist but
there's mounting evidence it does not. Besides, the fact i've developed a deterministic version of
particle theory (that does not depend on QFT, QED, nor standard QM) basically originating in
accepted general relativity (with some engineering perspective thrown in for good measure;),
shows we don't really need QFT, QED, nor standard QM.

One of my favorite quotes: there's nothing virtual about reality.

Some subtleties of 'spacelet theory' (a recent name i gave the new area of physics - isn't it cute?;)
need work but the basic framework is sound (from an engineering perspective). It proposes
deterministically that elementary particles are not random - they have definite physical structure
analogous to a hurricane around a spinning 'eye of the storm'. The 'hurricane' part of the structure
is analogous to the electromagnetic flux vortex. The 'eye' part corresponds to the 'mass'. (See
previous paper about why i put mass in quotes.)

The way particles interact via spacelet theory is two ways: temporal curvature and flux.
According to classical electromagnetism, total flux equals charge which basically is the charge
distributed throughout space 'in some 3D pattern' about the particle. One of the core problems in
physics is modeling that charge distribution. They've sidestepped it by developing modern QM.
They've basically said in a highly sophisticated mathematical structure: "we don't care, cannot
determine, (about) flux density patterns for individual particles, we can only talk about
interaction between particles virtually". In fact, QFT cannot even 'talk about' individual photons
as i have in previous essays.. It goes against the very basis of modern particle physics.

So as i've said previously, there's a Huge clash between assumptions - between spacelet theory
and standard QM. Spacelet theory asserts there's nothing random about particles except the fact
their best model (at present) is a 3D wavelet, which if true, has some fundamental limitations
based on wavelet theory. But this is not the same as saying particles are inherently random, that
spacetime itself is filled with random virtual particles,.. on and on.. Spacelet theory is basically
the fusion between general relativity and engineering perspective which asserts elementary
particles are complex dynamical structures that appear random because some parts of the
structure are exceedingly small, the dynamical structure itself, and temporal considerations. If
indeed temporal relativity has any basis in reality (the theory i've most recently pursued), then
elementary particles have some energy stored in 'the fabric of time', space, and electromagnetic
field.

If that's true, no wonder physicists 'threw up their hands in resignation' - resigning to the 'fact'
particles are probability waves.. It's much easier to do that - but essentially a cop-out. That's just
like me saying "the human spirit may be undetectable, but i know it exists!" .. (Personally, i
believe in the human spirit - if only our collective spirit that asserts: we have a right to improve
ourselves! We have a right to try to prove - we can earn our right to exist!) What better definition
of the human spirit is there? Similarly, i assert my right to perceive reality deterministically (with
some randomness recognized in chaos theory, turbulence, etc) and not be kowtowed or harassed
into submission by conventional dogma.

i also have a right to develop those ideas without dependency on the Standard Model, its
assumptions, QM, QED, nor QFT. i have a right to propose deterministic explanations of
Casimir, self-interference, and more.. You see, for every "look - there's more evidence for the
Standard Model!", there's actually a reasonable deterministic explanation which convention
refuses to acknowledge (because of PQM's entrenchment in their minds) .. In fact, it's an
obligation (if i cannot offer reasonable explanations to PQM statements, from a deterministic
perspective, that's always a reason convention can simply ignore me). Okay, here goes. Casimir
farce (i mean force .. slip of the tongue): that is misinterpreted and only applies to metallic plates,
was originally developed to explain molecular forces, and would be invalidated if researchers
used insulating plates. If Casimir truly exists, and is based on conventional interpretation, then it
should exhibit with All materials similarly. But you know it does not. Try it. Self-interference..
That's difficult but possible.. Since spacelet theory asserts 'quantum particles' (ooo i just Love
that phrase;) are not, they must be something else: dual extended dynamical structures. If so, it's
easy to imagine (and likely simulate) that self-interference can exhibit under proper
circumstances. What else.. inherent randomness, 'zero point energy',.. The list of self-deceptions
seems to be endless.. Oh, superfluidity! My favorite! Recall in a previous essay where i
addressed that.. i anticipate soon there will be a 'new branch of physics' (a new branch from
spacelet theory) called 'coherent processes' which investigates common themes deterministically
which will include: lasing, superfluidity, wave-packet theory,.. even some forms of turbulence.
You see, right now at this very moment, imagine an arc of two curved vectors: on the left is SM
(Standard Model) which is most of the way toward the right. On the right is a very small curved
vector representing ST/GR (spacelet theory / general relativity). These vectors are pushing
against each other representing conceptual forces between SM and GR. With 100 years of SM
domination, GR has almost disappeared.. but through me and ST, GR is pushing back! And
eventually (i have faith) it will win!

temporal relativity, spacelet theory, and coherent processes


sg micheal, 2011/JAN/09

In honor of Richard Feynman, in sincere humility, i hereby found a new branch of science with
three main branchlets: temporal relativity, spacelet theory, and coherent processes. TR is the
theoretical foundation behind spacelet theory. ST is the fully deterministic theory of elementary
particles. Coherent processes is the deterministic analysis of: lasing, superfluidity, and coherent
structures in turbulent flow, as examples. Because the assumptions of these areas are in stark
contrast with those of the Standard Model of particle physics, we cannot use the standard
methodologies of quantum field theory and conventional quantum mechanics. We must
necessarily develop a temporal curvature analogy of quantum field theory. In fact, it is argued the
phenomenal success of quantum electrodynamics is actually due to this conceptual analogy
between the proposed temporal curvature theory and quantum field theory.

Many erroneously declare the 'greatest physicist of all time' to be Albert Einstein. Although his
faith in the theory described above and his contribution of relativity is fully recognized, most of
convention recognize Richard Feynman as that person. There are two basic reasons why he
deserves this title. One is that Feynman basically 'taught physicists the deep structure of physics'
by providing them the tools they needed to formally justify their ideas. The other is that he
recognized, he himself, the need for a 'less haphazard' theory of elementary particles. Feynman
was not as vocal about this as Einstein was but still believed we could find a better way of
looking at things. The fact he had the mathematical and conceptual sophistication to develop
modern QFT and QED, and the foresight to recognize their weaknesses, is the reason we honor
him.
The inspiration for this theory comes from the inconsistent modeling approaches between
conventional branches: nuclear chemistry and quantum mechanics. One assumes inherent
stability and the other quite the opposite. This inconsistency within convention was one of the
stimulants. However, the main impetus/motivation for it is the faulty main assumption of the
Standard Model: elementary particles are probability waves that interact via virtual particles.
Admittedly, this is the logical extension of Heisenberg's early matrix formulation. But at that
time, we did not have the level of engineering sophistication capable of 'pointing the way' toward
unification.

In my estimation, it will take about 100 years for this theory to be established because of the
conceptual inertia of the physics community. So by 2110, we should be looking back at at this
time saying "how could we be so naive?" As stated above, it's not really convention's fault for not
developing this theory previously. We simply didn't have the sophistication in perspective to be
able to model things appropriately. That's forgivable. What's 'unforgivable' (strictly speaking,
nothing is unforgivable) is if we ignore this 'wake up call'.

Many will see me as incapable of establishing a new branch of science. Many will see me as
arrogant. However, close to 40 years i've made it a lifetime discipline (humility) and
approximately 30 years i've privately studied physics. Even considering these factors, many
would still dismiss me as incapable. But it's always the 'fresh perspective' in science which
'solves the problem'. This particular perspective comes from engineering.

There are two engineering concepts and one relatively new area that contribute: elasticity,
impedance, and wavelet theory. Within engineering, these concepts are well developed and
deserve more attention from the physics community. Admittedly, specifically because of this
deterministic bent and the fact impedance 'smells' something like the historical aether, physicists
have ignored them.. But again, because of the level of sophistication of the models being
discussed, we cannot afford to ignore them anymore.

Models is plural above because we're being inclusive of general relativity. The new branch
briefly described above has its roots in both general and special relativity - as much as the
engineering concepts mentioned above. So it's not as if this 'new tree' of science does not have
foundations / roots / conceptual inspirations .. The path to this tree/moment was quite convoluted
and took me through territory mentioned above: from impedance to special relativity to general
relativity to now. It's not so much i 'borrowed concepts' as began to see a clearer image of this
view of elementary particles.. What's the famous quote.. "We see through a glass darkly.." Not
exact but you get the idea.. So the main reason Feynman himself could not develop these ideas
was because, i believe, he was browbeaten into submission by himself to follow a conventional
path. How would we have received his proclamations had he followed this path? i believe quite
derisively as i have been.. So basically 'no choice' Feynman had - but to support convention.

There is tremendous unspoken pressure to conform in the physics community. Basically, the
tenet is: conform or don't get supported. This is the battle any newcomer must face. Even
Einstein faced great ridicule within the community because of his philosophy and later years. He
was mocked and denigrated.. It's the sad unfortunate truth that even physicists are subject to
normal human frailties.. ;) So much of my previous writings were about the philosophy of
science, scientific method, and Occam's Razor because we have led ourselves astray.
This is my formal statement because if i don't make it, history will not recognize me as saying so.
i must take a stand and make an unequivocal declaration: the Higgs will never be detected (if
something is found, it will not be the Higgs), no evidence for gravitons will ever be detected,
many of the 'forces' will be overturned such as Casimir and weak, and all so-called quantum
effects will be subsumed into various portions of ST or coherent processes. In lieu of any
commendation, i respectfully request we formally develop: temporal curvature theory, TR, ST,
and coherent processes independent of any probabilistic/QFT formulation.

..All the hullabaloo about 2012 may very well be this 'transformation in physics' i'm calling for..
What i'm calling for is a return to determinism and rationality. And quite honestly, a return to
sanity.

You might also like