You are on page 1of 110

COLECCIÓN DE HIPERTEXTOS

domingo, 07 de febrero de 2010


12:18 p.m.
Plectics, Groups, Autopoiesis, Logoi, Topoi, Biosemiosis, Orders, Teletics
Holonomic-Anholonomic Coordinates, *-Autonomous Categories, Neural Network Models, Orthoalgebraic
Semantics, Hyperincursive Automata, Ontology Mapping, Evolutionary Innovation, Quantum -Coherent Cross-
Temporal Teleo-Cybernetic Feedback-Entanglement.

Upon this gifted age, in this dark hour, Falls from the sky a meteoric shower Of facts...they lie unquestioned, uncombined.
Wisdom enough to leach us of our ill Is daily spun: but there exists no loom to weave it into fabric… (Murray Gell -Mann)

Entangled concepts, without orientation,


Superpositioned ideas, abstract location,
Complex dynamics, and polytelic effects,
Quantify models, with symmetric events.

Information from, compressing the data,


Oscillating potentials, generated in theta,
Simpler versions, of isotelic connections,
Qualify theories, with parallel inflections.

Transformation logic, semantic weavings,


Concurrent topoi, syntactic interleavings,
Multiplexed verse, of enfolded possibility,
Quasi presheaves, as unfurled probability.
(Hamid Y. Javanbakht )

Murray Gell-Mann defines "Plectics" as the "...the study of simplicity and complexity. It includes the various attempts to
define complexity; the study of roles of simplicity and complexity and of classical and quantum i...

Contenido
• An Introduction to Simplexity:
• Chaos, Algebra & Topology:
• Infocognition, Metalogic & Grammars
• Cybernetics, Protocomputation & Biosemiotics
• Adaptivity, Emergence & Connectivity
• Symmetry, Design & Reflexivity
• Normativeness, Descriptivity & Analyticity
• Stochasticity, Coherence & Self-Determinacy
• Creation, Annihilation & Gyroteleostasis

Brainstorm Page 1
Brainstorm Page 2
http://www.art-sciencefactory.com/complexity-map_feb09.html

www.visualcomplexity.com

Everything on Facebook now:


http://www.facebook.com/group.php?gid=75661813145
An Introduction to Simplexity:
http://www.youtube.com/watch?v=ccndmDMMSAA&feature=PlayList&p=FFBE110987AACC6D&index=0&playnext=1

Complexity:
http://en.wikipedia.org/wiki/Complexity

The Stone Gamut: A Coordinatization of Mathematics:


http://boole.stanford.edu/pub/gamut.pdf

Mathematicians Map E8:


http://aimath.org/E8/

Chaos, Algebra & Topology:


Journal of Modern Dynamics:
The Journal of Modern Dynamics (JMD) is dedicated to publishing research articles in active and promising areas in the
theory of dynamical systems with particular emphasis on the mutual interaction between dynamics and other major areas of
mathematical research, including:

Number theory, Symplectic geometry, Differential geometry, Rigidity, Quantum chaos, Teichmüller theory, Geometric group
theory, Harmonic analysis on manifolds.
http://www.math.psu.edu/jmd/
Institute for Advanced Study:
http://www.ias.edu/
The Kavli Institutes:
http://www.kavlifoundation.org/institutes/
Perimeter Institute for Theoretical Physics:
http://www.perimeterinstitute.ca/
Santa Fe Institute:
http://www.santafe.edu/
Visions of a Sustainable World (1990):
http://www.santafe.edu/research/publications/workingpapers/90 -021.pdf
Murray Gell-Mann defines "Plectics" as the "...the study of simplicity and complexity. It includes the various attempts to
define complexity; the study of roles of simplicity and complexity and of classical and quantum information in the history of
the universe, the physics of information; the study of non-linear dynamics, including chaos theory, strange attractors, and
self-similarity in complex non-adaptive systems in physical science; and the study of complex adaptive systems, including
prebiotic chemical evolution, biological evolution, the behaviour of individual organisms, the functioning of ecosystems, the
operation of mammalian immune systems, learning and thinking, the evolution of human languages, the rise and fall of
human cultures, the behaviour of markets, and the operation of computers that are designed or programmed to evolve
strategies - say, for playing chess, or solving problems."
Murray Gell-Mann is a founding member and currently a distinguished fellow at SFI as well as the Robert Andrews Millikan
Professor Emeritus at the California Institute of Technology, where he joined the faculty in 1955. His research focuses on

Brainstorm Page 3
“plectics,” the study of simplicity and complexity, scaling, and the evolution of languages.
"a broad transdisciplinary subject covering aspects of simplicity and complexity as well as the properties
of complex adaptive systems, including composite complex adaptive systems consisting of many adaptive agents."
Nonextensive Entropy-Interdisciplinary Applications
(Edited by Murray Gell-Mann and Constantino Tsallis):
The present book constitutes a pedagogical effort that reflects the presentations and discussion at the International
Workshop on "Interdisciplinary Applications of Ideas from Nonextensive Statistical Mechanics and Thermodynamics," held at
the Santa Fe Institute in New Mexico from April 8--12, 2002. The participants, close to 60 in number, were scientists at both
junior and senior levels from Argentina, Brazil, Canada, Germany, Great Britain, Italy, Mexico, Poland, and the U.S.A. The
subjects of the chapters relate to dynamical, physical, geophysical, biological, economic, financial, and social systems, and
to networks, linguistics, and plectics.
http://www.santafe.edu/research/publications/bookinforev/statmech-preface.php
Let's Call it Plectics:

"A decade ago, when the Santa Fe Institute was being organized, I coined a word for our principal area
of research, a broad transdisciplinary subject covering aspects of simplicity and complexity as well as
the properties of complex adaptive systems, including composite complex adaptive systems consisting
of many adaptive agents. Unfortunately, I became discouraged about using the term after it met with a
lukewarm response from a few of my colleagues. I comforted myself with the thought that perhaps
aspecial name was unnecessary.
Perhaps I should have been more forceful. A name seems to be inevitable. Various authors are now
toying with such neologisms as "complexology," which has a Latin head and a Greek tail and does
not refer to simplicity. In this note, I should like to try to make up for lost time and put forward what I
have long considered to be the best name for our area of study, if it has to have one.
...
It is appropriate that plectics refers to entanglement or the lack thereof, since entanglement is a key feature of the way
complexity arises out of simplicity, making our subject worth studying. For example, all of us human beings and all the
objects with which we deal are essentially bundles of simple quarks and electrons. If each of those particles had to be in it s
own independent state, we could not exist and neither could the other objects. It is the entanglement of the states of the
particles that is responsible for matter as we know it. Likewise, if the parts of a complex system or the various aspects of a
complex situation, all defined in advance, are studied carefully by experts on those parts or aspects and the results of thei r
work are pooled, an adequate description of the whole system or situation does not usually emerge. The reason, of course,
is that these parts or aspects are typically entangled with one another. We have to supplement the partial studies with a
transdisciplinary "crude look at the whole," and practitioners of plectics often do just that.
I hope that it is not too late for the name "plectics" to catch on. We seem to need it."
http://www.santafe.edu/~mgm/Site/Publications_files/MGM% 20118.pdf
"It is interesting to note, therefore, that the two words are related. The Indo-European root *plek- gives rise to the Latin verb
plicare, to fold, which yields simplex, literally once folded, from which our English word "simple" derives. But *plek - likewise
gives the Latin past participle plexus, braided or entwined, from which is derived complexus, literally braided together,
responsible for the English word "complex." The Greek equivalent to plexus is pletoV (plektos), yielding the mathematical
term "symplectic," which also has the literal meaning braided together, but comes to English from Greek rather than Latin.
The name that I propose for our subject is "plectics," derived, like mathematics, ethics, politics, economics, and so on, fro m
the Greek. Since plektos with no prefix comes from *plek -, but without any commitment to the notion of "once" as in "simple"
or to the notion of "together" as in "complex," the derived word "plectics" can cover both simplicity and complexity."

http://en.wikipedia.org/wiki/Plectics
http://en.wikipedia.org/wiki/Group
http://en.wikipedia.org/wiki/Autopoiesis
http://en.wikipedia.org/wiki/Logos
http://en.wikipedia.org/wiki/Topos
http://en.wikipedia.org/wiki/Biosemiotics
http://en.wikipedia.org/wiki/Order
http://en.wikipedia.org/wiki/Telesis
Murray Gell-Mann's Research:
http://www.santafe.edu/~mgm/Site/Research.html
What is Complexity? (by Murray Gell-Mann):
http://www.scribd.com/doc/7887206/COMPLEXITY-by-Murray-Gell-Mann
Effective Complexity as a Measure of Information Content:

"Murray Gell‐Mann has proposed the concept of effective complexity as a measure of information content. The effective
complexity of a string of digits is defined as the algorithmic complexity of the regular component of the string. This paper
argues that the effective complexity of a given string is not uniquely determined. The effective complexity of a string admit ting
a physical interpretation, such as an empirical data set, depends on the cognitive and practical interests of investigators. The
effective complexity of a string as a purely formal construct, lacking a physical interpretation, is either close to zero, or equal
to the string‘s algorithmic complexity, or arbitrary, depending on the auxiliary criterion chosen to pick out the regular
component of the string. Because of this flaw, the concept of effective complexity is unsuitable as a measure of information
content."
http://www.journals.uchicago.edu/doi/abs/10.1086/375469?cookieS et=1&journalCode= phos
Evolution, Complexity, Information Theory, and Entropy:
http://library.nyu.edu/research/subjects/science/complexity/
Plectics: The Study of Simplicity and Complexity (by Murray Gell -Mann in Europhysics News):
http://www.europhysicsnews.org/index.php?option= article&access=standard&It emid=129
&url=/articles/epn/pdf/2002/01/epn02105.pdf
The Simple and the Complex (by Murray Gell-Mann in Complexity, Global Politics, and National Security):
http://www.dodccrp.org/html4/bibliography/comch01.html

Regularities and Randomness: Evolving Schemata in Science and the Arts (by Murray Gell -Mann in Art and Complexity):
http://books.google.be/books?id=MKOUgd39QkcC&lpg=PP1&hl=en&pg=PA47#v=onepage&q=&f= false

Plectics (by Murray Gell-Mann, Chapter 19 of Third Culture: Beyond the Scientific Revolution):
http://www.edge.org/documents/ThirdCulture/zc-Ch.19.html

Brainstorm Page 4
Plectics (by Murray Gell-Mann, Chapter 19 of Third Culture: Beyond the Scientific Revolution):
http://www.edge.org/documents/ThirdCulture/zc-Ch.19.html
VisWiki on Plectics and Lecture on "Plectic Thinking":
http://www.viswiki.com/en/Plectics
Simplexity (by Jeffrey Kluger):
http://www.simplexitybook.com/SimplexityVideos.html
Books Which Mention "Plectics":
http://books.google.com/books?q=plectics&btnG=Search+Books

Organizations and Their Management (by Bertram Myron Gross, on "Teletics"):


http://books.google.be/books?id=bHsTAAAAMAAJ&q=teletics&dq=teletics&hl=en
From Cybernetics to Plectics: A Practical Approach to Systems Enquiry in Engineering:

"The most prominent systems theories from the 20th century are reviewed in this chapter and the arguments of complex
system theorists is supported who use the term ―plec-tics‖ instead of the overused and ambiguous ―systems science‖ and
―systems theory‖. It is claimed that the measurement of complex systems cannot be separated from their modelling as the
boundaries between the specific steps of the scientific method are necessarily blurred. A critical and extended interpretatio n
of the complex system modelling method is provided and the importance of discipline-specific paradigms and their
systematic interdisciplinary transfer is proposed."
http://www.springerlink.com/content/txp3414750v3r011/
Intelligent Engineering Systems and Computational Cybernetics (2009):
Engineering practice often has to deal with complex systems of multiple variable and multiple parameter models almost
always with strong non-linear coupling. The conventional analytical techniques-based approaches for describing and
predicting the behaviour of such systems in many cases are doomed to failure from the outset, even in the phase of the
construction of a more or less appropriate mathematical model. These approaches normally are too categorical in the sense
that in the name of modelling accuracy they try to describe all the structural details of the real physical system to be
modelled. This can significantly increase the intricacy of the model and may result in a enormous computational burden
without achieving considerable improvement of the solution. The best paradigm exemplifying this situation may be the classic
perturbation theory: the less significant the achievable correction, the more work has to be invested to obtain it. A further
important component of machine intelligence is a kind of structural uniformity giving room and possibility to model arbitrary
particular details a priori not specified and unknown. This idea is similar to the ready -to-wear industry, which introduced
products, which can be slightly modified later on in contrast to tailor-made creations aiming at maximum accuracy from the
beginning. These subsequent corrections can be carried out by machines automatically. This learning ability is a key element
of machine intelligence.The past decade confirmed that the view of typical components of the present soft computing as
fuzzy logic, neural computing, evolutionary computation and probabilistic reasoning are of complementary nature and that
the best results can be applied by their combined application. Today, the two complementary branches of Machine
Intelligence, that is, Artificial Intelligence and Computational Intelligence serve as the basis of Intelligent Engineering
Systems. The huge number of scientific results published in Journal and conference proceedings worldwide substantiates
this statement. The present book contains several articles taking different viewpoints in the field of intelligent systems.
http://books.google.com/books?id=AZ8cwjSqIv8C&source=gbs_navlinks_s
The Extended Mind: The Emergence of Language, the Human Mind, and Culture (Robert K. Logan, pg. 17-19 on plectics):
Building on his previous study, The Sixth Language (2000) and making use of emergence theory, Logan seeks to explain
how language emerged to deal with the complexity of hominid existence brought about by tool-making, control of fire, social
intelligence, coordinated hunting and gathering, and mimetic communication. The resulting emergence of language, he
argues, signifies a fundemental change in the functioning of the human mind a shift from percept -based thought to concept-
based thought.
From the perspective of the Extended Mind model, Logan provides an alternative to and critique of Noam Chomskys
approach to the origin of language. He argues that language can be treated as an organism that evolved to be easily
acquired, obviating the need for the hard-wiring of Chomskys Language Acquisition Device. In addition Logan shows how,
according to this model, culture itself can be treated as an organism that has evolved to be easily attained, revealing the
universality of human culture as well as providing an insight as to how altruism might have originated.
http://books.google.com/books?id=NYBJEmqhHlAC&source= gbs_navlinks_s
Group Theory:
http://mathworld.wolfram.com/topics/GroupTheory.html
Randomness and Complexity (By Cristian Calude, Gregory J. Chaitin):
http://books.google.com/books?id=RUedyFupPY4C&source=gbs_navlinks_s
Thinking about Gödel and Turing (By Gregory J. Chaitin, Paul Davies):
Dr Gregory Chaitin, one of the world's leading mathematicians, is best known for his discovery of the remarkable [Omega]
number, a concrete example of irreducible complexity in pure mathematics which shows that mathematics is infinitely
complex. In this volume, Chaitin discusses the evolution of these ideas, tracing them back to Leibniz and Borel as well as
Godel and Turing.
http://books.google.com/books?id=DS7AOrIw8bk C&source= gbs_navlinks_s
Hypercomplexity:
What is biological complexity? How many sorts exist? Are there levels of complexity? How are they related to one another?
How is complexity related to the emergence of new phenotypes? To try to get to grips with these questions, we consider the
archetype of a complex biological system, Escherichia coli. We take the position that E. coli has been selected to survive
adverse conditions and to grow in favourable ones and that many other complex systems undergo similar selection. We
invoke the concept of hyperstructures which constitute a level of organisation intermediate between macromolecules and
cells. We also invoke a new concept, competitive coherence, to describe how phenotypes are created by a competition
between maintaining a consistent story over time and creating a response that is coherent with respect to both internal and
external conditions. We suggest how these concepts lead to parameters suitable for describing the rich form of complexity
termed hypercomplexity and we propose a relationship between competitive coherence and emergence.
http://www.ncbi.nlm.nih.gov/pubmed/16583272
Complex Networks (Course CSYS/Math 303, Spring 2009, University of Vermont):
Complex networks crucially underpin much of the real and synthetic world. Networks distribute and redistribute information,
water, food, and energy. Networks can be constituted by physical pipes, embodied in relationships carried in people's minds,
or manifested by economic interdependencies.
In the past decade, building on work in a wide range of disciplines, many (but certainly not all) advances have been made in
understanding all manner of complex networks such as the World Wide Web, social and organizational networks,
biochemical networks, and transportation networks. In this special topics course, we will explore the evolving field of compl ex
networks by reading and discussing seminal and recent papers, and developing mathematical and algorithmic results where
they exist. The level will be graduate/advanced undergraduate.
Overview of Potential Projects, Overview of Complex Networks, Branching Networks I, Branching Networks II, Optimal
Supply Networks, Random Networks, Applications of Random Networks, Contagion on Random Networks, Assortativity,

Brainstorm Page 5
Diffusion (just a little), Generalized Contagion, Centrality, Structure Detection in Networked Systems, Scale -Free Networks,
Small-World Networks, References.
http://www.uvm.edu/~pdodds/teaching/courses/2009 -01UVM -303/content/lectures.html
http://www.uvm.edu/~pdodds/teaching/courses/2009 -06SFI-net works/index.html
http://www.santafe.edu/events/workshops/index.php/Main_Page
Networks and Complex Systems (Spring 2009 Talk Series):
http://vw.indiana.edu/talks-spring09/
University of Michigan, Center for the Study of Complex Systems:
http://www.cscs.umich.edu/about/about.html
Complexity Digest, Networking the Complexity Community:
http://turing.iimas.unam.mx/~comdig/
Swarm Dynamics, Semiotics, Intelligence, Modeling:
http://www.zulenet.com/see/swarm.html
Thinking in Complexity: The Computational Dynamics of Matter, Mind, and Mankind:
http://books.google.com/books?id=VWgDkNdX9AgC&source=gbs _navlinks_s
Systems Thinking: Managing Chaos and Complexity: A Platform for Designing Business Architecture (Jamshid
Gharajedaghi):
http://www.scribd.com/doc/17455308/Complexity
Anholonomic (or Nonholonomic) Systems:
In physics and mathematics, is a system whose state depends on the path taken to achieve it. Such a system is described
by a set of parameters subject to differential constraints, such that when the system evolves along a path in its parameter
space, (the parameters varying continuously in values) but finally returns to the original set of values at the start of the path,
the system itself may not have returned to its original state.
More precisely, a nonholonomic system, also called an anholonomic system, is one in which there is a continuous closed
circuit of the governing parameters, by which the system may be transformed from any given state to any other
state. Because the final state of the system depends on the intermediate values of its trajectory through parameter space,
the system can not be represented by a conservative potential function as can, for example, the inverse square law of the
gravitational force. This latter is an example of a holonomic system: path integrals in the system depend only upon the initi al
and final states of the system (positions in the potential), completely independent of the trajectory of transition between t hose
states. The system is therefore said to be integrable, while the nonholonomic system is said to be nonintegrable. When a
path integral is computed in a nonholonomic system, the value represents a deviation within some range of admissible
values and this deviation is said to be an anholonomy produced by the specific path under consideration. This term was
introduced by Heinrich Hertz in 1894.
http://en.wikipedia.org/wiki/Nonholonomic_system
Patent Law and Technology Transfer Interest Group:
http://sigs.nih.gov/patent/Pages/default.aspx
Peter K. Yu (Intellectual Property Law):
http://www.peteryu.com/praeger.htm
Securing Innovation: Managing Intellectual Property, Patents, Trademarks, and Trade Secrets:
http://www.securinginnovation.com/
Survey and Synthesis of Current Innovation Approaches:
http://www.scribd.com/doc/236206/Survey-and-Synthesis-of-Current-Innovation-Approaches
Ontology:
http://en.wikipedia.org/wiki/Ontology_(information_science)
Mereology:
http://en.wikipedia.org/wiki/Mereology
Teleology:
http://en.wikipedia.org/wiki/Teleology
Concurrency (Computer Science):
http://en.wikipedia.org/wiki/Concurrency_(computer_science)
Stanford Knowledge Systems Laboratory:
http://www-ksl.stanford.edu/
Protégé is a free, open source ontology editor and knowledge-base framework:
http://protege.stanford.edu/
An Intrepid Guide to Ontologies:
http://www.mkbergman.com/?p=374
Buffalo Ontology Site:
http://ontology.buffalo.edu/
Ontolinguistics: How Ontological Status Shapes the Linguistic Coding of Concepts (2007):
http://books.google.com/books?id=xxxyZo5A_gEC&source=gbs_navlinks_s
Ontological Foundations of Knowledge Engineering (1993):
The formal representation of aspects related to space, matter, structure and function still constitutes a bottleneck for all
problems related to the representation of physical entities like mechanical artifacts. The goal of the project on Logical
Modelling of Mechanical Assemblies is to develop a unified logical theory accounting both for the qualitative features of
simple mechanical parts (topology, dimension, form, orientation, mechanical properties of faces, edges, slots, holes...) and
the possible relations among them (relative position, contact, mechanical connection, support), at different levels of
granularity. Such a unified theory may play a crucial role in the integration of product data for applications in concurrent
engineering and enterprise integration, since current standardisation tools like ISO-10303 (STEP) are mostly based on
geometrical modelling, and a rigorous cha-racterization of qualitative features is still lacking. The relationships between
mereology, topology, geometry and teleology are among the major technical issues of this project. A preliminary
study has been made on the characterization of part-whole relations and on the ontological relationships between space and
matter.
http://www.ercim.org/publication/Ercim_News/enw25/ guarino.html
Steps Towards an Ontology Based Learning Environment:http://www.slideshare.net/kismihok/eclo2009kismihok
Adaptive Ontology Re-Use: Finding and Re-Using Sub-Ontologies:
The discovery of the ―right‖ ontology or ontology part is a central ingredient for effective ontology re -use. The purpose of this
paper is to present an approach for supporting a form of adaptive re-use of sub-ontologies, where the ontologies are deeply
integrated beyond pure referencing.
Design/methodology/approach – Starting from an ontology draft which reflects the intended modeling perspective, the
ontology engineer can be supported by suggesting similar already existing sub-ontologies and ways for integrating them with
the existing draft ontology. This paper's approach combines syntactic, linguistic, structural and logical methods into an
innovative modeling-perspective aware solution for detecting matchings between concepts from different ontologies. This
paper focuses on the discovery and matching phase of this re-use process.
Findings – Owing to the combination of techniques presented in this general approach, the work described performs in the

Brainstorm Page 6
Findings – Owing to the combination of techniques presented in this general approach, the work described performs in the
general case as well as approaches tailored for a specific usage scenario.
Research limitations/implications – The methods used rely on lexical information obtained from the labels of the concepts
and properties in the ontologies, which makes this approach appropriate in cases where this information is available. Also,
this approach can handle some missing label information.
Practical implications – Ontology engineering tasks can take advantage from the proposed adaptive re-use approach in order
to re-use existing ontologies or parts of them without introducing inconsistencies in the resulting ontology.
Originality/value – The adaptive re-use of ontologies by finding and partially re-using parts of existing ontological resources
for building new ontologies is a new idea in the field, and the inclusion of the modeling perspective in the computation of t he
matches adds a new perspective that could also be exploited by other matching approaches.
http://www.l3s.de/~stecher/papers/IJWIS08 -Vol4-Num2. pdf
A Harmony Based Adaptive Ontology Mapping Approach:
http://www.dit.unitn.it/~p2p/RelatedWork/Matching/SWW3692.pdf
Emotional Cognitive Agents with Adaptive Ontologies:
http://www.springerlink.com/content/k4g3h14t461p57g2/
Adaptive Ontology-Based Navigation:
http://www.dcs.warwick.ac.uk/~acristea/A3H/camera-ready 08/saloun-velart.pdf
Semantic Technology Conference:
http://www.semantic-conference.com/
NorthSide Inc., Language-Based Interaction with Machines:
http://www.northsideinc.com/
Deutsche Forschungszentrum für Künstliche Intelligenz (German Center for Artificial Intelligence):
http://www.dfki.de/
Cycorp:
http://www.cyc.com/
MITRE:
http://www.mitre.org/

Information Retrieval Facility:

Like no other facility in the world, the IRF provides a powerful supercomputing infrastructure that is exclusively concerned
with semantic processing of text. It has at its heart a huge collection of documents representing the global archive of ideas
and inventions in an environment, which allows large-scale scientific experiments on ways to manage and retrieve this
knowledge.
http://www.ir-facility.org/
UMBC eBiquity: Building Intelligent Systems in Open, Heterogeneous, Dynamic, Distributed Environments:
Our research explores the interactions between mobile and pervasive computing, the (semantic) web and web 2.0/3.0/4.0,
multi-agent systems and artificial intelligence, security/privacy/trust, and services. Group members have research interests in
the underlying areas, such as distributed systems, wireless networking, pervasive/mobile systems, ad -hoc networks,
knowledge representation and reasoning, data management and databases, information retrieval, machine learning,
personalization, security and privacy, web/data-mining, multi-agent systems and HPCC. Our research is driven by
applications in the e-sevices area -- context aware environments (meeting rooms, surgical suites), social media and
blogosphere, wireless web, VANETs, e-commerce and m-commerce, etc.
http://ebiquity.umbc.edu/us/
NetBase: Research Smarter Faster:
http://netbase.com/index.php
http://www.accelovation.com/
http://illumin8.com/home.php
The Shift from Information Retrieval to Synthesis:
"Grand challenges such as public health, security, genomics, environmental protection, education, and economics, are
characterized by complexity, interdependence, globalization, and unpredictability. Although the unprecedented quantity of
information surrounding these challenges can provide users with a new perspective on solutions, the data surrounding
complex systems vary with respect to levels of structure and authority, and include vastly different contexts and vocabularie s.
To be successful in this domain we must extend our models of information science such that they operate successfully in
environments where the quantity of relevant information far exceeds our human processing capacity. For example, the well -
accepted precision and recall metrics break down when hundreds of thousands of documents are relevant. Solutions to
grand challenges require that information scientists shift their focus from information retrieval towards information synthes is.
Systems that synthesize information support serendipity and experimentation, and enable the generation of patterns that no
individual has previously considered. In keeping with the socio-technical core of information science, models of synthesis
provide a mixed-initiative approach and ―solve tasks cooperatively with a domain expert‖ [1]. Such systems build on theories
and methodologies from the human-computer interaction, computer supported co-operative work, and information
visualization communities. Information synthesis systems generate knowledge for human problem -solvers who may not yet
recognize their information need."
http://dlist.sir.arizona.edu/2526/01/BlakeAnderson.pdf
mArachna - Applying Natural Language Processing Techniques to Ontology Engineering (2007):
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=4281096
Natural Language Processing and Information Systems (2007):
http://books.google.com/books?id=Vjf6qqsFwfgC&source= gbs_navlinks_s
Infocognition, Metalogic & Grammars
Sheaf (Mathematics):
http://en.wikipedia.org/wiki/Sheaf_(mathematics)
Sheaf Semantics for Concurrent Interacting Objects:
This paper proposes a new model theoretic approach to concurrency based on sheaves. Sheaf theory developed in
mathematics for studying relationships between local and global phenomena, and has also been applied in algebraic
geometry, differential geometry, analysis, and even logic. It has been given an abstract form using category theory [29,28],
which among other things provides some general results about limits that are used in this paper. From the point of view of
concurrency theory, it seems suggestive to think of sheaves as a generalisation of trace models. Sheaves handle real time
systems, and variation over space as well as over time, either discrete or continuous, in fact over any topological space, in a
very natural way.http://www.citeulike.org/pdf/user/isotelesis/article/3717500/goguen_92_sheaf.pdf

Type (Model Theory):


http://en.wikipedia.org/wiki/Type_(model_theory)
Type Theory:

Brainstorm Page 7
http://en.wikipedia.org/wiki/Type_(model_theory)
Type Theory:
http://en.wikipedia.org/wiki/Type_theory
*-Autonomous Categories: Quantifiers in Action: Generalized Quantification in Query, Logical and Natural Languages (by
Michael Barr and Po-Hsiang Chu):
http://books.google.com/books?id=WC4pkt3m5b0C
Concurrent ontology and the extensional conception of attribute (Vaughan Pratt):
By analogy with the extension of a type as the set of individuals of that type, we define the extension of an attribute as th e
set of states of an idealized observer of that attribute, observing concurrently with observers of other attributes. The attr ibute-
theoretic counterpart of an operation mapping individuals of one type to individuals of another is a dependency mapping
states of one attribute to states of another. We integrate attributes with types via a symmetric but not self-dual framework of
dipolar algebras or disheaves amounting to a type-theoretic notion of Chu space over a family of sets of qualia doubly
indexed by type and attribute, for example the set of possible colors of a ball or heights of buildings. We extend the sheaf-
theoretic basis for type theory to a notion of disheaf on a profunctor. Applications for this framework include the Web
Ontology Language OWL, UML, relational databases, medical information systems, geographic databases, encyclopedias,
and other data-intensive areas standing to benefit from a precise ontological framework coherently accommodating types
and attributes. Keywords: Attribute, Chu space, ontology, presheaf, type.
http://conconto.stanford.edu/conconto.pdf
http://chu.stanford.edu/
http://chu.stanford.edu/guide.html
http://boole.stanford.edu/pratt.html
http://boole.stanford.edu/abstracts.html
http://en.wikipedia.org/wiki/Vaughan_Pratt
A Historical Note on "Geometry and Concurrency" (Eric Goubault):
http://www.di.ens.fr/~goubault/index1.html
Conceptual Mathematics (F.W. Lawvere, Stephen Hoel Schanuel):
In the last fifty years, the use of the notion of 'category' has led to a remarkable unification and simplification of mathem atics.
Written by two of the best known participants in this development, Conceptual Mathematics is the first book to serve as a
skeleton key to mathematics for the general reader or beginning student and as an introduction to categories for computer
scientists, logicians, physicists, linguists etc. While the ideas and techniques of basic category theory are useful througho ut
modern mathematics, this book does not presuppose knowledge of specific fields but rather develops elementary categories
such as directed graphs and discrete dynamical systems from the beginning. The fundamental ideas are then illuminated in
an engaging way by examples in these categories.
http://books.google.com/books?id=o1tHw4W5MZQC&source=gbs _navlinks_s
A Categorical Manifesto (1991, Joseph A. Goguen):
http://citeseer.ist.psu.edu/old/goguen91categorical.html
nCatLab:
http://ncatlab.org/nlab/show/HomePage
The n-Category Café (John Baez, a group blog on math, physics and philosophy):http://golem.ph.utexas.edu/category/
What is Category Theory? (2006):
http://books.google.com/books?id=tVOuvxqhBxwC&source=gbs_navlinks_s
Perspectives of Neural-Symbolic Integration (2007):
http://books.google.com/books?id=gCGTN2lmwD8C&source= gbs_navlinks_s
Neural Information Processing (2008):
The second volume contains 112 contributions related to statistical and pattern recognition algorithms, neuromorphic
hardware and implementations, robotics, data mining and knowledge discovery, real world applications, cognitive and hybrid
intelligent systems, bioinformatics, neuroinformatics, brain-conputer interfaces, and novel approaches.
http://books.google.com/books?id=noYjY8rMilEC&source= gbs_navlinks_s
Artificial General Intelligence (2008):
http://books.google.com/books?id=a_ZR81Z25z0C&source= gbs_navlinks_s
Functional Models of Cognition (2000):
Readership: psychologists, cognitive scientists, theorists working in complexity theory, self-organization theory, synergetics,
semantics of natural language and mereology, philosophers, epistemologists.
http://books.google.com/books?id=KKSx1rM2YCUC&source=gbs _navlinks_s
Cognitive Processing Journal:
http://www.springerlink.com/content/1612-4782
Cognitive Neurodynamics Journal:
http://www.springerlink.com/content/1871-4080
Computational Cognitive Neuroscience Laboratory (Indiana University):
http://www.indiana.edu/~cortex/
Neuroscience Databases (By Rolf Kötter, 2002):
http://books.google.com/books?id=QZsuEzuOlhEC&source=gbs_navlinks_s
A.I., CogSci and Robotics:
http://www.transit-port.net/AI.CogSci.Robotics/robotics.html
A Complex Systems Perspective on the "Computation vs. Dynamics" Debate in Cognitive Science
(Melanie Mitchell, Santa Fe Institute):
http://web.cecs.pdx.edu/~mm/cogsci98.pdf
Society for Complex Systems in Cognitive Science:
http://pdl.brain.riken.jp/scscs/
Dynamicist Cognitive Science:
http://www.phil.mq.edu.au/staff/jsutton/CogSciDynamicism.html
The Continuous and the Infinitesimal in Mathematics and Philosophy:
http://books.google.com/books?id=Eq8ZualfMRkC&source=gbs_navlinks_s
Information Flow: The Logic of Distributed Systems:
http://books.google.com/books?id=Mawadg55eg4C
A Practical Logic of Cognitive Systems:
Volume I: Agenda Relevance: A Study in Formal Pragmatics:
http://books.google.com/books?id=x2sqXzdmn8MC
Volume II: The Reach of Abduction: Insight and Trial:
http://books.google.com/books?id=UULl07dutBwC
Applying Prolog to Semantic Web Ontologies & Rules Moving Toward Description Logic Programs:
http://www.mitre.org/work/tech_papers/tech_papers_07/06_0917/

Theories of Geographic Concepts: Ontological Approaches to Semantic Integration:

Brainstorm Page 8
Theories of Geographic Concepts: Ontological Approaches to Semantic Integration:

Most widely available approaches to semantic integration provide ad-hoc, non-systematic, subjective manual mappings that
lead to procrustean amalgamations to fit the target standard, an outcome that pleases no one. Written by experts in the field ,
Theories of Geographic Concepts: Ontological Approaches to Semantic Integration emphasizes the real issues involved in
integrating existing geo-ontologies.
http://books.google.com/books?id=YzVUcZj65CQC&dq=infomorphism&source=gbs_navlinks_s
Alexander Okhotin (Formal Language and Automata Theory, Language Equations):
http://users.utu.fi/aleokh/
Kazem Mahdavi (Group Theory, Universal Algebra)
http://books.google.com/books?q=Kazem+Mahdavi&bt nG=Search+Books
M. Alsani (Descent & Category Theory)
http://north.ecc.edu/alsani/descent.html
Barbara J. Grosz (Artificial Intelligence, Collaborative Planning and Human-Computer Communication)
http://www.eecs.harvard.edu/grosz/
Jakub Szymanik (Philosophical Logic, Computational Linguistics and Cognitive Science)
http://staff.science.uva.nl/~szymanik/
Alessio Guglielmi (Proof Theory from a Theoretical Computer Science perspective)
http://alessio.guglielmi.name/res/index.html
Joseph Goguen (Information Integration, ontologies, database semantics, schema mapping)
http://cseweb.ucsd.edu/~goguen/
Reinhard Blutner (Quantum Cognition):
http://amor.rz.hu-berlin.de/~h0998dgh/
The Human-Computer Interaction Fundamentals (2009):
In sixteen highly focused chapters, this book puts the spotlight on the fundamental issues involved in the technology of
human-computer interactions as well as the users themselves. Derived from select chapters in The Human-Computer
Interaction Handbook, this volume emphasizes emerging topics such as sensor based interactions, tangible interfaces,
augmented cognition, cognition under stress, ubiquitous and wearable computing, and privacy and security. It explores
human information processing, motivation, emotion in HCI, sensor-based input solutions, and accessibility/diversity issues.
The book features visionary perspectives and developments that fundamentally transform the way in which researchers and
practitioners view this discipline.
Teletic Work and Motivational Affordances:
Teletic, or autoteletic, work refers to "work" that is experienced as enjoyable and is associated with flow or optimal
experience characterized by a sense of well being and harmony with one's surroundings (Csikszentmihalyi, 1990) There is
variation in both tasks and individuals with respect to the degree to which the human-technology interaction is teletic. There
are four categories in which individuals tend to fall with respect to their relation to work.
http://books.google.com/books?id=npLEMUzgQ_0C&pg=PA99&dq=autoteletic
Metamotivational States:
Reversal Theory is - at its essence - a theory of the structure of mental life. Developed by Dr. Michael Apter, Reversal
Theory emphasizes the complexity, changeability, and inconsistency of behavior, and proposes that individuals can and do
regularly reverse between psychological states, depending upon the meaning and motives felt by that individual.
http://www.reversaltheory.org/ RT_TheoryGlos.htm
Nonlinearity and Teleology:
In contemporary analyses, teleological narratives are often mistakenly opposed to "nonlinear" narratives. Many secular
teleologists throughout history described telos as a product of feedback, not as a direct cause separate from the process it is
said to guide. Moreover, in many teleological accounts of causation, a telic state is seen as the inevitable result
of random interactions. The importance of chance to the concept of telos has been ignored by arguments that have confused
nonlinear telic causality with reductive material causality. Today nonlinear dynamics theorists and structural evolutionary
theorists use the terms "structural attractors," "emergent complexity," and "self-organization" to describe the same kinds of
phenomena that interested Aristotle, Kant, Bergson, and many other teleologists and vitalists.
http://www.dactyl.org/directors/vna/Pasadena_Talk.htm
CiteULike:
http://www.citeulike.org/user/msakai
http://www.citeulike.org/user/isotelesis
http://www.citeulike.org/user/scis0000001
http://www.citeulike.org/user/Scis0000002
Institute for Logic, Language, and Computation:
http://www.illc.uva.nl/
http://www.illc.uva.nl/Publications/reportlist.php?Series=PP
Journal of Logic, Language, and Information:
http://www.springer.com/philosophy/logic/journal/10849
Association for Logic, Language, and Information:
http://folli.loria.fr/
European Summer School in Logic, Language, and Information:
http://esslli2009.labri.fr/
North American Summer School in Logic, Language, and Information:
http://www.indiana.edu/~nasslli/
http://www.nasslli.com/
Boundary Institute, Foundations of Physics, Mathematics, and Computer Science:
http://boundary.org/bi/index.html
Group Theory and Computational Linguistics:
http://portal.acm.org/citation.cfm?id=595932
Lambek Calculus and Noncommutative Logic:
http://lpcs.math.msu.su/~pentus/abstr.htm
http://en.wikipedia.org/wiki/Ordered_logic
Linear Logic:
http://en.wikipedia.org/wiki/Linear_logic
The Oxford handbook of philosophy of mathematics and logic:
http://books.google.com/books?id=GU3lV1xoWC8C
Metalogic:
http://en.wikipedia.org/wiki/Metalogic
A Meta-logical Approach for Multi-agent Communication of Semantic Web Information:
The success of the semantic web would be determined by how easy and uniform to access to and exchange of the semantic
information among computers. In this paper we have developed a framework of multi -agent communication of the Semantic
Web information. The agent and the communication between agents are characterized in meta-logic. One single agent,

Brainstorm Page 9
Web information. The agent and the communication between agents are characterized in meta-logic. One single agent,
understood as a meta-logical system, adopts a demo(.) predicate as its inference engine and meta-programs—trans formed
from some Semantic Web ontologies—as its assumptions. Such an agent can reason with its assumptions as well as other
agent‘s assumptions. With this ability, when several agents are created by using this framework, the community of these
agents can uniformly communicate the Semantic Web information between each other on the Internet.
http://www.springerlink.com/content/y578t3k578541452/
http://www.waset.org/pwaset/v10/ v10 -20.pdf
Communication heuristics in distributed combinatorial search algorithms:
http://www.springerlink.com/content/fj734q8271p718x9/
http://en.wikipedia.org/wiki/Combinatorial_optimization
Parallel Problem Solving From Nature:
http://books.google.com/books?id=gI26Cld2BY0C&lr=&source=gbs_navlinks_s
Orthoalgebraic Semantics and Quantum Linguistics:
Classical truth-functional semantics and almost all of its modifications have a serious problem in treating prototypes and their
combinations. Though some modelling variants can account for many puzzling empirical observations, their explanatory
value is seldom noteworthy. In recent work by several researchers it has been argued that this explanatory inadequacy is
due to the Boolean characteristic of the underlying semantics. These researchers have suggested a proper generalization of
Boolean algebras called ortho-algebras (known from quantum information theory). In five lectures, this new and exciting field
of research will be discussed: Introduction and motivating examples The mathematics of orthoalgebras A decorated partition
theory of questions Quantum probabilities and bounded rationality Prototypicality and complex concepts Many linguistic
phenomena have a close analogue to phenomena investigated in quantum physics. Words are floating freely in a polyvalent
state representing a variety of different uses. As the properties of small particles are not absolute and determined not unti l
observing them, in language the properties of word tokens are determined not until conscious apprehension. Further,
cognitive measures such as salience, typicality or cue validity cannot be modelled properly by classical probabilities. Inste ad,
quantum probabilities were quite useful for handling such quantities. Finally, a quantum framework can be used for
integrating logic programs and connectionist systems (representing the ―phrase space‖ in a dynamic ―phase space‖). The
aim of a planned workshop is to discuss the applicability of methods known from quantum theory to the study of natural
language. The new and exciting field of research will be discussed in three blocks: Vector based retrieval of semantic
information (see Widdows, Aerts) Prototype semantics, bounded rationality, and interference effects (see Aerts, Gabora,
Busemeyer, Khrennikov, Franco) Representation theory for nonlinear dynamic automata and quantum information theory
(see Atmanspacher, beim Graben, Primas)
http://www.quantum-cognition.de/
NeuroQuantology is a journal dedicated to supporting the interdisciplinary exploration of the nature of quantum physics and
its relation to the nervous system:
http://www.neuroquantology.com/
Quantum Cognition and Quantum Brain Dynamics:
http://physik.htu.tugraz.at/wiki/images/9/99/WYOPSOL_TU1.pdf
Quantum Brain, Quantum Mind, & Quantum Consciousness:
http://www.quantumbrain.org/
http://www.quantumbrain.org/Abstract2007.html
Emergent Mind:
http://www.emergentmind.org/
http://www.emergentmind.org/ Theoretical%20Milestones.htm
Holonomic Brain Theory:
http://www.scholarpedia.org/article/Holonomic_brain_theory
Is This a Unified Theory of the Brain? (Bayesian Neural Network Modeling):
http://reverendbayes.wordpress.com/2008/05/29/bayesian -theory-in-new-scientist/
Redwood Center for Theoretical Neuroscience:
Theoretical neuroscience : a sub-discipline within neuroscience which attempts to use mathematical and physical principles
to understand the nature of coding, dynamics, circuitry and plasticity in nervous systems.
https://redwood.berkeley.edu/wiki/Mission_and_Research
Consciousness and Hyperspace with Saul-Paul Sirag:
http://www.intuition.org/txt/sirag.htm
Holonomic and Anholonomic "Coordinates" (Saul-Paul Sirag):
There is a confusion in the literature over the use of the word “coordinates.” As a result, in the older literature influenced by
J.A. Schouten (1951), the terms “holonomic coordinate system” and “anholonomic system.” are used. And for an
anholonomic system an “anholonomic object” is employed. In the newer literature, exemplefied by Bernard Schutz (1980),
the terms “coordinate system” and “noncoordinate system” are used. In this case the “anholonomic object” is replaced by the
Lie algebra structure constant tensor. A subtle question then arises: is the anholonomic object a tensor. J.F. Corum claims
that the anholonomic object is not a tensor, and can therefore be removed by a change in coordinate system. G. Shipov
claims that if the underlying manifold is a noncommutative Lie Group, then the anholonomic object is a tensor and cannot
simply be coordinatized away.
I hold that, in this argument, Shipov is correct. The key is to understand the relationships between manifolds and the vector
fields which live on them. Also we must understand the difference between a commutative Lie group and a
noncommutative Lie group and the effect which this difference makes on the vector fields on the respective Lie group
manifolds.
http://www.stardrive.org/Jack/Holonomy.pdf
Holonomic and Anholonomic Constraints and Coordinates, Frobenius Integrebility and Torsion of Various Types (R. M.
Kiehn):
http://www22.pair.com/csdc/pdf/anholono.pdf
Consciousness Studies:
http://en.wikibooks.org/wiki/Consciousness_studies
Alex Kaivarainen (quantum theory of condensed matter, especially ice and water, and biophysics):
http://web.petrsu.ru/~alexk/
Nature, Cognition and Quantum Physics - Peter Marcer: "in computer science, a physical theory - the quantum theory of
computation - is now to be regarded as the theory of computation, replacing the mathematical/Turing theory as the correct
one, the nature of information is radically extended by the concept of quantum information, beyond what, until now, has
generally been accepted in science to be the case, information therefore becomes a new concept on a par with the accepted
concept of energy, needing incorporation in understanding physics, and as already experimentally validated, this
incorporation radically changes the scientific understanding of how chemistry may be be performed - specifying new designs
for chemical systems employing optimally controlled quantum signal induced, rather than approximately thermodynamically
induced, chemical reactions."
http://www.bcs.org/server.php?show=ConWebDoc.16175
Cybernetics, Protocomputation & Biosemiotics

Brainstorm Page 10
http://www.bcs.org/server.php?show=ConWebDoc.16175
Cybernetics, Protocomputation & Biosemiotics
The Physics of Information:
http://theory.caltech.edu/people/preskill/
Geometric (Holonomic) Gates:
http://www.quantiki.org/wiki/index.php/Geometric_(holonomic)_gates
Holonomic Quantum Computing:
http://qwiki.stanford.edu/wiki/Holonomic_Quant um_Computing
Systems Thinking, Complex Systems, Chaos Theory, Systems Dynamics, Adaptive Systems, Self-Organization, Self-
Control, Autopoiesis, Autonomic Systems, Holonomic Systems, Nonlinear Dynamics, Complexity, Emergence,
Sociotechnical Systems, Fractals, Genetic Algorithms, Artificial Life.
http://www.brint.com/Systems.htm
Coherent Dynamics of Complex Quantum Systems:
http://books.google.com/books?id=pfeT5D_Wsy0C
Quantum Dissipative Systems:
http://books.google.com/books?id=4NfnaEsbQq4C
Energy and Information Transfer in Biological Systems:
http://books.google.com/books?id=jPDkS1I61vMC
Wholeness and Information Processing in Biological Networks: An Algebraic Study of Network Motifs:
http://www.springerlink.com/content/kv0x6332v2t47963/
Cybernetics & Human Knowing: A Journal of Second Order Cybernetics, Autopoiesis, and Cyber-Semiotics:
http://www.chkjournal.org/
Rules of Three:
Isotelism, Polytelism, Holotelism
Egalité, Amitié, Liberté
Physical, Mental, Platonic
Deductive, Inductive, Abductive

Rules of Four:
The axioms (basic rules) for a group are:
1. CLOSURE: If a and b are in the group then a • b is also in the group.
2. ASSOCIATIVITY: If a, b and c are in the group then (a • b) • c = a • (b • c).
3. IDENTITY: There is an element e of the group such that for any element a of the group
a • e = e • a = a.
4. INVERS ES: For any element a of the group there is an element a-1 such that
• a • a-1 = e
and
• a-1 • a = e

Rules of Five:
Air, Fire, Water, Earth, Space
"Hology is a logical analogue of holography characterizing the most general relationship between reality and its contents. It is
a form of self-similarity whereby the overall structure of the universe is everywhere distributed within it as accepting and
transductive syntax, resulting in a homogeneous syntactic medium."
http://www.megafoundation. org/ Teleologic/main.htm
"A review of the standard computational theory of language may prove useful. Computation theory recognizes two general
types of automata, transducers and acceptors. Transducers convert input to output, while acceptors classify or ―recognize‖
input consisting of strings of symbols without necessarily producing output.
A finite transducer is a 5-tuple (Σ,Q,Γ,δ,ω), where Σ is a finite nonempty input alphabet, Q is a finite nonempty state set, Γ is
a finite nonempty output alphabet, δ:Q × Σ ~> Q is the state transition function, and ω:Q × Σ ~> Γ is the output function. To
this we can add a start state q0. Finite transducers ultimately rely on mechanical laws to function, transforming information al

input to informational output by transforming their own states.
A finite acceptor is a 5-tuple (Q,Σ,δ,q0,A), where Q is a nonempty finite set of internal states, Σ is an alphabet, q0, is the start
state, and A Q is the set of accepting states. The range of∈the transition ∈mapping δ determines the type of acceptor; it is
deterministic if δ:Q×Σ~>Q, and nondeterministic if δ:Q×Σ~>2Q (where 2Q represents the power set of possible states). A
deterministic finite acceptor (Q,Σ,δ,q0,A) accepts a string x Σ* iff δ(q0,x) A. A language is the set of strings accepted by a
given automaton or class of automata.

Languages are∪ generated by grammars. In the computational theory of language, a generative
⊂ ∪(or phrase structure)

grammar G is a 4-tuple (N,T,P,σ) consisting of (1) a finite set N of nonterminals; (2) a finite nonempty set T of terminals, with
N∩T= and N T = A (the total alphabet of the grammar); (3) a finite set of productions P ((N T)*\T*) × (N T)* consisting
of nonterminal arguments and their possibly terminal transforms; and (4) an element σ of N called the starting symbol. The
implementation of such a grammar is a deductive process leading from the general to the specific; starting from the most
general symbol σ (which stands for ―sentence‖), increasingly specific productions lead to a terminal configuration.
∈ The
production (x,y), often written x~>y, signifies replacement of x by y, or equivalently, the substitution of y for x. Where A*
denotes the set of all strings or ―words‖ in A, and A*\T* denotes the complement of T* in A*, a word w (A*\T*) generates
another word w‘ if w=w1Xw2, w‘=w1X‘w2, and X~>X‘ is a production.
The theory of generative grammars classifies them according to the least powerful acceptor that can recognize the
languages they generate. Type 0 grammars generate unrestricted languages requiring a universal computer (Turing
machine) with unlimited memory; type 1 grammars generate context -sensitive languages requiring a linear-bounded
automaton with memory proportional to word length; type 2 grammars generate context -free languages requiring a
pushdown automaton with a memory stack in which a fixed number of elements are available at any point; and type 3
grammars generate regular languages requiring a finite deterministic automaton with no memory.
There is an obvious parallel between the states and state transitions of automata, and the strings and productions of a
grammar. An automaton processes input strings through its internal states, expressing them in terms of its own ―internal
language‖. Indeed, a physical automaton in the act of processing an input string can be seen as a dynamic linguistic
stratification incorporating the input language, the mutable programming of the automaton (including assembly and machine
code), its hard-wired architecture, the nomological language consisting of the laws of physics according to which the
hardware functions, and any ―metaphysical‖ level of language necessary to define and maintain the laws of physics
themselves. Since each language in this sequence is expressed in terms of the next one after it, the languages form a
―descriptive nesting‖ in which the syntax of each distributes over all of those preceding it.
The syntax of a language consists of its grammar and the structure of its expressions. That is, a syntax is a compilation of
the spatial (structural) and temporal (grammatical, transformational) rules of the associated language; its rules are invaria nt,
general, and distributive with respect to the entire set of expressions comprising the language. This concept is as meaningfu l
for automata as it is for the languages they process, applying to every level of the linguistic stratification just described . For

Brainstorm Page 11
for automata as it is for the languages they process, applying to every level of the linguistic stratification just described . For
example, where the concept of general covariance expresses the general and distributive nature of the laws of physics,
these laws can be regarded as a ―syntax‖ unto themselves, and so can the more general mathematical laws applying to the
various mathematical structures to which the laws of physics implicitly refer."
Pg. 38-39:
http://megafoundation.org/CTMU/Articles/Langan_CTMU_092902.pdf
Finite-State Machines:
http://en.wikipedia.org/wiki/Finite-state_machine

Ontological Semantics:
Ontological semantics, an integrated complex of theories, methodologies, descriptions, and implementations,
attempts to systematize ideas about both semantic description as representation and manipulation of
meaning by computer programs. It is built on already coordinated "microtheories" covering such diverse
areas as specific language phenomena, processing heuristics, and implementation system architecture rather
than on isolated components requiring future integration. Ontological semantics is constantly evolving,
driven by the need to make meaning manipulation tasks such as text analysis and text generation work.
Nirenburg and Raskin have therefore developed a set of heterogeneous methods suited to a particular task
and coordinated at the level of knowledge acquisition and runtime system architecture implementations, a
methodology that also allows for a variable level of automation in all its processes.

Nirenburg and Raskin first discuss ontological semantics in relation to other fields, including cognitive science
and the AI paradigm, the philosophy of science, linguistic semantics and the philosophy of language,
computational lexical semantics, and studies in formal ontology. They then describe the content of
ontological semantics, discussing text -meaning representation, static knowledge sources (including the
ontology, the fact repository, and the lexicon), the processes involved in text analysis, and the acquisition of
static knowledge.
http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&tid=10267
Descriptional Complexity of Multi-Parallel Grammars with Respect to the Number of Nonterminals:
The conventional wisdom was that biology influenced mathematics and computer science. But a new approach has taken
hold: that of transferring methods and tools from computer science to biology. The reverse trend is evident in Grammars and
Automata for String Processing: From Mathematics and Computer Science to Biology and Back. The contributors address
the structural (syntactical) view of the domain. Mathematical linguistics and computer science can offer various tools for
modeling complex macromolecules and for analyzing and simulating biological issues. This collection is valuable for students
and researchers in biology, computer science, and applied mathematics.
http://www.informaworld.com/smpp/content~content=a732478690~ db=all~jumptype=rss
Key Concepts in Holopoetry:
From the start the breaking down of the immaterial space of holography, as well as the development of non -linear temporal
systems, have been the basis of my holographic syntax.
http://www.electronicbookreview.com/thread/electropoetics/uncontrollable?mode= print
The holocosmological model is aesthetic and uses metaphor at all levels. The structure and function of metaphor, and its role
in philosophical and scientific description, is presented, using the concepts of holography as a metaphor for metaphor.
Important metaphysical themes, such as participation and caring, are brought together in a discussion of the role of language
as expression and limitation. Poetic language is justified as the primary expression of knowledge of human and ambihuman
nature; it is found to be particularly appropriate for cosmology, with its ability to impart wholeness, ambiguity, reference, and
interrelatedness. Poetry weaves the themes and threads into a holocosmology. Literally, poiesis is "making" and holopoiesis
is making the whole universe. This poetic framework is a holopoietic cosmology, where the highest wisdom is "letting be"
(beyond necessary utilization), and where compliance and caring are derivations of the most binding emotion, love.
http://www.3musesbooks.com/oemwrev.htm
"Category the First is the Idea of that which is such as it is regardless of anything else. That is to say, it is a Quality o f
Feeling. Category the Second is the Idea of that which is such as it is as being Second to some First, regardless of anything
else, and in particular regardless of any Law, although it may conform to a law. That is to say, it is Reaction as an element of
the Phenomenon. Category the Third is the Idea of that which is such as it is as being a Third, or Medium, between a
Second and its First. That is to say, it is Representation as an element of the Phenomenon."
http://www.textlog.de/7649.html
The Philosophy of Ecology:
http://books.google.com/books?id=uYOxUAJThJE C
Teleosemantics seeks to explain meaning and other intentional phenomena in terms of their function in the life of the
species:
http://books.google.com/books?id=hgUXTKBiDDUC
Form-Meaning Asymmetries and Bidirectional Optimization:
https://webspace.utexas.edu/dib97/fmaabo.pdf
On thinking of kinds: a neuroscientific perspective:
http://homepage.mac.com/ancientportraits/drsite/representingkinds.pdf

Logic as Philosophy: An Introductory Anthology:


http://www.libstudy.hawaii.edu/manicas/pdf_files/books/LogicAsPhilosophy.pdf

Ling 236: Quantitative, Probabilistic, and Optimization Base Explanation in Linguistics:


http://nlp.stanford.edu/~manning/courses/ling236/handouts/ling236-prob-in-ling.pdf

• Category Theory Category theorists are conceptual mathematicians of a special kind. What binds them together is that they
approach mathematical problems with a point of view that is radically different from that on which traditional mathematics is
based, and which emphasizes interactions between mathematical objects over their individual constituents. Their results are
often surprising, provide new insights, and are obtained by the invention of sophisticated notions, theories, and techniques.
Category Theory is only little more than 50 years old (dating it back to the work of S. Eilenberg and S. MacLane in 1945) --
yet, its impact on several branches of mathematics has been considerable, in spite of the reluctance to recognize it as a
revolutionary independent field dealing with foundational questions, very different from Set Theory.
• Category Theory at McGill The category theorists that constitute our group are, in order of their joining the Department, Jim
Lambek, Marta Bunge, Michael Barr and Michael Makkai,with the addition of Robert Seely and Thomas Fox as Adjunct
Professors. Together, they have a variety of traditional interests comprising Logic, Model Theory, Set Theory, Ring Theory,
Algebraic Theories and Categories, Differential Algebra, Homological Algebra, Synthetic Differential Geometry and Topology,
Hopf Algebras and Dynamical Systems, Topos Theory, Locales Theory, Fundamental Group, Descent, Classifying Toposes,

Brainstorm Page 12
Hopf Algebras and Dynamical Systems, Topos Theory, Locales Theory, Fundamental Group, Descent, Classifying Toposes,
Theory of Distributions, Fibered Categories, Higher-Order Categories, Categorical Linguistics, and Theoretical Computer
Science. After the retirements of Jim Lambek and Michael Barr, both Emeritus Professors, we hope to be able to make new
strong additions to the Department in the near future. The following is a more or less exhaustive list of Category Theory
Centers in the world: Montreal, Cambridge, Sydney, Chicago, Buffalo, Bangor, Louvain -la-Neuve, Utrecht, Genova, Trieste,
Como, Paris, Toronto and Dalhousie.
• The Montreal Categories Center It began informally in 1966, when Jim Lambek, after a sabbatical year in Zurich and
contact with Bill Lawvere, decided not only to work in the field himself, but also to promote it at McGill. He then brought M arta
Bunge ( a student of Peter Freyd and Bill Lawvere) to McGill as a post -doctoral fellow, later to join the staff. Within a year,
the Berkeley logician Gonzalo Reyes joined the Universite de Montreal, while Michael Barr, a homological algebraist, joined
McGill, bringing along three graduate students. Various seminars and increased activity were carried on at these two
Universities. Out of the Universite de Montreal came Andre Joyal, now the center-piece at UQAM, and out of McGill came
Bob Pare, the promoter of the Dalhousie Category Theory Center. Later on, the group was enriched by the hiring of Mihaly
Makkai, a logician from Budapest. Within five years, the nucleus of the group, as it exists today, was already formed. No
further hirings in Category Theory were made in more than 25 years at any of these three institutions. Yet, the activities
which this group has generated has been (until now) truly remarkable from the points of view of graduate students,
postdoctoral fellows, visitors, organization of meetings, invited lectures at international meetings, editorship of various
important journals, distinctions of various kinds, individual and team grants from NSERC and FCAR, bulk and quality of
publications, and an incredible network of international contacts. The activities of the group are partly reflected by those of
the Centre de Recherches en Theorie des Categories, within the Institut des Sciences Mathematiques.
• Current Research Areas in Category Theory at McGill Three areas deserve attention because of the novelties they bring
and because they are part of a truly international joint effort. Let us refer to them as "Grothendieck's Program", "Lawvere's
Program", and "Computational Category Theory". Although not pairwise disjoint, their objectives are different and can be
briefly described as follows.
1. Grothendieck's program was expounded by Grothendieck in his famous unpublished very long "Letter to Quilllen". In
Montreal, Joyal, Makkai and Bunge are "pursuing the stacks" from different points of views.
2. Lawvere's program was initiated by Lawvere in two steps, in 1967 and in 1983. The first is called "Categorical Dynamics" and
it gave rise to "Synthetic Differential Geometry". In Montreal, both Reyes and Bunge have worked and formed many students
in this area. The second is called "Distributions Theory on Toposes" and is still in full development.In Montreal, Bunge and
her collaborators from elsewhere (A. Carboni (Como), J. Funk (Saskatchewan), S. Niefield (Union College), M. Fiore
(Sussex), M. Jibladze (Louvain-la-Neuve and Tbilisi), and T. Streicher (Darmstadt) has been actively engaged in research in
this area for the past seven years.
3. Computational Category Theory. Broadly speaking, this includes Linear Logic, Chu Categories, Synthetic Domain
Theory, Coherence, Bi-completions of Categories, Categorical Proof Theory, and Categorical Linguistics. In
Montreal, Lambek, Barr, Seely and others are actively working in some aspects of this program.

http://www.math.mcgill.ca/bunge/ctatmcgill.html

Syntactic Structures (Noam Chomsky):


http://www.scribd.com/doc/7002886/Noam -Chomsky-Syntactic-Structures-2Ed
A Dictionary of Grammatical Terms in Linguistics:
http://www.bookrags.com/browse/tf0203393368/
Quantum Automata and Quantum Grammars (Cristopher Moore and James P. Crutchfield
Theoretical Computer Science 237 (2000) 275-306):
To study quantum computation, it might be helpful to generalize structures from language and automata theory to the
quantum case. To that end, we propose quantum versions of finite-state and push-down automata, and regular and context-
free grammars. We find analogs of several classical theorems, including pumping lemmas, closure properties, rational and
algebraic generating functions, and Greibach normal form. We also show that there are quantum context -free languages that
are not context-free.
http://www.santafe.edu/~moore/pubs/qrl.html
http://en.wikipedia.org/wiki/Lisp_(programming_language)
http://en.wikipedia.org/wiki/Haskell_(programming_language)
http://en.wikipedia.org/wiki/Oz_(programming_language)
Concurrent Constraint Programming in Oz for Natural Language Processing:
http://www.ps.uni-sb.de/~niehren/Web/Vorlesungen/Oz-NL-SS01/ vorlesung/
Sixth Generation Computing: A Conspectus of the Japanese Proposals:
http://pages.cpsc.ucalgary.ca/~gaines/reports/MFIT/SIGA RT86/index.html
Why Sanskrit is Important:
http://speaksamskrit.blogspot.com/2008_10_01_archive.html

The Semiotic Machine:


"Humans are no longer the only and lonely ones that are capable of, and to some degree
dependant on, the interpretation of signs."
Frieder Nake
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1270062
http://en.wikipedia.org/wiki/Self-reference
http://en.wikipedia.org/wiki/Metacomputing

From Aristotelian Metaphysics to the Implicate Order and Evolution:


http://www.metafysica.nl/nature/
Computational Philosophy:
http://www.crumpled.com/cp/
Social Correlates of Turn-Taking Behavior:
http://www.nashborges.com/research/sctt_icassp09.pdf
Self-Organizing Maps in Natural Language Processing:
http://reference.kfupm.edu.sa/content/s/e/self_organizing_maps_in_natural_language_395205.pdf
http://www.cis.hut.fi/~tho/publications/honkela_casys97.pdf
Quantum Computation and Natural Language Processing:
http://nats-www.informatik.uni-hamburg.de/~joseph/dis/dis/dis.html
Adaptive Grammars for Intelligent Agents in Virtually Distributed Conceptual Spaces
http://en.wikipedia.org/wiki/Formal_grammar
http://en.wikipedia.org/wiki/Adaptive_grammar
Adaptive Parsing: Self-Extending Natural Language Interfaces:

Brainstorm Page 13
http://en.wikipedia.org/wiki/Adaptive_grammar
Adaptive Parsing: Self-Extending Natural Language Interfaces:
http://www.aclweb.org/anthology-new/J/ J92/ J92-3010.pdf

Summaries of Adaptive Grammar Models:


http://web.cs.wpi.edu/~jshutt/adapt/top.html

Adaptive Parsing:
Parsing is the transformation from flat text to data structures. Usually, this requires some kind of syntax definition as input in
addition to the text to be parsed. An adaptive parser performs the transformation with minimal additional input; in particular,
AP requires only syntactical information that can be provided by a typical user without the help of a programmer.
http://www.cs.hmc.edu/~asampson/ap/
Adaptive Predicates in Empty-Start Natural Language Parsing:
http://www.thothic.com/downloads/jackson01adaptive.pdf
Some Theoretical and Practical Results in Context-Sensitive and Adaptive Parsing:
http://www.iscid.org/papers/Jackson_AdaptiveParsing_093002.pdf
RNA Structural Motif Classification Grammars:
Context-Free and Context-Sensitive (Pseudoknot) Parsing of RNA Secondary Structure:
http://www.rnaparse.com/
The Linguistics of DNA: Words, Sentences, Grammar, Phonetics, and Semantics:
http://www.rci.rutgers.edu/~sji/Linguistics%20of%20DNA.pdf
On Einstein's Razor: Telesis-Driven Introduction of Complexity into Apparently Sufficiently Non-Complex Linguistic Systems:
―Never express yourself more clearly than you are able to think.‖—Niels Bohr
―It is wrong to say that a good language is important to good thought, merely; for it is the essence of it.‖—Charles Sanders
Peirce
The notion that a linguistic system that is powerful enough to accept any acceptable language but insufficiently complex to
meet specific goals or needs is explored. I nominate Chomsky‘s generative grammar formalism as the least complex
formalism required to describe all language, but show how without the addition of further complexity, little can be said abou t
the formalism itself. I then demonstrate how the
O(n) parsing of pseudok nots, a previously difficult to solve problem, becomes tractable by the more complex § -Calculus, and
finally close with a falsifiable hypothesis with implications in epistemological complexity.
http://www.thothic.com/downloads/Jackson_EinsteinsRazor_050205.pdf
Annotation for the Semantic Web:
The Semantic Web aims at machine agents that thrive on explicitly specified semantics of content in order to search, filter,
condense, or negotiate knowledge for their human users. A core technology for making the Semantic Web happen, but also
to leverage application areas like Knowledge Management and E-Business, is the field of Semantic Annotation, which turns
human-understandable content into a machine understandable form. This book reports on the broad range of technologies
that are used to achieve this translation and nourish 3rd millennium applications. The book starts with a survey of the oldes t
semantic annotations, viz. indexing of publications in libraries. It continues with several techniques for the explicit
construction of semantic annotations, including approaches for collaboration and Semantic Web metadata. One of the major
means for improving the semantic annotation task is information extraction and much can be learned from the semantic
tagging of linguistic corpora. In particular, information extraction is gaining prominence for automating the formerly purely
manual annotation task at least to some extent. An important subclass of information extraction tasks is the goal -oriented
extraction of content from HTML and / or XML resources.
http://books.google.com/books?id=JMw8Y897c7MC
Algebraic Semiotics:
http://www-cse.ucsd.edu/~goguen/projs/semio.html
Hybrid Logics in Action:
http://hylo.loria.fr/content/history2.php

Hylomorphism:
http://en.wikipedia.org/wiki/Hylomorphism
Co-occurrence:
http://en.wikipedia.org/wiki/Co-occurrence
Relevance paradox:
A term for the occurrence where the attempt to gather information relevant to a decision is ineffective because the attempt to
eliminate distracting or unnecessary information also excludes gathering information that is later seen to be crucial. It was
one of the key ideas in "The IRG Solution - hierarchical incompetence and how to overcome it".
http://en.wikipedia.org/wiki/Relevance_Paradox
http://en.wikipedia.org/wiki/Hierarchical_incompetence
http://en.wikipedia.org/wiki/Interlock_research
http://en.wikipedia.org/wiki/Interlock_diagram
Finite Model Theory and its Applications:
http://www.springer.com/computer/foundations/book/978-3-540-00428-8
Model Theory of Modal Logic:
http://www.mathematik.tu-darmstadt.de/~otto/papers/mlhb.pdf
Foundations of Temporal Logic:
http://www.prior.aau.dk/index2.htm
A Classic: Functional Programming with Bananas, Lenses, Envelopes and Barbed Wire:
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10. 1.1.41.125&rep=rep1&type=pdf
Modular Semantics and Logics of Classes: In this paper we improve a simple class -based semantics to deal with extensions
compositionally and derive modular reasoning principles for a logic of classes. The domain theoretic reasoning principle
behind this is fixpoint induction. Modularity is obtained by endowing the denotations of classes with an additional parameter
that accounts for those classes added later at linkage time.
http://www.springerlink.com/content/66tq8pkgnlt 406kg/
Coalgebras generalise the standard Kripke semantics of modal logic to encompass notions such as neighbourhood frames,
Markov chains, topological spaces, etc. Moreover, Coalgebra is a concept from Category Theory. Category Theory is an
area of mathematics which describes mathematical constructions in abstract terms that make these constructions available
to many different areas of mathematics, logic, and computer science. In particular, the category theoretic nature of
Coalgebras allows us to tackle the modularity problem using category theoretic constructions. One of the benefits of category
theory is that these constructions, because of their generality, apply to specification languages and to their semantic model s.
To summarise, Coalgebraic Logic combines Modal Logic with Coalgebra. This generalises modal logics from Kripke frames
to coalgebras and makes category theoretic methods and constructions available in Modal Logic.
http://gow.epsrc.ac.uk/ViewGrant.aspx?GrantRef=EP/G041296/1
Coalgebraic Modal Logic: Theory and Applications:

Brainstorm Page 14
http://gow.epsrc.ac.uk/ViewGrant.aspx?GrantRef=EP/G041296/1
Coalgebraic Modal Logic: Theory and Applications:
http://db.cwi.nl/projecten/project.php4?prjnr=176
Handbook of Modal Logic:
"...six major applications areas of modal logic (in Mathematics, Computer Science, Artificial Intelligence, Linguistics, Game
Theory, and Philosophy) are surveyed."
http://books.google.com/books?id=urINMvvs T5MC
Undecidability of Multi-modal Hybrid Logics:
http://portal.acm.org/citation.cfm?id=1248220
Lectures on Hybrid Logic:
http://www.stanford.edu/group/nasslli/courses/blackburn/reader.pdf

Nabla Algebras and Chu Spaces:


http://www.springerlink.com/content/x1t03u42g041v170/
Adaptivity, Emergence & Connectivity
http://wordinfo.info/
http://www.onelook.com/
http://plato.stanford.edu/entries/model-theory/
http://www.class.uh.edu/COGSCI/lang/Entries/model_theory.html
http://findarticles.com/p/articles/mi_pwwi/is_200901/ai_n31169420/
http://en.scientificcommons.org/theodore_zamenopoulos
http://telicthoughts.com/information-filters/
The Complexity & Artificial Life Research Concept for Self-Organizing Systems:
http://www.calresco.org/
Non-Fractal Complexity:
http://www.ceptualinstitute.com/uiu_plus/necsi1video.htm
Fractals, Complexity, and Connectivity in Africa:
http://www.rpi.edu/~eglash/eglash.dir/afractal/Eglash_Odumosu.pdf
Evolutionary Design by Computers:
http://www.cs.ucl.ac.uk/staff/P.Bentley/evdes.html
Symbiosis as a Source of Evolutionary Innovation:
A departure from mainstream biology, the idea of symbiosis —as in the genetic and metabolic interactions of the bacterial
communities that became the earliest eukaryotes and eventually evolved into plants and animals —has attracted the attention
of a growing number of scientists.

These original contributions by symbiosis biologists and evolutionary theorists address the adequacy of the prevailing neo -
Darwinian concept of evolution in the light of growing evidence that hereditary symbiosis, supplemented by the gradual
accumulation of heritable mutation, results in the origin of new species and morphological novelty. They include reports of
current research on the evolutionary consequences of symbiosis, the protracted physical association between organisms of
different species. Among the issues considered are individuality and evolution, microbial symbioses, animal bacterial
symbioses, and the importance of symbiosis in cell evolution, ecology, and morphogenesis.
http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&tid=5722

What's So "Intelligent" About "Intelligent Design"?:


http://evolutionlist.blogspot.com/2009/06/whats-so-intelligent-about-int elligent.html
Principia Cybernetica Web:
http://pespmc1.vub.ac.be/
New Trends in Computing Anticipatory Systems : Emergence of Artificial Conscious Intelligence with Machine Learning
Natural Language (Daniel M. Dubois, 2008):

This paper deals with the challenge to create an Artificial Intelligence System with an Artificial Consciousness. For that, an
introduction to computing anticipatory systems is presented, with the definitions of strong and weak anticipation. The quasi-
anticipatory systems of Robert Rosen are linked to open-loop controllers. Then, some properties of the natural brain are
presented in relation to the triune brain theory of Paul D. MacLean, and the mind time of Benjamin Libet, with his veto of the
free will. The theory of the hyperincursive discrete anticipatory systems is recalled in view to introduce the concept of
hyperincursive free will, which gives a similar veto mechanism: free will as unpredictable hyperincursive anticipation The
concepts of endo-anticipation and exo-anticipation are then defined. Finally, some ideas about artificial conscious intelligence
with natural language arepresented, in relation to the Turing Machine, Formal Language, Intelligent Agents and Mutli-Agent
System.
http://link.aip.org/link/?APCPCS/1051/25/1
As Robert Rosen proposed, there should be a category-theoretic approach to analyzing teleological and teleonomic
processes.
http://www.complex.vcu.edu/
Approches to the Question: 'What is Life?': Reconciling Theoretical Biology with Philosophical Biology:
http://www.cosmosandhistory.org/index.php/journal/article/ view/109/218

"Polytely can be described as Frequently, complex problem-solving situations characterized by the presence of not one, but
several goals, endings. when solving complex problems, we are often forced into making difficult choices and in the polytelic
scenarios; different outcomes to decide from. Though this is more complex than just choosing. We need to explore various
outcomes and theorise before making pragmatic decisions. Haste without experiment will not help but rather often hinder the
cogniser, the thinker, you. Modern society faces an increasing incidance of various complex problems that are “pervasive,
spreading unhindered into regions”, social ills for example. In other words, the defining characteristics of our complex
problems are a large number of variables (complexity) that interact in a nonlinear fashion (connectivity), changing over time
(dynamic and time-dependent), and to achieve multiple goals (polytely). Problem - to solution = the involvement of complex
variables. Multiple goals may be present that could, but do not necessarily, interfere with each other."

"Intelligent agents have become a major “ field of research in AI. Although there is little
consensus about the precise de“ nition of an intelligent agent, it is generally held that
agents are autonomous pieces of hardware/software, able to take initiative on behalf of
a user or, more generally, to satisfy some goal. Agents are often held to possess mental
attitudes; they are supposed to deal with information, and act upon this, based on motivation.
This calls for a description in terms of the agent•s beliefs/knowledge, desires,
goals, intentions, commitments, obligations, etc. To describe these mental or cognitive
attitudes one may fruitfully employ modal logic. Typically for the description of agents

Brainstorm Page 15
goals, intentions, commitments, obligations, etc. To describe these mental or cognitive
attitudes one may fruitfully employ modal logic. Typically for the description of agents
one needs an amalgam of modal operators/logics to cater for several of the mental attitudes
as mentioned above. Moreover, since agents by de“ nition act and display behavior,
it is important to include the dynamics of these mental attitudes in the description. One
might even maintain that the logics of some of these attitudes, such as goal directedness
and a fortiori desire, have little interest per se: they are rather weak logics without exciting
properties. What makes them interesting is their dynamics: their change over time
in connection with each other!"
http://www.csc.liv.ac.uk/~frank/MLHandbook/18. pdf
Synthetic intelligence will have to be proficient in analyzing polytelic scenerios, which requires more advanced forms of
cognitive function, possibly even meta-cognitive aspects:

Complex problem solving as a mediator between basic cognition and real -world functioning:
"The core theme of the present research project is the relationship between basic cognitive processes,
performance on complex cognitive tasks and real-world functioning. Basic cognitive processes examined in
the laboratory are often not easy to relate to real-life situations, both in investigations of healthy individuals
and in the clinical context. Research on complex problem solving was originally started to address precisely
this gap between “the narrow straits of the laboratory and the deep blue sea of field research” (Funke 2001).
In the proposed research, we will use the construct „complex problem solving‟ as a mediator between basic
cognition and real-world functioning, and use a multi-disciplinary approach to characterize the interrelation
between these three levels of analysis. To this end, tightly coordinated studies using computational
modelling, neuropsychological testing, functional neuroimaging, as well as pharmacological and behavioral
interventions will be conducted in the context of narrowly defined, shared behavioral paradigms."
http://www.psychologie.uni-heidelberg.de/projekte/bmbf-problemsolving/project.html
Objective Selection:

Sharing too many goals with your people is the same as sharing no goals. Energy, enthusiasm, and attention
all dissipate when they are spread too widely. Think of peanut butter: The more you spread it, the thinner it gets. It’s
tempting to load up the wish list with lots of ambitious aims, but the results will inevitably be disappointing.
Rather, choose a single, clearly articulated objective. This may embody your organization’s central purpose.
In Made to Stick, Chip Heath and Dan Heath cite the “commander’s intent,” a crisp, plain -talk statement about the

Brainstorm Page 16
In Made to Stick, Chip Heath and Dan Heath cite the “commander’s intent,” a crisp, plain -talk statement about the
desired result of a military maneuver that discards “a lot of great insights in order to let the most important insight
shine.‖ Or the goal may be relatively narrow in scope —say, meeting a higher quality standard for a single product.
Broad or narrow, if it is well-defined, measurable, and urgent, as Robert H. Schaffer says in ―Demand Better
Results—and Get Them‖ (HBR November–December 1974), it can galvanize an organization, ―generating the feeling
that achievement of the goal is imperative, not merely desirable. ‖
http://hbr.harvardbusiness.org/2007/ 06/objective -selection/ ar/1
Knowledge Sharing: The Facts and the Myths:http://www.intranetjournal.com/articles/200502/ij_02_22_05a.html
Innocentive Isolates the Problem Solvers From Eachother (Commentary by Sami Viitamaki):
"Innocentive fits the category of crowdsourcing that does not fully utilize the community‟s „wisdom of crowds‟. The solvers
pursue the solution in isolation from each other, and the possibility of using the community to gather comments on the
alternatives, build on others‟ ideas, find a winning solution by community rating, etc. is absent.
http://p2pfoundation.net/Innocentive

FLIRT Model of Crowdsourcing:


The model views the phenomenon from the perspective of a company considering intensive collaboration with customer
collectives and aims to identify the different actors on the field as well as their roles in the collective creation process.
Furthermore, it suggests a set of elements (the FLIRT ring) that have to be considered and established in order to achieve
desired action in the community.
http://p2pfoundation.net/FLIRT_Model_of_Crowdsourcing
http://www.samiviitamaki.com/2007/02/16/the-flirt-model-of-c rowdsourcing-collective-customer-c ollaboration/
Complex Problem Solving: Identity Matching Based on Social Contextual Information:
Modern society is increasingly facing various complex problems that are ―pervasive, spreading unhindered into
regions, countries, and economic activities which seem powerless to resist the invasion‖ (Mumford, 1998, p. 447).
Globalization, for example, is such a complex problem that, while bringing numerous opportunities to organizations, has also
brought substantial challenges and pressure. Defining complex problems seems to be a good starting point for solving
them; however, there has not been a widely accepted definition (Gray, 2002; Quesada et al., 2005). Funke (1991) suggested
that complex problems can be understood by contrasting them with simple problems, which can be solved by simple
reasoning and pure logic (Quesada et al., 2005), and that they can be characterized by their intransparency, polytely (from
the Greek words poly telos meaning many goals), complexity, connectivity of variables, dynamic, and time-delayed effects. In
other words, the defining characteristics of complex problems are a large number of variables (complexity) that interact in a
nonlinear fashion (connectivity), changing over time (dynamic and time-dependent), and to achieve multiple goals (polytely).
http://idea.library.drexel.edu/bitstream/1860/ 2671/1/2006175384.pdf
Variety (Universal Algebra):
http://en.wikipedia.org/wiki/Birkhoff's_HSP_theorem
Polyvarieties (1969):
In connection with the notion of a polyidentity, introduced by O. N. Golovin, we introduce the notion of the polyvariety of
groups and we generalize Birkhoff's theorem on varieties of groups for the case of polyvarieties. We also present several
examples of polyvarieties of groups.
http://www.springerlink.com/content/x36tmk28278684g7/

Birkhoff's Representation Theorem:


http://en.wikipedia.org/wiki/Birkhoff's_representation_theorem
Category Theory and Homological Algebra:
http://www.intute.ac.uk/sciences/cgi-bin/browse.pl?limit=0&id=25573&type=%&sort=Date
Operad Theory:
http://en.wikipedia.org/wiki/Operad_theory
Operads and Varieties of Algebras Defined by Polylinear Identities: http://www.springerlink.com/content/l5820r7r35632578/
Operads, Algebras, Modules, and Motives (1994):
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.53.7255

Symmetry, Design & Reflexivity


Design Science:
http://en.wikipedia.org/wiki/Design_Science
Buckminster Fuller Institute:
http://www.bfi.org/our_programs/who_is_buckminster_fuller/design_science

Brainstorm Page 17
Buckminster Fuller Institute:
http://www.bfi.org/our_programs/who_is_buckminster_fuller/design_science
One of the great American visionaries of the twentieth century, R. Buckminster Fuller (1895 -1983) endeavored to see what
he, a single individual, might do to benefit the largest segment of humanity while consuming the minimum of the earth's
resources. Doing "more with less" was Fuller's credo. He described himself as a "comprehensive anticipatory design
scientist," setting forth to solve the escalating challenges that faced humanity before they became insurmountable.
Fuller's innovative theories and designs addressed fields ranging from architecture, the visual arts, and literature to
mathematics, engineering, and sustainability. He refused to treat these diverse spheres as specialized areas of investigation
because it inhibited his ability to think intuitively, independently, and, in his words, "comprehensively."
Although Fuller believed in utilizing the latest technology, much of his work developed from his inquiry into "how nature
builds." He believed that the tetrahedron was the most fundamental, structurally sound form found in nature; this shape is an
essential part of most of his designs, which range in scale from domestic to global. As the many drawings and models in this
exhibition attest, Fuller was committed to the physical exploration and visual presentation of his ideas.
http://www.whitney.org/www/buckminster_fuller/about.jsp

Kybernetes:
http://www.emeraldinsight.com/Insight/ viewContainer.do?containerType= Journal&containerId=357
Pentagon's Mind-Reading Computers Replicate:
http://www.wired.com/dangerroom/2008/03/augcog-continue/
Land Warfare and Complexity, Part I: Mathematical Backgorund and Technical Sourcebook (U) The purpose of this paper is
to provide the theoretical framework and mathematical background necessary to understand and discuss the various ideas
of nonlinear dynamics and complex systems theory and to plant seeds for a later, more detailed discussion (that will be
provided in Part II of this report1) of how these ideas might apply to land warfare issues. This paper is also intended to be a
general technical sourcebook of information. Question 1: What does the behavior of the human brain have in common with
what happens on a battlefield? Question 2: Might there be higher-level processes that emerge on the battlefield, in the way
consciousness emerges in a human brain?
http://www.cna.org/isaac/lw1.pdf
Complexity, Global Politics, and National Security
(Edited by David S. Alberts and Thomas J. Czerwinski):
National Defense University, Washington D.C.
The inquiry into the nature of nonlinearity, and the rise of Complexity theory has of necessity paralleled the development of
the computer. Nonlinearity is extremely difficult to work with unless aided by the computer. Nonlinear equations were referre d
to as the ―Twilight Zone‖ of mathematics. Beginning in the early 1960s, efforts to modify the weather indicated the severe
limits to predictability in nonlinear environments, such as weather, itself. The self-organizing nature of nonlinearity, and the
attributes of Chaos theory were well advanced by 1987, with the publication of James Gleick‘s best -selling
popularization Chaos: Making a New Science. In the mid-1980s, the Santa Fe Institute was organized to further the inquiry
into complex adaptive systems. By 1992, Complexity theory also qualified for publication in the popular press with Mitchell
Waldrop‘s Complexity: The Emerging Science at the Edge of Order and Chaos, and Steven Lewin‘s Complexity: Life at the
Edge of Chaos. Nonlinearity was now in the public domain and universally accessible.
A number of modern U.S. defense thinkers, in retrospect, can be considered to be nonlinearists. Prominent among these are
J.C. Wylie and the prolific, but unpublished, John Boyd of OODA loop fame. However, in the context of the time and
vocabulary, this realization could only be implicit. An explicit articulation only began to emerge in the early 1990s. Two of the
earliest pioneers are authors in this volume. Both wrote seminal papers, the significance of which was largely unrecognized
when they first appeared. In late 1992, Alan Beyerchen‘s ―Clausewitz, Nonlinearity, and the Unpredictability of War,‖ was
published in International Security, and Steven Mann‘s ―Chaos Theory and Strategic Thought‖ appeared in Parameters. The
former work is a profound reinterpretation of Clausewitz‘s On War, persuasively placing the work, and Clausewitz, himself, in
a nonlinear framework. Mann, a Foreign Service officer, used self-organizing criticality, a concept associated with the Santa
Fe Institute, to describe the dynamics of international relations and its implications for strategy.
These initial intellectual contributions were followed by important advances, each the individual efforts of talented Air For ce
officers. These included investigations into defense applications of Chaos theory (David Nicholls, et al.,1994, and Glenn E.
James,1995.) Paralleling these efforts were those in Complexity theory applied to the determination of centers of gravity (Pa t
A. Pentland, 1993), and especially a robust and detailed methodology for identifying target sets (Steven M. Rinaldi, 1995). A s
a result, the confidence factor rose appreciably, as the body of defense-relat ed literature began to assume the qualitative
and quantitative dimensions for a discipline, or a contending body of thought. Primarily at the operational and tactical leve ls
of war, nonlinear concepts were moving beyond the notional, to formulation and application .
http://www.dodccrp.org/html4/bibliography/comindex.html
"Coping with Information Overload:
Coping with information overload is a major challenge of the 21st century. In previous eras, access to information was
difficult and often tightly controlled as a source of power. Today, we are overloaded with so much electronic information that
it has become an obstacle to effective decision making. Thus, the challenge facing individuals and institutions is how to
embrace this information rather than being paralyzed by it.
The intelligence community is overloaded with huge volumes of information, moving at large velocities and comprising great
variety. Information includes both content and context, which humans deal with as a gestalt but computer systems tend to
treat separately. We discuss two complementary approaches to coping with information overload and the open research
questions that arise in this emerging discipline. First is value estimation, where humans examine only the golden nuggets of
information judged valuable by some process. The second approach is knowledge distillation, where the information is
digested and compressed, producing salient knowledge for human consumption. Finally, there are many open questions
regarding the symbiosis between people and machines for knowledge discovery."
http://www.icsi.berkeley.edu/talks/Gorin.html
http://www.media.mit.edu/events/2009/04/06/allen-gorin-stream-characterization
"Reflexive Mappings and Nonlinear Dynamics:
The paper considers reflexive mappings properties: it is proved that, when the agents in the framework of the game-theoretic
model make their decisions on the base of the finite informational structures, actions, chosen by phantom agents, are
defined by the system of nonlinear iterated mappings. Exploration of the model allows concluding that the informational
equilibrium is generally unstable under the increase of the reflexivity depth."
http://www.mtas.ru/uploads/rmnd.pdf
Introduction of the Aristotle's final causation in CAST concept and method of incursion and hyperincursion: This paper will
analyse the concept and method of incursion and hyperincursion firstly applied to the Fractal Machine, an hyperincursive
cellular automata with sequential computations where time plays a central role. This computation is incursive, for inclusive
recursion, in the sense that an automaton is computed at the future time t+1 in function of its neighbour automata at the
present and/or past time steps but also at the future time t+1. The hyperincursion is an incursion when several values can be
generated at each time step. The incursive systems may be transformed to recursive ones. But the incursive inputs, defined
at the future time step, cannot always be transformed to recursive inputs. This is possible by self-reference. A self-reference
Fractal Machine gives rise to A non deterministic hyperincursive field rises in a self-reference Fractal Machine. This can be

Brainstorm Page 18
Fractal Machine gives rise to A non deterministic hyperincursive field rises in a self-reference Fractal Machine. This can be
related to the Final Cause of Aristotle. Simulations will show the generation of fractal patterns from incursive equations wi th
interference effects like holography. The incursion is also a tool to control systems. The Pearl -Verhulst chaotic map will be
considered. Incursive stabilisation of the numerical instabilities of discrete linear and non-linear oscillators based on Lotka-
Volterra equation systems will be simulated. Finally the incursive discrete diffusion equation is considered.
http://www.springerlink.com/content/m23248538w56706x/
A Survey of Incursive (Inclusive Recursion), Hyperincursive, and Anticipative Systems (Problems of Nonlinear Analysis in
Engineering Systems):
http://www.kcn.ru/tat_en/science/ans/journals/ansj_cnt/06_2_5.html
On the Quantum Potential and Pulsating Wave Packet in the Harmonic Oscillator:
http://adsabs.harvard.edu/abs/2008AIP C.1051..100D
Computational derivation of quantum and relativist systems with forward-backward space-time shifts:
http://adsabs.harvard.edu/abs/1999AIPC..465..435D
New Trends in Computing Anticipatory Systems : Emergence of Artificial Conscious Intelligence with Machine Learning
Natural Language:
http://adsabs.harvard.edu/abs/2008AIPC.1051...25D
Holographic associative memory and information transmission by solitary waves in biological systems:
http://adsabs.harvard.edu/abs/1993SPIE.1978..249G

Fermi-Pasta-Ulam nonlinear lattice oscillations:


http://www.scholarpedia.org/article/Fermi-Pasta-Ulam_nonlinear_lattice_oscillations
The Symmetries of Solitons:
http://www.ams.org/bull/1997-34-04/S0273-0979-97-00732-5/S0273-0979-97-00732-5. pdf
Stephen Wolfram's A New Kind of Science Online:
http://www.wolframscience.com/nksonline/toc.html

Origin of Randomness in Physical Systems:


http://www.stephenwolfram.com/publications/articles/physics/85-origins/2/text.html
Nonlinear Dynamics, Chaos, Bifurcations:
http://web.ift.uib.no/~antonych/bif.html

Nonlinear Dynamics and Complex Systems Theory Glossary of Terms:


http://www.cna.org/isaac/Glossb.htm

From Complexity to Life: On the Emergence of Life and Meaning:


http://www.complexsystems.org/essays/ReviewComplexity.htm

The Cybersemiotic Pre- and Post- Conditions of Computation:


http://www.rosen-enterprises.com/RobertRosen/JedJonesBioTheoryPaper.pdf

KLI Theory Lab:


http://www.kli.ac.at/theorylab/index.html
Centre for Discrete Mathematics and Theoretical Computer Science (The University of
Auckland):http://www.cs.auckland.ac.nz/CDMTCS/
Information Systems and the Theory of Categories: Is Every Model an Anticipatory System?
http://computing.unn.ac.uk/staff/CGNR1/liege04m4.pdf
Normativeness, Descriptivity & Analyticity

"The Cognitive-Theoretic Model of the Universe: A New Kind of Reality Theory: Inasmuch as science is observational or
perceptual in nature, the goal of providing a scientific model and mechanism for the evolution of complex systems ultimately
requires a supporting theory of reality of which perception itself is the model (or theory -to-univers e mapping). Where
information is the abstract currency of perception, such a theory must incorporate the theory of information while extending
the information concept to incorporate reflexive self-processing in order to achieve an intrinsic (self-contained) description of
reality. This extension is associated with a limiting formulation of model theory identifying mental and physical reality,
resulting in a reflexively self-generating, self-modeling theory of reality identical to its universe on the syntactic level. By the
nature of its derivation, this theory, the Cognitive Theoretic Model of the Universe or CTMU, can be regarded as a
supertautological reality-theoretic extension of logic. Uniting the theory of reality with an advanced form of computational
language theory, the CTMU describes reality as a Self-Configuring Self-Processing Language or SCSPL, a reflexive intrinsic
language characterized not only by self-reference and recursive self-definition, but full self-configuration and self-execution
(reflexive read-write functionality). SCSPL reality embodies a dual-aspect monism consisting of infocognition, self-
transducing information residing in self-recognizing SCSPL elements called syntactic operators. The CTMU identifies itself
with the structure of these operators and thus with the distributive syntax of its self-modeling SCSPL universe, including the
reflexive grammar by which the universe refines itself from unbound telesis or UBT, a primordial realm of infocognitive
potential free of informational constraint. Under the guidance of a limiting (intrinsic) form of anthropic principle called t he Telic
Principle, SCSPL evolves by telic recursion, jointly configuring syntax and state while maximizing a generalized self -selection
parameter and adjusting on the fly to freely-changing internal conditions. SCSPL relates space, time and object by means of
conspansive duality and conspansion, an SCSPL-grammatical process featuring an alternation between dual phases of
existence associated with design and actualization and related to the familiar wave-particle duality of quantum mechanics.
By distributing the design phase of reality over the actualization phase, conspansive spacetime also provides a distributed
mechanism for Intelligent Design, adjoining to the restrictive principle of natural selection a basic means of generating
information and complexity. Addressing physical evolution on not only the biological but cosmic level, the CTMU addresses
the most evident deficiencies and paradoxes associated with conventional discrete and continuum models of reality,
including temporal directionality and accelerating cosmic expansion, while preserving virtually all of the major benefits of
current scientific and mathematical paradigms."
http://www.iscid.org/papers/Langan_CTMU_092902.pdf
http://en.wikipedia.org/wiki/Polytely
http://en.wikipedia.org/wiki/Teleonomy
http://en.wikipedia.org/wiki/List_of_category_theory_topics

Brainstorm Page 19
http://en.wikipedia.org/wiki/List_of_category_theory_topics

http://www.cambridgeblog.org/2008/12/reflections -on-a-self-representing-universe/
The Joys of Concurrent Programming:
http://www.informit.com/articles/article.aspx?p=30413
Stanford Concurrency Group:
http://boole.stanford.edu/
Decision making process via constraint-oriented fuzzy logic based on Chu space theory:
"In decision making processes, one often does not know what is the real problem, what is the requirement, and the goals are
vaguely prescribed and may contradict with each other. Thus, the problem structure of decision making is covered with
various kinds of vagueness, and the way of solving the problem is highly dependent on the derision maker. One cannot
grasp the gist of problems if one ignore the associated vagueness which decision makers hold in his mind through his real
experience. Also, decision making processes cannot be characterized only by static relations among elements in problems,
but should reflect the dynamic structures that are depend on situations. In this paper, in order to deal with problems which
have dynamic structures including vagueness, we propose a framework of decision making structures which involves the
decision maker himself and find out scenarios for problem solving in the decision making “process”, i.e., interaction between
the human and environments"
http://ieeexplore.ieee.org/Xplore/login.jsp?url=http://ieeexplore.ieee. org/iel5/6771/18091/00838662.pdf% 3Farnumber%
3D838662&authDecision= -203
http://en.wikipedia.org/wiki/Chu_space
http://chu.stanford.edu/guide.html
http://boole.stanford.edu/pub/coimbra.pdf
http://citeseer.ist.psu.edu/old/15868.html
http://www.citeulike.org/tag/ontology-space
http://conconto.stanford.edu/conconto.pdf
http://www.entcs.org/files/mfps19/83018. pdf
http://www.tac.mta.ca/tac/volumes/17/5/17-05.pdf
On Game Formats and Chu Spaces:
http://ideas.repec.org/p/usi/wpaper/417.html
"The bulk of theoretical and empirical work in the neurobiology of emotion indicates that isotelesis —the principle that any one
function is served by several structures and processes—applies to emotion as it applies to thermoregulation, for example
(Satinoff, 1982)...In light of the preceding discussion, it is quite clear that the processes that emerge in emotion are gove rned
not only by isotelesis, but by the principle of polytelesis as well. The first principle holds that many functions, especiall y the
important ones, are served by a number of redundant systems, whereas the second holds that many systems serve more
than one function. There are very few organic functions that are served uniquely by one and only one process, structure, or
organ. Similarly, there are very few processes, structures, or organs that serve one and only one purpose. Language, too, is
characterized by the isotelic and polytelic principles; there are many words for each meaning and most words have more
than one meaning. The two principles apply equally to a variety of other biological, behavioral, and social phenomena. Thus,
there is no contradiction between the vascular and the communicative functions of facial efference; the systems that serve
these functions are both isotelic and polytelic."
http://psychology.stanford.edu/~lera/273/zajonc-psychreview-1989.pdf

"Between the lines of all the sacred books, we discern the holotelic craving, the sense of continued life, which has so much
more to know and to be."
http://books.google.com/books?id=4fcaAAAAYAAJ&pg=PA350&lpg=PA350
&dq=holotelic&source=bl&ots=v8QVZoAG4f&sig=yAggV UGj_Y v99ryE HCjuMSUZExo& hl=en& ei=xJ3eS ZTNB 5jstQOCo4i7C
Q&sa=X&oi=book_result&ct=result&resnum=6#PPA351,M1
"An act is a temporal process, and self -inclusion is a spatial relation. The act of self -inclusion is thus "where
time becomes space"; for the set of all sets, there can be no more fundamental process. No matter what else
happens in the evolving universe, it must be temporally embedded in this dualistic self -inclusion operation.
In the CTMU, the self-inclusion process is known as conspansion and occurs at the distributed, Lorentz -
invariant conspansion rate c, a time -space conversion factor already familiar as the speed of light in vacuo

Brainstorm Page 20
invariant conspansion rate c, a time -space conversion factor already familiar as the speed of light in vacuo
(conspansion consists of two alternative phases accounting for the wave and particle properties of matter
and affording a logical explanation for accelerating cosmic expansion). When we imagine a dynamic self -
including set, we think of a set growing larger and larger in order to engulf itself from without. But since
there is no "without" relative to the real universe, external growth or reference is not an option; there can be
no external set or external descriptor. Instead, self -inclusion and self-description must occur inwardly as the
universe stratifies into a temporal sequence of states, each state topologically and computationally contained
in the one preceding it (where the conventionally limited term computation is understood to refer to a more
powerful SCSPL-based concept, protocomputation, involving spatiotemporal parallelism). On the present
level of discourse, this inward self -inclusion is the conspansive basis of what we call spacetime.
Every object in spacetime includes the entirety of spacetime as a state -transition syntax according to which
its next state is created. This guarantees the mutual consistency of states and the overall unity of the
dynamic entity the real universe. And because the sole real interpretation of the set -theoretic entity "the set
of all sets" is the entire real universe, the associated foundational paradoxes are resolved in kind (by
attributing mathematical structure like that of the universe to the pure, uninterpreted set -theoretic version
of the set of all sets). Concisely, resolving the set -of-all-sets paradox requires that (1) an endomorphism or
self-similarity mapping D:S-->rÎS be defined for the set of all sets S and its internal points r; (2) there exist
two complementary senses of inclusion, one topological [S Ét D(S)] and one predicative [D(S) Éd S], that
allow the set to descriptively "include itself" from within, i.e. from a state of topological self -inclusion (where
Ét denotes topological or set -theoretic inclusion and Éd denotes descriptive inclusion, e.g. the inclusion in a
language of its referents); and (3) the input S of D be global and structural, while the output D(S) = (r Éd S)
be internal to S and play a syntactic role. In short, the set -theoretic and cosmological embodiments of the
self-inclusion paradox are resolved by properly relating the self -inclusive object to the descriptive syntax in
terms of which it is necessarily expressed, thus effecting true self -containment: "the universe (set of all sets)
is that which topologically contains that which descriptively contains the universe (set of all sets)."

This characterizes a system that consistently perceives itself and develops its own structure from within via
hology, a 2-stage form of self-similarity roughly analogous to holography. (Hology is a logico -cybernetic form
of self-similarity in which the global structure of a self -contained, self-interactive system doubles as its
distributed self-transductive syntax; it is justified by the obvious fact that in a self -contained system, no
other structure is available for that purpose.) The associated conspansive mapping D is
called incoversion in the spatiotemporally inward direction and coinversion in the reverse (outward, D-1)
direction. Incoversion carries global structure inward as state -recognition and state-transformation syntax,
while coinversion projects syntactic structure outward in such a way as to recognize existing structure and
determine future states in conformance with it. Incoversion is associated with an operation called
requantization, while coinversion is associated with a complementary operation called inner expansion. The
alternation of these operations, often referred to as wave -particle duality, comprises the conspansion
process. The Principle of Conspansive Duality then says that what appears as cosmic expansion from an
interior (local) viewpoint appears as material and temporal contraction from a global viewpoint. Because
metric concepts like "size" and "duration" are undefined with respect to the universe as a whole, the
spacetime metric is defined strictly intrinsically, and the usual limit of cosmological regress, a pointlike
cosmic singularity, becomes the closed spacetime algebra already identified as SCSPL.

Thus, the real universe is not a static set, but a dynamic process resolving the self -inclusion paradox.
Equivalently, because any real explanation of reality is contained in reality itself, reality gives rise to a
paradox unless regarded as an inclusory self -mapping. This is why, for example, category theory is
increasingly preferred to set theory as a means of addressing the foundations of mathematics; it centers on
invariant relations or mappings between covariant or contravariant (dually related) objects rather than on
static objects themselves. For similar reasons, a focus on the relative invariants of semantic processes is also
well-suited to the formulation of evolving theories in which the definitions of objects and sets are subject to
change; thus, we can speak of time and space as equivalent to cognition and information with respect to the
invariant semantic relation processes, as in "time processes space" and "cognition processes information".
But when we define reality as a process, we must reformulate containment accordingly. Concisely, reality
theory becomes a study of SCSPL autology naturally formulated in terms of mappings. This is done by
adjoining to logic certain metalogical principles, formulated in terms of mappings, that enable reality to be
described as an autological (self -descriptive, self-recognizing/self-processing) system.

The first such principle is MAP, acronymic for Metaphysical Autology Principle. Let S be the real universe, and
let T = T(S) be its theoretical description or "TOE". MAP, designed to endow T and S with mathematical
closure, simply states that T and S are closed with respect to all internally relevant operations, including
recognition and description. In terms of mappings, this means that all inclusional or descriptive mappings of
S are automorphisms (e.g., permutations or foldings) or endomorphisms (self -injections). MAP is implied by
the unlimited scope, up to perceptual relevance, of the universal quantifier implicitly attached to reality by
the containment principle. With closure thereby established, we can apply techniques of logical reduction to
S without worrying about whether the lack of some external necessity will spoil the reduction. In effect, MAP
makes T(S) "exclusive enough" to describe S by excluding as a descriptor of S anything not in S. But there
still remains the necessity of providing S with a mechanism of self -description.

This mechanism is provided by another metalogical principle, the M=R or Mind Equals Reality Principle, that
identifies S with the extended cognitive syntax D(S) of the theorist. This syntax (system of cognitive rules)
not only determines the theorist's perception of the universe, but bounds his cognitive processes and is
ultimately the limit of his theorization (this relates to the observation that all we can directly know of reality
are our perceptions of it). The reasoning is simple; S determines the composition and behavior of objects (or
subsystems) s in S, and thus comprises the general syntax (structural and functional rules of S) of which s
obeys a specific restriction. Thus, where s is an ideal observer/theorist in S, S is the syntax of its own
observation and explanation by s. This is directly analogous to "the real universe contains all and only that

Brainstorm Page 21
which is real", but differently stated: "S contains all and only objects s whose extended syntax is isomorphic
to S." M=R identifies S with the veridical limit of any partial theory T of S [limT(S) = D(S)], thus making S
"inclusive enough" to describe itself. That is, nothing relevant to S is excluded from S @ D(S).

Mathematically, the M=R Principle is expressed as follows. The universe obviously has a structure S.
According to the logic outlined above, this structure is self -similar; S distributes over S, where "distributes
over S" means "exists without constraint on location or scale within S". In other words, the universe is a
perfectly self-similar system whose overall structure is replicated everywhere within it as a general state -
recognition and state-transition syntax (as understood in an extended computational sense). The self -
distribution of S, called hology, follows from the containment principle, i.e. the tautological fact that
everything within the real universe must be described by the predicate "real" and thus fall within the
constraints of global structure. That this structure is completely self -distributed implies that it is locally
indistinguishable for subsystems s; it could only be discerned against its absence, and it is nowhere absent
in S. Spacetime is thus transparent from within, its syntactic structure invisible to its contents on the
classical (macroscopic) level. Localized systems generally express and utilize only a part of this syntax on
any given scale, as determined by their specific structures. I.e., where there exists a
hological incoversion endomorphism D:Sà{rÎS} carrying the whole structure of S into every internal point
and region of S, objects (quantum-geometrodynamically) embedded in S take their recognition and state -
transformation syntaxes directly from the ambient spatiotemporal background up to isomorphism. Objects
thus utilize only those aspects of D(S) of which they are structural and functional representations.

The inverse D-1 of this map (coinversion) describes how an arbitrary local system s within S recognizes S
at the object level and obeys the appropriate "laws", ultimately giving rise to human perception. This reflects
the fact that S is a self-perceptual system, with various levels of self -perception emerging within interactive
subsystems s (where perception is just a refined form of interaction based on recognition in an extended
computational sense). Thus, with respect to any class {s} of subsystems of S, we can define a homomorphic
submap d of the endomorphism D: d:Sà{s} expressing only that part of D to which {s} is isomorphic. In
general, the si are coherent or physically self -interactive systems exhibiting dynamical and informational
closure; they have sometimes-inaccessible internal structures and dynamics (particularly on the quantum
scale), and are distinguishable from each other by means of informational boundaries contained in syntax
and comprising a "spacetime metric".

According to the above definitions, the global self -perceptor S is amenable to a theological interpretation,
and its contents {s} to "generalized cognitors" including subatomic particles, sentient organisms, and every
material system in between. Unfortunately, above the object level, the validity of s -cognition - the internal
processing of sentient subsystems s - depends on the specific cognitive functionability of a given s...the
extent to which s can implicitly represent higher-order relations of S. In General Relativity, S is regarded as
given and complete; the laws of mathematics and science are taken as pre -existing. On the quantum scale,
on the other hand, laws governing the states and distributions of matter and energy do not always have
sufficient powers of restriction to fully determine quantum behavior, requiring probabilistic augmentation in
the course of quantum wavefunction collapse. This prevents a given s, indeed anything other than S, from
enclosing a complete nomology (set of laws); while a complete set of laws would amount to a complete
deterministic history of the universe, calling the universe "completely deterministic" amounts to asserting the
existence of prior determinative constraints. But this is a logical absurdity, since if these constraints were
real, they would be included in reality rather than prior or external to it (by the containment principle). It
follows that the universe freely determines its own constraints, the establishment of nomology and the
creation of its physical (observable) content being effectively simultaneous and recursive. The incoversive
distribution of this relationship is the basis of free will, by virtue of which the universe is freely created by
sentient agents existing within it."
http://www.ctmu.org/

The Non-Unique Universe: The purpose of this paper is to elucidate, by means of concepts and theorems drawn from
mathematical logic, the conditions under which the existence of a multiverse is a logical necessity in mathematical physics,
and the implications of Godel's incompleteness theorem for theories of everything. http://arxiv.org/abs/0907.0216
"A new paradigm for the modelling of reality is currently being developed called Process Physics. In Process Physics we
start from the premise that the limits to logic, which are implied by Gödel's incompleteness theorems, mean that any attempt
to model reality via a formal system is doomed to failure. Instead of formal systems we use a process system, which uses
the notions of self-referential information with self-referential noise and self-organised criticality to create a new type of
information-theoretic system that is realising both the current formal physical modelling of reality but is also exhibiting
features such as the direction of time, the present moment effect and quantum state entanglement (including EPR effects,
nonlocality and contextuality), as well as the more familiar formalisms of Relativity and Quantum Mechanics. In particular a
theory of Gravity has already emerged.
In short, rather than the static 4-dimensional modelling of present day (non-process) physics, Process Physics is providing a
dynamic model where space and matter are seen to emerge from a fundamentally random but self-organising system. The
key insight is that to adequately model reality we must move on from the traditional non-process syntactical information
modelling to a process semantic information modelling; such information is `internally meaningful'. The new theory of gravity
which has emerged from Process Physics is in agreement with all experiments and observations. This theory has two
gravitational constants: G, the Newtonian gravitational constant, and a second dimensionless constant which experiment has
revealed to be the fine structure constant. This theory explains the so-called `dark matter' effect in spiral galaxies, the bore
hole gravitational anomalies, the masses of the observed black holes at the centres of globular clusters, and the anomalies
in Cavendish laboratory measurements of G. As well it gives a parameter-free account of the supernovae Hubble expansion
data without the need for dark energy, dark matter nor accelerating universe. This reveals that the Friedmann equations are
inadequate for describing the universe expansion dynamics."
http://www.scieng.flinders.edu.au/cpes/people/cahill_r/processphysics.html
"Across the Megaverse What troubles Susskind is an intelligent design argument considerably more vexing than the anti -
evolution grumblings recently on trial in Dover, Pa. Biologists can point to unambiguous evidence that evolution truly does
happen and that it can account for many otherwise inexplicable aspects of how organisms function. For those who take a
more cosmic perspective, however, the appearance of design is not so simply refuted. If gravity were slightly stronger than i t
is, for instance, stars would burn out quickly and collapse into black holes; if gravity were a touch weaker, stars would never

Brainstorm Page 22
is, for instance, stars would burn out quickly and collapse into black holes; if gravity were a touch weaker, stars would never
have formed in the first place. The same holds true for pretty much every fundamental property of the forces and particles
that make up the universe. Change any one of them and life would not be possible. To the creationist, this cosmic comity is
evidence of the glory of God. To the scientist, it is an embarrassing reminder of our ignorance about the origin of physical
law."
http://www.nytimes.com/2006/01/15/books/review/15powell.html?_r=1
"In 1970, a young physicist named Leonard Susskind got stuck in an elevator with Murray Gell -Mann, one of physics' top
theoreticians, who asked him what he was working on. Susskind said he was working on a theory that represented particles
"as some kind of elastic string, like a rubber band." Gell-Mann responded with loud, derisive laughter. Within a few years,
however, many physicists saw string theory as a promising line of research (and Gell -Mann had apologized to Susskind, one
of the theory's co-founders). String theory -- which posited the existence of unimaginably tiny, vibrating strands of energy --
evolved into "superstring theory" and then "M theory" (and expanded to include not just strings but wider "membranes").
String theory, broadly defined, became and remains the most prominent candidate to unify the physical world's diverse
particles and forces into a single mathematical framework."
http://www.rinf.com/news/dec05/string.html
"The black hole information paradox results from the combination of quantum mechanics and general relativity. It suggests
that physical information could "disappear" in a black hole, allowing many physical states to evolve into precisely the same
state. This is a contentious subject since it violates a commonly assumed tenet of science —that in principle complete
information about a physical system at one point in time should determine its state at any other time."
http://en.wikipedia.org/wiki/Black_hole_information_paradox
Stochasticity, Coherence & Self-Determinacy
"Causal evolution of spin networks (2008): A new approach to quantum gravity is described which joins the loop
representation formulation of the canonical theory to the causal set formulation of the path integral. The theory assigns
quantum amplitudes to special classes of causal sets, which consist of spin networks representing quantum states of the
gravitational field joined together by labeled null edges. The theory exists in 3+1, 2+1 and 1+1 dimensional versions, and
may also be interepreted as a theory of labeled timelike surfaces. The dynamics is specified by a choice of functions of the
labelings of d+1 dimensional simplices,which represent elementary future light cones of events in these discrete spacetimes.
The quantum dynamics thus respects the discrete causal structure of the causal sets. In the 1 + 1 dimensional case the
theory is closely related to directed percolation models. In this case, at least, the theory may have critical behavior
associated with percolation, leading to the existence of a classical limit."
http://en.scientificcommons.org/40855829
"It's a Strange World: The Human as Living Time Machine: Schwartz, a Professor of psychology, medicine, neurology,
psychiatry and surgery at the University of Arizona and Director of the Human Energy Systems Laboratory, had expressed
an interest in how the mind could access information "beyond space and time," something Sarfatti knew required going
outside of accepted theory. Sarfatti had proposed a post-quantum theory based upon the work of the late Professor David
Bohm, and noted physicist Anthony Valentini had devised a theory which allowed signals to travel faster than the speed of
light. Valentini's work, which is based on the pilot-wave interpretation of quantum theory championed by the late David
Bohm, predicted a new kind of non-quantum matter, offering unique and almost magical properties. Sarfatti proposed that
the human mind -- the essence of the consciousness experience -- operated "beyond space and time" in a way similar to
Valentini's non-quantum matter."
http://www.americanchronicle.com/articles/view/43239
The Emergence of Gravity as a Retro-Causal Post-Inflation Macro-Quantum-Coherent Holographic Vacuum Higgs-
Goldstone Field (2009, Jack Sarfatti and Creon Levit):
We present a model for the origin of gravity, dark energy and dark matter: Dark energy and dark matter are residual pre-
inflation false vacuum random zero point energy (w = -1) of large-scale negative, and short-scale positive pressure,
respectively, corresponding to the "zero-point" (incoherent) component of a superfluid (supersolid) ground state. Gravity, in
contrast, arises from the 2nd order topological defects in the post-inflation virtual "condensate" (coherent) component. We
predict, as a consequence, that the LHC will never detect exotic real on-mass-shell particles that can explain dark
matter ΩDM ≈ 0.23. We also point out that the f uture holographic dark energy de Sitter horizon is a total absorber (in the sense of retro -causal Wheeler-Feynman action-at-a-distance
electrody namics) because it is an infinite redshift surface for static detectors. Therefore, the advanced Hawking -Unruh thermal radiation from the future de Sitter horizon is a candidate for
the negativ e pressure dark vacuum energy.
http://www.iop.org/EJ/article/1742-6596/174/1/012045/jpconf9_174_012045.pdf?request-
id=a9bca028-9133-44e7-8821-17ac7c7acb75
Creation, Annihilation & Gyroteleostasis
Time Travel Research Has Begun:
http://www.youtube.com/watch?v=BtQfHpB 8XHQ
Practical Application for the Negative Time Hypothesis:
http://qualight.com/stress/frolov/timehyp.htm
A Conditional Criterion for Identity, Leading to a Fourth Law of Logic
(Chronotopology):http://www.cheniere.org/books/aids/appendix III.htm
A Graph-Theoretic Model for Time:
http://www9.georgetown.edu/faculty/kainen/timegts.pdf
"Pataphysics, an absurdist concept coined by the French writer Alfred Jarry, is a philosophy dedicated to studying what lies
beyond the realm of metaphysics. Defined as: "The science of imaginary solutions, which symbolically attributes the
properties of objects, described by their virtuality, to their lineaments"

"Reason dreams of an empire of knowledge, a mansion of the mind. Yet sometimes we end up living in a hovel by its side.
Reason has shown us our capacity for power, both to create and to destroy. Yet how we use that power rests on our deeper
capacities which lie beyond the reach of reason, beyond our traditions and culture, stretching far back into the depths of th e
evolutionary process that created our species, a process that ultimately asserts the power of life over death. And, ironicall y,
even death, as part of the process of life, asserts that power. That is how we have come into being and now find ourselves
committed to the unrelenting struggle of ordinary human existence.

We surely stand at the threshold of a great adventure of the human spirit —a new synthesis of knowledge, a potential
integration of art and science, a deeper grasp of human psychology, a deepening of the symbolic representations of our
existence and feelings as given in religion and culture, the formation of an international order based on cooperation and
nonviolent competition. It seems not too much to hope for these things.

The future, as always, belongs to the dreamers."

Heinz R. Pagels
The Dreams of Reason

Brainstorm Page 23
The Dreams of Reason
http://en.wikipedia.org/wiki/List_of_scientific_journals
Ranking and Mapping Scientific Knowledge:
http://www.eigenfactor.org/top10.htm
The SAO/NASA Astrophysics Data System:
http://adsabs.harvard.edu/
IngentaConnect:
http://www.ingentaconnect.com/
IEEE:
http://www.ieee.org/
IEEE Computer Society:
http://www2.computer.org/portal/web/guest/home
Nature:
http://www.nature.com/
Springer:
http://www.springer.com/

Japanese Scientific and Technical Journals in the Computing Area:


http://www.cs.arizona.edu/projects/japan/table_contents.html
Turpion Publications: Science Journals from Russia:
http://www.turpion.org/

Science Direct:
http://www.sciencedirect.com/
Elsevier:
http://www.elsevier.com/
Jstor:
http://www.jstor.org/
Project Euclid:
http://www.projecteuclid.org/
Scientific Journals International:
http://www.scientificjournals.org/current_issue. htm
SciCentral:
http://www.scicentral.com/
arXiv:
http://arxiv.org/
Oxford Journals:
http://www.oxfordjournals.org/

Universal Access to Human Knowledge:


http://www.archive.org/index.php
Directory of Open Access Journals:
http://www.doaj.org/
Hindawi Publishing Corporation (150+ Open Access Journals):
http://www.hindawi.com/

MIT Press Journals:


http://www.mitpressjournals.org/
MIT OpenCourseWare:
http://www.core.org.cn/OcwWeb/Brain-and-Cognitive-Sciences/index.htm
Free Online Computer Science and Programming Books, Textbooks, Lecture Notes:
http://freetechbooks.com/
New Mathematics and Natural Computation:
http://ideas.repec.org/s/wsi/nmncxx.html
The Frontiers Collection:
http://www.springerlink.com/content/x23rvt/?v=editorial
Archive Freedom: Addressing the Need for Freedom in Scientific Research:
http://www.archivefreedom.org/casehistories.htm
Stanford Encyclopedia of Philosophy:
http://plato.stanford.edu/
The Internet Encyclopedia of Philosophy:
http://www.utm.edu/research/iep/

Best Philosophy Journals:


http://homepage.mac.com/mcolyvan/journals.html

Gigapedia:
http://gigapedia.com/

Scribd:
http://www.scribd.com/
http://en.wikipedia.org/wiki/Collaborative_Innovation_Networks
http://www.scholarpedia.org/article/Reaction -diffusion_systems
http://www.ulb.ac.be/sciences/nlpc/pattern_formation.html
http://www.apmaths.uwo.ca/~mkarttu/turing.shtml
http://knol.google.com/k/twain/the-global-brain-singularity-and-360/31fjy9fjsu1x2/19

http://psychology.stanford.edu/~lera/273/zajonc-psychreview-1989.pdf
http://books.google.com/books?id=2F4AAAAAYAAJ&pg=PA403&dq=isotely&lr=
http://books.google.com/books?id=uoAwAAAAIAAJ&q=isotelic&dq=isotelic&lr=&pgis=1
http://dictionary.reference.com/search?q=telesis&r=66
http://books.google.com/books?id=4wAfD7Xa94UC&pg=PA381&dq=isotelic
http://www.edwardgoldsmith.org/page138.html

Brainstorm Page 24
http://www.edwardgoldsmith.org/page138.html

Pasted from <http://knol.google.com/k/plectics-groups-autopoiesis-logoi-topoi-biosemiosis-orders-teletics>

Protocomputational Multivarifolds, Heirarchical Stratifolds, Fractal Bubbles, Nested Hyperstructures, Cosmologic


Minimalism, Hyperincursive Conspansion
Multiply-Nested Virtual Realities, Intersective -Absorptive Exotic Spheres, Conspanding Fractal -Bubbles, IS-
Multivarifolds, Stratified Automaton, Minimal Surface Entanglement.
"The Fractal Bubble model is a serious candidate to describe the Universe in which we live." -David Wiltshire on The
Fractal Bubble Universe

Bubble fractal images are the most organic of all fractal types. They often appear as resembling a myriad of natural
creations of all sizes, from tiny cells and molecules to planets, stars and galaxies.
http://www.fantastic-fractals.com/Fractal-Bubbles.html

Physicists Use Soap Bubbles to Study Black Holes:

“Evidence for a membrane-like behavior of black holes has been known for two decades, in work pioneered by Kip
Thorne and his colleagues,” said Cardoso, who came to UM last fall. “This membrane paradigm approach makes
calculations easier.”
http://www.physorg.com/news66408910. html

Minimal Surfaces, Stratified Multivarifolds, and the Plateau Problem:

Dao Trong Thi formulated the new concept of a multivarifold, which is the functional analog of a geometrical stratified
surface and enables us to solve Plateau's problem in a homotopy class.
http://books.google.com/books?id=mncIV2c5Z4s C&source= gbs_navlinks_s

The remaining chapters present a detailed exposition of one of these trends (the homotopic version of Plateau's
problem in terms of stratified multivarifolds) and the Plateau problem in homogeneous symplectic spaces.
http://www.ams.org/bookstore?fn=20& arg1=mmonoseries&ikey=MMONO-84

The "Bubble Universe" Theory:


http://www.youtube.com/watch?v=2GNkazRo-tE& feature= related

Closer to Truth:
http://www.closertotruth.com/

Fractal Bubble Universe – Cosmology Without Dark Energy:


http://www.phys.canterbury.ac.nz/seminars/2005/

Double Bubbles in the Three Torus:


http://www.expmath.org/expmath/volumes/12/12.1/ pp79_89.pdf
Contenido
• Loop Quantum Gravity (LQG)
• Physicists Use Soap Bubbles to Study Black Holes:
• Universal Geometry: from Cosmic Foam to Voronoi Foam
• YouTube Video
• YouTube Video
• YouTube Video
• Related results
• Cellular automata models of stratified dispersal

Enlace

Cita

Correo electrónico

Imprimir

Favorito

Recopilar esta página

http://www.skytopia.com/project/fractal/mandelbulb.html

Brainstorm Page 25
http://www.science-art.com/image.asp?id=3380&search= 1&pagename=Colliding_Bubble_Universe

Dimensions - 1: Dimension Two


http://www.youtube.com/watch?v=e-A mvW viXAA
http://www.youtube.com/watch?v=6gKTCkUNtUE
Dimensions - 2: Dimension Three
http://www.youtube.com/watch?v=_12kuEP ZglQ
http://www.youtube.com/watch?v=evCrLDmCjXI
Dimensions - 3: The Fourth Dimension
http://www.youtube.com/watch?v=LN7KWA6hS 5o
http://www.youtube.com/watch?v=BbienXJ55bg
Dimensions - 4: The Fourth Dimension
http://www.youtube.com/watch?v=JK bVKxW_BYs
http://www.youtube.com/watch?v=el_uy1i0948
Dimensions - 5: Complex Numbers
http://www.youtube.com/watch?v=8wtJX9Ds3e0

Brainstorm Page 26
http://www.youtube.com/watch?v=8wtJX9Ds3e0
http://www.youtube.com/watch?v=zisPV9Guz TE
Dimensions - 6: Complex Numbers
http://www.youtube.com/watch?v=xuO5lOhXB vM
http://www.youtube.com/watch?v=BngZUXqK eSk
Dimensions - 7: Fibrations
http://www.youtube.com/watch?v=fPK HkpvMfhk
http://www.youtube.com/watch?v=tB Xb7CD_ZHU

Dimensions - 8: Fibrations...Suite
http://www.youtube.com/watch?v=yxAOCOOtKno
http://www.youtube.com/watch?v=ovl1Ag146Jg

Dimensions - 9: Proof
http://www.youtube.com/watch?v=FAyD2V 3h1sM
http://www.youtube.com/watch?v=yAf27Sij-O8

http://en.wikipedia.org/wiki/Calabi–Yau_manifold
Haha, quantum foam.

http://www.geocities.com/CapeCanaveral/Hall/5803/tra.html
http://www.zamandayolculuk.com/cetinbal/elementaryparticles.htm
http://www.ipod.org.uk/reality/reality_quantum_reality.asp
http://www.the-global-shift.com/author/admin/

Brainstorm Page 27
http://www.the-global-shift.com/author/admin/

http://blog.robbiecooper.org/2009/02/15/quantum -foam-aet as-serenas/


http://fractalartgallery.com/wallpaper/hires4.htm

Loop Quantum Gravity:


Loop quantum gravity (LQG), also known as loop gravity and quantum geometry, is a proposed quantum theory of
spacetime which attempts to reconcile the theories of quantum mechanics and general relativity. Loop Quantum
Gravity suggests that space (i.e. the universe) can be viewed as an extremely fine fabric or network ―weaved‖ of finite
quantised loops (of excited gravitational fields) called spin networks. When viewed over time these spin networks are
called spin foam, which should not be confused with quantum foam. While some prefer String theory, some physicists
consider Loop Quantum Gravity to be a serious contender because it incorporates general relativity and does not
incorporate higher dimensions.
http://mc2.gulf-pixels.com/?p=344

Brainstorm Page 28
http://mc2.gulf-pixels.com/?p=344
Black Holes May Harbour Their Own Universes:
When matter gets swallowed by a black hole, it could fall into another universe contained inside the black hole, or get
trapped inside a wormhole-like connection to a second black hole, a new study suggests.
What's inside a black hole is one of the biggest mysteries in physics. The theory that predicted black holes in the first
place - general relativity - says that all the matter inside them gets squashed into a central point of infinite density
called a singularity. But then, "things break down mathematically", says Christian Böhmer of University College
London, in the UK. "We would like to see the singularity removed."
Many researchers believe that some kind of new, overarching theory that unites gravity and quantum effects will
resolve the problem. String theory is the most popular of these alternatives.
But Böhmer and colleague Kevin Vandersloot of the University of Portsmouth in the UK use a rival approach
called loop quantum gravity, which defines space-time as a network of abstract links that connect tiny chunks of space.
Loop quantum gravity has been used before to tackle the singularity that would seem to have occurred at the origin of
our universe. It suggests that instead of a big bang, an earlier universe could have collapsed and then exploded
outward again in a "big bounce".
http://www.newscientist.com/article/dn12853-black-holes-may-harbour-their-own-universes.html
Loop Quantum Gravity (LQG)
Obviously we need a better, consistent theory which unifies GR and QT. One candidate for such a theory is Loop
Quantum Gravity (LQG). LQG has a built-in regulator which avoids the aforementioned curvature singularities: Space
and time are granular, get chopped up into smallest quanta of 10 -33 cm and 10-43 s respectively which are known as
Planck length and Planck time respectively. LQG derives its name from a formulation of the theory in terms of so called
Wilson loops, familiar from (Lattice) QCD. This means that geometry and matter are excited on one dimensional closed
paths or loops. Zillions of these elementary, mutually intersecting and linked loops then form the spacetime continuum
at large scales. For instance, one would need an order of 10 68 such loops in order to „weave― a sheet of paper of size
A4. The granular structure of spacetime, sometimes termed spacetime foam, becomes visible only under high
resolution, see figure. A recent review of LQG can be found in [1]. The MPI for gravitational physics, Golm, is
internationally leading in the analysis of the „Quantum - Einstein Equations― that one derives from LQG.
http://www.aei.mpg.de/english/research/ highlights/01_quantumGrav/index.html

Deriving Dimensions:
It seems like the most obvious physical fact: The universe has four dimensions --three spanning space and one ticking
away time. But the ultimate theory of gravity should explain why the universe is four-dimensional and how those
dimensions arose, say researchers trying to unify the theories of quantum mechanics and relativity. Now, calculations
in the 24 September PRL show that when all possible microscopic contortions of spacetime are added together, a
large-scale four-dimensional universe can emerge.
http://focus.aps.org/story/v14/st13

Physicists Use Soap Bubbles to Study Black Holes:


―Evidence for a membrane-like behavior of black holes has been known for two decades, in work pioneered by Kip
Thorne and his colleagues,‖ said Cardoso, who came to UM last fall. ―This membrane paradigm approach makes
calculations easier.‖

Cardoso and Dias have extended and strengthened this analogy. Their combined efforts show that by endowing the
membrane with surface tension – the force that holds soap bubbles together – one can reproduce many phenomena,
which up to now could be studied only through series of complex computations.

The duo has been applying the membrane paradigm to their study of ―black strings,‖ which are long and thin black
holes. The researchers showed these black strings break into smaller fragments, just as water dripping from a faucet
breaks into small droplets.

―What‘s most amazing to me in our results is how such a complex system of equations such as Einstein‘s can be
modeled so well by fluids with surface tension, like soap bubbles,‖ Dias said. ―I was stunned when I saw how good the
match was.‖

Brainstorm Page 29
match was.‖

Cardoso and Dias recently had an article on their theory accepted for publication in Physical Review Letters, journal of
the American Physical Society. The paper, ―Gregory-Laflamme and Rayleigh-Plateau Instabilities of Black Strings,‖
runs in the May 12 issue.
http://www.physorg.com/news66408910. html

"After all, the nested foam is the easiest and most natural way, how to realize the fractal Universe."
http://lofi.forum.physorg.com/Physicists-Use-Soap-Bubbles-to-Study-Black-Holes_6492.html
At first, the character of light spreading requires the environment, where the transversal waves are absolutely
predominant. The soap foam can serve as the physical model of such environment. The foam can explain the quantum
phenomena easily, too, because the foam tends to become more dense after introducing of energy (the shaking of
foam in evacuated vessel can serve as the example of such behavior). Therefore the very same wave, which is
spreading through such foam makes the environment more dense, which results in non-linear character of quantum
wave, which interferes with the density gradient of Aether,. The sufficiently intensive energy wave with low frequency
can be even trapped by the density blob like the light wave trapped inside of glass sphere by total reflection
phenomena. This is the principle of particle formation - we can see, the foamy system can create the system of
particles, just by introducing of required amount of energy.

At second, each the particle system exhibits the diffusional fluctuations. These fluctuations are obtaining the foamy
character, as the particle system becomes more and more dense. The condensing supercritical vapor can serve as the
example of such system. We can see, the foam model is able to explain itself by recursive way. This leads to the
model of nested foam, i.e. the foam, whose bubbles are filled by another foam, recursively. The complex character of
waves, spreading along crossing internal surfaces of foam are giving the waves and particles the internal helicity and
spin.

Because such model is quite qualitative, here's a lotta place for math simulations & modeling, indeed -
but this is the first case after long time, where the intuitive insight based on classical physic experience
has surpassed the understanding, based on solely formal math models. So we can start to validate
these models by intuitive insight into reality and/or even extrapolate such understanding outside of
scope of the existing math models. Now we can understand, in which points the LQG, Heim's theory or
even superstring theories are correct and weak, because the nested foam is complex, yet
understandable system. Such model supplies the way, by which all these theories can be connected &
reconciled, although they appear as mutually quite incompatible with respect of underlying math
formalism.

This is another problem of formal math approach. For example the quantum mechanic can be described
by at least four or five mutually independent math models, so called intepretations: by the linear
algebra based on matrix theory, differential equations, path integrals, Dirac's statistic and so on. All
these models are supplying the same results, so they can be useful in particular situations. But it
increases a number of math models in physic, which appears a more complex, then it really is by such
way.
http://lofi.forum.physorg.com/Relativity -versus-Quantum-Mechanics_11580.ht ml
Innocent light-minded men,
who think that astronomy
can be learnt by looking at the stars
without knowledge of mathematics
will, in the next life, be birds
Plato, Timaeus

Theme
Universal Geometry: from Cosmic Foam to Voronoi Foam

Brainstorm Page 30
In my work I am particularly fascinated by the Megaparsec scale foamlike patterns in the
spatial matter and galaxy distribution. As increasingly elaborate galaxy redshift surveys
charted ever larger regions in the nearby cosmos, an intriguingly complex and salient cosmic
foamlike network has been found to permeate the observable Universe:
THE COSMIC FOAM
I have been adressing various aspects of its origin and formation, and of its significance within a cosmological context. Wit h
such a fascinating geometric yet stochastic structure defining the fabric of the Universe's infrastructure, I have been lead to
establish the similarity with what is an equally intriguing and complex structure in mathematics, one of the basic concepts i n
the field of ``stochastic geometry'':

VORONOI TESSELLATIONS
For the various cosmological, mathematical and statistical aspects of related projects, see:

Research
Publications
http://www.astro.rug.nl/~weygaert/

Periodic Computer Representations of the Cosmic Structure


Rien van de Weijgaert, Kapteyn Institute, University of Groningen
slides(ppt, 80 Mb)

Cosmology is the study of the structure and evolution of the Universe and its contents. One of the most important
issues is the formation of structure and the emergence of cosmic objects out of the almost featureless pristine Universe
some 13.6 Gyrs ago. All evidence points towards a gravitational amplification of initial Gaussian density and velocity

Brainstorm Page 31
some 13.6 Gyrs ago. All evidence points towards a gravitational amplification of initial Gaussian density and velocity
ripples with tiny amplitudes. An important consideration is that of the cosmological principle: there are no preferred or
central positions in the Universe, each sizeable volume of the Universe is more or less (statistically) representative.
When simulating the evolution of cosmic structure we therefore need to guarantee that the simulation volume is
representative. This is most straightforwardly accomplished by specifying volumes with periodic boundary conditions.
Effectively it means that the simulation ignores the influence of primordial density and velocity fluctuations with
wavenumbers smaller than the fundamental wavelength of the simulation box.
One of the main instruments for studying the formation of structure are N-body simulations, in which the cosmic mass
distribution is reflected in a representative spatial distribution of discrete particles. I will describe how Delaunay and
Voronoi tessellations are powerful instruments for analyzing and following the multiscale and strongly patterned nature
of spatial structure on Megaparsec scales.
(image processed from images made my Miguel Aragon-Calvo)
http://www.cgal.org/Events/PeriodicSpac esWorkshop/
"The Fractal Bubble model is a serious candidate to describe the Universe in which we live." -David Wiltshire on The
Fractal Bubble Universe Bubble fractal images are the most organic of all fractal types. They often appear as
resembling a myriad of natural creations of all sizes, from tiny cells and molecules to planets, stars and galaxies.
http://www.fantastic-fractals.com/Fractal-Bubbles.html
David Wiltshire:
I currently have a research project, Gravitational energy and cosmic structure: What is the Universe made of?, from the
Marsden fund administered by the RSNZ. Teppo Mattsson joined us in September 2009 as a postdoc on this project,
and his wife Maria Mattsson is a visiting research associate, also working on inhomogeneous cosmology. Ishwaree
Neupane has been with us as a research fellow since June 2004, and works in other areas of gravitational physics and
cosmology. I am supervisor to one PhD student, Peter Smale. Emeritus Professor Roy Kerr maintains an active
association with our group. We have on-going interactions with Matt Visser's group at Victoria University of Wellington.
http://www2.phys.canterbury.ac.nz/~dlw24/
Multidimensional Theory:
http://lofi.forum.physorg.com/Multi-dimensional-theory _6181.html

http://www.thebegavalley.org.au/bent.html

Brainstorm Page 32
http://www.thebegavalley.org.au/bent.html
http://www.zazzle.com/quantum_foam_poster-228304752833134500

http://www.quantumconsciousness.org/presentations/whatisconsciousness.html
Ruggero Gabbrielli's Homepage:
Research interests
1. Cell aggregates and soap films (require Java). The geometry of cellular materials. Ground state for cellular material
and aggregates of equisized soft particles. The Kelvin problem (video 1 video 2),modelling of foams, the three-
dimensional structure of bubbles in foams, disorder in foams, n-dimensional honeycombs, periodic tilings from point
sets, low-dimensional non-lattice sphere packings and coverings, spatial distribution of points, periodic point sets
(require Java). Applications: foam modelling, cell aggregates, metal crystallites, grain structure and boundaries,
cosmology.
2. Triply periodic surfaces (require Java). Trigonometric approximations by implicit functions of minimal surfaces and
surfaces of constant curvature. Optimization of low volume fraction porous materials (stress levelling). Modelling
of structures based on triply periodic surfaces. Stress-leveling analysis of porous materials. Application: trabecular
bone mimicking for bone substitutes. Modelling and three-dimensional printing of functionally graded materials.
3. Pattern formation applied to optimization problems in mathematics (sphere coverings and quantizers), physics
(structure of condensed matter and metallurgy), chemistry (crystallography), engineering (foam modelling, surface
and volume mesh generation). Swift-Hohenberg equation, Brusselator reaction-diffusion system.
4. Auxetics (require Java). Bifurcation and geometric instabilities in solid mechanics. The internal geometry of negative
Poisson's ratio structures.
Movies
The movies below show how pattern-forming equations (in this case the Swift-Hohenberg equation) are able to find low
energy lattices and nonlattices.
Here two isosurfaces of the BCC lattice are shown. The lattice points are at the centre of the green surfaces. The white
surface is 5% off the absolute minima.
Same with A15, a nonlattice. Also here.
http://engweb.swan.ac.uk/~gabbriellir/
YouTube Video
YouTube Video
YouTube Video

http://smu.edu/math/foam.html
Interfacial and Colloidal Phenomena Research Group:
http://www.iit.edu/~wasan/ftdes.html

Brainstorm Page 33
http://www.iit.edu/~wasan/ftdes.html

Diagram 11: In the above illustration, a spatial cross section of a spacetime diagram (blue line) is rotated toward the
viewer and displayed along the time axis (blue rectangle). The result is a Venn diagram in which circles represent
objects and events, or (n>1)-ary interactive relationships of objects. That is, each circle depicts the ―entangled quantum
wavefunctions‖ of the objects which interacted with each other to generate it. The small dots in the centers of the
circles represent the initial events and objects from which the circles have arisen, while the twin dots where the circles
overlap reflect the fact that any possible new event, or interaction between objects involved in the old events, must
occur by mutual acquisition in the intersect. The outward growth (or by conspansive duality, mutual absorption) of the
circles is called inner expansion, while the collapse of their objects in new events is called requantization. The circles
themselves are called IEDs, short for inner expansive domains, and correspond to pairs of interactive syntactic
operators involved in generalized-perceptual events (note the hological ―evacuation‖ and mutual absorption of the
operators). Spacetime can be illustrated in terms of a layering of such Venn diagrams, mutual contact among which is
referred to as ―extended superposition‖ (in the real world, the Venn diagrams are 3-dimensional rather than
planar, the circles are spheres, and “layering” is defined accordingly). Extended superposition ―atemporally‖
distributes antecedent events over consequent events, thus putting spacetime in temporally -extended self-contact. In
light of the Telic Principle (see below), this scenario involves a new interpretation of quantum theory, sum over futures.
Sum over futures involves an atemporal generalization of ―process‖, telic recursion, through which the universe effects
on-the-fly maximization of a global self-selection parameter, generalized utility.
http://www.scribd.com/doc/23694328/The-Cognitive-Theoretic-Model-of-the-Universe-A-New-Kind-of-Reality-Theory
The "Bubble Universe" Theory:
http://www.youtube.com/watch?v=2GNkazRo-tE& feature= related
Closer to Truth:
http://www.closertotruth.com/
Fractal Bubble Universe – Cosmology Without Dark Energy:http://www.phys.canterbury.ac.nz/seminars/2005/
Viable inhomogeneous model universe without dark energy from primordial inflation:
A new model of the observed universe, using solutions to the full Einstein equations, is developed from the hypothesis
that our observable universe is an underdense bubble, with an internally inhomogeneous fractal bubble distribution of
bound matter systems, in a spatially flat bulk universe. It is argued on the basis of primordial inflation and resulting
structure formation, that the clocks of the isotropic observers in average galaxies coincide with clocks defined by the
true surfaces of matter homogeneity of the bulk universe, rather than the comoving clocks at average spatial positions
in the underdense bubble geometry, which are in voids. This understanding requires a systematic reanalysis of all
observed quantities in cosmology. I begin such a reanalysis by giving a model of the average geometry of the universe,
which depends on two measured parameters: the present matter density parameter, Omega_m, and the Hubble
constant, H_0. The observable universe is not accelerating. Nonetheless, inferred luminosity distances are larger than
naively expected, in accord with the evidence of distant type Ia supernovae. The predicted age of the universe is 15.3
+/-0.7 Gyr. The expansion age is larger than in competing models, and may account for observed structure formation
at large redshifts.
http://arxiv.org/abs/gr-qc/0503099
Type Ia supernovae tests of fractal bubble universe with no cosmic acceleration:
http://arxiv.org/abs/astro--ph/0504192
Geodesics, soap bubbles and pattern formation in riemannian surfaces:

We study a singularly perturbed semi-linear elliptic PDE with a bistable potential on a closed Riemannian surface. We
show that the transition region converges in the sense of varifolds to an embedded (multiple) curve with constant
geodesic curvature.
http://www.springerlink.com/content/k18052t7351126t8/
In the realm of theory (apart from Mandelbrot‘s ideas), the first appearance of fractals in cosmology was likely with
Andrei Linde‘s "Eternally Existing Self-Reproducing Chaotic Inflationary Universe"[5] theory (see Chaotic inflation
theory), in 1986. In this theory, the evolution of a scalar field creates peaks that become nucleation points which cause
inflating patches of space to develop into "bubble universes," making the universe fractal on the very largest scales.
Alan Guth's 2007 paper on "Eternal Inflation and its implications"[6] shows that this variety of Inflationary universe
theory is still being seriously considered today. And inflation, in some form or other, is widely considered to be our best
available cosmological model.
http://en.wikipedia.org/wiki/Fractal_cosmology
Quantum Data Visualization in a Sphere:
http://jasonkolb.com/weblog/2009/07/quantum-data-visualization-in-a-sphere.html
Is there an equation for physics itself as a subset of mathematics? I believe there is and if it were to be found it would
be called the ultimate theory of physics. Moreover, I believe that it can be found and that it has a lot to do with what is
different about the way a physicist looks at the world compared to a mathematician...We can then try to elevate the
idea to a more general principle of representation-theoretic self-duality, that a fundamental theory of physics is
incomplete unless such a role-revers al is possible. We can go further and hope to fully determine the (supposed)
structure of fundamental laws of nature among all mathematical structures by this self-duality condition. Such duality
considerations are certainly evident in some form in the context of quantum theory and gravity. The situation is
summarised to the left in the following diagram. For example, Lie groups provide the simplest examples of Riemannian
geometry, while the representations of similar Lie groups provide the quantum numbers of elementary particles in
quantum theory. Thus, both quantum theory and non-Euclidean geometry are needed for a self-dual picture. Hopf
algebras (quantum groups) precisely serve to unify these mutually dual structures.'' (The reader may also wish to see
the original document on line.)
http://planetmath.org/encyclopedia/IHESOnTheFusionOfMathematicsAndTheoreticalPhysics2.html
Global Theory of Minimal Surfaces:
The subjects covered include minimal and constant-mean-curvature submanifolds, geometric measure theory and the
double-bubble conjecture, Lagrangian geometry, numerical simulation of geometric phenomena, applications of mean
curvature to general relativity and Riemannian geometry, the isoperimetric problem, the geometry of fully nonlinear

Brainstorm Page 34
curvature to general relativity and Riemannian geometry, the isoperimetric problem, the geometry of fully nonlinear
elliptic equations and applications to the topology of three-dimensional manifolds.
http://books.google.com/books?id=HRr3jdBDXIgC&source=gbs_navlinks_s
Isoperimetric Inequalities for Multivarifolds:
http://www.iop.org/EJ/abstract/0025-5726/26/2/A03
Multidimensional Parameterized Variational Problems on Riemannian Manifolds:
http://www.springerlink.com/content/171q92207n778361/
Modern State of Minimal Surface Theory:
http://dfgm.math.msu.su/bookskaf/Fom1990-f.pdf
Computing Matveev's Complexity of Non-Orientable 3-Manifolds via Crystallization Theory:
http://www.matematica.unimore.it/0ricerca/Quaderni/quaderno_55.pdf
Spinors, twistors, Clifford algebras, and quantum deformations:
http://books.google.com/books?id=XAI_yheWFkUC&source= gbs_navlinks_s
Multiverses and Blackberries
http://www.csicop.org/si/show/multiverses_and_blackberries/
The original Mandelbrot is an amazing object that has captured the public's imagination for 30 years with its cascading
patterns and hypnotically colourful detail. It's known as a 'fractal' - a type of shape that yields (sometimes elaborate)
detail forever, no matter how far you 'zoom' into it (think of the trunk of a tree sprouting branches, which in turn split of f
into smaller branches, which themselves yield twigs etc.).

It's found by following a relatively simple math formula. But in the end, it's still only 2D and flat - there's no depth,
shadows, perspective, or light sourcing. What we have featured in this article is a potential 3D version of the same
fractal. For the impatient, you can skip to the nice pics, but the below makes an interesting read (with a little math as
well for the curious).

http://www.skytopia.com/project/fractal/mandelbulb.html

The first who bound up the topology of figures and the logical systems was Alfred Tarski. He showed that the logical
units can be associated with the topologically different geometrical figures.

Related results
1. Logic of topology is a particular case of the spatial one. [21][22] Some information on the digital topology which is the
basic for topological computing is in [23][24] .
2. A theory of a particular case of topological processors has been proposed by Ryabov and Serov (2007). [25] which is
for parallel operating of elementary image voxels.
3. The idea of increased noise immunity of topological computing is used in quantum topological computations. [26]
4. Some aspects of relations of topology, physics and computations are considered by Baez and Stay (2009). [27]
http://en.wikipedia.org/wiki/Topologic al_computing

A topological quantum computer is a theoretical quantum computer that employs two-


dimensionalquasiparticles called anyons, whose world lines cross over one another to form braids in a three-
dimensionalspacetime (i.e., one temporal plus two spatial dimensions). These braids form the logic gates that make up
the computer. The advantage of a quantum computer based on quantum braids over using trapped quantum particles
is that the former is much more stable. The smallest perturbations can cause a quantum particle todecohere and
introduce errors in the computation, such small perturbations do not change the topological properties of the braids.
This is like the effort required to cut a string and reattach the ends to form a different braid, as opposed to a ball
(representing an ordinary quantum particle in four-dimensional spacetime) simply bumping into a wall. While the
elements of a topological quantum computer originate in a purely mathematical realm, recent experiments indicate
these elements can be created in the real world using semiconductorsmade of gallium arsenide near absolute zero and
subjected to strong magnetic fields.
http://en.wikipedia.org/wiki/Topologic al_quantum_computing

Asymmetry: The Foundation of Information:

"As individual needs have arisen in the fields of physics, electrical engineering and computational science, each has
created its own theories of information to serve as conceptual instruments for advancing developments. This book
provides a coherent consolidation of information theories from these different fields. The author gives a survey of
current theories and then introduces the underlying notion of symmetry, showing how information is related to the
capacity of a system to distinguish itself. A formal methodology using group theory is employed and leads to the
application of Burnside's Lemma to count distinguishable states. This provides a versatile tool for quantifying
complexity and information capacity in any physical system."
http://books.google.com/books?id=oMFsko4E9FQC&lr=&source=gbs_navlinks_s

Brainstorm Page 35
http://books.google.com/books?id=oMFsko4E9FQC&lr=&source=gbs_navlinks_s

"Self-dissimilarity is a measure of complexity defined in a series of papers by David Wolpert and William G. Macready."
http://en.wikipedia.org/wiki/Self-dissimilarity

Self-Dissimilarity: An Empirical Measure of Complexity:

For systems usually characterized as complex/living/intelligent, the spatio-temporal patterns exhibited on different
scales differ markedly from one another. (E.g., the biomass distribution of a human body looks very different depending
on the spatial scale at which one examines that biomass.) Conversely, the density patterns at different scales in non -
living/simple systems (e.g., gases, mountains, crystal) do not vary significantly from one another.
Such self-dissimilarity can be empirically measured on almost any real-world data set involving spatio-temporal
densities, be they mass densities, species densities, or symbol densities. Accordingly, taking a system's (empirically
measurable) self-dissimilarity over various scales as a complexity "signature" of the system, we can compare the
complexity signatures of wholly different kinds of systems (e.g., systems involving information density in a digital
computer vs. systems involving species densities in a rainforest, vs. capital density in an economy etc.). Signatures
can also be clustered, to provide an empirically determined taxonomy of kinds of systems that share organizational
traits. Many of our candidate self-dissimilarity measures can also be calculated (or at least approximated) for physical
models.

The measure of dissimilarity between two scales that we finally choose is the amount of extra information on one of the
scales beyond that which exists on the other scale. It is natural to determine this "added information" using a maximum
entropy inference of the pattern at the second scale, based on the provided patter at the first scale. We briefly discuss
using our measure with other inference mechanisms (e.g., Kolmogorov complexity -based inference, fractal-dimension
preserving inference, etc.).
http://ideas.repec.org/p/wop/safiwp/97-12-087.html
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10. 1.1.44.8564&rep=rep1&type=pdf

Self-dissimilarity as a high dimensional complexity measure:


http://ti.arc.nasa.gov/m/pub/935h/0935%20(Wolpert).pdf

Unifying Themes in Complex Systems: Relevance of Hyperstructure Theory:


http://books.google.com/books?id=OJwuFoWhJTIC&lpg=PA299&dq= hyperstructure% 20theory&pg=PA299
#v=onepage&q=hyperstructure%20theory&f= false
Applications of Hyperstructure Theory:
http://books.google.com/books?id=uvCrZ3iGur4C&source=gbs_navlinks_s
Stratifolds are a generalization of smooth manifolds – a notion of generalized smooth space – which were introduced
by Kreck; see his lecture notes.

Stratifolds comprise one of the many variants of the concept of a stratified space?, and they may include some types of
singularities.
http://ncatlab.org/nlab/show/stratifold
Differential Algebraic Topology: From Stratifiolds to Exotic Spheres:
http://www.hausdorff-research-institute.uni-bonn.de/ files/kreck-DA.pdf
http://www.hausdorff-research-institute.uni-bonn.de/kreck-stratifolds
Resolution of Stratifolds and Connection to Mather's Abstract Pre-Stratified Spaces:
http://www.hausdorff-research-institute.uni-bonn.de/ files/diss_grinberg.pdf
Minimal Surfaces, Stratified Multivarifolds, and the Plateau Problem:
Dao Trong Thi formulated the new concept of a multivarifold, which is the functional analog of a geometrical stratified
surface and enables us to solve Plateau's problem in a homotopy class.
http://books.google.com/books?id=mncIV2c5Z4s C&source= gbs_navlinks_s
Learning on Varifolds:
In this paper, we propose a new learning framework based on the mathematical concept of varifolds (Morgan, 2000),
which are the measure-theoretic generalization of differentiable manifolds. We compare varifold learning with the
popular manifold learning and demonstrate some of its specialties. Algorithmically, we derive a neighborhood
refinement technique for hypergraph models, which is conceptually analogous to varifolds, give the procedure for
constructing such hypergraphs from data and finally by using the hypergraph Laplacian matrix we are able to solve
high-dimensional classification problems accurately.
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=4685510
Hypergraph:
http://en.wikipedia.org/wiki/Hypergraph
P System:
http://en.wikipedia.org/wiki/P_system
Network Automaton:
http://en.wikipedia.org/wiki/Network_aut omaton

Fuzzy Information and Engineering:


http://books.google.com/books?id=9OIRN5YJ9Is C&lpg=PA290&dq=hyperstructure%20t heory&pg=PA290
#v=onepage&q=hyperstructure%20theory&f= false
Fractured fractals and broken dreams: self-similar geometry through metric and measure:
Fractal patterns have emerged in many contexts, but what exactly is a pattern? How can one make precise the
structures lying within objects and the relationships between them? This book proposes new notions of coherent
geometric structure to provide a fresh approach to this familiar field. It develops a new concept of self-similarity called
"BPI" or "big pieces of itself," which makes the field much easier for people to enter. This new framework is quite
broad, however, and has the potential to lead to significant discoveries. The text covers a wide range of open
problems, large and small, and a variety of examples with diverse connections to other parts of mathematics. Although
fractal geometries arise in many different ways mathematically, comparing them has been difficult. This new approach
combines accessibility with powerful tools for comparing fractal geometries, making it an ideal source for researchers
in different areas to find both common ground and basic information.
http://books.google.com/books?id=0oy-Mgd1aTQC&source=gbs_navlinks_s
Ten Thousand Peacock Feathers in Foaming Acid:
A vacuum or a semi-vacuum encased within a gravity and temperature sensitive elastic skin - the scenario of an early

Brainstorm Page 36
Ten Thousand Peacock Feathers in Foaming Acid:
A vacuum or a semi-vacuum encased within a gravity and temperature sensitive elastic skin - the scenario of an early
universe, a soap bubble, and later, that of a biological membrane. By researching the behavior of soap films, a vast
variety of optical, mathematical, thermodynamic and electrochemical discoveries have been made since the time of the
Renaissnace. One of the earliest means of analogue computing was the soap film calculator (19th century), which
tackled geometric problems of minimal surface area. Soap film soft drives are currently being used for blackhole and
superstring modeling.
http://portablepalace.com/10000.html
Computing Homology:
The aim of this paper is to provide a short introduction to computational homology based on cubical complexes. The
discussed topics include cubical complexes, a reduction algorithm for computing homology of finitely generated chain
complexes, and an algorithmic construction of homology of continuous maps via multivalued acyclic representations.
http://www.maths.soton.ac.uk/EMIS/journals/HHA/volumes/2003/ n2a8/ v5n2a8.pdf
Computational Homology:
The aim of the paper is to give a retrospective of authors‘ research on homology computing in past six years. The
discussed topics include cubical complexes, reduction algorithm for computing homology
of finitely generated chain complexes, and an algorithmic construction of homology of continuous maps via multivalued
acyclic representations. The motivation comes from computational dynamics but more applications are expected.
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10. 1.1.140.3132&rep=rep1&type=pdf
Computing Persistent Homology:
The aim of the paper is to give a retrospective of authors‘ research on homology computing in past six years. The
discussed topics include cubical complexes, reduction algorithm for computing homology
of finitely generated chain complexes, and an algorithmic construction of homology of continuous maps via multivalued
acyclic representations. The motivation comes from computational dynamics but more applications are expected.
http://comptop.stanford.edu/preprints/persistence1.pdf
In mathematics, a varifold is, loosely speaking, a measure-theoretic generalization of the concept of adifferentiable
manifold, by replacing differentiability requirements with those provided by rectifiable sets, while maintaining the
general algebraic structure usually seen in differential geometry. More closely, varifolds generalize the ideas of
a rectifiable current. Varifolds are the topic of study ingeometric measure theory.
http://en.wikipedia.org/wiki/Varifold

Geometric measure theory


An area of analysis concerned with solving geometric problems via measure-theoretic techniques. The canonical
motivating physical problem is probably that investigated experimentally by J. Plateau in the nineteenth century [a4]:
Given a boundary wire, how does one find the (minimal) soap film which spans it? Slightly more mathematically: Given
a boundary curve, find the surface of minimal area spanning it. (Cf. also Plateau problem.) The many different
approaches to solving this problem have found utility in most areas of modern mathematics and geometric measure
theory is no exception: techniques and ideas from geometric measure theory have been found useful in the study of
partial differential equations, the calculus of variations, harmonic analysis, and fractals.
http://eom.springer.de/G/g130040.htm
Triply Periodic Minimal Surfaces:
http://www.susqu.edu/brakke/evolver/examples/periodic/periodic.html
Minimal surfaces have become an area of intense mathematical and scientific study over the past 15 years, specifically
in the areas of molecular engineering and materials science, due to their anticipated nanotechnology applications.
http://en.wikipedia.org/wiki/Category:Minimal_s urfaces
Mathematicians Maximize Knowledge of Minimal Surfaces:
http://www.physorg.com/news74872374. html
Solution of a Class of Minimal Surface Problem with Obstacle:
http://ccsenet.org/journal/index.php/jmr/article/viewFile/180/139
Nonlinear Optimization: Algorithms and Models:
http://www.princeton.edu/~rvdb/542/lectures/lec23.pdf
Optimization: Linear Programming, Operations Research, Path Integrals, etc.:
http://home.att.net/~numericana/answer/optimize.htm
Minimal Surfaces as Webs of Optimal Transportation Flows:
http://www.informaworld.com/smpp/979737740-73230552/content~db=all~content=a772432399
Bubble: Science Gallery:
http://sciencegallery.com/bubble
Minimal Surfaces and Multifunctionality:
Triply periodic minimal surfaces are objects of great interest to physical scientists, biologists and mathematicians. It
has recently been shown that triply periodic two-phase bicontinuous composites with interfaces that are the Schwartz
primitive (P) and diamond (D) minimal surfaces are not only geometrically extremal but extremal for simultaneous
transport of heat and electricity. More importantly, here we further establish the multifunctionality of such two-phase
systems by showing
that they are also extremal when a competition is set up between the e.ffective bulk modulus and the electrical (or
thermal) conductivity of the composite. The implications of our fi“ndings for materials science and biology, which
provides the ultimate multifunctional materials, are discussed.
Keywords: minimal surfaces; multifunctionality; composites; elastic moduli; conductivity; optimization
http://cherrypit.princeton.edu/papers/paper-221.pdf
A Measure-Theoretic Foundation of Rule Interestingness Evaluation:
http://www2.cs.uregina.ca/~yyao/PAPERS/measurement_foundation.pdf
On Semantic Interoperability and the Flow of Information:
http://ftp1.de.freebsd.org/Publications/CE UR -WS/Vol-82/SI_paper_14.pdf
Ontology Mapping Based on Rough Formal Concept Analysis:
http://www2.computer.org/portal/web/csdl/doi/10.1109/AICT-ICIW.2006. 142
Ontology Mapping: The State of the Art:
http://drops.dagstuhl.de/opus/volltexte/2005/40/ pdf/04391.KalfoglouYannis.Paper.40. pdf
A Channel-Theoretic Foundation for Ontology Coordination:
http://users.ecs.soton.ac.uk/yk1/MCN04-schorlemmer-kalfoglou.pdf
Using Formal Concept Analysis and Information Flow for Modelling and Sharing Common Semantics: Lessons Learnt
and Emergent Issues:
http://users.ecs.soton.ac.uk/yk1/iccs2005-final-kalfoglou.pdf
Why Ontologies are not Enough for Knowledge Sharing:
http://citeseer.ist.psu.edu/old/371242.html
The Information Flow Framework (IFF) Foundation Ontology:
http://suo.ieee.org/Kent-IFF.pdf

Brainstorm Page 37
The Information Flow Framework (IFF) Foundation Ontology:
http://suo.ieee.org/Kent-IFF.pdf
Combining Rough Set Theory and Instance Selection in Ontology Mapping:
http://cat.inist.fr/?aModele=afficheN&cpsidt=20586027
Advanced Knowledge Technologies:
http://www.aktors.org/akt/
A Measure Theoretic Approach to Information Retrieval:
http://portal.acm.org/citation.cfm?id=1254873
Coalgebras, Chu Spaces, and Representations of Physical Systems:
The main results of our investigations can be summarized as follows:
• Firstly, at the general level, we look at the comparative strengths and weaknesses of the two formalisms. On our
analysis, the key feature that Chu spaces have and coalgebras lack is contravariance; the key feature
which coalgebras have and Chu spaces lack is extension in time. There are some interesting secondary issues as well,
notably symmetry vs. rigidity.
• Formally, we introduce an indexed structure for coalgebras to compensate for the lack of contravariance, and show
how this can be used to represent a wide class of physical systems in coalgebraic terms. In particular, we show how a
universal model for quantum systems can be constructed as a final coalgebra. This opens the way to the use of
methods such as coalgebraic logic in the study of physical systems. It also suggests how coalgebra can mediate
between ontic and epistemic views of the states of physical systems.
• We also define an analogous indexed structure for Chu spaces, and use this to obtain a novel categorical description
of the category of Chu spaces. We use the indexed structures of Chu spaces and coalgebras over a common base to
define a truncation functor from coalgebras to Chu spaces.
• We use this truncation functor to lift the full and faithful representation of the groupoid of physical symmetries on
Hilbert spaces into Chu spaces, obtained in [1], to the coalgebraic semantics.
http://arxiv.org/abs/0910.3959
Extended Memory Evolutive Systems in a Hyperstructure Context:
http://www.springerlink.com/content/77tp885145x77827/
Applications of Hyperstructure Theory:
There are applications to the following subjects: 1) geometry; 2) hypergraphs; 3) binary relations; 4) lattices; 5) fuzzy
sets and rough sets; 6) automata; 7) cryptography; 8) median algebras, relation algebras; 9) combinatorics; 10) codes;
11) artificial intelligence; 12) probabilities.
http://books.google.co.uk/books?id=uvCrZ3iGur4C
Unifying themes in complex systems: 2.6: Relevance of Hyperstructure Theory:

In (section 2.2) there was reference to 'transformations coming about in response to assessments of the observed
performance.'
http://books.google.com/books?id=OJwuFoWhJTIC&lpg=PA299&dq= hyperstructure% 20theory&pg=PA299
#v=onepage&q=hyperstructure%20theory&f= false

Summary of PSYCOLOQUY Topic Hyperstructure:


http://www.cogsci.ecs.soton.ac.uk/cgi/psyc/ptopic?topic=Hyperstructure
On Emergence and Explanation:
http://www.kli.ac.at/theorylab/Keyword/H/Hyperstructures.html
Hyperstructures, Topology and Datasets:
http://www.springerlink.com/content/k5581852h53305q7/
Hyperstructures as Abstract Matter:
http://econpapers.repec.org/article/wsiacsxxx/v_3a09_3ay_3a2006_3ai_3a03_3ap_3a157-182. htm
Cities of the Future: The Hyperstructure Concept:
http://www.technewsworld.com/story/Cities-of-the-Future-Part -1-61226. html
Unifying Themes in Complex Systems, Volume 3:
http://books.google.com/books?id=GUFm6dCWwoEC&source=gbs_navlinks_s
Evolutionary Economics and Environmental Policy: Survival of the Greenest:
http://books.google.com/books?id=-pNaaizl5tcC&source=gbs_navlinks_s
Minimal Surfaces and Particles in 3-Manifolds:
http://arxiv.org/abs/math.DG/0511441
Monopoles and Four-Manifolds (Edward Witten):
http://www.mrlonline.org/mrl/1994 -001-006/1994-001-006-013. pdf
Nigel Hitchin:
http://en.wikipedia.org/wiki/Nigel_Hitchin
Vladimir Drinfel'd:
http://en.wikipedia.org/wiki/Drinfeld
Ruziewicz Problem:
http://en.wikipedia.org/wiki/Ruziewicz_problem
Quasi-Hopf Algebra:
http://en.wikipedia.org/wiki/Quasi-Hopf_algebra
Hom-Tensor Adjunctions for Quasi-Hopf Algebras:
http://deposit.ddb.de/cgi-bin/dokserv?idn= 991213734&dok_var=d1&dok_ext=pdf&filename=991213734.pdf
Dror Bar-Natan:
http://www.math.toronto.edu/~drorbn/LOP.html
Generalized Complex Structure:
http://en.wikipedia.org/wiki/Generalized_complex_structure
Almost Complex Manifold:
http://en.wikipedia.org/wiki/Almost_complex_manifold
The Computational Manifold Approach to Consciousness and Symbolic Processing the Cerebral Cortex:
Abstract—A new abstract model of computation, the computational manifold, provides a framework for approaching the
problems of consciousness, awareness, cognition and symbolic processing in the central nervous system. Physical
properties involving space, time and frequency, such as the surface of the skin, sound spectrograms, visual images,
and the muscle cross-sections are included in the state of manifold automata. The Brodmann areas are modeled as
recurrent image associations implemented as neurotransmitter fields. Symbols are defined by state transition behavior
near reciprocal-image attractors in dynamical systems. Control masks overlay the images and regulate awareness of
the environment and cognitive reflection.
Index Terms—natural intelligence, consciousness, perception, cognition, models of computation.
Manifolds are the basis of differential geometry where they are defined as spaces which are locally homeomorphic to

Brainstorm Page 38
Rn [1]. For example, curved lines are one-dimensional manifolds and smooth surfaces are two-dimensional manifolds.
We present a formal definition of Computational Manifold Automata (CMA) as an abstract model of computation. The
definition is similar to the specification of finite state machines [2], formal language theory [3] or Turing Machines [4]
but with the additional mathematical structure of continuous topologies. Within this framework it is possible to unify the
immediate awareness of perception and motor control with the reflective processing of language and cognition.
http://gmanif.com/pubs/CMAC.pdf
A Unified System of Computational Manifolds:

Fundamental properties of the world in which all life evolved, such as space, time, force, energy and audio frequencies,
are modeled in physics and engineering with differentiable manifolds. A central question of neurophysiology is how
information about these quantities is encoded and processed. While the forces of evolution are complex and often
contradictory, the argument can be made that if all other factors are equal, an organism with a more accurate mental
representation of the world has a better chance of survival. This implies that the representation in the central nervous
system (CNS) of a physical phenomenon should have the same intrinsic mathematical structure as the phenomenon
itself. The philosophical principal, put forth by Monad (1971) and others, that under certain conditions, biological
evolution will form designs that are in accordance with the laws of nature is referred to as teleonomy.
http://gmanif.com/pubs/TR-CIS-0602-03. pdf
Linear logic is a substructural logic proposed by Jean-Yves Girard as a refinement of classical and intuitionistic logic,
joining the dualities of the former with many of the constructiveproperties of the latter. [1] Although the logic has also
been studied for its own sake, more broadly, ideas from linear logic have been influential in fields such as programming
languages,game semantics, and quantum physics [2] , particularly because of its emphasis on resource-boundedness,
duality, and interaction.
http://en.wikipedia.org/wiki/Linear_logic

From Cause to Causation presents both a critical analysis of C.S. Peirce's conception of causation, and a novel
approach to causation, based upon the semeiotic of Peirce.
http://books.google.com/books?id=q58X1qnzBIsC&source=gbs_navlinks_s
Double Bubble Universe:
http://doublebubbleuniverse.com/
When Zero Equals Infinity:
http://www.everythingforever.com/st_math.htm
Fractal Foam Model of Universes:
http://www.bautforum.com/against-mainstream/61052-fractal-foam-model-universes.html
Memory Evolutive Systems: Hierarchy, Emergence, Cognition:
The theory of Memory Evolutive Systems represents a mathematical model for natural open self-organizing systems,
such as biological, sociological or neural systems. In these systems, the dynamics are modulated by the cooperative
and/or competitive interactions between the global system and a net of internal Centers of Regulation (CR) wich a
differential access to a central heirarchical Memory. The MES proposes a mathematical model for autonomous
evolutionary systems and is based on the Category Theory of mathematics. It provides a framework to study and
possibly simulate the structre of "living systems" and their dynamic behavior. MES explores what characterizes a
complex evolutionary system, what distinguishes it from inanimate physical systems, its functioning and evolution in
time, from its birth to its death. The behavior of this type of system depends heavily on its former experiences, and a
model representing the system over a period of time, could anticipate later behavior and perhaps even predict some
evolutionary alternatives. The role of the MES model will be two-fold: theoretical, for a comprehension of a fundamental
nature and practical, for applications in biology, medicine, sociology, ecology, economy, meteorology, and other
sciences.
http://books.google.com/books?id=OqcYQbY79GMC&source= gbs_navlinks_s
Introduction to Model Theory:
This article introduces some of the basic concepts and results from model theory, starting from scratch. The topics
covered are be tailored to the model theory of elds and later articles. I will be using algebraically closed fi elds to
illustrate most of these ideas. The tools described are quite basic; most of the material is due either to Alfred Tarski or
Abraham Robinson. At the end I give some general references.
http://online.wsj.com/article/SB10001424052748703740004574513870490836470.html
The Resolution of Newcomb's Paradox:
http://www.megasociety.net/noesis/44/newcomb.html
"Nature does not necessarily use the maths already in maths books, hence theoretical physicists should be
prepared to explore ... all of pure mathematics"
This revealing quote from page 112 of On Space and Time was presented as reply to physicists who attack
mathematicians while turning to maths books for structures to use in their theories, as if mathematics is a resource
rather than part of the creative process. The subtle interplay between the creativity of pure mathematics and the fact -
driven agenda of physics form the basis of a general philosophy of Relative Realism in which Majid argues that the
nature of Physical Reality is not fundamentally different from the way that topics in Pure Mathematics are on the one
hand created by definitions and on the other hand `out there' waiting to be invented. Majid gives the more everyday
example of the way that the reality experienced in a game of chess is created by the rules of chess and the choice to
abide by them while at the same time, on another level, the rules of chess were themselves a reality waiting to be
discovered by those seeking to invent board games. The general picture leads to a dualism between experiment and
theory or `principle of representation-theoretic self-duality' in which Majid argues that `the search for the ultimate theory
of physics is the search for self-dual structures in a self-dual category'. Although not accepted by professional
philosophers of science, the philosophy has provided a point of view behind most of his research work [7] .
http://en.wikipedia.org/wiki/Shahn_Majid
Algebraic Approach to Quantum Gravity I: Relative Realism:
In the first of three articles, we review the philosophical foundations of an approach to quantum gravity based on a
principle of representation-theoretic duality and a vaguely Kantian-Buddist perspective on the nature of
physical reality which I have called ‗relative realism‘. Central to this is a novel answer to the Plato‘s cave problem in
which both the world outside the cave and the ‗set of possible shadow patterns‘ in the cave have equal status. We
explain the notion of constructions and ‗co‘constructions in this context and how quantum groups arise naturally as a
microcosm for the unification of quantum theory and gravity. More generally, reality is ‗created‘ by choices made
and forgotten that constrain our thinking much as mathematical structures have a reality created by a choice of axioms,
but the possible choices are not arbitary and are themselves elements of a higher-level of reality. In this way the
factual ‗hardness‘ of science is not lost while at the same time the observer is an equal partner in the process. We
argue that the ‗ultimate laws‘ of physics are then no more than the rules of looking at the world in a certain self-dual
way, or conversely that going to deeper theories of physics is a matter of letting go of more and more assumptions. We

Brainstorm Page 39
show how this new philosophical foundation for quantum gravity leads to a self-dual and fractal like structure that
informs and motivates the concrete research reviewed in parts II,III. Our position also provides a kind of explanation of
why things are quantized and why there is gravity in the first place, and possibly why there is a cosmological constant.
http://philsci-archive.pitt.edu/archive/00003345/01/qg1.pdf
In ref. [7], S. Majid presents the following `thesis' : ``(roughly speaking) physics polarises down the middle into two
parts, one which represents the other, but that the latter equally represents the former, i.e. the two should be treated on
an equal footing. The starting point is that Nature after all does not k now or care what mathematics is already in
textbook s. Therefore the quest for the ultimate theory may well entail, probably does entail, inventing entirely new
mathematics in the process. In other words, at least at some intuitive level, a theoretical physicist also has to be a pure
mathematician. Then one can phrase the question `what is the ultimate theory of physics ?' in the form `in the tableau
of all mathematical concepts past present and future, is there some constrained surface or subset which is called
physics ?' Is there an equation for physics itself as a subset of mathematics? I believe there is and if it were to be
found it would be called the ultimate theory of physics. Moreover, I believe that it can be found and that it has a lot to
do with what is different about the way a physicist looks at the world compared to a mathematician...We can then try to
elevate the idea to a more general principle of representation-theoretic self-duality, that a fundamental theory of
physics is incomplete unless such a role-reversal is possible. We can go further and hope to fully determine the
(supposed) structure of fundamental laws of nature among all mathematical structures by this self-duality condition.
Such duality considerations are certainly evident in some form in the context of quantum theory and gravity. The
situation is summarised to the left in the following diagram. For example, Lie groups provide the simplest examples of
Riemannian geometry, while the representations of similar Lie groups provide the quantum numbers of elementary
particles in quantum theory. Thus, both quantum theory and non-Euclidean geometry are needed for a self-dual
picture. Hopf algebras (quantum groups) precisely serve to unify these mutually dual structures.''
http://planetmath.org/encyclopedia/IHESOnTheFusionOfMathematicsAndTheoreticalPhysics2.html
A Bialgebraic Approach to Automata and Formal Language Theory:

A bialgebra is a structure which is simultaneously an algebra and a coalgebra, such that the algebraic and coalgebraic
parts are compatible. Bialgebras are usually studied over a commutative ring. In this paper, we apply the
defining diagrams of algebras, coalgebras, and bialgebras to categories of semimodules and semimodule
homomorphisms over a commutative semiring. We then treat automata as certain representation objects of algebras
and formal languages as elements of dual algebras of coalgebras. Using this perspective, we demonstrate many
analogies between the two theories. Finally, we show that there is an adjunction between the category of ―algebraic‖
automata and the category of deterministic automata. Using this adjunction, we show that K -linear automaton
morphisms can be used as the sole rule of inference in a complete proof system for automaton equivalence.
http://arxiv.org/PS_cache/arxiv/pdf/0807/ 0807.4553v4.pdf
Automata, computability and complexity: theory and applications:
http://books.google.com/books?id=lIuu53IcKWoC&sourc e=gbs_navlinks_s

Isotelesis: a principle similar to "the boundary of a boundary".


Bianchi Identities and the Boundary of a Boundary:
http://books.google.com/books?id=w4Gigq3tY1kC&lpg=PA364&ots=CX7K HEpDeL&dq=boundary%20of% 20a%
20boundary%20is%20zero&pg=PA364#v=onepage&q=boundary%20of%20a% 20boundary%20is%20zero&f= false
Stratified Probabilistic Automata:
http://books.google.com/books?id=Tv9nwzSWi1kC&lpg=PA15&ots=DIqCTrP_b7&dq=stratified%
20automaton&pg=PA14#v=onepage&q=stratified%20automat on& f=false
Cellular automata models of stratified dispersal
Cellular automata is a grid of cells (usually in a 2-dimensional space), in which each cell is characterized by a particular
state. Dynamics of each cell is defined by transition rules which specify the future state of a cell as a function of its
previous state and the state of neighboring cells. Traditional cellular automata considered close neighborhood cells
only. However, in ecological applications it is convenient to consider more distant neighborhoods within specified
distance from the cell.
The figure above shows 3 basic rules for the dynamics of cellular automata that simulates stratified dispersal:
1. Stochastic long-distance jumps
2. Continous local dispersal
3. Population growth (population numbers are multiplied by R)
Results of cellular automata simulation for several sequential time steps are shown at the right figure. It is seen how isola ted
colonies become established, grow, and then coalesce. This model was used for prediction of barrier -zone effect on the rate of
population spread and the results were similar to those obtained with the metapopulation model..

http://home.comcast.net/~sharov/PopEcol/lec12/stratdsp.html
Links Between Probabilistic Automata and Hidden Markov Models:
http://www.info.ucl.ac.be/~pdupont/pdupont/pdf/ HMM_PA _pres_n4.pdf
Probabilistic Automata: System Types, Parallel Composition and Comparison:
http://reference.kfupm.edu.sa/content/p/r/probabilistic_aut omata__system_types__pa_1835608.pdf
Stratified Petri Nets:
Abstract: We introduce a subclass of Valk's self-modifying nets. The considered nets appear as stratifi ed sums of
ordinary nets and they arise as a counterpart to cascade products of automata via the duality between automata
and nets based on regions in automata. Nets in this class, called strati fied nets, cannot exhibit circular dependences
between places: inscription on flow arcs attached to a given place, depend at most on the contents of places in

Brainstorm Page 40
between places: inscription on flow arcs attached to a given place, depend at most on the contents of places in
lower layers. Therefore, the synthesis problem has similar degrees of complexity for (ordinary) nets and for strati ed
nets, hence it is tractable.

Key-words: Selfmodifying Petri Nets, Cascade Product, Net Synthesis Problem.


http://reference.kfupm.edu.sa/content/p/r/probabilistic_aut omata__system_types__pa_1835608.pdf
Complementation of Büchi Automata Revisited:
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10. 1.1.29.9287&rep=rep1&type=pdf
Zero to Infinity: the Foundations of Physics:
http://books.google.com/books?id=cOnjDfQQX0UC&lpg=PA1&ots=mw7CEVin_K&dq=zero -totality&pg=PA1
#v=onepage&q=zero-totality&f= false
Hyperincursive Stack Memory With Chaos:
http://books.google.com/books?id=97og4KLGGR0C&lpg=PA350&dq= hyperincursive&pg=PA346
#v=onepage&q=hyperinc ursive&f= false
Three Forms of Truth Relativism:
1. Modeling Matters of Taste
2. Modeling Epistemic Modality
3. Relativism: Indexical, Propositional or Factual?
http://www.duke.edu/~ieinheus/papers/tftr.pdf
Introduction to the CTMU:
The real universe has always been theoretically treated as an object, and specifically as the composite type of object
known as a set. But an object or set exists in space and time, and reality does not. Because the real universe by
definition contains all that is real, there is no "external reality" (or space, or time) in which it can exist or have been
"created". We can talk about lesser regions of the real universe in such a light, but not about the real universe as a
whole. Nor, for identical reasons, can we think of the universe as the sum of its parts, for these parts exist solely within
a spacetime manifold identified with the whole and cannot explain the manifold itself. This rules out pluralistic
explanations of reality, forcing us to seek an explanation at once monic (because nonpluralistic) and holistic (because
the basic conditions for existence are embodied in the manifold, which equals the whole). Obviously, the first step
towards such an explanation is to bring monism and holism into coincidence.

When theorizing about an all-inclusive reality, the first and most important principle is containment, which simply tells
us what we should and should not be considering. Containment principles, already well known in cosmology, generally
take the form of tautologies; e.g., "The physical universe contains all and only that which is physical." The predicate
"physical", like all predicates, here corresponds to a structured set, "the physical universe" (because the universe has
structure and contains objects, it is a structured set). But this usage of tautology is somewhat loose, for it technically
amounts to a predicate-logical equivalent of propositional tautology called autology, meaning self-description.
Specifically, the predicate physical is being defined on topological containment in the physical universe, which is tacitly
defined on and descriptively contained in the predicate physical, so that the self-definition of "physical" is a two-step
operation involving both topological and descriptive containment. While this principle, which we might regard as a
statement of "physicalism", is often confused with materialism on the grounds that "physical" equals "material", the
material may in fact be only a part of what makes up the physical. Similarly, the physical may only be a part of what
makes up the real. Because the content of reality is a matter of science as opposed to mere semantics, this issue can
be resolved only by rational or empirical evidence, not by assumption alone.

Can a containment principle for the real universe be formulated by analogy with that just given for the physical
universe? Let's try it: "The real universe contains all and only that which is real." Again, we have a tautology, or more
accurately an autology, which defines the real on inclusion in the real universe, which is itself defined on the predicate
real. This reflects semantic duality, a logical equation of predication and inclusion whereby perceiving or semantically
predicating an attribute of an object amounts to perceiving or predicating the object's topological inclusion in the set or
space dualistically corresponding to the predicate. According to semantic duality, the predication of the attribute real on
the real universe from within the real universe makes reality a self-defining predicate, which is analogous to a self-
including set. An all-inclusive set, which is by definition self-inclusive as well, is called "the set of all sets". Because it is
all-descriptive as well as self-descriptive, the reality predicate corresponds to the set of all sets. And because the self-
definition of reality involves both descriptive and topological containment, it is a two-stage hybrid of universal autology
and the set of all sets.

Now for a brief word on sets. Mathematicians view set theory as fundamental. Anything can be considered an object,
even a space or a process, and wherever there are objects, there is a set to contain them. This "something" may be a
relation, a space or an algebraic system, but it is also a set; its relational, spatial or algebraic structure simply makes i t
a structured set. So mathematicians view sets, broadly including null, singleton, finite and infinite sets, as fundamental
objects basic to meaningful descriptions of reality. It follows that reality itself should be a set…in fact, the largest set of
all. But every set, even the largest one, has a powerset which contains it, and that which contains it must be larger (a
contradiction). The obvious solution: define an extension of set theory incorporating two senses of ―containment‖ which
work together in such a way that the largest set can be defined as "containing" its powerset in one sense while being
contained by its powerset in the other. Thus, it topologically includes itself in the act of descriptively including itself i n
the act of topologically including itself..., and so on, in the course of which it obviously becomes more than just a set.

In the Cognitive-Theoretic Model of the Universe or CTMU, the set of all sets, and the real universe to which it
corresponds, take the name (SCSPL) of the required extension of set theory. SCSPL, which stands for Self-
Configuring Self-Processing Language, is just a totally intrinsic, i.e. completely self-contained, language that is
comprehensively and coherently (self-distributively) self-descriptive, and can thus be model-theoretically identified as
its own universe or referent domain. Theory and object go by the same name because unlike conventional ZF or NBG
set theory, SCSPL hologically infuses sets and their elements with the distributed (syntactic, metalogical) component of
the theoretical framework containing and governing them, namely SCSPL syntax itself, replacing ordinary set -theoretic
objects with SCSPL syntactic operators. The CTMU is so-named because the SCSPL universe, like the set of all sets,
distributively embodies the logical syntax of its own descriptive mathematical language. It is thus not only self-
descriptive in nature; where logic denotes the rules of cognition (reasoning, inference), it is self-cognitive as well. (The
terms "SCSPL" and "hology" are explained further below; to skip immediately to the explanations, just click on the
above links.)

An act is a temporal process, and self-inclusion is a spatial relation. The act of self-inclusion is thus "where time

Brainstorm Page 41
An act is a temporal process, and self-inclusion is a spatial relation. The act of self-inclusion is thus "where time
becomes space"; for the set of all sets, there can be no more fundamental process. No matter what else happens in
the evolving universe, it must be temporally embedded in this dualistic self-inclusion operation. In the CTMU, the self-
inclusion process is known as conspansion and occurs at the distributed, Lorentz -invariant conspansion rate c, a time-
space conversion factor already familiar as the speed of light in vacuo (conspansion consists of two alternative phases
accounting for the wave and particle properties of matter and affording a logical explanation for accelerating cosmic
expansion). When we imagine a dynamic self-including set, we think of a set growing larger and larger in order to
engulf itself from without. But since there is no "without" relative to the real universe, external growth or reference is no t
an option; there can be no external set or external descriptor. Instead, self-inclusion and self-description must occur
inwardly as the universe stratifies into a temporal sequence of states, each state topologically and computationally
contained in the one preceding it (where the conventionally limited term computation is understood to refer to a more
powerful SCSPL-based concept, protocomputation, involving spatiotemporal parallelism). On the present level of
discourse, this inward self-inclusion is the conspansive basis of what we call spacetime.

Every object in spacetime includes the entirety of spacetime as a state-transition syntax according to which its next
state is created. This guarantees the mutual consistency of states and the overall unity of the dynamic entity the real
universe. And because the sole real interpretation of the set-theoretic entity "the set of all sets" is the entire real
universe, the associated foundational paradoxes are resolved in kind (by attributing mathematical structure like that of
the universe to the pure, uninterpreted set-theoretic version of the set of all sets). Concisely, resolving the set -of-all-
sets paradox requires that (1) an endomorphism or self-similarity mapping D:S-->rÎS be defined for the set of all sets S
and its internal points r; (2) there exist two complementary senses of inclusion, one topological [S Ét D(S)] and one
predicative [D(S) Éd S], that allow the set to descriptively "include itself" from within, i.e. from a state of topological s elf-
inclusion (where Ét denotes topological or set-theoretic inclusion and Éd denotes descriptive inclusion, e.g. the
inclusion in a language of its referents); and (3) the input S of D be global and structural, while the output D(S) = (r Éd
S) be internal to S and play a syntactic role. In short, the set -theoretic and cosmological embodiments of the self-
inclusion paradox are resolved by properly relating the self-inclusive object to the descriptive syntax in terms of which it
is necessarily expressed, thus effecting true self-containment: "the universe (set of all sets) is that which topologically
contains that which descriptively contains the universe (set of all sets)."

This characterizes a system that consistently perceives itself and develops its own structure from within via hology, a
2-stage form of self-similarity roughly analogous to holography. (Hology is a logico-cybernetic form of self-similarity in
which the global structure of a self-contained, self-interactive system doubles as its distributed self-transductive syntax;
it is justified by the obvious fact that in a self-contained system, no other structure is available for that purpose.) The
associated conspansive mapping D is called incoversion in the spatiotemporally inward direction and coinversion in the
reverse (outward, D-1) direction. Incoversion carries global structure inward as state-recognition and state-
transformation syntax, while coinversion projects syntactic structure outward in such a way as to recognize existing
structure and determine future states in conformance with it. Incoversion is associated with an operation called
requantization, while coinversion is associated with a complementary operation called inner expansion. The alternation
of these operations, often referred to as wave-particle duality, comprises the conspansion process. The Principle of
Conspansive Duality then says that what appears as cosmic expansion from an interior (local) viewpoint appears as
material and temporal contraction from a global viewpoint. Because metric concepts like "size" and "duration" are
undefined with respect to the universe as a whole, the spacetime metric is defined strictly intrinsically, and the usual
limit of cosmological regress, a pointlike cosmic singularity, becomes the closed spacetime algebra already identified
as SCSPL.

Thus, the real universe is not a static set, but a dynamic process resolving the self-inclusion paradox. Equivalently,
because any real explanation of reality is contained in reality itself, reality gives rise to a paradox unless regarded as
an inclusory self-mapping. This is why, for example, category theory is increasingly preferred to set theory as a means
of addressing the foundations of mathematics; it centers on invariant relations or mappings between covariant or
contravariant (dually related) objects rather than on static objects themselves. For similar reasons, a focus on the
relative invariants of semantic processes is also well-suited to the formulation of evolving theories in which the
definitions of objects and sets are subject to change; thus, we can speak of time and space as equivalent to cognition
and information with respect to the invariant semantic relation processes, as in "time processes space" and "cognition
processes information". But when we define reality as a process, we must reformulate containment accordingly.
Concisely, reality theory becomes a study of SCSPL autology naturally formulated in terms of mappings. This is done
by adjoining to logic certain metalogical principles, formulated in terms of mappings, that enable reality to be described
as an autological (self-descriptive, self-recognizing/self-processing) system.

The first such principle is MAP, acronymic for Metaphysical Autology Principle. Let S be the real universe, and let T =
T(S) be its theoretical description or "TOE". MAP, designed to endow T and S with mathematical closure, simply states
that T and S are closed with respect to all internally relevant operations, including recognition and description. In terms
of mappings, this means that all inclusional or descriptive mappings of S are automorphisms (e.g., permutations or
foldings) or endomorphisms (self-injections). MAP is implied by the unlimited scope, up to perceptual relevance, of the
universal quantifier implicitly attached to reality by the containment principle. With closure thereby established, we can
apply techniques of logical reduction to S without worrying about whether the lack of some external necessity will spoil
the reduction. In effect, MAP makes T(S) "exclusive enough" to describe S by excluding as a descriptor of S anything
not in S. But there still remains the necessity of providing S with a mechanism of self-description.

This mechanism is provided by another metalogical principle, the M=R or Mind Equals Reality Principle, that identifies
S with the extended cognitive syntax D(S) of the theorist. This syntax (system of cognitive rules) not only determines
the theorist's perception of the universe, but bounds his cognitive processes and is ultimately the limit of his
theorization (this relates to the observation that all we can directly know of reality are our perceptions of it). The
reasoning is simple; S determines the composition and behavior of objects (or subsystems) s in S, and thus comprises
the general syntax (structural and functional rules of S) of which s obeys a specific restriction. Thus, where s is an ideal
observer/theorist in S, S is the syntax of its own observation and explanation by s. This is directly analogous to "the
real universe contains all and only that which is real", but differently stated: "S contains all and only objects s whose
extended syntax is isomorphic to S." M=R identifies S with the veridical limit of any partial theory T of S [limT(S) =
D(S)], thus making S "inclusive enough" to describe itself. That is, nothing relevant to S is excluded from S @ D(S).

Mathematically, the M=R Principle is expressed as follows. The universe obviously has a structure S. According to the
logic outlined above, this structure is self-similar; S distributes over S, where "distributes over S" means "exists without
constraint on location or scale within S". In other words, the universe is a perfectly self-similar system whose overall

Brainstorm Page 42
structure is replicated everywhere within it as a general state-recognition and state-transition syntax (as understood in
an extended computational sense). The self-distribution of S, called hology, follows from the containment principle, i.e.
the tautological fact that everything within the real universe must be described by the predicate "real" and thus fall
within the constraints of global structure. That this structure is completely self-distributed implies that it is locally
indistinguishable for subsystems s; it could only be discerned against its absence, and it is nowhere absent in S.
Spacetime is thus transparent from within, its syntactic structure invisible to its contents on the classical (macroscopic)
level. Localized systems generally express and utilize only a part of this syntax on any given scale, as determined by
their specific structures. I.e., where there exists a hological incoversion endomorphism D:Sà{rÎS} carrying the whole
structure of S into every internal point and region of S, objects (quantum-geometrodynamic ally) embedded in S take
their recognition and state-transformation syntaxes directly from the ambient spatiotemporal background up to
isomorphism. Objects thus utilize only those aspects of D(S) of which they are structural and functional
representations.

The inverse D-1 of this map (coinversion) describes how an arbitrary local system s within S recognizes S at the object
level and obeys the appropriate "laws", ultimately giving rise to human perception. This reflects the fact that S is a self-
perceptual system, with various levels of self-perception emerging within interactive subsystems s (where perception is
just a refined form of interaction based on recognition in an extended computational sense). Thus, with respect to any
class {s} of subsystems of S, we can define a homomorphic submap d of the endomorphism D: d:Sà{s} expressing
only that part of D to which {s} is isomorphic. In general, the si are coherent or physically self-interactive systems
exhibiting dynamical and informational closure; they have sometimes -inaccessible internal structures and dynamics
(particularly on the quantum scale), and are distinguishable from each other by means of informational boundaries
contained in syntax and comprising a "spacetime metric".

According to the above definitions, the global self-perceptor S is amenable to a theological interpretation, and its
contents {s} to "generalized cognitors" including subatomic particles, sentient organisms, and every material system in
between. Unfortunately, above the object level, the validity of s -cognition - the internal processing of sentient
subsystems s - depends on the specific cognitive functionability of a given s...the extent to which s can implicitly
represent higher-order relations of S. In General Relativity, S is regarded as given and complete; the laws of
mathematics and science are taken as pre-existing. On the quantum scale, on the other hand, laws governing the
states and distributions of matter and energy do not always have sufficient powers of restriction to fully determine
quantum behavior, requiring probabilistic augmentation in the course of quantum wavefunction collapse. This prevents
a given s, indeed anything other than S, from enclosing a complete nomology (set of laws); while a complete set of
laws would amount to a complete deterministic history of the universe, calling the universe "completely deterministic"
amounts to asserting the existence of prior determinative constraints. But this is a logical absurdity, since if these
constraints were real, they would be included in reality rather than prior or external to it (by the containment principle).
It follows that the universe freely determines its own constraints, the establishment of nomology and the creation of its
physical (observable) content being effectively simultaneous and recursive. The incoversive distribution of this
relationship is the basis of free will, by virtue of which the universe is freely created by sentient agents existing within
it.

Let's elaborate a bit. Consider the universe as a completely evolved perceptual system, including all of the perceptions
that will ultimately comprise it. We cannot know all of those perceptions specifically, but to the extent that they are
interactively connected, we can refer to them en masse. The set of "laws" obeyed by the universe is just a minimal set
of logical relations that suffices to make these perceptions noncontradictory, i.e. mutually consistent, and a distributed
set of laws is just a set of laws formulated in such a way that the formulation can be read by any part of the system S.
Obviously, for perceptions to be connected by laws, the laws themselves must be internally connected according to a
syntax, and the ultimate syntax of nomological connectedness must be globally valid; whatever the laws may be at any
stage of system evolution, all parts of S must be able to unambiguously read them, execute and be acted upon by
them, and recognize and be recognized as their referents ("unambiguously" implies that 2-valued logic is a primary
ingredient of nomology; its involvement is described by a third metalogical principle designed to ensure consistency,
namely MU or Multiplex Unity). This implies that the action and content of the laws are merged together in each part of
the system as a single (but dual-aspect) quantity, infocognition. The connectedness and consistency of infocognition is
maintained by refinement and homogenization as nomological languages are superseded by extensional
metalanguages in order to create and/or explain new data; because the "theory" SCSPL model-theoretically equates
itself to the real universe, its "creation" and causal "explanation" operations are to a certain extent identical, and the
SCSPL universe can be considered to create or configure itself by means of "self-theorization" or "self-explanation".

The simplest way to explain "connected" in this context is that every part of the (object -level) system relates to other
parts within an overall structural description of the system itself (to interpret "parts", think of events rather than object s;
objects are in a sense defined on events in a spatiotemporal setting). Obviously, any part which fails to meet this
criterion does not conform to a description of the system and thus is not included in it, i.e. not "connected to" the
system (on the other hand, if we were to insist that it is included or connected, then we would have to modify the
systemic description accordingly). For this description to be utile, it should be maximally compact, employing compact
predictive generalizations in a regular way appropriate to structural categories (e.g., employing general "laws of
physics"). Because such laws, when formulated in an "if conditions (a,b,c…) exist, then (X and Y or Z) applies" way,
encode the structure of the entire system and are universally applicable within it, the system is "self-distributed". In
other words, every part of the system can consistently interact with every other part while maintaining an integral
identity according to this ("TOE") formulation. Spatiotemporal relations can be skeletally depicted as edges in a graph
whose vertices are events (physical interactions), i.e. spacetime "points". In this sense, graph-theoretic connectivity
applies. But all those object-level connections must themselves be connected by more basic connections, the basic
connections must be connected by even more basic connections, and so on. Eventually - perhaps sooner than later -
we reach a basic level of connectivity whose syntax comprises a (partially undecidable) "ultimate nomology" for the
level of reality we‘re discussing.

Is this nomology, and the cognitive syntax in which it is expressed, wholly embodied by matter? In one sense the
answer is yes, because S is distributed over each and every material sÎS as the reality -syntax D(S). Thus, every axiom
and theorem of mathematics can be considered implicit in material syntax and potentially exemplified by an appropriate
material pattern, e.g. a firing of cerebral neurons. Against holism - the idea that the universe is more than the sum of its
parts - one can further object that the holistic entity in question is still a material ensemble, thus insinuating that even if
the universe is not the "sum" of its parts, it is still a determinate function of its parts. However, this fails to explain t he
mutual consistency of object-syntaxes, without the enforcement of which reality would disintegrate due to perceptual
inconsistency. This enforcement function takes matter as its argument and must therefore be reposed in spacetime
itself, the universal substrate in which matter is unconditionally embedded (and as a geometrodynamic or quantum -

Brainstorm Page 43
itself, the universal substrate in which matter is unconditionally embedded (and as a geometrodynamic or quantum -
mechanical excitation of which matter is explained). So the background has logical ascendancy over derivative matter,
and this permits it to have aspects, like the power to enforce consistency, not expressible by localized interactions of
compact material objects (i.e., within the bounds of materialism as invoked regarding a putative lack of "material
evidence" for God, excluding the entire material universe).

On the other hand, might cognitive syntax reside in an external "ideal" realm analogous to Plato's world of
Parmenidean forms? Plato‘s ideal abstract reality is explicitly set apart from actual concrete reality, the former being an
eternal world of pure form and light, and the latter consisting of a cave on whose dirty walls shift murky, contaminated
shadows of the ideal world above. However, if they are both separate and in mutual correspondence, these two
realities both occupy a more basic joint reality enforcing the correspondence and providing the metric of separation. If
this more basic reality is then juxtaposed to another, then there must be a more basic reality still, and so on until finally
we reach the most basic level of all. At this level, there will (by definition) be no separation between the abstract and
concrete phases, because there will be no more basic reality to provide it or enforce a remote correspondence across
it. This is the inevitable logical terminus of "Plato‘s regress". But it is also the reality specified by the containment
principle, the scope of whose universal quantifier is unlimited up to perceptual relevance! Since it is absurd to adopt a
hypothesis whose natural logical extension is a negation of that hypothesis, we must assume that the ideal plane
coincides with this one…but again, not in a way necessarily accessible to identifiable physical operations. Rather,
physical reality is embedded in a more general or "abstract" ideal reality equating to the reality -syntax D(S), and the
syntax D(S) is in turn embedded in physical reality by incoversion. Thus, if D(S) contains supraphysical components,
they are embedded in S right along with their physical counterparts (indeed, this convention is already in restricted use
in string theory and M-theory, where unseen higher dimensions get "rolled up" to sub-Planck diameter).

What does this say about God? First, if God is real, then God inheres in the comprehensive reality syntax, and this
syntax inheres in matter. Ergo, God inheres in matter, and indeed in its spacetime substrate as defined on material and
supramaterial levels. This amounts to pantheism, the thesis that God is omnipresent with respect to the material
universe. Now, if the universe were pluralistic or reducible to its parts, this would make God, Who coincides with the
universe itself, a pluralistic entity with no internal cohesion. But because the mutual syntactic consistency of parts is
enforced by a unitary holistic manifold with logical ascendancy over the parts themselves - because the universe is a
dual-aspected monic entity consisting of essentially homogeneous, self-consistent infocognition - God retains
monotheistic unity despite being distributed over reality at large. Thus, we have a new kind of theology that might be
called monopantheism, or even more descriptively, holopantheism. Second, God is indeed real, for a coherent entity
identified with a self-perceptual universe is self-perceptual in nature, and this endows it with various levels of self-
awareness and sentience, or constructive, creative intelligence. Indeed, without a guiding Entity whose Self-awareness
equates to the coherence of self-perceptual spacetime, a self-perceptual universe could not coherently self-configure.
Holopantheism is the logical, metatheological umbrella beneath which the great religions of mankind are unknowingly
situated.

Why, if there exists a spiritual metalanguage in which to establish the brotherhood of man through the unity of
sentience, are men perpetually at each others' throats? Unfortunately, most human brains, which comprise a
particular highly-evolved subset of the set of all reality-subsystem s, do not fire in strict S-isomorphism much
above the object level. Where we define one aspect of "intelligence" as the amount of global structure functionally
represented by a given sÎS, brains of low intelligence are generally out of accord with the global syntax D(S). This limits
their capacity to form true representations of S (global reality) by syntactic autology [d(S) Éd d(S)] and make rational
ethical calculations. In this sense, the vast majority of men are not well -enough equipped, conceptually speaking, to
form perfectly rational worldviews and societies; they are deficient in education and intellect, albeit remediably so in
most cases. This is why force has ruled in the world of man…why might has always made right, despite its marked
tendency to violate the optimization of global utility derived by summing over the sentient agents of S with respect to
space and time.

Now, in the course of employing deadly force to rule their fellows, the very worst element of humanity – the butchers,
the violators, i.e. those of whom some modern leaders and politicians are merely slightly -chastened copies – began to
consider ways of maintaining power. They lit on religion, an authoritarian priesthood of which can be used to set the
minds and actions of a populace for or against any given aspect of the political status quo. Others, jealous of the power
thereby consolidated, began to use religion to gather their own "sheep", promising special entitlements to those who
would join them…mutually conflicting promises now setting the promisees at each other‘s throats.

But although religion has often been employed for evil by cynics appreciative of its power, several things bear
notice. (1) The abuse of religion, and the God concept, has always been driven by human politics, and no one
is justified in blaming the God concept, whether or not they hold it to be real, for the abuses committed by evil
men in its name. Abusus non tollit usum. (2) A religion must provide at least emotional utility for its believers,
and any religion that stands the test of time has obviously been doing so. (3) A credible religion must contain
elements of truth and undecidability, but no elements that are verifiably false (for that could be used to
overthrow the religion and its sponsors). So by design, religious beliefs generally cannot be refuted by
rational or empirical means.

Does the reverse apply? Can a denial of God be refuted by rational or empirical means? The short answer is yes; the
refutation follows the reasoning outlined above. That is, the above reasoning constitutes not just a logical
framework for reality theory, but the outline of a logical proof of God's existence and the basis of a "logical
theology". While the framework serves other useful purposes as well, e.g. the analysis of mind and consciousness,
we'll save those for another time.

http://www.ctmu.org/
"Design theory, which traces its origins to traditional theological ―arguments from design‖ holding that nature was more
or less obviously designed by a preexisting intelligence, maintains that the observed complexity of biological structures
implies the involvement of empirically detectable intelligent causes in nature. Intelligent Design, the most recent
scientific outgrowth of Design Theory, is a scientific research program based on a more philosophically neutral, and
therefore scientific, search for instances of a clear, objective, standard form of biological complexity. According to
William Dembski, one of the movement‘s leading spokesmen, this has led to ―a theory of biological origins and
development‖ according to which ―intelligent [and empirically detectable] causes are necessary to explain the complex,
information-rich structures of biology.‖ In view of the informational nature of complexity, Dembski observes that
―information is not reducible to natural causes…the origin of information is best sought in intelligent causes. Intelligent

Brainstorm Page 44
―information is not reducible to natural causes…the origin of information is best sought in intelligent causes. Intelligent
design thereby becomes a theory for detecting and measuring information, explaining its origin, and tracing its flow.

One of the first things to note about the above definition is that it couples the implied definitions of intelligence,
causation and information to a greater extent than do most dictionaries, pointing in principle to a joint definition of all o f
them. Since any good definition requires a model, one might be strongly tempted to infer on this basis that ID, as here
defined, has a well-defined model in which all of its constituent concepts are related. It may therefore come as a
surprise to many that perhaps the most frequent, or at any rate the most general, objection to ID in the wider
intellectual community is that it ―has no model‖. According to its critics, it lacks any real -world interpretation specifying a
fundamental medium able to support it or a means by which to realize it.

Furthermore, its critics claim, its central hypothesis is not only beyond proof, but unrealistic and not amenable to
empirical confirmation.
In all fairness, it must be noted that insofar as science has itself spectacularly failed to agree on a global model of
reality, this is really nothing more than an exercise in hypocrisy. Science observes, relates and extrapolates from
observations with what often turns out to be great efficiency, but has time and time again proven unable to completely
justify its reductions or the correspondences between its theories and the real universe as a whole. Although some
critics claim that beyond a certain point, explanation is pointless and futile, they do not speak for science; the entire
purpose of science is explanation, not rationally unsubstantiated assertions to the effect that a closed-form explanation
is ―unavailable‖ or ―unnecessary‖. In seeking a coherent explanation for existence – an explanation incorporating an
ontological design phase that is rational, coherent and therefore intelligent – the ID program is in fact perfectly
consistent with science.
However, being perfectly consistent with science means merely that something is in line for a model, not that it already
has one. It has thus been possible for dedicated critics of ID to create the illusion, at least for sympathetic audiences,
that they have it at a critical disadvantage. They contend that while science must be instrumental to society, yield
specific predictions, and thus cite specific structural and dynamical laws that nontrivially explain its contexts of
application, ID is nothing more than a Trojan horse for religious ideology, makes no nontrivial predictions, and is devoid
of theoretical structure. Due to the number of sympathetic ears that such claims have found in Academia, this illusion
has all but promoted itself to the status of a self-reinforcing mass delusion in certain closed-minded sectors of the
intellectual community. Obviously, it would be to the advantage of the ID movement, and society as a whole, to end
this contagion by putting forth something clearly recognizable as a model.

The problem, of course, is that as long as science in general lacks a fundamental model, so do all particular strains of
science including Intelligent Design. Due to the close connection between fundamentality and generality, ID or any
other field of scientific inquiry would ultimately have to provide science in general with a fundamental model in order to
provide one for itself. This might have led some people, in particular those who doubt the existence of a stable
fundamental model of reality, to suppose that the ID controversy would remain strictly within the realm of philosophy
until the end of time. But this is not the case, for if there were really no fundamental model – if there were no way to
map theoretic cognition onto reality in its entirety - perception itself would lack a stable foundation. Perception, after all,
can be described as the modeling of objective reality in cognition, and the modeling of cognition in objective reality.
The self-evident perceptual stability of reality, on which the existence and efficacy of science and scientific
methodology absolutely depend, bear unshakable testimony to the existence of a fundamental model of the real
universe.

The general nature of this model can be glimpsed merely by considering the tautological reflexivity of the term ―self-
evident‖. Anything that is self evident proves (or evidences) itself, and any construct that is implicated in its own proof
is tautological. Indeed, insofar as observers are real, perception amounts to reality tautologically perceiving itself. The
logical ramifications of this statement are developed in the supertautological CTMU, according to which the model in
question coincides logically and geometrically, syntactically and informationally, with the process of generating the
model, i.e. with generalized cognition and perception. Information thus coincides with information transduction, and
reality is a tautological self-interpretative process evolving through SCSPL grammar.

The CTMU has a meta-Darwinian message: the universe evolves by hological self-replication and self-selection.
Furthermore, because the universe is natural, its self-selection amounts to a cosmic form of natural selection. But by
the nature of this selection process, it also bears description as intelligent self-design (the universe is ―intelligent‖
because this is precisely what it must be in order to solve the problem of self-selection, the master-problem in terms of
which all lesser problems are necessarily formulated). This is unsurprising, for intelligence itself is a natural
phenomenon that could never have emerged in humans and animals were it not already a latent property of the
medium of emergence. An object does not displace its medium, but embodies it and thus serves as an expression of
its underlying syntactic properties. What is far more surprising, and far more disappointing, is the ideological conflict to
which this has led. It seems that one group likes the term ―intelligent‖ but is indifferent or hostile to the term ―natural‖,
while the other likes ―natural‖ but abhors ―intelligent‖. In some strange way, the whole controversy seems to hinge on
terminology.

Of course, it can be credibly argued that the argument actually goes far deeper than semantics… that there are
substantive differences between the two positions. For example, some proponents of the radical Darwinian version of
natural selection insist on randomness rather than design as an explanation for how new mutations are generated prior
to the restrictive action of natural selection itself. But this is untenable, for in any traditional scientific context,
―randomness‖ is synonymous with ―indeterminacy‖ or ―acausality‖, and when all is said and done, acausality means
just what it always has: magic. That is, something which exists without external or intrinsic cause has been selected for
and brought into existence by nothing at all of a causal nature, and is thus the sort of something-from-nothing
proposition favored, usually through voluntary suspension of disbelief, by frequenters of magic shows.

Inexplicably, some of those taking this position nevertheless accuse of magical thinking anyone proposing to introduce
an element of teleological volition to fill the causal gap. Such parties might object that by ―randomness‖, they mean not
acausality but merely causal ignorance. However, if by taking this position they mean to belatedly invoke causality,
then they are initiating a causal regress. Such a regress can take one of three forms: it can be infinite and open, it can
terminate at a Prime Mover which itself has no causal explanation, or it can form some sort of closed cycle doubling as
Prime Mover and that which is moved. But a Prime Mover has seemingly been ruled out by assumption, and an infinite
open regress can be ruled out because its lack of a stable recursive syntax would make it impossible to form stable
informational boundaries in terms of which to perceive and conceive of reality.

Brainstorm Page 45
What about the cyclical solution? If one uses laws to explain states, then one is obliged to explain the laws themselves.
Standard scientific methodology requires that natural laws be defined on observations of state. If it is then claimed that
all states are by definition caused by natural laws, then this constitutes a circularity necessarily devolving to a mutual
definition of law and state. If it is then objected that this circularity characterizes only the process of science, but not the
objective universe that science studies, and that laws in fact have absolute priority over states, then the laws
themselves require an explanation by something other than state. But this would effectively rule out the only remaining
alternative, namely the closed-cycle configuration, and we would again arrive at…magic.

It follows that the inherently subjective process of science cannot ultimately be separated from the objective universe;
the universe must be self-defining by cross-refinement of syntax and state. This brings us back to the CTMU, which
says that the universe and everything in it ultimately evolves by self-multiplexing and self-selection. In the CTMU,
design and selection, generative and restrictive sides of the same coin, are dual concepts associated with the
alternating stages of conspansion. The self-selection of reality is inextricably coupled to self-design, and it is this two-
phase process that results in nature. Biological evolution is simply a reflection of the evolution of reality itself, a proce ss
of telic recursion mirroring that of the universe as a whole. Thus, when computations of evolutionary probability are
regressively extrapolated to the distributed instant of creation, they inevitably arrive at a logical and therefore
meaningful foundation.
The CTMU says that on logical grounds, reality has generative and restrictive phases, and that evolution has
generative and restrictive phases that are necessarily expressed in terms of those of reality. It asserts that the meta -
cybernetic mechanism of evolution is telic recursion, an atemporal process which sets up a stratified dialectic between
syntax and state, organism and environment, with mutually consistent mutable and invariant levels. It says that this
process, though subject to various forms of noise, interference and competition predicated on the internal freedom of
reality, tends to maximize the utility of the universe and its inhabitants. And it thus says that evolution is much more
than a mere environment al dictatorship in which inexplicable laws of nature call the tune as biology slavishly dances
the jig of life and death.

The CTMU says that by its self-generative, self-selective nature, which follows directly from the analytic requirement of
self-containment, reality is its own ―designer‖. Other features of the generative grammar of reality imply that reality
possesses certain logical properties traditionally regarded as theological or spiritual, and that to this extent, the self-
designing aspect of reality is open to a theological or spiritual interpretation. The CTMU, being a logical theory, does
not attempt to force such an interpretation down anyone‘s throat; not all semantic permutations need affect theoretical
structure. What it does do, however, is render any anti-theological interpretation a priori false, and ensures that
whatever interpretation one chooses accommodates the existence of an ―intelligent designer‖…namely, reality itself. In
light of the CTMU, this is now a matter more of logic than of taste.

In any case, it should be clear that the CTMU yields new ways of looking at both evolution and teleology. Just as it is
distinguished from other theories of cosmic evolution by its level of self-containment, particularly with regard to its
preference for self-determinacy rather than external determinacy or indeterminacy, so for its approach to biological
evolution. Unlike other theories, the CTMU places evolutionary biology squarely in the context of a fundamental, self-
contained model of reality, thus furnishing it with an explanation and foundation of its own instead of irresponsibly
passing the explanatory buck to some future reduction; instead of counting it sufficient to model its evolutionary
implications in the biological world, the CTMU establishes model-theoretic symmetry by providing a seamless blend of
theory and universe in which the biological world can itself be ―modeled‖ by physical embedment.

This alone entitles it to a place in the evolutionary debate."

Composition is of three kinds.

1. Accidental composition.
2. Involuntary composition.
3. Voluntary composition.

http://bahai-library.com/compilations/bahai.scriptures/7.html

"Homeotely: The term homeotely signifies that subsystems will direct their behaviour in such a way that it is beneficial
for the well-being of the overall system. When applied to the evolutionary process, it states that subsystems will
develop in such a way that they are beneficial for the well-being of the overall system. At first glance, this sounds
embarrassingly teleological. However, if we recognize the fact that the behaviour as well as the evolution of systems is
guided by context-sensitive self-interest, teleology vanishes into thin air. Context-sensitive self-interest is a systemic
evolutionary principle: organisms are forced by their selfish genes to seek nothing but their own advantage - but the
environment in which they develop, or the system of which they are a subsystem, only allows a limited set of
developments and patterns of behaviour, any breach of the rules being punished with elimination. For an animal
endowed with choice this harsh law transforms into an
ethical principle: since its behaviour is only partly genetically determined, the word sensitive assumes its active
meaning, i.e. it refers to conscious reactions to perceived or anticipated effects of behaviour or development on the
overall system. (LM, based on Edward Goldsmith, The Way)"
http://tinyurl.com/yjcxa3f

Metagenetics:
http://tinyurl.com/yld36jt

Programming at the Edge of Chaos:

"Teleonomy is the science of adaptation. It is "the quality of apparent purposefulness in living organisms that derives
from their evolutionary adaptation". The term was coined to stand in contrast with teleology. A teleological process is
one that is planned in a purposeful way by a sentient, intelligent being. Artifacts that emerge from such a process are
the products of foresight, and intent. A teleonomic process, such as evolution, produces products of stunning intricacy
without the benefit of such a guiding intelligence. Instead, it blindly accrues information about what has worked,
exploiting feedback from the environment via the selection and survival of fitter coalitions of such insight. It unwittingly
choreographs a grand audition of a horde of variations on what it has learned thus far, culling the also-rans, and

Brainstorm Page 46
choreographs a grand audition of a horde of variations on what it has learned thus far, culling the also-rans, and
casting the winners in its next production. It hoards hindsight, and uses it to make "predictions" about how to cope with
the future."
http://www.laputan.org/chaos/chaos.html

"The memory-prediction framework is a theory of brain function that was created by Jeff Hawkins and described in his
2004 book On Intelligence. This theory concerns the role of the mammalian neocortex and its associations with the
hippocampus and the thalamus in matching sensory inputs to stored memory patterns and how this process leads to
predictions of what will happen in the future."
http://en.wikipedia.org/wiki/Memory-prediction_framework

Viable systems are coherent social organisations that are able to survive. Part of their survival process involves
anticipation that is embedded in their logical models. The development of viable systems often occurs despite their
inability to develop common patterns of knowledge for their world view holders. This has means that new anticipatory
processes must be activated, when their viability may be endangered
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1360082
This volume contains papers based on the workshop "Energy and Information Transfer in Biological Systems: How
Physics Could Enrich Biological Understanding," held in Italy in 2002. The meeting was a forum aimed at evaluating
the potential and outlooks of a modern physics approach to understanding and describing biological processes,
especially regarding the transition from the microscopic chemical scenario to the macroscopic functional configurations
of living matter. In this frame some leading researchers presented and discussed several basic topics, such as the
photon interaction with biological systems also from the viewpoint of photon information processes and of possible
applications; the influence of electromagnetic fields on the self-organization of biosystems including the nonlinear
mechanism for energy transfer and storage; and the influence of the structure of water on the properties of biological
matter.
http://books.google.com/books?id=jPDkS1I61vMC&source=gbs_navlinks_s
Also reminded of Chapter VI on the isomorphism between information-bearers and information-renderers.

Godel, Escher, Bach Lecture:


http://www.youtube.com/watch?v=5jFhq3Rj6DI
Lee Smolin's book on the trouble with physics is a good read.

I don't want to call S. Majid's work "Indian" science, or String Theory "Kabbalistic", but there is some competition
between algebraic (east) and geometric (west) going on it seems, I like to avoid politics, because they both have their
merits, and shortcomings.

http://www.mindspring.com/~noetic.advanced.studies/Amoroso14. pdf
"What is not widely understood, even amongst physicists, is that a belief in the mystical aspects of the theory is a
choice that one makes, rather than something inevitable. One formulation of quantum mechanics - long ignored or
derided by just about everyone - which makes this particularly clear is the pilot-wave theory (also known as Bohmian
mechanics, de Broglie-Bohm theory, the causal or ontological interpretation of QM). In this theory, wave particle duality
is explained through the startlingly sensible notion of having both waves and particles (think about how that makes the
double slit experiment intelligible!). So unlike in orthodox QM - where the wave function is all there is - the particles
have an objectively real existence and they move along trajectories, guided by the waves. In such a formalism the
standard paradoxes related to measurement, observation or wave function collapse (Schroedinger's cat, and so on)
simply evaporate. The classical limit emerges out of the theory, rather than being presupposed. All the 'talk' is replaced
by sharply-defined mathematics, it becomes possible to 'visualize' the reality of most quantum events, and - most
importantly - the theory is completely consistent with the full range of QM predictive-observational data."
http://www.tcm.phy.cam.ac.uk/~mdt26/pilot_waves.html
"Complexity: noun: the quality of being intricate and compounded"
http://onelook.com/?w=complexity&ls=a

Can "Quality" or Qualia be objectively defined?

"The veteran artificial intelligence researcher Marvin Minsky thinks the problems posed by qualia are essentially issues
of complexity, or rather of mistaking complexity for simplicity.
"Now, a philosophical dualist might then complain: "You've described how hurting affects your mind — but you still
can't express how hurting feels." This, I maintain, is a huge mistake — that attempt to reify 'feeling' as an independent
entity, with an essence that's indescribable. As I see it, feelings are not strange alien things. It is precisely those
cognitive changes themselves that constitute what 'hurting' is — and this also includes all those clumsy attempts to
represent and summarize those changes. The big mistake comes from looking for some single, simple, 'essence' of
hurting, rather than recognizing that this is the word we use for complex rearrangement of our disposition of
resources."
http://en.wikipedia.org/wiki/Qualia

Different types of complexity:


http://www.tim-taylor.com/papers/thesis/html/node22.html
http://nirmukta.com/2009/09/14/complexity-explained-5-defining-different-types-of-complexity/

"Murray Gell-Mann defines "Plectics" as the "...the study of simplicity and complexity. It includes the various attempts
to define complexity; the study of roles of simplicity and complexity and of classical and quantum information in the
history of the universe, the physics of information; the study of non-linear dynamics, including chaos theory, strange
attractors, and self-similarity in complex non-adaptive systems in physical science; and the study of complex adaptive
systems, including prebiotic chemical evolution, biological evolution, the behaviour of individual organisms, the
functioning of ecosystems, the operation of mammalian immune systems, learning and thinking, the evolution of
human languages, the rise and fall of human cultures, the behaviour of markets, and the operation of computers that
are designed or programmed to evolve strategies - say, for playing chess, or solving problems."

Murray Gell-Mann is a founding member and currently a distinguished fellow at SFI as well as the Robert Andrews
Millikan Professor Emeritus at the California Institute of Technology, where he joined the faculty in 1955. His research
focuses on ―plectics,‖ the study of simplicity and complexity, scaling, and the evolution of languages."

Brainstorm Page 47
focuses on ―plectics,‖ the study of simplicity and complexity, scaling, and the evolution of languages."

"Polytely can be described as Frequently, complex problem-solving situations characterized by the presence of not
one, but several goals, endings. when solving complex problems, we are often forced into making difficult choices and
in the polytelic scenarios; different outcomes to decide from. Though this is more complex than just choosing. We need
to explore various outcomes and theorise before making pragmatic decisions. Haste without experiment will not help
but rather often hinder the cogniser, the thinker, you. Modern society faces an increasing incidance of various complex
problems that are ―pervasive, spreading unhindered into regions‖, social ills for example. In other words, the defining
characteristics of our complex problems are a large number of variables (complexity) that interact in a nonlinear
fashion (connectivity), changing over time (dynamic and time-dependent), and to achieve multiple goals (polytely).
Problem - to solution = the involvement of complex variables. Multiple goals may be present that could, but do not
necessarily, interfere with each other."
http://knol.google.com/k/plectics-groups-autopoiesis-logoi-topoi-biosemiosis-orders-teletics
Einstein's Razor:
http://deletionpedia.dbatley.com/w/index.php?title=Einstein's_razor_(deleted_24_Mar_2008_at_15:35)

Does citing PCID justify censorship?


http://www.uncommondescent.com/intelligent-design/does-citing-pcid-justify-censorship/

On Einstein's Razor: Telesis-Driven Introduction of Complexity into Apparently Sufficiently Non-Complex Linguistic
Systems

by Quinn Tyler Jackson

Abstract: The notion that a linguistic system that is powerful enough to accept any acceptable language but
insufficiently complex to meet specific goals or needs is explored. I nominate Chomsky‘s generative grammar
formalism as the least complex formalism required to describe all language, but show how without the addition of
further complexity, little can be said about the formalism itself. I then demonstrate how the O(n) parsing of
pseudoknots, a previously difficult to solve problem, becomes tractable by the more complex §-Calculus, and finally
close with a falsifiable hypothesis with implications in epistemological complexity.
http://www.iscid.org/papers/Jackson_EinsteinsRazor_050205.pdf
Usually complexity refers to information and 'work' in some context, local-global interaction/balance is necessary, there
must be value in order for any one function/process/hyperstructure to demonstrate utility.

Hypercomplexity:
http://www.ncbi.nlm.nih.gov/pubmed/16583272

The Hypercomplex Society:


http://www.units.muohio.edu/codeconference/papers/papers/Rutenbeck%20 -%20bit%20by%20bit%20%5Bfinal%
5D.pdf
Usually complexity refers to information and 'work' in some context, local-global interaction/balance is necessary, there
must be value in order for any one function/process/hyperstructure to demonstrate utility.

Hypercomplexity:
http://www.ncbi.nlm.nih.gov/pubmed/16583272

The Hypercomplex Society:


http://www.units.muohio.edu/codeconference/papers/papers/Rutenbeck%20 -%20bit%20by%20bit%20%5Bfinal%
5D.pdf

"There is a close relation between notion of Number and such fundamental physical categories as space, time, matter,
field and almost no one has doubt about this. Usually, this relation is associated with such particular numbers as real
and complex numbers, sometime on add quaternions and octavas. The organizers of this site do not deny the
fundamental role of these numbers. However, they pay attention to the fact that there are others generalizations of
number, for example, hypercomplex numbers, waiting for interpretation of their link with physics and geometry. The
problem of physical and geometrical interpretation of hypercomplex numbers is an actual task all the more because
spaces associated with hypercomplex numbers belong to the class of Finsler spaces, the more general class than
Riemannian manifolds. The progress in physics often was conditioned by new geometrical views and we hope that the
explanation of geometrical background must lead to new qualitative consequences in physics and for this once."
http://hypercomplex.xpsweb.com/index.php?lang= en&ref= about

Thermodynamics as a Finsler Space with Torsion:


http://www.cartan.pair.com/carfre31. htm
Complexity is a measure of information content, the inverse of measurement of entropy, and has
quantitative/objective/information and qualitative/subjective/cognition attributes.

"The central idea of algorithmic information theory is to quantify the information content of an object or bit string in
terms of its shortest description. In other words, if an object can be described easily in a short space, it is of low
complexity or information content, while if describing it takes more space, then it is of higher complexity or information
content. It can be useful to think of the 'shortest description' as a kind of 'self-extracting archive' of the target string, so
that a string's information content is the size of it once it has been compressed into a self-extracting archive. One of
many interesting features of algorithmic information theory is that it turns out that in the space of all possible strings,
hardly any strings can actually be compressed at all."
http://mulhauser.net/research/tutorials/complexity/complexity.html#JointEtc

Complexity Measures:

"Generally speaking, complexity measures either take after Kolmogorov complexity, and involve finding some
computer or abstract automaton which will produce the pattern of interest, or they take after information theory and
produce something like the entropy, which, while in principle computable, can be very hard to calculate reliably for
experimental systems. Bennett's "logical depth" is an instance of the former tendency (it's the running time of the
shortest program), Lloyd and Pagels's "thermodynamic depth" of the later (it's the entropy of the ensemble of possible

Brainstorm Page 48
shortest program), Lloyd and Pagels's "thermodynamic depth" of the later (it's the entropy of the ensemble of possible
trajectories leading to the current state; uncomputable in the weaker sense that you'd have to go all the way back to
the beginning of time...). The statistical complexity of computational mechanics partakes of both natures, being the
entropy of an abstract automaton; it can actually be calculated. (I presently have a couple of papers incubating where
we do calculate the statistical complexity of various real-world processes.)

Even if the complexity measure is uncomputable, it may be possible to say something about how fast it grows, in some
well-defined average sense. For instance, the average Kolmogorov complexity per symbol of a random string
converges on the entropy per symbol of the string. Similarly, my first published paper was a proof that the rate of
increase in thermodynamic depth ("dive") is also an entropy rate, though not the same one. It'd be nice if there was a
similar result about logical depth; watch this space for more developments in this exciting nano-field. --- Such results
tend to make the complexity measures concerned seem less interesting, but this is just in line with Bohr's dictum, that
the task of theoretical science is to turn deep truths into trivialities."
http://cscs.umich.edu/~crshalizi/notebooks/complexity-measures.html

Effective Complexity as a Measure of Information Content:

"Murray Gell‐Mann has proposed the concept of effective complexity as a measure of information content. The
effective complexity of a string of digits is defined as the algorithmic complexity of the regular component of the string.
This paper argues that the effective complexity of a given string is not uniquely determined. The effective complexity of
a string admitting a physical interpretation, such as an empirical data set, depends on the cognitive and practical
interests of investigators. The effective complexity of a string as a purely formal construct, lacking a physical
interpretation, is either close to zero, or equal to the string‘s algorithmic complexity, or arbitrary, depending on the
auxiliary criterion chosen to pick out the regular component of the string. Because of this flaw, the concept of effective
complexity is unsuitable as a measure of information content."
http://www.journals.uchicago.edu/doi/abs/10.1086/375469?cookieSet=1&journalCode= phos
"Fractal patterns have emerged in many contexts, but what exactly is a pattern? How can one make precise the
structures lying within objects and the relationships between them? This book proposes new notions of coherent
geometric structure to provide a fresh approach to this familiar field. It develops a new concept of self-similarity called
"BPI" or "big pieces of itself," which makes the field much easier for people to enter. This new framework is quite
broad, however, and has the potential to lead to significant discoveries. The text covers a wide range of open
problems, large and small, and a variety of examples with diverse connections to other parts of mathematics. Although
fractal geometries arise in many different ways mathematically, comparing them has been difficult. This new approach
combines accessibility with powerful tools for comparing fractal geometries, making it an ideal source for researchers
in different areas to find both common ground and basic information."
http://books.google.com/books?id=0oy-Mgd1aTQC&source=gbs_navlinks_s

"This entry is about two kinds of circularity: object circularity, where an object is taken to be part of itself in some sens e;
and definition circularity, where a collection is defined in terms of itself. Instances of these two kinds of circularity are
sometimes problematic, and sometimes not. We are primarily interested in object circularity in this entry, especially
instances which look problematic when one tries to model them in set theory. But we shall also discuss circular
definitions.

The term non-wellfounded set refers to sets which contain themselves as members, and more generally which are part
of an infinite sequence of sets each term of which is an element of the preceding set. So they exhibit object circularity
in a blatant way. Discussion of such sets is very old in the history of set theory, but non-wellfounded sets are ruled out
of Zermelo-Fraenkel set theory (the standard theory) due to the Foundation Axiom (FA). As it happens, there are
alternatives to this axiom FA. This entry is especially concerned with one of them, an axiom first formulated by Marco
Forti and Furio Honsell in a 1983 paper. It is now standard to call this principle the Anti -Foundation Axiom (AFA),
following its treatment in an influential book written by Peter Aczel in 1988.

The attraction of using AFA is that it gives a set of tools for modeling circular phenomena of various sorts. These tools
are connected to important circular definitions, as we shall see. We shall also be concerned with situating both the
mathematics and the underlying intuitions in a broader picture, one derived from work in coalgebra. Incorporating
concepts and results from category theory, coalgebra leads us to concepts such as corecursion and coinduction; these
are in a sense duals to the more standard notions of recursion and induction.

The topic of this entry also has connections to work in game theory (the universal Harsanyi type spaces), semantics
(especially situation-theoretic accounts, or others where a ―world‖ is allowed to be part of itself), fractals sets and other
self-similar sets, the analysis of recursion, category theory, and the philosophical side of set theory."
http://plato.stanford.edu/entries/nonwellfounded-set -theory/index.html

Additional related modeling of circularity

"At this point, we have presented the main points in this entry: an introduction to the mathematics of hypersets and
specifically to work with the axiom AFA, and an exploration of the conceptual dualities that appear to lie at the heart of
the matter. In this supplement, we want to mention briefly two matters related to the instances of circularity mentioned
earlier. The goal here is to show how the perspective of coalgebra widens the scope of circular phenomena. That is,
the two sections below represent treatments that use coalgebra more than hypersets.
...
Our final topic is the link between the mathematics of hypersets and coalgebras on the one hand, and fractal sets on
the other."
http://plato.stanford.edu/entries/nonwellfounded-set -theory/modeling-circularity.html
"This entry is about two kinds of circularity: object circularity, where an object is taken to be part of itself in some sens e;
and definition circularity, where a collection is defined in terms of itself. Instances of these two kinds of circularity are
sometimes problematic, and sometimes not. We are primarily interested in object circularity in this entry, especially
instances which look problematic when one tries to model them in set theory. But we shall also discuss circular
definitions.

The term non-wellfounded set refers to sets which contain themselves as members, and more generally which are part
of an infinite sequence of sets each term of which is an element of the preceding set. So they exhibit object circularity
in a blatant way. Discussion of such sets is very old in the history of set theory, but non -wellfounded sets are ruled out

Brainstorm Page 49
in a blatant way. Discussion of such sets is very old in the history of set theory, but non-wellfounded sets are ruled out
of Zermelo-Fraenkel set theory (the standard theory) due to the Foundation Axiom (FA). As it happens, there are
alternatives to this axiom FA. This entry is especially concerned with one of them, an axiom first formulated by Marco
Forti and Furio Honsell in a 1983 paper. It is now standard to call this principle the Anti -Foundation Axiom (AFA),
following its treatment in an influential book written by Peter Aczel in 1988.

The attraction of using AFA is that it gives a set of tools for modeling circular phenomena of various sorts. These tools
are connected to important circular definitions, as we shall see. We shall also be concerned with situating both the
mathematics and the underlying intuitions in a broader picture, one derived from work in coalgebra. Incorporating
concepts and results from category theory, coalgebra leads us to concepts such as corecursion and coinduction; these
are in a sense duals to the more standard notions of recursion and induction.

The topic of this entry also has connections to work in game theory (the universal Harsanyi type spaces), semantics
(especially situation-theoretic accounts, or others where a ―world‖ is allowed to be part of itself), fractals sets and other
self-similar sets, the analysis of recursion, category theory, and the philosophical side of set theory."
http://plato.stanford.edu/entries/nonwellfounded-set -theory/index.html
Additional related modeling of circularity

"At this point, we have presented the main points in this entry: an introduction to the mathematics of hypersets and
specifically to work with the axiom AFA, and an exploration of the conceptual dualities that appear to lie at the heart of
the matter. In this supplement, we want to mention briefly two matters related to the instances of circularity mentioned
earlier. The goal here is to show how the perspective of coalgebra widens the scope of circular phenomena. That is,
the two sections below represent treatments that use coalgebra more than hypersets.
...
Our final topic is the link between the mathematics of hypersets and coalgebras on the one hand, and fractal sets on
the other."
http://plato.stanford.edu/entries/nonwellfounded-set -theory/modeling-circularity.html
Anticipatory Systems and the Processing of Meaning: a Simulation Study Inspired by Luhmann's Theory of Social
Systems:
Meaning can be communicated in addition to — and on top of — underlying processes of the information exchange.
Meaning is provided to observations from the perspective of hindsight, while information processing follows the time
axis. Simulations of anticipatory systems enable us to show how an observer can be generated within an information
process, and how expectations can also be exchanged. Cellular automata will be used for the visualization. The
exchange of observations among observers generates (a) uncertainty about the delineations in the observed system at
each moment in time and (b) uncertainty about the dynamics of the interaction over time.
Keywords:
Anticipation, Autopoiesis, Social System, Incursion, Meaning
http://jasss.soc.surrey.ac.uk/8/2/7.html
Dialetheism is the view that there are true contradictions, or dialetheia. More specifically, dialetheists believe that for
some sentence or proposition P, both P and its negation, not-P(˜P), are true. Dialetheism is not itself a formal logic, but
to endorse dialetheism without accepting some version of paraconsistent logic is to accept everything at all,
namely trivialism.Graham Priest, of the University of Melbourne and the CUNY Graduate Center, is dialetheism's most
prominent contemporary champion. He defines Dialetheism as the view that there are true contradictions. [1]
http://en.w ikipedia.org/wiki/Dialetheism
Geometry Articles, Theorems, Problems:
http://www.cut-the-knot.org/geometry.shtml
In a nutshell, Dr. Hatcher has taken modern refinements in logic -- specifically the creation of relational logic, which
forms the basis for modern computing -- and applied them in the realm of philosophy, in particular to the kinds of
metaphysical and ethical questions that have seemed so stubbornly to resist modern analysis.
http://www.onecountry.org/e144/e14416as_Minimalism_Review.htm
Comentarios
Accede para escribir un comentario.

Hamid Javanbakht
Generative Network Automata: A Generalized Framework for Modeling Adaptive Network Dynamics Using
Graph Rewritings
A variety of modeling frameworks have been proposed and utilized in complex systems studies, including dynamical
systems models that describe state transitions on a system of fixed topology, and self-organizing network models that
describe topological transformations of a network with little attention paid to dynamical state changes. Earlier network
models typically assumed that topological transformations are caused by exogenous factors, such as preferential
attachment of new nodes and stochastic or targeted removal of existing nodes. However, many real -world complex
systems exhibit both of these two dynamics simultaneously, and they evolve largely autonomously based on the system's
own states and topologies. Here we show that, by using the concept of graph rewriting, both state transitions and
autonomous topology transformations of complex systems can be seamlessly integrated and represented in a unified
computational framework.We call this novel modeling framework "Generative Network Automata (GNA)". In this chapter,
we introduce basic concepts of GNA, its working definition, its generality to represent other dynamical systems models,
and some of our latest results of extensive computational experiments that exhaustively swept over possible rewriting
rules of simple binary-state GNA. The results revealed several distinct types of the GNA dynamics.
http://arxiv.org/abs/arxiv:0901.0216

The generative sciences (or generative science) are the interdisciplinary and multidisciplinary sciences that explore the
natural world and its complex behaviours as a generative process. Generative science shows how deterministic and finite
rules and parameters in the natural phenomena interact with each other to generate indeterministic and infinite behaviour.
These sciences include psychology and cognitive science, cellular automata, generative linguistics, natural language
processing, social network analysis, connectionism, evolutionary biology, self-organization, neural network theory,
communication networks, neuromusicology, information theory, systems theory, genetic algorithms, artificial life, chaos
theory, complexity theory, epistemology, systems thinking, genetics, philosophy of science, cybernetics, bioinformatics,
and catastrophe theory.
http://en.wikipedia.org/wiki/Generative_sciences

Unifying Dynamical Systems and Complex Networks Theories: A Proposal of ―Generative Network Automata (GNA)

Brainstorm Page 50
Unifying Dynamical Systems and Complex Networks Theories: A Proposal of ―Generative Network Automata (GNA)

Complexity Digest: Networking the Complexity Community:


http://comdig.unam.mx/index.php
http://www.nd.edu/~netsci/TALKS/Sayama_CT.pdf
Última modificación: 28/01/2010 18:28
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
A Theory of Teleology
A representation language for teleology, or description of purpose, is defined.
ftp://ftp.cs.utexas.edu/pub/qsim/papers/Franke-PhD-92.pdf

Cosmic Teleology and the Crisis of the Sciences:


http://www.bu.edu/wcp/Papers/Scie/ScieMans.htm

Towards a Teleological Logic:


http://thespaceofreasons.blogspot.com/2009/10/towards-teleological-modal-logic.html

A Suggestion for a Teleological Interpretation of Quantum Mechanics:


http://cdsweb.cern.ch/record/442211/files/0006070.pdf

The Teleological Arguement and the Anthropic Principle:


http://www.theapologiaproject.org/The%20Teleological%20Argument%20and%20The%20Anthropic%20Principle.pdf

Teleological Operator:
http://www.google.com/m/search?oe=UTF-8&client=safari&q=teleological+operator&hl=en&start=10&sa=N

Fundamental Legal Concepts: A Teleological Characterization:


http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.126.6708&rep=rep1&type=pdf

Teleology and Logical Form:


http://bjps.oxfordjournals.org/cgi/pdf_extract/38/1/27

Representation and Use of Teleological Knowledge in the Multi-Modeling Approach:


http://www.springerlink.com/content/p60677138363132u/
Última modificación: 20/01/2010 14:52
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
On Thinking of Kinds: A Neuroscientific Perspective
http://homepage.mac.com/ancientportraits/drsite/representingkinds.pdf

The Causal Relevance of Mental Properties:


http://www.springerlink.com/content/9712h1g04231wm58/

The Duality of Content:


http://www.jstor.org/pss/4321065

Furnishing the Mind: Concepts and their Perceptual Bias:


http://books.google.com/books?id=Gp5A85NQzJMC&pg=PA242&lpg=PA242
&dq=nomological+covariance&source=bl&ots=85z -MqRzVS&sig=yZWVUeXM1wrgxeuDZ9Fgao1IlkU&hl=en

Última modificación: 15/01/2010 20:02


Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Brainstorm Page 51
Publicar respuesta a este comentario ▼

Hamid Javanbakht
Topologically Stratified Space
In topology, a branch of mathematics, a topologically stratified space is a space X that has been decomposed into pieces
called strata; these strata are topological manifolds and are required to fit together in a certain way. Topologically strati fied
spaces provide a purely topological setting for the study of singularities analogous to the more differential -geometric
theory of Whitney. They were introduced by Thom, who showed that every Whitney stratified space was also a
topologically stratified space, with the same strata. Another proof was given by John Mather in 1970, inspired by Thom's
proof.
http://en.wikipedia.org/wiki/Topologically_stratified_space
Última modificación: 15/01/2010 03:41
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
Wolfram on Minimal Surfaces
Multidimensional generalizations [of intrinsically defined curves]

Curvatures for surfaces and higher-dimensional objects can be defined in terms of the principal axes of approximating
ellipsoids at each point. There are combinations of these curvatures --in 2D Gaussian curvature and mean curvature--
which are independent of the coordinate system used. (Compare page 1049.) Given such curvatures, a surface can in
principle be obtained by solving certain partial differential equations. But even in the case of zero mean curvature, which
corresponds to minimal surfaces of the kind followed by an idealized soap film, this is already a mathematical problem of
significant difficulty.

If one looks at projections of surfaces, it is common to see lines of discontinuity at which a surface goes, say, from having
three sheets to one. Catastrophe theory provides a classification of such discontinuities --the simplest being a cusp. And
as emphasized by René Thom in the 1960s, it is possible that some structures seen in animals may be related to such
discontinuities.

Another way to state the Einstein equations--already discussed by David Hilbert in 1915--is as the constraint that the
integral of RicciScalar Sqrt[Det[g]] (the so-called Einstein-Hilbert action) be an extremum. (An idealized soap film or other
minimal surface extremizes the integral of the intrinsic volume element Sqrt[Det[g]], without a RicciScalar factor.) In the
discrete Regge calculus that I mention on page 1054 this variational principle turns out to have a rather simple form.

The Einstein-Hilbert action--and the Einstein equations--can be viewed as having the simplest forms that do not ultimately
depend on the choice of coordinates. Higher-order terms--say powers of the Ricci scalar curvature--could well arise from
underlying network systems, but would not contribute noticeably except in very high gravitational fields.

Various physical interpretations can be given of the vanishing of the Ricci tensor implied by the ordinary vacuum Einstein
equations. Closely related to my discussion of the absence of t^2 terms in volume growth for 4D spacetime cones is the
statement that if one sets up a small 3D ball of comoving test particles then the volume it defines must have zero first and
second derivatives with time.
http://www.wolframscience.com/nksonline/index/mes -mz.html
Última modificación: 03/01/2010 01:38
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
The Googol Room
Thinking machines:
(The grail machine - table of essays)
— Temporal propositions and the resolution of the Gödelian paradox
— ZF+: A set theory for describing the mind
— The physical construction of free will
— Free will and the foundation of perception
— Machines and emotional states
— Basic scheme for the construction of artificial minds
(including the bases for two or more pre-emergent sciences)
— Design, behavior and purpose
— Behavior-driven technologies
— General purpose artificial minds and man-made evolution
— Integrated AI and Amind systems
Additional commentary
— The façade of the clockwork universe
— Simple versus simplex
— Playfulness and machines that play
— AI, Amind, chaotic-intelligence and higher modes of thought

Brainstorm Page 52
— AI, Amind, chaotic-intelligence and higher modes of thought
— Aecology
— The sensation of being

Education:
— Titles in the study of logic
— The resolution of advancing paradox
Methods of logical instruction:
— Cryptics and evocatives
— The benefit and use of koan in Eastern education
— Koanics
— A logic primer

Philosophy:
— The metaphysical division in ZFC
— Mnemonics for categorical unification
— The evaluation of metaphysical systems
— Meta-concerns for unification
— Modern categorical metaphysics
— Simple versus simplex
— The nature of accident
— Knowledge and the thing-in-itself
http://www.googolroom.org/index.htm
Última modificación: 02/01/2010 18:01
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
Big Toy Models: Representing Physical Systems as Chu Spaces
We pursue a model-oriented rather than axiomatic approach to the foundations of Quantum Mechanics, with the idea that
new models can often suggest new axioms. This approach has often been fruitful in Logic and Theoretical Computer
Science. Rather than seeking to construct a simplified toy model, we aim for a `big toy model', in which both quantum and
classical systems can be faithfully represented - as well as, possibly, more exotic kinds of systems.
To this end, we show how Chu spaces can be used to represent physical systems of various kinds. In particular, we show
how quantum systems can be represented as Chu spaces over the unit interval in such a way that the Chu morphisms
correspond exactly to the physically meaningful symmetries of the systems - the unitaries and antiunitaries. In this way we
obtain a full and faithful functor from the groupoid of Hilbert spaces and their symmetries to Chu spaces. We also consider
whether it is possible to use a finite value set rather than the unit interval; we show that three values suffice, while the two
standard possibilistic reductions to two values both fail to preserve fullness.
http://arxiv.org/abs/0910.2393

We revisit our earlier work on the representation of quantum systems as Chu spaces, and investigate the use of
coalgebra as an alternative framework. On the one hand, coalgebras allow the dynamics of repeated measurement to be
captured, and provide mathematical tools such as final coalgebras, bisimulation and coalgebraic logic. However, the
standard coalgebraic framework does not accommodate contravariance, and is too rigid to allow physical symmetries to
be represented. We introduce a fibrational structure on coalgebras in which contravariance is represented by indexing.
We use this structure to give a universal semantics for quantum systems based on a final coalgebra construction. We
characterize equality in this semantics as projective equivalence. We also define an analogous indexed structure for Chu
spaces, and use this to obtain a novel categorical description of the category of Chu spaces. We use the indexed
structures of Chu spaces and coalgebras over a common base to define a truncation functor from coalgebras to Chu
spaces. This truncation functor is used to lift the full and faithful representation of the groupoid of physical symmetries o n
Hilbert spaces into Chu spaces, obtained in our previous work, to the coalgebraic semantics.
http://arxiv.org/abs/0910.3959

Samson Abramsky:
http://arxiv.org/find/quant-ph/1/au:+Abramsky_S/0/1/0/all/0/1
Última modificación: 02/01/2010 16:45
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
Globular Universe and Autopoietic Automata
We present two original computational models — globular universe and autopoietic automata — capturing the basic
aspects of an evolution: a construction of self–reproducing automata by self–assembly and a transfer of algorithmically
modified genetic information over generations. Within this framework we show implementation of autopoietic automata in
a globular universe. Further, we characterize the computational power of lineages of autopoietic automata via interactive
Turing machines and show an unbounded complexity growth of a computational power of automata during the evolution.
Finally, we define the problem of sustainable evolution and show its undecidability.
http://www.springerlink.com/content/1gyg6ak02fjn8w18/

It has been claimed that NKS tries to take these ideas as its own; although this has been mainly suggested by people

Brainstorm Page 53
It has been claimed that NKS tries to take these ideas as its own; although this has been mainly suggested by people
thinking that Wolfram's main thesis is that the universe is a cellular automaton in spite of the fact that Wolfram's actual
proposal as discrete model of the universe is a trivalent network. Wolfram himself considers that a cellular automaton
model is unsuitable to describe quantum and relativistic properties of nature, as explained in the book.
http://en.wikipedia.org/wiki/A_New_Kind_of_Science
Última modificación: 02/01/2010 16:39
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
HYPERINCURSIVE METHODS FOR GENERATING FRACTALS IN AUTOMATA RELATED TO DIFFUSION AND
WAVE EQUATIONS
This paper provides modelling tools for automata design in the field of information and formal systems. The concept and
methods of incursion and hyperincursion are firstly applied to the Fractal Machine, a hyperiacursive cellular automaton
with sequential computations with exclusive OR, where time plays a central role. Simulations will show the generation of
fractal patterns. The computation is incursive, for inclusive recursion, in the sense that an automaton is computed at the
future time l+1 as a function of its neighbour automata at the present and/or past time steps but also at the future time l+1 .
The hyperincursion is an incursion when several values can be generated at each time step. External incursive inputs
cannot be transformed to recursion. (This is really an example of the Final Cause of Aristotle.) But internal incursive input s
defined at the future time can be transformed to recursive inputs by self-reference. A particular case of self-reference with
the Fractal Machine shows a non deterministic hyper-incursive field. Interference of particles in the Fractal Machine gives
rise to fractal patterns which follow Huygens Principle of Secondary Sources. The superimposition of states is similar to
Deutsch's quantum computing principle. This is also related to digital cellular automata obtained from diffusion and wave
finite difference equations. Zuse proposed to represent physical systems by a computing space based on such a
digitalisation of differential equations. I show that the digital wave equation exhibits waves by digital particles with
interference effects and uncertainty.
Keywords: Recursion; incursion; hyperincursion; hypersets; K. Zuse; computing space; fractal; Huygens; wave equation;
diffusion equation; cellular automata
http://www.informaworld.com/smpp/327464342-20081241/content~db=all~content=a779199250
Última modificación: 02/01/2010 16:35
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
Fractal Cosmology
In physical cosmology, fractal cosmology is a set of minority cosmological theories which states that the distribution of
matter in the Universe, or the structure of universe itself, is fractal. More generally, it relates to the usage or appearanc e of
fractals in the study of the universe and matter. A central issue in this field is the fractal dimension of the Universe or o f
matter distribution within it, when measured at very large or very small scales.
Fractals are encountered in both observational and theoretical cosmology, make an appearance at both extremes of the
range of scale, and have been observed at various ranges in the middle. Similarly; the use of fractals to answer questions
in cosmology has been employed by a growing number of serious scholars close to the mainstream[citation needed], but
the metaphor has also been adopted by others outside the mainstream of science, so some varieties of fractal cosmology
are solidly in the realm of scientific theories and observations, and others are considered Fringe science, or perhaps
metaphysical cosmology. Thus, these various formulations enjoy a range of acceptance and/or perceived legitimacy that
includes both extremes as well as the middle.
http://en.wikipedia.org/wiki/Fractal_cosmology
Última modificación: 02/01/2010 16:33
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Pasted from <http://knol.google.com/k/protocomputational-multivarifolds-heirarchical-stratifolds-fractal-bubbles>

Brainstorm Page 54
Conspansive Manifold, Complexity Generation, Generative Automata, Ontological Knowledge, Infomorphic
Semantics, Polymorphic Isotelesis, Holomorphic Autotelesis, Isomorphic Polytelesis, Automorphic Holotelesis,
Distributed Koinontelesis, Value Utility
Adaptive Distributed Metagrammars, Reflexive Directed Hypergraphs, Yoneda Lemma and Communes,
Duality and Hology, Multiply Nested and Clustered Knowledge with Chu Spaces; Metalogic, Metagames,
Maude; Ising Model, Isbell Duality, IS-Multivarifolds.
Isotelesis: Teleonic Isomorphism, Teleosemantic Equivalence, Local-Global Intercomplementarity, Ontological
Feedback, Telestic Syndiffeonesis, Reflective Equilibrium, Multi-Teleological Confluence.

According to R. Zajonc: "The bulk of theoretical and empirical work in the neurobiology of emotion indicates that
isotelesis—the principle that any one function is served by several structures and processes—applies to emotion
as it applies to thermoregulation, for example (Satinoff, 1982)...In light of the preceding discussion, it is quite clear
that the processes that emerge in emotion are governed not only by isotelesis, but by the principle of polytelesis as
well. The first principle holds that many functions, especially the important ones, are served by a number of
redundant systems, whereas the second holds that many systems serve more than one function. There are very
few organic functions that are served uniquely by one and only one process, structure, or organ. Similarly, there
are very few processes, structures, or organs that serve one and only one purpose. Language, too, is
characterized by the isotelic and polytelic principles; there are many words for each meaning and most words have
more than one meaning. The two principles apply equally to a variety of other biological, behavioral, and social
phenomena. Thus, there is no contradiction between the vascular and the communicative functions of facial
efference; the systems that serve these functions are both isotelic and polytelic."
http://psychology.stanford.edu/~lera/273/zajonc-psychreview-1989.pdf

"IS-Multivarifolds" (ISMs), Multiply Nested Virtual Realities: "forming abstract nouns of action, state, doctrine" or "is-
what-it-is"). Infocognitive Semantics (IS) within holofractologic IS (Isocognitive Stratifications) of Infomorphic Self-
Dualities (IS) accepting transductive IS (Isomorphic Syntax) within co-teleonomic automatons.

"Conspansive Manifold": self-configuring, self-processing metagrammatical medium for primary and secondary
telic-recursive processes to co-teleoplectically generate complex functions. "Generalized Utility": anticipated
potential value in metagame-theoretic and teleo-economical processes which require coordinated strategies to
reach, for local and global cross-temporal feedback, to allow a method for concurrency in self-determinacy and
self-emergence, through hyperstructural, multispansive, "directed" telesis.

"Reflective equilibrium is a state of balance or coherence among a set of beliefs arrived at by a process of
deliberative mutual adjustment among general principles and particular judgments. Although he did not use the
term, philosopher Nelson Goodman introduced the method of reflective equilibrium as an approach to justifying the
principles of inductive logic."
http://en.wikipedia.org/wiki/Reflective_equilibrium
Contenido
• Metaphysical
• Physics, Topology, Logic, and Computation: A Rosetta Stone: http://math.ucr.edu/home/baez/rosetta.pdf
• Maude language and toolhttp://www.item.ntnu.no/department/labs/topics/maude
• Ontological Manifolds, Sheaf Semantics, and Concurrently Interacting Teleological (or Teleonomic) Structures
• Mechanism Design: How to Implement Social Goals
• Isotelesis: a new type of "Freemasonry"?
• Isotelesis, Cognitive-Theoretic Model of the Universe, M-Theory, and TOEs
• Examples
• Properties
• The Singularity Institute for Artificial Intelligence: Advance Innovation, Advance Humanity
menosmás

Enlace

Cita

Correo electrónico

Imprimir

Favorito

Recopilar esta página

Brainstorm Page 55
http://www.culturetv.tv/?p=2013
(spherical minimal hypersurfaces, mutual contact, inner-expansion, re-quantization, nested-linked Hopf tori):

―Nothing can be reasonable or beautiful unless it‘s made by one central idea, and the idea sets every detail.‖
—Architect Howard Roark from Ayn Rand‘s novel The Fountainhead.
http://mctague.org/carl/music/computer/pieces/6_integers/

http://www.soulsofdistortion.nl/SODA_chapter6.html
Nested Spheres:

http://www.fundza.com/mel/recursion/recursion.html
"Bubble" Multiverses:

Brainstorm Page 56
"Bubble" Multiverses:

Brainstorm Page 57
http://seedmagazine.com/content/article/the_multiverse_problem/
Cosmic Mysteries:

http://technoeventhorizon.blogspot.com/2005_12_01_archive.html
Unbound Telesis:

http://www.cosmographica.com/gallery/port folio2007/content/414_CosmicBrane_large.html
Isotelesis:

Brainstorm Page 58
Gyroteleostasis:

http://www.abovetopsecret.com/forum/thread392386/pg4

http://ontolog.cim3.net/file/work/OntologySummit2007/workshop/
The Theory of Theories:
http://www.ctmu.org/
Cognitive-Theoretic Model of the Universe:

Brainstorm Page 59
Cognitive-Theoretic Model of the Universe:

Diagram 11: In the above illustration, a spatial cross section of a spacetime diagram (blue line) is rotated toward
the viewer and displayed along the time axis (blue rectangle). The result is a Venn diagram in which circles
represent objects and events, or (n>1)-ary interactive relationships of objects. That is, each circle depicts the
―entangled quantum wavefunctions‖ of the objects which interacted with each other to generate it. The small dots in
the centers of the circles represent the initial events and objects from which the circles have arisen, while the twin
dots where the circles overlap reflect the fact that any possible new event, or interaction between objects involved
in the old events, must occur by mutual acquisition in the intersect. The outward growth (or by conspansive duality,
mutual absorption) of the circles is called inner expansion, while the collapse of their objects in new events is called
requantization. The circles themselves are called IEDs, short for inner expansive domains, and correspond to pairs
of interactive syntactic operators involved in generalized-perceptual events (note the hological ―evacuation‖ and
mutual absorption of the operators). Spacetime can be illustrated in terms of a layering of such Venn diagrams,
mutual contact among which is referred to as ―extended superposition‖ (in the real world, the Venn diagrams are
3-dimensional rather than planar, the circles are spheres, and “layering” is defined accordingly). Extended
superposition ―atemporally‖ distributes antecedent events over consequent events, thus putting spacetime in
temporally-extended self-contact. In light of the Telic Principle (see below), this scenario involves a new
interpretation of quantum theory, sum over futures. Sum over futures involves an atemporal generalization of
―process‖, telic recursion, through which the universe effects on-the-fly maximization of a global self-selection
parameter, generalized utility.
http://megafoundation.org/CTMU/Articles/Langan_CTMU_092902.pdf
The conservation principle based on "the boundary of a boundary". Bianchi Identities and the Boundary of a
Boundary: http://books.google.com/books?id=w4Gigq3tY1kC&lpg=PA364& ots=CX7KHE pDeL&dq=boundary%
20of%20a%20boundary%20is%20zero& pg=PA
4-D Quaternion Julia Set: (This is by far the most popular method for creating hypercomplex fractals.)

http://nylander.wordpress.com/2004/05/
These images were created from octonion math, an eight-dimensional algebra:

http://www.mysticfractal.com/fractal_mirage/octonion.html
Hypercomplex Fractals:

http://www.bugman123.com/Hypercomplex/index.html

http://fractalontology.wordpress.com/
Remarks on Turing and Spencer-Brown:
Computation is holographic. Information processing is a formal operation made abstract only by
a reduction in the number of free variables, a projective recording which analyzes from all angles
the entropy or information contained in the space. Thus, basing my results partly on Hooft‘s
holographic conjecture for physics (regarding the equivalence of string theory and quantum
theory,) and by extending Spencer-Brown‘s work on algebras of distinction (developed in his
Laws of Form,) I will sketch the outlines of a new theory of universal computation, based not on
system-cybernetic models but on holographic transformations (encoding and projection, or more
precisely, fractal differentiation and homogeneous integration.)

Brainstorm Page 60
system-cybernetic models but on holographic transformations (encoding and projection, or more
precisely, fractal differentiation and homogeneous integration.)
http://fractalontology.files.wordpress.com/2007/12/ universal-computation-and-the-laws-of-form.pdf
"The bulk of theoretical and empirical work in the neurobiology of emotion indicates that isotelesi s—the principle
that any one function is served by several structures and processes—applies to emotion as it applies to
thermoregulation, for example (Satinoff, 1982)...In light of the preceding discussion, it is quite clear that the
processes that emerge in emotion are governed not only by isotelesis, but by the principle of polytelesis as well.
The first principle holds that many functions, especially the important ones, are served by a number of redundant
systems, whereas the second holds that many systems serve more than one function. There are very few organic
functions that are served uniquely by one and only one process, structure, or organ. Similarly, there are very few
processes, structures, or organs that serve one and only one purpose. Language, too, is characterized by the
isotelic and polytelic principles; there are many words for each meaning and most words have more than one
meaning. The two principles apply equally to a variety of other biological, behavioral, and social phenomena. Thus,
there is no contradiction between the vascular and the communicative functions of facial efference; the systems
that serve these functions are both isotelic and polytelic."
http://psychology.stanford.edu/~lera/273/zajonc-psychreview-1989.pdf
The Unreasonable Effectiveness of Mathematics in the Natural Sciences:
http://www.dartmouth.edu/~matc/MathDrama/reading/Wigner.html
The Mathematico-Symbolic Formulation of Teleonic Principles:
http://www3.interscience.wiley.com/journal/109875456/abstract?CRE TRY=1&S RE TRY=0
http://findarticles.com/p/articles/mi_7349/is_1_22/ai_n32033785/?tag=cont ent;col1
Biomatrix Theory is a Process and Web-Based Systems Theory:
The fundamental unit of observation of Biomatrix Theory is purposeful, structured and regulated process, which is
referred to as ―activity system‖ (or in some of our research articles we also call it process system or teleon).

Activity systems link up with each other to form supply chains across and along levels in the systems hierarchy.
These supply chains interact with each other in a multitude of ways. In fact, one can view the whole web of life (i.e.
the Biomatrix) as a web of interacting supply chains. This gives rise to a web-based view of the world.

At various points in the web the interaction of activity systems becomes dense and gives rise to field-like entity
systems.
http://www.biomatrixweb.com/history2.htm
Computational Topology: Ambient Isotopic Approximation of 2-Manifolds:
http://www.cs.ucdavis.edu/~amenta/pubs/tcs-iso-qed.pdf
The Computational Manifold Approach to Consciousness and Symbolic Processing the Cerebral Cortex:
Abstract—A new abstract model of computation, the computational manifold, provides a framework for approaching
the problems of consciousness, awareness, cognition and symbolic processing in the central nervous system.
Physical properties involving space, time and frequency, such as the surface of the skin, sound spectrograms,
visual images, and the muscle cross-sections are included in the state of manifold automata. The Brodmann areas
are modeled as recurrent image associations implemented as neurotransmitter fields. Symbols are defined by state
transition behavior near reciprocal-image attractors in dynamical systems. Control masks overlay the images and
regulate awareness of the environment and cognitive reflection.
Index Terms—natural intelligence, consciousness, perception, cognition, models of computation.
Manifolds are the basis of differential geometry where they are defined as spaces which are locally homeomorphic
to Rn [1]. For example, curved lines are one-dimensional manifolds and smooth surfaces are two-dimensional
manifolds. We present a formal definition of Computational Manifold Automata (CMA) as an abstract model of
computation. The definition is similar to the specification of finite state machines [2], formal language theory [3] or
Turing Machines [4] but with the additional mathematical structure of continuous topologies. Within this framework
it is possible to unify the immediate awareness of perception and motor control with the reflective processing of
language and cognition.
http://gmanif.com/pubs/CMAC.pdf
A Unified System of Computational Manifolds:

Fundamental properties of the world in which all life evolved, such as space, time, force, energy and audio
frequencies, are modeled in physics and engineering with differentiable manifolds. A central question of
neurophysiology is how information about these quantities is encoded and processed. While the forces of evolution
are complex and often contradictory, the argument can be made that if all other factors are equal, an organism with
a more accurate mental representation of the world has a better chance of survival. This implies that the
representation in the central nervous system (CNS) of a physical phenomenon should have the same intrinsic
mathematical structure as the phenomenon itself. The philosophical principal, put forth by Monad (1971) and
others, that under certain conditions, biological evolution will form designs that are in accordance with the laws of
nature is referred to as teleonomy.
http://gmanif.com/pubs/TR-CIS-0602-03. pdf
Teleonomy:
http://en.wikipedia.org/wiki/Teleonomy
Meta Data Repository Design Aided by Teleonic Process and Goal Analysis:
http://www.pp.bme.hu/ee/2005_3/pdf/ee2005_3_01.pdf
A Teleonic Framework for the Discussion of Education Policy and Governance:
http://www.springerlink.com/content/p13211540766373g/
Polytely:
http://en.wikipedia.org/wiki/Polytely
Intelligence: Reliability and Validity of Performance Measures in Microworlds:
Polytely refers to the fact that multiple goals, sometimes contradictory, are present so that a reasonable trade-off
is required.
http://dx.doi.org/10.1016/S0160-2896(02)00121-6
Dynamic Systems as Tools for Analyzing Human Judgement:
"Subjects acting in these scenarios did indeed face a lot more tasks than in the IQ tests: (a) the complexity of the
situation and (b) the connectivity between a huge number of variables forced the actors to reduce alarge amount
of information and anticipate side effects; (c) the dynamic nature of the problem situation required the prediction of
future developments (a kind of planning) as well as long-term control of decision effects; (d) the intransparency
(opaqueness) of the scenarios required the systematic collection of information; (e) the existence of multiple goals
(polytely) required the careful elaboration of priorities and a balance between contradicting, conflicting goals."
http://cogprints.org/1374/0/funke2001.pdf
Decision Making for Complex Socio-Technical Systems: Robustness from Lessons Learned in Long-Term

Brainstorm Page 61
http://cogprints.org/1374/0/funke2001.pdf
Decision Making for Complex Socio-Technical Systems: Robustness from Lessons Learned in Long-Term
Radioactive Waste Governance:
"After Mag 1990 "optimum decisions are...always merely optimal with respect to a certain goal, with respect to
other goals often they are not."
...
But: Complex problems such as the present issue are defined by multiple goals, so-called polytely (see Section
7.3). The often quoted key word of "trust", however, is a "complex goal" according to Dörner 1989.
...
In complex issues it is very feasible that conflicting goals exist. The magic spell of "sustainability", a complex goal
as well, encompasses protection of, and leeway for, future generations (see page 13). In the case of a safe
disposition of radioactive waste, both passive safety and "active" control or surveillance need due care and
attention in parallel. Respective decisional situations were given consideration in Flüeler 1998 [D22]. In this
context, some authors insinuate hidden goals, after Keeny & von Winterfeldt 1986 so-called "hidden agendas", i.e.,
opposition against an individual project be just used as a pretext for to achieve wider goals.
...
The concept of polytely does not have to resort to such constructions."
http://books.google.com/books?id=8Q_7gVt5Y vY C&lpg=PA138&ots=5_2YaG7a7R& dq=polytely&pg=PA138
#v=onepage&q=polytely&f=false
Big Ideas. Big Thinkers. Eric Maskin (Implementation Theory):
http://www.thirteen.org/bigideas/maskin.html
http://en.wikipedia.org/wiki/Eric_Maskin
Knowledge Representation:
http://en.wikipedia.org/wiki/Knowledge_representation
Polysemy:
http://en.wikipedia.org/wiki/Polysemy
Oxford Linguistics: Polysemy: Theoretical and Computational Approaches:
http://books.google.com/books?id=Bt5gexKhIlQC&source=gbs _navlinks_s
Polysemy: Flexible Patterns in Mind and Language:
http://books.google.com/books?id=Hjq_mOzljOIC&sourc e=gbs_navlinks_s
Using Symbolic Knowledge to Disambiguate Words in Small Datasets with Naïve Bayes Classifer:
http://www.lhncbc.nlm.nih.gov/lhc/docs/published/2004/pub2004034.pdf
Compilation of Symbolic Knowledge and Integration with Numerical Knowledge Using Hybrid Systems:
http://www.springerlink.com/content/y2x245583656q4jt/

The IS-Multivarifold is HYJ's theory of epistemological ontology based on the mathematical and physical-theoretic
properties of the correspondence in self-mapping between mind and reality, generalized manifolds, self-duality,
hypersets, fractals, network automata, minimal surfaces, holographic quantum computing, and the co-evolving
quality of epistemic-ontologic telesis. It posits that every sentient being lives in multiply nested, protocomputational
virtual realities, or crystallographic "bubble" universes. Also similar to a 4-D "World Crystal Lattice", with visible
matter emerging from telesis (a k ind of pre-spacetime) as quantum "foam" or virtual particle/anti-particle "globules"
or "cells", and that so-called vibrations and ripples in string or membrane is more like a dynamically self-
determining, recursively interacting, fractal hyperset, generative automaton, which co-evolves with subsystem
parameters of the genetic and grammatical meta-linguistic rewriting logic. Sentient beings (there could be multiple
sentient beings nested in a stratified manner in any one meta-being: this considers each cell in a body as
sentient) may be thought of as local telors (hypercomplex or directed functors), or automorphisms of the universe
which contains it (each "cell" and "language" has the entire "genome" and "grammar"), which aids in maintaining
semantic cohesion through joint participation, and shared coordination (k oinontelic). Ideally, automorphisms learn
to act reflexively in a distributed manner which aligns with the needs of some rough conception of the

Brainstorm Page 62
whole (holotelic). It co-creates ontogenic growth through gyroteleostasis and reflective autopoiesis to generate a
global telor, which represents an isocognitive abstraction, or a self-dual, complementary reflection of its own beliefs
or shared ideals. This is an indirect consequence and result of the inherent uncertainty in Chu Spaces (automata
with quantum aspects) and the infocognitive potential of unbound telesis allowed for self-configurative and self-
executive freedoms. This stage of self-selective growth(autotelic) generates a holomorphic universe with the
consistency of tautological autology, or supertautologicity. Through covariant and contravariant teleo-recursions, a
myriad of functorial arrangement may be possible, for example, various groups may exhibit parallel development
and genotypical convergence (isotelic) and with relative simultaneity exhibit divergent multifunctionality and
phenotypical plasticity (polytelic). Or vice-versa, when genotypical divergenc e results in phenotypical convergence.
The same phenomena applies to holotelism when combined with autotelism. I believe this gets into mechanism
design and implementation theory for evaluating the semantics of metagames. Ultimately the ultimate meaning and
purpose of science and religion may be equivalent, although their methods for dealing with ontological questions
are rather different, eventually all pretense about history and finality should give way to a form of metaphysical
naturalism in which it is recognized that self-determinacy and complex-goal efficiency (represented as minimal
surfaces or IS-Multivarifolds) may have always been the only "intelligence" behind teleonomic function designs.
What that suggests however is no less spiritually uplifting, as we may be part of some "fractal multicellular
hypercomplex" which also may set up the conditions for its own emergence. All we can know about the
directedness of its telesis is that it exhibits a teleonomic quality all its own, the underlying symmetry of which is
probably beyond human linguistic reductions or evolutionary concepts, all one can know is that our bubble world
was created like all bubbles are formed, and they don't last forever. If language is the infinite use of finite means, or
as I have described it,gyroteleostasis, life on earth may generate and not be considered to have inherent value to
some distant civilzation in another galaxy, however for the planet, we may have been a means for its survival,
whether we realize that or not may result in whether we consider it our responsibility to nurture this hypercomplex
system we call "earth", and not limit our attention to any one category of teleonomy. There is a fifth way, and I
believe it will involve a respect for life, and maintaining its health, with all its glorious diversity. I believe if the earth
as Gaia or Medea exhibits multiple composite goals, it may be a member of the hypercomplex called the galaxy,
and we may not even be native to her soil, however we should fashion ourselves after the "Isotelic" class in ancient
times, we are equal to any other native lifeform on this planet, as long as we follow the rules and help the host "city-
state". This way is what I call "Isotelesis", and assumes no commitments to spirits, gods or deities. Simply the
notion that we are all citizens of the earth, as human beings, we are not aliens (maybe some once were, but they
no longer are thought of that way). As non-native but loyal members, humans may need a way to reaffirm their
commitment to some common goal or purpose, however sans religious or political mechanisms. Back to
philosophy, in terms of the process of Conspansion, the phenomena of Isotelesis may be introduced in the contexts
of certain orthogonal concepts such as information-gravity, time-space and energy-matter, in a
sense time manifests energy which dissipates through entropy allowing for information to be reflect in a self-
dual way to become a type of space (I prefer the generalizable topology of Chu Spaces) which acts like a prism for
the vibrational separation of the states of matter in terms of their frequency which according to their scale, which
distorts space and the iridescent quantum foam which emerges it emerges from to generate what is
called gravity which also changes the flow of time, energy, information, space, and matter in a mutally reinforcing
dance which can only be contained by the meta-equation which requires hological consistency or Isotelesis. These
are examples of isotelic connections, in this context of the dualities, however the notion of Isotelesis extends
beyond E8 to infinite symmetries. It confirms the notion that all structures or dimensions of reality are related at
more fundamental levels. Chris Langan calls this syndiffeonesis and the notion of pre-spacetime (which Edward
Witten has already reflected on) as something which reduces to "telesis". My notion is simply to express the
conservation of the whole as equivalent to itself, and which compensates for "borrowed" or "generated" energy or
information, this may be the reason life exists, in order to maintain the global consistency of reality through a self-
balancing meta-equation.
Uncertainty: (imaginary time, potential energy, inner expansion, coinversion around the IS-Multivarifold)
Complementarity: (gravitational zone, material frequency, requantization, incoversion within the IS-Multivarifold)
Quantum Mechanics: Uncertainty, Complementarity, Discontinuity and Interconnectedness:
http://www.physics.nyu.edu/faculty/sokal/transgress_v2/node1.html
Trondheim Matchmaking 2008: Keynote: Roy Ascott
http://matchmaking.no/wp/2008/?p=263
Linear logic is a substructural logic proposed by Jean-Yves Girard as a refinement of classical and intuitionistic
logic, joining the dualities of the former with many of the constructiveproperties of the latter.[1] Although the logic
has also been studied for its own sake, more broadly, ideas from linear logic have been influential in fields such
as programming languages,game semantics, and quantum physics[2] , particularly because of its emphasis on
resource-boundedness, duality, and interaction.
http://en.wikipedia.org/wiki/Linear_logic
Is there an equation for physics itself as a subset of mathematics? I believe there is and if it were to be found it
would be called the ultimate theory of physics. Moreover, I believe that it can be found and that it has a lot to do
with what is different about the way a physicist looks at the world compared to a mathematician...We can then try to
elevate the idea to a more general principle of representation-theoretic self-duality, that a fundamental theory of
physics is incomplete unless such a role-reversal is possible. We can go further and hope to fully determine the
(supposed) structure of fundamental laws of nature among all mathematical structures by this self-duality condition.
Such duality considerations are certainly evident in some form in the context of quantum theory and gravity. The
situation is summarised to the left in the following diagram. For example, Lie groups provide the simplest examples
of Riemannian geometry, while the representations of similar Lie groups provide the quantum numbers of
elementary particles in quantum theory. Thus, both quantum theory and non-Euclidean geometry are needed for a
self-dual picture. Hopf algebras (quantum groups) precisely serve to unify these mutually dual structures.'' (The
reader may also wish to see the original document on line.)
http://planetmath.org/encyclopedia/IHESOnTheFusionOfMathematicsAndTheoreticalPhysics2.html

Nigel Hitchin:
http://en.wikipedia.org/wiki/Nigel_Hitchin
Vladimir Drinfel'd:
http://en.wikipedia.org/wiki/Drinfeld
Ruziewicz Problem:
http://en.wikipedia.org/wiki/Ruziewicz_problem
Quasi-Hopf Algebra:
http://en.wikipedia.org/wiki/Quasi-Hopf_algebra
Hom-Tensor Adjunctions for Quasi-Hopf Algebras:
http://deposit.ddb.de/cgi-bin/dokserv?idn= 991213734&dok_var=d1&dok_ext=pdf&filename=991213734.pdf
Dror Bar-Natan:

Brainstorm Page 63
Dror Bar-Natan:
http://www.math.toronto.edu/~drorbn/LOP.html
Generalized Complex Structure:
http://en.wikipedia.org/wiki/Generalized_complex_structure
Almost Complex Manifold:
http://en.wikipedia.org/wiki/Almost_complex_manifold
The "Bubble Universe" Theory:
http://www.youtube.com/watch?v=2GNkazRo-tE& feature= related
Minimal Surfaces, Stratified Multivarifolds, and the Plateau Problem:
Dao Trong Thi formulated the new concept of a multivarifold, which is the functional analog of a geometrical
stratified surface and enables us to solve Plateau's problem in a homotopy class.
http://books.google.com/books?id=mncIV2c5Z4s C&source= gbs_navlinks_s
Learning on Varifolds:
In this paper, we propose a new learning framework based on the mathematical concept of varifolds (Morgan,
2000), which are the measure-theoretic generalization of differentiable manifolds. We compare varifold learning
with the popular manifold learning and demonstrate some of its specialties. Algorithmically, we derive a
neighborhood refinement technique for hypergraph models, which is conceptually analogous to varifolds, give the
procedure for constructing such hypergraphs from data and finally by using the hypergraph Laplacian matrix we are
able to solve high-dimensional classification problems accurately.
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=4685510
Hypergraph:
http://en.wikipedia.org/wiki/Hypergraph
Minimal Surfaces and Particles in 3-Manifolds:
http://arxiv.org/abs/math.DG/0511441
Monopoles and Four-Manifolds (Edward Witten):
http://www.mrlonline.org/mrl/1994-001-006/1994-001-006-013. pdf
Global Theory of Minimal Surfaces:
The subjects covered include minimal and constant-mean-curvature submanifolds, geometric measure theory and
the double-bubble conjecture, Lagrangian geometry, numerical simulation of geometric phenomena, applications of
mean curvature to general relativity and Riemannian geometry, the isoperimetric problem, the geometry of fully
nonlinear elliptic equations and applications to the topology of three-dimensional manifolds.
http://books.google.com/books?id=HRr3jdBDXIgC&source=gbs_navlinks_s
Isoperimetric Inequalities for Multivarifolds:
http://www.iop.org/EJ/abstract/0025-5726/26/2/A03
Multidimensional Parameterized Variational Problems on Riemannian Manifolds:
http://www.springerlink.com/content/171q92207n778361/
Modern State of Minimal Surface Theory:
http://dfgm.math.msu.su/bookskaf/Fom1990-f.pdf
Computing Matveev's Complexity of Non-Orientable 3-Manifolds via Crystallization Theory:
http://www.matematica.unimore.it/0ricerca/Quaderni/quaderno_55.pdf
Closer to Truth:
http://www.closertotruth.com/
Energy and Information Transfer in Biological Systems: How Physics Could Enrich Biological Understanding:
http://tinyurl.com/yffprd8

{HYJ: I think spinor, twistor, and Clifford algebra methods for studying space-time or information-cognition may be
interesting, or perhaps linked and nested quasi-hopf fibrations, or hypercomplex multiivarifolds which evolve
according to holographic quantum information which builds up relievable 2-D tension producing entropy and the
flow of time along minimal surfaces.}
Third-Generation Nuclear Magnetic Resonance: Diffusion Tensor Magnetic Resonance Tomography and Cerebral
White Matter Tractography:
http://www.dm.unibo.it/~GRUPPI/convegni/gala09/schempp. pdf
Spinors, twistors, Clifford algebras, and quantum deformations:
http://books.google.com/books?id=XAI_yheWFkUC&source= gbs_navlinks_s
Phenotypic Plasticity:
http://en.wikipedia.org/wiki/Phenotypic_plasticity
As the Intersection of "Metaphysical Naturalism" and "Intelligent Design":
http://www.infidels.org/library/modern/bill_schultz/crsc.html

The laws of physics and emergence


The status of the laws of physics - what are they and where do they come from? - is an old philosophical problem
that has received a new twist with modern scientific ideas like string theory and the multiverse. I am developing a
new slant on this problem by investigating whether the laws of physics might be emergent, rather than absolute,
universal and Platonic. (See my paper for more about this topic). I have suggested a link between the way the laws
of physics operate and the large scale properties of the universe. This opens the way to the possibility of hitherto
unknown "higher level" emergent laws, such as biological principles of organization, that are consistent with, but
not reducible to, the traditional laws of physics. In connection with this project, I am trying to sharpen the concept of
"downward causation", where wholes have causal efficacy over parts. I am also seeking to eliminate the age-old
dualism between absolute unchanging laws of physics and time-dependant contingent states of the world, and
develop a notion where laws and states might co-evolve.
http://cosmos.asu.edu/research/current.htm
As a scientist I am 1000% committed to the Scientific Method but I see it as a particular way of exploring reality.
One that we might now need to understand better by seeing it from the outside.

What I am going to argue now is that what we know about quantum gravity — what we have seen in earlier
posts — is telling us that the Scientific Method itself is perhaps the fundamental ‘metaequation’ of physics. To see
what I have in mind, consider playing chess but forgetting or not being aware of the rules of chess (perhaps
because you learned them at a very early age). Then as you play, you experience the reality of chess, the
frustration of being checkmated and so forth. In this sense the joining of a club, the acceptance or rules or
constraints ‗creates‘ a bit of reality, the reality of chess.

What if Physical Reality is no different, created by the rules of looking at the world as a Scientist? In other words,
just maybe, as we search for the ultimate theory of physics we are in fact rediscovering our own assumptions in

Brainstorm Page 64
just maybe, as we search for the ultimate theory of physics we are in fact rediscovering our own assumptions in
being Scientists, the Scientific Method?
http://www.cambridgeblog.org/2008/12/reflections-on-a-self-representing-universe/

The Ontology of Cyberspace:


http://www.cse.buffalo.edu/~rapaport/Papers/cyber.pdf
In mathematics, distributive lattices are lattices for which the operations of join and meet distribute over each
other. The prototypical examples of such structures are collections of sets for which the lattice operations can be
given by set union and intersection. Indeed, these lattices of sets describe the scenery completely: every
distributive lattice is – up to isomorphism – given as such a lattice of sets.
http://en.wikipedia.org/wiki/Distributive_lattice

"Infomorphism" → Meaning
A morphism is a type of maths function. It transforms objects but also preserves some of their properties. An
example would be a group morphism. This transforms elements of one group into elements of another while
respecting how the elements interact. In short, a morphism is a structure-preserving function.
An "info-morphism" transforms information while preserving meaning within it.
The logo represents functions which structurally map information.
The maths notation for functions from A to B is "A -> B".
Information mapping functions would be notated "INFO -> INFO".
More concisely it can be written as "INFO" with a loop back to itself.
Finally, the function is labelled a "morphism" - indicating its special behaviour.
http://www.infomorphism.com/index.php?id=meaning
"In other words, telesis is a kind of “pre-spacetime” from which time and space, cognition and
information, state-transitional syntax and state, have not yet separately emerged. Once bound
in a primitive infocognitive form that drives emergence by generating “relievable stress” between
its generalized spatial and temporal components - i.e., between state and state-transition
syntax – telesis continues to be refined into new infocognitive configurations, i.e. new states and
new arrangements of state-transition syntax, in order to relieve the stress between syntax and
state through telic recursion (which it can never fully do, owing to the contingencies inevitably
resulting from independent telic recursion on the parts of localized subsystems). As far as
concerns the primitive telic-recursive infocognitive MU form itself, it does not ―emerge‖ at all
except intrinsically; it has no ―external‖ existence except as one of the myriad possibilities that
naturally exist in an unbounded realm of zero constraint. Telic recursion occurs in two stages,
primary and secondary (global and local). In the primary stage, universal (distributed) laws are
formed in juxtaposition with the initial distribution of matter and energy, while the secondary
stage consists of material and geometric state-transitions expressed in terms of the primary
stage. That is, where universal laws are syntactic and the initial mass-energy distribution is the
initial state of spacetime, secondary transitions are derived from the initial state by rules of
syntax, including the laws of physics, plus telic recursion. The primary stage is associated with
the global telor, reality as a whole; the secondary stage, with internal telors (―agent-level‖
observer-participants). Because there is a sense in which primary and secondary telic recursion
can be regarded as ―simultaneous‖, local telors can be said to constantly ―create the universe‖
by channeling and actualizing generalized utility within it."
http://www.scribd.com/doc/23694328/The-Cognitive-Theoretic-Model-of-the-Universe-A-New-
Kind-of-Reality-Theory

"The term intentionality was introduced by Jeremy Bentham as a principle of utility in his
doctrine of consciousness for the purpose of distinguishing acts that are intentional and acts
that are not. The term was later used by Edmund Husserl in his doctrine that consciousness is
always intentional, a concept that he undertook in connection with theses set forth by Franz
Brentano regarding the ontological and psychological status of objects of thought. It has been
defined as "aboutness", and according to the Oxford English Dictionary it is "the distinguishing
property of mental phenomena of being necessarily directed upon an object, whether real or
imaginary". It is in this sense and the usage of Husserl that the term is primarily used in
contemporary philosophy."
http://en.wikipedia.org/wiki/Intentionality

Some Chris Langan Quotes:


"You mention Bill Dembski‘s 3-way distinction between determinacy, nondeterminacy (chance) and design. In the
CTMU, this distinction comes down to the 3-way distinction between determinacy, nondeterminacy and self-
determinacy, the last being associated with telic recursion and the others being secondarily defined with respect to
it. Telic recursion is just another term for "metacausation"; instead of simply outputting the next state of a system, it
outputs higher-order relationships between state and law (or state and syntax).

Regarding the distinction between origins and evolution, not too many people are clear on it. This distinction is
based on the standard view of causality, in which there seems to be a clean distinction between the origin and
application of causal principles, specifically first-order Markovian laws of nature. In the CTMU, origins distribute
over causes in a new kind of structure called a conspansive manifold, and are therefore not cleanly distinguishable
from causality. Both are products of a higher-order process, telic recursion. To put it in simpler terms, evolution
consists of events which originate in causes which originate in (teleological) metacauses. So in the CTMU, to talk
about evolution is to talk about metacausal origins by ontogenic transitivity."
...
"Entitled "The Resolution of Newcomb's Paradox", it utilized a (then brand new) computational model of the
universe based on nested virtual realities. The model was called NeST, short for Nested Simulation Tableau.

Brainstorm Page 65
universe based on nested virtual realities. The model was called NeST, short for Nested Simulation Tableau.
Subsequently, other papers on the CTMU were published in that journal and elsewhere, some developing its
cosmological implications. This can all be documented."
http://www.iscid.org/boards/ubb-get_t opic-f-6-t-000351-p-2.html
The Resolution of Newcomb's Paradox:
http://www.megasociety.net/noesis/44/newcomb.html
Algebraic Approach to Quantum Gravity: Relative Realism:
This is a model of reality that is adapted to the fundamental nature of knowledge, achieves the minimal goal of
explaining how we perceive it and also has at its core notions of consciousness or self-awareness and free will.
http://philsci-archive.pitt.edu/archive/00003345/01/qg1. pdf
On Einstein's Razor: Telesis-Driven Introduction of Complexity into Apparently Sufficiently Non-Complex Linguistic
Systems
The notion that a linguistic system that is powerful enough to accept any acceptable language but
insufficiently complex to meet specific goals or needs is explored. I nominate Chomsky‘s generative grammar
formalism as the least complex formalism required to describe all language, but show how without the addition of
further complexity, little can be said about the formalism itself. I then demonstrate how the O(n) parsing of
pseudoknots, a previously difficult to solve problem, becomes tractable by the more complex §-Calculus, and finally
close with a falsifiable hypothesis with implications in epistemological complexity.
Keywords: parsing, scientific inquiry, telesis, necessary complexity
http://www.iscid.org/papers/Jackson_EinsteinsRazor_050205.pdf
http://en.wikipedia.org/wiki/Adaptive_grammar
Adaptive-Predicates in Empty-Start Natural Language Parsing:
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.17.3768
Aristotle May Provide the Key to Quantum Gravity:
Rather, I think that this deepest and most long-standing of all problems in fundamental physics still needs a
revolutionary new idea or two for which we are still grasping. More revolutionary even than time-reversal. Far more
revolutionary and imaginative than string theory. In this post I‘ll take a personal shot at an idea — a new k ind of
duality principle that I think might ultimately relate gravityand information.
http://www.cambridgeblog.org/2008/12/aristotle-may-provide-the-key-to-quantum-gravity/
The Institute of Noetic Sciences:
http://www.noetic.org/
Mirror Neurons, Mirrorhouses, and the Algebraic Structure of the
Self:http://www.goertzel.org/dynapsyc/2007/mirrorself.pdf

Special Issue on Integration of Symbolic and Connectionist Systems:


http://books.google.com/books?id=5JMrzpnGuzE C&lr=&sourc e=gbs_navlinks_s
Memory and the Computational Brain: Why Cognitive Science Will Transform Neuroscience:
http://books.google.com/books?id=5JMrzpnGuzE C&lr=&sourc e=gbs_navlinks_s
Learning the Systematic Transformation of Holographic Reduced Representations:
http://dx.doi.org/10.1016/S1389-0417(01)00059-6
The Spencer-Brown form is a simple mathematical concept that formalizes what a mathematical object is formally
identical to what it is not (Spencer-Brown 1997, pp. ix and 180). The Spencer-Brown form is defined by two
primitive equations which are its axioms (Spencer-Brown 1969):

1. Condensation: Two instances of the form are equivalent to one instance of the form if they are placed in the
same space.

2. Cancellation: Two instances of the form are equivalent to no instance of the form alias empty space if one of the
forms is the argument of the other form.
http://mathworld.wolfram.com/Spencer-BrownForm.html
Logico-Algebraic models for Incomplete Information Systems, Knowledge Discovery in Databases and Data
Mining, Formal Logic and Philosophy of Language.
http://www.surf.it/logic/Logica.htm

Institute for Knowledge Innovation and Technology: IKIT is a global network of organizations committed to the
advancement of knowledge building technology and practices in all sectors of society.
http://ikit.org/index.html

Games, Groups, and the Global Good:


How do groups form, how do institutions come into being, and when do moral norms and practices emerge? This
volume explores how game-theoretic approaches can be extended to consider broader questions that cross scales
of organization, from individuals to cooperatives to societies. Game theory strategic formulation of central problems
in the analysis of social interactions is used to develop multi-level theories that examine the interplay between
individuals and the collectives they form. The concept of cooperation is examined at a higher level than that usually
addressed by game theory, especially focusing on the formation of groups and the role of social norms in
maintaining their integrity, with positive and negative implications. The authors suggest that conventional analyses
need to be broadened to explain how heuristics, like concepts of fairness, arise and become formalized into the
ethical principles embraced by a society.
http://books.google.com/books?id=2v9gh_2DBWMC&source= gbs_navlinks_s
Multiobjective Optimization:
http://en.wikipedia.org/wiki/Multiobjective_optimization
Ontological Feedback in Multiagent Systems:
http://proceedings.eldoc.ub.rug.nl/FILES/HOME/bnaic/2004/pt2/ontological/036Ontological.pdf
Cellular Automata are like Wei Qi and similar to Feynman Checkerboards which, in 1+1 dimensions, are
isomorphic to 1-dimensional Ising Models:
http://www.valdostamuseum.org/hamsmith/ ficw2.html
Statistical Mechanics of Games ~ Evolutionary Game Theory:
(Keywords: Evolutionary Game Theory, Statistical Mechanics, Ising Model, SK Model, Percolation, Econphysics):
http://www.hiranolab.jks.ynu.ac.jp/~rcme/abstract/08kikkawa_abst.pdf
Tableau-Based Decision Procedures for the Logics of Subinterval Structures over Dense Orderings:
http://mail.maths.wits.ac.za/~goranko/papers/TableauSubintervalStructuresOverDenseOrderingsJLC.pdf
Interval Temporal Logic:
http://en.wikipedia.org/wiki/Interval_t emporal_logic

Brainstorm Page 66
Interval Temporal Logic:
http://en.wikipedia.org/wiki/Interval_t
↙↘ emporal_logic
In any category C, a span, roof, or correspondence, from an object x to an object y is a diagram
sf gx y
where s is some other object of the category.
This diagram is called a „span‟ because it looks like a little bridge; „roof‟ is similar. The term
„correspondence‟ is prevalent in geometry and related areas; it comes about because a
correspondence is a generalisation of a binary relation.
http://ncatlab.org/nlab/show/span
• A multi(co)span is supposed to be something that generalizes span (and cospan) both horizontally
and vertically: it may have a number of legs different from 2, but more importantly it need not be a
single roof but can be ⊤ a
∈ more complex diagram.
• A multispan is supposed to be a model for a hierarchical cell complex with
• a single top “cell” K( ) C (an object in some ambient category C);
• for each cell K(a) a collection of morphisms {K(b i)→K(a)} i into it, to be thought of as pieces of
boundary components of K(a);
• in the language of hyperstructures we would say that K(a) is a bond for the K(b i)
• and so on;
• such that the entire resulting diagram commutes, expressing the fact how one boundary pieces may
be part of different higher order boundary pieces.
Multi-cospans of cubical shape have been introduced and studied by Marco Grandis in his work
on Cospans in Algebraic Topology. Grandis also formulates the idea that an extended QFT should be
a morphism with domain extended cobordisms modeled as multi-cospans.
http://ncatlab.org/nlab/show/multispan
Nils Baas has been emphasizing for many years, in print and in private communication, the
conviction that the usual notions of n-category, infinity-category omega-category in higher category
theory are notnaturally suited for describing
• extended cobordisms such as appearing in
• the tangle hypothesis
• in extended quantum field theory
• hierarchical systems such as appearing in
• complex systems;
• biology.
The point is essentially that the directedness of morphisms and — related to that — the binary
notion of source and target in categories and higher categories are notions alien to these contexts,
which in applications have to and are essentially removed again in a second step by adding extra
structure and requiring further properties, such as various monoidal structures and dualities, which
allow to change the direction of morphisms, to collect objects together, etc.
In contrast to that, Baas pointed out that more naturally the above situations are thought of from
the beginning in terms of hierarchies of what he calls bonds, where, quite generally, a bond is an
object equipped with ↙ information of how a collection of sub-bonds sits inside it, bound by the bond.
A sketch of a generic such a situation of hierarchical bonds is a diagram
β2= β2β4↓↓ b1→B←b2↑ ↑↑β1b3 β3↑β5
where a bond B binds sub-bonds b 1,b2 and b 3 which in turn bind sub-bonds β i.
For instance B might be an extended cobordism with three boundary components b 1,b 2,b 3 which
in turn have pieces of boundary components β i.
Nils Baas has made, in print and in private communication, suggestions for a formalization of such
systems of hierarchical bonds and coined the term hyperstructures for these. One reference is
• Nils Baas, Introducing abstract matter (pdf)
http://ncatlab.org/nlab/show/hyperstructure
Philosophical interest in n-categories may be characterised as belonging to one of two kinds.
• Metaphysical: The formation of a new language which may prove to be as important for philosophy
as predicate logic was for Bertrand Russell and the analytic philosophers he inspired.
• Illustrative of mathematics as intellectual enquiry: Such a reconstitution of the fundamental
language of mathematics reveals much about mathematics as a tradition of enquiry stretching back
several millennia, for instance, the continued willingness to reconsider basic concepts.
Metaphysical
• Higher category theory provides a new foundation for mathematics - logical and philosophical.
• Higher category theory refines the notion of sameness to allow more subtle variants. It advocates
the avoidance of evil.
• There ought to be a categorified logic, or 2-logic. There are some suggestions that existing work on
modal logic is relevant. Blog discussion: I, II, III, IV. Mike Shulman’s project: 2-categorical logic.
• Higher category theory may provide the right tools to take physics forward. A Prehistory of n-
Categorical Physics See also physics.
• More speculatively, category theory may prove useful in biology.
http://ncatlab.org/nlab/show/philosophy
Physics, Topology, Logic, and Computation: A Rosetta Stone:
http://math.ucr.edu/home/baez/rosetta.pdf

Modal Logics are Coalgebraic:


This paper substantiates the authors’ firm belief that the systematic exploitation of the coalgebraic nature of modal logic will
not only have impact on the field of modal logic itself but also lead to significant progress in a number of areas within
computer science, such as knowledge representation and concurrency/mobility.
http://www.doc.ic.ac.uk/~dirk/Publications/cj2009.pdf

Some interests: pragmatic semantics, ontology mapping, artificial intelligence, knowledge discovery, computational
linguistics, innovation management, goals collaboration, data mashing, social networks, sheaf semantics for

Brainstorm Page 67
linguistics, innovation management, goals collaboration, data mashing, social networks, sheaf semantics for
concurrent interacting objects, chu spaces for game theory, cognitive modeling, parallel teleonomic processes,
epistemological representation, relevance paradox, complex adaptive systems, algebraic semiotics, computational
manifolds, game-theoretic problems in network economics with mechanism design and implementation theory
solutions.
Koinontely: Knowledge Ontology Integ-gregation Networks + Collaborative Innovation Networks +
Ontoteleoplectics + Concurrent Engineering + Swarm Intelligence + Sheaf Semantics for Cooperating Agents +
Isotelesis (equitable telestics, parallel development, convergent equilibriums) + Polytely (simultaneous goals,
outcome multiplicity, potential for mutual interference) + Polysemy (a sign, word, or phrase with two or more related
meanings) + Distributed Infomorphism + Sensor Interoperability + RFID + Cognitive Radio + Dynamic Mechanism
Design + Social Implementation Theory + Generalized Expected Utility
Koino- (Greek: Common, Shared, Public, Open-to-All)
i.e.- Koinotropy: Sharing the interests of a group of people.

-ont- (Greek: Individual, Being, Existence)


i.e. - Ontology: The metaphysical study of the nature of being and existence

-tely (Greek: Goal, End, Purpose, Completion, Result, Perfection, Fulfillment, Up To or At a Distance)
i.e.- Isotely: Equivalent goals or purpose; Parallel development or convergence
-semy (Greek: Sign, Signal, Mark, Meaning)
i.e.- Polysemy: Capacity for a sign (word, phrase, etc.) or signs to have multiple related meanings.

Integ- (Latin: Whole, Complete)


i.e.- Integation: The act of combining into an integral whole
-gregation (Latin: Flock; Assemble, Gather, Gather Together)
i.e.- Aggregation: Several things grouped together or considered as a whole
-plectic (Latin: to fold, bend, curve, turn, twine, twist, braid, interweave, weave)
i.e.- Complex: A conceptual whole made up of interconnected or related structures
"Ontoteleoplectics" is how "Koinontely" will work, and could integrate with "Twine" and "Knol".
innovative Collaborative Knowledge Networks (iCKN):

The goal of this research project at the MIT Center for Collective Intelligence is to help
organizations to increase knowledge worker productivity and innovation, by creating
"Collaborative Innovation Networks (COINs)" .
http://www.ickn.org/
Peter Gloor:
http://cci.mit.edu/pgloor/
Swarm Creativity Blog:
http://swarmcreativity.blogspot.com/
Information Integration, Databases and Ontologies:
Although ontologies are promising for certain applications, many difficult problems remain, in part due to the essentially
syntactic nature of ontology languages (e.g. OWL), the computationally intractable nature of highly expressive ontology
languages (such as KIF), and the difficulty of interoperability among the many existing ontology languages, as well as among
the ontologies written in those languages. Difficulties of another kind stem from the unrealistic expectations engendered by the
many exaggerated claims made in the literature.
The goal of research in Data, Schema and Ontology Integration and Information Integration in Institutionsis to provide a
rigorous foundation for information integration that is not tied to any specific representational or logical formalism, by using
category theory to achieve independence from any particular choice of representation, and using institutions to achieve
independence from any particular choice of logic. The information flow and channel theories of Barwise and Seligman are
generalized to any logic by using institutions; following the lead of Kent, this is combined with the formal conceptual analysis
of Ganter and Wille, and the lattice of theories approach of Sowa. We also draw on the early categorical general systems
theory of Goguen as a further generalization of information flow, and draw on Peirce to support triadic satisfaction.
http://cseweb.ucsd.edu/~goguen/projs/data.html
Epistemic Properties of Knowledge Heirarchies:
http://dx.doi.org/10.1016/S0304-4068(98)00040-8
Ontology, Society, and Ontotheology (Joseph Goguen):
http://charlotte.ucsd.edu/~goguen/pps/fois04.pdf

Value-Driven Design, Semiotics, and Compassion (Joseph Goguen):


http://cseweb.ucsd.edu/~goguen/papers/kyoto05/slide0.html
HYJ: I consider the CTMU a metaphysical framework, and thus important to concurrent ontology development,

Brainstorm Page 68
http://cseweb.ucsd.edu/~goguen/papers/kyoto05/slide0.html
HYJ: I consider the CTMU a metaphysical framework, and thus important to concurrent ontology development,
however there are some differences I've had with it, which results in the "IS-Multivarifold".
The CTMU reduces reality to self-transducing information and ultimately to telesis, using the closed, reflexive
syntactic structure of the former as a template for reality theory.
http://megafoundation.org/CTMU/Articles/Langan_CTMU_092902.pdf
{HYJ: Implementing these methods and ideas are what I'd like to work towards in Collaborative Innovation
Networks (COINs) and Knowledge Ontology Integration Networks (KOINs) with the addition of metalogical rewrite
systems for mobile environments (Maude) for plectic and telestic socioecophysical environments.}
As a Scientist & Engineering Consultant Expert, Rexsy has been involved in the following AI projects:
Design and Development of Advanced Evolutionary Multi-Agent Expert Systems (AEXSYS) based on Adaptive
Recurrent Fuzzy Neural Networks, State Space Innovations Models, Multi-variable Nonlinear Autoregressive
Moving Average (ARMAX) Systems, Advanced Pattern Recognition Theory, Ontological Expert Systems, and
Generalized Hybrid AI System Architectures (Series/Series, Series/Parallel, etc.). These AEXSYS Systems
possess both Fuzzy Knowledge-Based and Computational Intelligence evolving through time towards optimal
endeavors. Design and Development of the Advanced Complex Adaptive MultiAgent Ontological Expert System
(ACXSYS), which consists of various groups of Intelligent Agents working and performing complex
tasking/functions in parallel to achieve the final goals/endeavors.
Rexsy received his B.S., M.S., and Ph.D. in Electrical Engineering from the University of Colorado, Boulder.
http://www.rexsy.com/ai_technology

Exploring the Link between Science and Innovation:


http://www.rieti.go.jp/en/rieti_report/031.html

The Ontology Integration Problem:


http://novaspivack.typepad.com/nova_spivacks_weblog/2006/08/the_ontology_in.html
Knowledge Interchange Format (KIF) is a computer-oriented language for the interchange of knowledge among disparate
programs. It has declarative semantics (i.e. the meaning of expressions in the representation can be understood without appeal
to an interpreter for manipulating those expressions); it is logically comprehensive (i.e. it provides for the expression of
arbitrary sentences in the first-order predicate calculus); it provides for the representation of knowledge about the
representation of knowledge; it provides for the representation of nonmonotonic reasoning rules; and it provides for the
definition of objects, functions, and relations.
http://www-ksl.stanford.edu/knowledge-sharing/kif/
The Multi-Agent Coordination research group is part of the Distributed Scalable Systems Division of
the Information Sciences Institute at the University of Southern California. We focus on solving coordination
problems involving human, robot and software agents. This includes distributed planning, scheduling and resource
allocation problems in uncertain and dynamic environments.
http://coordination.isi.edu/home/
Multi-Agent Coordination: A Measure-Theoretic Approach to Large-Scale Coordination:
http://nrsl.mne.psu.edu/thrust.html

Relevance paradox is a term for the occurrence where the attempt to


gather information relevant to a decision is ineffective because the attempt to eliminate
distracting or unnecessary information also excludes gathering information that is later seen to
be crucial.[citation needed] It was one of the key ideas in "The IRG Solution - hierarchical incompetence
and how to overcome it"
http://en.wikipedia.org/wiki/Relevance_paradox
An interlock diagram (for example see [1] )is a real or imagined diagram that plots the actual interactions, physical,
political, social, environmental between all entities within human societies. Each node is a specific activity such as
a power station, or a policy such as controlled rent. Ideally each node should be owned by a practitioner with
relevant experience and knowledge both tacit knowledge and explicit.
The diagram can be started by one or more such experts and can grow to plot all interactions, and stimulate
discussions about the various knock on effects and their amelioration.
By having an expert at each node, these can feed policy insight directly into government or other management
machinery, to keep the organisation on track and to achieve its goals.
Lateral communication occurs between these experts, largely mapping the interactions, which facilitates the
transfer of relevant information and tacit knowledge. This defeats the Relevance Paradox.
To a very large extent, Wikipedia is a gigantic self assembling interlock diagram, and the participants are carrying
out interlock research.
http://en.wikipedia.org/wiki/Interlock_diagram
Interlock Research is a concept used to overcome the gaps in individual or group knowledge of which they are
unaware of and which would result in incorrect action being taken, or important action not taken, leading to
unintended consequences. It is based on the notion that no individual or group can ever understand anything fully
on their own.[1] The concept is achieved by working and communicating efficiently within the same organization at a
lateral level directly with each other. The concept was developed in a series of research papers and the book "The
IRG Solution - hierarchical incompetence and how to overcome it"[2]
http://en.wikipedia.org/wiki/Interlock_research

Most people have heard the phrase


A straight line is the shortest distance between two points.
But in differential geometry, they say this same thing in a different language. They say instead
Geodesics for the Euclidean metric are straight lines.
http://www.theory.caltech.edu/people/patricia/grelb.html

Brainstorm Page 69
http://www.theory.caltech.edu/people/patricia/grelb.html
Some Mathematical Challenges in Materials Science:
http://www.ams.org/bull/2003-40-01/S0273-0979-02-00967-9/ home.html

Lord Kelvin (William Thomson) was a prodigious and prolific nineteenth century thinker, and one of his interests
was studying bubbles. (Seriously.) But to preface this post, take a moment and think about a single bubble. Why
does it form a sphere — and not a cube or blob shape? The answer deals with forces and energy and the like, but
the problem reduces to finding the figure with the smallest surface area for the amount of air contained in the
bubble.

And if there are two bubbles touching, we know they meet and form a dyad of bubbles. And three bubbles meet,
and they will always meet at 120 degrees. And more?

The more the bubbles, the more they are touched on different sides by other bubbles, forming flat faces. We end
up seeing that the bubbles form polyhedra!

What if you had a whole bunch of bubbles in a foam bath? What would be the idea formation of them?

In 1887 Lord Kelvin asked the question: what is the shape that partitions space in such a way that the shape has
the minimum surface area?

One example of a shape that partitions space would be boxes — stacking boxes, one on top of each other, in all
directions. But boxes end up not having minimum surface area. Spheres aren‘t a possible answer, because you
can‘t fill space with spheres — there will always be gaps between the spheres!

Bubbles, by their very nature, partition space by taking the shape that minimizes surface area. Kelvin studied
bubbles and conjectured that the answer to his question was ―tetrakaidecahedra.‖ Well, in his words, ―a plane-
faced isotropic tetrakaidecahedron.‖ (These are truncated octahedra.)
http://samjshah.com/2008/08/17/lord-kelvin-bubbles-and-t he-olympics/
In mathematics, a varifold is, loosely speaking, a measure-theoretic generalization of the concept of adifferentiable
manifold, by replacing differentiability requirements with those provided by rectifiable sets, while maintaining the
general algebraic structure usually seen in differential geometry. More closely, varifolds generalize the ideas of
a rectifiable current. Varifolds are the topic of study ingeometric measure theory.
http://en.wikipedia.org/wiki/Varifold

Geometric measure theory


An area of analysis concerned with solving geometric problems via measure-theoretic techniques. The canonical
motivating physical problem is probably that investigated experimentally by J. Plateau in the nineteenth
century [a4]: Given a boundary wire, how does one find the (minimal) soap film which spans it? Slightly more
mathematically: Given a boundary curve, find the surface of minimal area spanning it. (Cf. also Plateau problem.)
The many different approaches to solving this problem have found utility in most areas of modern mathematics and
geometric measure theory is no exception: techniques and ideas from geometric measure theory have been found
useful in the study of partial differential equations, the calculus of variations, harmonic analysis, and fractals.
http://eom.springer.de/G/g130040.htm
Triply Periodic Minimal Surfaces:
http://www.susqu.edu/brakke/evolver/examples/periodic/periodic.html
Minimal surfaces have become an area of intense mathematical and scientific study over the past 15 years,
specifically in the areas of molecular engineering and materials science, due to their
anticipated nanotechnology applications.
http://en.wikipedia.org/wiki/Category:Minimal_s urfaces
Mathematicians Maximize Knowledge of Minimal Surfaces:
http://www.physorg.com/news74872374. html
Solution of a Class of Minimal Surface Problem with Obstacle:
http://ccsenet.org/journal/index.php/jmr/article/viewFile/180/139
Nonlinear Optimization: Algorithms and Models:
http://www.princeton.edu/~rvdb/542/lectures/lec23.pdf
Optimization: Linear Programming, Operations Research, Path Integrals, etc.:
http://home.att.net/~numericana/answer/optimize.htm
Minimal Surfaces as Webs of Optimal Transportation Flows:
http://www.informaworld.com/smpp/979737740-73230552/content~db=all~content=a772432399
Ten Thousand Peacock Feathers in Foaming Acid:
http://portablepalace.com/10000.html
Bubble: Science Gallery:
http://sciencegallery.com/bubble
Minimal Surfaces and Multifunctionality:
Triply periodic minimal surfaces are objects of great interest to physical scientists, biologists and mathematicians. It
has recently been shown that triply periodic two-phase bicontinuous composites with interfaces that are the
Schwartz primitive (P) and diamond (D) minimal surfaces are not only geometrically extremal but extremal for
simultaneous transport of heat and electricity. More importantly, here we further establish the multifunctionality of
such two-phase systems by showing
that they are also extremal when a competition is set up between the e.ffective bulk modulus and the electrical (or
thermal) conductivity of the composite. The implications of our fi“ndings for materials science and biology, which

Brainstorm Page 70
provides the ultimate multifunctional materials, are discussed.
Keywords: minimal surfaces; multifunctionality; composites; elastic moduli; conductivity; optimization
http://cherrypit.princeton.edu/papers/paper-221.pdf
A Measure-Theoretic Foundation of Rule Interestingness Evaluation:
http://www2.cs.uregina.ca/~yyao/PAPERS/measurement_foundation.pdf
On Semantic Interoperability and the Flow of Information:
http://ftp1.de.freebsd.org/Publications/CE UR-WS/Vol-82/SI_paper_14.pdf
Ontology Mapping Based on Rough Formal Concept Analysis:
http://www2.computer.org/portal/web/csdl/doi/10.1109/AICT-ICIW.2006. 142
Ontology Mapping: The State of the Art:
http://drops.dagstuhl.de/opus/volltexte/2005/40/ pdf/04391.KalfoglouYannis.Paper.40. pdf
A Channel-Theoretic Foundation for Ontology Coordination:
http://users.ecs.soton.ac.uk/yk1/MCN04-schorlemmer-kalfoglou.pdf
Using Formal Concept Analysis and Information Flow for Modelling and Sharing Common Semantics: Lessons
Learnt and Emergent Issues:
http://users.ecs.soton.ac.uk/yk1/iccs2005-final-kalfoglou.pdf
Why Ontologies are not Enough for Knowledge Sharing:
http://citeseer.ist.psu.edu/old/371242.html
The Information Flow Framework (IFF) Foundation Ontology:
http://suo.ieee.org/Kent-IFF.pdf
Combining Rough Set Theory and Instance Selection in Ontology Mapping:
http://cat.inist.fr/?aModele=afficheN&cpsidt=20586027
Advanced Knowledge Technologies:
http://www.aktors.org/akt/
A Measure Theoretic Approach to Information Retrieval:
http://portal.acm.org/citation.cfm?id=1254873
{HYJ: Koinontely also has similarities with the community enterprise and business partnership
ofKoinonia or Koinonos, except there are no spiritual connotations per se for Koinontely, simply teleonomic or goal-
directed processes in general, particularly social kinds, and in concurrent, parallel, distibuted, structures
(ontologies) which facilitate interoperability and virtual teamwork for swarm creativity.}

The word has such a multitude of meanings that no single English word is adequate to express its depth and
richness. It is a derivative of "koinos," the word for common. Koinonia is a complex, rich, and thoroughly fascinating
Greek approach to building community or teamwork.

Koinonia embraced a strong commitment to Kalos k'agathos meaning "good and good," – an inner goodness
toward virtue, and an outer goodness toward social relationships. In the context of outer goodness, translated into
English, the meaning of koinonia holds the idea of joint participation in something with someone, such as in a
community, or team or an alliance or joint venture. Those who have studied the word find there is always an
implication of action included in its meaning. The word is meaning-rich too, since it is used in a variety of related
contexts.

Sharing:
Koinonos means 'a sharer' as in to share with one another in a possession held in common. It implies the spirit of
generous sharing or the act of giving as contrasted with selfish getting. When koinonia is present, the spirit of
sharing and giving becomes tangible. In most contexts, generosity is not an abstract ideal, but a demonstrable
action resulting in a tangible and realistic expression of giving.

In classical Greek, koinonein means "to have a share in a thing," as when two or more people hold something, or
even all things, in common. It can mean "going shares" with others, thereby having "business dealings,‖ such as
joint ownership of a ship. It can also imply "sharing an opinion" with someone, and therefore agreeing with him, or
disagreeing in a congenial way. Only participation as a contributive member allows one to share in what others
have. What is shared, received or given becomes the common ground through which Koinonia becomes real.

Relationships:
"Koinonos" in classical Greek means a companion, a partner or a joint-owner. Therefore, koinonia can imply an
association, common effort, or a partnership in common." The common ground by which the two parties are joined
together creates an aligned relationship, such as a "fellowship" or "partnership".
http://en.wikipedia.org/wiki/Koinonia

Information Flow in Social Groups:


http://www.hpl.hp.com/research/idl/papers/flow/flow.pdf
Mobile Maude is a Mobile Agent language extending the rewriting logic language Maude and
supporting mobile computation. Mobile Maude uses reflection to obtain a simple and general
declarative mobile language design and makes possible strong assurances of mobile agent behavior.
The two key notions are processes and mobile objects. Processes are located computational
environments where mobile objects can reside. Mobile objects have their own code, can move
between different processes in different locations, and can communicate asynchronously with each
other by means of messages. Mobile Maude's key novel characteristics include:
1. reflection as a way of endowing mobile objects with "higher-order" capabilities
2. object-orientation and asynchronous message passing
3. a high-performance implementation of the underlying Maude basis
4. a simple semantics without loss in the expressive power of application code
5. security mechanisms supporting authentication, secure message passing, and secure object
mobility.
http://www.csl.sri.com/projects/mobile-maude/
Reflexive Ontologies: Enhancing Ontologies with Self-Contained Queries:
http://portal.acm.org/citation.cfm?id=1451212
A Decisional Trust Implementation on a Maintenance System by the Means of Decisional DNA and Reflexive
Ontologies:
http://portal.acm.org/citation.cfm?id=1487227

Brainstorm Page 71
http://portal.acm.org/citation.cfm?id=1487227
Semantic Knowledge Management: Integrating Ontology Management, Knowledge Discovery, and Human
Language Technologies (2009):
http://books.google.com/books?id=dOIfy7aZUaA C&source=gbs_navlinks_s

Leading the Web in Concurrent Engineering: Next Generation Concurrent Engineering:


http://books.google.com/books?id=BoqYILA2HuEC&dq=concurrent+ontology&source=gbs_navlinks_s

Mechanism Design and Communication Networks:


http://cowles.econ.yale.edu/conferences/sum-09/theory/tomala.pdf

HYJ: Intive may become a fun and useful method for swarm creativity:

"We‘ve used an algorithmic approach for solving multiple nesting clustering ambiguity. These algorithms are the
heart of the Intive system."
http://intive.com/blog/

iSOCO: Enabling the Networked Economy:


http://isoco.com/ENG/index.htm
Cycorp:
http://www.cyc.com/
Laboratory for Applied Ontology:
http://www.loa-cnr.it/

Book Review: Knowledge Representation: Logical, Philosophical, and Computational Foundations:


http://www.mitpressjournals.org/doi/pdf/10.1162/089120101750300544?cookieSet=1

Maude language and tool


http://www.item.ntnu.no/department/labs/topics/maude
An Asynchronous Communication Model for Distributed Concurrent Objects:
http://www.springerlink.com/content/e17t8253405943mk/
Telematics — 1. The convergence of telecommunications and information processing
http://en.wikipedia.org/wiki/Telematics
Concurrent Ontology and the Extensional Conception of Attribute (Vaughan Pratt):

By analogy with the extension of a type as the set of individuals of that type, we define the extension of an attribute
as the set of states of an idealized observer of that attribute, observing concurrently with observers of other
attributes. The attribute-theoretic counterpart of an operation mapping individuals of one type to individuals of
another is a dependency mapping states of one attribute to states of another. We integrate attributes with types via
a symmetric but not self-dual framework of dipolar algebras or disheaves amounting to a type-theoretic notion of
Chu space over a family of sets of qualia doubly indexed by type and attribute, for example the set of possible
colors of a ball or heights of buildings. We extend the sheaf-theoretic basis for type theory to a notion of disheaf on
a profunctor. Applications for this framework include the Web Ontology Language OWL, UML, relational databases,
medical information systems, geographic databases, encyclopedias, and other data-intensive areas standing to
benefit from a precise ontological framework coherently accommodating types and attributes.
Keywords: Attribute, Chu space, ontology, presheaf, type.
http://conconto.stanford.edu/conconto.pdf
The paper "The Yoneda Lemma as a foundational tool" can be downloaded as

http://boole.stanford.edu/pub/yon.pdf

Section 1 attempts to bridge the gap between algebra and category theory
by treating the Yoneda Lemma from the viewpoint of universal algebra.
I'm not sure how interesting this will be however for those favor logic
over algebra as the optimal organization of the foundations of mathematics.

As a natural extension of the Yoneda Lemma, Section 2 gives two


characterizations of density that I call respectively semantic and
syntactic. I propose the latter as having some bearing on the
foundations of mathematics. Again I would expect logicians to be less
likely to find this plausible than algebraists.

Section 3 extends algebras to communes as a kind of algebra consisting


of both elements (as usual) and dual elements (e.g. the open sets of a
topological space, the characters of a group, the functionals of a
vector space, etc.) It also gives some applications of communes to
combinatorics and ontology (shades of categorial grammar!), and
speculates on the origin of the distinction between types and properties.

The "foundational tool" part has to do with my perception of density as


somehow more basic than algebras and homomorphisms. On the theory that
there is little new under the sun that is basic, I would be delighted to
learn that there are even better ways to conceptualize that idea.

Vaughan Pratt"
Logic, Epistemology, and the Unity of Science 14:
From a Geometrical Point of View: A Study of the History and Philosophy of Category Theory
http://books.google.com/books?id=bvy0aA NuhPY C&source=gbs_navlinks_s

Multilevel System as Multigraph:


http://www.springerlink.com/content/40ntlu7jng8gj9fa/

Brainstorm Page 72
There are several cross-cutting dimensions that distinguish among graphs.
•Directed vs Undirected: ◦Directed graphs (also called digraphs) consist of ordered pairs. The links in a directed
graph are called arcs. Can use these to represent non-symmetric relations like "is the boss of" or "is attracted to"
◦Undirected graphs (also known simply as "graphs") consist of unordered pairs. They are used for the relations
which are necessarily symmetric, such as "is the sibling of" or "lives with"
•Valued vs Non-Valued ◦In non-valued graphs, nodes are either connected or not. Either Sally and Bill are siblings,
or they're not. ◦In valued graphs, the lines have values attached to represent characteristics of the relationships,
such as strength, duration, capacity, flow, etc.
•Reflexive vs Non-Reflexive ◦Reflexive graphs allow self-loops. That is, a node can have a tie to itself. This is
mostly useful when the nodes are collectivities. For example, if the nodes are cities and the ties represent
phonecalls between people living in those cities, it is possible (a virtual certainty) that there will be ties from a city to
itself.
•Multi-graphs ◦If more than one edge connects two vertices, this is a multigraph. In general, we do not use
multigraphs, preferring to use either valued graphs (to represent the number of interactions between A and B) or
wholly separate graphs (to represent substantively different relations, such as "does business with" and "is married
to"
http://wiki.apache.org/hama/GraphAndMatrices
Properties:
http://plato.stanford.edu/entries/properties/

Practical experiences in concurrent, collaborative ontology building using Collaborative Protégé:


http://precedings.nature.com/documents/3517/ version/1/files/npre20093517-1.pdf

Hands-on Experiences in Concurrent Ontology Building with Collaborative Protégé:


http://protege.stanford.edu/conference/2009/ abstracts/S2P2Schober.pdf

An Ontology Versioning Tool:


http://starlab.vub.ac.be/website/node/144

Collaboration: Success for the Future (Los Alamos National Laboratory Research Library):

As we focus on the 21st century, new information technologies and accelerated knowledge growth are creating
new ways to collaborate in achieving research and development goals. The role of the Library is changing from
being an archive and information provider to being a partner in research and development.
http://www.osti.gov/inforum99/papers/stack.pdf

Towards a Sheaf Semantics for Cooperating Agents Scenarios:

The ultimate goal of our work is to show how sheaf theory can be used for studying cooperating robotics scenarios.
In this paper we propose a formal definition for systems and define a category of systems. The main idea of the
paper is that relationships between systems can be expressed by a suitable Grothendieck topology on the category
of systems. We show that states and (parallel) actions can be expressed by sheaves and use this in order to study
the behavior of systems in time.

"Sheaf theory was developed in mathematics because of the necessity of studying the relationship between "local"
and "global" phenomena...The alternance "local-global" that occurs in this case suggests that a sheaf-theoretical
approach to the study of systems of cooperating agents (and in the study of concurrency in general) would be
natural...In a series of papers, J. Pfalzgraf develops the idea of "logical fiberings", with the goal of developing a
(non-classical) "fibered logical calculus", by means of which one could construct logical controllers for multi-tasking
scenerios in a formal way."
http://www.springerlink.com/content/f84q672400077970/
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.9. 1075

Sheaf Semantics for Concurrent Interacting Objects:

This paper uses concepts from sheaf theory to explicate phenomena in concurrent systems, including object,
inheritance, deadlock, and non-interference, as used in computer security. The approach is very general, and
applies not only to concurrent object oriented systems, but also to systems of differential equations, electrical
circuits, hardware description languges, and much more. Time can be discrete or continuous, linear or branching,
and distribution is allowed over space as well as time. Concepts from category theory help to achieve this
generality: objects are modeled by sheaves; inheritance by sheaf morphisms; systems by diagrams; and
interconnections by diagrams of diagrams. In addition, behaviour is given by limit, and the result of interconnection
by colimit. The approach is illustrated with many examples, including a semantics for a simple concurrent object-
based programming language.
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.52.4296

A Study in Knowledge Ontology Building in the Area of Knowledge Engineering:


http://www2.computer.org/portal/web/csdl/doi/10.1109/SKG.2007.46

Knowledge Ontology: A Method for Empirical Identification of 'As-Is' Contextual Knowledge:

In this paper, we consider existing approaches to ontology definition and validation. Popular techniques include the
use of domain experts or reliance on formal logic. We consider these contemporary techniques, their motivation
and limitations, and then suggest an empirical approach that statistically identifies knowledge ontology within
contextual databases using factor analytic techniques. We find that this method improves upon the process of
identifying existing, codified knowledge ontology, and that it can be integrated into other methods to improve upon
the efficiency of knowledge ontology identification, validation, and evolution. It can facilitate collaboration and inter-
organizational progress by providing a common foundation, empirically supported.
http://www2.computer.org/portal/web/csdl/doi/10.1109/HICSS.2005.377

Brainstorm Page 73
http://www2.computer.org/portal/web/csdl/doi/10.1109/HICSS.2005.377

Sheaves, Objects, and Distributed Systems:


http://portal.acm.org/citation.cfm?id=1486449

Adoption of the Semantic Web for Overcoming Technical Limitations of Knowledge Management Systems:
http://portal.acm.org/citation.cfm?id=1498258

Semantic Innovation Management:


http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.104.6470

Keywords: Innovation ontology, Innovation management, Semantic Web, Goals collaboration


http://bleedingedge.jeromedl.org/preview;jsessionid=783600EB 2DE 311FCB5E 2019D14102554?
author=mailto:David.OSullivan@deri. org

Collaborative Ontology Building with Wiki@nt - A Multi-Agent Based Ontology Building Environment:
http://www.cs.iastate.edu/~honavar/Papers/baoeon04.pdf

Mechanism for Collaborative Innovation in Living Labs (Network, Idea, Games, Team):
http://www.esoce.net/YC2007/wks2/01%20-% 20Thoben% 20Collaborative%20Innovation%20Vision%20Dec%
203%20-%202007%20% 20final% 20[Sola%20lettura].pdf

Enabling Open Innovation in a World of Ubiquitous Computing - Proposing a Research Agenda (2009):
http://de.scientificcommons.org/50776335

The Bumble Bee: Ken Thompson's Shared Know-How on Team Dynamics, Virtual Collaboration and Bioteaming.
http://www.bioteams.com/
Swarm Intelligence to Swarm Robotics:
http://bit.ly/3m8aO
Swarm Intelligence: Literature Overview:
http://bit.ly/ovgL6
Swarm Intelligence: a New C2 Paradigm with an Application to Control of Swarms of UAVs:
http://www.math.ucla.edu/~shargel/UAV-swarms.pdf
Gerardo Beni:
http://bit.ly/2FUEX6
How to Change the World: A Practical Blog for Impractical People:
http://blog.guykawasaki.com/2006/06/the_art_of_the_.html

Pragmatic Semantics by Conceptual Graphs:


http://www.springerlink.com/content/y8214833m5365778/

"With the advent of the Internet and other modern information and communication technologies, a magnificent
opportunity has opened up for introducing new, innovative models of commerce, markets, and business. Creating
these innovations calls for significant interdisciplinary interaction among researchers in computer science,
communication networks, operations research, economics, mathematics, sociology, and management science. In
the emerging era of new problems and challenges, one particular tool that has found widespread applications is
mechanism design.
The focus of this book is to explore game theoretic modeling and mechanism design for problem solving in Internet
and network economics. It provides a sound foundation of relevant concepts and theory, to help apply mechanism
design to problem solving in a rigorous way.

Written for:
Academic and industry researchers and graduate students in computer science, electrical engineering,
communications, economics, industrial engineering, systems engineering, operations research, management
science and business schools

Keywords:
Ad Hoc Wireless Networks
Auction
Bayesian Games
E-Procurement
Electronic Commerce
Electronic Markets
Game Theory
Grid Computing
Incentive Compatibility VCG Mechanisms
Incentives
Internet Analytics
Mechanism Design
Microeconomics
Nash Equilibrium
Network Economics
Rational and Intelligent Agents
Sponsored Search Auctions"
http://www.springer.com/math/applications/book/978-1-84800-937-0
[Collaborative Innovation + Knowledge Ontology Integration] = Koinontely
http://www.facebook.com/group.php?gid=141737330247
Davos Annual Meeting 2008 - Power of Collaborative Innovation:
http://www.youtube.com/watch?v=fXgZutituAY

InnovateOnPurpos e: Collaborative Innovation Conference:


http://www.youtube.com/watch?v= TV X_yQcMqiU

Brainstorm Page 74
InnovateOnPurpos e: Collaborative Innovation Conference:
http://www.youtube.com/watch?v= TV X_yQcMqiU

Imaginatik is the leading provider of collaborative innovation and enterprise crowdsourcing solutions:
http://www.imaginatik.com/

bluenove - opening innovation:


http://www.bluenove.com/

Nokia Research Centre is actively engaging in Open Innovation through selective and deep research collaborations
with world-leading institutions. By sharing resources, leveraging ideas, and tapping each other‘s expertise we are
able to create vibrant innovation ecosystems, multiply our efforts, enhance innovation speed and efficiency, and
derive more value for our organizations and ultimately for our end-customers.
http://research.nokia.com/openinnovation

Open Source Collaborative Networking for Intranets and Extranets:


http://www.mindtouch.com/

"iLink is a machine learning-based system that models users and content in a social network and then points the
user to relevant content, discussions, and other network members with shared interests and goals across a broad
range of scenarios."
http://news.cnet.com/8301-13639_3-10004980-42.html?tag=unt agged

SRI International Social Network Analytics Technology Supercharges Popular Online Military Communities:

―iLink learns to deliver the right answers to the right people at the right time,‖ said Dr. David Gutelius, product
manager, Artificial Intelligence Center, SRI International. ―It identifies needs, questions, and issues as they emerge
in online communities and matches them with highly relevant resources and people. The goal is a more adaptive,
effective problem-solving military.‖

The iLink technology was developed as part of the SRI-led CALO (Cognitive Agent that Learns and Organizes)
program and was funded and managed under DARPA‘s PAL (Personalized Assistant that Learns) program. CALO
is a five-year contract to develop an enduring personalized cognitive assistant. iLink is the first example of a military
deployed CALO technology.

―Using machine learning and community analytics, we hope to continue helping soldiers and their families sort
through miles of content and deliver what‘s most relevant to the topic at hand, connecting them quickly to people
who can help problem solve,‖ said Dr. Jeffrey E. Davitz, Director of the Social Computing Group, Artificial
Intelligence Center, SRI International. ―In partnership with the military, we are focused on creating online
communities where soldiers can troubleshoot, converse and brainstorm with soldiers in their social network,
leading to new ideas that can quickly be implemented in the field.‖
http://www.sri.com/news/releases/072908.html

{HYJ: That is quite similar to what I hope "Koinontely" to achieve, the ability to network with other members with
shared interests (koinotropy), but also shared goals (koinotely), particularly for collaborative innovation and
telework settings, I'm sure it's already been done, however in particular the machine learning/pattern recognition
will be involved with formal concept analysis (hence Chu Spaces) for "knowledge ontologies", with the addition to
some game-theoretic (mechanism design/implementation theory) ideas for designing incentives and outcome
objectives. However, as Putin said at the last Davos Conference, after semantics and pragmatics, it comes down to
trust.}

Sobolev Institute of Mathematics Russian Foundation for Basic Research


Association for Pattern Recognition and Image Analysis of the Russian Federation
IInd All–Russian Conference ―Knowledge - Ontology - Theory‖ (KONT-09)
with international participation 20–22 October, 2009, Novosibirsk
http://www.math.nsc.ru/conference/zont09/engl.htm
This first conference on Collaborative Innovation Networks (COINs) brings together practitioners, researchers and
students of the emerging science of collaboration. Collaborative Innovation Networks, or COINs, are cyberteams of
self‐ motivated people with a collective vision, enabled by technology to collaborate in innovating by sharing ideas,
information, and work. Although COINs have been around for hundreds of years, they are especially relevant today
because the concept has reached its tipping point thanks to the Internet. COINs are powered by swarm creativity,
wherein people work together in a structure that enables a fluid creation and exchange of ideas. ‗Coolhunting‘ –
discovering, analyzing, and measuring trends and trendsetters – puts COINs to productive use. Patterns of
collaborative innovation frequently follow an identical path, from creator to COIN to collaborative learning network
to collaborative interest network.

The theme of the conference combines a wide range of interdisciplinary fields such as social network analysis,
group dynamics, design and visualization, information systems and the psychology and sociality of collaboration.

We invite researchers to submit their latest scientific results on


• Dynamic Social Network Analysis
• Semantic Social Network Analysis
• Social System Design and Architectures
• Social Behavior Modeling
• Social Intelligence and Social Cognition
• Emotional Intelligence, Cultural Dynamics, Opinion Representation, Influence Process
• Trust, Privacy, Risk, Transparency and Security in social contexts
• Virtual Communication and Collaboration
• Measuring the performance of COINs
• Patterns of swarm creativity
• Collaborative Leadership
• Design and visualization in interdisciplinary collaboration

Brainstorm Page 75
• Collaborative Leadership
• Design and visualization in interdisciplinary collaboration
• Group dynamics and global teaming in virtual collaboration
• Organizational optimization in COINs
• The psychology and sociality of collaboration
http://www.coins2009.com/
http://wiki.soberit.hut.fi/virtualbrownbag/tiki-index.php?page=CoinConference

Collaborative Ontology Development for Information Integration:


Information integration is enabled by having a precisely defined common terminology. We call this combination of
terminology and definitions an ontology. We have developed a set of tools and services to support the process of
achieving consensus on such a common shared ontologies by geographically distributed groups. These tools make
use of the world-wide web to enable wide access and provide users with the ability to publish, browse, create, and
edit ontologies stored on an ontology server. Users can quickly assemble a new ontology from a library of modules.
We discuss how our system was
constructed, how it exploits existing protocols and browsing tools, and our experience supporting hundreds of
users. We describe applications using our tools to achieve consensus on ontologies and to integrate
information. The ontology server may be accessed through the URL
http://www-ksl-svc.stanford.edu:5915/
Keywords: Ontology, world-wide web interfaces, interoperation, knowledge reuse
http://www.isrl.illinois.edu/~amag/langev/localcopy/pdf/ farquhar95collaborativeOntology.pdf

Large Resources: Ontologies (SENSUS) and Lexicons:


Objective:
In order to perform the kind of reasoning/inference required for deeper (semantic) understanding of texts, as
required for high-quality Machine Translation, Summarization, and Information Retrieval, it is imperative to provide
systems with a wide-ranging semantic thesaurus. We call such a 'concept thesaurus' an Ontology.

No adequately large, refined, and consistent ontology exists today. It is our goal to try to build one incrementally,
with constant testing and revision according to the needs of the MT and summarization systems, out of existing
lexical, text, and other ontology resources.
http://www.isi.edu/natural-language/projects/ONTOLOGIES.html
OntoEdit: Collaborative Ontology Development for the Semantic Web:

Ontologies now play an important role for enabling the semantic web. They provide a source of precisely defined
terms e.g. for knowledge-intensive applications. The terms are used for concise communication across people and
applications. Typically the development of ontologies involves collaborative efforts of multiple persons. OntoEdit is
an ontology editor that integrates numerous aspects of ontology engineering. This paper focuses on collaborative
development of ontologies with OntoEdit which is guided by a comprehensive methodology.
http://www.springerlink.com/content/drux1nwev5ghr695/
The ontolingua server: a tool for collaborative ontology construction (1996):
http://eprints.kfupm.edu.sa/70714/
Cyberinfrastructure for e-Science:
http://www.sciencemag.org/cgi/content/abstract/308/5723/817

REDMOND, Wash., and SAN JOSE, Calif. — March 11, 2009 — The nuggets of information
necessary for science to progress are often hard to find, submerged deep within the Web, or within
databases that can’t be easily accessed or integrated. As a result, many scientists today work in
relative isolation, follow blind alleys and unnecessarily duplicate existing research.
Addressing this critical challenge for researchers, Microsoft Corp. and Creative Commons announced
today, before an industry panel at the O’Reilly Emerging Technology Conference (ETech
2009,http://en.oreilly.com/et2009), the release of the Ontology Add-in for Microsoft Office Word
2007 that will enable authors to easily add scientific hyperlinks as semantic annotations, drawn from
ontologies, to their documents and research papers. Ontologies are shared vocabularies created and
maintained by different academic domains to model their fields of study.
http://www.microsoft.com/presspass/press/2009/mar09/03-11mscreativecommonspr.mspx
Semantic Science Ontology Design Principles:
http://code.google.com/p/semanticscience/wiki/ODP
NISO is where content publishers, libraries, and software developers turn for information industry standards that
allow them to work together. Through NISO, all of these communities are able to collaborate on mutually accepted
standards — solutions that enhance their operations today and form a foundation for the future.
http://www.niso.org/home
The Future of Knowledge Organization Systems on the Web:
http://www.clir.org/pubs/reports/pub91/5future.html

Vigils in a Wilderness of Knowledge: Metadata in Learning Environments:


http://dbis.rwth-aachen.de/i5new/staff/klamma/download/icalt117.pdf
Peer-to-Peer Metadata Management for Knowledge Discovery in Applications in Grids:
http://www.coregrid.net/mambo/images/stories/Tec hnicalReports/tr-0083.pdf
Invariant Standard Positions of Ordered Sets of Points:
http://www.springerlink.com/content/y11m25l672363481/

SWAN (Semantic Web Applications in Neuromedicine) is a project to develop knowledge bases for the
neurodegenerative disease research communities, using the energy and self-organization of that community
enabled by Semantic Web technology. Created in collaboration with the Alzforum and other partners. Read more
about the SWAN project here.
http://swan.mindinformatics.org/ontology.html

From Top-Level to Domain Ontologies: Ecosystem Classifications as a Case Study:


http://www.springerlink.com/content/u6213j677q262210/

An Ecosystem for Semantics:

Brainstorm Page 76
An Ecosystem for Semantics:

The article presents a theoretical approach to semantics that embraces the complex and challenging problems
associated with authoring multimedia albums.
(Ansgar Scherp, University of Koblenz-Landau,
Ramesh Jain, University of California, Irvine)
http://www2.computer.org/portal/web/csdl/doi/10.1109/MMUL.2009.20
Structural Informatics Group:
http://sig.biostr.washington.edu/
Autoepistemic Logic:
http://en.wikipedia.org/wiki/Autoepistemic_logic
Multi-Layered Extended Semantic Networks:
http://en.wikipedia.org/wiki/MultiNet
Scientific Modeling:
http://en.wikipedia.org/wiki/Scientific_modeling
Galois Connection:
http://en.wikipedia.org/wiki/Galois_connection
Closure Operators:
http://en.wikipedia.org/wiki/Category:Closure_operators
Game Theory:
http://en.wikipedia.org/wiki/Game_theory
The Core of Games on Distributive Lattices: How to Share Benefits in a Hierarchy:
http://hal.archives-ouvertes.fr/docs/00/34/48/ 02/PDF/B08077.pdf
Formal Concept Analysis via Multi-Adjoint Concept Lattices:
http://portal.acm.org/citation.cfm?id=1456948

Functions of a Thesaurus / Classification / Ontological Knowledge Base:


http://ontolog.cim3.net/file/work/OntologizingOnt olog/ TaxoThesaurus/SoergelK OSOntologyFunctions2--
DagobertSoergel_20060616. pdf

Compositional Semantics:
http://www.ling.helsinki.fi/kit/2004k/ctl254/L3/Semantics/

On Compositional Semantics:
http://portal.acm.org/citation.cfm?doid=992066.992109

Statistical NLP:
http://www.cs.berkeley.edu/~klein/cs294-7/SP07%20cs294%20lecture%2019%20--%20c ompositional%
20semantics%20(6pp).pdf

Minimal Recursion Semantics: Overview and Applications:


http://www.topicalizer.com/files/Minimal_Recursion_Semantics_-_Overview_and_Applications.pdf

An Introduction to Formal Computational Semantics:


http://www.stanford.edu/class/cs224n/handouts/cl-semantics-new.pdf

Cybernetics and Systems Analysis: Automata Manifolds:


http://www.springerlink.com/content/n4vk25wm0t vx5266/
Abstract Neural Automata on Compact Riemannian Manifold:
http://www.emeraldinsight.com/Insight/ viewContentItem.do?c ontentType=Article&contentId= 1454536
Continuous Spatial Automaton:
http://en.wikipedia.org/wiki/Continuous_spatial_automaton
Constrained Sums of Information Systems:
We study properties of infomorphisms between information systems. In particular, we interpret infomorphisms
between information systems in terms of sums with constraints (constrained sums, for short) that are some
operations on information systems. Applications of approximation spaces, used in rough set theory, to study
properties of infomorphisms are included.
http://www.springerlink.com/content/glpgtjkg4c8jy7wb/
Advances in Human-Computer Interaction: A Mathematical Framework for Interpreting Playing Environments as
Media for Information Flow: 〈 〉
This paper proposes a novel strategy of designing play equipments. The strategy introduces two loose constraints
as a guideline for designers. The first constraint is “describing unit of play action chain” O,S,V based on Barthes'
semiology, and the second is the infomorphism between designer, play equipment, and players based on channel
theory. We provide detailed explanation of the strategy through an example of a designing process of playing
environment where the players usage of the play equipment cannot be foreseen.
http://www.hindawi.com/journals/ahci/2008/258516.html
Recursive Ontology:
One particularly unfortunate feature of our conception of theories and systems is that we see them as being like
physical edifices. We “build” and “ground” our theories. We want them to have firm “foundations”. We view a strong
theory as one that is impervious to attack, and a weak theory as one that “topples” or “collapses” due to its flaws.

The reason why I say this is unfortunate is that we generally have a hell of a time providing foundations for our
theories. There is no one fact that is solid and irrefutable, and if even if there were, there‟s no guarantee that we
could build anything on it.

So we need a better metaphor. I haven‟t got one, I‟m sad to say, but I‟m hoping that one will emerge if I think about
it long enough and talk to enough people about it. A good place to start, I think, is to theorize a little bit about
ontology.
http://sevenless.org/blog/?p=190
Language Log: Ontological Promiscuity vs. Recursion:
A key insight of early analytic philosophy, as I understand things, was that the routine ontological commitments of
natural-language semantics are sometimes disastrously misleading. On this view, we must sometimes either find

Brainstorm Page 77
new ways of talking, or else agree to interpret old ways of talking in artificial and perhaps unnatural ways. It's not a
surprise to find ourselves in this situation with the semantics of mathematical language like "rapidly converging
function".
http://itre.cis.upenn.edu/~myl/languagelog/archives/005380.html
Learning Ontology Alignments Using Recursive Neural Networks:
The Semantic Web is based on technologies that make the content of the Web machine-understandable. In that
framework, ontological knowledge representation has become an important tool for the analysis and understanding
of multimedia information. Because of the distributed nature of the Semantic Web however, ontologies describing
similar fields of knowledge are being developed and the data coming from similar but non-identical ontologies can
be combined only if a semantic mapping between them is first established. This has lead to the development of
several ontology alignment tools. We propose an automatic ontology alignment method based on the recursive
neural network model that uses ontology instances to learn similarities between ontology concepts. Recursive
neural networks are an extension of common neural networks, designed to process efficiently structured data.
Since ontologies are a structured data representation, the model is inherently suitable for use with ontologies.
http://www.springerlink.com/content/42bhqrl31v3r48rh/
Dimensionally Modeling a Recursive Hierarchy:
I was recently asked how to include a recursive hierarchy into a dimensional model.
What do I mean by a recursive hierarchy? This is when an entity (a table) relates to it's self.
http://it.toolbox.com/blogs/rossgoodman-on-bi/dimensionally-modeling-a-recursive-hierarchy-22885
Recursive Hierarchies: The Relational Taboo!
As I was going to St. Ives,
I met a man with seven wives.
Every wife had seven sacks,
Every sack had seven cats,
Every cat had seven kits,
Kits, cats, sacks and wives,
How many were going to St. Ives?
~Mother Goose
http://kamfonas.com/id3.html
Recursive Hierarchies or Non-Recursive, Pros and Cons:
http://thomasianalytics.spaces.live.com/blog/cns!B6B6A40B93AE1393!575.entry
Fast Rollup on Recursive Hierarchy in OLAP:
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=4061514

The Future of Philosophy (John R. Searle):

Because this article is intended for a predominantly scientific audience, I will begin by explaining some of the
similarities and differences between science and philosophy. There is no sharp dividing line between the two. Both,
in principle, are universal in subject matter, and both aim at the truth. However, though there is no sharp dividing
line, there are important differences in method, style and presuppositions. Philosophical problems tend to have
three related features that scientific problems do not have. First, philosophy is in large part concerned with
questions that we have not yet found a satisfactory and systematic way to answer. Second, philosophical questions
tend to be what I will call "framework" questions; that is they tend to deal with large frameworks of phenomena,
rather than with specific individual questions. And third, philosophical questions are typically about conceptual
issues; they are often questions about our concepts and the relation between our concepts and the world they
represent.
http://socrates.berkeley.edu/~jsearle/future_of_philos ophy.rt f

Ontological Manifolds, Sheaf Semantics, and Concurrently Interacting Teleological (or Teleonomic) Structures
Tort Theory and Underlying Isomorphic Teleological Structures at Philosophical, Economic, Practical, and Doctrinal
Levels:
http://books.google.com/books?id=2OyI7KuMB vcC& pg=PA119
&dq=isomorphic+teleology#v=onepage&q=isomorphic%20teleological&f= false

Model Theory and Topoi:


http://www.springerlink.com/content/v51v12692507/?p=c7eef578ad7848ed877c2defd08a71c6&pi=0

A Note on Kripke-Joyal Semantics for the Internal Language of Topoi:


http://www.springerlink.com/content/lh345367795t4008/

A Geometry of Approximation: Rough Set Theory: Logic, Algebra and Topology:

"'A Geometry of Approximation' addresses Rough Set Theory, a field of interdisciplinary research first proposed by
Zdzislaw Pawlak in 1982, and focuses mainly on its logic-algebraic interpretation. The theory is embedded in a
broader perspective that includes logical and mathematical methodologies pertaining to the theory, as well as
related epistemological issues. Any mathematical technique that is introduced in the book is preceded by logical
and epistemological explanations. Intuitive justifications are also provided, insofar as possible, so that the general
perspective is not lost."
http://books.google.com/books?id=FCKnqGEP4WEC&sourc e=gbs_navlinks_s

General Theory of Groups:


http://www.springerlink.com/content/r433001532w68763/

Polyvarieties:

In connection with the notion of a polyidentity, introduced by O. N. Golovin, we introduce the notion of the
polyvariety of groups and we generalize Birkhoff's theorem on varieties of groups for the case of polyvarieties. We
also present several examples of polyvarieties of groups.
http://www.springerlink.com/content/x36tmk28278684g7/

I hope to attend this conference someday, maybe even present, hopefully my ideas regarding "Polymorphic
Isotelesis", "Isomorphic Polytelesis", "Holomorphic Autotelesis", "Automorphic Holotelesis" and "Distributive
Functioning" and their implications for concurrency and infocognitive parallelism will mature enough to express

Brainstorm Page 78
them in a formal logic, with applications in game-theoretic, knowledge-ontology, categorical-graph, and quantum-
automata semantic-networks.
http://en.wikipedia.org/wiki/Game_semantics
http://en.wikipedia.org/wiki/Semantic_network

Formal concept analysis is a principled way of automatically deriving an ontology from a collection of objects and
their properties. The term was introduced by Rudolf Wille in 1984, and builds on applied lattice and order
theory that was developed by Birkhoff and others in the 1930's.

Efficient Construction:
Kuznetsov & Obiedkov (2001) survey the many algorithms that have been developed for constructing concept
lattices. These algorithms vary in many details, but are in general based on the idea that each edge of theHasse
diagram of the concept lattice connects some concept C to the concept formed by the join of C with a single object.
Thus, one can build up the concept lattice one concept at a time, by finding the neighbors in the Hasse diagram of
known concepts, starting from the concept with an empty set of objects. The amount of time spent to traverse the
entire concept lattice in this way is polynomial in the number of input objects and attributes per generated concept.
http://en.wikipedia.org/wiki/Formal_concept_analysis

Formal Concept Analysis Homepage:

Formal Concept Analysis is a theory of data analysis which identifies conceptual structures among data sets. It was
introduced by Rudolf Wille in 1982 and has since then grown rapidly. Three well-established annual international
conferences (ICFCA, ICCS and CLA) are dedicated to FCA and related methods. The FCA method of formal data
analysis has successfully been applied to many fields, such as medicine and psychology, musicology, linguistic
databases, library and information science, software re-engineering, civil engineering, ecology, and others. A
strong feature of Formal Concept Analysis is its capability of producing graphical visualizations of the inherent
structures among data. Especially for social scientists, who often handle data sets that cannot fully be captured in
quantitative analyses, Formal Concept Analysis extends the scientific toolbox of formal analysis methods. Statistics
and Concept Analysis complement each other in this sense. In the field of information science there is a further
application: the mathematical lattices that are used in Formal Concept Analysis can be interpreted as classification
systems. Formalized classification systems can be analysed according to the consistency of their relations.
Thesauri can automatically be constructed from classes and their attributes, without having to create a hierarchy of
classes by hand. As an example, an on-line library catalog using the Conceptual Diagrams of an automatically
constructed class hierarchy has been implemented in the ZIT library in Darmstadt.
http://www.upriss.org.uk/fca/fca.html

Blending and Conceptual Integration:


The Riddle of the Buddhist Monk: A Buddhist monk begins at dawn one day walking up a mountain, reaches the
top at sunset, meditates at the top overnight until, at dawn, he begins to walk back to the foot of the mountain,
which he reaches at sunset. Make no assumptions about his starting or stopping or about his pace during the trips.
Riddle: is there a place on the path which the monk occupies at the same hour of the day on the two trips?
http://markturner.org/blending.html
Information Visualization and Semiotic Morphisms:
http://130.203.133.121:8080/showciting;jsessionid=D64DFC6808B 1DF9F4A D42A9A D57E 4224?cid= 165275
Algebraic Semiotics:
Algebraic semiotics is a new approach to meaning and representation, and in particular to user interface design, that builds on
five important insights from the last hundred years:
• Semiotics: Signs are not isolated items; they come in systems, and the structure of a sign is to a great extent inherited from the
system to which it belongs. Signs do not have pre-given "Platonic" meanings, but rather their meaning is relational, because
signs are always interpreted in particular contexts. (The first sentence reflects the influence of Saussure, the second thatof
Pierce.)
• Social Context: Signs are used by people as part of their participation in social groups; meaning is primarily a social
phenomenon; its purpose is communication. (This reflects some concerns of post-structuralism.)
• Morphisms: If some class of objects is interesting, then structure preserving maps or morphisms of those objects are also
interesting - perhaps even more so. For semiotics, these morphisms arerepresentations. Objects and morphisms together form
structures known as categories.
• Blending and Colimits: If some class of objects is interesting, then putting those objects together in various ways is probably
also interesting. Morphisms can be used to indicate that certain subojects are to be shared in such constructions, and colimit s of
various kinds are a category theoretic formalization of ways to put objects together. In cognitive linguistics, blending has been
identified as an important way to combine conceptual systems.
• Algebraic Specification: Sign systems and their morphisms can be described and studied in a precise way using semantic
methods based on equational logic that were developed for the theory of abstract data types.
http://charlotte.ucsd.edu/~goguen/projs/semio.html
A common theme of this work is the adoption of a sign-theoretic perspective on issues of artificial
intelligence and knowledge representation. Many of its applications lie in the field of computer-human
interaction (CHI) and the fundamental devices of recognition (work at IASE in California).
http://en.wikipedia.org/wiki/Computational_semiotics
Semiotic Morphisms, Representations and Blending for Interface Design:
Issues of representation arise in natural language processing, user interface design, art, and indeed,
communication with any medium. This paper addresses such issues using algebraic semiotics, which draws on
algebraic speci cation to give (among other things) an algebraic theory of representation, and a generalization of
blending in the sense of cognitive linguistics.
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.10.5238
Signs of Sense-Making: Blending Semiotics and Sense-Making:
http://communication.sbs.ohio-state.edu/sense-making/meet/m96gluck.html
Max Planck Institut Informatik:
http://www.mpi-inf.mpg.de/
International Conference on Conceptual Structures:

Papers for ICCS 2009 are invited on, but not restricted to, the following topics:
*conceptual structures - theory and applications
*semantic technologies and conceptual structures

Brainstorm Page 79
*semantic technologies and conceptual structures
*the interplay of conceptual structures with language, semantics, semiotics and pragmatics
*conceptual data processing, analysis and conceptual logic
*modeling, representation and visualization of concepts
*conceptual knowledge acquisition
*knowledge representation and reasoning with conceptual structures
*applied conceptual structures including data mining and knowledge discovery
*theory and applications of formal ontologies
*graph-based knowledge representation and reasoning
http://semanticweb.org/wiki/ICCS2009

IADIS Multi-Conference on Computer Science and Information Systems:


http://www.informatics-conf.org/
International Association for Development of the Information Society:
http://www.iadisportal.org/
Graph-Based Knowledge Representation: Computational Foundations of Conceptual Graphs:
http://www.lirmm.fr/gbkrbook/
Handbook of Graph Grammars and Computing by Graph Transformations:
http://books.google.com/books?id=nwD_oRtfJKkC&source=gbs_navlinks_s

Emergent Semantic and Computational Processes in Distributed Information Systems (DIS):


http://www.c3.lanl.gov/~joslyn/pcp/workshop98.html
Formal Concept Analysis in Information Science:

A central notion of FCA is a duality called a “Galois connection”. This duality can often be observed between two
types of items that relate to each other in an application, such as objects and attributes or documents and terms. A
Galois connection implies that if one makes the sets of one type larger, they correspond to smaller sets of the other
type, and vice versa. For example, considering documents and terms in information retrieval, enlarging a set of
terms will reduce the set of documents which contain all of these terms, whereas a smaller set of terms will match a
larger set of documents. In Formal Concept Analysis, the elements of one type are called “formal objects”, the
elements of the other type are called “formal attributes”. The adjective “formal” is used to emphasize that these are
formal notions. “Formal objects” need not be “objects” in any kind of common sense meaning of “object”. But the
use of “object” and “attribute” is indicative because in many applications it may be useful to choose object-like
items as formal objects and to choose their features or characteristics as formal attributes. In an information
retrieval application, documents could be considered object-like and terms considered attribute-like. Other
examples of sets of formal objects and formal attributes are tokens and types, values and data types, data-driven
facts and theories, words and meanings and so on.
http://www.upriss.org.uk/papers/arist.pdf

Knowledge Representation & Game Theory:


http://www.csd.cs.cmu.edu/research/areas/ai/
http://www.gameontology.org/index.php/Main_P age

Rational Choice Theory and Social Explanation:


http://philpapers.org/rec/FERSOE
Legal Informatics Blog:
http://legalinformatics.wordpress.com/tag/legal-ont ologies/

Let's Talk About "Being": A Linguistic-Based Ontology Framework for Coordinating Agents:
http://portal.acm.org/citation.cfm?id=1412410

Semantic Conceptions of Information:


http://plato.stanford.edu/entries/information-semantic/
Self-Predication and the Third Man:
http://www.springerlink.com/content/h57231g858031kn4/
Metacognition is classified into three components:

1. Metacognitive knowledge (also called metacognitive awareness) is what individuals know about themselves and
others as cognitive processors.
2. Metacognitive regulation is the regulation of cognition and learning experiences through a set of activities that
help people control their learning.
3. Metacognitive experiences are those experiences that have something to do with the current, on-going cognitive
endeavor.

Metacognition refers to a level of thinking that involves active control over the process of thinking that is used in
learning situations. Planning the way to approach a learning task, monitoring comprehension, and evaluating the
progress towards the completion of a task: these are skills that are metacognitive in their nature. Similarly,
maintaining motivation to see a task to completion is also a metacognitive skill. The ability to become aware of
distracting stimuli – both internal and external – and sustain effort over time also involves metacognitive or
executive functions.
http://en.wikipedia.org/wiki/Metacognition

Bioenergetics: A Key to Brain and Mind:

From the vantage point of the present stage of terrestrial evolution we can look over all the previous stages and
range them in a sequence of accelerating dissipation. The first, the simplest stage was the dissipation of solar (and
perhaps of other sources) energy gradients in synthesis of simple organic compounds—the phase of prebiotic
syntheses. Later, organic compounds were assembled into various dissipative structures, including elementary
protocells, which maintained their internal order by continual dissipation. Some of them gained the quality
of ontotelic systems—organized systems which ―aimed at‖ preserving their permanence, onticity, by directing the
flow of energy through them before its full dissipation. It has been proposed to label such ontotelic systems the
―subjects‖. The propensity of the world, ensuing from the second law of thermodynamics, to create subjects, has
been dubbed ―subjectibility‖. Subjectibility may be seen as a third ―substance‖ of the world, along with matter and
energy.29 Another intensification of the total energy flux through the system set in when some ontotelic systems got

Brainstorm Page 80
energy.29 Another intensification of the total energy flux through the system set in when some ontotelic systems got
the ability to make copies of themselves, to replicate. An important new stage appeared when the replicative
ontotelic systems became sentient—they were continually displaying alternative states, of which the one
recognizing a relevant feature of the environment became stabilized. It has been already mentioned that proteins
are the basic chemical entities exhibiting sentience at the simplest, molecular level. It is from this stage that the
voraciously dissipating systems may be called living systems. Sentience, and hence cognition, may be defined as
the demarcating characteristic of life. Cognitive systems have markedly increased the densification of energy flow.
The emergence of systems with higher-level sentience, in particular of the nervous system, brought about
additional densification. Cultural evolution, a new recent type of evolution, has become several orders faster than
the biological evolution and it may be nowadays complemented by still much faster technoscientific evolution.
The reason why cognition has become the most accelerating factor of evolution is straightforward: the growth of
knowledge, noogenesis, is autocatalytic, and hence exponential or even hyperbolic. In the simplest case, an
increase of knowledge is linearly dependent on the already existing knowledge, dK/dt = c.K, K = ect . Evolution of
knowledge has a character of a Bayesian ratchet: accumulation of knowledge is an incremental process of
changing probability of existing justified beliefs in the light of new evidence.30 According to Hans Kuhn, in the
course of evolution organisms gain in quality.31 The quality represents knowledge, and it is measured by the total
number of bits to be discarded until the evolutionary stage under consideration is reached. This Kuhn's measure of
knowledge is related to a similar measure of complexity, for which Seth Lloyd and Hans Pagels introduced the term
―thermodynamic depth‖.32 (For a more detailed description see ref. 29.) Knowledge embodied in a system
corresponds to its epistemic complexity; the greater the knowledge the greater the epistemic complexity. But rather
than characterizing it by the notion ―depth‖, a more appropriate term would be the ―thermodynamic height‖. ―Depth‖
corresponds to the sum total of entropy, which had been produced in the past to reach the present state, while
―height‖ is a measure of work capacity associated with the present state. Any gain of knowledge means an
increase in capacity to do work on the environment, because of higher placement of the subject in an ―epistemic
field‖—just as work done by a weight depends on its elevation in the gravitational field. To describe the
advancement in scientific description of the world, John Smart used a metaphor of climbing the ―Wigner's ladder‖,
wherein the spacing of each new rung gets ever closer together and easier to attain.33 This metaphor pictures well
the hyperbolic increase of knowledge acquisition. But even if a new knowledge is easier to attain when a lot of
previous knowledge, the Bayesian priors, is already available, a gain of any new knowledge continues to be an
energetically costly process. Knowledge is new if it cannot be foreseen, so acquiring new knowledge cannot be, by
definition, a deterministic process; it can only results from trials and failures. This is why the majority of actual
cognitive transactions, even in humans, continue to be based on the use of previously acquired, embodied,
knowledge. Functioning of mammalian immune system is a case in point.
http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid= 2633811

Mechanism Design: How to Implement Social Goals


The theory of mechanism design can be thought of as the ―engineering‖ side of economic theory. Much theoretical
work, of course, focuses on existing economic institutions. The theorist wants to explain or forecast the
economic or social outcomes that these institutions generate. But in mechanism design theory the direction of
inquiry is reversed. We begin by identifying our desired outcome or social goal. We then ask whether or not an
appropriate institution (mechanism) could be designed to attain that goal. If the answer is yes, then we want to
know what form that mechanism might take. In this paper, I offer a brief introduction to the part of mechanism
design called implementation theory, which, given a social goal, characterizes when we
can design a mechanism whose predicted outcomes (i.e., the set of equilibrium outcomes) coincide with the
desirable outcomes, according to that goal. I try to keep technicalities to a minimum, and usually confine them to
footnotes.
http://nobelprize.org/nobel_prizes/economics/laureates/2007/maskin-lecture.html

This would be an instance of the ubiquitous principle of Isotely, convergent or


parallel development of different creatures along similar lines.
http://books.google.com/books?
id=ncAVAQAAIAAJ&q=ubiquitous+principle+of+isotely&dq=ubiquitous+principle+of+isotely
Isoteleis, Isoteleia, Isoteleonic, Isotelestai
*-sis act, state, condition of
Isotelesis: The Law of Equivalence of Form:
http://www.kabbalah.info/engkab/kabbalah-video-clips/the-law-of-equivalence-of-form
Isotelesis: a new type of "Freemasonry"?
(The question is, whose or what ends are being served, and what are the structures and processes for attaining
them? I feel as if the 17th century couldn't imagine what changes would occur over these last 400 years, and that
new forms of techne and teleos or "telesis" would be available through communications technology for
collaboration models. Ultimately Isotelesis is simply a form of collaborative innovation network based on global
standards for interoperable, context-aware pervasive systems, which enable a new type of ubiquitous computing to
replace the "all seeing eye" of horus, as we can build our own suns on earth if we wanted to. The question is value
utility, which is not a utilitarian paradigm, but a value oriented paradigm, which respects the fact that different
cultures and communities have varying conceptions values and needs, even moral virtues. The idea that we can be
equal with each other in every sense is not possible, the goal should be whether we can find a balance, to work
together in spite of differences, while recognizing we share common goals at times (koinontely), and not at other
times, ultimately the outcome, whether of a shared or conflicting nature, will be equal to something. The inherent
equivalence of nature's laws is what "Isotelesis" ultimately recognizes.)
"Isotelesis" was partly inspired by the work of MIT's Center for Collective Intelligence and the vision of Peter Gloor
and Thomas W. Malone.
http://cci.mit.edu/pgloor/
http://ccs.mit.edu/malone/
Imagine organizations where bosses give employees huge freedom to decide what to do and when to do it.
Imagine electing your own bosses and voting directly on important company decisions. Imagine organizations
where most workers aren‘t employees at all, but electronically connected freelancers living wherever they want to.
And imagine that all this freedom in business lets people get more of whatever they really want in life—money,

Brainstorm Page 81
interesting work, helping other people, or time with their families.

In The Future of Work, renowned organizational theorist Thomas W. Malone, codirector of MIT‘s landmark initiative
―Inventing the Organizations of the 21st Century,‖ shows where these things are already happening today and
how—if we choose—they can happen much more in the future. Malone argues that a convergence of technological
and economic factors—particularly the rapidly falling cost of communication—is enabling a change in business
organizations as profound as the shift to democracy in governments. For the first time in history, says Malone, it will
be possible to have the best of both worlds—the economic and scale efficiencies of large organizations, and the
human benefits of small ones: freedom, motivation, and flexibility.

Based on twenty years of groundbreaking research, this landmark book provides compelling models for actually
designing the ―company of the future.‖ Through vivid examples of organizations around the world Malone outlines:
*Four decentralized organizational structures—loose hierarchies, democracies, external markets, and internal
markets—that will be enabled by technology but centered around enduring human values
*The shift from ―command-and-control‖ management to ―coordinate-and-cultivate,‖ and the new skills that will be
required to succeed
*A framework for determining if a company‘s situation is ripe for decentralizing and which organizational structure
would be most effective

Visionary and convincing, The Future of Work shows how technology now offers us the choice of creating a world
that is not just richer, but better.
http://ccs.mit.edu/futureofwork/

Thomas W. Malone:
http://en.wikipedia.org/wiki/Thomas_W._Malone
Value Utility From Wealth and Work:
http://www.dailynexus.com/article.php?a=11405
Game Theoretic Problems in Network Economics and Mechanism Design Solutions:
http://books.google.com/books?id=S7zxVKFmk24C&source=gbs_navlinks_s

Machine Learning, Game Theory, and Mechanism Design for a Networked World:
http://www.cs.cmu.edu/~mblum/search/AGTML35.pdf
Great Expectations. Part I: On the Customizability of Generalized Expected Utility:
http://129.3.20.41/econ-wp/game/papers/0411/0411003. pdf

Great Expectations. Part II: Generalized Expected Utility as a Universal Decision Rule:
http://129.3.20.41/eps/game/papers/0411/0411004. pdf
On Game Formats and Chu Spaces:
http://www.econ-pol.unisi.it/ab_quaderni/ ab417.html
"In some Masonic rituals there is a charge to the initiate with appeals to him or her as "a Freemason" being "a
citizen of the world" and an "individual"."
http://www.freemasons-freemasonry.com/column0907. html
Lysias, then, went to Thurii with his brothers Polemarchus and Euthydemus. He is said to have studied under the
Syracusan rhetorician Tisias. After the loss of the Athenian armies in Sicily, 413 B.C., Lysias and his brothers were
among three hundred persons accused of „Atticizing,‟ and were expelled from Thurii. They returned to Athens in
412 B.C. From this year till 404 B.C., the brothers lived in prosperity and happiness, making a considerable fortune
as proprietors of a shield-factory, where they employed 120 slaves.
They had many friends; they belonged to the highest class of aliens—the isoteleis—and the evidence of Plato and
Dionysius makes it clear that they mixed with the most cultivated society. They took pride in the performance of all
public services which fell to their share.
Fortune changed for the sons of Cephalus when in 404 B.C. a successful revolution brought the Thirty into power;
the orator himself gives a graphic description of the way in which their ruin was brought about.
http://www.perseus.tufts.edu/hopper/text?doc=Perseus:text:1999.04.0075:chapter=4

Homeotely: The term homeotely signifies that subsystems will direct their behaviour in such a way that it is
beneficial for the well-being of the overall system. When applied to the evolutionary process, it states that
subsystems will develop in such a way that they are beneficial for the well-being of the overall system. At first
glance, this sounds embarrassingly teleological. However, if we recognize the fact that the behaviour as well as the

Brainstorm Page 82
glance, this sounds embarrassingly teleological. However, if we recognize the fact that the behaviour as well as the
evolution of systems is guided by context-sensitive self-interest, teleology vanishes into thin air. Context-sensitive
self-interest is a systemic evolutionary principle: organisms are forced by their selfish genes to seek nothing but
their own advantage - but the environment in which they develop, or the system of which they are a subsystem,
only allows a limited set of developments and patterns of behaviour, any breach of the rules being punished with
elimination. For an animal endowed with choice this harsh law transforms into an
ethical principle: since its behaviour is only partly genetically determined, the word sensitive assumes its active
meaning, i.e. it refers to conscious reactions to perceived or anticipated effects of behaviour or development on the
overall system. (LM, based on Edward Goldsmith, The Way)
http://www.google.com/url?q=http://www.ausstieg-aus-dem-klima-cras h.de/%3Fdownload% 3DLexEcol0708.pdf%
26PHPSESSID%3D931d3f7a63398bfc65358ac475f0f7b0&sa=U& ei=y0uySoC6FoPqtAO9uaTY Cw&ct=res&cd=17
&usg=AFQjCNEJu6DEEzU7CkSktsaAk0z9Bt69Iw
Homeotely/Homeotelic/Homeotelos/Homeotelesis (resembling, sharing in common, similar + goals, outcomes,
results, up to or at a distance, purpose, intelligent planning of means to achieve desired ends)

Isotely/Isotelic/Isotelos/Isotelesis (equal, same, convergent + goals, outcomes, results, up to or at a distance,


purpose, intelligent planning of means to achieve desired ends)

Koinotely/Koinotelic/Koinotelos/Koinotelesis (common, mutual, shared + ends, goals, outcomes, results, up to or at


a distance, purpose, intelligent planning of means to achieve desired ends)
κοινός (koinós) m., nominative, sg.
1. common, mutual, shared, joint
κοινή πεποίθηση (common belief)
κοινό σσμφέρον (mutual interest)
2. common, commonplace, ordinary
ένας κοινός άνθρωπος
3. public
κοινή γνώμη (public opinion)
κοινή ωφέλεια (public utility)

(isos, “equal”)
(telos) "end, goal, result"
Psychological Decision Theory: Decision Maker as a Goal-Directed System: Composite Goals:
http://books.google.com/books?id=LQc-zQoOuFMC&lpg=PP1&pg=PA25#v= onepage&q=& f=false
Goal-Directed Agent Behavior:
http://books.google.com/books?id=gDLpyWtFacYC&lpg=PP1&pg=PA379#v= onepage&q=& f=false
A Refined Goal Model for Semantic Web Services:
Abstract—The idea of service orientation envisions dynamic detection and execution of suitable Web services for
solving a particular request. Most realization approaches pay only little attention to the client side of such
architectures. We therefore promote a goal-driven approach: a client merely specifies the objective to be achieved
in terms of a goal, and the system resolves this by automated detection, composition, and execution of Web
services. Extending the WSMO framework, we present a model for describing goals as formalized client objectives
that carry all information relevant for automated detection and execution of Web services. This paper explains the
design of the goal model, specifies the formal descriptions of goals, and demonstrates the model within an
illustrative example.
http://www.sti-innsbruck.at/fileadmin/documents/goalmodel-iciw07.pdf
The Koinophilic Solution:

The problem with cooperation is that although a group of cooperative individuals is fitter than an equivalent group
of selfish individuals, selfish individuals interspersed amongst a community of cooperators are fitter than their
hosts. This means they raise, on average, more offspring than their hosts, and will therefore ultimately replace
them.
http://en.wikipedia.org/wiki/Koinophilia

The KIF Encoding for a Classification Ontology:

This paper provides a formal semantics for classifications. Following the thesis that classifications are a
preliminary representation or prototype for ontological models, by extension it provides a formal semantics
for ontologies. This appendix discusses the KIF encoding for an ontology of classifications. Again, since
classifications are a prototype for ontological models, this KIF ontology could be viewed of as a theory of
ontologies, or in other words an ―ontology of ontologies.‖
http://suo.ieee.org/email/pdf00005. pdf
A Dynamic Theory of Ontology:
http://www.jfsowa.com/pubs/dynonto.htm

Confronting Misconceptions with Adaptive Ontologies:


http://www.mkbergman.com/date/2009/08/17/
UMBEL: Upper Mapping and Binding Exchange Layer; A Lightweight, Subject Concept Reference Structure for the
Web:
http://www.umbel.org/
OpenStructs: Open Source Data Structs and Web Service Frameworks:
http://openstructs.org/
ConStruct: Structured Content System, "Driving Drupal with Structured Data":
http://constructscs.com/
structOntology:
http://drupal.org/project/structontology
Structured Dynamics: Structured and Linked Data Experts:
http://www.structureddynamics.com/
The Bibliographic Knowledge Network (BKN) is a project to develop a suite of tools and services to encourage
formation of virtual organizations in scientific communities of various types.
http://www.bibkn.org/drupal/
Concurrent Knowledge-Extraction in the Public Key Model:
http://eccc.hpi-web.de/eccc-reports/2007/TR07-002/index.html

Brainstorm Page 83
Concurrent Knowledge-Extraction in the Public Key Model:
http://eccc.hpi-web.de/eccc-reports/2007/TR07-002/index.html
Simplified Design for Concurrent Statistical Knowledge Extraction:
http://linkinghub.elsevier.com/retrieve/pii/S1007021409700380
Bibliographic Ontology Specification:
http://bibliontology.com/
Music Ontology Specification:
http://musicontology.com/
Theory as Product: Ontology of Organization as System:
http://www.c5corp.com/research/ontology.shtml

Measuring Inconsistencies in Ontologies:


http://www.eswc2007.org/pdf/eswc 07-deng.pdf

Ontology View of Automata Theory:


http://www.foibg.com/ijita/vol15/ijita15-4-p05.pdf
Semantics and Ontology of InterAction:
http://www.loa-cnr.it/Files/SOIA.pdf

Towards an Ontology of Problems:


http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.48.1990

Development of the ALIS IP Ontology: Merging Legal and Technical Perspectives:


http://www.springerlink.com/content/f5356w8182886432/

AI Approaches to the Complexity of Legal Systems:


http://74.125.155.132/search?q=cache: XRF7-Ydy QqIJ:idt.uab.es/IV RXXIV-aicol09/
+ontologies+and+game+theory&cd=9&hl=en&ct=clnk&gl=us

Computable Models of the Law: Languages, Dialogues, Games, Ontologies:


http://www.one-lex.eu/Activities/Conferences/cmlaw.pdf

Chu Spaces are a simple and surprisingly general mathematical construct with applications to linear logic,
concurrency, games, physics, philosophy and foundational mathematics.

Chu Spaces as a Semantic Bridge Between Linear Logic and Mathematics:

The motivating role of linear logic is as a "logic behind logic". We propose a sibling role for it as a logic of
transformational mathematics via the self-dual category of Chu spaces, a generalization of topological spaces.
These create a bridge between linear logic and mathematics by soundly and fully completely interpreting linear
logic while fully and concretely embedding a comprehensive range of concrete categories of mathematics. Our
main goal is to treat each end of this bridge in expository detail. In addition, we introduce the dialectic lambda-
calculus, and show that dinaturality semantics is not fully complete for the Chu interpretation of linear logic.
http://portal.acm.org/citation.cfm?id=763672

On Game Formats and Chu Spaces:

It is argued that virtually all coalitional, strategic and extensive game formats as currently employed in the extant
game-theoretic literature may be presented in a natural way as discrete nonfull or even-under a suitable choice of
morphisms- as full subcategories of Chu (Poset 2).
http://www.econ-pol.unisi.it/ab_quaderni/ ab417.html

The essence of the points and states (dual points) of Chu spaces may be found in the subjects and predicates of
linguistics, the individuals and predicates of logic, the positions and momenta of physics, the inbound and outbound
arrows at categorical objects, and, most importantly for this talk, the events and states of computational process
algebra.
http://www.cs.bham.ac.uk/~vdp/meetings/chu-abs.html

Computational Category Theory. Broadly speaking, this includes Linear Logic, Chu Categories, Synthetic Domain
Theory, Coherence, Bi-completions of Categories, Categorical Proof Theory, and Categorical Linguistics.
http://www.math.mcgill.ca/bunge/ctatmcgill.html

Chu spaces generalize the notion of topological space by dropping the requirements that the set of open sets be
closed under union and finite intersection, that the open sets be extensional, and that the membership predicate (of
points in open sets) be two-valued. The definition of continuous function remains unchanged other than having to
be worded carefully to continue to make sense after these generalizations.
http://www.viswiki.com/en/Chu_space
Information Transfer Across Chu Spaces:
http://jigpal.oxfordjournals.org/cgi/content/abstract/8/6/719

Information Transfer Time: The Role of Holomorphism, Stationary Phase, and Noise:
http://arxiv.org/abs/physics/0210115

Formal Topology, Chu Space, and Approximable Concept:


http://ftp.informatik.rwth-aachen.de/Publications/CE UR-WS/Vol-162/paper14.pdf

Generalizations of Approximable Concept Lattices:


http://www.springerlink.com/content/057625x0745w717p/

Chu Spaces, Concept Lattices, and Domains:


http://www.entcs.org/files/mfps19/83018. pdf

Chu Correspondences:
http://cat.inist.fr/?aModele=afficheN&cpsidt=20280132

Brainstorm Page 84
http://cat.inist.fr/?aModele=afficheN&cpsidt=20280132

A Categorical View at Generalized Concept Lattices:


http://www.kybernetika.cz/pdf_abstract/432818. pdf

Concept Lattices and Their Applications:


http://books.google.com/books?id=htWL-wqPS XwC&source=gbs_navlinks_s
Knowledge Representation and Mediation for Transdisciplinary Frameworks: Tools to Inform Debates, Dialogues &
Deliberations:
http://www.ijtr.org/Vol%201%20No1/4.%20Pereira_Funtowicz_IJTR_Article_Vol1_no1.pdf

Semantic Web based Collaborative Knowledge Management - A generic SOA for managing semantics driven
domain knowledge:
http://eprints.ecs.soton.ac.uk/14016/
Open Source Models of Collaborative Innovation in the Life Sciences:
http://www.cambia.org/daisy/patentlens/1450/ version/default/part/AttachmentData/data/ 2005_Bellagio_Report.pdf
Mining Workshop: Transdisciplinary and Conceptual Framework that Facilitates Addressing Research Questions
Simultaneously: Key Elements for Building a Transdisciplinary Research Framework:
http://www.aeseda.psu.edu/workshop/transdisciplinary.html

A Spatial View of Information:


http://www.linkinghub.elsevier.com/ retrieve/pii/S0304397506005019

6. The Nature of Telism. -- Telism is a special case of causation; but it emphatically does nor consist, as
sometimes alleged, in causation of an earlier event by a later. If one observes carefully some undoubted cases of
telism, one finds them to be cases where not the later occurrence of an event E, but the earlier occurrence of
desire for the later occurrence of E, causes performance of an act A, which itself then causes E (either directly or
through an instrument). That is, performance of act A at time 2 by person P is not caused by occurrence of E at
time 3; it is caused by P's desire at time 1 that E shall occur at time 3. What is true, however, is that, for a
spectator, occurrence of E at time 3 explains performance of A by P at time 2. But it explains that performance not
by exhibiting itself as cause of it, but by exhibiting itself as something occurring indeed at time 3, desire for which
by P at time 1 would have been sufficient to cause, and probably did cause, performance of A by P at time 2.
http://www.ditext.com/ducasse/duc13.html

Purpose: An intended use or anticipated outcome (categories, graphs, networks, games) that serves a role or
epitomizes a variable as part of the needs of some contextual, self-defined whole (with self-restrictive agency
or complex emergent feedback), that guides the planning of actions (consciously or unconsciously), has the quality
of being determined to do or achieve something (decisions reached and proposed through bounded intentional
modes), probability (objective or subjective) is used in the selection of generalized utility.
Outcome: A phenomenon that follows and is caused by another phenomenon, something that results, information,
event, object or state produced as a result or consequence of a plan, process, accident, stimuli, calculi, effort or
other similar action or occurence.
Teleologic Purpose: Intented outcome determined by entirety of events in the universe (or multiverse).
Teleonomic Purpose: Intended outcome determined by the modes of programmable states available to an agent or
composition of complex adaptive systems.
Teleologic Outcome: Intended purpose determined by all possibilities in the universe (or multiverse).
Teleonomic Outcome: Intended purpose determined by available resources of anticipatory program.
Teletic: An outcome or purpose which is self-consistent concurrently throughout, ensures the maintanence
of certain principles to, at, or from a distance.
Telestic: An outcome or purpose for which the causal agency is able to formulate new strategies based on changes
in evidence or desires, a means for self-determinacy, or quasi-freedom of will.
Teleologic or Teleonomic: Telic or Quasi-Telic
Telesis: Teletic or Telestic (when hyperincursively applied)

I am interested in self-organization, beyond genotypes and phenotypes, for online social phenomena (the WWW is
similar to directed network structures) and eventually combined with semantics technology (ontology design tools)
for entangled collaborative innovation networks which employ concepts from quantum game theory.

My Tentative Definition:
Telisms are structure-preserving mappings similar to infomorphisms in directed networks, or teleomorphisms in
evolutionary biology, however the definition of "function" is extended to include fulfilling objectives, ends, outcomes,
goals, or apparent purposes (used only metaphorically in the sense of teleonomy not teleology), which come in four
main types which may be glued into various types of network motifs, their systemic analysis is complementary to
the study of plectics, and therefore it may be called teletics when refering to processes occuring concurrently at a
distance, and telestics when it involves telesis as described by Chris Langan's CTMU (Ultimately my interest with
his ideas comes down to a skeletal theory for complex adaptive systems.)

Autotelism: an inclusion relation which takes automorphic parameters for objectives, functions, ends, outcomes,
goals, or apparent purposes.
Isotelism: parallel structures and processes with convergent, or isomorphic objectives, functions, ends, outcomes,
goals, or apparent purposes.
Polytelism: any one function which fulfills multiply-connected, polymorphic objectives, functions, ends, outcomes,
goals, or apparent purposes.
Holotelism: when any functional node takes the entire network structure as the parameter when developing
objectives, functions, ends, outcomes, goals, or apparent purposes.
(Koinotelism, Homeotelism, etc. may arise from categorical, functional-network gluing)

Reality Principle - The real universe contains all and only that which is real. The reality concept is analytically self-
contained; if there were something outside reality that were real enough to affect or influence reality, it would
be inside reality, and this contradiction invalidates any supposition of an external reality (up to observational or
theoretical relevance).
Syndiffeonesis - The expression and/or existence of any difference relation entails a common medium
and syntax. Reality is a relation, and every relation is a syndiffeonic relation
exhibiting syndiffeonesis or "difference-in-sameness". Therefore, reality is a syndiffeonic

Brainstorm Page 85
exhibiting syndiffeonesis or "difference-in-sameness". Therefore, reality is a syndiffeonic
relation. Syndiffeonesis implies that any assertion to the effect that two things are different
implies that they are reductively the same; if their difference is real, then they both reduce to a
common reality and are to that extent similar. Syndiffeonesis, the most general of all reductive
principles, forms the basis of a new view of the relational structure of reality.
Metaphysical Autology Principle (MAP) - All relations, mappings and functions relevant to
reality in a generalized effective sense, whether descriptive, definitive, compositional, attributive,
nomological or interpretative, are generated, defined and parameterized within reality itself. In
other words, reality comprises a "closed descriptive manifold" from which no essential predicate
is omitted, and which thus contains no critical gap that leaves any essential aspect of structure
unexplained. Any such gap would imply non-closure.
Isotelesis, Cognitive-Theoretic Model of the Universe, M-Theory, and TOEs

HYJ: in my version of ISOTELESIS, [which is totally independent of CML's Holopantheistic CTMU, and if it had any
"theistic" connotations (which this author considers somewhat irrelevant), it would rather be described
as Holopanentheistic or Metatheistic, after all Reimann apparently made several efforts to prove the validity of the
book of Genesis using mathematical principles, although his deeper motives weren't to strengthen fundamentalism
in religious dogma, but to recognize inherent possibility. At times Isotelesis is quite similar to the ying/yang of
holistic/dualistic complementarity as exemplified by Chu Spaces or perhaps as some dual-aspect "monism", the
"ISO" component merely implies "equation", and "TELESIS" implies the connectivity of means and ends,
therefore compounded they represent means-ends equations, and the equivalence of all possibilities and
their complementary "adaptations" with actualizations, in order to maintain ontological consistency and cohesive
logic or hology. So these aspects could be similar to the Fractional Quantum Hall Effect or Anyons, at
times Quaternionic, or perhaps an Octo-Octonionic projective plane such as E8, which is the largest exceptional
Lie symmetry group for which the Lie algebra is finite, and built using an algebra that is a tensor product of the
Octonions with themselves. Isotelesis is a principle which underlies the semantics of mappings as telisms,
and seeks to fine tune various distinctions and similarities among information preserving structures, especially
when involving multiple potentially conflicting meanings, quantification/qualification
of extensional/intensional attributes or isolating cause and effect in functions involving mixed variance
polyisomorphism.] The nodes above represent local-global agreement through Isocognitive Stratification and
Isomorphic Syntax {(IS)} and , and the Multifurcatively Convergent bands of "explained in terms of" are
Bound Telisms embedded in O for Open Telesis (rather than Unbound Telesis), think of the hole(whole) of a Klein
Bottle as the O, the self-connected regions of the manifold are the {(IS)}, however instead of one hole(whole), there
are many holes(wholes) connecting transdimensionally, the free space/energy/infogravitic potential which the O
has access to in the form of unoriented (unbound) TELE-S (the means to achieve some end "S"), or O could stand
for "operator" and "S" for "system", or the operating system (global/local self-perceptual mechanism) which
connects, transforms, structurally preserves "Is" in terms of "is" with "tele" which transduces "I" for inclusion, or "SI"
for Self-Inclusion, allowing self-containment, binding, stratification and protocomputational ordering of functors and
parameters, telors and grammars, and orients telesis from a topoalgebraically unoriented state as part
of the system of "S", becoming "SIS" for a Self-Including Set or in terms of how reality explains itself: Scientia
Incogito Summa, or (Knowing Arranges Higher), allowing for complexity generation (a non-zero sum game)
in an Ontologically Telestic System "O-TELE-S" aka "OTELES", with "o": Among the ancients, O was a mark of
triple time, from the notion that the ternary, or number 3, is the most perfect of numbers, and properly expressed by
a circle, the most perfect figure. "teles": Tending or relating to a purpose or an end. for self-consistent, self-
coherable, self-distributive networks capable of self-design, self-actualizations, and self-predications up to
isomorphism: {(IS)}-O-TELE-S -{(IS )}-O-TELE-S-{(IS)}-O-TE LE-S-{(IS)}-O-TELE-S-{(IS)} etc. concurrently in
multiple dimensions via distributively branching spaces and the incoversion-coinversion of parallel infogravitic
lightcones represents the conspansion between event-state, or time-information dualities.
Cyclic Coverings, Calabai-Yau Manifolds and Complex Multiplication: The Multilinear Yoneda Lemmas: Tocatta,
Fugue, and Fantasia on Themes by Eilenberg-Kelly and Yoneda:
http://www.springerlink.com/content/436564w0m7344771/
Relative Injectivity Between Cocompleteness For a Class of Distributors:
http://www.maths.tcd.ie/EMIS/journals/TAC/volumes/21/12/21-12.pdf
The Ising Model is NP-Complete:
http://www.cs.brown.edu/~sorin/lab/media/ising_model_np_complete.pdf
The gate of the Yale Chemistry Research building is adorned with a model beloved of physicists and chemists who
study phase transitions: the Ising model. In 1944, Nobel-prize winning Yale professor Lars Onsager solved the two-
dimensional Ising model in a show of mathematical tour de force.
http://www.yale.edu/physics/gallery/Photo-30.html

Brainstorm Page 86
http://plus.maths.org/issue26/features/mathart/index-gifd.html

http://www.cs.sandia.gov/newsnotes/index.html

HYJ: "End"-less or holoauto-open "ended"? Perhaps forming both open and closed strings among multiple
interstratified D-branes? If unbound/open "telesis" is similar to strings/branes, and closed loops take form as
isorecursions (telic binding), perhaps they vibrate in symphonic polymodal melodies or echo together as the voices
in choir, becoming like is-oteles-is, or referring to factors that produce or tend to produce the same effect, in a self-
bound or open-ended system this dance of resonantly embedded topoalgebraic, emanative matrices is-what-it-is:
a TOE.

Brainstorm Page 87
Hierarchical and Symbolic Structures, Omne symbolum de symbolo, Scientia Incogito Summa, and Cybersemiotics
"Symbols grow. They come into being by development out of other signs, particularly from icons, or from mixed
signs partaking of the nature of icons and symbols. We think only in signs. These mental signs are of mixed nature;
the symbol-parts of them are called concepts. If a man makes a new symbol, it is by thoughts involving concepts.
So it is only out of symbols that a new symbol can grow. Omne symbolum de symbolo." - Charles Sanders Pierce

http://www.brier.dk/SoerenBrier/index.htm
Cybersemiotics ~ Integration of the Informational and Semiotic Paradigms of Cognition and Communication:
http://www.mdpi.com/journal/entropy/special_issues/cybersemiotics-paradigms
The scientific endeavor in the post-modern age is becoming increasingly complex and
transdisciplinary. Researchers and practitioners within the fields of the arts and the natural, medical, and social
sciences have been forced together by new developments in communication and knowledge technologies that
have broken the traditional limits of professional knowledge. They are further forced together by problems arising
from the limitation on the kinds of knowledge that we have cherished sofar. The shortcomings of traditional
information and communication analysis based on data or information-flow theories are raising fundamental
problems with respect to the construction and organization of knowledge systems. New concepts of communication
can help us understand and develop social systems as self-organizing and self-producing networks, and we need a
deeper understanding of the ethics and aesthetics foundational to the existence of these new systems. Instead of
the communication of information we might speak of a jointly actualized meaning.
http://www.infoamerica.org/teoria_articulos/luhmann02.pdf
Pragmatism and Umwelt-Theory:
The main idea of Uexküll is that each component of Umwelt has a functional meaning for an organism; it may be
food, shelter, enemy, or simply an object that is used for orientation. An organism actively creates its Umwelt
through repeated interaction with the world. It simultaneously observes the world and changes it; the phenomenon
which Uexküll called a functional circle. The Umwelt-theory also implies that it is not possible to separate mind from
the world (matter) because mind makes the world meaningful.
http://home.comcast.net/~sharov/biosem/txt/umwelt.html
Pragmatic Semiotics & Knowledge Organization:
http://cat.inist.fr/?aModele=afficheN&cpsidt=16763828
Knowledge Profiling: The Basis for Knowledge Organization:
http://findarticles.com/p/articles/mi_m1387/is_3_52/ai_n6080401/
Minds and Machines:
http://portal.acm.org/toc.cfm?id=J538&type=periodical&coll=GUIDE&dl=GUIDE&CFID=57001638& CFTOKE N=
69755046
Resources on Cognitive & Organizational Semiotics:
http://www.mildai.org/resources.html
A Theory of Second-Order Trees:
http://glew.org/nglew/papers/tsot-esop.pdf

In string theory, D-branes are a class of extended objects upon which open strings can end with Dirichlet boundary
conditions, after which they are named. D-branes were discovered by Dai, Leigh and Polchinski, and independently
by Horava in 1989. In 1995, Polchinski identified D-branes with black p-brane solutions ofsupergravity, a discovery
that triggered the Second Superstring Revolution and led to both holographic and M-theory dualities.
http://en.wikipedia.org/wiki/D-brane
http://www.damtp.cam.ac.uk/user/dt281/teaching.html
http://eskesthai.blogspot.com/2006_01_01_archive.html
http://universe-review.ca/R15-18-string.htm
http://www.superstringtheory.com/
A First Course in String Theory:
http://books.google.com/books?id=XmsbvP1uUeIC&source=gbs_navlinks_s
Orientifold:
http://ncatlab.org/nlab/show/orientifold
On Mass Hierarchies in Orientifold Vacua:
Abstract. We analyze the problem of the hierarchy of masses and mixings in orientifold realizations of the Standard Model.
We find bottom-up brane configurations that can generate such hierarchies.
Key words: D-branes; Quark Masses and SM Parameters; Standard Model
http://www.iop.org/EJ/abstract/1126-6708/2009/08/ 026

From Spacetime Foam to Holographic Foam Cosmology:


http://dx.doi.org/10.1016/j.physletb.2007.04.024
Emerging Holography:

Brainstorm Page 88
Emerging Holography:
http://kea-monad.blogspot.com/2009/06/emerging-holography.html
Originally the letter M in M-theory was taken from membrane, a construct designed to generalize the strings of
string theory. However, as Witten was more skeptical about membranes than his colleagues, he opted for "M-
theory" rather than "Membrane theory". Witten has since stated that the interpretation of the M can be a matter of
taste for the user of the word "M-theory".
http://en.wikipedia.org/wiki/M-theory

According to Illuminati Conspiracy Part One, by Terry Melanson (author of “Perfectibilists - The 18th Century
Bavarian Order of the Illuminati”):

«The Order was, therefore, always represented in communications between members as a circle with a dot in the
center ʘ This symbolic imagery - the point within a circle, the Perfectibilists and the Bees - is also reflective of
Weishaupt's fascination with Eleusinian and Pythagorean Mysteries; no doubt learning of this early on having
access to Ick statt's considerable library.»
http://www.scribd.com/doc/7455173/Illuminati-Cons piracy-Part-One
To understand the meaning and motives of egalitarianism, project it into the field of medicine. Suppose a doctor is
called to help a man with a broken leg and, instead of setting it, proceeds to break the legs of ten other men,
explaining that this would make the patient feel better; when all these men become crippled for life, the doctor
advocates the passage of a law compelling everyone to walk on crutches—in order to make the cripples feel better
and equalize the ―unfairness‖ of nature.

If this is unspeakable, how does it acquire an aura of morality—or even the benefit of a moral doubt—when
practiced in regard to man‘s mind?
http://aynrandlexicon.com/lexicon/egalitarianism.html
"Isotelesis" supports social equity, teleonic isomorphism, teleosemantic equivalence, local-global
intercomplementarity, coreflexive functionality, and reflective equilibrium.
http://plato.stanford.edu/entries/egalitarianism/

"Is-oteles-is" may also be thought of as "equal distance, fulfillment", as opposed to "I-lluminat-i", which
presupposes the existence of "enlightened ones" or "Roshaniya" in Farsi, and represents a restoration of the
original purpose of Zarathustra's teachings, of "Good Thoughts, Good Words, Good Deeds" which does not
support one-world government as proposed by the Bahá'u'lláh, rather a distributed brotherhood of man, or Koinonia
+ Telesis, with each city-state practicing gyroteleostasis, and respecting social equity and diversity, without
assuming egalitarian principles, as individuality precludes total "equality" of all people, because total "equality" is
only possible within oneself, however balance and equilibrium within self-determinative principles should be
encouraged, as it exercises a form of causation which enables higher-order bonds to be formed between various
modes of cognition.
In Sanskrit, the proper spelling of the word swastika is svastika. Sanskrit has no 'w'. Literally, the word svastika is a statement
of affirmation, "It is!" "Life is good!" "There is value" "There is meaning!" Svastika is a term that affirms the positive values
of life. The word is made of su + as. "As" is the root of the copular verb "to be" of which the third person singular is, "asti," "it
is." Su is a prefix used in Sanskrit to intensify meaning in a positive way, thus su+asti means literally, "it really is!" When
combined, the 'u' changes into a 'v' thus giving the form svasti. The ending 'ka' makes this verbal form into a noun. This is the
linguistic morphology of the word, svastika.
http://www.sanskrit.org/www/Hindu%20Primer/swastika.html

Brainstorm Page 89
http://en.wikipedia.org/wiki/Koinonia
http://en.wikipedia.org/wiki/Telesis
http://www.pfo.org/illumint.htm
The ancient Egyptians used a simple circle with a dot in the center to represent their fiery deity Ra in hieroglyphics.
That same solar symbol is still used today to represent the sun in astronomy and astrology.
http://max-bro.net/2009/05/15/33-cool-facts-about-the-sun/

Brainstorm Page 90
Because Zoroastrianism claims to be the world‘s oldest revealed religion it is also, arguably, the world‘s first
proponent of ecology, through caring for the elements and the earth.

The Zoroastrian faith enjoins the caring of the physical world not merely to seek spiritual salvation. Human beings,
as the purposeful creation of God, are seen as the natural motivators or overseers of the Seven Creations. As the
only conscious creation, it is humanity‘s ultimate task to care for the universe.

The faith endorses the caring of Seven Creations (sky, water, earth, plant, animal, human and fire), as part of a
symbiotic relationship. Zoroastrianism sees the physical world as a natural matrix of Seven Creations in which life
and growth are inter-dependent if harmony and perfection is to be the final goal.

The sacredness of the creations demands a greater awareness on the part of Zoroastrians, for at the end of time
humanity must give to Ahura Mazda a world of purity, a world in its original perfect state. As an example of their
concern, it is a tradition that Zoroastrians never enter a river, to wash in it or pollute it in any way. Purity of nature in
their tradition is seen as the greatest good.
http://en.wikipedia.org/wiki/Zoroastrianism#Zoroastrianism_and_ecology

Zoroaster prophecied to the Persians, that He would return in exactly 1000 years. He said, "In a thousand years
look to the western sky. You will see a great star, follow it and you will find me, there, in a manger. That is who the
three Magi were, they were Zoroastian king-priests looking for the return of Zoroaster and they found him in the
baby Jesus.
http://www.alaska.net/~peace/zoroaster.htm
Tradition has it that during the final 3000 years after Zoroaster three Sayoshants or Benefactors will be born at
1000 year intervals. The origins of this belief is unclear but passages in the Gathas themselves do suggest that He
taught that after Him would come "the man who is better than a good man- the one who will teach us for the
physical existence and for that of the mind, the straight paths of salvation to the true things with which Ahura
Mazda dwells-- who is faithful and resembles You, O Mazda." But later legends spoke of this series of three saviors
who would each be born of virgins, miraculously impregnated with the seed of Zoroaster which has been preserved
in Lake Kasaoya.

With the appearance of the first Sayoshant , the sun will stand still for ten days. For three years the creations will
flourish and the wolf species will disappear. But then a bitter winter will set in and many will perish. When the
second Sayoshant comes the sun will stand still for twenty days and the creation will flourish for six years and
snakes will disappear. Humanity will become vegetarian and drink only water. But evil will arise once again in the
form of a demon which will devour men and beast, and spread pollution everywhere. But an ancient hero
Keresaspa will rise from the dead to rid the world of this evil being.

The final savior will bring the complete and final victory of good over evil. This time the sun will stand still for thirty
days. All disease and death will be overcome. The dead (both man and beast) will be raised and proceed to the
last judgement where everyone will see his good and evil deeds. The metal of the mountains will be melted,
leveling the mountains (symbolizing that which divides us) and making a river of metal flow upon the earth.
Everyone will be made to pass through this river, which will feel like warm milk to the righteous but will bring
anguish and pain on the wicked. It is from purifying properties associated with molten metals in Zoroastrianism that
the conception of hell as a place of burning torment was derived. But while the torment is real, its purpose is
purification not punishment. Thus purified, all will be given the gift of immoratlity, forever uniting body and soul.
Ahriman will be utterly destroyed, having been consumed by his own demon, rage. Note that within this system
there is no eternal hell. If Ahura Mazda is to win the final victory *all* of His creation must be ultimately redeemed.
The continued existence of hell would infer the continued existence of evil.

In this final three thousand years, lineal time has become somewhat of a spiral, there is sense of what Windengren
termed "cylic revelation." The messenger comes, things get better for a time, but until the coming of the last
Sayoshant they again start to decline. While these cycles appear similiar to those common in other religious
traditions, note that there remains a directionality to all this. We are not spinning our wheels going nowhere. This
spiral conception of time and revelation will be echoed in Ismaeli Shi'ism as well as the Baha'i concept of
progressive revelation.
http://bahai-library.org/conferences/time.evil.html
However, if we are to be intellectually honest, we must admit that the linear Baha'i concept of God sending a "new
Manifestation" to "reveal" a "new Dispensation" with a "new Book" for the world roughly every 1000 years does not
take into account all the great diversity and complexity of historical religious development. This idea comes directly
from Islam and has a heavy Abrahamic bias. Baha'ism has attempted to embrace some non-Abrahamic religions

Brainstorm Page 91
from Islam and has a heavy Abrahamic bias. Baha'ism has attempted to embrace some non-Abrahamic religions
by fitting them into the Islamic paradigm of religious history. This is a nice theoretical attempt with good motivations
behind it, but in reality it is like trying to fit a square peg into a round hole.
http://www.bahai-faith.com/
"...as to the meaning of the passage in the 'Iqan' in which Baha'u'llah refers to the renewal of the 'City of God' once
in about a thousand years: this, as the word about implies, is simply an approximate date, and should not therefore
be taken literally."
http://bci.org/bahaistudies/courses/Iqan/renewal. htm
The roots of Christmas are from Iran. The Iranian Astrologers understood the cycles of the planets, and knew of the
famous delineation at 3BC, and Christ was seen as one of the Sayoshants.
In most ancient cultures, including Persia, the start of the solar year has been marked to celebrate the victory of
light over darkness and the renewal of the Sun.
As expected many in the world still do not appreciate the Astrological connection of this so ancient festival. And
Yalda becomes Yuletide, which is also known as the celebration of logs. The exact moments is known just
Nowrooz or the Spring Equinox. But Iranians do not look at this festival that way, so much as the dualistic nature of
light, as Sa'adi says below.

Because Shab-e Yalda is the longest and darkest night, it has come to symbolise many things in Persian poetry;
separation from a loved one, loneliness and waiting. After Shab-e Yalda a transformation takes place - the waiting
is over, light shines and goodness prevails.

' The sight of you each morning is a New Year


Any night of your departure is the eve of Yalda' (Sa'adi)

'With all my pains, there is still the hope of recovery


Lik e the eve of Yalda, there will finally be an end' (Sa'adi)
http://alimostofi8.blogspot.com/2007/12/happy-2007-winter-solstice-or-iranian.html#links
Not in His Image: Gnostic Vision, Sacred Ecology, and the Future of Belief:
http://books.google.com/books?id=NblF1xjoeLUC&lpg=PP1&pg=PA131#v= onepage&q=& f=false

"Already at the beginning of this history I hinted at the reasons which led my brother to select a
Persian as the incarnation of his ideal of the majestic philosopher. His reasons, however, for
choosing Zarathustra of all others to be his mouthpiece, he gives us in the following words: 'People
have never asked me, as they should have done, what the name Zarathustra precisely means in my
mouth, in the mouth of the first immoralist; for what distinguishes that philosopher from all others
in the past is the very fact that he was exactly the reverse of an immoralist. Zarathustra was the
first to see in the struggle between good and evil the essential wheel in the working of things. The
translation of morality into the metaphysical, as force, cause, end in itself, was his work. But the
very question suggests its own answer. Zarathustra created the most portentous error, morality,
consequently he should also be the first to perceive that error, not only because he has had longer
and greater experience of the subject than any other thinker - all history is the experimental
refutation of the theory of the so-called moral order of things: the more important point is
that Zarathustra was more truthful than any other thinker. In his teaching alone do we
meet with truthfulness upheld as the highest virtue i.e. the reverse of the cowardice of the
'idealist' who flees from reality. Zarathustra had more courage in his body than any other
thinker before or after him. To tell the truth and to aim straight: that is the first Persian
virtue. Am I understood? ... The overcoming of morality through itself - through truthfulness, the
overcoming of the moralist through his opposite - through me: that is what the name Zarathustra
means in my mouth."
http://www.heritageinstitute.com/zoroastrianism/westernauthors/nietzsche.htm

The title, Twilight of the Idols, or How One Philosophizes with a Hammer (Götzen-Dämmerung, oder Wie man mit dem
Hammer philosophiert, August-September 1888), word-plays upon Wagner's opera, The Twilight of the Gods (Die
Götterdämmerung). Nietzsche reiterates and elaborates some of the criticisms of Socrates, Plato, Kant and Christianity found
in earlier works, criticizes the then-contemporary German culture as being unsophisticated and too-full of beer, and shoots
some disapproving arrows at key French, British, and Italian cultural figures such as Rousseau, Hugo, Sand, Michelet, Zola,
Renan, Carlyle, Mill, Eliot, Darwin, and Dante. In contrast to all these alleged representatives of cultural decadence, Nietzsche
applauds Caesar, Napoleon, Goethe, Dostoevski, Thucydides and the Sophists as healthier and stronger types. The phrase “to
philosophize with a hammer” primarily signifies a way to test idols by tapping on them lightly; one “sounds them out” to
determine whether they are hollow, or intact, etc., as physician would use a percussion hammer upon the abdomen as a
diagnostic instrument.
http://plato.stanford.edu/entries/nietzsche/
Some relativists claim that humans can understand and evaluate beliefs and behaviors only in terms of
their historical or cultural context. There are many forms of relativism which vary in their degree of controversy.[1] The term
often refers to truth relativism, which is the doctrine that there are no absolute truths, i.e., that truth is always relative to some
particular frame of reference, such as a language or a culture. Another widespread and contentious form is moral relativism.
Relativism can be contrasted [2] with:
• Universalism—the view that facts can be discovered objectively and that they thus apply universally in all situations, times and
places.
• Objectivism—the view that existence exists outside of consciousness and value is created through conscious evaluation of
reality versus one's nature.
• Intrinsicism—the view that cognitive, aesthetic and ethical values are independent of human thinking.
• Absolutism—the view that beauty, truth, etc, are timeless and unchanging qualities.
• Monism—the view that in any given area there can be no more than one correct opinion.
• Subjectivism—the view that any philosophical or moral question has an answer that is not falsifiable, and is therefore
subjective.
http://en.wikipedia.org/wiki/Relativism
Nietzsche's influence remains substantial within and beyond philosophy, notably
in existentialism and postmodernism. His style and radical questioning of the value and objectivity of truth have
resulted in much commentary and interpretation, mostly in the continental tradition, and to a lesser extent

Brainstorm Page 92
resulted in much commentary and interpretation, mostly in the continental tradition, and to a lesser extent
in analytic philosophy. His key ideas include the interpretation of tragedy as an affirmation of life, an eternal
recurrence (which numerous commentators have re-interpreted), a rejection of Platonism and a repudiation of
both Christianity and egalitarianism (especially in the form of democracy and socialism).
http://en.wikipedia.org/wiki/Friedrich_Nietzsche
A hypothetical imperative compels action in a given circumstance: if I wish to quench my thirst, I must drink
something. A categorical imperative, on the other hand, denotes an absolute, unconditional requirement that
asserts its authority in all circumstances, both required and justified as an end in itself. It is best known in its first
formulation:
"Act only according to that maxim whereby you can at the same time will that it should become a universal law."[1]
Kant expressed extreme dissatisfaction with the popular moral philosophy of his day, believing that it could never
surpass the level of hypothetical imperatives: a utilitarian says that murder is wrong because it does not maximize
good for the greatest number of people, but this is irrelevant to someone who is concerned only with maximizing
the positive outcome for themselves. Consequently, Kant argued, hypothetical moral systems cannot persuade
moral action or be regarded as bases for moral judgments against others, because the imperatives on which they
are based rely too heavily on subjective considerations. He presented a deontological moral system, based on the
demands of the categorical imperative, as an alternative.
http://en.wikipedia.org/wiki/Categorical_imperative
Moral Realism and Teleosemantics:
http://www-personal.usyd.edu.au/~rjoyce/acrobat/joyce_moral.realism.teleosemantics.pdf

Algebraic Approach to Quantum Gravity: Relative Realism:

This is a model of reality that is adapted to the fundamental nature of knowledge, achieves the minimal goal of
explaining how we perceive it and also has at its core notions of consciousness or self-awareness and free will.
http://philsci-archive.pitt.edu/archive/00003345/01/qg1. pdf

The "Flower of Life" can be found in all major religions of the world.

It contains the patterns of creation as they emerged from the "Great Void". Everything is made from the Creator's
thought.

After the creation of the Seed of Life the same vortex's motion was continued, creating the next structure known as
the Egg of Life.
This structure forms the basis for music, as the distances between the spheres is identical to the distances
between the tones and the half tones in music. It is also identical to the cellular structure of the third embryonic
division (The first cell divides into two cells, then to four cells then to eight). Thus this same structure as it is further
developed, creates the human body and all of the energy systems including the ones used to create the Merkaba.
If we continue creating more and more spheres we will end up with the structure called the Flower of Life.

Brainstorm Page 93
Flower of life made of intersecting spheres instead of circles.

Leonardo da Vinci has studied the Flower of Life's form and its mathematical properties. He has drawn the Flower
of Life itself, as well as components therein, such as the Seed of Life. He has drawn geometric figures representing
shapes such as the platonic solids, a sphere, a torus, etc., and has also used the golden ratio of phi in his artwork;

Brainstorm Page 94
shapes such as the platonic solids, a sphere, a torus, etc., and has also used the golden ratio of phi in his artwork;
all of which may be derived from the Flower of Life design.
http://www.maybelogic.org/maybequarterly/13/ 1306FlowerOfLife. htm
Your individual world view, your philosophy of life, can be seen as your personal operating system. And if we look
to the core principals of any robust OS system or program/application, the ones which are most evolved, and
possess the most potential for further evolutionary elegance, are those which are open source & dynamic in nature.
How can these same principals be applied when looking to ourselves, nature and the cosmos?

I propose this new Holy Trinity.

Developer (creator, that which creates)


Code (nature, that which is)
User (consciousness, that which experiences)

All 3 are aspects of the same source & being, expressing a 1:3 divine ratio

We are the playwright, the play & the player

The designer, experience & experiencer

and finally...

god, cosmos, & being

As we deepen our understanding of the code, mechanics & underlying principals of nature – through genomics,
micro-biology, physics, etc... It is revealed to us how nature operates at a certain level. These learnings could
benefit our role as the "New Holy Trinity", so we might evolve the divine code of nature.

Nature is the ultimate creative medium.

There is also our collective operating system. Which we are co-creating at this very moment. It's a cosmic open
source movement like know other!

And at a personal operating system level, we are experiencing at greater depths, our understanding of the 'Mind at
Large', consciousness & Being. The workings, light & landscape within and without. And with this, we can create,
evolve, and weave a new consciousness unto the cosmos, and within ourselves.

/ mc

For an elegent, insightful talk on the subject of decoding nature, view 'Juan Enriquez: Decoding the future with
genomics'
http://spacecollective.org/recent/page59

Brainstorm Page 95
genomics'
http://spacecollective.org/recent/page59

Info on Mind, Fractals, Art, Science, Futurism & More:


http://www.miqel.com/miqel_data/art/fractal_math_paintings.html

http://spiritualdove.com/what_is_a_star_seed_.htm

http://www.r-evolucio.org/
Two circles or spheres intersect so that the center of each exactly touches. This symbolic intersection forms a third
circle or sphere that equally shares the energies of the first 2 circles or spheres. This shape is the Vesica Piscis.
http://lightworkers.org/node/87524

http://www.internationaledwa.org/
Infomorphic Spheroidal Geometric Sets: IS-Multivarifolds

Brainstorm Page 96
http://www.susqu.edu/brakke/evolver/examples/examples.htm

Brainstorm Page 97
http://en.wikipedia.org/wiki/Introduction_t o_the_Global_Positioning_System
Higher Eigenvalues and Isoperimetric Inequalities on Riemannian Manifolds and Graphs:
http://math.ucsd.edu/~fan/mypaps/gsob.pdf
Probability in Banach Spaces: Isoperimetry and Processes:
http://books.google.com/books?id=cyKYDfvxRjsC&source=gbs_navlinks_s
Layman's Guide to the Banach-Tarski Paradox:
The Banach-Tarski Paradox is well-known among mathematicians, particularly among set
theorists.1 The paradox states that it is possible to take a solid sphere (a "ball"), cut it up into
a finite number of pieces, rearrange them using only rotations and translations, and re-assemble
them into two identical copies of the original sphere. In other words, you've doubled the volume of
the original sphere.
"Impossible!" I hear you say. "That violates physical laws!" Well, that is what many mathematicians
said when they first heard this paradox. But I'd like to point out in this article why this may not be
as impossible as one might think at first.
1 It revolves around the decades-old debate of whether the Axiom of Choice should be admitted or

rejected. More on this in the epilogue.


http://www.kuro5hin.org/story/2003/5/23/134430/275

The Banach–Tarski paradox is a theorem in set theoretic geometry which states that a
solid ball in 3-dimensional space can be split into a finite number of non-overlapping pieces,
which can then be put back together in a different way to yieldtwo identical copies of the original
ball. The reassembly process involves only moving the pieces around and rotating them,
without changing their shape. However, the pieces themselves are complicated: they are not
usual solids but infinite scatterings of points. A stronger form of the theorem implies that given
any two "reasonable" objects (such as a small ball and a huge ball), either one can be
reassembled into the other. This is often stated colloquially as "a pea can be chopped up and
reassembled into the Sun".
In a paper published in 1924, Stefan Banach and Alfred Tarski gave a construction of such a
"paradoxical decomposition", based on earlier work by Giuseppe Vitali concerning the unit
interval and on the paradoxical decompositions of the sphere by Felix Hausdorff, and discussed
a number of related questions concerning decompositions of subsets of Euclidean spaces in
various dimensions. They proved the following more general statement, the strong form of the
Banach–Tarski paradox:
http://en.w ikipedia.org/wiki/Banach–Tarski_paradox

Program of the 73rd Statistical Mechanics Meeting:

Duality: From Ising Model to Strings. Ed Witten, IAS.


http://www.springerlink.com/content/b085381x81522848/
Reflections on the Fate of Spacetime:
String theory carries the seed of a basic change in our ideas about spacetime and in other fundamental notions of
physics (Edward Witten):
http://www.sns.ias.edu/~witten/papers/Reflections.pdf
"In other words, telesis is a kind of ―pre-spacetime‖ from which time and space, cognition and information, state-
transitional syntax and state, have not yet separately emerged. Once bound in a primitive infocognitive form that
drives emergence by generating ―relievable stress‖ between its generalized spatial and temporal components - i.e.,
between state and state-transition syntax – telesis continues to be refined into new infocognitive configurations, i.e.
new states and new arrangements of state-transition syntax, in order to relieve the stress between syntax and state
through telic recursion (which it can never fully do, owing to the contingencies inevitably resulting from independent
telic recursion on the parts of localized subsystems). As far as concerns the primitive telic-recursive infocognitive
MU form itself, it does not ―emerge‖ at all except intrinsically; it has no ―external‖ existence except as one of the
myriad possibilities that naturally exist in an unbounded realm of zero constraint.
Telic recursion occurs in two stages, primary and secondary (global and local). In the primary stage, universal
(distributed) laws are formed in juxtaposition with the initial distribution of matter and energy, while the secondary
stage consists of material and geometric state-transitions expressed in terms of the primary stage. That is, where
universal laws are syntactic and the initial mass-energy distribution is the initial state of spacetime, secondary
transitions are derived from the initial state by rules of syntax, including the laws of physics, plus telic recursion.
The primary stage is associated with the global telor, reality as a whole; the secondary stage, with internal telors
(―agent-level‖ observer-participants). Because there is a sense in which primary and secondary telic recursion can
be regarded as ―simultaneous‖, local telors can be said to constantly ―create the universe‖ by channeling and
actualizing generalized utility within it."

Brainstorm Page 98
Diagram 13: The above diagram illustrates the relationship of primary and secondary telic recursion, with the latter
―embedded in‖ or expressed in terms of the former. The large circles and arrows represent universal laws
(distributed syntax) engaged in telic feedback with the initial state of spacetime (initial mass-energy distribution),
while the small circles and arrows represent telic feedback between localized contingent aspects of syntax and
state via conspansion. The primary stage maximizes global generalized utility on an ad hoc basis as local telors
freely and independently maximize their local utility functions. The primary-stage counterparts of inner expansion
and requantization are called coinversion and incoversion. It is by virtue of telic recursion that the SCSPL universe
can be described as its own self-simulative, self-actualizative ―quantum protocomputer‖.

Diagram 1: 1. Indeterminacy 2. External determinacy 3a. Self-determinacy 3b. Intrinsic self-determinacy (The
effectual aspect of the object or event has simply been moved inside the causal aspect, permitting the
internalization of the blue arrow of determinacy and making causality endomorphic.)
http://megafoundation.org/CTMU/Articles/Langan_CTMU_092902.pdf
By isocognitive it is meaning just this: What we think(say, do, feel etc) thinks us.
http://www.geocities.com/charistatwo/CCamateuratheismxmas.html
As an artificial sentience (A.S.), Pandora represents the latest in artificial intelligence and isocognitivetechnology
from the renowned Jupiter Holoprogramming Facility, drawing on long-term machine-reasoning subroutines based
on the Voyager EMH as well as enhanced reasoning-decision logarithms based on the former Commander Data's
positronic brain circuitry.
http://s14.invisionfree.com/StarshipIndependence/ar/t54.htm
That is, the play of forms both in society and nature are isocognitive with the play of human emotion and ontology;
we inextricably engaged in the drama of seeing both nature and society (and especially, their architecture or
language) through the human component. Again, this is a very romantic film in the classical sense.
http://www.geocities.com/landonsealey/Film120Tango.htm

Closed Descriptive Manifolds, Open Containment Boundaries, and the IS-Multivarifold

HYJ: I propose the IS-Multivarifold rather than just M-Theory as our ever-temporary (new I-deas generate bumps
in S-curves: See Ray Kurzweil http://www.youtube.com/watch?v=9PW XrnsSrf0) theory of everything, at least in the
domain of knowledge representation and modeling, which incorporates notions from the Ising model, Isbell
Duality, Infocognitive Semantics, and the Isotelesi s concept of parallel development and convergenc e of means
and ends. (it needs a new title because in addition to M-Theory, it combines my ideas with Chris Langan's, and
ISM-Theory would just have been sending the wrong message, even though "-ism": A suffix indicating an act, a
process, the result of an act or a process, a state; also, a characteristic (as a theory, doctrine, idiom, etc.), but the
purpose here is not to delve into scientism. ISO-Theory would have been closer, the open-ended cosmology of
explaining "Is" in terms of "is" has enough power to become a modeling system, or theory of theories, not a GUT
but a TOE). "IS" means different things to different people, for starters: information
supersymmetry, isocognitive superposition, isomorphic syntax, infomorphic semantics,
and invariant standard, perhaps a bit less formally, intelligent system, illuminated spirituality, instantiated
substance, the "Eye" or "I" of Wheeler's Universe as a self-excited circuit and "See" or "Sense", Integrated
Science, I-Duality (for Information-Time Duality) and S-Duality, the Is-Ought Problem {"is" (plural "ises"), anything
factual, empirical, actual, or spatiotemporal, in is-ought philosophy, resolved in terms of OTELES: The moral
philosopher Alasdair MacIntyre has attempted to show that because ethical language developed in the West in the
context of a recognition of a human telos--an end or goal--our inherited moral language, including terms such as
"good" and "bad," have functioned, and function, to evaluate the way in which certain behaviors facilitate the
achievement of the telos. In an evaluative capacity, therefore, "good" and "bad" carry moral weight without
committing a category error. Just as a pair of scissors that cannot easily cut through paper can legitimately be
called "bad" since it cannot discharge its function well, a doctor who cures a high percentage of his or her patients,
or a person who acts in accordance with his or her particular culture's or community's vision of the human telos,
can be called good, as can those things that help them facilitate their goal. It follows that if we want scissors to be
good, we ought not mak e them lik e the aforementioned pair; if doctors or people in general want to be good, they
ought to act in such ways, and mak e use of such things, that enable them to fulfill their telos.[8]}, and Isabelle: Isabelle is
a generic proof assistant. It allow s mathematical formulas to be expressed in a formal language and provides tools for proving those formulas in
a logical calculus. The main application is the formalization of mathematical proofs and in particular formal verification, which includes proving
the correctness of computer hardware or software and proving properties of computer languages and
protocols.http://www.cl.cam.ac.uk/research/hvg/Isabelle/Isar: Intelligible Semi-Automated Reasoning: http://isabelle.in.tum.de/Isar/, because the
ring is so much more beautiful. Of course, it regards the nature of w hat "IS" is, and "all" there is, w hich brings philosophyto the rolled up
dimensions encompassing us beyond the planck level. CML's theory also recognizes the holographic principle and duality, and the importance
of metaphysical autonomy, i.e. the existence of an "I", w hich Douglas Hofstader has w ritten on. If "I" is a taut open-ended string or tubular
membrane, "S" w ould be a super-recursive yet open-ended structure encompassing both the manifold itself, and the metaphase space it is
contained w ithin (unbound/open telesis), perhaps making it "supertautological" in CML's terms. In terms of Isotelesis, "O" could suggest a
closed string betw een or attached to membranes, such "rings" are self-contained loops capable of self-resonating vibration, however since the
emphasis is on modeling reality, "O" w ould stand for "ontological", how ever since "Is" is the third-person singular of "to be" in English, and
"ontology" is the metaphysical study of the nature of "being" and existence, w hether a being or individual, or existence at a
l rge, and since it
models reality in terms of a cognitive-theoretic system, all one needs is to represent the semantic duality of "I" as in "I Am", and "S" as in "My
Self", to represent a reality w hich introprojectively describes itself in terms of itself, similar to autology, "I am my Self", this operation essentially
recognizes that any TOE will have to be able to explain the role of self-referentiality in cognition-theory-universe mapping, the CTMU describes
this process in terms of automata w ith quantum aspects, w hich are essentially Chu Spaces, w hich act like netw ork automata inIS-Multivarifolds.
In a sense the entire state-transition syntax is available to every adjoint-point in the conspansive syndiffeonetic ontological manifold, as in
holic/monic self-containment, or the entire structure of the universe being embedded-encoded throughout (CML's homogenous syntactic
medium: "hology") is another means for automorphic holotelesis, or the distribution of the design phase over the actualization phase. Although
the comparison is not formalized, and no mathematical formalism on the CTMU, the SCSPL is rather similar to one of the distinguishing

Brainstorm Page 99
the comparison is not formalized, and no mathematical formalism on the CTMU, the SCSPL is rather similar to one of the distinguishing
principles of Chu Spaces: the duality of syntax-state ("state" is "content") or event-state.
http://en.wikipedia.org/wiki/Is

A dualizing object is an object a which can be regarded as being an object of two different categories:
http://ncatlab.org/nlab/show/dualizing+object
In mathematics, a *-autonomous category C is a symmetric monoidal closed category equipped with a dualizing
object

.
Examples
A familiar example is given by matrix theory as finite-dimensional linear algebra, namely the category of finite-
dimensional vector spaces over any field k made monoidal with the usual tensor product of vector spaces. The
dualizing object is k , the one-dimensional vector space, and dualization corresponds to transposition. Although the
category of all vector spaces over k is not *-autonomous, suitable extensions to categories of topological vector
spaces can be made *-autonomous.
Various models of linear logic form *-autonomous categories, the earliest of which was Jean-Yves Girard's
category of coherence spaces.
The category of complete semilattices with morphisms preserving all joins but not necessarily meets is *-
autonomous with dualizer the chain of two elements. A degenerate example (all homsets of cardinality at most one)
is given by any Boolean algebra (as a partially ordered set) made monoidal using conjunction for the tensor product
and taking 0 as the dualizing object.
An example of a self-dual category that is not *-autonomous is finite linear orders and continuous functions, which
has * but is not autonomous: its dualizing object is the two-element chain but there is no tensor product.
The category of sets and their partial injections is self-dual because the converse of the latter is again a partial
injection.
The concept of *-autonomous category was introduced by Michael Barr in 1979 in a monograph with that title. Barr
defined the notion for the more general situation of V-categories, categories enriched in a symmetric monoidal or
autonomous category V. The definition above specializes Barr's definition to the case V = Set of ordinary
categories, those whose homobjects form sets (of morphisms). Barr's monograph includes an appendix by his
student Po-Hsiang Chu which develops the details of a construction due to Barr showing the existence of nontrivial
*-autonomous V-categories for all symmetric monoidal categories V with pullbacks, whose objects became known
a decade later as Chu spaces.
http://en.wikipedia.org/wiki/*-autonomous_category
Properties
• Whenever X × ZY exists, then so does Y × Z X and there is an isomorphism X × Z Y Y × ZX.
• Monomorphisms are stable under pullback: if the arrow f above is monic, then so is the arrow p2. For example, in
the category of sets, if X is a subset of Z, then, for any g : Y → Z, the pullback X × Z Y is the inverse
image of X under g.
• Isomorphisms are also stable, and hence, for example, X × X Y Y for any map Y → X.
• Any category with fibre products (pull backs) and products has equalizers.
http://en.wikipedia.org/wiki/Pullback_(category_theory)
Relational algebra, an offshoot of first-order logic (and of algebra of sets), deals with a set of finitary relations (see
also relation (database)) which is closed under certain operators. These operators operate on one or more
relations to yield a relation. Relational algebra is a part of computer science.
http://en.wikipedia.org/wiki/Relational_algebra
Isbell Duality for Modules:
http://www.tac.mta.ca/tac/volumes/22/17/22-17.pdf
Isbell Duality:
http://www.univie.ac.at/EMIS/journals/ TAC/ volumes/20/15/20-15.pdf
Isbell Envelope in nLab:
http://ncatlab.org/nlab/show/Isbell+envelope
New Theories of Everything: The Quest for Ultimate Explanation:
"It seemed to me a superlative thing ~ to know the explanation of everything, why it comes to be, why it perishes,
why it is." -Socrates
http://books.google.com/books?id=b_fN7CXBYC0C&source=gbs_navlinks_s

Rational Mechanics and Natural Mathematics:

Chu spaces have found applications in computer science, mathematics, and physics. They enjoy a useful
categorical duality analogous to that of lattice theory and projective geometry. As natural mathematics Chu spaces
borrow ideas from the natural sciences, particularly physics, while as rational mechanics they cast Hamiltonian
mechanics in terms of the interaction of body and mind.

This paper addresses the chief stumbling block for Descartes' 17th-century philosophy of mind-body dualism, how
can the fundamentally dissimilar mental and physical planes causally interact with each other? We apply Cartesian
logic to reject not only divine intervention, preordained synchronization, and the eventual mass retreat to monism,
but also an assumption Descartes himself somehow neglected to reject, that causal interaction within these planes
is an easier problem than between. We use Chu spaces and residuation to derive all causal interaction, both
between and within the two planes, from a uniform and algebraically rich theory of between-plane interaction alone.
Lifting the two-valued Boolean logic of binary relations to the complex-valued fuzzy logic of quantum mechanics
transforms residuation into a natural generalization of the inner product operation of a Hilbert space and
demonstrates that this account of causal interaction is of essentially the same form as the Heisenberg-Schrodinger
quantum-mechanical solution to analogous problems of causal interaction in physics.
http://boole.stanford.edu/pub/ratmec h.pdf

John Archibald Wheeler, high priest of quantum mysteries, suspects that reality exists not because of physical
particles but rather because of the act of observing the universe. "Information may not be just what welearn about
the world," he says. "It may be what mak es the world."
http://discovermagazine.com/2002/jun/featuniverse
John Wheeler and the "It from Bit":
http://suif.stanford.edu/~jeffop/WWW/wheeler.txt

Brainstorm Page 100


John Wheeler and the "It from Bit":
http://suif.stanford.edu/~jeffop/WWW/wheeler.txt

Hence Langan's use of "Infocognition" or "Infocognitive Potential".


http://www.upscale.utoronto.ca/PVB/Harrison/Bells Theorem/BellsTheorem.html
http://www.ipod.org.uk/reality/reality_big_brother.as p

The Ising model, named after the physicist Ernst Ising, is a mathematical model in statistical mechanics. It has
since been used to model diverse phenomena in which bits of information, interacting in pairs, produce collective
effects.
http://en.wikipedia.org/wiki/Ising_model

Isotelic (adj.): Referring to factors that produce, or tend to produce, the same (or equivalent resulting) effect:
http://books.google.com/books?id=MFlc5iN6ynAC&pg=PA381&dq=isotelic&lr=#v= onepage&q=isotelic&f= false

Maintained as a variable, time can be conventionally represented along a line in a diagram, simultaneously with
variables that denote distances. But other variables, such as temperature, pressure, statistical aggregates, all of
them as completely distinct from space-variables as is time, can be (and are) conventionally represneted also
along lines in diagrams. However convenient such diagrammatic uses may be for mathematical investigation, they
do not constitute either time, temperature, or potential, or pressure, or any statistical aggregate, as a
dimension isotelic with the ordered dimensions of space.

Again, to cite the purely mathematical use of the diagrammatic practice of substituting the term dimension for the
term variable, it is a commonplace that the aggregate of all spheres in Euclidean triple space requires four
independent variables for its mathematical expression,(...)
http://books.google.com/books?id=nGw3AAAAIAAJ&pg=PR12
&dq=isotelic+geometry&lr=#v=onepage& q=isotelic&f=false

Perhaps the "equivalence" in S-duality signifies "isotelesis"?

In theoretical physics, S-duality (also a strong-weak duality) is an equivalence of two quantum field
theories,string theories, or M-theory.
http://en.wikipedia.org/wiki/S-duality

Meta-Ethics and the Is-Ought problem:


http://en.wikipedia.org/wiki/Is-ought
Borsuk-Ulam Theorem:
http://en.wikipedia.org/wiki/Borsuk–Ulam_theorem

An Isovariant Borusk-Ulam Theorem and Its Converse:


http://www.wakayama-u.ac.jp/~kawa/math/33th/nagasaki.pdf
Isovariant Borsuk-Ulam Results for Pseudofree Circle Actions and Their Converse:
http://cat.inist.fr/?aModele=afficheN&cpsidt=17562425
Isovariant maps from free Cn-manifolds to representation spheres:
http://linkinghub.elsevier.com/retrieve/pii/S016686410800028X

Brainstorm Page 101


http://www.valdostamuseum.org/hamsmith/SegalConf2.html
http://www.ams.org/mathimagery/

Mind, Reality, Causality, Determinacy, and Evolution

Mind Equals Reality Principle (M=R) asserts that mind and reality are ultimately inseparable to the extent that
they share common rules of structure and processing.

Multiplex Unity Principle (MU) - The minimum and most general informational configuration of reality, defines the
relationship holding between unity and multiplicity, the universe and its variegated contents. Through its structure,
the universe and its contents are mutually inclusive, providing each other with a medium.

HYJ: In my version of ISOTELESIS, Multiplex Unity or "(M(U))" => IS for Information System or Infocognitive
Superposition, a state of Multifurcative Convergence, with (IS) for the incoverting local and {IS} for the
coinverting global vantage points, with Unbound Telesis as OTELES or Open Telesis with extended superposition,
which allows simultaneous binding elsewhere, but has free resources available to allow various other tasks,
functions, or semantic content to emerge in coordination, allowing Polytelic/Multiversal branching
to generate, this makes the full picture equal to: (IS)-OTELES-{IS}-OTE LES-(IS) with "Interpretation" as
Epistemological Convergence, "Explanation" as Concurrent Ontological Model. According to the Isotelesis
interpretation of the CTMU, Chu Spaces play a fundamental role in how the SCSPL would generate a
conspansive ontological manifold via the dual-phases of design-actualization or wave-particle processes of inner-
expansion and re-quantization, with the extended-superposition principle (ESP) allowing for the "wave function
collapse" to "be determined by higher order teleological functions spanning the successive layers of spacetime". In
the Isotelesis version, higher order teleonomic functions are produced through gyroteleostasis, a process similar to
telic-recursion, except it maintains a basic gyroscopic orientation with respect to a network of self-coupling,
rotating gravitational systems and sub-systems through symmetry-breaking phase-transitions, and the virtually
created-annihilated flux of coherent particle-antiparticles which obey the conservation of energy {or in terms of
Isotelesis, the equivalence of the whole (a rough set with no clear upper boundary) everywhere throughout, for
all converse or entangled mappings}.
Telic Recursion - A fundamental process that tends to maximize a cosmic self-selection parameter, generalized
utility, over a set of possible syntax-state relationships in light of the self-configurative freedom of the universe. An
inherently "quantum" process that reflects the place of quantum theory in SCSPL, telic recursion is a "pre-
informational" form of recursion involving a combination of hology, telic feedback and recursive selection acting on
the informational potential of MU, a primal syndiffeonic form that is symmetric with respect to containment.
Teleologic Evolution (TE) is a process of alternating replication and selection through which
the universe "creates itself" along with the life it contains. This process, called telic recursion, is

Brainstorm Page 102


the universe "creates itself" along with the life it contains. This process, called telic recursion, is
neither random nor deterministic in the usual senses, but self-directed. Telic recursion occurs
on global and local levels respectively associated with the evolution of nature and the evolution
of life; the evolution of life thus mirrors that of the universe in which it occurs. TE improves on
traditional approaches to teleology by extending the concept of nature in a way eliminating any
need for "supernatural" intervention, and improves on neo-Darwinism by addressing the full
extent of nature and its causal dynamics.

In the past, teleology and evolution were considered mutually exclusory. This was at least
partially because they seem to rely on different models of causality. As usually understood,
teleology appears to require a looping kind of causality whereby ends are immanent everywhere
in nature, even at the origin (hence the causal loop). Evolution, on the other hand, seems to
require a combination of ordinary determinacy and indeterminacy in which the laws of nature
deterministically guide natural selection, while indeterminacy describes the "random" or
"chance" dimension of biological mutation.

In contrast, the phrase teleologic evolution expresses an equivalence between teleology and
evolution based on extended, refined concepts of nature and causality. This equivalence is
expressed in terms of a self-contained logic-based model of reality identifying theory, universe
and theory-universe correspondence, and depicting reality as a self-configuring system
requiring no external creator. Instead, reality and its self-creative principle are identified through
a contraction of the mapping which formerly connected the source and output of the teleology
function. In effect, the creative principle itself becomes the ultimate form of reality.

The self-configuration of reality involves an intrinsic mode of causality, self-determinacy, which


is logically distinct from conventional concepts of determinacy and indeterminacy but can
appear as either from a localized vantage. Determinacy and indeterminacy can thus be viewed
as "limiting cases" associated with at least two distinct levels of systemic self-determinacy,
global-distributed and local-nondistributed. The former level appears deterministic while the
latter, which accommodates creative input from multiple quasi-independent sources,
dynamically adjusts to changing conditions and thus appears to have an element of
"randomness".

According to this expanded view of causality, the Darwinian processes of replication and natural selection occur on
at least two mutually-facilitative levels associated with the evolution of the universe as a whole and the evolution of
organic life. In addition, human technological and sociopolitical modes of evolution may be distinguished, and
human intellectual evolution may be seen to occur on collective and individual levels. Because the TE model
provides logical grounds on which the universe may be seen to possess a generalized form of intelligence, all
levels of evolution are to this extent intelligently directed, catalyzed and integrated.
The Third Option:
Determinacy and indeterminacy…at first glance, there seems to be no middle ground. Events are either causally
connected or they are not, and if they are not, then the future would seem to be utterly independent of the past.
Either we use causality to connect the dots and draw a coherent picture of time, or we settle for a random
scattering of independent dots without spatial or temporal pattern and thus without meaning. At the risk of
understatement, the philosophical effects of this assumed dichotomy have been corrosive in the extreme. No
universe that exists or evolves strictly as a function of external determinacy, randomness or an alternation of the
two can offer much in the way of meaning. Where freedom and volition are irrelevant, so is much of human
experience and individuality.
But there is another possibility after all:self-determinacy. Self-determinacy is like a circuitous boundary separating
the poles of the above dichotomy…a reflexive and therefore closed boundary, the formation of which involves
neither preexisting laws nor external structure. Thus, it is the type of causal attribution suitable for a perfectly self-
contained system. Self-determinacy is a deep but subtle concept, owing largely to the fact that unlike either
determinacy or randomness, it is a source of bona fide meaning. Where a system determines its own composition,
properties and evolution independently of external laws or structures, it can determine itsown meaning, and ensure
by its self-configuration that its inhabitants are crucially implicated therein.

Reality Theory is about the stage of attribution in which two predicates analogous to true and false,
namely real and unreal, are ascribed to various statements about the real universe. In this sense, it is closely
related to sentential logic. In particular, sentential logic has four main properties to be emulated by reality theory.
The first is absolute truth; as the formal definition of truth, it is true by definition. The other properties
are closure, comprehensiveness and consistency. i.e., logic is wholly based on, and defined strictly within the
bounds of, cognition and perception; it applies to everything that can be coherently perceived or conceived; and it
is by its very nature consistent, being designed in a way that precludes inconsistency.
The Principle of Linguistic Reducibility - Reality is a self-contained form of language. This is true for at least two
reasons. First, although it is in some respects material and concrete, reality conforms to the algebraic definition of a
language. That is, it incorporates (1) representations of (object-like) individuals, (space-like) relations and
attributes, and (time-like) functions and operations; (2) a set of "expressions" or perceptual states; and (3) a syntax
consisting of (a) logical and geometric rules of structure, and (b) an inductive-deductive generative grammar
identifiable with the laws of state transition. Second, because perception and cognition are languages, and reality is
cognitive and perceptual in nature, reality is a language as well.
SCSPL - According to the Reality Principle, the universe is self contained, and according to infocognitive monism, it
regresses to a realm of nil constraint (unbound telesis or UBT) from which it must refine itself. According to the
Telic Principle, which states that the universe must provide itself with the means to do this, it must make and
realize its own "choice to exist"; by reason of its absolute priority, this act of choice is identical to that which is
chosen, i.e. the universe itself, and thus reflexive. i.e., "existence is everywhere the choice to exist." Accordingly,
the universe must adopt a reflexive form in which it can "select itself" for self-defined existence, with the selection
function identical to that which is selected. This means that it must take a certain general or "initial" form, the MU

Brainstorm Page 103


function identical to that which is selected. This means that it must take a certain general or "initial" form, theMU
form, which contains all of the requisites for generating the contents of reality. Due to hology, whereby the self-
contained universe has nothing but itself of which to consist, this form is self-distributed.
By the Principle of Linguistic Reducibility, reality is a language. Because it is self-contained with respect to
processing as well as configuration, it is a Self-Configuring Self-Processing Languageor SCSPL whose general
spatiotemporal structure is hologically replicated everywhere within it as self-transductive syntax. This reduces the
generative phase of reality, including physical cosmogony, to the generative grammar of SCSPL.
Unbound Telesis (UBT) - a primordial realm of infocognitive potential free of informational constraint. In CTMU
cosmogony, "nothingness" is informationally defined as zero constraint or pure freedom (unbound telesis or UBT),
and the apparent construction of the universe is explained as a self-restriction of this potential. In a realm of
unbound ontological potential, defining a constraint is not as simple as merely writing it down; because constraints
act restrictively on content, constraint and content must be defined simultaneously in a unified syntax-state
relationship.
Hology is a logical analogue of holography characterizing the most general relationship between reality and its
contents. It is a form of self-similarity whereby the overall structure of the universe is everywhere distributed within
it as accepting and transductive syntax, resulting in a homogeneous syntactic medium.
CTMU:
http://www.ctmu.net/

Teleologic Evolution:
http://www.teleologic.org/

Christopher M. Langan Posts at ISCID Forums:


http://www.iscid.org/ubbcgi/ultimatebb.cgi?ubb=recent_user_posts;u=00000264

Essentially, telic recursion allows meaningful higher-order expressions generated by telors to function as SCSPL
generative grammar (rules of production) within a conspansive manifold which "automatically" executes these rules
by virtue of its structure (see the PCID paper regarding extended superposition, a conspansive correlate of hology
which places the past and future in immediate contact).
http://exfructibus.blogspot.com/2009/03/heres-little-exchange-i-had-with. html

Unifying Dynamical Systems and Complex Networks Theories: A Proposal of "Generative Network Automata
(GNA)":
http://www.nd.edu/~netsci/TALKS/Sayama_CT.pdf
On the Reachability of a Version of graph-rewriting system:
http://portal.acm.org/citation.cfm?id=1541261
Application of Functional Network to Solving Classification Problems:
http://74.125.155.132/search?q=cache: XihjbZGINZIJ:www. waset.org/pwaset/ v7/ v7-77.pdf+functional+net work&cd=
5&hl=en&ct=clnk&gl=us
Functional Network Inference Algorithms:
(gene regulatory network, neural information flow network, ecological dependence network)
(Pair-wise algorithms, Equation-based algorithms, Network-based algorithms)
http://biology.st-andrews.ac.uk/vannesmithlab/functionalnet work.html
The Singularity Institute for Artificial Intelligence: Advance Innovation, Advance Humanity

Our Phase I research divides into three categories:


• Theoretical Research
• Tools and Technologies
• System Design and Implementation (primarily Phase II)
http://singinst.org/research/researchareas

Conceptual Structures:
http://books.google.be/books?id=IiFaPhDSE JkC&hl=en&source=gbs_navlinks_s

Quantum Finite Automata:


http://en.wikipedia.org/wiki/Quantum_finite_automata

Ontology, Epistemology, Teleology: Triangulating Geographic Information Science:


http://www.springerlink.com/content/q787181555221513/

Types of Conceptual Scales:


http://www.ontologos.org/CKML/Conc eptualScalingTypes.html

Rough Sets and Knowledge Technology:


http://logic.mimuw.edu.pl/publikacje/ChongqingRSKTt om.pdf

Cumulative Distribution Function:


http://en.wikipedia.org/wiki/Cumulative_distribution_function

"The concepts of teleomorphism, autopoiesis, self-organization, organized potentiality, originality, individuality, and
historicity do indeed ascribe human behavior to nature. Thus, in a recent book on the life sciences, Lovelock (1979)
states that our bodies are made up of cell cooperatives."
p.34http://www.boaventuradesousasantos.pt/media/pdfs/a_discourse_on_the_sciences-Review.P DF
Applications of biotransformations and biocatalysis to complexity generation in organic synthesis:
http://www.rsc.org/publishing/journals/CS/article.asp?doi=b901172m
Intersection Homology:
http://en.wikipedia.org/wiki/Intersection_homology
Concepts: Where Cognitive Science Went Wrong (Jerry A. Fodor):
http://books.google.com/books?id=7m2M0usqV-wC&source=gbs_navlinks_s
Equivariant Homology and Intersection Homology (Geometry of Pseudomanifolds):http://www.math.lsu.edu/
~cohen/courses/SPRING09/M7512/MacPherson.pdf
Complementary Pivoting on a Pseudomanifold Structure with Applications in the Decision
Sciences:http://www.springerlink.com/content/v77803738280x813/

Brainstorm Page 104


Sciences:http://www.springerlink.com/content/v77803738280x813/

Pseudo-Manifold Geometries with Applications:


http://adsabs.harvard.edu/abs/2006math.....10307M

Sheaves in Geometry and Logic (Saunders Mac Lane, Ieke Moerdijk):


http://books.google.be/books?id=SGwwDerbE owC&hl= en&source=gbs _navlinks_s

Ontology Alignment:
http://en.wikipedia.org/wiki/Ontology_alignment

Harvesting Wiki Consensus - Using Wikipedia Entries as Ontology Elements:


http://www.heppnetz.de/files/SemWiki2006-Harvesting%20Wiki%20Consensus-LNCS -final.pdf
Semantic Integration:
http://en.wikipedia.org/wiki/Semantic_integration

Making Sense of the Semantic Web:


http://docs.google.com/present/view?id=dwz v3r6_143qtq9cf9

Ontology for the Intelligence Community:


http://c4i.gmu.edu/OIC08/

Ontology-Based Automatic Data Structure Generation for Collaborative Networks:


http://www.springerlink.com/content/h405882876n15871/

Building Intelligent Systems in Open, Heterogeneous, Dynamic, Distributed Environments:


http://ebiquity.umbc.edu/

Information Flow: The Logic of Distributed Systems:


Information is a central topic in computer science, cognitive science, and philosophy. In spite of its importance in
the "information age," there is no consensus on what information is, what makes it possible, and what it means for
one medium to carry information about another. Drawing on ideas from mathematics, computer science, and
philosophy, this book addresses the structure of information and its place in society. The authors, observing that
information flow is possible only within a connected distribution system, provide a mathematically rigorous,
philosophically sound foundation for a science of information. They illustrate their theory by applying it to a wide
range of phenomena, from file transfer to DNA, from quantum mechanics to speech act theory. An important
interdisciplinary text, this book is ideal for graduate students and researchers in mathematics, computer science,
philosophy, linguistics, logic, and cognitive science.
http://books.google.com/books?id=Mawadg55eg4C&source= gbs_navlinks_s
Ontologies Are Us: A Unified Model of Social Networks and Semantics:
http://www.cs.vu.nl/~pmika/research/papers/ISWC-folksonomy.pdf

Professor Ian Horrocks: Information Systems, Knowledge Representation and Reasoning:


http://web.comlab.ox.ac.uk/people/Ian.Horrocks/

Ontologies and Databases: Myths and Challenges:


http://www.inf.unibz.it/~franconi/papers/franconi-vldb-08-slides.pdf

Ontologies: Principles, Methods, and Applications:


http://www.upv.es/sma/teoria/sma/onto/96-ker-intro-ontologies.pdf

Ontologies Come of Age:


http://www-ksl.stanford.edu/people/ dlm/papers/ontologies-come-of-age-mit-press-(with-citation).htm

A Conceptual Construction of Complexity Levels Theory in Spacetime Categorical Ontology: Non-Abelian Algebraic
Topology, Many-Valued Logics and Dynamic Systems (2007):
http://www.springerlink.com/content/j7t56r530140r88p/

Telesophy System:
http://www.canis.uiuc.edu/projects/telesophy/index.html

Norms and Dissensus: Why We Will Not See Normative Convergence and How to Live With It:
http://www.allacademic.com/meta/p_mla_apa_researc h_citation/0/7/ 4/1/4/p74141_index.html

Knowledge Ontologies:
http://books.google.com/books?id=g_IywnHJjlAC&lpg=PP1&pg=PA18#v= onepage&q=& f=false

Outcome Models, Program Logics, Strategy Maps, Ends-Means Diagrams, Results Chains, Theories of Change,
etc., Modeled in DoView Outcomes Software:
http://www.outcomesmodels.org/
Applied Categorical Structures:
http://www.springer.com/math/journal/10485?detailsPage=editorialB oard

Formal Semantics:
http://en.wikipedia.org/wiki/Formal_semantics

The Onomasiological Grammar from the Methodological Point of View:


http://www.islu.ru/danilenko/articles/grammar.htm

Quantum Holography and Neurocomputer Architectures:


http://www.springerlink.com/content/llkg2104884n266k/
Third-Generation Nuclear Magnetic Resonance: Diffusion Tensor Magnetic Resonance Tomography and Cerebral
White Matter Tractography:

Brainstorm Page 105


White Matter Tractography:
http://www.dm.unibo.it/~GRUPPI/convegni/gala09/schempp. pdf
State Transition System:
http://en.wikipedia.org/wiki/State_transition_system
Concept Equations:
http://logcom.oxfordjournals.org/cgi/content/abstract/14/3/395

An Ontology-Based Information Retrieval Model:


http://www.acemedia.org/aceMedia/files/document/wp7/2005/ eswc05-uam.pdf

Ontology Construction and Concept Reuse with Formal Concept Analysis for Improved Web Document Retrieval:
http://iospress.metapress.com/content/187272723xul1571/

Rough Sets, Fuzzy Sets, Data Mining, and Granular Computing:


http://books.google.com/books?id=9YTyEGP6-FYC&source=gbs_navlinks_s

Foundations of Intelligent Systems:


http://books.google.com/books?id=EDxsSkzur1kC&source=gbs_navlinks_s

Formal Concept Analysis: Foundations and Applications:


http://books.google.com/books?id=nEh4D4e88NwC&source=gbs_navlinks_s

Knowledge Web Technology Road Map: "The Technology Roadmap of the Semantic Web":
http://knowledgeweb.semanticweb.org/o2i/menu/KWTR-whitepaper-43-final. pdf
Knowledge Discovery:
http://en.wikipedia.org/wiki/Knowledge_discovery

Concept Lattice-Based Knowledge Discovery:


http://www.ksl.stanford.edu/iccs2001/clkdd2001/

Sensor Ontology Integration for the Knowledge Management for Distributed-Tracking Program (KMDT):
http://www.stormingmedia.us/49/4903/A490364.html

Fuzzy Modeling:
http://documents.wolfram.com/applications/fuzzylogic/DemonstrationNotebooks/4.html

PR-OWL: A Framework for Probabilistic Ontologies:


http://mars.gmu.edu:8080/dspace/bitstream/1920/1734/1/C4I-06-02.pdf

Self-Organizing Semantic Topologies in P2P Data Integration Systems:


http://portal.acm.org/citation.cfm?id=1547405
Operational Semantics:
http://en.wikipedia.org/wiki/Operational_semantics
Profunctors, Open Maps and
Bisimulation:http://journals.cambridge.org/action/displayAbstract;jsessionid=D257CFA1AC94C39794960E DF406B
F724.tomcat1?fromPage=online&aid=304562

Peer Selection in Peer-to-Peer Semantic Topologies:


http://cat.inist.fr/?aModele=afficheN&cpsidt=16335232

An Ontology for Organizational Functions: The Recursive Self-Maintenance of Mechanism of the Enterprise:
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=4031262
Internet Alchemy Blog:
http://iandavis.com/blog/
Etymon Blog:
http://etymon.blogspot.com/

Teleonetics Blog:
http://blog.teleonetics.com/

Knowledge Today Wiki:


http://knowledgetoday.org/wiki/index.php/Main_Page
"The most characteristic thing about reality or the world (in the sense of Wittgenstein) is its syndiffeonesis."
http://www.geocities.com/ilvangelo/logic.html

Chaos, Solitons & Fractals (unusual, high style, low quality, but sometimes interesting):
http://www.elsevier.com/wps/find/journaldescription.cws_home/967/description#description
Why Pigs Don't Have Wings (Jerry A. Fodor):
http://www.lrb.co.uk/v29/n20/fodo01_. html
3-D Structure of Human Genome: Fractal Globule Architecture Packs Two Meters of DNA Into Each Cell:
http://www.sciencedaily.com/releases/2009/10/091008142957.htm

As It Is, Inc.:
http://www.asitisinc.com/randd/
Comentarios
Accede para escribir un comentario.

Hamid Javanbakht
Goguen Sets; Dialectica and Chu Constructions: Cousins?
This note compares two generic constructions used to produce

Brainstorm Page 106


Goguen Sets; Dialectica and Chu Constructions: Cousins?
This note compares two generic constructions used to produce
categorical models of linear logic, the Chu construction and the
Dialectica construction, in parallel.
http://www.emis.de/journals/TAC/volumes/17/7/17-07.pdf

http://en.wikipedia.org/wiki/Dialectica_space
http://en.wikipedia.org/wiki/Dialectica_interpretation

Categorical relationships between Goguen sets and ―two-sided‖


categorical models of linear logic:
The relationships between ―two-sided‖ categorical models of linear
logic and Goguen sets is investigated. In particular, we show that
only certain Goguen sets can be represented as Chu spaces, while it is
possible to represent any Goguen set as a Dialectica space. In
addition, we discuss the benefits of these representations.
http://dx.doi.org/10.1016/j.fss.2004.02.008

Goguen Categories:
In a wide variety of problems one has to treat uncertain or incomplete
information.
http://matwww.ee.tut.fi/~eturunen/Winter.pdf

On Fuzzification of Algebraic and Topological Structures:


http://www.eusflat.org/publications/proceedings/EUSFLAT_2007/papers/Solovjovs_Sergejs_(45).pdf

On a Coalgebraic Category of Variety-Based Topological Systems:


http://staff.science.uva.nl/~gfontain/tacl09-abstracts/tacl2009_submission_6.pdf
Última modificación: 31/01/2010 16:35
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
Heterotelic vs. Polytelic vs. Multitelic?
I think we can say that just about all the policies adopted in our modern industrial society fall into this category. All are
technological and institutional, and though some may seem superficially to serve the interests of individual people,
they are designed above all to serve those of the state and the corporations, without any regard whatsoever for their
invariably destructive effects on society, the natural world and the ecosphere as a whole.

The critical distinction between homeotelic and heterotelic behaviour, or between normal and abnormal behaviour, is
foreign to the paradigm of science. If behaviour is looked at reductionistically, there is no way in which its purposive
and "whole-maintaining" function can be established, and hence no way of distinguishing between behaviour that
serves to maintain the critical order of the ecosphere and that on the contrary that serves to disrupt it. Reductionist
science is thus above all an instrument of scientific obscurantism and mystification-among other things, it prevents
people from understanding the true nature of the conflict between their interests and those of their political and
industrial leaders.
http://www.edwardgoldsmith.com/page280.html
Última modificación: 15/01/2010 06:03
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
Homeotely
Homeotely As we have seen, natural systems, as differentiated parts of the Gaian hierarchy, share the common goal
of maintaining its critical order or stability, for only in this way can they maintain their own critical order and hence their
own stability. It is significant that there is no word in the English language that makes explicit the essential purposive
and whole-maintaining character of life processes, so I have had to coin a new word - 'homeotely', from the Greek
homeo (same) and telos (goal). The principle of homeotely must clearly apply to all natural systems. Thus von
Bertalanffy accentuates the "whole-maintaining character" of life processes at the level of the biological organism:
"The most convinced representative of an ateleological point of view must admit that actually an enormous
preponderance of vital processes and mechanisms have a whole-maintaining character; were this not so the organism
could not exist at all. But if this is so, then the establishment of the significance of the processes for the life of the
organism is a necessary branch of investigation." [29] He cites E. Ungerer as being so impressed by the "whole-
maintaining" function of life processes that he decided to replace the biological "consideration of purpose" with that of
"wholeness". [30] The same principle applies to a community and a society. At least some anthropologists of the
'functional' school saw cultural behaviour as ensuring the integrity and stability of social systems. For Radcliffe-Brown
the function of a behavioural trait is the contribution it makes "to the total activity of which it is part", while "the function
of a particular social usage is the contribution it makes to the total social life as a functioning unit of the total social
system". [31] It must be clear that the teleological nature of life processes only becomes apparent when one sees
them holistically in terms of their relationship with the spatio-temporal whole of which they are part. Mainstream
scientists, who insist on looking at them in isolation from the whole, continue to insist that they are random, goalless
and self-serving. This could not be better illustrated than by the preposterous writings of Professor Richard Dawkins at

Brainstorm Page 107


and self-serving. This could not be better illustrated than by the preposterous writings of Professor Richard Dawkins at
Oxford University. The coordination of homeotelic processes is particularly impressive. Radcliffe-Brown saw the
essential "functional unity" of a society as "a condition in which all parts of the social system work together with a
sufficient degree of harmony or internal consistency, i.e. without producing persistent conflicts which could neither be
resolved nor regulated." He notes that this view of society is in direct conflict with the view that culture is no more than
a collection of "shreds and patches" for which there are "no discoverable significant social laws". [32] Without the
coordination required to prevent "persistent conflicts," life processes, however, could not conceivably achieve their
common goal of maintaining the critical order of the Gaian hierarchy. As I have already mentioned, living things
behave homeotelically towards the Gaian hierarchy because it is the only way of maintaining its integrity and stability
and hence their own integrity and stability. This is clear if one realizes that they are but the differentiated parts of such
systems in isolation from which they have no meaning, cannot survive or, in the case of a loosely integrated system,
can survive only imperfectly and precariously. As Eugene Odum writes, "because each level in the biosystem's
spectrum is integrated or interdependent with other levels, there can be no sharp lines or breaks in a functional sense,
not even between organism and population. The individual organisms, for example, cannot survive for long without its
population, any more than the organ would be able to survive for long as a self-perpetuating unit without its organism."
[33] From another perspective, they must behave homeotelically to the hierarchy of larger systems of which they are
part, because the latter provides them with their 'field', i.e. the environment to which they have been adapted by their
evolution and upbringing and which, as Stephen Boyden points out, must best satisfy their most fundamental needs.
[34] For these reasons, one can go so far as to say that in a stable biosphere, behaviour that satisfies the
requirements of the whole must also be that which best satisfies the requirements of its differentiated (as opposed to
random) parts. I refer to this as "the principle of hierarchical mutualism". Of course, with the increasing social and
ecological disintegration that occurs under the impact of economic development or progress, behaviour ceases to be
homeotelic; it becomes misdirected, and though it may continue to serve, superficially at least, some of the interests of
the parts, it no longer serves those of the whole Gaian hierarchy. I refer to such behaviour as heterotelic (from the
Greek, hetero (different) and telos (goal)). I...
Última modificación: 15/01/2010 06:00
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
Related to Isotelesis
Objective Teleology
http://www.jstor.org/pss/2018389

Teleological theories of mental content try to explain the contents of mental representations by appealing to a
teleological notion of function. Take, for example, the thought that blossoms are forming. On a representational theory
of thought, this thought involves a representation of blossoms forming, and a theory of content aims among other
things to tell us why this representation has that content; it aims to say why it is a thought about blossoms forming
rather than about the sun shining or pigs flying or nothing at all. In general, a theory of content tries to say why a
mental representation counts as representing what it represents.

According to teleological theories of content, what a representation represents depends on the functions of the
systems that use or (it depends on the version) produce the representation. The relevant notion of function is said to
be the one that is used in biology and neurobiology in attributing functions to components of organisms (as in "the
function of the pineal gland is releasing melatonin" and "the function of brain area MT is processing information about
motion"). Proponents of teleological theories of content generally understand this notion to be the notion of what
something was selected for, either by ordinary natural selection or by some other natural process of selection.
...
Before closing this sub-section, we should also note that Millikan occasionally makes it clear that her theory is
intended as a version of an isomorphism theory. According to an isomorphism theory, representation is a matter of
preserving the similarities and differences among relations such that relations in the represented domain are mirrored
by relations in the representing domain. This is therefore a resemblance theory, but the relevant resemblances are
second-order or relational: there is no requirement that representations share properties other than abstract relational
properties with their representeds. This makes isomorphism theories more plausible than crude resemblance theories.
They also have some currency with cognitive scientists (e.g., see Gallistel 1990, and Palmer 1999, 77). Moreover, a
teleological version of an isomorphism theory has the resources to address at least some of the traditional objections
to isomorphism theories. For instance, one traditional objection is that resemblance is too cheap: i.e., the objection is
that everything maps on to everything by some rule of mapping or other (Goodman 1976). However, as mentioned in
the introduction to section 3, a teleological theory can restrict the relevant mappings to those that the system was
designed to exploit. This part of Millikan's theory is not much discussed in this entry, however, because it is not very
much developed.

To a large extent, Millikan's theory has been responsible for the great interest, both positive and negative, that
philosophers have shown in assessing this general class of theories. While this section summarizes her core idea,
readers will need to turn to her extensive writings for the full version. A similar warning applies to all the theories
summarized here, but since Millikan's writings on the topic are very extensive and are sometimes hard to integrate,
the warning is especially relevant in this case. For the best brief version of her theory, readers should probably turn to
her "Biosemantics" (Millikan 1989b, reprinted in Millikan 1993).
http://plato.stanford.edu/entries/content-teleological/

Monge has identified nine major elements of systems as defined by


systems theory, namely, that they are isomorphic, hierarchical,
interdependent, teleological, nonsummative, self-regulative,
equifinite, adaptable, and (in the case of open systems) interactive
with the environment. [Monge, 1977]
http://internet.eserver.org/Hardy-Usenet-System.txt
Última modificación: 15/01/2010 05:34
Informe de comentarios abusivos

Brainstorm Page 108


Última modificación: 15/01/2010 05:34
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
Semantics
In graph theory there are two concepts of "sameness" whereby graphs are judged to be "the same", namely, equality
and isomorphism. Of these isomorphism is the more fundamental, and the one with which we will be chiefly concerned
from now on. It's true that we may care to know, occasionally, whether or not two graphs are equal, but I introduced
the notion of equality mostly as a stepping stone to the more abstract notion of isomorphism.

An indication of the pervasive role isomorphism has in graph theory is the fact that isomorphism has virtually captured
the word "is". In most contexts graph theorists use "is" not to mean "is equal to", as you would naturally think, but to
mean "is isomorphic to".
http://books.google.com/books?id=8nYH5OYEW24C&pg=PA47&lpg=PA47
&dq=isomorphic+semantics&source=bl&ots=wzCNTwIMA7&sig=Ns7fWVrCdpOQcx91rs4E20vGfv0&hl=en&ei=
0UBQS9CvDJO0swPNmuyCCA&sa=X&oi=book_result&ct=result&resnum=10&ved=
0CDsQ6AEwCQ#v=onepage&q=isomorphic%20semantics&f=false
Última modificación: 15/01/2010 05:27
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
Equifinality similar to Isotelesis
Equifinality is the principle that in open systems a given end state can be reached by many potential means. The term
is due to Ludwig von Bertalanffy, the founder of General Systems Theory. He prefers this term, in contrast to "goal", in
describing complex systems' similar or convergent behavior. It emphasizes that the same end state may be achieved
via many different paths or trajectories. In closed systems, a direct cause-and-effect relationship exists between the
initial condition and the final state of the system: When a computer's 'on' switch is pushed, the system powers up.
Open systems (such as biological and social systems), however, operate quite differently. The idea of equifinality
suggests that similar results may be achieved with different initial conditions and in many different ways.
http://en.wikipedia.org/wiki/Equifinality
Última modificación: 14/01/2010 02:02
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
Toward a Category Theory Design of Ontological Knowledge Bases
"I discuss (ontologies_and_ontological_knowledge_bases /
formal_methods_and_theories) duality and its category theory
extensions as a step toward a solution to Knowledge-Based Systems
Theory. In particular I focus on the example of the design of elements
of ontologies and ontological knowledge bases of next three electronic
courses: Foundations of Research Activities, Virtual Modeling of
Complex Systems and Introduction to String Theory."
http://arxiv.org/pdf/0906.1694
Última modificación: 11/01/2010 00:03
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
Autology
Autology: An Inductive System of Mental Science Whose Center is the Will, and Whose Completion is the Personality:
http://books.google.com/books?id=5NYuAAAAYAAJ&dq=autology&source=gbs_navlinks_s

http://en.wikipedia.org/wiki/Autological_word

"Heterology. Either (a) not true of, or applicable to, itself

Brainstorm Page 109


"Heterology. Either (a) not true of, or applicable to, itself
(contrary: autology); (b) not the same as itself (contrary: homology);
and/or (c) not true for and/or to itself (contrary: autonomy which, if
it includes de-alienation, entails true in itself also)."
Última modificación: 11/01/2010 00:01
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
Samson Abramsky
Interactions between Representation Theory, Quantum Field Theory, Category Theory, and Quantum Information
Theory:

Representing Physical Systems as Chu Spaces (Samson Abramsky):


http://cs.uttyler.edu/faculty/mahdavi/Conference%20abstracts.pdf

Here is the power point version of Abramsky's Paper:


http://www.math.tulane.edu/~mwm/wip/slides/samson.pdf

Full Paper:
http://www.comlab.ox.ac.uk/files/2372/RR-09-08.pdf

Samson D. Abramsky FRS, FRSE is a computer scientist who currently holds the Christopher Strachey Professorship
at Oxford University Computing Laboratory. He is well known for playing a leading role in the development of game
semantics. He has made significant contributions to the areas of domain theory, the lazy lambda calculus, strictness
analysis, concurrency theory, interaction categories, and the geometry of interaction.
http://en.wikipedia.org/wiki/Samson_Abramsky
Última modificación: 01/01/2010 14:25
Informe de comentarios abusivos
0

Publicar respuesta a este comentario ▼

Hamid Javanbakht
Chu Spaces and Quantum Games
On Game Formats and Chu Spaces:
It is argued that virtually all coalitional, strategic and extensive game formats as currently employed in the extant
game-theoretic literature may be presented in a natural way as discrete nonfull or even-under a suitable choice of
morphisms- as full subcategories of Chu (Poset 2).
http://ideas.repec.org/p/usi/wpaper/417.html

Quantum game theory is an extension of classical game theory to the quantum domain. It differs from classical game
theory in three primary ways:

Superposed initial states,


Quantum entanglement of initial states,
Superposition of strategies to be used on the initial states.
This theory is based on the physics of information much like quantum cryptography.
http://en.wikipedia.org/wiki/Quantum_game_theory

Quantum pseudo-telepathy is a phenomenon in quantum game theory resulting in anomalously high success rates in
coordination games between separated players. These high success rates would seem to require communication
between the players; however, the game is set up such that during the game, this is physically impossible. Quantum
pseudo-telepathy relies on the fact that the quantum laws of physics are subtly non-local and allow violations of the
Bell inequalities. This means that for quantum pseudo-telepathy to occur, prior to the game the participants need to
share a physical system in an entangled quantum state, and during the game have to execute measurements on this
entangled state as part of their game strategy. Games in which the application of such a quantum strategy leads to
pseudo-telepathy are also referred to as quantum non-locality games.
http://en.m.wikipedia.org/wiki/Quantum_pseudo-telepathy

Pasted from <http://knol.google.com/k/conspansive-manifold-complexity-generation-generative-automata-ontological>

Brainstorm Page 110

You might also like