You are on page 1of 60

The absent signal

All that Technical Analysis is about is to reveal the underlying signal that is nested in the time series. In a
narrow sense (for a specific kind of time series), TA is a form of signal processing. There is however a
fundamental difference with classical signal processing; when we process a signal, we know beforehand that
the signal exists and our goal is to exhibit it free of noise, to clean it in order to enjoy thoroughly its
meaningful content, and to possibly alter it. A contradiction is then becoming apparent in TA since we
assume the existence of such a signal while all the signal processing we perform seems to reveal its absence.

This obviously applies to the fractals-derived tools we may use to analyse the time series, as indeed, what
this analysis is telling us is that the time serie keeps evading its nature as a fractal, since the Hurst exponent
(and therefore the Fractal Dimension) keeps changing. These perpetual changes mean that the time serie
cannot be identified as a fractal, at least not via the methodology we are employing, we basically failed to
identify (or even to detect) the signal we intended to study. Nevertheless, we ignore this conclusion and go
on with our assumption of an existing signal in order to make a decision to enter, modify or exit a trade.

That the time serie is not a fractal is not a surprise, since fractals, after all, are mathematical objects that are
not to be found in nature, they are just models that may be helpful to describe natural phenomena. Even the
most classical examples of fractals are not genuine fractals, for instance the coast of Brittany is clearly not a
fractal since its fractalisation stops at best at a molecular level, it does not go on infinitely as is the case for
a mathematical fractal. But even, in this sense, a financial time serie does not qualify as a phenomenon that
can usefully be modelised as a fractal, since its parameters keep changing.
So is there any sense in which the market could be said to be fractal, given that the time serie does not seem
to be satisfyingly modelised by a fractal?

In The Blank Swan, Elie Ayache provides a very interesting element to answer this question at the page
295:

If it were known what the price of some traded asset would be in the next instant, and different from
the present price, it would immediately become the present price because buyers or sellers would
immediately want to hold the asset at that price. The price function catsches up with its virtual before
it becomes actualized. Another, more regressive virtual is therefore needed.

The result of this doubly-composed breaking and differentiating is the infinitely broken line, whose
simplest instance is Brownian motion. It is essentially fractal. There is no scale at which the
differentiating can settle and the function look smooth. Let us call it the line of the market. From it,
the whole world of derivatives unfolds, which will give us, retro-actively, the point of the market.

The point of the market does not just stop at noting that the underlying must follow Brownian motion
(or some other, more complex and jagged trajectory), in order that its next movement may always be
unpredictable. The abyss of differentiation, opening at every point, must not concern the price of the
underlying alone but the price of any 'virtual' derivative that might be written, right there and right
then, in that pit (for there is only one pit), on that underlying. That means that all the potential
coefficients, not only of the 'initial' Brownian motion (or any other more complex process) but of all
the following processes that will complicate that Brownian motion, differentiating it even further,
should themselves never settle but perpetually differentiate. If there were a stage at which the
coeffcicients settled, the price of the corresponding derivative would become a deterministic function
of the preceding prices and would no longer admit a market, that is to say, an unsettlement, an
unpredictability and a differentiation of its own. [EA10, p.295]

From this extract, we can see that the fractal behaviour of the market is not to be identified with the time
serie as a fractal, rather the fractality applies to the existence of the market as a whole, including the
infinitely many virtual derivatives. At best therefore, the time serie would appear as a truncated version of
the 'fractal of the market'. What this fractality implies cannot be detected at the level of the time serie, it may
however be possible to account for this fractality, but this is not going to be done by considering the process
of price formation, rather I believe its implication must be found at a topological level.

Without presuming what a topological study, which I am not yet equipped mathematically to conduct, would
reveal, I may try to precise a bit what I expect to find from it. For that, I would use an analogy ( as all
analogies, this one is limited and should not be carried forward too much): It seems to me that the classical
approaches (TA or Mathematical Finance), by assuming the exitence of a signal or a stochastic process,
assumes that we are exploring an unknown land by following an already built road. We don't know where
the road is going, and we can only detect it by walking it, so we don't even see where it is leading us, we're
just happy to follow it. Unfortunately, the more we are walking on the supposed road, the more we are told
that there is no road at all (this is the absence of signal or of process). A topological study should not tell us
anything about a road, but rather it may inform us of the topography of the land we are travelling through
and help us, in some limited way, to chose the direction where we want to take the next step.
Posted by Jean-Philippe at 4:46 PM 22 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: fractals, Philosophy

Sunday, December 18, 2011


Current research

For a follow-up of my discussion with Elie Ayache, about "The Blank Swan" and its consequences, those
interested can look up this forum's thread, from page 22.
This is the focus of my current researches.
Posted by Jean-Philippe at 11:51 AM 1 comment: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: p-adic numbers, Philosophy

Sunday, May 22, 2011


The Art of Speculation

The objective of this post is to establish speculation as an artistic activity. This obviously does not mean that
speculation can only be practiced as an art; just as one can paint, sculpt or write with no artistic ambition, it
is obvious that one can speculate with no artistic concern whatsoever. My point is rather to say that it is
possible to develop an artistic mastery in speculating just as it is possible to develop one in the other arts. In
particular, I wish to investigate how such a mastery is conducive to successful trading.
In order to do that, I shall first consider some traditional features of art and build some analogies with
speculation. That will hopefully open new perspectives to thinking and practicing speculation

I-CONSTRAINING THE ART

Let me start by a quote from Baudelaires Salon of 1859:

Car il est vident que les rhtoriques et les prosodies ne sont pas des tyrannies inventes arbitrairement,
mais une collection de rgles rclames par lorganisation mme de ltre spirituel. Et jamais les prosodies et
les rhtoriques nont empch loriginalit de se produire distinctement. Le contraire, savoir quelles ont
aid lclosion de loriginalit, serait infiniment plus vrai. [ Salon de 1859- Le gouvernement de
limagination]

Which, in english, gives :

Since it is clear that rhethorics and prosodies are not arbitrarily invented tyrannies but a collection of rules
required by the very organization of the spiritual being. And never did the prosodies and rhetorics prevent
originality to get produced distinctly. On the contrary, to say that they nurtured the occurrence of originality
would be infinitely truer.

Such a remark was inspirational for Raymond Queneau and Francois Le Lyonnais when they founded the
Oulipo in 1960. This movement may be superficially seen as a reaction against the trend of discarding
traditional rules such as versification in poetry, figuration in painting or common practices (tonality,
contrapuntal forms,) in music. In this sense, it would closely relate to Baudelaires point in reasserting
such rules on the ground that they coincide with the very organization of the spiritual being. Such is not
the goal of the Oulipo however, it is, in fact, more in line with the questioning of the traditional rules and
their replacement or improvement, as can be seen in the efforts of Schoenberg in music or in those of
Kandinsky in painting. In my understanding, the Oulipos project is, first of all, to assert the necessity of
constraints, and secondly, to reflect upon the nature of these constraints.
According to this conception, Art is not to be freed from arbitrary constraints and led to develop from a pure
constraint-free intuition as some may have thought wrongly in the 20th century (with experiments such as
some kinds of stream-of-consciousness, automatic writing, ); on the contrary, Art needs constraints, for
intuition and imagination to be productive. The fundamental problem then becomes one of knowing which
constraints are relevant, or even whether this question is meaningful at all.
In practice, Oulipos artists have often been led astray from their original goals by surrealistic believes, but
these early objectives do retain, in my view, all their relevance. The presence of mathematicians, such as
Francois Le Lyonnais and Claude Berge, among the founders, tends to credit the central concern about
structures that underlies the whole enterprise. Many Oulipos constraints came therefore to be inspired
directly from mathematics. The Oulipos work may then seem to be acquainted with what TA is doing in
relation with speculation, provided that speculation is indeed an art.

II-THE BEAUTILESS ART

When considering art, we routinely turn our attention to beauty, as art is widely defined as the making of
beautiful things. However, there is no beauty to be found in speculation, the speculator does mot produce
any masterpiece that can be looked at and admired in an aesthetic perspective, the only judge of the value of
a speculators action is the profit or the loss he made, and this judgment is as unaesthetic as can be since it
solely is based on immediate usefulness. Art, from the romantic period onwards, is not considered as a mean
for a material gain, or it becomes devoid of content and is abased to mere propaganda. Art is believed to be
an end to itself (which was translated by Duchamp as Art for arts sake), in that the search for beauty, is
not the search of an external object in order to acquire or even unveil it in a mundane sense, it is more about
creating beauty.
Speculation cannot be said to create anything beautiful, it merely is useful for the speculator and the system
which it brings to existence: the market. However, art may also be said to be useful, just not directly so, I
admit that art is an end to itself (but I contend that speculation is also an end to itself) but its creations have a
purpose and even a cognitive content, the beauty is enlightening, it says some truth, albeit not one as formed
and determinate as is normally considered to be a true statement.
I have written earlier that the market is evolving faster than mundane reality, and it is in this difference of
speed that we must look for the reason of the absence of beauty in speculation. I believe that beauty simply
has not the time to form in the market place, speculation can only display its utility, its efficiency, the
enlightenment of the speculative art is at best confined to the mind of the speculator, and even there, it is
only present for a fraction of a second, and it leaves no trace whatsoever. The closest to the art of the
speculator is the performance of an amnesic improvisator with no public.
In that, I think I can say that speculation is an art, at least that it can be considered as an art in the way a
speculator wishes to approach this activity, the speculator can, and I believe must, be an artist, even though
he will never be recognized as one by any public.

III-THE FALSEHOOD OF TECHNICAL ANALYSIS


Baudelaire, again in the Salon de 1859, wrote this at the end of section 8 Le Paysage:

I would rather return to the dioramas, whose brutal and enormous magic has the power to impose on me a
useful illusion. I would rather go to the theater and feast my eyes on the scenery, in which I find my dearest
dreams artistically expressed and tragically concentrated. These things, because they are false, are infinitely
closer to the truth.
[translation from The Arcades Project, p.536 (Q4a,4), Walter Benjamin, First Harvard University Press,
2003]

Similarly to the dioramas, it is because TA is false that it may bring us closer to the truth of speculating,
which is itself a production of truth (and that is similar to the status of truth in art as well). Many Technical
Analysts are looking for low-lagging tools, conceiving no-lagging tools as the ideal they should aim at, but
this is a mistake, the ideal TA tool is not one with no lag at all, since if such a tool were existing, there would
be no market in the first place, the ideal TA tool is one that has a lag adapted to the given speculator and
particularly to his relation to the conditions of the market (its speed). The TA tools therefore are not there to
tell some truth about the market but rather, like the dioramas for Baudelaire, to artistically express and
tragically concentrate the reality of a relationship between the speculator and the market conditions (the
speculators dream), which, when witnessed by the speculator will allow him to be attuned to the market.
The point of the Oulipos constraints is exactly that as well, it is to pull the mind of the artist from the
unconstrained immediacy of nothingness, from the passivity of contemplation, in order to force him to
reconquer this immediacy by displaying his creative power to overcome the constraints. Like an Oulipos
constraint incites/challenges the artist to create in order to overcome it, TA incites/challenges the speculator
to speculate also in order to be overcome.
That view, in some way, gives credit to the contrarian philosophy, a speculator speculates against TA.
Speculation to qualify as the activity I expose here, must always be done at variance with the market and
with what the market is saying; speculation is the counter-proof of TA, and it is through this double-negation
of the market (TA negating the market and speculation negating TA), that the speculator becomes the market
(see The Logic of Place: not-not-a = A).
Posted by Jean-Philippe at 4:17 PM 1 comment: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Art, Philosophy, Technical Analysis

Monday, April 18, 2011


The possibility of cognition

The most fundamental question raised by The Blank Swan may be that of the level of cognition of the
market an individual can acquire, and the usefulness of such a cognition if it is, at all, possible. Such a
matter is obviously paramount to the validity of Technical Analysis. The untotalization of possibilities Elie
Ayache shows with regard to financial markets, seems to invalidate most of the current attempts at thinking
this market in explicit terms, as most, if not all, of these attempts are ultimately based on probabilities
computation (and therefore on unwarranted, even false, assumptions about the totalization of possible
states), and this is indeed the case for Technical Analysis, though I believe that the fractal analysis I have
endeavored to develop in this blog provides for an untotalization by means of an implicit multifractal model,
where Hurst exponent keeps being recomputed (I however start thinking this model still falls short of being
efficient at a theoretical point of view). In this post, I therefore intend to examine, from the standpoint of
such a critic of probability theory, whether some kind of cognition is still possible as to what the market is
going to be.

I-READING A BOOK BEFORE IT IS WRITTEN

The best way to read a book before it is written is to write it, and that is, to some extent, what Elie Ayache is
proposing us to do in The turning. There he shows how the market can be dealt with, not by predicting it by
computing some probabilities artificially attached to possible states of the world, but rather by writing
contingency, i.e. writing contingent claims. However, the book of the market is not written by any single
individual (or even any single intentional entity), as is clearly said on page 43:

The place of the contingent claim is nobodys place in particular. It falls to no subject to assign a price
to the contingent claim or to reflect it in his mind.

Writing a contingent claim, therefore, does not quite amount to write the book of the market. It does amount,
however, to protect ones financial interest from the uncertainty of the market, from its contingency. In this
sense of ones direct financial interest, as being under the threat of contingency, writing of contingent claims
indeed appears as the means to mediate contingency. The question which interests me, at the level of
Technical Analysis, is whether we can mediate contingency beyond this direct financial interest, and still do
that in a speculative manner (in the philosophical sense of the term speculative), in other terms, can we
speculate (financially) speculatively?
As to read the book of the market before it is written, it obviously is not possible, as such a thing would
clearly come down to write it, and as such, it would make it redundant, and therefore destroy it. If the book
of the market was to be written by one subject (or if its writing could be seen as being the work of one
subject), it would immediately cease to be a market, as a market can only be a place of exchange, necessarily
supposing the presence of at least two independent subjects.
Nonetheless, speculative knowledge is not perfect knowledge of the phenomenon under inquiry, on the
contrary, speculative knowledge is precisely imperfect, partial, fragmentary, as such a knowledge is rooted
in the necessity of contingency, which implies the knowledge that perfect knowledge is illusory (not in an
epistemological sense but in an ontological one).
As a consequence, we will not be able to read the book of the market before it is written, we will not be able
to predict it in a deterministic way, nor will we be able to predict it in a probabilistic way, what we could
endeavor to know however is the language of the market, and from knowing its grammar, we may be able to
infer something about the market and its dynamics, just like a knowledge of a natural language allows us to
expect a verb after a subject (or the reverse, depending on the language we consider). Such a knowledge
may not be enough to diminish the absolute contingency of the market, but it should be sufficient to provide
a basis for a speculative speculation, or, as Nishida calls it, an action-like intuition.

II-ACTION-LIKE INTUITION (, KOUITEKI CHOKKAN)

Robert Wilkinson presents the concept of Action-like Intuition, that he calls Action-Intuition, in the
following manner:

We must experience the world in order to act on it, and we learn to perceive the world better by acting on
it. Just as he [Nishida] insists that practical reason is more profound than the theoretical, so he insists
that our natural mode of being-in-the-world is not contemplative but active, an aspect of the constant
mutual interaction between individual and the world. The idea that experience is a passive reflection of
the world he regards as entirely false: intuition, separated from action, is either merely an abstract idea,
or mere illusion(Intelligibility and the Philosophy of Consciousness, p.208). Action-intuition, like any
other form of action in Nishidas late thought, is a mutual relation of forming and being-formed:
Action-intuition means our forming of objects, while we are formed by the objects. Action-intuition
means the unity of the opposites of seeing and acting.(ibid, p.191)
[], the philosophy of pure experience leads Nishida to take a view of concept formation diametrically
opposed to that to be found, for example, in the classic empiricists, according to whom concepts are
arrived at by some process of abstraction based on noting common elements in numerically disctinct
perceptions. Concepts are not formed in this way in Nishidas view. We form concepts in the course of
action-intuition: Conceiving something through action-intuition means: seeing it through formation,
comprehending it through poiesis.(ibid, p.210)
The basic thesis of the philosophy of pure experience is that the world is a construction from such pure
experience, and manifestly such construction has to have some method: action-intuition is the basic
formative operation by means of which this construction is carried out. []. Cognition has to be
understood as a form of dynamic, reciprocal expression
[Nishida and Western Philosophy, Robert Wilkinson(2009),p.120-121]

While Nishida obviously considers these remarks to apply to the whole of reality, and while such a stance
may be argued against, I believe there is not much argument as to the relevance of his remarks when it
comes to the market. Cognition, in this domain, can only be understood as a form of dynamic, reciprocal
expression, and concepts formation, according to Nishida, can only occur within a poietic attitude, that is
an active one, and not a detached, analytical one. This dimension is well-established by Elie Ayache in The
Blank Swan with regards to the writing of contingent claims, and particularly with the logic of inverting
dynamic replication with the view of implying volatility. When it comes to Technical Analysis, what Nishida
is saying, also has an interesting consequence, in that it tells us, that, in order to grasp the market, we must
grasp the grasping itself. We therefore need a Technical Analysis tool that is essentially self-referential, there
is however a difficulty in understanding this sentence, that lies in the difference of velocity between the
processes in historical reality, which are the ones Nishida is treating, and the processes in the market which
are the ones interesting us.
The remarkable characteristic of the market is its proximity to the virtual (wherein speed is infinite), a
consequence of this proximity is its very high speed, and its emancipation from causality. This high speed
also accounts for the absence of a subject-object distinction because such a duality does not have the time to
accrete. We are therefore confined, within the market, in a relatively unfriendly environment when it comes
to scientific investigation (even a probabilistic one). In this context, self-reference itself becomes an ill-
defined notion, since we dont even know on which entity to apply such a self-reference. Of course, we may
say that the market is self-referential, in some sense, but since we dont know what the market is, since we
cant reduce it to a subject or an object, we have no direct way to comprehend such a self-referentiality in
order to translate it in a cognition (be it a partial one) of how the market may evolve. This ambiguity is
enough to invalidate a TA tool that would simply be self-referential since such a tool could only be efficient
if every market-actors were to use this specific tool, which is obviously impossible. What we need is a tool
that is self-referential in the way the market (whatever it is) is self-referential, we therefore need a TA tool
that accounts for the very grammar the market is writing itself in.

III-THE GRAMMAR OF THE MARKET

What I call the grammar of the market, extending the analogy made by Elie Ayache between the market and
a book, asks for a little precision here. As said above, the velocity of the virtual is infinite (because the
virtual is not situated in time), and the market inherits some of this velocity more directly than history, as
such it appears much faster than history and mundane life (this high speed also contaminates real history and
accelerates it in some way, this is particularly visible in recent times). Natural languages also happens in
history and as such, their grammar seems relatively constant to us, nonetheless, natural languages change,
and so do their grammar, we must therefore expect the grammar of the market to change faster than the pace
we are accustomed to with natural grammar.
In order to elucidate what we can know of this grammar (that can only amounts to some structure of it, and
therefore to a meta-grammar), we must first look at the market globally and that leads us to recognize that it
has fractal features. This, in itself, is already a very interesting finding, one from which I have tried to
develop some TA tools , but many unknowns remain, such as the adequate period for calculation, the real
meaning of fractal dimension, the scope of the probabilistic model (Fractional Brownian Motion) used,
,and the mathematics that sprang from the fractal theory seem relatively limited to clarify these
unknowns. The holistic approach of Fractal Theory only provides a very global view of the price dynamics,
and Mandelbrot himself even excluded its possible application either to investing or to trading; in his view,
Fractal Theory only served to invalidate probabilistic and statistical inference from the market.
However, to obtain a model that would provide a higher interest in building TA tools, we need to start
considering a reductionist approach at some level. Again here, I must insist, it would be absurd to look
forward obtaining a precise account of the working of the market, when I am talking of reductionism, it must
be clear that I mean a very partial one, that will inevitably fall short of elucidating the processes of the
market. Reductionism may indeed not be the right word, what I am intending to look at, is something in
between holism and reductionism. Despite such reserves, I believe there may be something valuable to find
and to explicit about the market, and that this something may lead to a deeper understanding of the whole
reality.

IV- FRACTALS AND P-ADIC FIELDS

I said earlier that the fundamental properties I wish to look at are to be found at a topological level. One way
to study such properties is to find a space homeomorphic to the one we wish to investigate, and that is
simpler to manipulate.
When it comes to self-similar fractals, which are typically build by IFS (Iterated Function Systems), it is
known that we can find a map so as to assert the homeomorphism of some self-similar fractals with a
space of p-adic integers:

From this map, we can obtain the fractal dimension of the constructed self-similar set:

For b=3 and p=2, we get:

Where this homeomorphism is actually mapping the ring of 2-adic integers onto the Cantor Set
Alain M. Robert provides a more detailed discussion of these maps in "A course in p-adic Analysis"(pp.8-
17)

Of course the fractals we wish to investigate in Finance are not as simple as those built by IFS, in particular,
the self-similarity is not strictly true. Nonetheless, I think such a direction may lead to some interesting
results. The ideal objective would be to establish a general procedure to find a map between a set of arbitrary
fractal dimension and a subset of the space of p-adic numbers. I believe such a question is still an open one,
and I am not sure of the advancement of research in this area (or even whether there are any), as I am just
starting to look at this question.
The fields of p-adic numbers also present another interesting feature when it comes to account for the
process of decision-making at an atomic level. The market is clearly the product of multiple decision-
making processes, and as such they are all, individually, rooted in a valuation of reality. While we are well-
acquainted with the classical absolute value that leads to the intuitive definition of distance (metric), p-adic
fields are equipped with an ultrametric that satisfies the strong triangle inequality.
Whereas a metric satisfies the following triangle inequality:

An ultrametric satisfies the following:

Such a feature leads to rather counter-intuitive results, when we try to visualize them in geometric terms,
such as the following formula, known as "The strongest wins":
However, if we think in terms of decision making, we will indeed tend to ignore menial parameters to base
our decision on the one parameter we consider as the most relevant. In that, we seem to be closer to an
ultrametric mode of thinking.

These considerations are still far from exploitable intuitions, and I myself am not very sure whether they will
lead anywhere. Once again, I am only in the process of learning about this problematic, and anybody is
welcome to criticize or comment, either positively or negatively, on such ideas.

Posted by Jean-Philippe at 8:54 PM 7 comments: Links to this post


Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: fractals, p-adic numbers, Philosophy, Technical Analysis

Wednesday, March 16, 2011


Situation in Japan

I usually reside in Japan, in Hino, which is located in the west of Tokyo. However, given the evolution of the
situation in the Fukushima nuclear Plant, My wife and I have decided to fly out of Japan yesterday, and we
are now in Singapore, in my wife's family.
It seems to me that the government and TEPCO have very little control over the evolution of the situation in
the reactors at Fukushima, and therefore the risk of a major radioactive leak is very real. For the last few
days, the public has been given contradictory information, often after the facts, and with very few details.
The reality of the risk and the seeming confusion of the authorities are what made us leaving Japan for a
period we hope to be as short as possible. The pressure from our overseas family and friends also became
very strong with every new problem at the nuclear plants.

Anyway, I advise all the people who currently reside in Tokyo area to consider leaving it for a while, there
are several options: Living the country for those who have places to stay outside of Japan, move towards the
south, the Kansai region or even Hiroshima area, or simply move towards the mountains, in the narrowest
possible valleys where a radioactivity cloud is the less likely to find its way through. For the coming two
days, Tokyo seems safe from contamination because of the wind direction, it may be the best time to move
out. Once the cloud will be there, the only option will be to stay at home as hermetically closed as possible.

All our thoughts are with the Japanese people, and we do hope we will be back in Japan in a few days when
the situation in the nuclear plants will be under control.
Posted by Jean-Philippe at 10:01 AM 4 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest

Wednesday, March 9, 2011


The Logic of Place

In this post, I will attempt to present succinctly the Logic of Place as created and understood by Nishida
Kitaro, and then to draw some of its implications.

I- NISHIDAS LOGIC OF BASHO ( , BASHO NO RONRI)

Nishidas Logic of Basho closely relates to his writing style and therefore to the Japanese language itself, as
is shown by Jacynthe Tremblay in this article ([1]).
From there, we may at once remark that an important feature of this logic is its encompassing nature: a
higher category (more universal) that is encompassing of lower one (more particular) is said to be its basho.
This is illustrated, at the epistemological level, by the following:
In the judgment red is a color, the copula () means at the objective level that the particular
is located in the universal and that the latter becomes the basho of the former[1], p.257

To the point that basho becomes embedded into each other:

The fact that in the strict sense, the basho is located in a basho means simply the
consciousness.[1],p.262, note 6

This embedding-into-each-others may even be an embedding-in-itself, in the case of self-awareness:

Self-awareness means that the self sees itself in itself. Seeing without a seer means that the self as
noesis becomes the self as basho, that is to say, the basho itself.[1],p.268

It would be a mistake however to understand the basho in terms of Platonician forms, as is clearly stated in
note 12:

form and content are given simultaneously[1],p.262, note 12

The idea of simultaneity is particularly important here, it means that the basho and its content condition each
other in a non-causal manner, which relates to the Buddhist concept of dependent origination.
In fact, the logic of basho is a formalization of the classic Buddhist logic that is found in Nagarjuna and his
followers, an attempt to overcome its apparent contradictory features such as the one expressed here in
another article by Jacynthe Tremblay ([2])(click on the PDF link on the left column to access the full
article):

La logique paradoxale de Nishida prsente le nant absolu comme la matrice logique de la


dtermination mutuelle des couples opposs. Il permet la nantisation d'un terme en auto-identit
pour qu'apparaisse son contraire, et vice versa. Comprise partir du point de vue du nant absolu,
l'auto-identit absolument contradictoire de Nishida n'est rien d'autre que la reprise philosophique
rigoureuse de la logique bouddhiste qui nonce que A=A; A=non-A; donc A=A. Le terme auto-
identit (jiko ditsu) correspond A=A; les mots absolument contradictoire (zettai mujun teki)
correspondent A=non-A (ou A n'est pas A mais devient B ), c'est--dire la nantisation de A par
l'intermdiaire du nant absolu.[2],p.70

Which, in english, roughly gives:

The paradoxical logic of Nishida presents absolute nothingness as the logical matrix of the reciprocal
determination of opposite couples. It allows the nihilation of one term into self-identity for its opposite to
appear, the absolutely contradictory self-identity of Nishida is nothing more than the philosophical and
rigorous translation of the Buddhist logic that states: A=A; A=not-A; therefore A=A. The term self-
identity (jiko ditsu) relates to A=A; the words absolutely contradictory (zettai mujun teki) relate to
A=not-A (where A is not A but becomes B ), that is, the nihilation of A by means of absolute nothingness.

I dont totally agree with the above remark, it is not the the absolutely contradictory self-identity that is
the philosophical and rigorous translation of the Buddhist logic, but the whole idea of a basho logic, and it
is where we can clearly see the difference between this logic and the classic circular western logic. A bit
earlier, Jacynthe Tremblay writes the following:

Elle n'est pas base simplement sur une ngation, mais sur une ngation de la ngation, sur une
ngation absolue qui n'est rien d'autre qu'une affirmation absolue.[2],p.68

It[Nagarjunas Way of the Middle] is not simply based on a negation, but on a negation of a negation, on
an absolute negation that is nothing else than an absolute affirmation.

I think it would be a mistake to interpret this last sentence as being equivalent to the western logic
proposition: not-not-A=A.
What Nagarjuna says here and what Nishida tries to clarify with his logic of basho is that A, via this process
of double-negation is altering itself, it is actually becoming its own basho by overcoming the duality that
conditions its very existence.
If now, we denote a an entity and A the basho within which this entity is located, we should then write:
not-not-a=A
And possibly: A=not-A if A is the basho of absolute nothingness, that obviously leads to the constitution of
the self-aware subject as an absolutely contradictory self-identity.

A last point in this short presentation of Nishidas logic is to notice its acquaintance with some of the points
Meillassoux is making about the law of non-contradiction and the necessity of contingency. The logic of
basho is ultimately a logic of becoming, its affirmation of absolute contradiction is really one that is meant
to refine the eliminative Hegelian logic as is correctly pointed out by Robert E. Carter ([2],pp69-70) in order
to preserve intact the dual tension between contradicting poles; such a move was also made, at about the
same time, by Mao in his interpretation of Marx, he was then inspired, maybe subconsciously, by Chinese
classical philosophy.
When Meillassoux asserts the law of non-contradiction as a corollary to the necessity of contingency, he is
stating that becoming is only possible as the result of a tension between contradicting poles, as such he
wrote:

Affirmer quun existant peut ne plus exister, affirmer que cette possibilit, de surcrot, est quant
elle une ncessit ontologique, cest aussi bien affirmer que lexistence en gnral de lexistant, au
mme titre que linexistence en gnral de linexistant sont les deux ples indestructibles par lesquels
la destructibilit de toute chose peut tre pense.[MEI06], p.102

To affirm that an existent can stop existing, to affirm that such a possibility is furthermore an ontological
necessity, that is also to affirm that the existence in general of the existent, as well as the inexistence in
general of the inexistent are the two indestructible poles by which the destructibility of all things may be
thought.

II- NISHIDAS BASHO AND LEVINAS ILLEITY

In the note 28 of the second article, Jacynthe Tremblay writes the following:

La relation je-tu (watashi to nanji) de Nishida se rapproche beaucoup de la relation ch-Du (ware to
nanji) de Martin Buber. Nishida est en effet entr en contact avec la pense de Buber sur cette
question par l'intermdiaire de la thologie dialectique de Gogarten, entre autres. [2], p,73, note 28

The relation I-thou (watashi to nanji) from Nishida has a strong acquaintance with the relation ch-Du
(ware to nanji) from Martin Buber. Nishida has indeed had contact with Bubers thought on that question by
means of the dialectical theology of Gorgarten, among others.

This is correct to an extent, however, a better comparison would have been, in my view, with Levinas
thought.
Whereas Buber insists on a mutuality of relation that will eventually elide the difference between the I
and thou([3]), Levinas seems keener to preserve the "reality of the difference between the `I' and
`thou'"([3]). As such, Levinas is more in tune with the spirit of nishidas logic where nihilation is never to
be taken as a cancellation or an overcoming of the difference, but as a keeping intact of the tension of
relation that founds the subject as an absolutely contradictory self-identity( :
the aggregative nature of Japanese language allows to reflect the dynamics at play, which would be
destroyed by a reconciliation of opposites).
Buber, therefore, does remain into an Hegelian famework (via Feuerbach), and by insisting on reciprocity,
he precludes the field, the basho, within which the dynamics of becoming is to take place.
However, Levinas significantly deviates from Nishida when he introduces a third party in the relation
between I and thou, and this third party is God whose being-in-the world is the Illeity:

Illeity lies outside the "thou" and the thematization of objects. A neologism formed with il (he) or ille,
it indicates a way of concerning me without entering into conjunction with me. To be sure, we have to
indicate the element in which this concerning occurs.[3]

At first sight, Illeity may seem a concept equivalent to Nishidas basho, if, we, for a minute, overlook the
usage of the word God within the context of Judaism, we may assume that, indeed, Illeity just provides
a ground, a location for the relation to take place, especially in view of the following remark:

God is the absent condition of the encounter with the other.[3]

There is however a fundamental difference between Levinas Illeity and Nishidas basho, that actually
reflects the difference between the western monotheistic view of reality and the eastern atheistic view of it.
The absence of God, that structures its illeity is not the absolute nothingness of the Nishidas ultimate basho.
Whereas the former posits a totalized world where God (even as the Absent) in its illeity appears as the
fabric of relationality, by which an order is signified to me([3], note 40), the latter posits the
untotalizable emptiness of absolute nothingness.
Thereby, positing such a fabric is for Levinas to posit an existing and eternal connectedness between I and
thou, on the other hand, Nishida clearly affirms an impassable chasm between I and thou, a discontinuity
that cannot be amended:

Le lieu o le je et le tu se situent et qui entrane l'auto-ngation de chacun d'entre eux, c'est--dire le


basho du nant absolu, est l'intermdiaire o s'unit ce qui ne s'unit absolument pas, l'intermdiaire o
le caractre absolu de chaque lment et l'aspect d'absolue confrontation s'unissent dynamiquement.
Pareille ngation ouvre dans le basho du nant absolu un intervalle insondable qui est la condition
mme de la subjectivit (shutaisei) du je et du tu. C'est l une manire de se joindre en se coupant,
c'est--dire en ne se touchant pas directement. Cela s'associe directement au fait que le soi est le soi
sans tre le soi, c'est--dire comporte une interruption en lui-mme. Nishida mentionne que le je est
le je par le fait de reconnatre la personnalit du tu, et le tu est le tu par le fait de reconnatre la
personnalit du je. Ce qui fait du tu le tu est le je, et ce qui fait du je le je est le tu. Le je et le tu tant
une discontinuit absolue, le je dtermine le tu et le tu dtermine le je.[2]p.74

The place where the I and the thou are located and which triggers the self-negation of each of them, that
is, the basho of absolute nothingness, is the intermediary where is united what is absolutely not united, the
intermediary where the absolute character of each element and the aspect of absolute confrontation get
unified dynamically. Such a negation opens up in the absolute nothingness an inscrutable interval that is the
very condition of subjectivity (shutaisei) of the I and the thou. This is a way to get united while being cut off,
that is without being in a direct contact of each others. That is directly linked to the fact that oneself is
oneself without being oneself, that there is a gap within oneself. Nishida mentions that the I is the I by the
fact of recognizing the personality of the thou, and the thou is the thou by the fact of recognizing the
personality of the I. That which makes the thou the thou is the I, and that which makes the I the I is the thou.
The I and the thou being an absolute discontinuity, the I determines the thou, and thou determines the I.

In conclusion, whereas Levinas Illeity and Nishidas basho do seem to fulfill a similar formal role in the
constitution of relationality, Levinas Illeity does define a continuum, an ether within which the meeting is
meant to take place and to successfully be the foundation of ethics, on the other hand, Nishidas basho of
absolute nothingness defines an absolute disconnectness, a discontinuum, an empty place within which the
meeting can never settled in anything but a tense and endless dynamics.
Not that Nishidas view, or the way I understand it, cannot be used to reveal an ethics, but not as directly as
the way Levinas is proposing, and not either as Watsuji does propose ([2],pp.74-79); I shall not, however,
discuss this ethical dimension here.

III- THE BASHO OF THE MARKET


On page 442 of [EA10], Ayache writes :

Matter is said to determine the geometry of space, even to preside over the genesis of space, in
Einsteins general theory of relativity. Now what I am saying is that contingency (the only true and
original matter there really is in a materialistic ontology like Meillassouxs) determines place.

As we have indeed seen above, the logic of basho is really a logic of becoming, and. as such, it is easy to see
that the basho of absolute nothingness is corollary to the necessity of contingency. It can also be noticed that
price is the result of a meeting of offer and demand, and this meeting takes place in a market which is
otherwise empty.

It must then be clear that, if we are to follow Nishidas logic of basho and apply it to the dynamics of price,
this dynamics is infinite, discontinuous and untotalizable. Nothing new in there really, the discontinuity of
price time-series is an obvious thing, even though continuity is often preferred in terms of modelization, for
the sake of simplicity in getting mathematical results. The infinity goes without saying, and the
untotalizability is already well-argued for in [EA10].
However, what the comparison between Nishida and Levinas is telling us is that disconnectedness (I will
stick to this word, from now on, as, I believe, it is more precise than discontinuity, to justify this choice
would however take me too far here) is due, not to the process of price determination, but to the basho; in
other terms, the disconnectedness of price is a topological property of the market-place and not a feature of
price time-series.
This is a very important result since it invalidates the distinction between discrete and continuous models, as
such models only address the price process and do not account for the topology of the market place, which is
properly the determining factor in this regard.

Once again, such a conclusion is only valid insofar that my understanding of Nishidas point is valuable (i.e.
productive), anyway, this is the direction of my reflection until I know better.

Notes:
[1]: Nishida Kitars Language and Structure of Thought in the Logic of Basho - Jacynthe Tremblay
[2]: Nantisation et relationalit chez NISHIDA Kitar et WATSUJI Tetsur - Jacynthe Tremblay(click on
the PDF link on the left column to access the full article)
[3]: Levinas and Buber:Transcendence and Society - Damien Casey
Posted by Jean-Philippe at 9:48 PM 8 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Philosophy

Thursday, February 17, 2011


Guilt and Shame, and their necessities

Shame and guilt are very well studied concepts, and it is very well known that they are making up (along
other feelings) the very core of social psyche. It is also well-established that some societies are giving
priority to one over the other in their structures. As such, Japan is widely believed to be the archetype of a
shame society, in that it is contrasted with the occidental civilization (Judeo-Christian), which, in turn, is
taken as the archetype of a guilt society. This approach is, for instance, the one proposed by Takeo Doi, in
his seminal work about Japanese psychology: (Amae no kz, The anatomy of dependence).
Obviously, nobody is seriously asserting that shame is unknown in the West or guilt is unknown in Japan (or
in other Asian countries). It is clear that any non-pathological individual experiences, to various degrees,
some amount of shame and guilt, what is interesting is to consider which feeling comes to dominate within
the psychological structure of a given society, and when one dominates, whether this domination entails
consequences in the intellectual outlook of this society, for example.
It is to such consequences that Elie Ayache seems to allude when he wrote on page 202 of The Blank
Swan:

Necessity is a later, almost guilty, stage; it is a return, a turning-back to the absolute, a form of
regret.

In this post, I intend to discuss this sentence in some details, in order, first, to precise the connections
between guilt and necessity, and afterwards, to go a little beyond, in exploring how shame may enter the
picture.

I shall start by quickly considering the word almost in its mathematical sense, as in almost everywhere,
that is, I read the sentence as saying that necessity is almost nowhere non-guilty, that it is non-guilty in a
domain of null measure. The question is therefore whether there is room in this domain for discourse; can
we say anything about this seemingly guiltless necessity? Is this necessity as shameless as it is guiltless? I
will consider these questions later on.

A substantial difference between guilt and shame is the transparency of guilt; one will always know what he
feels guilty about and why he feels so, on the other hand, while one may know what makes him feel
shameful, he may be at a loss as to the why of such a feeling. So guilt appears as the logical consequence of
a given situation, it seems more consciously motivated than shame, whose obscurity may even feed itself,
one can then be ashamed of feeling ashamed for a futile reason.
Whereas guilt does provide an obvious rationalization, shame looks like a black hole ready to engulf the
whole individual in a whirlpool of despair. Besides, one can hardly feel guilty of feeling guilt; on the
contrary, feeling guilty is often seen as the first step towards redemption and therefore out of guilt
Furthermore, shame is mythologically older than guilt, within Judeo-Christian civilization, since Adam and
Eve felt shame as soon as they realized they were naked, just after eating from the Tree of Knowledge, but
its only after they got cursed by God that a sense of guilt can be conceived of, guilt therefore hinges on the
original sin as a sin entailing punishment (the sense of guilt just being the internalization of this sequence),
while shame seems to only derive from pure self-awareness.

I- THE SAFE HAVEN OF GUILT

As seen above, guilt seems therefore a relatively safe feeling. This is not to say that it cannot lead to severe
depression and even suicide. Guilt can indeed be muddied by too strong a belief (sometimes fanatical) about
what a sin is and how evil it is (to the point of being deadly), that obscures all attempt at rationalization and
at redemption via a reasonable sequence of remorseful penitence. But it provides, at least, a favorable
ground for the individual to overcome the first affliction of guilt, by laying out a standard process to escape
it and be forgiven. In this sense, guilt is, most of the time, equivalent to a redeemable debt.
The safety of guilt is then this framework, analogous to an accounting practice, where causality is well-
established and determinism is assured. Necessity naturally follows from there. As guilt establishes its
dominion on most social interactions, conditioning individuals psychology from an early age, thought
admits causality as the natural way of things, and if something is, then, surely it must be necessary.

But this is not quite what Elie Ayache is saying in the above passage. His idea is to say, if I understand it
properly that thought is feeling guilt, along nostalgia from its self-inflicted banishment from the
absolute, and build necessity in a way to redeem this guilt or to comply with this nostalgia.
Nonetheless, I contend that it is largely because of the framework of safety and stability, that is inherent to
the logic of guilt, and that is encrypted in the western mind that, first, thought branches itself onto this logic
to expiate his abandonment of Fideism and, then, redeem itself by acknowledging necessity, a necessity that,
contrary to Meillassouxs ambition to discover an absolute necessity that was not leading to an
absolutely necessary being (()nous devons dcouvrir une ncessit absolue qui ne reconduise
aucun tant absolument ncessaire(p.47, Aprs la Finitude), has so far (until Meillassoux) been one
founding a dogmatism, in its pre-Kantian form, or one asserting the a-priori absoluteness of the principle of
non-contradiction in the post-Kantian weak correlationism (to use Meillassouxs vocabulary).
The rejection of all form of necessity then leads to the strong correlationism, but this one entails an
incapacity at opposing any form of fanaticism, as it disallows all possibility of a speculative rational
discourse about the absolute, even a refuting one.

II- PARADISE LOST

I published earlier a post on Igor Markevitchs Le Nouvel Age, and shortly commented it as involving a
reflection on shame. Here, I wish to examine another composition from Markevitch: Le Paradis Perdu
(Oratorio), which can be purchased here, and whose full libretto can be found here.

In my opinion, Christopher Lyndon-Gee is totally missing the point of Markevitchs intention in his analysis,
when he wrote the following, whose content is perfectly right (except for redemption that is precisely never
achieved) but whose overall condescendence is out of place:

Markevitchs Eve is a self-pitying rag doll; Miltons has dignity and responsibility. Indeed, a Lucifer
who can declare Quelle proie facile (what easy prey) or can refer to Eve as stupide pouvantail
(foolish scarecrow) does not fall from the pages of Milton, in whose poetic vision the opponents in
this cosmic clash that affects the destiny of the known universe inhabit a higher moral plane.
Redemption is achieved (all too quickly) without effort or confrontation with the terrifying majesty of
God. The Spirit that points the way is a cross between a Victorian sentimental comfort-cushion
and some kind of pantheistic prop of Futurism.

Markevitchs Eve is indeed a self-pitying rag doll, just like Flauberts Emma Bovary, Woolfs Clarissa
Dalloway or Chekhovs three sisters, just like Bertold Brecht and Kurt Weills soldiers wife:

From there to conclude that our musically gifted, very young composer had merely normal literary
abilities, is not to make justice to Markevitch (neither it is to Cocteau and Ramuz who advised him on this
matter), Markevitch didnt want to put Miltons poem in music, he obviously aimed at adapting it to his
times, and that is what he did by taking his characters away from the dogmatic universe of Milton, to
immerse them into the contemporary post-critical correlational circle, where they indeed become easy
prey and foolish scarecrow for all kind of fanaticism.

Elsewhere, Lyndon-Gee writes:

Acceptance of guilt is the first building-block of redemption for Milton. Markevitch, on the other
hand, posits redemption through a vague, almost Hollywoodised notion of Love and aspiration
towards the Spirit

In Miltons dogmatic context, which is the natural space of guilt, such an acceptance reflects indeed the
necessity that is articulated by the logic of guilt: sin-guilt-punishment-redemption (equivalent to the logic of
debt-redemption), itself made efficient by the presence of dogma. In the post-critical world of Markevitch,
redemption can only be a vague and dream-like yearning for an impossible absolute, such is the tragedy of
the correlational circle, that it deprives you of redemption while maintaining you in guilt.
But it obviously is in the music itself that all this is most apparent, and particularly towards the end of the
piece when the futuristic machinery is entering the scene to uplift mankind to a fideist beatitude of pure
spirit. Here again, a subtle irony pierces through the superficial triumph of the Spirit, an uneasiness is felt, a
pretentious vulgarity, which at times derails cacophonously from the main harmony, contradicting the
emphatic pronouncement of redemption; Markevitch clearly demonstrates a defiance to modernity, in
extreme contrast with the bombastic optimism of Prokovievs The steel step for instance:
Markevitchs view of modernity is certainly closer to the one Chaplin depicted in Modern Times:

The pressing question is then now whether there is a possibility for a guiltless necessity, a necessity that
would not be accidental, coincidental to guilt, and that would not therefore appear as a mere expedient to
fulfill its logic. This necessity is of course the necessity of contingency, but it remains to be seen whether it
is not just a new trick from guilt to reassert its dominion on human thought.

III-THE HOLLOWING PRINCIPLE OF SHAME

If we now turn our attention to shame, we see that, whereas guilt is a process that aims at redemption, and
therefore at its own cancellation, shame is unredeemable, it can never be expiated, it cant be erased by the
purchasing of an indulgence or the enduring of a punishment. Its mark is indelible, it is a hole in the flesh.
The readers of the Blank Swan will already have made the connection with the following passage found on
page 365:

The individual degenerates into an identical individual again; he didnt evolve into a differentiated
organism, a body, a corporation, a company.

This is how Elie Ayache qualifies debt, it could, I believe, apply equally to guilt. By contrast, shame is
differentiating, it alters an individual and forces him into a becoming. Interestingly, in Japanese, there is no
direct translation of must, the necessity of acting in a given way is rendered by the verbal suffix:
(nakereba narimasen)
which, literally translates as:
If you dont do it, it will not become.
It is then, here, at the linguistic level, not a call to guilt but to shame properly, for preserving the becoming
of a process, by himself accepting to become other. And this leads us to a very similar call from Meillassoux,
on page 96 of his book:

It is necessary for this to be this and not that or anything else, for this to become that or anything
else.

Hence shame does imply a necessity, a necessity for things to become, that is, the necessity of contingency.

Incidentally, I would suggest that the dynamics of shame is identical to the dynamics at play in the
construction of the Cantor set (and other such constructions), but I shall not dig into this matter further as of
now.

What is clear, is that shame proceeds according to an hollowing principle, it does not build relationship
between individuals according to a logic of debt, but according to a logic of basho (), of place, that
relates to the works of Nishida Kitaro. Shame makes room, makes place for the others and the world to
become.

I shall stop here for now, and see whether it triggers any discussion, there are certainly many imprecisions,
simplifications and perhaps some mistakes, but this is still a thought in progress. Shame on those who would
not let it become!

Eternally returning to the virtual

On page 244 (Ch.11: The Narrative Adventure) of The Blank Swan, Elie Ayache wrote that "to capture (...)
the singular 'how' of creation as such" will provide for "our perception of the creature as a result of
the act of creation, our perception of it against the backdrop of the virtual it is emanating from, will
allow us to read, in the static and actual and settled creature, a continual and eternal return to the
virtual. As it actually, definitely, 'eternally' exists, the creature eternally returns to the virtual."

Nietzsche's idea of the "Eternal Return" is often borrowed by various thinkers and adapted to their need,
which is fair indeed, because Nietzsche himself did not close this idea (that he himself borrowed from
various sources: mythologies and poets) into a strict interpretation, leaving ample room for his readers to
comprehend it in their own way.
A few years ago, I myself pondered over this idea, and came to consider it as a return to the materiality of
the body, a necessary and eternal travel from the abstract to the material, where thought (I personally used
the word "regard" in french, I don't know how to translate it in english, in order to convey its polysemy,
associating the ideas of seeing, considering, thinking, caring, guarding, absorbing, mirroring, repeating, ...,
Nietzsche would perhaps talk of "will to power") eternally returns to the body in order to reaffirm itself (by
again moving away from the body, in order to later return to it) in its freedom, in acting. It seems to me that
it adheres to the sense of the following sentence from Elie Ayache found on page 234:

"Thought wakes up with the body pushing it from behind, and it is soon to be itself literally pushed
outside of the room."

This eternal return of the "regard" to the body from which it comes, is also the return, I believe, of the
thought onto the creature from which it emanates. It eventually is through the consideration by thought of
the negation of the body that is its origin, that body and thought are articulated into a relationship of eternal
return. But the negation of the body can also be seen as the affirmation of contingency (and its necessity), it
is therefore an eternal return of the actual to the virtual, which is the perspective described by Elie Ayache.

Furthermore, I think that this ability for us "to read, in the static and actual and settled creature, a
continual and eternal return to the virtual" is particularly well-represented in chinese art. Calligraphy
painting, for instance, is all about enabling the viewer to enter and to experience the process of painting, that
is the virtual:

This is the character li, which means strength, but there is no depiction of strength here, nothing substantial
differentiates this painting from . What is apparent however is the way this painting has been realised. We
can easily imagine ourselves drawing this exact character, the brush strokes are clearly visible, Wen Shen,
the author of this calligraphy did not do anything to conceal his technique, on the contrary, he's displaying it,
we can imagine the tension of his hand, the speed of his strokes, even the angle of the brush on the paper.
This painting does not represent , it represents itself, it is a painting of the painting.

Similarly, chinese landscapes do not represent a landscape, they again represent the painting of the
landscape:

Again, Chen Jun here doesn't make any effort to conceal his technique, on the contrary, sharing it is the
whole point of the painting.
This philosophy of art is further developed in the works of Zao Wou Ki, whose explicit ambition is to paint
the Dao, which is nothing else than the virtual:
The stress on the process of actualization that underlies chinese art obviously has its counterpart in chinese
philosophy and most importantly in the Dao De Jing ().
Interestingly, just as virtue and virtual shares an etymological connection, the Dao De Jing relates both
concepts quite closely in the first chapter of the De (, virtue):

(shang de wu wei er wu yi wei ye) (1)

which translates literally (and according to Henricks whose book is referred above) as:

The highest virtue takes no action, yet it has no reason for acting this way.

Or, in other words: The highest virtue remains virtual

This idea is of course pervasive in all traditional chinese philosophy, it is sometimes referred as the "Wei wu
wei"() principle, and it can be found to some level in the teachings of all Daoists, but also in
Confucianism. The De Dao Jing (to use Henricks order), that is, the Book of the Virtue and the Dao, or to
use a purely western vocabulary, the Book of the Virtue and the Virtual, is unique in linking both the notion
in such an explicit manner. It indeed seems to say that to be virtuous is to remain in constant proximity with
the virtual, at the edge, one could say, between the virtual and the actual.
It is, in this sense, that Art has an educative function, it is what formed the thought or the "regard", to remain
virtuous, by not interrupting the eternal return to the virtual away from(and pushed by) the actual, by
remaining not in the "can be" (The Blank Swan, p.437) which is already too much on the side of the actual
(into an ontological beingness), but rather in the "also can". This expression which is often found in Singlish
(Singaporean english) obviously comes from the chinese ye keyi(), the "ye"(), meaning also, acts
as an untotalizer of the range of possibilities. Indeed, this expression often comes as an answer to a request,
which assumes implicitely, that the demand was rather out of the range of what is normally requested, the
interlocutor then replies that it can also be done (even though, in practice, this expression is more pervasive
and exceeds this context).
is also found in the above verse (1) of the De, here it is widely believed to just be an emphasis, an
affirmative marker, as it is only found in the MaWangDui version, and not in the more commonly used
version of WangBi , its importance is therefore neglected. However, I believe that it should also be
understood here as an untotalizer, a differentiating operator, that extends the meaning beyond its literalness.
Far from emphasizing then, I think it may well be a de-emphasizing marker, a prompting to forgetting in the
sense Elie Ayache is proposing on page 248:

"Forgetting is the point of inflexion and reversion to the surface that is necessary for writing. By
contrast with the sum total of possibilities, clear and yet confused, forgetting makes the writing thread
obscure yet distinct."

A lot could be said now of these ideas of clarity and obscurity, in relation with chinese thought, I leave it to
the readers to appreciate for themselves, maybe through the works of Francois Jullien, or maybe by looking
also at the works of Zao Wou Ki.
Posted by Jean-Philippe at 3:57 PM 2 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Art, Philosophy

Sunday, February 6, 2011


Igor Markevitch: Le Nouvel Age

It's been a while that I wanted to publish a post on Igor Markevitch's works. Although he's better known as a
conductor, he's one of my favorite composer. I could have chosen more famous pieces of his works such as
Rebus or L'envol d'Icare, I may discuss some of them in the future, but as it happens, right now, I can only
listen to the CD starting with Le Nouvel Age of which 1mn extract of each movement can be listened and
the whole piece can be purchased here.

A presentation of Le Nouvel Age (the new age), the one that can also be found in the CD, can be read here.

To me, one of the most striking feature in Markevitch's music is its self-restraint, which is contrasting with
the futuristic elation found in Mosolov's "Iron Foundry" for instance:

The programatic subtext, provided by Markevitch himself in his autobiography and reproduced in the CD
leaflet, reflects this in toning down the pride of a youthful wrath with:

"Prsence sous-jacente de la vulgarit." (underlying presence of vulgarity)

This presence seems to instill a lingering sense of shame in all the piece, and this leads to a fundamental
questioning(the unresolved dominant seventh) that comes to dominate the third movement: Isn't the New
Age the age of shamelessness, and therefore the age of vulgarity?
Posted by Jean-Philippe at 12:54 PM No comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Classical Music, Philosophy

Wednesday, January 26, 2011


The Medium of Contingency

The medium of contingency(shortly available in volume 22 of pli) clarifies many points that remained
implicit in The Blank Swan, it particularly precises the divergence that exists between Elie Ayaches (EA
in the following) and Quentin Meillassouxs (QM) thought. In that, it does answer my earlier comment,
somehow confirming it, but also, strangely, I came to disagree with this one to the extent of now holding the
contrary belief that EAs and QMs thought do merge into a seeming compatibility.

I-THOUGHT: FROM DOUBT TO AXIOMATICS

The following passage (on page 2) from The medium of contingency seems to assume a certain
conception of thought, or at least of its placement in order to think speculatively, in QMs sense:

If a speculation like Meillassoux's must bring our thought flat against the matter of absolute
contingency, with a flattening of the depth where we would have searched for the reason why things
are what they are and not otherwise and with the flipping of ontology from the side on which things
are to the side on which things can be and if, correlatively, contingency has to be thought
independently of any division of underlying states in which the contingent thing possibly can be
something or other, then the step back from contingency - for only by stepping back from its absolute
strike are we able to make sense of it and unfold the expanse where it can be thought speculatively -
should take place in a direction and through a medium that maintain the absence of reason and the
absence of states.

I am not sure there is any depth to be flattened in the way thought relates to any of its object, and in the way
thought is really, since I cant conceive of a thought severed from its object (in the way I can conceive of an
objectless desire, for instance). On the contrary, I tend to believe that thought is flat, what does have depth is
its manifestation via language, but thought is largely independent of it, and it is independent naturally of
any division of underlying states, which are just tools used to express itself (i.e. its object).
As such, thought can easily criticize these divisions, these states, while it may lose itself in it every now and
then, it always retains the capacity of freeing itself from their influence, of turning against them, of staring at
them in an inquisitive manner. Language can even help thought in its rebellion against language, as any
natural language contains its own meta-level. In that, thought can easily turn itself (and its object) upside
down.
So I am not convinced that a medium that maintain the absence of reason and the absence of states is all
that necessary, provided that states are not taken too seriously, too heavily so as to place us in a world that
is repelled by gravity(The Blank Swan-p.152, Aprs la finitude-p.149). As long as the totalization of states
in not given too much credit, the philosophical debt can easily, and instantly be repaid.
That being said, a medium that maintain the absence of reason and the absence of states may not be
necessary, but it may be useful, but more on this later.

The condition for thought to stay in control of its fate, and thereby, be speculative, is doubt, which may just
be the historical root of QMs factuality, as just like facticity cannot be said to be factitious (Aprs la
finitude-p.107), doubt itself cannot be submitted to doubt.
Doubt seems therefore to appear as the psychological form of facticity (see Aprs la finitude-p.101), or to go
further, as the subjective face of it, and to extrapolate a bit more, one may wish to consider the equivalence
between the necessity of facticity and the necessity of doubt, wherein the former does imply the object
(principle of factuality) and the latter, the subject. I will not follow this line of thought now however, and I
genuinely dont know whether it leads anywhere.

Doubt is nonetheless dated, and often distorted beyond recognition by a mundane usage, whereas axiomatics
propose a modern mathematical formalization of doubt. It may then be through axiomatics, and mathematics
(and maybe indeed topology, to follow Jeff Malpas along with EA, The Medium of Contingency-p.18) that
speculative thought can progress. Mathematics, in their axiomatized form, also present the great advantage
to be a very flat language, containing its own meta-language, and providing an unmatched clarity to the
extent of being tautological (as it is fully explicit).

II-USEFULNESS OF THE MARKET


While I dont think the market is the only available medium for factual speculation to develop, I wholly
follow EAs analysis of the market as being a genuinely contingent and immanent place. As such, it is
therefore useful, but not only because of what it has done, but more because of what it promises to do.
Looking at the market, not as the medium for factual speculation but as just one of many such media, we
should expect all these media to communicate with each others, to exchange and nourish their respective
speculation.
Clearly for instance, and some works may already have started on this matter, that I still have to get
acquainted with, the derivatives market should be a fruitful domain for the factual speculation on probability
axiomatics, which itself could lead to topological and therefore ontological results.
It may also be that a speculative resolution of Humes problem (as stated by QM in Aprs la Finitude-p.176)
could draw some insights from the material un-totalization of contingent claims by the endless
complexification of exotics contracts.

Furthermore, the market may not be the only one, but it appears, in some important regard, to be the purer,
the most devoid of faux-semblants, it will therefore act as a useful reference for the other places of factual
speculation. Not unique but central, it may play the role of a singularity, a repelling or an attracting one, it
does not matter but it will be instrumental in the development of factual speculation, and it may even be
more than that, as I believe it also has a political and ethical relevance which is not foreign to its immanence,
but that is another matter.
Posted by Jean-Philippe at 11:54 PM 4 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Economics, Options, Philosophy

Tuesday, January 25, 2011


The Blank Swan

I already have published a review of The Blank Swan on Amazon site, I will not repeat it here, rather, I
wish to further an analysis (which I outlined on Amazons review, but in very vague terms) of some of its
thesis in relation with those of Quentin Meillassoux in After Finitude: An Essay on the Necessity of
Contingency(the original version of this book being in French: Apres la Finitude).

I- SOME OF MEILLASSOUX'POINTS

1. Humes problem and its probabilistic inference


In After Finitude, Quentin Meillassoux takes on Humes problem which investigates the possibility of
grounding rationally the observed stability of natural laws, or, in Meillassouxwords, our capacity to
demonstrate the necessity of causal connections. Meillassoux lists three types of answers to this problem: a
metaphysical, a skeptic and a transcendental one.
I am only interested in the latter here, I leave to the interested readers to check on the others in Meillassoux
book.
The transcendental solution to Humes problem is typically the one proposed by Kant. Kants argument is a
reductio ad absurdum, starting from assuming that there is no necessity in the causal connections, it results,
according to Kant, in the complete destruction of the possibility of representation, as the very categories of
representation would lost all meaning in a world where causal connection would keep changing. From there,
Kant infers that therefore, since we have representation and consciousness of phenomenon, causal
connections are necessary.
However, Meillassoux notices that Kants argument hinges on a probabilistic assumption, the one according
to which if causal laws could change, they would change often, to the point where all representation would
become impossible. It is this assumption that Meillassoux does criticize, by proposing the concept of
untotalization, inspired by Cantors work on transfinite numbers.

2. Cantors transfinite and the concept of untotalization


Cantors theorem establishes that the power set of any set A (finite and infinite alike) has a cardinality
superior the the original set A, in other words, the set of all subsets of A has more elements that A itself. That
leads Cantor to introduce transfinite numbers to account for the cardinality of various infinite sets: aleph-
null is then the cardinality of the natural number, while aleph-one is the cardinality of the set of all countable
ordinal numbers.
From the work of Alain Badiou, who interpreted Cantors theory in ontological terms, Meillassoux argues
that such a concept of the transfinite invalidates Kants argument, as probabilities are valid, in their
frequentist interpretation only insofar that a totalization of the cases is not problematic. In
Meillassouxwords:
We are completely ignorant of the legitimacy there is in totalizing the possible, as we totalized the faces of
a die. Such an ignorance is sufficient to demonstrate the illegitimacy of extending an argument about
uncertainty outside of a totality given by experience.
The possible, because of Cantors theorem, may therefore escape, according to Meillassoux, a totalization
compatible with its treatment by probabilistic means, and that is enough to invalidate Kants argument.
Meillassoux does not provide however a positive demonstration of how Cantors theorem, applied to the
possible, makes probabilities invalid, he simply raises the question, and concludes, rightfully in my view and
for the problematic hes looking at, that this is enough to reject Kants argument.

II- THE BLANK SWAN

Elie Ayaches book is subtitled the end of probability, and its central thesis is indeed, following the trail
opened up by Meillassoux, to assert that probability theory is unable to account for the reality taking place in
Finance.
It would be presumptuous to summarize here all that there is in The Blank Swan, just as the previous
section can in no way be taken as a summary of Meillassoux work, I just wish to precise a few concepts, in
order to point out an ambiguity which, in my opinion, is left unresolved by Ayache.

What Ayache proposes to do, in his book, is to apply Meillassoux conclusions about the physical world and
our relations to it, to the world of derivatives trading. For that, he asserts, in convincing terms, that the
market (of derivatives) is a medium of contingency. Therefore, the market is untotalized, in the very same
way that Meillassoux says possibilities in the material world are, and in the market, this untotalization can
be derived from the non-redundancy of derivatives contracts, as indeed, if a contract is redundant, its market
would simply vanish; or in the words of Ayache:

If there were an established law, then some derivatives would never be exchanged.(p.167)

This is indeed true, if valuation were exact, there would be no room to exchange a contract at variance with
this valuation, and therefore no market. The existence of a market clearly points out to the inadequacy of the
valuation process, and therefore to an untotalization of possibilities.
All this is still very much in line with the thesis from Meillassoux, however, Ayache goes a step further when
he writes:

In thinking contingency as absolute with regard to the material world, Meillassoux is thrown into the
exchange. His speculation is untenable in pure thought and the corresponding detachment or
transcendence.
()
All I am trying to do is to carve out the space that is adapted to speculative factual thought.(p.190)

Here I see a divergence between Ayaches and Meillassouxthought. Elsewhere, Ayache wrote:

Speculation thus recovers its absolute meaning. It surpasses even thought itself.(p.175)

Such a stance seems to assert speculation (which, in Ayaches terms, means the act of inverting the model,
whatever it is, for valuating an option and engaging into the trading of the derivatives at variance with its
replication plan, i.e. the writing of the market), as a process that exceeds thoughts, that reaches to a point
beyond thought. But then, one of the main point of Meillassoux being the re-appropriation of the domain of
the absolute by thought, isnt Ayache positing a new absolute which he again places beyond thought?
Another way to put it is: Isnt Ayache falling into a new kind of Fideism, which is properly a target of
Meillassouxwork?
By proposing such a radical criticism of probability, positing not the end of some interpretation or
axiomatization of probability but of probability itself, Ayache may have hypostatized a reality beyond
thought, only accessible through speculation. What then differentiate speculation from a magical ritual, that
one must perform in order to access to a higher level of reality? Arent we driven into a fideistic way of
relating to the world (be it the world of the market) and to give up any illusion of grasping it with an
analytical apparatus?
That matter doesnt find any treatment in The Blank Swan, and it is, in my opinion, its major defect.
Because of it, an atmosphere of ambiguity does linger over the pages, becoming more and more persistent.

I shall stop here for now, not that Ive said it all, and I will likely come back to comment on some other
ideas from the book, such as the logic of place, that I wish to analyse further, in relation to Nishida Kitaros
ideas of the logic of basho, for instance, though I still have some study to do before that.
Posted by Jean-Philippe at 4:46 PM 4 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Economics, Options, Philosophy

Thursday, July 1, 2010


From Spot FX to FX Options (maybe)

Sorry for having been a bit absent these last few weeks, I am considering moving towards FX options
trading, and I have therefore been studying this very different area.

This idea came to me quite accidentally. A few weeks ago, I bought "The blank swan" by Elie Ayache, out of
curiosity really; I started reading it and after about 30 pages, even though, I got the gist of his ideas (which
are more philosophical than technical), I nonetheless realized that I may enjoy the book more if I was a bit
more knowledgeable in Options Trading. Until then, I had browsed through some mathematical finance
books, but never went into much of the details.
I therefore did that and acquired Espen Gaarder Haug'"Derivatives: Models on Models", I am still reading it,
but I already have realised how powerful Options trading can be.

As I see it, traders are to face two unknowns: the volatility and the direction of the move. So far, I don't think
analytical tools provide for a very good prediction in terms of the direction, but I tend to think that volatility
can be foreseen in a better way, albeit far from perfect.
Nonetheless, it is very possible to make money in Spot FX, the uncertainty about direction can indeed be
compensated by a proper money management strategy (that can be found in many trading books, see those
by Van Tharp for instance), but Options trading seems to be able to do that more efficiently by hedging the
risk by means of a combination of options, diminishing thereby the exposure to direction while maintaining
a profit potential out of the volatility variations. And this is only one aspect of Options Trading, as it appears
to offer a rich range of other approaches to trading.
And last but not least, it also proposes rather stimulating intellectual challenges.

My knowledge of these combinations is still too incomplete for me to detail much more at this level, I shall
therefore continue to explore this area in the next few weeks, and will confirm whether I chose to trade FX
Options in the future. If so, one must expect some changes in the content of this blog, even though I will
continue to look at technical tools to analyse the "underlying" time-series, I may concentrate as well on
some portfolio strategy issues.
Posted by Jean-Philippe at 11:03 AM 6 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Options
Monday, May 17, 2010
Variation of the Hurst Exponent

While playing around with various strategies, I came to consider that an interesting way to use the fractal
dimension is to look at its variations rather than its absolute value. Furthermore such an approach makes
sense at a mathematical point of view: from equation (1) in this post, applying the functional power rule of
derivation, we can see that:

Rearranging it, we get:

Asymptotically (for t sufficiently high), we can then see that the sign of the variation of H with time gives us
the sign of the variation of the variance over time, and when this variation is positive, it indicates an
increasing volatility and is therefore the best time to enter a trade. It must be noted that this indication does
not say anything about the sense of the trade we should enter, and it therefore ought to be combined with a
directional indicator in order to be fully operational.

Even though most of such variations can be seen by just looking at the FGDI graphic, it is just as easy (and
possibly adding some precision) to program a new indicator that displays the variations of H over time, the
script of this indicator can be found here on MQL4.
Below the indicator Hurst_Difference is displayed in the lower window on a 1hr chart for EUR/USD:

Whenever this indicator display a value above 0, it indicates a potential entry for a trade.
The parameters of Hurst_Difference.mq4 are:
f_period (integer): This is the period considered for calculating the fractal dimension, default is 30.
type_data (0,1,2,3,4,5 or 6): This is the type of price the indicator will consider (0=CLOSE, 1=OPEN,
2=HIGH, 3=LOW, 4=MEDIAN, 5=TYPICAL, 6=WEIGHTED), default is 0.
Posted by Jean-Philippe at 5:07 PM 8 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest

Thursday, April 15, 2010


Self-similarity and a measure of it
Following an exchange via emails with a fellow trader, John Last, in which he made some remarks about the
interest of some kind of self-similarity, I came to conceive a new indicator, which can be used to detect some
convergence of behaviour between different timescales, in the sense outlined below.
____________________________________________________________________________________

I-General Remarks

Self-similarity is a well-known feature of fractals, it may however be useful to precise a few things about
this concept.
The classic examples of fractals such as the Cantor Dust, the Von Koch Curve or the Sierpinski Gasket
display an obvious self-similarity that is the direct consequence of their recursive mode of construction.
However, when one comes to consider real-life applications of fractals, one should not expect to find such a
perfect self-similarity.

Furthermore, when it comes to random fractals and their applications, which is the case of financial markets,
self-similarity should not be taken as meaning a repetition of the exact same pattern, not even as a repetition
of a pattern close enough to the original to warrant the use of "repetition".
Rather, what is meant in the case of financial price variations by self-similarity is really a "statistical self-
similarity", which is more of a similarity of behavior between different timescales.

What should be compared therefore in order to measure the level of self-similarity at a given time, is not
how the price curves at different timescales are "similar" to each others, but rather whether their behaviour,
and particularly their volatility displays a level of self-similarity across timescales.

II-Dispersion of the Fractal Dimension across various timeframes

To measure this statistical similarity, I will only consider the dispersion between the FGDI of various
timeframes around the FGDI of the longest timeframe considered.
For instance, considering the TimeFrame of 1hr, whose FGDI is fgdi(60), and the other shorter timeframes
of 5mn (fgdi(5)), 15mn(fgdi(15)) and 30mn(fgdi(30)). The dispersion will be given by:

Which is basically the formula of the standard deviation around the value of the longest timeframe.
This calculation seems rather straightforward, except that we must take care of a little technical problem: if,
at the present instant, the value of the FGDI is actual for all the timeframes, the value of FGDI 30 bars ago
in the 5mn timeframe is not corresponding to the value of the FGDI 30 bars ago in the 15mn timeframe.
Indeed, the 30th bar back from now on the 5mn TF corresponds to 150mn ago, which on the 15mn
timeframe corresponds to the 10th bar in the past.
Clearly therefore, to re-establish a correspondance that makes sense in equation (1) above, one must apply a
change of index, whose general equation between a given timeframe (TF) and the reference timeframe
(TFref) has the following form:

'newpos' is the new value of the index to be considered in the shorter timeframe in relation to the index 'pos'
from the reference timeframe. Notice that in MQL4, the further a bar is in the past the higher is its index, in
order to have the past index available, the main loop should be a decreasing one in terms of index, starting
from as far as necessary in the past and calculating the bars towards the present.

That being said, I am aware, that the simplicity of this transformation does not ensure a perfect match
between the different timeframes and an error of a few bars is still possible in the past, but the complexity of
implementing a full check in order to ensure a perfect match is not warranted given that it will modify the
final calculation in a negligible manner.

III-Implementation in MT4

I then wrote this indicator as MTF_FractalDispersion11.mq4, the script is available here, and here is what it
looks like on a EUR/USD chart, in orange, in the bottom window:

For clarity purposes, the value of the dispersion is multiplied by 10, a low value is indicative of a high self-
similarity, between the different timeframes.
For instance, in case of a trendy market (FGDI below 1.5), a low dispersion (corresponding to a high self-
similarity) is a positive indicator to enter a trade in the sense of the trend, provided the said trend is in the
same direction in all the timeframes considered.
The available timeframes are 5mn, 15mn, 30mn, 1hr, 4hr and 1 day. Each timeframe can be weighted as
desired (but by an integer value).

Important remark:
From an excellent remark by John Last, I came to realise that the graphical representation of the Fractal
dispersion is only aligned temporally to the prices graph (and therefore also to the Fractal Dimension
graph) for the reference TimeFrame (the longest TF selected, with a weight above 0). On all the shorter TF,
this representation will appear as contracted towards the right (proportionally to the distance we are
looking at in the past from the present, rather than the Fractal Dispersion being contracted, it is the price,
and therefore also the Fractal Dimension, that are dilated, taking more values than in the longer TF, within
the same time interval), and the movements of the FractalDispersion will therefore appear to have taken
place at a time later than at which they really did happen (the correct time will be the one displayed on the
reference TF, i.e. the longest one selected).
The only time all the graphs will coincide on all TF is the present. Any analysis of the past should therefore
take this into account.
This effect of contraction/dilatation is particularly well-illustrated on the following graph, sent to me, by
John:

Here, by comparing the two lower windows (ignore the difference in numerical values, they are due to
another mistake I did in the first version and that I also corrected), we see that the yellow curve
MTF_FracDisp11 is nothing else than the contraction towards the left (with the present as the fixed point) of
the green curve MTF_FracDisp.
Incidentally on this graph, the correct representation, except for a multiplicative factor of 2, is given by
MTF_FracDisp, this is however exceptional and solely due to a specific setup, the 15mn TF is actually not
the reference TF.
In all cases, MTF_FracDisp11 gives the proper value of the fractal dispersion and does coincide with the
timescale only for the graph on the reference TF. And this indicator should only be used as such.

Notice that this indicator needs to access FGDI.mq4 on your PC, and that this one should therefore be
present and compiled properly.

The parameters of MTF_FractalDispersion11.mq4 are:


e_period (integer): This is the period considered for calculating the fractal dimension, default is 30.
e_type_data (0,1,2,3,4,5 or 6): This is the type of price the indicator will consider (0=CLOSE, 1=OPEN,
2=HIGH, 3=LOW, 4=MEDIAN, 5=TYPICAL, 6=WEIGHTED), default is 0.
M5w (integer): This is the weight to be applied to the 5mn timeframe, default is 1.
M15w (integer): This is the weight to be applied to the 15mn timeframe, default is 1.
M30w (integer): This is the weight to be applied to the 30mn timeframe, default is 1.
M60w (integer): This is the weight to be applied to the 1hr timeframe, default is 1.
M240w (integer): This is the weight to be applied to the 4hr timeframe, default is 0.
M1440w (integer): This is the weight to be applied to the 1day timeframe, default is 0.

Rescaled Range Analysis

The Rescaled Range Analysis is an interesting statistical tool to detect long-range dependence in a time-
series, and it also provides a method to estimate the Hurst Exponent. I have detailed to some extent this
method on my other blog at this address.

Having estimated the Hurst Exponent, I was then able to write a Fractalised Moving Average, very much in
the style of the FRASMA, except that this one, called RS_FRASMA, used the estimation of the Hurst
Exponent coming from a Rescaled Range Analysis.
Unfortunately, this analysis is rather demanding in terms of computing power and time, I was therefore
limited to small sample of values and even then, the processing time is quite long, furthermore, the result of
the estimation is not very good, and not good enough anyway to be usable in terms of a fractional bands type
of indicator.
Nevertheless, the RS_FRASMA may still be of some interest, if only in comparison with other MAs, and I
therefore uploaded a script in MQL4 at this address.

The logic of the RS_FRASMA is similar to the one at work in the FRASMA: An SMA is modified by
multiplication of its speed with a factor alpha defined as such:

Where H is the Hurst Exponent.

Here is what it looks like, the red curve is the RS_FRASMA, the yellow one is the FRASMA, and the blue
one is an SMA, all with unmodified speed of 30:
The parameters of RS_FRASMA are:
period (integer): The size of the sample on which the Rescaled Range Analysis is performed, it must be a
power of 2 (4,8,16,32,64,128,...), the default is 64, and in consideration of the limited computing power of
MT4, I don't advise going higher than 256.
normal_speed (integer): This is the normal speed of the Moving Average before it is modified by the Hurst
Parameter.
PIP_Convertor (integer): The factor necessary to convert real price to PIPS, default is 10000 (for
EUR/USD)
type_data (0,1,2,3,4,5 or 6): This is the type of price the indicator will consider (0=CLOSE, 1=OPEN,
2=HIGH, 3=LOW, 4=MEDIAN, 5=TYPICAL, 6=WEIGHTED), default is 0.
Posted by Jean-Philippe at 7:33 AM 6 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: FRASMA, Hurst Exponent, Rescaled Range Analysis, RS_FRASMA

Monday, September 28, 2009


EUR/USD outlook

Technically, the EUR/USD may have reached its top as the 61.8% fibonacci retracement of the range
1.6040/1.2329 at 1.4842 (even though the key resistance is 1.4867) and could now go for a dip back into the
1.2s (albeit some resistance on the way).

On the fundamental side, the EUR is again now over-valued. Besides, in a recent report, the OECD wrote:
"The reform of global exchange rate regimes and the dollar reserve currency problem is extremely
important, but is also unlikely to be achieved any time soon." From The Financial Crisis and the
Requirements of Reform - Adrian Blundell-Wignall

The USD is therefore strengthened in its position as a reserve currency in the medium term.
In terms of financial regulations, the G20 has clearly achieved nothing but a bunch of populistic tricks that
will have no consequence whatsoever, and as explained in the OECD report, this should lead to a sluggish
recovery, especially in Europe and USA.
In this context, the recent rise of the EUR, upshot of an early enthusiasm, should be short-lived, as the
reality of national deficits, progressing unemployment, limited credit and falling consumption will set in.
Posted by Jean-Philippe at 9:06 AM 6 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Economics, EUR/USD, Fibonacci, FOREX, Fundamentals

Wednesday, July 22, 2009


Some general updates and a comment on FRASMA
Let me apologize for a rather long silence, I've been studying some more fundamental problems that require
me to revamp and improve a bit my knowledge on various mathematics topics.
I shall try to resume posting more frequently whenever I find something interesting, and anyway, I should at
least be able to post some more basic stuff after summer.

Meanwhile, some people left a few comments on the MQL4 community, particularly on the thread
concerning FRASMA that may be of interest for those using this moving average.
Posted by Jean-Philippe at 8:03 AM 6 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: FRASMA

Sunday, May 17, 2009


Blogs dynamics

As some of you may have noticed, from now on, I will publish all the posts that do not relate directly to
trading or economics on two other blogs: http://thenomadicchronicle.blogspot.com/ for english, and
http://chroniquenomade.blogspot.com/ for french (one will not necessarily be the translation of the other, the
content may be different)
The posts which are directly focusing on mathematics will also be made on another blog:
http://stochasticfractals.wordpress.com/

I however leave all past posts on this blog.


This is not to mean that my posts on the other blogs will not relate in anyway to trading, I actually believe
that most of them, if not all, do relate to it in some ways. My intention in separating them is only for the
purpose of clarity.
Posted by Jean-Philippe at 1:46 AM No comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Art, Classical Music, Philosophy, Politics

Friday, May 15, 2009


Internationalization of the Yuan

Today, the Bank of China Chairman Xiao Gang announced the beginning of a scheme to internationalize the
Chinese Yuan. The immediate real effect of this declaration will be relatively mild, this internationalization
will only concern trading relationships with south-east asian countries, but we can expect a psychological
effect on the exchange rate of the Yuan, and therefore a trade shorting USD against the CNY seems possible.

On a more fundamental point of view, the Yuan is still very far from being a reserve currency, but given the
geopolitical situation, and the weakening of US economy, the environment is certainly propitious for China
to undertake such measures in the direction of a strengthening of the Yuan and basically an affirmation of its
real weight in the world economy. Clearly, until now, China has been relying on US consumption to boost its
economy, with this consumption currently falling (and still far from bottoming), China would be well-
inspired to develop its domestic consumption and that supposes a strengthening of the Yuan.

For more details, see Reuters, insiderNews,...etc.


Posted by Jean-Philippe at 11:24 PM 2 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: FOREX, Fundamentals, USD/CNY

Thursday, May 7, 2009


Fractional Bands

Let's consider again the equation (1) from yesterday:

We were facing the technical problem of having very small real variations of prices, leading to very small
standard deviations. This can however be easily solved by converting all our values in PIPS. For EUR/USD,
it simply consists in multiplying all the prices by 10000. If we then apply the above equation to PIPS, and
convert it back to the scale of real prices (by dividing by 10000), we can then get a proper representation of
bands, which, given that they are strictly obeying the model of FBM we are working with, I shall name
Fractional Bands.

Here is a representation of these fractional bands for the 5 mn timeframe of EUR/USD, the red bands are the
Fractal Bands defined as earlier, with the default parameters, the yellow bands are the Fractional Bands, with
the same default parameters (without , which we don't need anymore since we are not using equation (2)):

We can also compare the Fractional Bands (in yellow) with the Bollinger Bands (in blue-green) , to confirm
what we expect from the above equation:

We indeed see that whenever the Fractal Dimension crosses the 1.5 line (i.e. whenever H crosses the 0.5
mark), the respective bands cross as well. The Fractional Bands are therefore narrower for a side-market and
wider for a trendy market (even wider than the Fractal Bands for a very trendy market).

The script of Fractional Bands can be downloaded from this address.


The paramaters for the Fractional Bands are the same as for the Fractal Bands except that there is no , and
in addition, we have the following parameter:
PIP_Convertor (integer): the factor necessary to convert real price to PIPS, default is 10000 (for
EUR/USD)

As for the strategy, I am not sure whether there is one for this indicator alone, it seems to cross the prices
quite often, especially during a side-market, it may however be combined efficiently with the FGDI and/or
the Fractal Bands.

From Bollinger to Fractal Bands

Bollinger Bands indicator is a well-known and interesting indicator, as it provides with entry and exit points.
It basically consists in a MA and two bands above and below it. Each band is classically placed at 2 standard
deviations away from the MA. If we assume that price variations follow a normal distribution, this ensures
that 95% of the prices will fall within the bands.
_______________________________________________________________________________________
_______________

I-Some theoretical points

Keeping this assumption for now, the time-series of price variations can be described by a Wiener Brownian
Motion of normal distribution N(0,t). It is interesting to see the probability of the prices to be within the
bands is equal to the probability of the maximum of the price (that we will name M(t)) to be within them, as
shown below:

For more details and the justification of this formula, see my other blog.
We then see:

Such probabilities are calculated for the theoretical value of the standard deviation of the WBM, the
Bollinger Bands, however, calculates an empirical value for it using the well known formula:

Given this practical and the theoretical one, we can equate the two:

And knowing the theoretical standard deviation for a FBM (see there), we get the practical standard
deviation for FBM (of Hurst parameter H):

II-Implementation of Fractal Bands


A straightforward way to implement Fractal Bands seems to just take classical Bollinger Bands and merely
increase the width of the bands by raising the standard deviation to the power of 2H. However, if we do that,
here is what we get (the MA is the FRASMAv2, the reference period is 30, the blue bands are Bollinger
Bands for the same speed) :

I don't find this indicator very useful (not useful at all actually, for me). It seems necessary here to get some
perspective about how we wish to improve on the Bollinger Bands. From my point of view, as a day trader, I
feel Bollinger Bands too narrow, the prices hit them too often, especially in a trending market, where I
would like to get a clear signal only when the trend is over. But, with Bollinger Bands, most of the trend
occurs outside the bands, prompting me to close the trade much too early and basically inciting me not to
ride the trend.
Applying equation (1) however, we get the counter-productive effect of narrowing the bands when in a
trend, because, in our case of price variation, the standard variation is much lower than 1(this may not be the
case for stock exchange, but it clearly is for FOREX), raising it to a higher power therefore decreases its
value proportionally.
A way out of this quandary is simply to apply the following treatment to the standard deviation from the
Bollinger Bands instead of the one from (1):

By taking greater than 1, the higher our H, the wider the bands will be, here is what it leads to (with the
same setup as before, and =2):

The script fractal_bands.mq4 can be downloaded from this address from the MQL4 site.
The input parameters of the indicator are as follows:
e_period (integer): This is the period considered for calculating the fractal dimension, default is 30.
normal_speed (integer): This is the speed of the SMA before being modified to become the FRASMA,
default is 30.
alpha (real): This is the alpha from equation (2), default is 2.
shift (integer): This is the number of bars the FRASMA is shifted to the right(positive) or to the
left(negative), default is 0.
e_type_data (0,1,2 or 3): This is the type of price the indicator will consider (0=CLOSE, 1=OPEN,
2=HIGH, 3=LOW), default is 0.

III-Strategical considerations

I have started using the Fractal Bands indicator, and am very happy of it so far. The strategy is quite
straightforward.
I enter in a BUY position after the price have rebounded (after touching it) from the lower band and crossed
the FRASMA, my Stop Loss is then set to the level the prices hit the lower band, and my Take Profit is
when the prices hit the higher band.
Symmetrically, I enter a SELL position after the price have fallen from the higher band (after touching it)
and crossed the FRASMA, Stop Loss set at the level of the hit of the higher band, and Take Profit when the
lower band is hit.
It is obviously possible (and even advised) to make your Stop Loss trailing the price changes.
I used this strategy for EUR/USD on a 5 minutes timeframe, using it on other timeframes or on other
instruments may require a different setup, mine was to set the speed of the FRASMA at 30, and =2 (in
equation (2) above), it is possible to change these values.
Posted by Jean-Philippe at 5:00 AM 8 comments: Links to this post

Fraternit

Some interesting remarks about fraternity came to my attention today, and I think it reveals an interesting
difference between a social policy and Socialism, a confusion that many seem to make, in one sense or
another.
First, here is a quote from Charles Pguy's "De Jean Coste" written in 1902 (first the original in french,
followed by my translation):
Le devoir d'arracher les misrables la misre et le devoir de rpartir galement les biens ne sont pas du
mme ordre : le premier est un devoir d'urgence ; le deuxime est un devoir de convenance ; non seulement
les trois termes de la devise rpublicaine, libert, galit, fraternit, ne sont pas sur le mme plan, mais les
deux derniers eux-mmes, qui sont plus rapprochs entre eux qu'ils ne sont tous deux proches du premier,
prsentent plusieurs diffrences notables ; par la fraternit nous sommes tenus d'arracher la misre nos
frres les hommes ; c'est un devoir pralable ; au contraire le devoir d'galit est un devoir beaucoup moins
pressant ; autant il est passionnant, inquitant de savoir qu'il y a encore des hommes dans la misre, autant
il m'est gal de savoir si, hors de la misre, les hommes ont des morceaux plus ou moins grands de fortune ;
je ne puis parvenir me passionner pour la question clbre de savoir qui reviendra, dans la cit future,
les bouteilles de champagne, les chevaux rares, les chteaux de la valle de la Loire ; j'espre qu'on
s'arrangera toujours ; pourvu qu'il y ait vraiment une cit, c'est--dire pourvu qu'il n'y ait aucun homme qui
soit banni de la cit, tenu en exil dans la misre conomique, tenu dans l'exil conomique, peu m'importe
que tel ou tel ait telle ou telle situation ; de bien autres problmes solliciteront sans doute l'attention des
citoyens ; au contraire il suffit qu'un seul homme soit tenu sciemment, ou, ce qui revient au mme,
sciemment laiss dans la misre pour que le pacte civique tout entier soit nul ; aussi longtemps qu'il y a un
homme dehors, la porte qui lui est ferme au nez ferme une cit d'injustice et de haine.
De Jean Coste, Charles Pguy, d. Acte Sud Labor L'Aire, coll. Babel, 1993, p. 55

The duty to lift the destitute off their misery and the duty to distribute wealth equally are not of the same
order: The former is a pressing duty; the latter is a desirable one; not only the three terms of the republican
motto, liberty, equality, fraternity, are not at the same level, but the last two themselves, have several
important differences; by fraternity we are prompted to lift our brothers the men off misery; its a prior duty;
on the contrary the duty of equality is much less pressing; as much as I am passionately disturbed to know
that there are still men in misery, as much as I am indifferent to know if, out of misery, men have larger or
lesser wealth; I cannot succeed to make myself passionate for the famous matter of knowing who will get in
the future society, the bottles of champagne, the rare horses, the castles of the Loire Valley; I hope well
always find some arrangement; provided there really is a society, that is, with the provision that nobody will
be banned from it, kept in exile in economic misery, kept in an economic exile, nevermind that this or that
one is in this or that situation; many other problems will request the attention of citizens; on the contrary, it
is enough that one man is kept knowingly, or, which is the same, is knowingly left into misery for the whole
social contract to be broken; as long as there is one man outside, the door that is shut in someones face
secures a society of injustice and hatred.

I think it illustrates very well an aspect of our societies on which we can ponder with some profit. Fraternity
is really at the core of humanity and humanism, and its difference with equality is precisely parallel to the
one between a social policy and Socialism.

For the french speakers (and listeners), it is also interesting to listen to today's broadcast of Repliques on
France Culture:
Repliques du 2 Mai 2009: Penser la fraternit
Posted by Jean-Philippe at 12:47 AM 7 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Art, Economics, Philosophy

Sunday, April 26, 2009


FRASMAv2

This is an updated version of the FRASMA, earlier discussed. The original logic of it is left untouched, I
merely updated it to take into account the calculation of the fractal dimension after the corrections I made in
FGDI. Also, following a request from a reader, I added a parameter "shift" who simply translates the
FRASMA either to the right (when "shift" is a positive integer) or to the left (when "shift" is a negative
integer).

Here is how the FRASMAv2 with a shift set to 10 looks like:

The script for metatrader of FRASMAv2 can be found here.


Posted by Jean-Philippe at 11:18 PM 10 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: FRASMA

Sunday, April 19, 2009


From D.H. Lawrence to Messiaen

Let me start this post by a quote from Lawrence's "Aaron's rod", towards the end of the chapter "Florence",
wherein the hero Aaron plays a piece of solo flute for the Marchesa, who used to be a dilettante singer
(contralto), but is now (after WW1) in a sort of downbeat mood, and feels nausea when listening to music
(especially the orchestral one):
...And there, in the darkness of the big room, he put his flute to his lips, and began to play. It was a clear,
sharp, lilted run-and-fall of notes, not a tune in any sense of the word, and yet a melody, a bright, quick
sound of pure animation, a bright, quick, animate noise, running and pausing. It was like a bird's
singing, in that it had no human emotion or passion or intention or meaning--a ripple and poise of
animate sound. But it was unlike a bird's singing, in that the notes followed clear and single one after the
other, in their subtle gallop. A nightingale is rather like that--a wild sound. To read all the human pathos
into nightingales' singing is nonsense. A wild, savage, non-human lurch and squander of sound,
beautiful, but entirely unaesthetic.

What Aaron was playing was not of his own invention. It was a bit of mediaeval phrasing written for the
pipe and the viol. It made the piano seem a ponderous, nerve-wracking steam-roller of noise, and the
violin, as we know it, a hateful wire-drawn nerve-torturer.

After a little while, when he entered the smaller room again, the Marchesa looked full into his face.

"Good!" she said. "Good!"

And a gleam almost of happiness seemed to light her up. She seemed like one who had been kept in a
horrible enchanted castle--for years and years. Oh, a horrible enchanted castle, with wet walls of
emotions and ponderous chains of feelings and a ghastly atmosphere of must-be.
She felt she had seen through the opening door a crack of sunshine, and thin, pure, light outside air,
outside, beyond this dank and beastly dungeon of feelings and moral necessity. Ugh!--she shuddered
convulsively at what had been. She looked at her little husband.
Chains of necessity all round him: a little jailor. Yet she was fond of him. If only he would throw away the
castle keys. He was a little gnome. What did he clutch the castle-keys so tight for?

Aaron looked at her. He knew that they understood one another, he and she. Without any moral necessity
or any other necessity. Outside--they had got outside the castle of so-called human life. Outside the
horrible, stinking human castle Of life. A bit of true, limpid freedom. Just a glimpse.

It is always difficult to discuss such a passage without, somehow, destroying its charm. I will therefore limit
myself to providing a few directions through which its understanding may be deepened (or so it is for me).

First, I'd like to qualify a little the rather harsh judgment about the piano, by referring to composers such as
Satie (one may also relate the mediaeval flavour of what Aaron plays to Satie's world) or Mompou, who
found a voice for it that does not deserve to be called ponderous or nerve-wracking, and Messiaen, who
seemed to echo the comparison of Lawrence with birdsongs, by composing his "Catalogue d'oiseaux", and
that one was mostly composed for piano, even though, the first piece of this collection can be said to be "Le
merle noir", itself composed primarily for the flute (with a piano accompanying).

On the other hand, the piano indeed has a tendency towards grandiloquence, from which the flute seems
immune. One may think of Japanese music, and of the often central part played by the shakuhachi (wooden
flute), and that may be the best approach to enter the "out-of-life" world (though I disagree with this
characterization) Lawrence is talking about in this passage. The wonderful recording by Lily Laskine and
Jean-Pierre Rampal came readily to my mind while reading these lines.

But before the "Catalogue d'oiseaux", even before "Le merle noir", there was Messiaen's "Preludes pour
piano", whose first piece is called "La Colombe", already a bird, even if this one is a metaphor for
Messiaen's mother. This piece, at least for me, particularly resonates with Lawrence's point.
Posted by Jean-Philippe at 2:43 AM 2 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Art, Classical Music, Philosophy
Friday, April 17, 2009
Fractal dimensions...And a Fractal Graph Dimension Indicator

I have already alluded to the possible confusion with regard to what the fractal dimension exactly is, and
even though I try to always clarify the kind of fractal dimension I am considering in a given context, I never
provided with a detailed discussion of this problem. So here it is, I am going, in this overview, to discuss the
various definition of this entity, and give some references which examine their relationship in more detail.
Eventually, I shall provide with a new indicator that slightly improves on the previous calculation of the
fractal dimension of a graph.

1) Hausdorff Dimension (or Besicovitch-Hausdorff Dimension).


This is the oldest and most mathematically convenient definition of the fractal dimension of an object, but it
is also extremely difficult to calculate exactly for most object, especially those who are not exactly self-
similar, which is basically the case of all interesting objects in any applied domain.
We first need to define a measure of an object F as such:

Where a -cover is a countable (or finite) collection of sets of diameter at most that covers F.
The s-dimensional Hausdorff measure of F is then defined as:

The Hausdorff Dimension is then defined as:

The difficulty in computing this quantity lies in the definition of a -cover. The sets of the collection are
indeed not necessarily having a diameter of , on the contrary, it will be frequent to have an optimal
collection (in the sense of optimizing equation (1)) that will have sets with a diameter much smaller than ,
and to explicit the logic behind such a construction is only possible for extremely simple sets (typically, sets
that are explicitly built through a well-known iterative process). That is obviously not the case of sets found
in practice as a model of a real phenomenon.
This difficulty can be overcome by the Box-counting Dimension to which I come now. For more details
about the Hausdorff Dimension see Chapter 2 in [FALC03].

2) Box-counting Dimension (or Kolmogorov Entropy, Entropy Dimension, Capacity Dimension,


Metric Dimension, Logarithmic Density and Information Dimension)
The Box-counting Dimension can be defined simply as:

Where can be any of the following (not exhaustive list):


- The smallest number of closed balls of radius that cover F;
- The smallest number of cubes of side that cover F;
- The number of -mesh cubes that intersect F;
- The smallest number of sets of diameter at most that cover F;
- The largest number of disjoint balls of radius with centres in F.

From the definition of both the Hausdorff and the Box-counting Dimension, it is easy to see intuitively
(from equation (1)) that:
For a formal proof of that and more detail about the Box-counting Dimension, see Chapter 3 in [FALC03].
There are some other alternatives to define the fractal dimension, but so far, I have not seen applications of
those to finance, and therefore, I will not mention them here, see [FALC03] for a short overview of those.

3) Fractal Graph Dimension Indicator


I have already referred to the code written by iliko that implemented a calculation of the fractal dimension.
This computation is actually inspired from this article that provides with a method to estimate the Box-
counting Dimension (and not directly the Hausdorff Dimension as it is claimed in the article itself)(see
equation (6) in the article).
I however noticed two slight mistakes in iliko's code:

- At line 199:
Instead of : for( iteration=0; iteration < g_period_minus_1; iteration++ )
It should be : for( iteration=0; iteration <= g_period_minus_1; iteration++ )

- At line 213:
Instead of : fdi=1.0 +(MathLog( length)+ LOG_2 )/MathLog( 2 * e_period );
It should be : fdi=1.0 +(MathLog( length)+ LOG_2 )/MathLog( 2 * g_period_minus_1)

After correction however, there is not much change in the indicator itself.
In addition I added a calculation of the standard deviation of the fractal dimension so estimated. It is also
given in the article as equations (10) and (11); and that may provide information for a more precise entry
point for a trade.
The MQ4 file of the FGDI Indicator can be downloaded from this address in the MQL4 Community forum.

Here is a daily EUR/USD chart representing this new indicator along with the FRASMA, and the original
fractal dimension by iliko (lower window):

Posted by Jean-Philippe at 5:51 AM 3 comments: Links to this post

FX Scaling Laws

This article by Glattfelder, Dupuis and Olsen, brought to my attention by a reader, proposes an empirical set
of scaling laws that apply to FX markets.
After considering them, in view of devising an interesting indicator for trading, the problem appears to be
that these laws mostly concerns averages taken over 5 years, that is a serious limitation for their applicability
on a short period of time.
Nonetheless, I identified one, the law (12) that may be of interest, provided some more work:
This law(applied to the total move, *=tm) gives the length of the coastline for a given pair for a year of
activity (250 days) as a percentage, relatively to a resolution defined as the directional-change threshold (cf
chapter 2.3 in the article).
Considering the case without the transaction costs (an assumption, I think, justified by the small scale
considered), I then look at Table A19 to know the parameters of the Law relative to the currency pair I am
interested in. For the following I will consider EUR/USD, which is the pair I trade most often, the law
therefore becomes :

As I am interested in moves around 10 PIPs, I shall then consider a resolution of 0.001 for EUR/USD, so:

Which gives me a resolution between 12 and 14 PIPs (for the current value of the EUR/USD) since 0.001 is
a percentage.
As a result, I get:

This is the annualised length of the coastline, I am more interested in this length for 15 minutes, I therefore
have to divide it by 250*24*4, for a result of:

Which is equal to about 520 PIPs (taking 1.35 for EUR/USD) as the length of the coastline for 15 minutes.

This information is the best I can get so far from the scaling laws described in the article. It may be used to
determine the width of a channel (volatility), though, even for this, it needs to be included in further
calculations (that will likely used the Graph Dimension, or the Hurst exponent). I am currently thinking of
ways to do that, and will publish any success I may have with this line of thought in the future.
Posted by Jean-Philippe at 1:16 AM 15 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Scaling Laws

Friday, March 6, 2009


Is bargaining anti-capitalistic ?

Let me indulge a bit more in some economic ranting while I am still on holiday.

It is easy to verify the fact that bargaining is most popular in those places the less developed in terms of
capitalism, and the more a country will "progress" in accepting the principles of modern capitalism, the
more the activity of bargaining will disappear. It may almost seem like paradox, but is it really one?
I come to think of a possible explanation for this phenomenon, whether it accounts totally for it or only
partially can certainly be a matter of debate.

Bargaining is properly a confrontation between one offer and one demand, it is a highly individualistic
process. Despite that the offerer can back his side of the exchange by a direct reference to the overall
demand for the specific product, and on this ground he will argue for a higher price than the customer is
ready to pay. On the other hand, the customer can argue that this overall demand is merely virtual, projected,
but ultimately unrealized in the very short term, while his present buying of the goods means immediate,
actual money for the seller.

That's how it used to be in traditional societies, in those areas where the exchange of goods was falling
beyond the reach of the despotic rulers. It seems odd then to think that an extension of the domain of free
exchange(Capitalism) has entailed a quasi disparition of bargaining.
Bargaining assumes that the price of a commodity is open to debate, it is not a static given of the transaction,
on the contrary, it is a dynamic component of it. Opposite to this, obviously lies the principle that any given
commodity has a fair ("natural") price. If nowadays, a customer intend to bargain, the selling person(who is
likely to work for a salary, not even indexed on his selling performance) can simply reply, that the price
displayed is already the optimal price, and that there is nothing better to hope for.

One may then say that the almost disappearance of bargaining is simply an effect of the mass-consumption
and the bureaucratization of the modern world, and that it has nothing to do with Capitalism, I believe
Schumpeter[1] may disagree with that with regard to the origins of modern bureaucracy, that he saw as a
manifestation of the rationalization of the economic and social life (the latter being largely conditioned by
the former in a capitalist system).
So, even if bargaining could have survived the rational theory of commodities exchange that has developed
after Ricardo, and evolved into the neo-classical theory, and its widespread acceptation by our societies, it
seems difficult to imagine that it could have survived its multifarious pervasive effects.

I would therefore say that bargaining is NOT anticapitalistic, I believe it is on the contrary, the most
genuinely capitalistic activity one can think of: It is the epitome of individual freedom at the level of the
most elementary economic transaction, the freedom of agreeing on a price.
Clearly this freedom is not denied in the direct sense of fixing the prices of goods by laws as what may be
thought of in marxist-inspired societies, but an indirect influence is just as powerful and much more difficult
to identify. Prices are also fixed in modern Capitalism, by sophisticated economic theories about which
Georgy Lukacs once said that a statue should be erected for their authors in front of every ministry of
economy in the communist countries, because they are the main contributors to the practice of state
socialism (I think it is Lukacs, but if someone wants to correct me and can cite the exact quote, I will be
happy to correct this post in the sense necessary).

What bargaining is clearly incompatible with, is the ideology that affirms the existence of an objective
natural price, in a sense not far from the existence of a natural law. It is that ideology that takes away from
the individual negotiation the freedom of fixing the price for an individual transaction.

On a side-note, the sociological dimension of bargaining could also be an interesting topic of discussion. I
mean by that the way such an activity exceeds the merely utilitarian aspect of commodity exchanges and
may be a strong basis for building or consolidating a network of social human relationships, with diplomacy
and common understanding as a basis. Maybe somebody can point me towards some authors who
investigate these aspects.

References:
[1]: Capitalism, Socialism and Democracy
Posted by Jean-Philippe at 11:59 AM 5 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Economics, Philosophy

Thursday, March 5, 2009


For a deontological code in Finance

I came through the following article[1], that provides with an analysis of the responsibility of Finance and
Economics Academia with regard to the current crisis, and one of their conclusion is as follows:
A second, more likely explanation, is that they did not consider it their job to warn the public. If that
is the cause of their failure, we believe that it involves a misunderstanding of the role of the economist,
and involves an ethical breakdown. In our view, economists, as with all scientists, have an ethical
responsibility to communicate the limitations of their models and the potential misuses of their
research. Currently, there is no ethical code for professional economic scientists. There should be one.

I certainly agree with that, but for such warnings about the models to be heard in the capitalist world we are
living in, they must be broadcasted quite loudly, and even enforced by some sort of regulations. Some
people just don't want to hear certain truths, especially when these ones are liable to jeopardize their multi-
millions bonuses. Let's keep in mind that most financial researchers are funded by these people (directly or
indirectly), and that they therefore are cordially invited to present results that are pleasing to their
benevolence.
If speculators pay for the financial researches done in academia, is it such a big surprise to find that these
researches tend to show the harmlessness of speculation?

While the overall article is interesting, I'd like to comment a bit on the following:
Of course, considerable progress has been made by moving to more refined models with, e.g., fat-
tailed Levy processes as their driving factors. However, while such models better capture the intrinsic
volatility of markets, their improved performance, taken at face value, might again contribute to
enhancing the control illusion of the nave user.

The user who thinks that Levy processes may somehow enhance his control, is not nave, he is ignorant of
what a Levy process is all about. Levy process is exactly telling us that we have less control about what's
going on, and particularly, it invalidates the dynamic hedging strategy inspired by Black, Merton and
Scholes work. Furthermore, this invalidation is not a matter of opinion, it is a matter of mathematical
correctness, as Haug and Taleb have shown in the previously cited article (Haug and Taleb, November
2007), a Levy distribution entails such a weakening of the Central Limit Theorem that the hypothesis(finite
variance) making possible dynamic hedging becomes false.

And last but not least, it would be unfair not to mention the existence of the Truncated Levy Process(TLF)
that seemingly resolved the "inconvenience" of the Levy Process with regard to the infinite variance, and
therefore bring it back to the scope of validity of the Central Limit Theorem, making Dynamic Hedging
again possible. It is indeed what Andrew Matacz in this article[2] aims at achieving.
While I don't question the value of the mathematical parts in the article, I wonder about their applicability
from an investment point of view, and there's indeed a profound ethical problem at play here, and it is rooted
in the belief of the possibility of a riskless strategy (which is at the core of Dynamic Hedging). There can't
be such a strategy, because if there was one, its implementation would invalidate it (the statistical model of a
market is always historical, the market can perfectly shift from one model to another, it is not causally
determined to stay within the limits of one precise model).
A riskless strategy is potentially the equivalent of the perpetual motion machine in mechanics, to use it may
well lead to its destruction (and also create a speculative bubble in the process).
On the other hand, the study of TLF is interesting and should be pursued, but it is necessary to separate this
study from the sole motivation of creating an investing edge in the market (again the problem of deontology
and financing creeps back). In this sense, the approach of Cont, Potters and Bouchaud in this article[3] from
May 1997, displaying as its primary concern the fitting with existing data, appears more promising.

References:
[1]: The Financial Crisis and the Systemic Failure of Academic Economics
[2]: Financial Modeling and Option Theory with the Truncated Levy Process
[3]: Scaling in stockmarket data: Stable laws and beyond
Posted by Jean-Philippe at 9:55 AM 2 comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Economics, Philosophy, Truncated Levy Process

Wednesday, February 25, 2009


An ongoing discussion

In relation to my last post, about the "flapping butterflies", a discussion is going on between myself and Duc
on his site, it takes place over several posts, so a bit difficult to follow, but in case some of my readers are
interested.

Also, I updated the format of comments here, so that, anybody can now post one, even as anonymous if one
wishes. I actually did not realize earlier that there was some limitations on this.
Posted by Jean-Philippe at 12:18 PM 1 comment: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest

Saturday, February 21, 2009


Flapping butterflies don't make hurricanes (A critical view of the 2008-2009 crisis)

Many analysts have provided, are providing, and will provide still for some time with explanations of the
current crisis, and often conclude by sketching some remedies to it, or at least which system should be
implemented in the future to avert a similar situation. I seldom totally disagree with those explanations, but I
even more rarely totally agree with them, and I almost never share their sketches of a solution.
Ultimately though, I think the core problem is seldom touched at all.

As for the elements commonly incriminated for the crisis, here are a few in no order:
- CDS and their unregulated practice
- SubPrime loans and their securitization
- Expansionary monetary policy of the central banks (primarily the one from USA)
- Intervention of the US government to promote access to home ownership (primarily the Community
Reinvestment Act)
- The carelessness of the Credit Ratings Agencies
- The dogmatic culture in financial mathematics (relying on a Gaussian model) that promoted risky strategy
by presenting them as riskless
- American over-consumerism and over-reliance on credit
...etc.

According to the analyst, some of these phenomena will be emphasized, others may simply be ignored or
neglected, but each will be weighted in order to rationalize a judgment that often appears to have preexisted
to a fair analysis, and the rhetorics betrays more or less clearly a whole set of prejudices that is not very
difficult to relate to a school of economics.
It is clear to me that all these elements (and many others) have played a role at one time or another in the
unfolding of the crisis, it is however very difficult and hazardous to identify their relative importance.

Rather than contributing to this debate by merely adding my own prejudices and rationalization, I will try
here to bring up a few elements that I have not seen often mentioned (if at all).

1)I saw many who tried to put the key responsibility of the crisis on government intervention, some
defending the point that in absence of such intervention, crisis would simply not develop at all, at least not
up to any significant level. This idea is simply false and has been demonstrated to be so in 1966 by
Mandelbrot in an article[1], reprinted as the chapter E19 in [MAN97]. In the reprint, Mandelbrot includes
the following foreword:
Two terms are found in the title of this reprint, but not of the originals, namely "nonlinear" and
"rational bubble". They express the two main points of this paper in words that were not available to
me in 1966.
The main substantive finding was that rational behavior on the part of the market may lead to "wild"
speculative bubble(...). The randomness of these bubble is called "wild" in my present vocabulary,
because they can be extremely large, and their sizes and duration follow a scaling distribution. This
distribution is closely akin to the L-stable distribution introduced in the model of price variation
presented in M 1963b.

In there, Mandelbrot demonstrates how speculative bubbles do occur "naturally" in a market. While it is
very possible that some interventions will facilitate bubbles, this mere possibility allows for the opposite
one, that some intervention can also diminish the intensity of bubbles, and even prevent their apparition or
their violent burst.
The prejudice that roots speculative bubbles in government intervention (read as disturbances of a market
otherwise well-balanced) is untenable.

2)One reading of this crisis can be that of the failure of dynamic hedging. I can't testify about the importance
of this failure and its relevance in this crisis, but if I am to believe Espen Gaarder Haug and Nassim
Nicholas Taleb in this article[2], and if dynamic hedging was used in any systematic way by the main
financial institutions, there is certainly some kind of responsibility to be found here.
At the root of the popularity of dynamic hedging, there is again the dogma that markets are inherently
Gaussian, and eventually do not derive into fat-tailed behavior (where serious bubbles form and burst). This
is obviously a denial of the reality of their nature, a nature that has been largely documented over the last 40
years, and clearly displays a chaotic behavior.

3)Another type of analysts, while recognizing the correctness of the occurrence of crisis in an unhampered
market, will argue that any intervention can only make things worse, human minds simply cannot
understand the full effect of their actions, and in a complex system such as the economy, they better abstain
from any attempt to act.
I can't help seeing the fundamentally religious mindset behind such a position, in that it hypostasizes the
market into an order beyond human understanding, that seems to exist in a transcendental realm: From a
mere metaphor, the "invisible hand" suddenly becomes the Logos, the infallible organizing principle.
This rationale though, hinges on a misunderstanding of the "Butterfly Effect". This famous effect is known
by most, and for most, it is the only thing they know about Chaos Theory (and Fractals), and the dynamics
of complex systems, but no butterfly ever created a hurricane, the image is simply that, again, a metaphor to
say that very slight disturbance may contribute to(rather than create) unforeseen catastrophic effect. It does
not mean that they always do so, or even that human understanding cannot have any control over the most
adverse of these effects. Real complex systems have some level of tolerance, of self-regulation at a local
level, of resilience (to use a fashionable term). We may not control the weather, but we can open an umbrella
not to get wet when it rains, and it does not make the rain any heavier.
Human beings are acting, whether in relation to the weather or in relation to the market, there is no such
thing as an unhampered market, because there is no such a thing as a market without human actions.
The question is whether we should think those interventions in a rational manner, from a social point of
view, or whether we should leave each individuals to impose themselves in the market on the basis of their
luck, intelligence and birth, and let the big picture to the care of the "invisible hand" (if one has faith in its
omnipotence, with regard to this context, this faith anyway falls beyond rationality).

In conclusion, let me expose my opinion, which may be prejudiced, but if so, I welcome any criticism of it.
I see the root of the current crisis in this core belief, of a religious nature, about the market (as self-regulated
by the "invisible hand"), that led many people to ignore what the market really was, because it was
inconvenient for them to acknowledge it (its chaotic nature was going against the belief).
The origin of this credo can be found in the Cold War (which provided a propitious intellectual climate for
such a faith to flourish: Against the religious socialism of the communist block, a religious form of
capitalism was seen as most welcome), and more precisely in the Neo-conservative ideology that succeeded
to fusion several elements of economic thought mostly coming from the Austrian school, Monetarism and
Libertarianism; it further blended these elements with the US christian movements that spread from (or were
heavily influenced by) Calvinism, Pietism, Methodism and Baptism (cf. Max Weber[3] about the historical
link between protestant sects and Capitalism).
As a result, a very dogmatic and religious ideology came into play as the official economic philosophy of
american politics (beyond traditional party lines) and even found strong supporters in western Europe (until
recently, Sarkozy and Berlusconi were among them). It found its natural expression in a minimization and
constant undermining of political power (and of the legitimacy of democracy, and therefore of democratic
intervention), to the profit of economic institutions (not submitted to the control of the public in any way)
and capitalist actors, the latter often providing the very people in control of the former, a kind of crony
"democracy" and neocorporatism (very much acquainted with its fascistic counterpart) developed on this
basis. This phenomenon is well documented, as early as the late 80s by Habermas in Ecrits Politiques[4]
(sorry, I don't know the english version or even whether there is one).
It is eventually this ideology that I will rank as holding the primary responsibility for the current situation;
and one may still see its influence at work in the ways the crisis is analysed, and recommendations are made
to decrease even more the influence of the government in the economic realm.

References:
[1]: Forecasts of Future Prices, Unbiased Markets, and "Martingale" Models
[2]: Why We Have Never Used the Black-Scholes-Merton Option Pricing Formula
[3]: The Protestant Ethic and the Spirit of Capitalism
[4]: Ecrits politiques
Posted by Jean-Philippe at 3:58 PM 1 comment: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Economics, fractals, Philosophy

The speed of the FRAMA (Part 1)

Earlier, I mentioned the logic behind the FRAMA (Fractal Adaptive Moving Average), and merely referred
to John Ehlers'article. Here I wish to examine and discuss a bit more in detail this logic.

John Ehlers recommends to link the speed of an exponential moving average to the fractal dimension by
making the coefficient a function of this one via the following formula:

Let's accept this formula, in a first time, to consider the problematic of whether to apply this modification on
an exponential moving average(EMA) or on a simple moving average(SMA).

The purpose of the EMA is to give more weight to the most recent price variations, this is a fair concern for
the medium or long-term trader, I feel it is however a much less interesting feature for the intraday trader,
who has to cope with a lot of noisy, meaningless fluctuations, and relies on the moving average precisely to
avoid being distracted by this noise.
Besides, if we look at what happens for a high fractal dimension (approaching 2), the coefficient is going
to be very small (around 0.01, see the FRAMA article referenced earlier), the EMA will then be slowed
down, but then, we also know that such a high fractal dimension coincides with the wildest noise, and
therefore very high variations of prices. What is then the point of, on one hand slowing down the EMA,
while this one will put a higher weight on the most recent, wildest price variations, thereby reflecting the
wildness?

The two ideas clearly seem to conflict, and the resulting signal appears to be an ambiguous compromise
where the exponential endeavors to speed up the moving average (by emphasizing the most recent
variations) while the fractal dimension endeavors to slow it down.

I therefore prefer, especially as an intraday trader, to fractalise directly a SMA, and therefore get a direct and
readable translation of the information implicit in the fractal dimension. This can be easily achieved by
simply dividing the period of the SMA by the coefficient .

Complement following a remark by Cool here:

In reply to Cool remark, here is a graph representing the FRAMA from Elhers in yellow, and this same
FRAMA using a more precise calculation of the Fractal Dimension in red. Both FRAMA are exponential
MA with a reference period of 10, their only difference is in the way the fractal dimension and therefore the
coefficient is calculated:

Yellow curve:
The fractal dimension is computed from the following equation:

where N1=(HighestPrice LowestPrice) over the interval from 0 to T, divided by T, N2=(HighestPrice


LowestPrice) over the interval from T to 2T, divided by T and N3= (HighestPrice LowestPrice) over the
entire interval from 0 to 2T, divided by 2T
and

Red Curve:
The fractal Dimension is computed from the following equation:

Here are the two MT4 listings.

For the original Elhers FRAMA (Yellow Curve):


//+------------------------------------------------------------------+
//| FRAMA.mq4 |
//| Rosh |
//| http://www.alpari-idc.ru/ru/experts/articles/ |
//+------------------------------------------------------------------+
#property copyright "Rosh"
#property link "http://www.alpari-idc.ru/ru/experts/articles/"

#property indicator_chart_window
#property indicator_buffers 1
#property indicator_color1 DarkBlue
//---- input parameters
extern int PeriodFRAMA=10;
extern int PriceType=0;
//PRICE_CLOSE 0
//PRICE_OPEN 1
//PRICE_HIGH 2
//PRICE_LOW 3
//PRICE_MEDIAN 4 , (high+low)/2
//PRICE_TYPICAL 5 , (high+low+close)/3
//PRICE_WEIGHTED 6 , (high+low+close+close)/4

//---- buffers
double ExtMapBuffer1[];
//+------------------------------------------------------------------+
//| Custom indicator initialization function |
//+------------------------------------------------------------------+
int init()
{
//---- indicators
SetIndexStyle(0,DRAW_LINE);
SetIndexBuffer(0,ExtMapBuffer1);
SetIndexEmptyValue(0,0.0);
//----
return(0);
}
//+------------------------------------------------------------------+
//| Custom indicator deinitialization function |
//+------------------------------------------------------------------+
int deinit()
{
//----

//----
return(0);
}
//+------------------------------------------------------------------+
//| |
//+------------------------------------------------------------------+
double Price(int shift)
{
//----
double res;
//----
switch (PriceType)
{
case PRICE_OPEN: res=Open[shift]; break;
case PRICE_HIGH: res=High[shift]; break;
case PRICE_LOW: res=Low[shift]; break;
case PRICE_MEDIAN: res=(High[shift]+Low[shift])/2.0; break;
case PRICE_TYPICAL: res=(High[shift]+Low[shift]+Close[shift])/3.0; break;
case PRICE_WEIGHTED: res=(High[shift]+Low[shift]+2*Close[shift])/4.0; break;
default: res=Close[shift];break;
}
return(res);
}

//+------------------------------------------------------------------+
//| Custom indicator iteration function |
//+------------------------------------------------------------------+
int start()
{
double Hi1,Lo1,Hi2,Lo2,Hi3,Lo3;
double N1,N2,N3,D;
double ALFA;
int limit;
int counted_bars=IndicatorCounted();
if (counted_bars==0) limit=Bars-2*PeriodFRAMA;
if (counted_bars>0) limit=Bars-counted_bars;
limit--;

//----
for (int i=limit;i>=0;i--)
{
Hi1=High[iHighest(Symbol(),0,MODE_HIGH,PeriodFRAMA,i)];
Lo1=Low[iLowest(Symbol(),0,MODE_LOW,PeriodFRAMA,i)];
Hi2=High[iHighest(Symbol(),0,MODE_HIGH,PeriodFRAMA,i+PeriodFRAMA)];
Lo2=Low[iLowest(Symbol(),0,MODE_LOW,PeriodFRAMA,i+PeriodFRAMA)];
Hi3=High[iHighest(Symbol(),0,MODE_HIGH,2*PeriodFRAMA,i)];
Lo3=Low[iLowest(Symbol(),0,MODE_LOW,2*PeriodFRAMA,i)];
N1=(Hi1-Lo1)/PeriodFRAMA;
N2=(Hi2-Lo2)/PeriodFRAMA;
N3=(Hi3-Lo3)/(2.0*PeriodFRAMA);
D=(MathLog(N1+N2)-MathLog(N3))/MathLog(2.0);
ALFA=MathExp(-4.6*(D-1.0));
ExtMapBuffer1[i]=ALFA*Price(i)+(1-ALFA)*ExtMapBuffer1[i+1];
}
//----
return(0);
}
//+------------------------------------------------------------------+

For the FRAMA modified with a different fractal dimension calculation (Red Curve):

//+------------------------------------------------------------------+
//| FRAMA2.mq4 |
//| Copyright 2008, MetaQuotes Software Corp. |
//| http://www.metaquotes.net |
//+------------------------------------------------------------------+
#property copyright "Copyright 2008, MetaQuotes Software Corp."
#property link "http://www.metaquotes.net"

#property indicator_chart_window

#property indicator_color1 Red


#property indicator_width1 2
//************************************************************
// Input parameters
//************************************************************
extern int e_period =10;
extern int normal_speed =10;
extern int e_type_data =PRICE_CLOSE;
//************************************************************
// Constant
//************************************************************
string INDICATOR_NAME="FRAMA2";
string FILENAME ="FRAMA2.mq4";
double LOG_2;
//************************************************************
// Private vars
//************************************************************
double ExtOutputBuffer[];
int g_period_minus_1;
//+-----------------------------------------------------------------------+
//| FUNCTION : init |
//| Initialization function |
//| Check the user input parameters and convert them in appropriate types.|
//+-----------------------------------------------------------------------+
int init()
{
// Check e_period input parameter
if(e_period < 2 )
{
Alert( "[ 10-ERROR " + FILENAME + " ] input parameter \"e_period\" must be >= 1 (" + e_period + ")" );
return( -1 );
}
if(e_type_data < PRICE_CLOSE || e_type_data > PRICE_WEIGHTED )
{
Alert( "[ 20-ERROR " + FILENAME + " ] input parameter \"e_type_data\" unknown (" + e_type_data +
")" );
return( -1 );
}
IndicatorBuffers( 1 );
SetIndexBuffer( 0, ExtOutputBuffer );
SetIndexStyle( 0, DRAW_LINE, STYLE_SOLID, 2 );
SetIndexDrawBegin( 0, 2 * e_period );
g_period_minus_1=e_period - 1;
LOG_2=MathLog( 2.0 );
//----
return( 0 );
}
//+------------------------------------------------------------------+
//| FUNCTION : deinit |
//| Custor indicator deinitialization function |
//+------------------------------------------------------------------+
int deinit()
{
return(0);
}
//+------------------------------------------------------------------+
//| FUNCTION : start |
//| This callback is fired by metatrader for each tick |
//+------------------------------------------------------------------+
int start()
{
int countedBars=IndicatorCounted();
//---- check for possible errors
if(countedBars < 0)
{
return(-1);
}
_computeLastNbBars( Bars - countedBars - 1 );
//----
return( 0 );
}
//
+============================================================================
====================================+
//+=== FUNCTION : _computeLastNbBars ===+
//+=== ===+
//+=== ===+
//+=== This callback is fired by metatrader for each tick ===+
//+=== ===+
//+=== In : ===+
//+=== - lastBars : these "n" last bars must be repainted ===+
//+=== ===+
//
+============================================================================
====================================+
//+------------------------------------------------------------------+
//| FUNCTION : _computeLastNbBars |
//| This callback is fired by metatrader for each tick |
//| In : - lastBars : these "n" last bars must be repainted |
//+------------------------------------------------------------------+
void _computeLastNbBars( int lastBars )
{
int pos;
switch( e_type_data )
{
case PRICE_CLOSE : _computeFRAMA( lastBars, Close ); break;
case PRICE_OPEN : _computeFRAMA( lastBars, Open ); break;
case PRICE_HIGH : _computeFRAMA( lastBars, High ); break;
case PRICE_LOW : _computeFRAMA( lastBars, Low ); break;

default :
Alert( "[ 20-ERROR " + FILENAME + " ] the imput parameter e_type_data <" + e_type_data + "> is
unknown" );
}
}
//+------------------------------------------------------------------+
//| FUNCTION : _computeFRASMA |
//| Compute the fractally modified SMA from input data. |
//| In : |
//| - lastBars : these "n" last bars must be repainted |
//| - inputData : data array on which the will be applied |
//| For technical explanations, see my blog: |
//| http://fractalfinance.blogspot.com/ |
//+------------------------------------------------------------------+
void _computeFRAMA( int lastBars, double inputData[] )
{
int pos, iteration;
double diff, priorDiff;
double length;
double priceMax, priceMin;
double fdi,alpha;
int speed;
//----
for( pos=lastBars; pos>=0; pos-- )
{
priceMax=_highest( e_period, pos, inputData );
priceMin=_lowest( e_period, pos, inputData );
length =0.0;
priorDiff=0.0;
//----
for( iteration=0; iteration <= g_period_minus_1; iteration++ )
{
if(( priceMax - priceMin)> 0.0 )
{
diff =(inputData[pos + iteration] - priceMin )/( priceMax - priceMin );
if(iteration > 0 )
{
length+=MathSqrt( MathPow( diff - priorDiff, 2.0)+(1.0/MathPow( e_period, 2.0)) );
}
priorDiff=diff;
}
}
if(length > 0.0 )
{
fdi=1.0 +(MathLog( length)+ LOG_2 )/MathLog( 2 * g_period_minus_1 );
}
else
{
/*
** The FDI algorithm suggests in this case a zero value.
** I prefer to use the previous FDI value.
*/
fdi=0.0;
}

alpha=MathExp(-4.6*(fdi-1)); // This is the recommendation from Elhers, but using fdi as the fractal
dimension
ExtOutputBuffer[pos]=alpha*Close[pos]+(1-alpha)*ExtOutputBuffer[pos+1];
}
}
//+------------------------------------------------------------------+
//| FUNCTION : _highest |
//| Search for the highest value in an array data |
//| In : |
//| - n : find the highest on these n data |
//| - pos : begin to search for from this index |
//| - inputData : data array on which the searching for is done |
//| |
//| Return : the highest value | |
//+------------------------------------------------------------------+
double _highest( int n, int pos, double inputData[] )
{
int length=pos + n;
double highest=0.0;
//----
for( int i=pos; i < length; i++ )
{
if(inputData[i] > highest)highest=inputData[i];
}
return( highest );
}
//+------------------------------------------------------------------+
//| FUNCTION : _lowest | ===+
//| Search for the lowest value in an array data |
//| In : |
//| - n : find the hihest on these n data |
//| - pos : begin to search for from this index |
//| - inputData : data array on which the searching for is done |
//| |
//| Return : the highest value |
//+------------------------------------------------------------------+
double _lowest( int n, int pos, double inputData[] )
{
int length=pos + n;
double lowest=9999999999.0;
//----
for( int i=pos; i < length; i++ )
{
if(inputData[i] < lowest)lowest=inputData[i];
}
return( lowest );
}
//+------------------------------------------------------------------+

The speed of the FRAMA (Part 2): The FRASMA

Having explained my preference for a "fractalisation" of the MA to apply on a SMA(rather than on an


EMA), I shall now discuss the exact form of this "fractalisation".
A modification, close to the one recommended by Ehlers, would be to merely divide the period of the SMA
by the coefficient , where is defined as:

For a dimension D varying between 1 and 2, such a division would indeed be equivalent to a change of
speed in a ratio of 100, the SMA being slowed down 100 times from its initial pace, in the extreme case of a
dimension of 2.
This dimension D is a numerical approximation of the Box Dimension, itself an approximation of the
Hausdorff dimension of the graph, which is properly the most mathematically precise fractal dimension.
There is however, another dimension that can also be seen as a Box Dimension, but of another object
relating to the process under study, and that Mandelbrot called the Trail Dimension [MAN97,pp.161&172).

For a Fractional Brownian Motion, we saw earlier that:

Where is what we have called so far the Fractal dimension, and is the coefficient of the FBM (which is
a different thing from the of equation (E1)) . This latter is actually known as the Hurst-Holder exponent (or
sometimes as simply the Hurst exponent, in memory of the British hydrologist whose studies of the long-
term dependence of the Nile discharges, were inspirational to Mandelbrot works), and most often designed
by H, I used in reference to Falconer's book, but H seems more convenient from now on. We therefore
have:
And will now be known as the Graph Dimension. While the Trail Dimension will be defined as:

_______________________________________________________________________________________
__________________________

I-Interpretation of the Trail Dimension

It is easy to see that the Trail Dimension varies between 1 and , for the coefficient H varying between 1
and 0. The first question is therefore how a "dimension" growing infinitely should be understood. In
[MAN97], p.161, Mandelbrot wrote the following explanation:

"First consider a Wiener Brownian motion in the plane. Its coordinates X(t) and Y(t) are independent
Brownian motions. Therefore, if a 1-dimensional Brownian motion X(t) is combined with another
independent 1-dimensional Brownian motion Y(t), the process X(t) becomes "embedded" into a 2-
dimensional Brownian motion {X(t),Y(t)}. The value of the trail dimension:

is the fractal dimension of the three dimensional graph of coordinates t,X(t) and Y(t), and the
projected "trail" of coordinates X(t) and Y(t). However, the dimension:

applies to the projected graphs of coordinates t and X(t) or t and Y(t)."

My understanding of the above passage, in the general case of FBM (H varying between 0 and 1, while for
WBM, H=1/2), is that the Trail dimension must be seen as an approximation of the number of dimensions in
which the "real" process takes place (here it might be interesting to understand the term of "dimension" in a
data-mining sense, rather than in a strict topological sense, prices are clearly the end-result of many
independent processes, any of them with the potential of being chaotic in their own right), under the
assumption that all the coordinates of the said process can be described as independent Fractional Brownian
motions sharing the same Hurst exponent.

II-Slowing down the MA with the Trail Dimension

It is now possible to conceive of a formula for the coefficient , using the Trail Dimension. The purpose of
is to slow down the MA from a reference speed when the Hurst exponent becomes very small, and also to
accelerate it when this exponent becomes close to 1. The reference speed should be taken as the one used
when the price varies in a gaussian way, that is when H is 1/2. So for such a value of H, we should have =1.
If we then consider the following formula:

For a WBM, we have =1. In addition, for a H tending towards 0, tends towards infinity, and for H close
to 1, =1/2.
Comparing from (E2)(red curve) with the inverse of from (E1)(black curve)(we take the inverse in order
to get a multiplicative factor rather than a dividing one to apply on the speed of the MA), we get the
following graphs:
Or, for a more detailed view of their behavior below H=1/2:

Dividing the black curve by 10 in order to have an unchanged speed for the case of a WBM, we get the
following:

For H varying from 0.5 to 0, we see that the coming from (E1) varies almost linearly, for the same
variation however, we know that the randomness increases in a rather non-linear fashion; a linear slowing
down of the MA does not seem to reflect this properly. From this theoretical point of view, I therefore prefer
the given by equation (E2)(not to mention that it is much more simple).

III-Implementation of the FRASMA

I programmed the FRASMA(Fractally modified Simple Moving Average) in the MetaTrader platform. You
may access and download freely this indicator, as well as use it on the metatrader 4 platform, at this address
of the MQL4 Community.
Please, let me know your findings or any criticisms that can improve this indicator.
Meanwhile, here is a screenshot of three fractally modified MA, the Light Blue is a version of the FRAMA
from Ehlers paper (modifying a EMA), the Yellow is a modification of a SMA using the following
inspired by Ehlers paper:

And the Red one is properly the FRASMA, using equation (E2).

Below is the fractal Graph Dimension. The period of reference for all original MA is 20.

IV-Conclusion

My purpose here is not to demonstrate that one indicator is better than another, since the quality of an
indicator is relative to the manner one uses it. I believe that one must be acquainted intuitively with an
indicator to use it productively, and it is for this reason that my preference is going to the FRASMA.
While one may just rely on direct practise to "understand" at an intuitive level a given indicator, I believe
most of us can also profit from a theoretical understanding of them. My goal here is therefore to provide
elements along these lines, for others to develop their own familiarity, and maybe provide me, in return, with
some of their insights and experiences.
It is again naive to think that a trader, using technical analysis, can actually trade without some level of
reliance on his intuition, and it is to totally miss the point of what the fractals tell us about the market to
nourish expectations about a deterministic methodology to be successful as a trader, in other words, there is
no Grail to be found in the first place. Nonetheless, to understand the technical tools one is using, can
improve one's intuition, and the overall success of one's trading activity.

Monday, January 26, 2009


Tuning up the mind

In The Nature of Risk, Justin Mamis concludes his sixth chapter with the following remark:

Intuition, although seemingly spontaneous, apparently emotional, stems from a form of "information"
that has become built-in from past experience. Discipline means choosing what to do unencumbered
by the fear of making a mistake. Confidence means trusting our intuition that what we "see" is what
we "know." There's no escaping to the external, to the objective, and no standing on the shaky ground
of emotions. So the question becomes, How do we create within ourselves the heroic condition of
confidence wherein risk is not danger but life.[MAM99, p.80]

The condition of confidence, heroic or not, to be actual, must somehow not deceive too much, and it would
be naive to think that intuition never deceives us. Nonetheless, Mamis is right, intuition is a critical
component in any decision process, not less in a trading context.
Technical analysis is very nice, but if fractal geometry teaches us anything, it is that we cannot foresee the
evolution of complex processes on the basis of objective knowledge.
Now, if intuition deceives us, I contend that it is because it is trained to do so: Our whole education has
conditioned us to think in deterministic terms, the analytical mind is praised and rewarded, hard sciences
have simply excluded complex systems from their scope for centuries, and these systems are hardly touched
at all in a normal education before a specialized Masters level even today.
Despite that, hard sciences and the mode of thinking they promote, are the foremost influence we are
exposed to during our formal education. We are all members of the church of scientism, those who are not
are likely to be members of churches even more deterministic than this one.
And our intuition follows this fold, even when we lack information to make a decision (which is almost
always the case), we will tend to over-rely on the ones we have, and decide solely on this basis,
extrapolating linearly from this partial knowledge, because we are conditioned to rely on linearity. We are
simply unable to recognize, acknowledge and take into account the non-linearity of a process. It is this
ability that must be developed over time, and it is this one that shall be called an efficient intuition.

There is one domain of the culture that may provide us with a way to build up this intuition, and that is Art.
To each his own, personally, I am more sensitive to music and poetry, and it is therefore along these lines
that I will argue my point, but I believe it can be transposed to other arts.
Adorno's critic of Schoenberg's and Stravinky's music links them both to the political and philosophical
problematic of their times:
Art, indeed, always happens in a context, and relates to it in a very essential way, furthermore, it always
takes on a problematic and resolve it figuratively. When Bartok or Stravinsky rejuvenated classical music
with peasants songs, they merely reacted to the standardization of the world along western romanticist lines.
But more than that, they provide us with a solution to the problematic of cosmopolitism, the native cultures
don't have to be erased, they can be consolidated within an evolving culture (civilizations don't clash, they
merge, sorry Mr.Huntington) and contribute to a manifold society (see the cosmopolitanist philosophy of
Kwame Anthony Appiah for instance). The european empires could have used a bit of insight from them in
the 20s and 30s.

And the same goes for today, here is a piece by Iannis Xenakis: Metastasis
The 1st and 3rd movement deals with a relativistic notion of time, that is a function of energy and mass.
Interestingly, in trading, Mandelbrot defines the concept of trading time, which is also a function of what
can be compared to energy and mass, and that is volume. That is actually a reflection of the dependence of
the Hurst-Holder exponent to time [MAN97, pp39-40].
The second movement is even more interesting since it gives a musical translation of Fibonacci sequence.

Xenakis also wrote pieces dealing with Brownian Motion, Normal Distribution and Statistical Mechanics,
all very interesting pieces. What they provide us with is an acquaintance that goes beyond the mere
knowledge of well-defined criteria, an intuition that may articulate our decisions in a more efficient way.

About a chart

Here is a recent chart-of-the-day I found quite interesting:


It's interesting in the sense that, in 1929, the stock exchange actually fell much more abruptly over a short
period, but, as shown on this graph, on the longer timescale of a year, it actually fell less than in the current
crisis. So it seems to point out the existence of various kind of volatility, one short term, and the other long-
term.

One may propose a few remarks to explain this difference:


- There is much more volume today, implying some kind of inertia in the market
- The traders, despite all their shortcomings, seem more aware than their ancestors of economic forces and
less liable to panic moves, but more liable to early reactions anticipating the worse.
- The way the authorities have managed this crisis is much better than what was done in 1929, and it seems
to have at least spread the fall on a longer time-period, which, in itself, is a very positive achievement.

On the contrary, it may also indicate that, while the digestion of bad news is more progressive , this one still
has to run its course fully. And that may be a teaching of the relative powerlessness of existing institutions
which are basically only reactive in absence of proper regulations.

On a more technical aspect, all this indicates that the two volatilities are intimately connected and that
basically, what is taken from one goes into the other, the final distribution being a result of current
psychologies and existing institutions.
Despite the uncontroversial nature of that remark, it is a bold statement if one takes the time to extend it to
all kinds of assets, and in particular to currency pairs. To perform this extension is however nowhere near
obvious. We all heard of the wild variations of currency value in the 29 crisis, but the monetary system was
then very much different from what it is today. In particular, there was a gold standard, and the arbitrage
system we have today was not in place. In absence of a careful analysis, it would be specious to conclude
anything detailed.
However, without that analysis, and given the current, more "efficient" system of currency pairs, it may
already be possible to conclude that very high volatily in the currency market is to be expected in the
coming months.

The weight of news

The following article from the Federal Reserve addresses the matter of the effects news announcements have
on some assets price (taken in a general sense):
http://www.ny.frb.org/research/current_issues/ci14-6.html

It's a purely statistical approach and therefore lacks any model to really make sense of the data. In particular,
the sample of data does not reflect the difference that may exist between a bull market reaction and a bear
market reaction.
Some interesting comments however on the type of assets that are the more reactive, on the indices that
elicit the most volatility, on the timing of the most significant reaction.
Posted by Jean-Philippe at 11:47 AM No comments: Links to this post
Email ThisBlogThis!Share to TwitterShare to FacebookShare to Pinterest
Labels: Technical Analysis

Friday, September 5, 2008


Who is selling the EUR?

Today saw some remarkable activity in the EUR rates.


This morning at about 10:15(CT), there was a sudden drop of about 50PIPS on the EUR/USD, then at 13:18,
-30PIPS in less than a minute, and at 16:12, more than a 100PIPs drop in less than 5 minutes, clearly some
big accounts are liquidating their EUR reserves.
All this may find an explanation in the remarks from Juncker about the EUR still being over-valued.

During the same time, the DJ dropped by 344 points, so the US stock market does not seem relevant with
regards to the current movements of the USD, stronger forces seem to be at play here.

This certainly invalidates my earlier analysis that linked the future of the USD to the US economy, and
especially to the unfolding of the current crisis. At least for now, the market appears to follow the consensus
of the central banks in working towards a strengthening of the USD against the EUR, or rather towards the
general weakening of the EUR.
As for today at about 5PM CT, the EUR has lost about 300 PIPs against the USD, and about 600 PIPS
against the JPY. Interestingly the USD also dropped by about 250PIPs against the JPY. Japan seems to be at
the heart of the matter in these movements.

The big question is up to which level this weakening will continue ? I, for now, expect to see it going
towards the 1.3, or even 1.2 USD mark, if the current logic is respected. At this level, we shall see what the
EU deciders say about the overvaluation of the EUR.

EUR/USD medium term outlook

These last few weeks, I have been holding the belief that, for the next year or so, the EUR/USD will have a
high volatility between 1.4 and 1.6.
This week, it dropped from 1.55+ to 1.5, that's an impressive drop, and we start hearing from a possible big
trend reversal, possibly the outset of an upward trend for the USD, the materialization of the strong USD
policy promised by Paulson and Bernanke beyond the customary rhetorical value such chantings have.

As of now, I have been considering the risk of being wrong on the upside higher than the one of being wrong
on the downside. Should I then reconsider my approach ?
Anyway, here is the details of my thought so far, up for comment:

My first assumption is that EU economy is still overall structurally healthier than its US counterpart, even
though some banks have suffered from the credit mess, the level of the write-downs (and the depth of their
consequence in the overall economy)is still very far from what we saw in US.

Secondly, I assume that there is a general psychological bias for the USD, whereby the investors actions (in
EUR/USD particularly) over-react to bad EU news, and under-react to bad US news. This bias is actually
justified in view of the market dynamics, where the european markets mostly mimic the US market. What I
mean by bias, is that it does not reflect pure fundamentals, but is mostly a psychological attitude in the mind
of investors that have spent most of their lives with considering the USD as the reference currency, the safe
haven away from the world uncertainty. This bias however is really challenged by the current crisis, and it
tends to fade a bit, and may well vanish totally, which is why, until now, I considered my risk of being
wrong on the upside higher.

Now, considering that Paulson and Bernanke really mean to walk the talk, can they really do it? Clearly, the
Fed may be able to do a few things, especially with the support of the ECB and the BOJ, both having a
strong interest in a strong USD to ease the pressure on their respective economy. We can then reasonably
expect a collaboration between the three largest players on this market to push for a strong USD. But is it
enough?

US economy is expected to deteriorate further. According to Krugman, some bad loans are going to mature
up to 2011, real estate is expected to continue its drop, being only half-way through according to some
estimations.
In addition, USA is going to have a new government in less than 6 months, one who will inherit some
serious liabilities from the current one. A new government, elected on the current buzzword of "change", can
hardly be expected to have a tight budgetting poliy in his first year, especially in view of a reform of
healthcare, of necessary expenses on infrastructure, on energy policy,...etc.
Such a policy may seriously strain at a strong USD policy, which I rather see as incompatible with running
an ever-increasing deficit (something about which the investors should see the EU, despite some very bad
members, relatively immune from, given the conditions of the Growth and Stability Pact).

So, will the USD pull it off, and are we really seeing the first signs of a complete reversal, or is it just the
last song of the swan before its slide into the 1.8 or so, just awaiting the next big write-down ?

JP

What to expect: Trend and Volatility

I assume here that the price evolution is modelised by a Fractional Brownian Motion (FBM) of index- (0<
<1): href="http://www.codecogs.com/">

where X(t) represents the price at time t, so that we have the following equality (E1) about the expectation of
dependent price increments (demonstration in [FALC03]pp267-268):

Clearly the value =1/2 seems to play a very specific role in that equation, since it cancels out its right-side
term.
=1/2 indeed consists in the classical Brownian Motion (Wiener Brownian Motion:WBM) where the
increments over time of the variable X are independent.
This index is directly linked to the Fractal Dimension Df by the relation:

Therefore, when =1/2, which is happening when Df=1.5, we have a genuine Random Walk.
When such is not the case, however, we can say:

1) Df<1.5
This case is equivalent to >1/2, and we can then expect from the equality (E1) that X(t+h)-X(t) tend to be
of the same sign as X(t)-X(0), therefore, if X(t) has an history of increasing, the next move X(t+h) will be
more likely to be up, similarly if X(t) has an history of decreasing, the next move will tend to be down. In
this case, we are in a trend.

2) Df>1.5
This case is equivalent to <1/2.> In this case, X(t+h)-X(t) tend to be of the opposite sign of X(t)-X(0),
therefore, following the same logic as above, we are in a trend reversal period.
Comments on some existing fractal-related tools

A few indicators, that relate to fractals (or seem to do so) are already easily available on several platform.

The first, maybe the simplest is called "Fractals", and when you use it, it draws little arrows, some pointing
up, others pointing down, like this:

This indicator, however, has nothing to do with fractals, it relates to Elliot Wave Theory, as explained here:
http://trading-stocks.netfirms.com/fractals.htm

A derivation of this is called the "fractal channel" which links the little arrows, similarly, it has nothing to do
with fractals.

More relevant then is the Fractal Adaptive Moving Average, which relates to Kaufman's AMA, but uses
fractal theory to determine the current volatility of the market in order to adjust the speed of the MA. The
idea of the AMA is to slow down the MA when the market is moving sideways, and to speed it up when
there is a trend. To achieve this objective, John Ehlers developped the FRAMA, using the Fractal Dimension
as a direct measurement of Volatility, he explains his method in a file (title: FRAMA) that can be
downloaded from this address: http://www.mesasoftware.com/technicalpapers.htm

On the following graph, I plotted a simple 16-MA (blue), an exponential 16-MA (yellow) and the FRAMA
in red (with a reference period of 16 as well). Below are the fractal dimension used by the FRAMA (and
computed from the formula of the above paper), as well as a more sophisticated fractal dimension (to which
I will come later):
Clearly, during the sideways market (until
about 16:45), the FRAMA is somewhat smoothier than the two others, and when the trend goes on, it also
reacts faster. Therefore, we can say that the FRAMA is a good AMA. However, it could be better, the
computation of the fractal dimension is rough to say the least, it oscillates between extreme values (from 2
to below 1) that don't even make sense mathematically. The FDI plotted in the lowest window, displays a
more reasonable fractal dimension (the period to calculate both is 16), for those interested in this tool, I
would therefore advise to use the FDI and that might entail a modification in the factor -4.6 in the
computation of the coefficient alpha (from the FRAMA paper) where Ehlers recommends:

The fractal dimension Df in the FDI follows the following formula:

Where N is the number of periods(price valuations) considered. Df provides us with some idea of volatility,
when Df gets close to 2, it means that we have very high volatility, the closer to 1 and we have low volatility
or, in other terms, a well-defined trend. But that's very general qualitative comments, the passage to a
computable quantity is trickier. Elhers assumes that price movements are following a lognormal distribution
(which is not the case) and, on this basis, comes to compute the value of alpha as an exponential. I will, in
the near future, share my reflections on how to get an identified numerical measure of entropy (volatility)
from Df.

But for now, my point is merely to say that the fractal dimension is an indicator of volatility, it does not
inform on the direction of the market. To get this direction, many analysts rely on MA or combination of
them (such as Ichimoku, Bands,...), those indicators may be refined, using the fractal theory, but they then
become hybrid indicators, mixing two diverging conceptions of what price movement is about.

As of now, and as far as I know, the only technical tool fractal theory is providing is a measure of volatility,
but volatility in itself may be an interesting information to set up one's stop and position size. It may not be
necessary to use volatility as a mere entry variable into another indicator.

From Economics to Fractals

Here is an interesting site that makes the link between economics theory (the "Fundamentals") and the
fractal behaviour of the market:
http://www.debunking-economics.com/
For a more precise link to Fractals, and a few other interesting things such as Behavioural Finance (slide 33
and above) or the PayBack Period (slide 70 and above):
http://www.debunking-
economics.com/Lectures/Managerial/ManagerialEconomicslecture11FinanceAlternative.ppt
This site seems very rich, and I just discovered it, so there may be other documents of interest in there.
Posted by Jean-Philippe at 5:05 AM No comments: Links to this post

Why fractals?

A significant part of TA, if not all, is based on Averages, and as such, it relies heavily on the Gaussian (or
Normal) Distribution which is the statistical translation of the Random Walk Theory.
Indeed for Averages (and that includes all kind of Moving Averages, Simple or Exponential) to really be as
meaningful as TA considers them, prices variation must actually be described adequately by the Gaussian
Distribution and its counterpart in random process, the Brownian Motion.

It is interesting to note that there is a contradiction inherent to the practise of TA. In his "Technical Analysis
of the Financial Market", John Murphy wrote (with good reason):
The Random Walk Theory (...) claims that price changes are "serially independent" and that price
history is not a reliable indicator of future price direction. In a nutshell, price movement is random
and unpredictable(...) It also holds that the best market strategy to follow would be a simple "buy and
hold" strategy as opposed to any attempt to "beat the market."

Something I completely agree with, but then, if a technical analyst is to reject this Random Walk view of
price movement, shouldn't he reject as well the mathematical ramifications of this assumption rather than to
use them as tools.
In a Gaussian model, the average (the mean) clearly is a good information to consider, it is the quantity that
has the highest probability to be realised, and the closest to the average, the higher the probability is.

The large pool of experimental data we have from financial markets, however, tells us that they don't follow
a Gaussian distribution, they diverge from it in various ways but a remarkable one is that they are fat-tailed ,
which means that the probability for the variable to be far away from the average is actually higher than in
the Gaussian model (i.e. extreme variations are more frequent than what is predicted by the model). And that
is important, because it tends to make our beloved Average less useful, in terms of prediction, while the
differences are not such that Averages don't retain any usefulness. But more precise tools may likely be
derived from a more fitting model of the real price movement.

Another problem with the Gaussian model is that it assumes continuity and evenness of change. Benoit
Mandelbrot in "Fractals and Scaling in Finance" wrote:
In the classical (Gaussian) theory of errors, a large change would typically result from the rare chance
simultaneity of many large contributing causes, each of them individually negligible. In economics,
this inference is indefensible. Typically, the occurrence of a large effect means that one contributing
cause, or at most a few turn out ex-post to be large.

This non-evenness, as well as the discontinuity of price movement (which is obvious given the structure of
the process of price determination, the apparent continuity is just an artefact of price representation),
contribute even further to undermine the validity of information given by Averages and even more so, by
Moving Averages.

Mandelbrot again, remarks:


In particular, price continuity is an essential (but seldom mentioned) ingredient for all trading
schemes that prescribed at what point one should buy on a rising market, and sell on a sinking price.
Being discontinuous, actual market prices will often jump over any prescribed level, therefore, such
schemes cannot be implemented.

Then, what are the alternatives to the Gaussian Distribution ?


Mandelbrot goes on discussing a few of them in his above-mentioned book. I won't do that here. The
alternative I wish to discuss on this blog is the one most promising, in terms of modeling the behaviour of
price movement, as far as I know.
It is the option involving the use of fractals. The models developed with fractals have so far shown a better
fit than Gaussian models (as well as other alternatives), and I therefore hope that they can lead to the
development of more efficient TA tools than the ones existing today.
Posted by Jean-Philippe at 1:27 AM No comments: Links to this post

About technical analysis

Let me start by precising how I understand Technical Analysis, its value and limitations.

I don't believe Technical Analysis alone encompasses all that there is to know about trading. Foremostly, to
me, TA is a tool, that provides the trader with some kind of knowledge, and this, in turn may help him to
decide about what trade to take and when to act upon it (when to enter it and when to exit it).
On this site, I want to explore the value of this knowledge, and how to make the best use of it.

Ultimately though, the decision taking is a psychological process, TA can just amount to a valuable input in
it, along with knowledge of relevant fundamentals, as well as some level of intuition.
TA, clearly, is subjective, and as such, there certainly is a lot to discuss as to what one can gain (and how can
he gain it) from its practise.

One last precision: As a FOREX trader, I don't really care about Volumes, which are an important
component when it comes to stocks, but have much less relevance in an OTC market.