You are on page 1of 9

ONTONIX COMPLEXITY MANAGEMENT

Complexity Management White Paper

Management: Decision Complexity Management: New Paradigms in Decision-Making, Business Intelligence, Risk Assessment and Crisis Anticipation
Ontonix S.r.l., www.ontonix.com

What is complexity? Can it be measured? Does a technology exist to help manage complexity? Why is it important to manage complexity? These and other questions are addressed and answered in this white paper, which also describes how complexity management can be turned into a beneficial business for every corporation around the globe. First of all lets take a closer look at complexity. Science still struggles with the definition of complexity and a widely accepted definition doesnt exist. Many of the popular definitions refer to complexity as a twilight zone between chaos and order. It is often sustained that in this zone Nature is most prolific and that only this region can sustain life. Others claim that the phenomena of self-emergence are manifestations of complexity. But such definitions are not practical since they dont define anything measurable. At Ontonix we sustain that complexity is not a phenomenon on the edge of chaos; it is an attribute of any system, just like energy, or momentum. Therefore, it can be measured and managed. The fitness of a dynamical system is proportional to its complexity. A more complex system possesses a richer spectrum of adaptability and can therefore better reduce risk exposure in an uncertain and changing environment. This is true, for example, for animals in an ecosystem or for corporations in an economy. For this very reason, all things in life tend to states of higher complexity. Ultimately, this translates to higher fitness. However, there is an upper limit of complexity up to which a given system can safely evolve. This too is a fact of life. When this limit is reached, the system is said to be critically complex. Any system, whether it is a product, process, or organization, becomes more unpredictable in its behaviour as it becomes more complex. This makes it more difficult to manage reliably. More difficulty in management means lower ability to reach goals and objectives. The risk that things can go wrong is, therefore, higher. Close to the upper complexity limit is reached, the system starts to lose functionality. Unless corrective actions are taken. The objective of management is to keep a corporation at a safe distance from its upper complexity limit. Without knowing ones upper complexity limit, it is impossible to speak of sustainable growth and development. Before getting into more details, let us clarify an important semantic issue, namely the difference between complex and complicated. Here too there is plenty of confusion. Most dictionaries use both terms without making a clear distinction and stating that a system is complex, or complicated, if made up of many interconnected or interacting parts. In our opinion the situations is a bit different. High complexity implies a pronounced capability to surprise or to behave in unexpected manner. Novelty is the key to adaptability. A complicated system, such as mechanical wrist watch, is indeed formed of numerous components in some cases as many as one thousand which are linked to each other but, at the same time, the system is also deterministic in nature. It cannot behave in an uncertain manner. It is therefore easy to manage. In the case of the wrist-watch essentially a single degree-of-freedom system one knob is sufficient. Very complicated but with extremely low complexity. The inverse is also true: a system can be composed of very few parts, or agents, and still be highly complex. Imagine, for example, a class of twenty ten-year old children. Try to manage it!

And this brings us to our definition of complexity. Complexity, according to our philosophy, is a function of three fundamental components: Structure. Structure is reflected in the way the components/agents in a system interact and exchange information. Typically, this is represented via a Process Map or a graph. Entropy. This is a fundamental quantity which describes the noisiness and uncertainty of the flow of information between the components of the system. Coarse-graining. Our ability to observe/describe a given problem is limited by the precision or resolution of our tools. Many problems in life are more fuzzy and coarse than we think. Consider, for example, car crash. A crashworthiness rating of one to five stars is provided: Excellent, Good, Acceptable, Marginal and Poor. The highly stochastic nature of car crash does not allow to accommodate more precision. No matter how

Copyright 2008, Ontonix s.r.l. All rights reserved. No part of this document may be reproduced in any form without the written consent of Ontonix S.r.l.

ONTONIX COMPLEXITY MANAGEMENT

Complexity Management White Paper

powerful a computer is used to design a car, we will never be able to predict precisely its behaviour in the case of an accident. In other words, only five fuzzy levels of performance may be determined. The natural coarse-graining of many problems in life is what often turns optimization into a futile and irrelevant exercise. Because all things possess a natural limit of precision, it is important not to insist on digging out information from where there isnt any. There is a natural noise floor in every problem or system. This natural noisiness of many problems is one of the reasons why complexity management should become the basis of decision-making and design instead of the unnatural optimization which became so popular in the last decade. As mentioned, complexity is first of all a function of the topology, or structure, of the information flow within a system. Process Maps are ideal for the purpose of displaying this flow. OntoSpace, our complexity management tool, generates such maps automatically based on the Users data, see the picture below. OntoSpace uses a simple way to display graphs. The nodes, or variables, are first arranged along the diagonal. Inputs are in red while outputs are blue. Links which represent relationships between the variables - are drawn either horizontally or vertically, in a way that attempts to reduce the so-called null crossings. The order of placement of the nodes reflects that of the data used to plot the graph. When two variables are related, the corresponding link crossing is marked by a connector. This makes graph reading very fast and easy, as it is immediately evident which variables are related to which. Nodes indicated in intense red and blue are called hubs. Hubs are critical for the process as they concentrate much of the information traffic within the process. OntoSpace builds maps based on raw User data. The significant relationships between the nodes are established automatically. In other words, the User does not have to specify how the variables of the graph are related this is done by OntoSpace. When the User reads-in a rectangular data table, in which the columns represent the variables, OntoSpace establishes the correspondence between a variable and the node in the graph as shown below. A proprietary algorithm is used for map construction.
Variable 1 Variable 2 Variable 3 Variable 4 Variable 5

For any given system, OntoSpace establishes the System Map and the corresponding value of complexity. Complexity is computed based on the topology of the map and on the amount of entropy contained in the Users data. In the case of two systems having identical System Maps, the more entropy-rich one will have higher complexity. Because of the way OntoSpace computes complexity, systems with less variables (nodes) and with less links may be more complex than systems with very few nodes and links.

But what kind of data does OntoSpace require to generate Systems Maps and to measure the corresponding complexity? In the case one wants to analyze a corporations, data which is normally used to manage and run any kind of business can be used: revenue, gross profit, net earnings, production volume, profit margin, debt, staff turnover, taxes, raw materials costs, fed/ecb rates, total assets, operating income, average sale, etc. In other words, data which is readily available in any corporation. It is important to note that multiple measurements are necessary, say daily or weekly data. More measurements lead to more reliable and credible results.

Copyright 2008, Ontonix s.r.l. All rights reserved. No part of this document may be reproduced in any form without the written consent of Ontonix S.r.l.

ONTONIX COMPLEXITY MANAGEMENT

Complexity Management White Paper

In the case of engineering problems. Monte Carlo Simulations constitute the ideal input to OntoSpace. Tools such as iSight from Engineous Inc. are ideal for the purpose. An interface between iSight and OntoSpace has been recently established. OntoSpace is an ideal tool also for processing the results of @RISK (Palisade Corp.) and CrystalBall (Decisioneering, Inc.). With the former, you can set up Monte Carlo Simulations(MCS) with MS Project, while the latter allows you to do the same with MS Excel. In both cases, the goal of a typical analysis is to manage risk exposure. OntoSpace delivers additional information on the complexity of each project option or asset portfolio, in addition to an invaluable Process Map. The simple process is illustrated below.

OntoSpace

MCS with MS Project and @RISK

Process Map Complexity measures

It is evident, at this point, that OntoSpace may be used to process data originating either from direct measurements (sensors, historical data, etc.) or from computer simulations. Simulation has become very popular in the past few decades and its use is widespread to almost all disciplines of science and engineering. However, our world-view is still dominated by determinism. Most simulation is done neglecting uncertainty. This produces unrealistically smooth data which, consequently, may be processed using a plethora of statistical methods. Since many simulation codes are very CPU-demanding, further damage is done by resorting to surrogates which have the function of emulating the original solver via an even more simplified function. In other words, models of models. The result is that we are more accustomed to seeing scatter plots such as the one on the left, rather than the pathological one on the right in the figure below. For this very reason, the urge to produce lightweight regression models, that can imitate the original solver often shifts the analysts attention from physics to statistics. Highly bifurcation or cluster-dominated data, including outliers and discontinuities, which cannot be processed using conventional statistics, is not even generated. The problem, therefore, doesnt exist. This placebo effect caused by computer-generated data is no longer present when treating real measurements. In OntoSpace no use is made of statistical methods. No models are used either we work directly with the data provided by the User. Our approach is totally different. Scatter plots are transformed to images and image-processing technology is used to analyze them. This allows us to treat any data, no matter how ill-conditioned. We can even treat data which is incomplete i.e. different number of samples per variable may be processed. A fundamental property of our unique complexity metric is that it peaks. While entropy will grow indefinitely, in virtue of the Second Law of Thermodynamics, complexity will reach a peak, which we call critical complexity, after which, if no measures are taken, will progressively decrease. This feature is extremely important in that the phenomenon of complexity growth and subsequent descent is a characteristic of all processes in Nature. Growth and evolution continue but only to up a certain limit, until, as aging progresses, decay takes over. The concept, although already quite intuitive, is underscored by the fact that, according to our philosophy, the fitness of a system is proportional to complexity. A great example is that of a human body and its evolution over a typical lifespan see the plot on the left in the figure below. Maximum overall fitness, from a physical and mental perspective, is reached well beyond adolescence and certainly well before one can be considered to be an elder. Similar logic transpires when one analyzes civilizations, political regimes, markets, economies, or even corporations, which flourish, peak, then decline more or less gracefully. The peak critical complexity is a dangerous place. Once the peak is reached, even small increments in entropy will start to erode the structure of the corresponding Process Map. The system, in

Copyright 2008, Ontonix s.r.l. All rights reserved. No part of this document may be reproduced in any form without the written consent of Ontonix S.r.l.

ONTONIX COMPLEXITY MANAGEMENT

Complexity Management White Paper

other words, starts to lose functionality. Depending on the topology of the Process Map, the phase of decay (aging) is more or less traumatic. The system on the right loses functionality in a pronounced manner, while the one on the left corresponds to a healthy, robust and redundant situation. In the case of biological systems little can be done to prolong life. The Second Law of Thermodynamics is inexorable and removing the effects of aging is not yet possible. In the case of corporations, the situation is different. Entropy may be drained and OntoSpace can help plan and drive the process.

Once complexity can be measured can it be rationally managed? The answer is of course positive. With our technology, complexity management not only becomes a proper business, it becomes a new way of doing business. From a practical perspective, complexity management may be used in efficiently managing a corporation, and organization or a process by helping in the following manner: Suppose that multiple business options are available, how can we select the least complex? Clearly, whenever possible, any manager will opt for a simpler solution. Why add complexity if you can avoid to? Identify and locate points/business units in a corporation where complexity is concentrated. This is where to focus efforts when restructuring or streamlining a business process. See how far a corporation is from its corresponding critical complexity. The distance to criticality establishes a new, rational and holistic measure of the global state of health of a business. Locate points/business units where the system is potentially vulnerable, where risk exposure is greatest and where failure is most likely. Complexity management opens new avenues in corporate risk management in all its facets (financial risk, operational risk, etc.). Understand the robustness of a corporation or a business model from a new, holistic and more rational perspective. In the first quarter of 2007 Ontonix has inaugurated an innovative eCommerce-based service known as Complexity On-line. With Complexity On-line, managing complexity is within the reach of any corporation, of any business around the globe. All it takes is a data sheet in csv (comma separated values) format and access to a web browser. Once the data is uploaded, its format is checked and the User is notified that he may proceed with the analysis. Upon successful payment via credit card, the analysis is performed automatically and the User is notified that he may download his report. The service is fully automatic, fast and secure. Data anonymity is guaranteed since only data sheets containing unnamed variables are accepted for processing.

With Complexity On-line one can obtain a snapshot of a corporations health and manageability in less than an hour. In particular the following information is delivered in a succinct and easy to interpret report:

Copyright 2008, Ontonix s.r.l. All rights reserved. No part of this document may be reproduced in any form without the written consent of Ontonix S.r.l.

ONTONIX COMPLEXITY MANAGEMENT

Complexity Management White Paper

An intuitive System Map, illustrating the structure of information flow within the corporation. The measure of complexity of the business or organization. The corresponding critical complexity, close to which the business becomes difficult to manage and grow. Overall health/risk rating. Ranking of the variables. Indication of where the business is most vulnerable and difficult to manage. Indication of where the complexity is concentrated. On-line complexity analysis may be performed at regular time intervals, for example on a monthly or quarterly basis, enabling to track the evolution of complexity of a business over time. The value of information obtained from complexity monitoring is, evidently, paramount. Various systems may continue to perform correctly and yet become increasingly complex. They may silently reach near-critical complexity and suddenly become almost impossible to manage with little or no early warning. Let us examine briefly a few examples of application of our complexity management technology. Since increasing complexity is a pressing issue across the board, the applications span a very broad spectrum. These range from medicine to engineering, from finance to traffic management and conflict anticipation. Because our technology is based on principles of general validity, there is virtually no limit in terms of its applicability. Banking monitoring and rating of customers. In the light of the Basel II agreements, banks are interested in establishing new and advanced rating methods for credit, financial and operational risk. Complexity allows to establish such a mechanism and in an innovative fashion. Based on each customers transactions it is easy to measure the customers complexity and to confront it with the corresponding critical value. The overall health index of a corporation is proportional to the difference between these two values. Tracking of this index in time allows the bank to quickly rank and classify its customers from a global health/risk perspective. Hubs (in red) indicate fundamental or key transactions of each customer.

Banking monitoring and ranking of subsidiaries and branches. Just like with customers, banks have interest in establishing a holistic health/performance index of each subsidiary or branch. The difference between the complexity and the corresponding critical value establishes an innovative ranking mechanism. Highly critical or close-to-critical subsidiaries are more difficult to manage and therefore require more management resources or restructuring. Subsidiaries close to critical complexity have a limited growth potential.

Copyright 2008, Ontonix s.r.l. All rights reserved. No part of this document may be reproduced in any form without the written consent of Ontonix S.r.l.

ONTONIX COMPLEXITY MANAGEMENT

Complexity Management White Paper

Banking - process and operational risk analysis. Operational risk analysis is a fundamental concern of modern banking. Bank processes are highly sophisticated, redundant, time-consuming and costly. It is of paramount importance to establish the points of a given process in which vulnerability and fragility are concentrated. These maybe identified with the hubs indicated in red and blue in the process maps which OntoSpace creates automatically based, for example, on Monte Carlo analyses with MS Project and @RISK (Palisade Corp.). In similar fashion, the nodes in which much of the process complexity is concentrated may be established. Information on process vulnerability and complexity may be used to streamline the process which maintaining its performance and robustness intact.

Corporate Performance Management estimating the growth potential of corporations. It is known that when a system functions in the proximity of its critical complexity, it is very difficult to manage and grow and, at the same time, quite risky. In such situations the system may develop unexpected modes of behaviour. Based on monthly corporate governance data it is possible to measure the evolution of corporate complexity and therefore to track in a holistic fashion the manageability and growth potential of the corporation. Before critical complexity is reached the management may either decide to re-structure (drain entropy) or consider a merger. In the latter case the corresponding critical complexity threshold is elevated, thereby increasing the growth potential. Critical business units (hubs of the business) are indicated in red in the process map. Medicine monitoring and ranking of cardiology patients. Patients who undergo heart surgery or transplants are carefully monitored during the operation and in the Intensive Care Unit (ICU). The data which is collected may be used to track the complexity of each patient versus time and to issue early warnings before critical complexity is reached. When a patients data is critically complex, the situation is difficult to manage and intervention in any direction may fix a problem locally but create problems elsewhere (kidneys, liver, lungs, etc.). The very high connectivity of the corresponding process map, combined with high entropy, may be the cause behind the inability to stabilize a patient. Patients with high complexity are globally more difficult.

Copyright 2008, Ontonix s.r.l. All rights reserved. No part of this document may be reproduced in any form without the written consent of Ontonix S.r.l.

ONTONIX COMPLEXITY MANAGEMENT

Complexity Management White Paper

Medicine monitoring and ranking of dialysis patients. In a very similar fashion to cardiology, complexity can be used to monitor and rank patients who undergo kidney dialysis. Complexity may be used to establish and track over time the overall state of health of a patient. One important step towards a more systems approach in medicine is to establish new holistic measures of a patients state of health. Today, medicine, just like many branches of science, is fragmented and highly specialized in separate compartments. However, organs interact dynamically in a multitude of ways. One way of capturing this interaction is to monitor complexity and to track global and local entropy, hubs and the evolution of the process maps.

Sociology/politics conflict anticipation and failing states. A socio-economical system becomes fragile when it functions in proximity of its critical complexity. At critical complexity a system becomes vulnerable and extremely fragile, as often a minute change in any of its parameters may trigger transitions to unwanted and unexpected modes of behaviour. Social unrest, civil war or armed conflict are examples of such modes. Terrorist attacks are examples of triggers. Moreover, critically complex countries are ideal terrorist sanctuaries. Process maps, complexities and the corresponding critical complexities may be established for single countries or regions. Those regions in which complexity grows at high rate require closer monitoring. Data such as the CIAs World Fact Book may be used for the purpose.

Air Traffic Management ATC workload estimation. The safety of air traffic systems is related to the workload necessary to manage the traffic. A given number of air traffic controllers can efficiently and safely manage only a certain maximum amount of traffic. Based on real-time traffic data complexity may be measured and used to establish and monitor critical traffic density. This information can help to estimate the necessary workload and also to identify critical periods at different airports in a given traffic sector. In proximity of critical complexity traffic becomes difficult to manage and therefore vulnerable. Knowing beforehand that critical complexity is being approached is invaluable towards guaranteeing traffic safety.

Copyright 2008, Ontonix s.r.l. All rights reserved. No part of this document may be reproduced in any form without the written consent of Ontonix S.r.l.

ONTONIX COMPLEXITY MANAGEMENT

Complexity Management White Paper

Computer-Aided Design Complexity-based design. With all things being equal, a simpler design is always preferable. Engineering is no exception. Thanks to OntoSpace complexity may be measured in a rational manner and used in the design process as any other engineering attribute (stress, temperature, mass, reliability, etc.). Computer-Aided Design is one field of modern engineering which may benefit from this new approach to design. It is evident that a simpler component will probably be cheaper, easier to manufacture, assemble, service and replace. The design of sophisticated products, together with the corresponding manufacturing processes, can benefit greatly from including complexity in the design loop.

Engineering monitoring of nuclear power plants. Modern nuclear power plants are designed with particular focus on safety. However, these systems are very sophisticated and require a multitude of sensors and monitoring devices in order to guarantee controllability, safety and reliability. In the presence of hundreds or even thousands of interacting components it is of great interest to establish a holistic measure of the state of health of the plant. Complexity is an ideal candidate for the purpose. It provides a single measure which takes into account the way information is transmitted within the system as well as the entropy content. Since complexity and controllability are intimately linked, the implications of near-critical complexity on safety are obvious.

But probably the most attractive use of our complexity management is in crisis anticipation. In our turbulent times, of globalizing economy and society, it is paramount importance to dispose of a pre-alarm system that can alert managers and decision-makers of an approaching crisis. Time is an extremely precious resource. What triggers a crisis is fragility. In a turbulent environment fragility is the source of vulnerability, hence of increased risk-exposure. But a bubble bursts only when it reaches a certain limit. The point is to know this limit ahead of time. This limit is known as the mentioned critical complexity and we know ho to measure it. The distance to criticality is a measure of the state-of-health of a given system. It is precisely based on how the state-of-health evolves that we have built our capability to anticipate crises and issue early warning. An eloquent example of our crisis-anticipation technology in action is the US sub-prime crisis which became known to the wide public in late 2007. We have analyzed four years of US housing market data and have found an outstanding result, see the figure below. Based on nearly 60 parameters, such as: Newly privately owned housing units started, Newly privately owned housing units under construction, Newly privately owned housing units completed, New single family home sales - US sold, New single family home sales - Months supply at current US sales

Copyright 2008, Ontonix s.r.l. All rights reserved. No part of this document may be reproduced in any form without the written consent of Ontonix S.r.l.

ONTONIX COMPLEXITY MANAGEMENT

Complexity Management White Paper

rate, Existing home sales US, Existing home sales - Months supply, New single family home prices - median US, New single family home prices - US average - houses actually sold, Existing home prices-median US, Existing home prices - Average US, etc. we have found that the complexity of the housing market started to increase sharply in early 2006 until it reached a plateau in late 2006 as shown below.

Until late 2006 the robustness, or state-of-health, of the market has been steadily increasing, see plot on the right. However, immediately after it peaked, the robustness has commenced to decline sharply. By mid 2007 values of mid 2005 have been reached. Setting aside detailed analyses of the sub-prime crisis, the important thing to notice is that over a year before the crisis became known to the wide public, clear indications of increasing tension in the market were present. This information would have been invaluable to analysts, banks and rating agencies had it been available. Clearly, similar analyses can be performed for any market segment, or even for single corporations and organizations. The monitoring of complexity establishes new standards in terms of advanced Corporate Performance Monitoring and Management, as well as in Business Intelligence.
The only way to counter the increase of complexity is to actively manage it and not to passively accept the consequences. The number of companies that offer the so-called complexity management has increased in the last decade. Although this is a good sign, none of these companies are able to actually quantify complexity. But, if you cant measure it you cant manage it. Serious science starts when you begin to measure.

Ontonix is situated in Europe and in the United States, and has representatives in Germany, South Africa, Israel, and Singapore. Ontonix srl Via Lega Insurrezionale, 7 22100 Como, ITALY www.ontonix.com info@ontonix.com

Copyright 2008, Ontonix s.r.l. All rights reserved. No part of this document may be reproduced in any form without the written consent of Ontonix S.r.l.

You might also like