You are on page 1of 60

The iBonD Series – intelligent Business on Demand

Volume 2:
CPM – Corporate Performance Management

White Paper:
Analytical Services in a SOA
Part 1: Vendor Independent White Paper and Reference Architecture
English Version 4.1 – February 2008

Authors:
Dr. Wolfgang Martin Richard Nußdorfer
Wolfgang Martin Team, S.A.R.L Martin CSA Consulting GmbH
Annecy München

Sponsored by
CSA Consulting GmbH / S.A.R.L. Martin

Preface

The present White Paper “CPM – Corporate Performance Management” is the second white paper
of the series “iBOND – intelligent Business on Demand”. It describes the business and technical
architecture of operational, tactical, and strategic CPM. CPM is defined as a model enabling a
business to continuously align business goals and processes and keeping them consistent. CPM
works as a closed-loop model for managing the performance of business processes on the
operational, tactical, and strategic level, i.e. planning, monitoring, and controlling. From a business
point of view, this is one logical model, but from a technological point of view, rather different
technologies from vendors with completely different roots are clashing together: Traditional
business intelligence vendors meet business integration vendors. The convergence happens via
the model of a service-oriented architecture (SOA).
CPM is also called “business performance management - BPM”. These two terms are absolutely
equivalent. We prefer the term CPM, since the abbreviation BPM has multiple meanings, e.g.,
business process management and business process modeling. We will always use the term CPM
in this white paper, and readers used to the term “business performance management” should
always understand this equivalence. We will also use the term BPM, but in this white paper, BPM
will always mean “business process management”.
Traditional Business Intelligence Vendors
These are vendors who used to act in the market of traditional business intelligence tools and data
warehouse and who have evolved into CPM vendors. First, these vendors addressed tactical and
strategic CPM. Today, they are moving to operational CPM (sometimes called BAM – business
activity monitoring). Furthermore, first BI solutions and products are coming to market that are SOA
based and publish and consume web services.
Traditional Business Integration Vendors
Vendors of integration platforms (see Nußdorfer, Martin, 2007) have started to offer first
operational CPM solutions using the labels BAM and PPM (process performance management).
Their challenge is to put CPM metrics into the classical financial business context. For doing so,
service-orientation is the prerequisite.
Best of Breed Products
In addition to the holistic approaches of the vendors of the two camps, there are specialists offering
best-of-breed tools and technologies. These products are especially interesting when business
integration platforms do not include operational CPM features, but provide standardized interfaces
to an open process warehouse for accessing all logged data of all process instantiations...
Goal of the CPM White Paper
Enterprises developing CPM solutions will have to decide which basic platform to choose for CPM
in the context of iBonD and which additional best-of-breed-products will be required. The focus of
this series of White Papers will be to assist any decisions in the described environment.
Both authors have lots of experience in IT Business in Management functions as analysts and

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 2


CSA Consulting GmbH / S.A.R.L. Martin

business-oriented project leaders. They both have many years of practical experience in
addressing and dealing with strategic deliberations and future developments.
The present CPM-White Paper is divided into two parts. First there is this general part 1 describing
the concepts and facilities of CPM as well as its reference architecture. Part 2 describes vendor
platforms and solutions for CPM architectures. To enable readers getting a fast survey of the
actual market, the authors have created a separate description for selected vendors. The following
white papers are already available:
arcplan, Cubeware, epoq, Informatica, in-factory, Panoratio, SAP, Spotfire
Version 3.0 of this white paper was published in August 2006. Version 4.0 (August 2007) is a
completely reworked and updated version. The authors will be delighted to receive reader
feedback, commentary, criticism - and of course compliments! In Version 4.1 we welcome
Panoratio and StatSoft as new sponsors. This gives us the opportunity to update again and to
extend chapter 8 in order to take into account the additional mergers and acquisitions that took
place in the mean time since version 4.0.1 in November 2007. Furthermore, we added a new
chapter 4.3 on text analytics, and updated chapter 6 at the end.

Munich, February 2008 Annecy, February 2008

Richard Nußdorfer Dr. Wolfgang Martin


Managing Director CSA Consulting GmbH Wolfgang Martin Team

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 3


Wolfgang Martin Team/CSA Consulting

The authors’ biographies:

Dr. Wolfgang Martin


Biography
Recently designated one of the top 10 most influential IT consultants in Europe
(by Info Economist magazine), Wolfgang Martin is a leading authority on
Customer Relationship Management (CRM), Business Process, Rules, and
Master Data Management (BPM/BRM/MDM), Business Intelligence (BI),
Corporate Performance Management (CPM), and service oriented
architectures (SOA). He is a founding partner of iBonD Ltd, Ventana
Research Advisor, and Research Advisor at the Institute for Business
Intelligence at the Steinbeis University, Berlin.
After 5½ years with META Group, latterly as Senior Vice President International
Application Delivery Strategies, Mr. Martin established the Wolfgang Martin
Team. Here he continues to focus on technological innovations that drive
business, examining their impact on organization, enterprise culture,
business architecture and business processes.
Mr. Martin is a notable commentator on conference platforms and in TV appearances across Europe.
His analytic skills are sought by many of Europe’s leading companies in consulting engagements.
A frequent contributor of articles for IT journals and trade papers, he is also an editor of technical
literature, such as the Strategic Bulletins on BI, CRM and EAI (www.it-research.net), as well as
“Data Warehousing – Data Mining – OLAP” (Bonn, 1998), and "Jahresgutachten CRM",
(Würzburg, 2002, 2003, 2004, 2005 & 2007).
Prior to META Group, Wolfgang Martin held various management positions with Sybase and Software
AG, responsible for business development, marketing and product marketing. Prior to this, he
became an expert on decision support while with Comshare. His academic work included
Computational Statistics at the Universities of Bonn (Germany) and Paris-Sud (France).
Dr. Martin has a doctoral rer.nat. degree in Applied Mathematics from the University of Bonn
17 (Germany). © 2007 S.A.R.L. Martin

Richard Nussdorfer
Biography
Richard Nussdorfer has worked for more than 30 years in the IT-industry
as a software architect and business analyst.
His current focus is on modernization of IT architectures through Business
Integration based on service oriented architectures (SOA) and end-to-
end business processes.
Richard’s technical knowledge has been used extensively for integration
projects, modernizing IT-Architectures, and re-centralizing
Client/Server-Architectures to Web-Architectures.
He has published 2 e-books: Information-Technology and the EAI-Book. He regularly contributes
articles to IT journals and is asked to speak at numerous congresses and seminars such as
Business Integration, DataW arehouses and Business Processes.
Richard Nussdorfer’s professional experience started in 1970 at Siemens AG in software
development. He then continued as an expert on databases and project leader for database
projects, nationally and internationally , from London to Moscow and from Stockholm to
Johannesburg.
His professional career continued as manager for Software-Marketing in Munich and Business
Development Manager in South Africa.
From 1990 to 1993 he worked as a consultant for Plenum AG in strategic IT-projects.
In 1994 he founded CSA Consulting GmbH where he works today as Managing Director.
Richard Nussdorfer has a degree in computer science from the Technical University in Vienna
(Austria).

18 © 2007 S.A.R.L. Martin

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 4


Wolfgang Martin Team/CSA Consulting

Copyright
CSA Consulting GmbH/Richard Nußdorfer and S.A.R.L. Martin/Dr. Wolfgang Martin authored this
report. All data and information was gathered conscientiously and with the greatest attention to
detail, utilizing scientific methods. However, no guarantee can be made with regard to
completeness and accuracy.
CSA Consulting GmbH and S.A.R.L. Martin disclaim all implied warranties including without
limitation warranties of merchantability or fitness for a particular purpose. CSA Consulting and
S.A.R.L. Martin shall have no liability for any direct, incidental special or consequential damage or
lost profits. The information is not intended to be used as the primary basis of investment
decisions.
CSA Consulting GmbH and S.A.R.L. Martin reserve all rights to the content of this study. Data and
information remain the property of CSA Consulting GmbH and S.A.R.L. Martin for purposes of data
privacy. Reproductions, even excerpts, are only permitted with the written consent of CSA
Consulting GmbH and S.A.R.L. Martin.
Copyright © 2004 – 2008 CSA Consulting GmbH, Munich/Germany and S.A.R.L. Martin,
Annecy/France

Disclaimer

Reference herein to any specific commercial products, process, or service by trade name,
trademark, manufacturer, or otherwise, does not necessarily constitute or imply its endorsement,
recommendation, or favoring by CSA Consulting GmbH and S.A.R.L. Martin.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 5


Wolfgang Martin Team/CSA Consulting

Contents

1 Management Summary 7
2 Metamorphosing Business Intelligence 10
2.1 Pitfalls of Business Intelligence 11
2.2 The New Paradigm of Process-Orientation 12

3 CPM – Strategies, Processes, Men and Metrics 18


3.1 Analytics: Process-Oriented Business Intelligence 18
3.2 The Process Ownership Model 23

4 CPM – Methods and Technologies 26


4.1 CPM Business Components 26
4.2 From Business Intelligence to Business Analytics 28
4.3 Text Analytics 30
4.4 Analytics in a SOA 31
4.5 Analytical Services 33

5 Data Integration 38
5.1 Data Integration Platform 38
5.2 Information Services 39
5.3 Meta and Master Data Management 41
5.4 Data Quality 44

6 Latency matters 47
7 CPM and classical BI: fundamental differences 50
8 Players in the CPM/BI Market 51
9 Summary 55
10 The Sponsors 56

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 6


Wolfgang Martin Team/CSA Consulting

1 Management Summary
In economical down times, budgets become tighter and tighter. Indeed, taking wrong decisions
today ends in disasters. Identifying potentials for profit, rigorously cutting cost as well as precisely
calculating where to optimally spend the remaining resources are key issues not only for top
management. Geopolitical uncertainties make planning much more difficult, but more important
than ever. New regulations like the Sarbanes-Oxley Act in the US, the International Financial
Reporting Standards (IFRS) in the EC, for banking Bale II, and for insurance Solvency II impact
financial reporting and consolidation. What is the next strategic move to master these challenges?
One answer is Corporate Performance Management (CPM), the topic of this second white paper of
the iBonD (“intelligent business on demand”) initiative. iBonD explains what makes up winners in
the markets.
Winners do:

• Focus on customers
• Strip away low value activities
• Decentralise decision making
• Drive for compliance
• Industrialize processes
• Collaborate with suppliers, partners, customers
• Focus on agility and are empowered to follow strategic moves on the spot
• and adopt corporate performance management according to the leitmotiv:
You can only manage what you can measure
To summarize: Processes make up the competitiveness of an enterprise. Winning and loosing in
the global market depends on quality and flexibility of business processes. Processes become the
new focus of management (see Nußdorfer, Martin, 2007). Winners in the markets industrialize their
business processes and make them agile. Agility means the power to innovate and to
continuously adapt its business models and processes to a steadily growing market dynamics. Life
cycles of business processes get shorter and shorter. As a consequence, speed of changes must
increase faster and faster. Drivers for industrialisation are continuous optimization and higher
profitability. Industrialization means automation and standardization. It speeds up and increases
throughput as well as it improves quality.

For today’s enterprises, agility and industrialization are key differentiators making up winners or
losers in the world’s global markets.

Business Process Management (BPM) is the answer, a closed-loop model describing the life cycle
of business processes, from analysis and design via flow and execution to planning, monitoring

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 7


Wolfgang Martin Team/CSA Consulting

and controlling. The task of CPM within BPM is planning, monitoring, and controlling of
processes and their performance.
For CPM and BPM, an appropriate IT support through the right infrastructure is essential. A
service-oriented architecture (SOA) is required as an infrastructure for closed-loop management of
business processes. BPM and CPM on a SOA enable and empower automated, standardized,
reliable, audit-proof, and flexible processes across business functions, departments and even
across enterprises. This cuts cost and boosts revenues. SOA based processes are independent of
the underlying IT systems and applications. Hence, business can change processes with the
speed of market dynamics and customer needs. You keep sailing close to the wind. The challenge
is continuous adaptation of strategy and processes to market and customer demands; and
moreover...processes must be "intelligent". Through a SOA, analytics can be embedded into
processes. Analytics is key for planning, monitoring and controlling processes and their
performance. The mission is: to identify problems in the right time to take preventive actions.
An example from day-to-day life explains how predictive models work: In a department
store, the sales areas are stocked up at the right time, before products are out of stock.
This avoids the situation where a customer wants to buy a product and finds himself
standing in front of empty shelves.

In a process oriented enterprise, CPM and BPM must go together. CPM is the business model
enabling a business to continuously align business goals and processes and keeping them
consistent. The concept is metrics-driven management, the methodology is CPM, and the
technology is business analytics.

Corporate performance management is fundamentally different from the traditional business


intelligence approach for decision support, executive information, and reporting. Integrated,
embedded analytics is the next step to go beyond BI. Traditional BI tools (reporting, adhoc
querying, OLAP – online analytical processing, data mining etc.) failed to deliver the right
information to the right location in the right time for the right purpose. Traditional business
intelligence tools did not meet management expectations: results to be applied to processes and
strategy for turning information into value. Return on investment (ROI) in the old tools was typically
rather low, if measurable at all. Traditional business intelligence tools were difficult to master.
Information remained a privilege in many enterprises. Only a handful of experts (the power users
or business analysts) were in a position to exploit information via the old tools. Management
decisions and actions were based on guesses, much less on facts. Embedding analytics into
processes through a SOA overcomes these problems.
A SOA enables adaptability and flexibility by separating process logic and flow from business and
application logic. It is service-oriented, plus it includes a common business vocabulary across all
services. SOA enabled processes can act, not only react. Events can drive process logic and flow.
Example: Product availability should control product publication in a web shop. This prevents
customers to order products that are out of stock and helps business to retain customers and
to keep revenues by offering substitute products in the shop.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 8


Wolfgang Martin Team/CSA Consulting

Corporate performance management is a new approach based on business intelligence for


optimally planning, monitoring and controlling business processes and their performance on the
level of operations, tactics, and strategies. CPM is based on metrics associated with the
processes. CPM starts when designing and engineering processes: metrics have to be derived
simultaneously and in parallel with the operational process design. Goals have to be metricized.
Achievement of goals has to be continuously monitored. Actions must be taken for controlling the
performance of processes. CPM is a closed-loop model.

CPM provides clear benefits to an enterprise:

• It is a methodology to link strategy to results.


• It turns data into actionable information.
• It empowers all staff by delivering information not only to power users and business analysts,
but to everybody inside and outside the enterprise (“information democracy”).
• It delivers high degree of accuracy and consistency of information.
• It provides transparency to management and enhances the bottom line.
• It delivers the right information to the right information consumer to the right location in due time
(This is called “real-time”).

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 9


Wolfgang Martin Team/CSA Consulting

2 Metamorphosing Business Intelligence


What is Business Intelligence (BI)? What is its exact definition? Gartner Group used this term
already in 1993, but even today, the term is by far not really known by everybody in business. Just
50% of all enterprises use BI today. So, let us start with a definition of BI that is rather widely
accepted by the market:

Business Intelligence means the capacity to know and to understand as well as the readiness of
comprehension in order to exercise this knowledge and understanding for mastering and improving
the business. In somewhat more detail, we define: Business Intelligence is a model consisting of all
strategies, processes, and technologies that create information out of data and derive knowledge
out of information so that business decisions can be put on facts that launch activities for
controlling business strategies and processes.

The idea of BI principles and concepts is all about to put decisions on facts and to make “better
decisions”. BI should give answers to questions like
• Do you know which of your suppliers is mission critical to your production? Will their failure
bring down your production for hours or even days?
• Do you know what percentage of supplier revenue is due to your spending? Do you get good
terms and conditions from suppliers, using this information?
• Do you know who your most profitable customers are? Are you providing superior services in
order to retain them and are you able to service them, up-sell/cross-sell at appropriate points
when interacting with them?
• Do you know in Q1 that you will miss your sales target in Q4, because your actual volume of
leads is insufficient?
• Do you know what revenue you are actually loosing because customers cannot connect to your
call centre due to peak demand?
• Do you know how much business you miss by not fully exploiting cross-sell opportunities in
face-to-face encounters, outlets, and web shops?
Do you know how much money this means for your enterprise? Do you know how to find it, get it
and keep it?
We practise BI since 15/20 years. Did we all get the answers we have been expecting to get from
BI? The problem has been since the early days in BI to use its concepts and principles in the daily
operational business and to get answers that matter. How are we doing?

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 10


Wolfgang Martin Team/CSA Consulting

2.1 Pitfalls of Business Intelligence

Up to now, business intelligence enabled decision support in the context of strategic planning and
tactical analysis. The goal of traditional BI was to base decisions on facts. Unfortunately, in many
cases this did not deliver the expected added value and enterprise wide acceptance. Reports,
indicators, analytical applications and others: where is the real value? Indeed, BI tools failed to
deliver. It was always difficult to measure value achieved by business intelligence and the data
warehouse. Reason is: information per se does not create any value. Value is created, when
information is applied, used and turned into decisions and actions.
What was wrong with traditional Business Intelligence?
• Business Intelligence was bottom-up and not process-oriented. Line of business people were
not sufficiently involved. Genuine, process-oriented business requirements had not been
addressed at all. BI lacked business-oriented relevance.

• Business Intelligence was just an information access model for decision support (i.e. Bill
Inmon’s “Information Factory“; Inmon, 1996). This means, information and the analytical
processes for information exploitation were mashed together. The results are inflexibility and
unnecessary complexity. Any innovation gets discarded from the very beginning. Acceptance
decreases drastically.
• Business Intelligence did support decision making to a certain degree, but the feedback
component for closing the loop was missing. Taking actions based on decisions was not part of
the model. Indicators that are not in the context of a process bring only limited value. The real
value of information is only achieved when information is deployed in the context of processes.
Example: As soon as an indicator on the strategic level is in red, the owner of that indicator
has to make decisions for launching tactical and operational actions. Information is
deployed, decisions are based on facts, and a much higher value is achieved than with the
traditional BI model where feed back is not part of the model.
• Operational aspects of Business Intelligence were left out. Traditional BI was based on a data
warehouse as the single point of truth. This architecture excluded BI from use in operational
environments. BI was isolated and limited to tactical and strategic analysis. The potential of
real-time analysis was completely neglected.
• Business Intelligence was retrospective. Focus was on analysis and diagnostics only. The
potential of predictive models for identifying of problems and risks in the right time was ignored.
Example: A midrange manufacturing plant analyses quality of production at the end of
each shift. This guarantees that problems in production are identified as soon as possible
so that actions can be immediately taken to ensure. This ensures that the identified
problems will not occur in the subsequent shift. Pro-active BI creates significant value to be
exploited.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 11


Wolfgang Martin Team/CSA Consulting

• Business Intelligence tools did not supply the information consumer and manager sufficiently.
Either information was not accessible (or even hidden and retained), or there was an
information deluge. Again, this lowered acceptance of BI dramatically.
• Business Intelligence was a tools-centric approach based on proprietary technologies. Each
analytical component played its own role in an isolated environment. Incompatibility and
inconsistency were the consequences, and stove-piped information silos were the results. On
the board level, numbers did not match any more.
Business Intelligence has to be reinvented. The old idea of basing decisions on facts is not bad
at all. What should and could to be done?

2.2 The New Paradigm of Process-Orientation

Business Intelligence has to be put into the context of processes for achieving business relevance.

What are the relevance and importance of business processes? What can be achieved by
process-orientation? To get an answer, let us look back: In the 90s, it was common belief that
enterprises could run exclusively on a single instance ERP application. Enterprises became
application oriented. Ideally, all business-relevant data was meant to reside in a single database
and all business functions were meant to have been supported by standard (ERP) functionality.
Unfortunately this ideal world was never achieved. What lessons have been learned?
• “One size ERP application fits all” does not work. The majority of enterprises run several
heterogeneous instances of ERP plus legacy and other systems. Enterprises have an average
of 50 mission critical OLTP systems.
• IT performance suffers. The huge number of point-to-point interfaces necessary to link
applications drives up costs for implementing new applications. The budget for maintaining
these interfaces killed IT innovation. IT became a legacy.
• Process automation is minimal to non-existent. Data has to be manually re-entered from
application to application. This makes process quality low and results in mistakes, failures and
lost money.
• Process integration is modest to non-existent. Processes end at the boundaries of
applications making collaboration with suppliers, partners, and customers impossible. As a
result, enterprises are sluggish and unable to react to changes in the market. Costs are driven
sky-high.
• Changing your strategy and adapting your business processes to the speed and
dynamics of the markets is impossible. Because business processes are hard-coded in the
applications, if you need to change the business process, you need to change the application
and every other application with which it interacts. In consequence, IT dictates the business,
not strategy. This is not practical. Application-oriented enterprises are not agile and will
ultimately lose to the competition.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 12


Wolfgang Martin Team/CSA Consulting

• Master data is caught in applications. Each application has its own business vocabulary.
Product or order numbers are defined completely differently from one application to the next.
Collaboration across networks of suppliers requires master data translation. Each time you add
a new supplier, customer, or product you must create new translation tables and/or add the
new item to all the translation tables. This makes changes slow, error prone and costly.

• Information management is impossible. Timely access to business information across


application islands becomes a luxury enterprises can’t afford. The price of not having access to
business information is even higher.
How can the traditional enterprise be transformed from an application-centric focus to process-
centric model? The answer is Business Process Management (BPM).

BPM is a closed-loop model consisting of three phases (Fig. 1):


Phase 1: Analyzing, planning, modeling, testing, and simulating business processes
Phase 2: Executing business processes by cross-application process flows through a process
engine on a SOA (service oriented architecture) infrastructure
Phase 3: Planning, monitoring and controlling of processes and the performance of the ensemble
of all business processes

To summarize, BPM means closed-loop management of business processes. It enables


synchronization of execution and exception management with continuous and comprehensive
planning, monitoring, and controlling. This synchronization keeps business processes optimized in
line with real time events and intelligent planning and forecasting.
Business processes are becoming the common communication platform between business and IT
people. For the first time we can create a genuine dialogue between business and IT. The
benefits of process-orientation are obvious:
• Processes become the common communication platform between business and IT. The
specification of business requirements is now based on a common language jointly understood
and spoken by the two parties, business and IT. Technical design of executable processes and
back-end services providing application logic becomes straightforward when based on a
common business design of processes.
• Processes become independent from applications. Collaboration makes enterprises shift to
end-to-end processes across applications and platforms that are executed by rules-based
process-engines running on an integration hub for application and data, the infrastructure for
business process management and service management. An important point is that we are
now dealing with cross-functional, cross-departmental, and even cross-enterprise processes
that exploit the application logic of the existing application landscape.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 13


Wolfgang Martin Team/CSA Consulting

The Process-Oriented Enterprise


Rules Based,
Execute Application
Independent
Process Engine

collaborative business process

Business Process Management


collaborative business process

SOA as Infrastructure
Corporate
Performance Metrics,
Management Business
Analytics
Model Plan, Monitor
Analysis, & Control
Design, Test,
1
Simulation © 2007 S.A.R.L. Martin

Figure 1: Business Process Management (BPM) is a closed-loop model. Management of business


processes becomes the center point of all entrepreneurial actions and activities. Processes are modeled,
executed, planned, monitored, and controlled independently of the existing application framework. The
infrastructure is a SOA (service oriented architecture). Corporate Performance Management (CPM) is a
second closed-loop model for managing the planning, monitoring and controlling of business processes and
their performance within BPM. This process-orientation is the foundation of an intelligent and agile real-time
enterprise.

• Processes benefit from the advantages of service-orientation. A SOA is business-driven.


The granularity of the process model determines the granularity of business services managed
in a SOA. Furthermore, the SOA maps technical services from existing back-end applications
to business services. This is 100% protection of investment in the existing IT architecture. With
service-orientation we do the next step and build on top of the existing IT investments.
• Processes run across the underlying application data models. In order to automate event-
driven processes across functions, departments, and enterprises, commonly-used application
touch-points and data across the enterprise must not only be integrated and synchronized, but
data models must be aggregated into a common information model to support collaboration
processes. This common business vocabulary is the heart of master data management.
Uniquely defined and centrally managed ‘meta’ data provides a common platform for all
business terms and items across different applications and business constituents. This is
essential when defining new products, gaining new customers, or adding suppliers to the
business network. One simple update in the master database propagates changes safely and
automatically to all related systems and services.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 14


Wolfgang Martin Team/CSA Consulting

• Processes consume and publish services. The shift here is from application-oriented
thinking to SOA-enabled processes (Fig. 2). For a specific business process, operational,
analytical, collaborative and information services are composed by a rules-based process
engine. The result is that a business process either becomes a service or a group of services.
Certain re-usability can be achieved by avoiding redundant implementation of functions and
data. Redundancy was inherent in the old application-oriented model; service-orientation helps
to overcome this problem.

Business Analytics in a SOA


SOA means
• IT Architecture
• Enterprise Architecture Intelligence &
• Collaboration Architecture Performance
Management Backend Services

Operational Data
ERP
CRM
BPM SCM
Process Integration Hub PLM
Portal ESB DW
Presentation & legacy
Meta Data Mgt etc
Collaboration Services
DI
DI
Content
B2B
Management
Office
Application MarketPlace,
Market Place,Suppliers,
Suppliers,Partners,
Partners,
Dealers,Customers,
Dealers, Customers,Social
SocialMedia
Media
CAD/CAM

2 © 2007 S.A.R.L. Martin

Figure 2: A SOA describes the design of the infrastructure for BPM. The implementation is based on an
integration hub supporting the life cycle management of processes and managing the back-end services
including information services (DI = data integration) and meta/master data services. It also provides the B2B
interface. Other business domains like content and knowledge management, office and CAD/CAM can also
be incorporated via the integration hub. Corporate Performance Management acts as the brains of the
process-oriented enterprise. It provides the “intelligence” for optimal monitoring and controlling all business
processes and their performance. Analytics is embedded into the processes for anticipating problems and
risks. The Process Portal acts as the human interface. It supports human interactions through collaboration
and presentation services. A Process Portal supports multi-channel communication via the web, PDAs,
voice, etc. (ESB – enterprise service bus, ERP – enterprise resource planning, CRM – customer relationship
management, SCM – supply chain management, PLM – product life cycle management, DW – data
warehouse, B2B – business to business)

• Processes drive the transformation to intelligent real-time enterprises. Business


intelligence is gleaned from metrics associated with each business process. Business metrics
are defined by goals and objectives to manage a process and its performance in a measurable
and proactive way with information, key performance indicators (KPI), rules, and predictive
models.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 15


Wolfgang Martin Team/CSA Consulting

Example: Let us assume, term of delivery is a goal of the shipment process. Then we first
have to make the goal measurable. As a metric, we could define that 90% of all shipments
should be within 2 days. This is a strategic metric. An operational business metric could be
a predefined threshold for stock in a dealer warehouse. If stock falls below the threshold, an
order is automatically executed. The outcome of this metric on stock level launches an
action. It is a pro-active metric to avoid the problem of sold out.
Metrics can be anticipative as this example shows. In CPM, we go beyond traditional diagnostics.
Based on anticipative metrics, processes get the power to act proactively and to become “self
healing”: Problems and risks are identified in the right time, and decisions and actions are taken to
prevent damages. In other words:
We have reinvented Business Intelligence. We have put Business Intelligence into the
context of business processes. Business Intelligence becomes Business Analytics.
Business Analytics is about planning, monitoring, and controlling of business processes and their
performance. This model is called Corporate Performance Management (CPM).

Definition: In a process-oriented enterprise, CPM is the model enabling a business to


continuously align business goals and processes and keeping them consistent. CPM means
planning, monitoring, and controlling of processes.

Finally, we have to define the infrastructure for BPM and CPM so that we can embed analytics into
processes. As we have already seen (Fig. 1), this is done through a SOA (Fig. 2). From the IT
point of view, agility and industrialization are two contradictory requirements, but if the
infrastructure for managing processes is a service oriented architecture (SOA), then the two
principles are brought together. The reason is the nature of a SOA. It is a special architecture for
providing “Software for Change”. This is due to the following principles of a SOA:
A SOA is the infrastructure for BPM and CPM that separates process and business logic.
• SOA is a design model for a special enterprise architecture and a special enterprise software
architecture
• SOA is independent from technology
• SOA is an evolution of component architectures (principles of “LEGO” programming)
• SOA services are business driven. The granularity of the process model determines the
granularity of business services.
An architecture following these principles is called service-oriented if the following three principals
hold:
Service-Orientation
• Principle 1 – Consistent Result Responsibility. The service provider takes responsibility for
the execution and result of the service. The service consumer takes responsibility for
controlling service execution.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 16


Wolfgang Martin Team/CSA Consulting

• Principle 2 – Unambiguous Service Level. The execution of each service is clearly agreed to
in terms of time, costs and quality. Input and output of services are clearly defined and known
to both parties by the Service Level Agreement (SLA).
• Principle 3 – Proactive Event Sharing. The service consumer is informed about every agreed
change of status for his work order. The service provider is required to immediately inform the
service consumer of any unforeseen events.
Such a service orientation provides a flexible framework for standardizing and automating business
processes, for bundling regional and global competencies into service offerings, for load balancing
of peaks “on demand”, and for provisioning services by third parties via a “software as a service”
(SaaS) model. Services provide business and decision logic that traditionally was included in
applications. Processes
As a prerequisite for applying these three rules, we need a business vocabulary so that all SOA
based processes use the same notation and specifications. A repository is necessary for uniquely
defining all meta and master data. The repository for meta and master data plays a similar role as
the integration hub within a SOA. So, the architecture of the repository should be hub and spoke so
that all meta and master data can be synchronized and versionized across all back-end systems
and services. This is the role of master data management (MDM). In chapt. 5, we will discuss
MDM in more detail.
Note: ROI is not provided through a SOA, but through the implemented processes.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 17


Wolfgang Martin Team/CSA Consulting

3 CPM – Strategies, Processes, Men and Metrics

3.1 Analytics: Process-Oriented Business Intelligence

As we have seen in the previous chapter, CPM puts business intelligence into the context of
processes. The obvious consequence is CPM also puts BI into the context of strategy and men.
Today, processes are cross-functional, cross-departmental, and cross-enterprise. They link the
suppliers of the suppliers with the customers of the customers. Let us recall the definition of a
business process.

A business process is…


a set of activities and tasks carried out by resources
(services rendered by people and machines)
using different kinds of information
(structured & unstructured)
by means of diverse interactions
(predictable & unpredictable)
governed by management policies and principles
(business rules & decision criteria)
with the goal of delivering agreed upon final results
(strategies & goals)

The benefits and advantages of integrated end-to-end processes are obvious:


• Faster and more reliable processes cut costs. Automation improves speed and quality of
processes. The result is higher throughput with fewer resources.
• Integrating processes shortens time-to-market. The ability to respond quickly to new
opportunities, customer needs, market dynamics and problems simply translates into increased
revenue and profitability.
• Safe and reliable processes minimise risk. High process quality means less costly aftershocks
to the bottom line. In addition to savings realised by reductions in post-sales service,
enterprises can benefit from high customer satisfaction and, ultimately, market share. The
ability to anticipate problems, customer needs, and market dynamics makes the intelligent real-
time enterprise a reality.

• Through flexible process management - independent of applications - you maximize business


flexibility and agility. By removing the constraints in hard-coded processes intrinsic ERP- and
other standard application packages your processes will move in line with market dynamics.
• Process-orientation creates transparency and traceability. There is no alternative to compliance
with the regulations of public authorities and the requirements imposed by auditors.
This is why “Business Process Management (BPM)” is one of the most important challenges for
today’s enterprises. BPM and CPM are the process-oriented latest version of managing an

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 18


Wolfgang Martin Team/CSA Consulting

enterprise: planning, execution, and performance management have always been the three basic
categories of all management (“make a plan, execute it, and manage to keep the actual in line with
the plan”).
CPM within the BPM model is all about managing the performance of all processes that
extend across all functions within a business, and beyond to all other relationships in business to
business and business to consumer. Metrics-oriented management is the top down principal of
CPM for optimal enterprise management by a closed-loop approach (Fig. 3). Business strategy
determines which business processes are to be executed and managed by the enterprise.
Business metrics are associated with each business process. Business metrics are defined by
goals and objectives to manage a process in a measurable way with information, performance
indicators, rules, and predictive models.

CPM: Strategy, Goals, Processes, Metrics

Events
Goal
End
Strategy Business Process
Result

Cycle Speed

Act Decide Measure


3 © 2007 S.A.R.L. Martin

Figure 3: Metrics-Oriented Management is a top down model for information-based business management.
Measurable goals and objectives are derived from the strategy. Based on strategy, goals and objectives,
business processes and business metrics for efficient process control and continuous optimization are
modeled in parallel. Technical implementation of processes and metrics follows the principles of a SOA
(service oriented architecture) by operational and analytical services. Based on monitoring, decisions are
taken either manually by man or automatically by decision engines. Decisions lead to actions for controlling
the process and its performance (tactical and operational BPM) as well as updates strategy, goals and
objectives (strategic BPM). The loop closes. Synchronizing monitoring, decision and action taking with the
speed of the business process and business dynamics is key – indeed, this is a foundation of the real time
enterprise.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 19


Wolfgang Martin Team/CSA Consulting

Embedding analytics in processes requires a new approach to process modeling as well as a new
approach to business intelligence. Modeling process logic and flow only as in the past is
insufficient. We now have to model simultaneously metrics and responsibilities. We have to link
strategy and goals to processes, metrics, and people and to build the closed-loop. This is all about
governance. Governance means an organization and control of activities and resources in the
enterprise that is oriented to responsibility and to durable and long term value creation.

Example: Monitoring and controlling of sales processes. Sales methodologies describe and
structure the sales activities across the sales cycle. The sales cycle is typically defined as
the time period between the identification of a lead and the payment of the bill according to
a signed contract. The methodology describes the various levels of qualification of a lead
and the actions to be taken to move a lead from one level to the next. These levels
correspond to the different states of the sales process, where the desired end result and
final state is the payment of the bill. The number of levels of qualification depends on the
selected sales methodology (and does not matter in this example). Now let us apply CPM
to this sample process. The metrics for monitoring and controlling of our sales process are
the number of qualified leads per level, the estimated/achieved value of a deal, and the
transition rate and transition time from each level to the subsequent level. Based on these
metrics, we can now act proactively. Assume that the objective of our sales process is a
revenue of x € in six months. We then can estimate the number of leads per level that is
necessary to achieve this goal by evaluating the defined metrics and compare the result
with the actual sales data. If the result shows that we will not achieve our goal, we still have
time to preventively counteract by taking actions for controlling the process, e.g., an
additional lead generation for filling up the sales funnel.
Metrics-oriented management is based on information management. Information has to be
available in “right-time” (often called “real-time”, see Nußdorfer, Martin, 2003) for triggering manual
or automated decisions for process control. This corresponds to the “information supply chain”
paradigm: supply the right information in the right time to the right location to the right information
consumer to trigger the right decision. So “real-time” means synchronization of information supply
with information demand (Note: “real-time” is a relative term and not necessarily related to clock-
time).
Business metrics represent management policies within metrics-oriented management. The idea
behind is obvious:

You can only manage what you can measure.

So, flexibility of changing and updating any metrics is one of the top requirements of the model.
Furthermore, business metrics must be consistent. Metrics specified to control the execution of a
particular group of processes should not contradict other metrics. Indeed, metrics are cross-
functional and cross-process: The performance of a business process may influence and interfere
with the performance of other processes.
For example, delivery time, a supply chain related metric, may influence customer
satisfaction, a customer relationship management metric.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 20


Wolfgang Martin Team/CSA Consulting

These issues are addressed by business scorecards. A business scorecard aligns all
management policies presented by all metrics across the enterprise and presents the aggregated
top management policy of the enterprise as well as all details for all employees. Examples of
particular business scorecards are Norton/Kaplan’s balanced score card or the six sigma model.
The balanced scorecard, for instance, is a collection of metrics that is not only based on financial
parameters, but uses also customer, employees and shareholders loyalties to provide a look to the
corporate performance beyond the quarterly results. It presents indeed one particular style of
management policies. Despite the wide variety of these metrics, the final goal remains the same:
transform data into information and knowledge and maximize its value for the business by closing
the loop: We now base planning, monitoring and controlling of processes on information, facts, and
knowledge.
CPM is applied to all business domains like customer relationship management, supply chain
management, human relations etc.
Example: Financial Performance Management like any other analytical solution is a
closed loop process depicting the information management of financial information. The
process stretches from planning, budgeting, and forecasting to performance measurement
and auditing via financial metrics including the statutory legal financial reporting and
consolidation requirements. Financial performance management includes profitability
analysis and planning as well as simulations and what if analysis. Decisions are then made
based on the financial metrics and analysis and fed back into the planning, budgeting and
forecasting activities: The loop is closed.
As Fig. 3 already implies, CPM takes place on three levels, the operational, tactical, and strategic
level (Fig. 4). In the past, business intelligence focused on enabling decision support in the context
of strategic planning and tactical analysis. This was done by metrics designed for long term
outlooks. The basic concepts were to measure and to monitor the achievements of strategic goals,
for example customer satisfaction, customer value, term of delivery, supplier value, staff fluctuation
etc. Long term here relates to the dynamics of the process. Question is how fast can actions
influence the process and significantly change the indicators. This is why there is a tactical level.
Achievement of tactical goals can be considered as mile stones towards the strategic goals.
Actions targeting the achievement of tactical goals typically address a time frame between some
few days to several months. Today, process orientation operationalizes business intelligence.
Operational processes are to be monitored and controlled in right time (“real time”) via intelligence.
Operational CPM is also called “Process Performance Management (PPM)”, and it includes
“Business Activity Monitoring (BAM)”.
These ideas stem from control theory. As room temperature is monitored controlled by a closed
loop feedback model, business processes shall be monitored and controlled on the operational
level, i.e. real-time. Figure 4 depicts the three levels of CPM. The real-time principles of the
information supply chain enable monitoring and controlling even of operational systems. An
information supply chain is defined by the principle of the availability of the right information in the
right time at the right location for the right purpose. Information is treated as the duty of the
information provider. In the data warehouse model, information was treated as the duty of the

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 21


Wolfgang Martin Team/CSA Consulting

information consumer. In CPM, the provider of information is responsible to propagate information.


The implementation is done through a publish and subscribe communication method.
Example. In a web shop, product availability is a valuable metric when controlling the order
process. Product availability is an operational metric. It measures stock via sales and
supply transactions. Hence, product availability is synchronized with transactions. When
product availability gets below a certain pre-defined threshold, an alert can be launched.
Such an alert could trigger an additional shipment. If shipment is not an option, then the
product could be blocked in the product catalogue so that customers cannot place any
orders for this product. This is a pro-active action that avoids canceling customer orders. In
the end, the frustration of customers due to the unavailability of a product is minimized.
Furthermore, the blocked product could be tagged by a note stating when the product will
be available again.

CPM – Temporal Layers

Top down
Strategic
Methodology
Planning
driven Strategic
CPM – long term
Tactical
Tactical CPM
Analysis
mid term –
days, weeks, months
Bottom up
Project Operational CPM (BAM) Operational
driven short term – Actions
same day at least

4 © 2007 S.A.R.L. Martin

Figure 4: Corporate Performance Management (CPM) is the process of managing the performance of
business processes by applying metrics, deciding on the outcome of the metrics, and launching actions for
controlling the performance and/or the process, a closed loop model. One key issue for all CPM approaches
is to put the metrics into a monetary context. This requires process-oriented accounting principles like activity
based management/costing.
CPM spans from operational to strategic CPM, but is addressed by two separate camps of vendors rushing
to exploit the new opportunities of a strongly growing analytics market. These are the Business Integration
vendors providing BAM (business activity monitoring) solutions, and the Business Intelligence vendors with
their traditional focus on tactical and strategic solutions. This is confusing business and IT people looking for
real solutions to solve their more and more complex analytical needs.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 22


Wolfgang Martin Team/CSA Consulting

This example shows how to monitor and control business processes on the operational level by
information. Processes are automated; manual interactions of product managers are minimized. By
the way, what is the meaning of “real-time” in this example? Typically, product availability is
measured twice a day. This is an empirical experience balancing cost of measuring with cost of
risk ignoring product availability for controlling the process.
Operational CPM has been addressed first by vendors coming from process engineering and
business integration by adding reporting and graphical features for visualizing operational
performance indicators. Via activity based management and costing these metrics can be also put
into a monetary context. This means technically to have access to financial data in a data
warehouse.
Tactical and strategic CPM was first addressed by the vendors of traditional business intelligence
by moving from the data warehouse model and business intelligence tools to analytical
applications and closed loop processing. Today, the two independent approaches to one and the
same problem are confusing the market, but are converging since 2005.

3.2 The Process Ownership Model

At this point, we have to address the question of how to use information. There are two aspects to
be considered. On the one hand, we have to find a solution for the information supply chain
paradigm, i.e. who needs when, where and why what information? On the other hand, we have to
understand the skills and training necessary for exploiting information. Let us start with the aspect
how easy to use CPM tools should be.
Traditional BI tools lacked the ease of use features. For users, it was mandatory to have a deep
knowledge on how to use report generators, adhoc querying, OLAP tools, spreadsheets, statistical
and data mining tools, etc. This know how was typically acquired by training and education.
Business analysts and power users evolved as a new class of people that was empowered by all
these types of tools. Business departments became dependent on this new type of information
empowered employees. So, information became a kind of luxury product that was not available to
everybody. This is now changing in the CPM model. Analytics for performance management and
analytics for enriching operational processes requires that everybody participating in a business
process must be in a position to consume information without training and education. This is
enabled by the CPM methods and technologies as we will see in the following chapter 4.
Nevertheless, we will continue to engage business analysts and power users, but their role is
changing. Due to better ease of use of CPM tools, business analysts will be less engaged in
providing standard information upon request. So, they can spend more time for interactive
analytics (data exploration) and create more value for the enterprise. Plus, a new task is attached
to them: management of the CPM methods and technologies. Identifying and communicating best
practices of analytical scenarios will be their charter. This requires a close cooperation and
collaboration with the information consumers. If an information consumer will be confronted with a
new, not yet encountered problem in monitoring and controlling his/her business processes, a new

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 23


Wolfgang Martin Team/CSA Consulting

analytical scenario has to be jointly developed with a business analyst. Once solved, the new
scenario can be reused within the CPM framework. The CPM organization learns and gets better
the longer they apply the CPM model.
Even if CPM provides an intuitive working environment that needs much less training and
education, there is still the problem of data deluge to be solved. Who needs what information,
where, when and why? The solution comes with the information supply chain model linking
processes, metrics, people and organization: Process-orientation comes with a process
ownership model. This is part of the governance of the BPM, CPM, and SOA model.
The process ownership model describes who of the constituents (employees, partners, suppliers,
customers etc.) participates in and is responsible for what processes or activities. This enriches the
process model by the roles and organizational units of all people that are involved in executing and
managing processes. In metrics-driven management, this process ownership models also includes
the metrics that are necessary to monitor and control the process and its performance. This can be
understood as information sharing and filtering. The constituents share data, information and
knowledge in the context of their process-oriented communication and collaboration. All other data
is filtered out. Consequently, the result is a top-down security model as a by product of the process
ownership model. Information sharing and filtering is done via information profiles describing the
context of collaboration based on the process ownership model. Some call this “information
democracy” (Fig. 5): The process ownership model includes the information profile describing and
filtering exactly the information that is needed by all constituents based on the context of
collaboration.

CPM – Governance
Enterprise Strategy

Goals & Objectives

CPM
Organisation

People
Technology

People
s p
In Pr

es hi
fo o

oc ers
rm fil

r
P n
at es

w Driving the
io

O
n

Enterprise
Processes
Processes Metrics
Metrics
Sensors

CPM

Culture

5 © 2007 S.A.R.L. Martin

Figure 5: Information democracy comes with the information supply chain paradigm. Everybody has access
to all information necessary to execute and manage the processes and their performance specified by the
process ownership model – not less and not more.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 24


Wolfgang Martin Team/CSA Consulting

Visualization of information according to information profiles should be done by portlets embedded


into portals (see also Fig. 2). Portals have evolved from intranet and extranet solutions to the
central point of control for collaboration providing the P2S (person to system) interface. A portal is
defined as a system that enables sharing and filtering of data / information, functions / functionality,
content / knowledge, and processes. This sharing and filtering is related to the functional role of a
collaborative team within the process ownership model. A collaborative team is a group of people
representing the various constituents that work together according to the collaborative goals and
objectives of the team. In this way, portals support cross-functional, cross-departmental, and cross-
enterprise virtual teams. As a special case, a team could also be an individual portal user. To
summarize, portals enable information democracy.
A process portal (see fig. 2) can be understood as an abstraction layer linking and aggregating
contents and services as well as reducing the complexity of their access. In this sense, the team-
context defines the collaboration bandwidth, i.e. which data / information, functions / functionality,
contents / knowledge, and processes are exposed to the collaborative team together with the
appropriate collaborative tools. Each portal user gets a personalized environment that can be
further individualized. Indeed, such a person portal can be understood as an integration
technology. But the ultimate integration is done via a human interaction, i.e. within the team-
context; a user can execute a message transfer between contents and services within his context.
Furthermore, process portals also provide synchronous and asynchronous collaborative tools, e.g.,
e-mail, co-browsing, chat, blogs, instant messaging, web-conferencing etc. We have described the
role of portals and their relationship with BPM and SOA in Martin and Nußdorfer (2006).

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 25


Wolfgang Martin Team/CSA Consulting

4 CPM – Methods and Technologies


As we have already seen, CPM is fundamentally different from traditional BI. Focus of BI was tools,
e.g., OLAP, spreadsheets, reporting, adhoc querying, statistical and data mining tools, etc. CPM
comes with new methods and technologies. Goal is to empower everybody collaborating in the
context of a business process by analytics without becoming a specialist in analytics. This principle
is not only applied to employees, but also for suppliers, partners, dealers, and even customers.
Analytics must become consumable by everybody.

4.1 CPM Business Components

Metrics and Key Performance Metrics – Metrics are used to manage the performance of a
process and / or to control a process. They are derived top-down from metricized goals out of
strategy and process analysis. Metrics work like sensors along the reach of a process flow. The
final goal is the proactive identification of risks and problems. Early warnings become possible so
that preventive actions can be taken to bring a process instantiation back on track. (See page 20;
example of monitoring and controlling sales processes).
Metrics consist of indicators and scales. Scales define how to interpret instantiations of indicators
and what decisions to take. A key performance metric (KPM) is a compound, cumulated metric.
Term of delivery is an example for a KPM. It is cumulated of detailed metrics like time of delivery
across all customers within a certain time period. Typically, an employee will have a lot of detailed
metrics, but just some selected KPMs. KPMs should be related to the personal goals and match
the model of management by objectives. In the end, KPMs could have an impact to certain
components of the salary.
In the example about term of delivery as a KPM, a decision maker is responsible for interpretation
of the KPM, making decisions, and taking actions. In case of such a human interaction, scales are
typically visualized by traffic lights and / or speedometers. Green, yellow, and red lights ease and
speed up the interpretation of instantiations of KPMs and metrics. In the web shop example about
managing the order process by product availability (p. 22), the interpretation is automated by a
decision engine – visualization is not necessary.
Business Scorecard – This is a consistent and comprehensive group of metrics for monitoring
and controlling a group of processes or even the total enterprise according to a management
policy. Consistency of metrics is very important, because metrics should not be contradictory and
cause conflicts between collaborative teams working in different contexts. The term business
scorecard was first developed for strategic CPM, but is now used for all levels of CPM. Known
models of business scorecards are the already mentioned balanced scorecard of Kaplan and
Norton (www.bscol.com), Baldridge’s scorecard model (www.quality.nist.gov), and the Six Sigma
model (www.isixsigma.com). It should be noted that he majority of enterprises does not exactly

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 26


Wolfgang Martin Team/CSA Consulting

apply one of these models, but uses its own customized scorecard model that is a derivative of one
of these models.

6 © 2005 S.A.R.L. Martin

Figure 6: Example of a Strategy Map of a Balanced Scorecard Model built with Actuate/performancesoft. In
the Strategy Map illustrated here, Input and Process metrics are shown in a cause-and-effect relationship to
their respective Outcome metrics. This pictorial representation of the strategy allows the organisation to
evaluate its effectiveness by tracking key measures relating to each corporate objective.

Strategy Maps – Strategy Maps (Fig. 6) are a visual presentation of a strategy based on the
cause-and-effect relationship of input and process metrics to their respective output metrics. The
well-known and typically used indicators of traditional BI were too much biased by financial data
and did not sufficiently consider investments in people, IT, customer relationships as well as in
supplier and partner networks. This is why the standard planning and reporting systems like profit
and loss and cash flow based on the traditional BI indicators are not applicable for monitoring and
controlling of resources beyond finance. In genuine CPM, we overcome this problem by the
concepts of process-orientation: We use metrics as sensors and cause-effect relationships
between the various goals and objectives within a strategy. Strategy determines the goals and
objectives of value creation by processes. This is depicted by strategy maps, and the business
scorecards provide the translation into decisions and actions for monitoring and controlling
processes in a proactive way. Strategy Maps as well as Business Scorecards are not static. Market
dynamics and customer needs drive and change strategy, so strategy maps as well as business
scorecards.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 27


Wolfgang Martin Team/CSA Consulting

Business Rules – They represent cross process decision logic in the context of business
expertise and management policies (refer to the definition of a business process on p. 18).
Modeling of rules is either top-down by an expert system type approach or bottom-up by
generating predictive models (e.g., a customer behavior model by a data mining process).
Ultimately, rules can be modeled by a combined top-down, bottom-up approach aligning predictive
models with expert rules. Business rules must be managed centrally and independently of
business processes. The reason is the n : m relationship between rules and processes: a rule can
belong to several processes, and a process can have several rules. When business rules are hard
coded into the processes, then a chaos for maintenance of rules is inevitable after some short
time, since the consistency of rules will be in danger.
Alerting – Event-orientation requires alerting services. When an event / alert occurs, the
information describing the event / alert is automatically propagated to all recipients that have
subscribed to receive this information. This is set up in the publish and subscribe communication
method using message / queuing infrastructure. The principle of this communication method is
defined by the information supply chain model. All information that is necessary to process the
event / alert should be available to all recipients in right time for making the right decision and
taking the right action. Again, right time means to synchronize the speed of the process with the
delivery of information via the propagation. If speed is high, and the delta between event / alert and
decision / action becomes small, then a human interaction may be to slow: The decision / action
taking must be automated. Examples for automated decision / action taking can be found on
various web sites where recommendation engines are working. Rules engines are state-of-the-art
technology for automated decision taking (see chapt. 6).
Broadcasting – These are services for delivering personalized messages to millions of recipients
via e-mail, fax, pager, mobile phone, PDA etc. RSS („real simple syndication“) feeds are becoming
the leading technology for data syndication pushed by the more and more widespread usage of
Web 2.0 concepts. Using exception conditions and recurring schedules as triggers, events can be
automatically created and propagated to processes and people within the enterprise or to any
external community. Content can be personalized to the individual subscriber, preventing
information overload and ensuring that security requirements are strictly enforced.

4.2 From Business Intelligence to Business Analytics

Process-orientation drives the evolution from BI to CPM. The CPM business components require
new BI tools and services, a new architecture for positioning the tools into the context of CPM (Fig.
7) as well as a new fresh thinking in terms of processes and services. We are making the next step
from Business Intelligence to Business Analytics. Here come the fundamental differences to
traditional BI:
Analytics is process-driven, not data-driven. It links business strategy to processes and people
according to their role in collaborative teams: the use and value of information now goes beyond
the power users and business analysts that in the past were the only people benefiting from

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 28


Wolfgang Martin Team/CSA Consulting

information provided by BI tools. Analytics now empowers all participants of the enterprise value
network, suppliers, partners, dealers as well as customers... It is targeted at the business rather
than IT.
Analytics can be predictive. It is aimed at responding to unforeseen events and revealing new
insights and unexpected discoveries. It is not limited to the analysis of historical data pre-
programmed into a warehouse or a cube.
Embedded Analytics – from strategy to operations. A SOA makes it happen. Embedding
analytics into operational processes enables synchronising information delivery with process speed
and interacting with information at the speed of business so that decisions and actions can be
taken in right-time. Through embedded analytics, processes become intelligent and event-driven.

CPM – Reference Architecture


Strategy
Metricized Goals
Processes & Metrics
Meta Data

Embedded
Analytics Interactive Analytics
ad hoc Workflow
Analytical
Collaboration
Services
Adaptive dynamic

Data Integration

7 © 2007 S.A.R.L. Martin

Figure 7: Reference architecture for CPM. It enables the comparison of products and offerings of the various
vendors for planning / developing, executing and managing CPM. Key is the coupling of modeling of
processes and metrics as well as the top-down implementation of metrics by analytical services and bottom-
up by interactive analytics (data exploration). Data integration is the foundation for CPM. It provides parallel
and simultaneous access of operational and analytical data via services within the framework of the SOA.

Analytics needs data integration – Traditional BI tools worked exclusively on the data
warehouse. The data warehouse provided the “single point of truth”, i.e. reliable and high quality
information. This prohibited the application of BI to operational environments. BI just operated in
the domains of strategic and tactical analysis. The potential value creation by real-time analytics
was discarded. Analytics has parallel and simultaneous access to operational and analytical data
and information. A data integration platform (“enterprise service data bus”) now becomes the single

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 29


Wolfgang Martin Team/CSA Consulting

point of truth. In a SOA, it links CPM to the integration hub and the data warehouse. The data
integration platform provides information services (Fig. 8) – may be as web services – that can be
composed out of any operational and data warehouse data. The data warehouse becomes a
backend service (Fig. 2) providing in particular historical data.
Interactive Analytics – analytical processes and collaboration. Interactive analytics (also
called “Data Exploration”) is an adhoc, dynamical, easy to handle, analytical, collaborative process.
The goal is to provide new analytics, e.g., profiles, rules, scores, and segmentation for a better
insight into markets, customers, risks etc. In this sense, it is a bottom-up development environment
for metrics and predictive models. Good example for interactive analytics is the development of
predictive models by data mining. The final predictive model is then implemented in a rules engine
controlling an operational process.
Example. Let us consider the process of credit approval in banking. Standard rules for
checking a customer situation for solvency and credit approval can be rather easily
modeled by a financial consultant. This top-down model can be complimented by a bottom-
up model describing the risk of credit failure. This can be identified by data mining customer
data and providing a risk based customer segmentation. A combination of the expert rules
and the generated predictive model provides the final rules. The process of credit approval
can now be automated, its workflow is controlled by a rules engine, and customers can now
run credit approval as a self service on a web site, for instance.
Other examples can be found in the context of cross/up-selling and customer attrition. It is
important to note that special knowledge how this intelligence works is not necessary when
working in the context of intelligent processes. Analytics embeds intelligence into the process, and
works as a black box. So, Analytics including even sophisticated approaches like data mining, text
mining and web mining is made consumable for everybody, not only for some thousands of
specialists, but for millions and more information consumers.
Business Analytics enables intelligent processes. Operational processes can now be enriched by
embedded analytics and can be monitored and controlled “in real-time” (BAM). Service-orientation
eases the embedding of analytics. Traditional BI tools turn into analytical services. In a SOA, CPM
is implemented through analytical services.

4.3 Text Analytics

Text analytics is a new type of analytics1. It combines linguistic methods with search engines, text
mining, data mining, and machine learning algorithms. Text analytics is both, a technology as well
as a process for knowledge discovery and extraction in unstructured data. The first goal of text
analytics is to identify selectively entities (like names, data, locations, conditions) and their
attributes as well as relationships, concepts, and sentiments between entities. The second goal is

1
See also http://www.intelligententerprise.com/blog/archives/2007/02/defining_text_a.html

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 30


Wolfgang Martin Team/CSA Consulting

to create and to visualize classifications based on the identified structures. As an example, an


outcome of text analytics could be the identification of opinion leaders in social networks.

Text Analytics is the extension of business analytics and data mining supporting analytics in
Content Management.

Text analytics is pushed by the increasing adoption of Web 2.0 principles. Given the additional
data exhibited in social networks, Web 2.0 enables enterprises to address customer segments with
chirurgical precision, but it also bears uncontrollable risks: In umpteen thousands of blogs and
forums, people talk and chat in all details about products and enterprises – real lies included.
Expert forums can demystify quickly and lastingly slogans and claims. Comparisons of prices
create transparency at cyberspeed. Competitive Intelligence is no more restricted to strategic
competitor observation, but moves to operational observation of brand new competitor products
even before they come to market. For instance, Nokia was well informed in advance about weak
points of Apple iphones by expert discussions on certain forums. This all is due to applied text
analytics.
Text analytics is also successful when it is about the identification and classification of critical
customers. Critical customers could be very helpful in removing product flaws, but could also be
notorious grouches and wiseacres.
Example: BMW actively uses blogs. Experience with bloggers has shown that customers
communicate sometimes more positively about BMW products than BMW’s own slogans
would dare to. (Attention: Sony has once tried to influence bloggers and blogs. When this
was brought to light, damage to Sony’s image was serious.) BMW created “M Power World”,
a social network about sportive driving for the special customer segment of buyers of M
models. Here, customers are invited to exchange ideas with BMW developers and designers.
Customer becomes product developer – this is the fundamental Web 2.0 principle.
BMW applies a Web 2.0 forward strategy: Web 2.0 principles become part of their CRM strategy.
An alternative would be a passive strategy by automated observing of selected blogs and forums
by text analytics for identifying critical situations and mood changes as quickly as possible. This is
very well doable by text analytics, but it turns out that it is extremely difficult to launch the right
actions in case of. You may legally enforce the deletion of blog entries, but in reality, they will pop
up elsewhere. In the world of Web 2.0, the principle of “semper aliquid haeret” is inexorable. Here,
we are entering virgin soil and have to learn of lot.

4.4 Analytics in a SOA

We have already introduced the concept of a SOA. Cross departmental and cross enterprise
processes can be implemented as composite applications – see also Nußdorfer and Martin (2007)
– orchestrating business services according to the process logic. Business services present and
publish the business logic from existing back-end systems (see also Fig. 2), or have to be
developed and/or acquired, if the necessary business logic has not yet been implemented.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 31


Wolfgang Martin Team/CSA Consulting

There are five categories of services providing business logic (Fig. 8):

Analytics in a SOA

Collaborative Business Process

Information Analytical Rules Operational Collaborative


Services Services Services Services Services

IT Management
Development
Services

Services
Data Integration
Services Enterprise Service & Service Data Bus
Plattform
Repository

Third Party Application Access


Services (SaaS) Services Services

Infrastructure Services

External Data Unstructured Operational


Data Warehouse Data Data

8 © 2007 S.A.R.L. Martin

Figure 8: Business processes orchestrate services within a SOA. The main idea of service orientation is to
split process and business logic. There are five categories of services providing business logic, information,
analytical, rules, operational, and collaborative services. These categories of services can be considered as
“business services”. They are composed out of “Technical Services” provided by 3rd parties (SaaS – software
as a service), backend applications, and the various types of data sources. Furthermore, we need
development services for both, process logic and business logic, and IT Management Services for
administration, execution and security of services. The Enterprise Service Bus together with the Enterprise
Service Data Bus is a kind of intelligent middleware enabling service and data brokerage. It also includes the
service directory listing and publishing all available services.

• Operational Services. They provide transactional business logic like creating new customer,
new account, placing an order etc.

• Collaborative Services. They provide services supporting human interactions and person to
person communication like setting up a meeting, search services, communication services like
embedded e-mail, chats, SMS, voice etc.
• Rule-Services. Rules define the decision logic. A process typically uses several rules,
whereas a rule can be used in several processes. This is why we have to strictly separate
process and decision logic. In a SOA, rules are considered as rules services that are
orchestrated by the process engine as any other service. A rule service can be also
understood as an encapsulation of complex rules. Indeed, a rule service could use another
rule service as a sub-rule.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 32


Wolfgang Martin Team/CSA Consulting

• Analytical Services. They provide analytical business logic like a threshold for product
availability, a predictive model for customer behavior or customer risk, a forecasting service for
sales, etc.
• Information and Data Services. They provide composite information based on structured and
unstructured, operational and analytical data sources like customer address, customer value,
term of delivery etc. Information and data services also include meta data and master data
services.
In this white paper, we now focus on analytical services (chapt. 4.4) and information and data
services (chapt. 5). Before doing so, let us put another note to rules services. They can also be
used to automate human decision making by using rules engine as a decision engine. A decision
engine should have scheduling features for follow up of events by intervening actions. For
instance, if a customer has visited a web site, given a positive response, but did not come back
within a certain amount of time, then the decision engine should be able to detect this “non-event”,
and send a trigger, for example to a call center agent for follow up. Decision engines enable
intelligent interactions with all business constituents. For example, they can enable intelligent real-
time interactions with customers in the web or call center channel. In cross/up-selling, decision
engines execute predictive models reflecting customer behavior. The right customer gets the right
offer in right time. This boosts revenues as various business cases have shown.

4.5 Analytical Services

In a SOA as infrastructure for BPM, embedded analytics is implemented via analytical services.
We can consider analytical services as the technical components of CPM. Analytical services are
implemented as encapsulated, component based modules that can, but need not communicate via
web services. They publish business logic. This includes all kinds of analytical content and
functionality, e.g., customizable and extensible templates for metrics in business scorecards and
analytical tools. It also includes all necessary development services for managing the life cycle of
analytical services (implementing, customizing, maintaining). This is why analytics extends the
traditional data warehouse centric BI. It puts intelligence into the context of strategy, goals and
objectives, processes, and people via metrics and predictive models, and it implements intelligence
through analytical services.
Reporting, Query and Analysis Services – In a SOA, traditional business intelligence tools
functionality for reporting (interactive, production, and financial reporting), querying (ad hoc
queries, OLAP) and analysis (data visualization, data mining, statistical tools) is implemented as
components providing analytical services that can be embedded in any collaborative process. In a
SOA, these services can use any information service for data supply so that these services can
now act on composite data stemming from analytical and operational data sources. Analytics goes
real-time whenever relevant for the business.
Planning and Simulation Services – Planning is a typical cross-departmental process that is best
implemented as a SOA based process. So, planning functionality is implemented as planning and

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 33


Wolfgang Martin Team/CSA Consulting

simulation services providing full flexibility and adaptability of this process to changing business
scenarios. The advantage of implementing planning through a SOA is obvious, the planning
process can be composed out of any analytical and other services avoiding the redundancy in
analytical functionality by implementing a planning application in a traditional data warehouse /
business intelligence architecture and by fostering a rigorous and audit-proof planning by a
controlled process instead of spreadsheet based manually driven planning processes. (Fig. 9)

9 © 2005 S.A.R.L. Martin

Figure 9: Planning and simulation with Cubeware Cockpit: A number of planning approaches can be used,
such as top-down with splashing, bottom-up and counterflow. As well as handling several planning
versions, both forecasts and "what-if" simulations can be set up. Current actual values can be accessed by
the planner in the same report. The illustration shows the simulation of the profit and loss assuming a 4 per
cent raise in salaries in relation to the plan IV values of the previous year, taking account of estimated pay
cost factors over the coming year.

Dash-Board Services – A dashboard (Fig. 10) visualizes large volumes of information stemming
from various data sources in a compressed way. Degree of compression and the kind of
visualization depend on the goal and on the user. A dashboard can also be used to implement a
business scorecard. It is typically embedded as a portlet in a portal framework. An information
profile is at the core of a dashboard. It describes which information, functions, knowledge and
processes an information consumer (employee, customer, supplier, partner, dealer etc.) must have
access to according to his / her role. Based on the information profile, the dashboard is
personalized according to the paradigm of the information supply chain: Each information
consumer gets exactly what he / she needs to do his / her job. Deployment is either passive, i.e.
the information consumer uses search and navigation services to access its metrics and is guided

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 34


Wolfgang Martin Team/CSA Consulting

by an analytical workflow, or active, i.e. in case of escalation, events or alerts, important


information is sent to the information consumer by special channels, e.g. SMS, instant message, e
mail etc. for triggering decisions and actions. This enables management by exception.

10 © 2005 S.A.R.L. Martin

Figure 10: Example of a Dashboard created with BOARD. BOARD is a CPM toolkit (see chapt. 8) that offers
a variety of presentation services for visualizing indicators including gauges, stoplights, thermometers and
cockpits. Drill-through and ad-hoc queries for more detail about what the dashboard shows you are other
highlights as well as self-service dashboard creation and customization.

Data Integration – Traditional business intelligence tools worked on the data warehouse, whereas
analytical services work on a data integration platform. In a SOA, the data warehouse becomes a
backend data service, and the data integration platform is part of the enterprise service bus (Fig. 2
and 8). Data integration provides information services for analyzing data, master data and meta
data, develop data models, prepare and profile any type of data, as well as ETL (extraction,
transformation, load) services. For more on data integration see chapter 5.
Interactive Analytics – Embedded analytics is complimented by interactive analytics (data
exploration environment). Metrics are not only derived top-down from strategy, goals, and
processes, but could also be derived bottom-up from data. This is the purpose of interactive
analytics. Up to now, mainly traditional business intelligence tools like data mining, statistical tools,
adhoc querying, OLAP tools etc are used, but now on top of the data integration platform. In the
mean time, there is a new generation of analytical tools coming to market that provides rather
enhanced data visualization techniques and analytical workflows.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 35


Wolfgang Martin Team/CSA Consulting

Tools for interactive analytics are used by interdisciplinary teams. A specialist for tools and
methods and a power user representing the future information consumers jointly drive this
analytical process. The necessary data services for supplying the tools are provided by an IT
specialist. The IT specialist ideally is a data architect who knows well the enterprise data and data
sources and who can identify and evaluate external data sources and services for enriching the
internal data. In this sense, interactive analytics is still a special task for especially trained experts
with specialized tools. But the new generation of analytical tools brings additional improvements.
They have added collaborative tools for better team support so that communication and
collaboration between the different parties engaged in the analytical process is sufficiently
enhanced.
Real-Time Analytics – Interactive analytics is a highly interactive process driven by man. When
the amount of data to be analyzed is huge (e.g., in the order of terabytes), then the tools become
the bottleneck, not the interdisciplinary team driving the process. Then, real-time analytics could
help. Real-time analytics is based on three different principles that can be also combined. Vendors
are listed in chapter 8.
• Special Database Technologies – Technologies like compression, indexing, vector
processing, memory-based caching etc. can dramatically improve the performance of adhoc
queries and other business intelligence components / tools. This speeds up the exploration
process by faster responses (from hours to minutes and seconds). Technologies in this
category are rather mature.
• In-Memory Databases – This is one of the rather recent developments in database
technology. Here, the total database is processed in memory providing even more performance
than the specialized data base technologies that still store data physically. In-memory data
bases especially benefit from a 64 bit address space.
• Special algorithms – They are used for reading and processing data. They overcome the
limitations of traditional SQL and OLAP technologies. Many vendors combine these features
with special data base technologies as described above.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 36


Wolfgang Martin Team/CSA Consulting

arcplan Enterprise 5
Web Services Communication Interface: external to arcplan

External
Information
accessible via
Web Services arcplan Enterprise
Standard Sales Report

Internet Services

Business
Processes ODBC,
MDX,
1 XMLA,

Business
Applications

Web Services Communication Integration via Web Services,


here: Market shares
Interface
2 Data Sources RDBMS OLAP other

11
arcplan Analytic Services

arcplan Enterprise 5
arcplan Analytic Services Provider: arcplan to external

Desktop User / Workplace


arcplan Enterprise Report

N.N.

arcplan Analytic Services Provider

11
5
arcplan Analytic Services

Figure 11: Example for embedding analytical services (here via web services). arcplan Enterprise consumes
web services that are orchestrated and presented by its analytical workflow (top). arcplan Enterprise also
publishes services via web services that can be orchestrated by other services and processes (bottom).

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 37


Wolfgang Martin Team/CSA Consulting

5 Data Integration
Data integration became an issue, when it was about to fill and to refresh a data warehouse.
Solutions have been extraction, transformation and load (ETL) processes. But in the times of
process-orientation and CPM, data integration gained a much higher, even mission-critical
importance, and a much more wide-spread usage.

5.1 Data Integration Platform

It has been common practice to supply a data warehouse by ETL processes. ETL processes are
either supported by batch and / or message / queuing, depending on whether time is critical for
data supply. This will continue, and this is one task that is still addressed by data integration
platforms. But now we need more. We need information and data services (Fig. 8) enabling the
simultaneous access of data warehouse and operational data via a data integration hub (Fig. 2). In
the past, one has tried to solve this time critical data access problem via an ODS (operational data
store). Using the ODS approach is not always sufficient, because storing data in an ODS may
already exceed a given time window, and unfortunately, business logic needed for calculating more
complex metrics may be hidden in the application logic and is not available on the data level.
There are two options: low latency and zero latency data integration. So, the key point is first to
determine what latency can be tolerated for a given process. Note that latency is correlated with
cost: the lower the tolerated latency, the higher the cost.
The low latency model is based on a data integration platform that collects all relevant
transactional and analytical data and stores it in a so-called low-latency data mart (LLDM). This
requires integration of the data integration platform with the integration hub (ESB) where the
processes and services across all backend applications are managed. The LLDM is refreshed
either by message queuing or by batch, where the batch is executed in short periodicities
according to the tolerated latency (e.g., hourly etc.). The LLDM can be used for low latency data
propagation. This is a feedback loop for triggering events in operational systems via cross-process
metrics. This coupling with operational systems requires managing the data integration platform
like the ESB: The data integration platform is an operational system. (Fig. 12)
This model is different from an operational data store (ODS) where data from operational data
bases is stored via ETL processes. So, all transaction logic that is not stored in the operational
data bases cannot be mapped to operational data stores. Furthermore, the ETL process is not
synchronized with the transactions, i.e. ODS data is not always in synch with the state of
transactions. This stresses the need for low latency data marts, especially in the case of legacy
systems.
The zero latency modell is also called Enterprise Information Integration (EII). It can be
understood as a logical data base access layer spanning across all operational data bases and the
data warehouse providing information and data as services. The access is done via XML and the

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 38


Wolfgang Martin Team/CSA Consulting

EII resolves the data request into various SQL statements accessing the corresponding data bases
and transforming the data so that the requested compound data is published as a service and
available for the process. Indeed, such an information service could be also implemented as a web
service.

Data Latency

Real-Time
„low-latency“ Analytics

Data Integration Platform LLDM


Real-Time
Data Propagation

Data or Events OLTP


System
OLTP OLTP OLTP OLTP
System System System System
OLTP
System
OLTP
System

Enterprise Information Integration (EII)


Real-Time
„zero-latency“ Analytics
12 © 2007 S.A.R.L. Martin

Figure 12: Real-time data integration can be implemented as a low latency or zero latency solution. The low
latency solution works with a low latency data mart (LLDM) that stores transaction synchronous, but by
latency time delayed information and data for BAM/PPM. The zero latency solution (EII) means transaction
synchronous access to heterogeneous OLTP data (OLTP = online transaction processing). Real time data
propagation triggers operational systems with events based on cross-process metrics via a data integration
platform.

5.2 Information Services

We already have met information services as a special category of SOA services (Fig. 8). Note that
information services do not only make perfectly sense in a SOA, but bring also great value to
enterprises that are not planning to use a SOA as their IT architecture, but suffer from data
fragmentation. (Data is kept in data silos in isolated applications or data marts). Let us start with a
more detailed definition of an information service.

An information service is a modular, reusable, well-defined, business-relevant service that


enables the access, integration and right-time delivery of structured and unstructured, internal or
external data throughout the enterprise and across corporate firewalls. An information service can
be a meta data service, a master data service, or a data service.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 39


Wolfgang Martin Team/CSA Consulting

Architecture of Information Services


INTERNAL
Information CONSUMER EXTERNAL

Processes Portals Web Services


Applications Analytical Services Applications

JMS Web Svc SQL JDBC WebSvc

Delivery Services

Meta and Master


Administration

Data Services

Infrastructure
Services

Services
Data Integration Services

Universal Data Access Services

Data Sources
Databases Flat Files unstructured Data
Applications Messages XML Mainframe
13 © 2007 S.A.R.L. Martin

Figure 13. To overcome data fragmentation, information and data is delivered by information services.
Information services include six different categories. The architecture is shown in the figure. Universal data
access services provide access to any internal or external data source. Data integration services provide any
type of mapping, matching, and transformation. Delivery services publish information to any information
consumer – internal or external. Meta and master data services provide the common business vocabulary.
Infrastructure services look to authentication and security. Administration services provide the functionality
for administrators, business analysts and developers for managing the life cycle of all services.

Given the definition of an information service, the next step is now to look at the needs of
information service consumers to identify the different categories of information services and their
architecture. (Fig. 13)
• Universal Data Access Services. Access services are the basic CRUD services for creating,
reading, updating and deleting data from any backend systems, structured or unstructured,
internal or external. Access services also provide zero and/or low latency access to federated
data. This is sometimes called enterprise information integration (EII).
• Infrastructure Services. Infrastructure services include basic functionality around
authentication, access control, logging, etc.
• Data Integration Services. Integration services move data from source data models to target
data models like synchronization, transformation, matching, cleansing, profiling, enrichment,
federation, etc.
• Meta and Master Data Services. Their purpose is to manage and use the technical and
business metadata and master data for audit, lineage, and impact analysis purposes.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 40


Wolfgang Martin Team/CSA Consulting

• Data Delivery Services. They automate and standardize the publication of information to all
consumers according to a request/reply model or a publish/subscribe model (data syndication).
Delivery mechanisms are bulk and/or single records by batch, real-time messaging or delta
mechanisms bases on change of data.
• Administration Services. These are services for the life cycle management of the other
services, i.e. development, management, and monitoring and controlling.
The model of service-orientation provides another advantage. Due to the sub-service principle,
composite information services can be built for any purposes. Typically examples consist of data
warehousing, data migration, and data consolidation processes. We will discuss other examples
like master data management and data quality management in the following chapters.

5.3 Meta and Master Data Management

Process-orientation needs meta data management. The meta data layer spans all layers of the
SOA. Meta data is key to a consistent data model including life cycle management for a consistent
comprehension and communication of the data model, for data quality, data protection and
security. Meta data builds the business vocabulary of the enterprise and even across enterprises.
Meta data is organized by three layers:
• Layer 1 – Master Data: This is business oriented meta data providing the foundation of the
business vocabulary. Master data is meta data describing business structures like assets,
products and services, and the business constituents (e.g., suppliers, customers, employees,
partners etc.) This provides the famous single view on all enterprise structures.
• Layer 2 – Navigational Meta Data: Meta data on navigation (e.g., sources and targets of data,
cross references, time stamps)
• Layer 3 – Administrational Meta Data: Meta data on administration (information profiles
including responsibility, security, monitoring and controlling usage etc.)
Meta and master data provide the single point of truth that was traditionally claimed by the data
warehouse. Today, this single point of truth is established through data integration. The business
vocabulary plays the central role. It controls both, BPM and CPM: Processes and metrics need a
common and uniquely defined language for modeling and for communication to all business
constituents in collaboration contexts. Master data describes the data objects of processes and
rules: No processes, metrics, and rules without data.
Master data in a SOA is provided by special information services. Traditionally, master data was
application dependent, and it was re-implemented in always new versions together with the
creation of new applications.
Example. When master data is fragmented across various application islands, then each
application tends to develop its own terminology. Lack of consistency directs to chaos.
Product and order numbers in one application do not match with those in other applications.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 41


Wolfgang Martin Team/CSA Consulting

Collaboration with suppliers and customers gets more and more expensive. Each time, a
new customer, a new supplier, a new product is introduced, a lengthy, cumbersome, and
error prone procedure adds the new items to translation tables, or even new translation
tables have to be created. Changes get slow, introduce quality problems, and boost costs.
But when master data is provided as an information service, then one simple update in the
master database propagates changes safely and automatically to all related systems.
The example emphasizes again the problems of application orientation. Applications are like silos.
Terminology and models end at the border lines of an application. Cross application processes are
interrupted, metrics, rules and business vocabulary are inconsistent and redundant. Efforts to
integrate and synchronize across application islands get more and more complex and expensive.
IT gets stuck into the maintenance and becomes a blocker to the business. Agility is not feasible in
the traditional world of applications. Process- and service-orientation is the way out. Application
independent processes, metrics, rules, and master data are the pre-requisite for agility.

Meta and Master Data

Meta Data (DNA)

Analytical Master Data


l Customer l Space
l Partner l Time
l Supplier l Plan
l Products l Organization
l People

Operational Master Data Transactional Data

Operational Data

14 © 2007 S.A.R.L. Martin

Figure 14: We derive master data from operational data (of the OLTP – online transaction processing –
systems) and we classify master data into operational and analytical master data. Master data is part of the
business layer of meta data (D = Definition, N = Navigation, A = Administration).

The services repository provides a container of all meta and master data. It plays the role of the
integration hub for the meta data of all back end systems in the BPM model. In a SOA, when
services of back end systems are invoked by an integrated solution representing the integration
logic and the flow of the process, then they must speak to each other in the same language that is
based on the business vocabulary of the repository. A point-to-point communication would again

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 42


Wolfgang Martin Team/CSA Consulting

lead into the chaos of isolated islands. The only solution is to transform the meta data model of
each back end system into the central business vocabulary of the repository of the BPM integration
hub. Then, all back end systems can speak to each other and adding additional back end systems
becomes straight forward, easy, and fast.
A subset of layer 1 of meta data is used to describe master data. Master data describes the
structures of an enterprise (Fig. 14). There is operational and analytical master data. Operational
master data is part of transaction data, where transaction data consists of (operational) master
data and inventory data. The various types of operational master data can be derived from the
basic structure of an enterprise. These are all objects and people that are involved in executing
and managing business processes, i.e. products and business constituents (customer, supplier,
dealer, employees). Analytical master data can be derived from CPM and the process ownership
model. It represents the principles of measuring and responsibilities: time, space, plan and
organizational units (e.g., cost center, cost objective).

Transparency and Traceability


Collaborative Processes need a
Common Business Vocabulary.
Supplier Enterprise Customer

The 3 pillars
of MDM:
§ Data Integration
§ Data Profiling
§ Data Quality

Master
Synchronizing Data Versioning

Master Data Management (MDM) – Pre-Requisite for


Transparency and Traceability
15 © 2007 S.A.R.L. Martin

Figure 15: Master data management is all about to establish information services for synchronizing and
versioning of all master data across all backend applications with a SOA. The center of master data
management is a repository that manages the common business vocabulary so that all processes can use
unique structures and terms. Best practice architecture for such a repository is a hub & spoke architecture
corresponding to the architecture of an ESB. The three pillars of MDM are discussed in chapt. 5.4

Meta data and master data are not static. On the contrary, any merger and acquisition, any market
change, any internal organizational restructuring, any update of a business definition and rule
creates new meta data and master data. But it is absolutely insufficient just to update meta data

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 43


Wolfgang Martin Team/CSA Consulting

and master data and store the most recent and actual version in the repository or in a data base.
For enterprise planning and for any comparisons between past, now, and future, the
availability of the total life cycle of all meta data and master data is a must. This is why meta
data management and master data management are to be based on life cycle management. The
repository must include the life cycle of all meta and master data. (Fig. 15) Today, this is a weak
point, sometimes even a gap in vendor offerings and enterprise architectures. But nevertheless,
without meta and master data management, BPM, CPM and SOA initiatives will fail.

5.4 Data Quality

What is the day in the year when most people have their birthday according to all birthday data
stored in all data bases in the world? Nonsense question? Not at all. The result is striking. It is the
11th of November. Why? Well, if a new customer is to be entered into the customer data base, then
there are mandatory fields to be filled in and additional fields. Input into mandatory fields is
checked (in many situations at least), but input into additional fields is typically not checked.
Birthday data unfortunately is stored in additional fields. So what happens? Man is lazy, and the
easiest and fastest way to input a birthday date is “1-1-1-1”….

Enterprises have introduced ERP systems from SAP and others for many millions of euros. One of
the drivers was to be more competitive based on all stored data about market and customers.
CRM per self-service, coupons, pay-cards, communities, and weblogs are best practices for
chasing the budgets of customers. Customer-orientation is the rule. Marketing, sales and service
are working together supported by collaborative end-to-end processes. Inbound and outbound
campaigns in customer interaction centers and/or in the web shops are enriched through business
analytics. The demand and customer driven supply chain is getting more and more reality. But as
we may have already noticed: Data quality is the prerequisite.
Example. In the apparel industry, people are already used to collect all sales transaction data
in a data warehouse. Customer profiles are calculated, and a demand profile per boutique can
be derived. According to these demand profiles, merchandise is individually attributed to all
boutiques. As a consequence, customer will typically find “his/her” products he/she is looking
for in “his/her” boutique. Customer satisfaction and loyalty increases: In the end, customer
profitability increases. There is another consequence: We also cut costs. If a boutique offers
the right products to the right customers, then stock is lowered, and lower stock means less
costs. An economy of 30% to 40% of cost of stock is achievable.
Data quality is key for being more successful with information. The principle “garbage in – garbage
out” is without mercy. When enterprises do not notice before building CPM solutions that the data
stored in SAP and other backend systems is insufficient for SOA based processes, then it is too
late!
Example. A leading European mail-order-house had a problem with its birthday data. Birthday
data allows the calculation of the age of customer, an important parameter in customer
relationship management. So, what can you do, if your birthday data is not reliable? There is a

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 44


Wolfgang Martin Team/CSA Consulting

solution that provides a good estimate of age of customer. You look to the customer’s first
name. First names follow trends. Customer age can be estimated based on patterns of
attributing first names to children. But this is an expensive approach, and it will never achieve
full reliability. Much better approach is to build data quality from the beginning into the
operational processes.
Building quality from the very beginning into processes – that is a well known concept. Indeed, this
is the idea of “total quality management (TQM)” that has been applied successfully in
manufacturing 20/25 years ago. TQM for IT, this is not only an issue for today. When implementing
ERP systems, the principles of TQM for assuring data quality should have been already applied.
But still today, data quality is an issue. In many enterprises, a data quality director and sponsor on
the executive level is the exception. Data quality needs management attention.
Example. Let us assume, we want to build a 360° view on customer. Goal is to know
customer in order to optimally serve customer according to his/her customer value. 60% to
80% of cost for customer data integration is caused by infrastructure. Customer data
integration means to synchronize and to versionize customer data from various sources into
one single customer data model. Data stems from various application islands, historical and
archived data bases, external market data, demographical data, web click stream data and
others. When building the customer data model, you may notice at once that in a backend
system, there is a data table with data about customer that could be linked to a table in
another system providing a new and not yet available customer insight. Great, but what if the
data field that is to be used as key is not a mandatory field, but just an additional field?
Typically, this is the end of the good idea: Will the owner of this application be ready to change
the additional data field into a mandatory data field just because you tell him that would ease
your job? An IT question turns into a business issue. Indeed, only the business can decide on
these questions that seem to be IT questions at first glance, but have to be tackled and solved
in a collaborative approach jointly by business and IT.
Data quality needs management attention as this example proofs. Leading enterprises have data
quality directors that report directly to the CIO. The CIO brings data quality to management
attention and creates a change culture task. The data quality director coordinates the roles of data
stewards and data custodians. Data custodians are located in the lines of business. The have to
responsibility for data quality from the business point of view, i.e. content of master and meta data
as well as validity rules for certain transaction data. In process-oriented enterprises, the role of a
data custodian can be played by the process owner. Data stewards are associated to the data
custodians. Their role is to implement all rules and models within the IT systems. Their skill set
should include data base administration and/or data administration.

Data quality should be implemented into operational processes by a TQM program. The principles
are: Preventing is better than healing.

But what if it is already too late for prevention? What can be done to improve data quality in
already existing data bases? There are two complimentary types of tools:

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 45


Wolfgang Martin Team/CSA Consulting

Data Profiling. Data Profiling is used to analyze the properties of a given data set and to create a
profile. There are three types of analysis:
• Column profiles. Analysis of content and structure of data attributes helps to identify data
quality problems related to data types, values, distributions and variances.
• Dependency profiles for identifying intra-table dependencies. Dependency profiling is related to
the normalization of a data source. It provides expected, unexpected and weak functional
dependencies and potential key attributes.
• Redundancy Profiling. It identifies overlapping inbetween attributes of different relationships.
This is typically used to identify candidate foreign keys within tables, and to identify areas of
data redundancy.
Tools for data profiling use methods from descriptive statistics (analysis of distributions, tests for
outliners) as well as data mining (cluster analysis and decision trees). Data Profiling provides an
analysis “as is”, and is a valuable tool for estimating and directing further investments in data
quality. Data profiling tools identify data quality problems much faster than any manual analysis.
Data Cleansing is based on different methods:
• Parsing. Compound data is decomposed.
• Semantic approach. Data is transformed into standard values and formats according to rules.
• Benchmarking. Internal data sources are compared with external sources for verification.
• Matching. Data of similar content in different fields is identified (match customer information
that is stored in different applications to one and the same customer).
• Removing duplicates. (e.g., address data)
• Consolidating. Create a complete data record out of dispersed information (e.g., create one
customer address record)
• House holding. Detect relationships between data (e.g., identify all persons belonging to one
and the same household).
• Enriching. External data may enrich the value of cleansed enterprise data.
Data cleansing tools are based on probabilistic, deterministic and knowledge procedures.
Probabilistic and deterministic procedures use appropriate algorithms, whereas the knowledge
based approach uses country/language specific knowledge data bases for composing addresses,
names or legal entities.
Managing data quality standards again is a process. It combines proofing and cleansing activities:
Both provide information about the status of quality progress. A periodic execution of this process
helps to continuously monitor and control enterprise data quality. It should be part of the TQM
model. This process as all processes should be supported by a BPM tool and run on a SOA. The
logic of the TQM model orchestrates the data quality services of profiling and cleansing.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 46


Wolfgang Martin Team/CSA Consulting

6 Latency matters
Today, CPM must address the operational, tactical, and strategic aspects in a seamless way.
Leading process-oriented businesses use highly automated processes for straight through
processing. Metrics trigger decision engines and actions are taken in an automated way. Just in
case of exceptions, escalation management, authorization, entry of triggers (self-service), and
when applying collaborative services human interactions are (still) needed. Now, when the
identification of alerts and exceptions becomes time critical, human interactions even become to
slow. This is where latency matters and action time becomes critical (Fig. 16). The action time
model shows three critical phases, data latency, analysis latency, and decision latency.

Real-Time and Action-Time

Event
Value
Real-Time
Data Integration
red

ed
to

t or
nS

t aS
Da
tio

Data BAM
ma

Latency Decision
or

Engines
Inf

Analysis Action Taken


Latency
Decision
Latency

Time
Action Time

16 After: Richard Hackathorn and Colin White © 2007 S.A.R.L. Martin

Figure 16: In operational CPM (BAM), time may be critical. The action time model decomposes action time
into data latency, analysis latency, and decision latency, and it shows by which approaches, action time can
be minimized.

Data Latency. This is addressed by real-time data integration (Fig. 12). There are two options as
we have already seen, low latency and zero latency data integration. For a discussion, we refer
back to chapter 5.1.
Analysis Latency. This is addressed by BAM solutions: Analytics must be now available in real
time. Since analysis latency depends on the complexity of events, we first discuss the various
types of events in order to understand the different kinds of BAM solutions and their constraints to
analysis latency. We follow the approach taken by Luckham (2002).

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 47


Wolfgang Martin Team/CSA Consulting

• Simple events. These are events where all data necessary to detect the event are available
when the event happens. We have already seen examples like product availability and
deliverability. The goal of BAM now is to compare such an indicator with a threshold to launch
actions for control. Here, the really critical part of latency is mainly data latency since analysis
latency caused by the calculation of the indicator is rather small compared to data latency.
But analysis latency becomes an issue when using predictive models. In many situations, the
predictive model cannot be derived in real time – data mining does not work in real time. This is
the reason why modeling of predictive models by data mining processes was strictly separated
from applying predictive models in operational processes. So, usage of a predictive model is
real-time, but not modeling. The predictive model was exploited in an off-line mode with the
hope that the model that is based on the past maps the actual and future. An approach to
overcome this problem was periodically remodeling the predictive model with the speed of
supposed changes (e.g., week, month). New approaches and technologies make a break
through. Predictive models can be made self-learning by adaptive algorithms. They match
dynamically to the changing process context. Such an adaptive predictive model is always on-
line and maps to the presence based on the actual data driving the adaptive algorithm. This
presents a low latency solution for analysis latency. In fractions of seconds, adaptive models
can be recalculated. This enables an application of adaptive, dynamic models for intelligent
customer interactions in call centers or in web shops.
• Event streams. This is a continuous time sequence of events. For example, timing
corresponds to the arrival times of events in a BAM tool or by time stamps. Monitoring and
controlling of traffic in all kinds of networks is typically based on event streams. Examples can
be found in telecommunications, information processing, air traffic, ground traffic etc. For BAM
tools, there are different application domains to be distinguished.
o Simple pattern recognition. BAM tools for this type of problem are based on time series
analysis. The goal is the forecast, i.e. the prediction of the outcome of the next event.
Typical examples are sales forecast as well as forecast of stock prices or peaks in
consumption.
o Complex pattern recognition. Events streams can be conditional. They could happen at
different locations at different times and influence each other. BAM tools are now based
on multivariate time series analysis. Examples are concurrent and collaborative processes
like sales promotions of several competitors in one and the same market. Task for a BAM
tool could be to track the effectiveness of its own marketing activities, to measure the
impact of its promotions and to derive marketing strategies for defense or attack based on
the BAM.
o Pattern abstraction. Subsequent events could be detailed events of an event on a higher
abstraction level. BAM tools are now used to detect and identify the typically higher value
abstract event based on the evaluation of the single and isolated detailed events per
semantic reasoning. For example, consider the analysis of buying signals of a customer.
Customers send signals about their readiness to buy a certain product, in particular if the

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 48


Wolfgang Martin Team/CSA Consulting

investment exceeds a certain level like buying a car or a house. A BAM tool should now
detect buying readiness as soon as possible given received buying signals so that sales
get a window of opportunity towards a competitor.
BAM tools for event streams are based on special fast algorithms (e.g., matching algorithms and
other semantic methods). Besides the rather well known domain of time series analysis, this is still
a young area of development and many solutions are in an experimental stage.

17 © 2007 S.A.R.L. Martin

Figure 17: Today’s tools for data analysis and data mining like the STATISTICA Data Miner support a variety
of statistical methods for developing data driven decision models and comparing their prognostic capabilities.
After selecting the appropriate predictive model or set of models, these tools automatically apply the rules as
shown in the example of customer scoring.
Decision Latency. Indeed, when time matters, decisions cannot be taken anymore by humans.
Now, decision taking must be automated by decision engines. Decision engines are based on rule
engines. Rules can be generated bottom-up via predictive models. Such a set of rules can be
rather complex. For instance in e-commerce, intelligent customer interactions use predictive
models that are derived from various data sources like real-time and historical surfing properties,
buying patterns, busing history, catalogue information, sales strategy, and other external conditions
like time of the day, day of the week, and seasonal information to get recommendations with high
relevance. In many cases, the set of rules is simplified and reduced to one single parameter, a
score. (Fig. 17). Decision rules and scores are identified by using various data sources, and
previously detected data structures and patterns.
Rules may be also specified by experts in a top-down approach. This is a certain revival of the old
expert systems popular in the late 80s and early 90s. Ultimately, rules engines can be modeled by
a combination of predictive models with expert rules. Decision engines have been discussed in
detail in Martin (2003-B).

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 49


Wolfgang Martin Team/CSA Consulting

7 CPM and classical BI: fundamental differences


CPM has evolved from the old decision support and business intelligence approaches, but today,
CPM is a completely different model then classical business intelligence.
• CPM is a top-down model that begins with business strategy. Business Process Management
links process analysis and design with cross-functional and cross-departmental process flows
and CPM. Process performance metrics are created at the same time as the processes.
o Business Intelligence (BI) was bottom-up and not process-oriented.
• CPM is based on an information supply chain model that permanently synchronizes the
provision of information with the need for it.
o Business Intelligence was solely an information-providing model (Bill Inmon
"Information Factory").
• CPM is a closed-loop model that controls and monitors business processes at operational,
tactical and strategic levels.
o Business Intelligence only supported decision making, but not action taking. The
operational aspects of Business Intelligence were not covered by a coherent
approach.
• CPM metrics are forward-looking. Predictive models enable the identification of problems
before they appear. Of course all traditional retrospective metrics remain useful.
o Business Intelligence was retrospective (based on the past). The focus was on
analysis and diagnosis. Potentials of predictive models were not exploited.
• CPM enables transparency by means of the sharing and filtering of information in accordance
with the process ownership model. Everybody gets the information required in the context of
his/her processes.
o Business Intelligence tools did not provide the information consumer with sufficient
information. Either one had information that was not accessible (on occasion even
hidden or held back), or you had an absolute flood of data ("information for the
masses"). This spoiled acceptance.
• CPM is based on analytical services that are published, consumed, and orchestrated in the
context of a SOA (service oriented architecture).
o Business Intelligence was a tool-related approach, based on proprietary
technologies. This resulted in stove piped information silos.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 50


Wolfgang Martin Team/CSA Consulting

8 Players in the CPM/BI Market

During recent years, the BI market has shown an over average performance with a two digit growth
rate. We believe that the following three statements deliver good insight and good arguments
about reasons: the extensive rearrangement of the BI market, the evolution of BI role, methods and
tools, and the transition to CPM.
Statement 1: The market for Business Intelligence disintegrated. For several years already,
we have observed increasing merger and acquisition activities in the market. The clou happened in
2007 when three mega acquisitions took place: Oracle/Hyperion, SAP/Business Objects and
IBM/Cognos. No real big independent BI vendor exists anymore (only exception: privately held
SAS). In consequence, there is no independent BI market any more, but it has been absorbed by
the BPM/SOA, respectively ERP II market. Indeed, the big four in the BPM/SOA market are all
leading BI vendors, and this holds also for the ERP II market. The market leader Infor gives
already a good example. This did not happen by surprise, we anticipated this trend already in 2006
(expert opinion in is-report 3/2006). But in the new, extended market, the remaining small and
independent BI vendors can very nicely occupy interesting and lucrative niches. The on-going
process and service orientation (including SaaS, the new license model for consuming external
services in a SOA) empowers a best of breed model for vendor selection more than ever, because
integration is no more a challenge, but is a given. As a consequence, the outlook for the smaller
players is excellent due to this market move. Additionally, new markets for “intelligence” spun off
the former BI market, for instance, content intelligence, customer intelligence, financial intelligence,
and competitive intelligence. These emerging markets offer new growth opportunities for new
and/or repositioned vendors. Indeed, the disintegration of the traditional BI market does not mean
the dead of the market, but a restart with many opportunities for all players.
Statement 2: For enterprises, BI is more important than ever. During recent years, BI changed
and evolved a lot. BI became operational, BI was finally put into the context of business processes,
and BI was extended to a closed loop model for monitoring and controlling business processes. In
the end, the old paradigm that BI only works on top of a data warehouse was shown to be too
restrictive and insufficient. Operational data sources became equal to classical data warehouse
data. The data warehouse stopped to be the single point of truth. A new challenge was created.
Traditional ETL processes continue to be necessary, but we need more to end up in a true
information management. The idea of enterprise information management was born. It also
enables the transition from BI to CPM. And today, we even move further. The key role of BI for
CPM extends to governance, risk management, and compliance (GRC). BI enters the board level.
Statement 3: BI arrived at C level. Indeed, CPM and GRC are within key responsibilities of the
board. In the past, BI was sold to the IT. Many BI projects simply suffered due to this ill position.
The creation of value by BI was very difficult to show, sometimes even abnegated. Huge data
warehouses caused costs, but nobody really liked to use all the existing data. This is now
changing. The value creation of BI in the context of business processes is beyond dispute. GRC

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 51


Wolfgang Martin Team/CSA Consulting

turns the office of the CFO to the control room of the enterprise. Even the role of the CFO is
changing. The CFO undergoes a metamorphosis to a CPO, the chief performance officer with full
responsibility for GRC.

Taxonomy CPM/BI

Action
Action

Business
Scorecard

Decision Engines CPM


CPM

BAM
BAM LLDM

Data Integration Platform


ETL Data
Warehouse
Enterprise Service Bus

18 © 2007 S.A.R.L. Martin

Figure 18: Action time (see Fig. 16) based taxonomy of CPM market players. (BAM = business activity
monitoring, ETL = extraction, transformation, load; LLDM = low latency data mart, EAI = enterprise
application integration, MQ = message / queuing)

Let us now move to the market players. From the three phases of action time (Fig. 16), we can
derive a taxonomy for classifying the players (vendors) in the market (Fig. 18). Key players in the
different categories are listed below, but we do not claim to provide an exhaustive list. More details
on specific vendors will be published in part 2 of this white paper, where in each white paper we
will map the vendors’ architecture and strategy to the vision and reference architecture developed
in this part 1.
Part 2 – Available white papers (February 2008): arcplan, Cubeware, EPOQ, Informatica,
in-factory, Panoratio, SAP, Spotfire (see www.wolfgang-martin-team.net) A white paper on Lixto is
planned for April 2008.
CPM Toolkits provide the actual state-of-the-art implementation of CPM. The term goes back to
Gartner Group. CPM Toolkits are positioned between CPM Suites and single tools (like
spreadsheets). CPM Toolkits provide CPM specific functionality as services: They are open and
have standardized interfaces, whereas CPM Suites are built on proprietary technologies and are
tightly integrated. This makes implementation of CPM Suites difficult: It is not easy to customize
them, and there needs rather high efforts and costs to integrate a CPM Suite into existing

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 52


Wolfgang Martin Team/CSA Consulting

applications. CPM Toolkits follow the SOA principles. This loosely coupling of CPM services has
several advantages. Service-orientation eases up customization. Services can be used like LEGO
building blocks. They are easily invoked by right mouse clicks and are ideal for mash ups. The end
user can build its own composite application: By mashing up CPM services, the user creates the
analytical workflow and orchestrates the analytical services provided by the CPM Toolkit. This is
why CPM Toolkits address the old requirement that BI tools should enable the business to work
autonomously and to build their own reports and analyses without programming.
The following listing of vendors is not supposed to provide a complete view on the market. But it is
quite comprehensive and puts a focus on the German speaking markets: It includes quite a
number of local players.
Data Warehouse / Data Mart (data base technologies – MOLAP, ROLAP, relational & special
technologies))
BOARD, Computer Associates/Cleverpath, IBM, IBM/Cognos/Applix, Infor/MIS AG, instantOLAP,
InterSystems, Kognitio, KxSystems, Microsoft, MicroStrategy, MIK, Netezza, NCR/Teradata,
Oracle, Panoratio, Sand Technology, SAP, SAS, Sybase, Xcelerix
Open Source: The Bee Project, Greenplum, Jedox, mySQL2, Pentaho
Business Intelligence / Business Analytics (Frontend: Suites, Toolkits and Specialized
Tools)
§ The BIG FOUR: IBM, Microsoft, Oracle, SAP
§ Leading world-wide specialists: Actuate, Information Builders, MicroStrategy, SAS, SPSS
§ Challenders: arcplan, aruba Informatik, Aruna, BOARD, CA/Cleverpath, Cubeware, Group 1,
Human IT (InfoZoom), Infor/Extensity, instantOLAP, Menta, MIK, Panorama, Panoratio,
Prevero, QlikTech, Q4bis, SAMAC (nur für IBM iSeries), Targit, Tibco/Spotfire
Open Source: Actuate/BIRT, The Bee Project, JasperSoft, Pentaho
Business Activity Management/Complex Event Processing (BAM/CEP)
Aleri, AptSoft, Axway, Business Code, Coral8, Coremetrics, EPOQ, Gemstone, HP TSG, IBM, IDS
Scheer, Information Builders, Microsoft, Oracle, Senactive, SL Corporation, Software
AG/WebMethods, SUN/SeeBeyond, Systar, Tibco, Vitria, WebTrends
Business Scorecards
Active Strategy, Actuate/PerformanceSoft, Antares, arcplan, aruba Informatik, BOARD, BOC,
Business CoDe, Coda, Communic, Corporate Planning, Cubeware, Dialog Strategy, hfp, Hologram
BI, Horvath & Partner, Hyperspace, IBM, IBM/Cognos, IDS Scheer, iGrafix, Infor/MIS, macs
software, Microsoft/ProClarity, MIK, Oracle, Panorama, Prevero, Procos, Prodacapo, QPR
Software, SAP/Business Objects, SAS, Stratsys, Targit

2
In January, Sun Microsystems has announced its intension to acquir mySQL.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 53


Wolfgang Martin Team/CSA Consulting

Predictive Models (Data Mining & related vendors)


Anderson Analytics, Angoss, Autonomy/NCorp, Chordiant, Cognos, Coremetrics, EPOQ, Fair
Isaac, IBM, Infor/E.piphany Insightful (Miner, S-Plus), ISoft, Kana, KXEN, Magnify, Megaputer,
Microsoft, NCR/Teradata, Oracle, Portrait Software, Prudsys, SAP, SAS, SPSS, StatSoft,
thinkAnalytics, Treparel, Unica, Verix, Viscovery
Open Source: Knime, Orange, RapidMiner, Rattle, R-Project, Weka
Text Analytics
Aero Text, Anderson Analytics, Attensity, Basis Technology, Clarabridge, Clear Forest, LexisNexis,
Linguamatics, Nstein, SAP/Business Objects, SAS, SPSS, StatSoft, Teezir, Temis Group, Treparel
Open Source: Gate, RapidMiner
Decision Engines
Chordiant, Corticon, EPOQ, Fair Isaac, Haley, IBM, ILog, Infor/E.piphany Innovations, Kana,
MicroStrategy, Oracle, Portrait Software, Prudsys, SAP, SAS, SPSS, StatSoft, thinkAnalytics,
Tibco, Versata, Viscovery
Financial Performance Management (Budgeting, Planning, Financial Consolidation etc.)
Acorn System, arcplan, BOARD, Coda, Complan & Partner, Corporate Planning, Cubus AG,
Denzhorn, Hologram BI, IBM/Cognos, IDL, Infor/MIS AG, Longview, LucaNet, macs Software,
Microsoft, MIK, Oracle/Hyperion, Orbis AG, Prevero, Procos, Prodacapo, PST, SAP/Business
Objects, SAP/OutlookSoft, SAS, Software4You, Targit, Winterheller
Data Integration – Platforms
Adeptia, Composite Software, ETI, Gemstone, IBM, Informatica, Information Builders, ISoft,
Oracle, Pervasive, Red Hat/MetaMatrix, SAS/DataFlux, SAP/Business Objects, Software AG,
SterlingCommerce, Tibco
Data Integration – Special Tools, Webintegration
Qitera, Kapow Technologies, Lixto, Teezir
ETL
AbInitio, BOARD, Cubeware, ETI, Group 1, IBM, Informatica, Information Builders, ISoft, Kognitio,
Menta, Microsoft, Open Text, Oracle, Pervasive, SAP, SAS, Sybase/Solonde
Open Source: The Bee Project, CloverETL, Enhydra Octopus, KETL, Pentaho/Kettl, Talend
Data Quality
Address Solutions, Datras, ETI, Group1, Harte Henks, Human Inference, IBM, Informatica,
Innovative System, Nokia/Identity Systems, Omikron, Oracle, SAP/Business Objects,
SAS/DataFlux, tekko, Uniserv

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 54


Wolfgang Martin Team/CSA Consulting

9 Summary

We believe that our vision on corporate performance management across operations, tactics, and
strategies will become a standard for continuously evolving metrics-driven management. We also
believe that our reference architecture of analytical services infrastructures will be a standard for
dynamic enterprise specific service oriented architectures (SOA). This Whitepaper will help to
make strategic decisions on strategies and platforms.
CPM is the answer to today’s challenges running a business: You can only manage what you can
measure. This is one of the leitmotivs that will lead enterprises into a successful future.
Munich, February 2008 Annecy, February 2008
E-Mail-Addresses:
Richard.Nussdorfer@csaconsult.de
Wolfgang.Martin@wolfgang-martin-team.net

Literature:
Inmon, W.H., Imhoff, C., and Sousa, R.: Corporate Information Factory, New York, John Wiley & Sons, 1998,
274 pages
Luckman, D.: The Power of Events: An Introduction to Complex Event Processing in Distributed Enterprise
Systems, Boston, Addison Wesley Professional, 2002, 400 pages
Martin, W.: Business Performance Management und Real-Time Enterprise – auf dem Weg zur Information
Democracy, Strategic Bulletin, IT Research, www.it-research.net, Sauerlach bei München, 2003-A, 32 pages
Martin, W.: CRM 2004 – Kundenbeziehungsmanagement im Echtzeitunternehmen, Strategic Bulletin, IT
Research, www.it-research.net, Sauerlach bei München, 2003-B, 32 pages
Martin, W.: BI 2004 – Business Intelligence trifft Business Integration, Strategic Bulletin, IT Research,
www.it-research.net, Sauerlach bei München, 2004, 32 pages
Martin, W.: SOA 2008 – SOA basierendes Geschäftsprozessmanagement, Strategic Bulletin, IT-Verlag für
Informationstechnik GmbH, Sauerlach, 2007, 28 Seiten
Martin, W., Nußdorfer, R.: Role of Portals in a service oriented architecture (SOA) – “Status and Trend –
Processes and People – Presentation and Collaboration Services“, iBond White Paper Vol. 4, www.soa-
forum.net, Munich, 2006, 33 pages
Nußdorfer, R., Martin, W.: RTE – Real-Time Oriented IT Architecture: All Together Now, Strategic Planning
of IT Architectures, iBonD White Paper Vol. 1, www.soa-forum.net; 2003, Munich, 35 pages
Nußdorfer, R., Martin, W.: BPM – Business Process Management – Änderung des Entwicklungsparadigmas,
Kompendium “Geschäftsprozesse als Lösungen“, iBonD White Paper Vol. 3, www.soa-forum.net; 2007, 43
pages

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 55


Wolfgang Martin Team/CSA Consulting

10 The Sponsors

BOARD International
Founded in 1995, BOARD is a global leader in the BI and CPM (Corporate Performance Management)
Toolkit space and offers a unique combination of speed and simplicity. BOARD has enabled over 2000
companies worldwide to rapidly deploy BI & CPM applications in a single integrated environment completely
programming free and in a fraction of the time and cost associated with traditional solutions.
BOARD provides one accurate, corporate view of your information, fully integrated with your CPM processes
and therefore uniquely linking performance to strategic vision at all levels down to operational detail. A toolkit
approach to Corporate Performance Management (CPM) represents a new, unique and cost effective way
for companies who want to win the new challenges and maximize their CPM and BI implementations.

What is the Toolkit Approach ?


Application Visual Modeling MOLAP + ROLAP engines

..an integrated environment to


build BI & CPM solutions

3 rd Tier architecture
Web
Master Server

Client

BOARD meets all business performance management needs, guarantying a unified access to business data,
for a single version of the truth. BOARD allows building sophisticated applications for Planning, Budgeting,
Forecasting, Profitability Analysis, What if scenario, Scorecarding & Dashboards, Consolidation combining
Business Intelligence and Corporate Performance Management on a single integrated product.
For more information: www.board.com

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 56


Wolfgang Martin Team/CSA Consulting

Cubeware
Founded in 1997 in Bavaria, with headquarters in Rosenheim and offices in Berlin, Darmstadt and
Düsseldorf, Cubeware is one of the leading European suppliers of business intelligence solutions. Cubeware
sells both a powerful and comprehensive out-of-the-box solution for analysis, planning and reporting and a
connectivity toolset for data extraction from a wide range of operational systems, including SAP. Cubeware’s
products are platform independent, flexible and scalable, and seamlessly integrated into the worlds of
Microsoft and SAP – as shown by the Microsoft Gold Certified Partner status and the award of SAP®
Certified Integration for the Cubeware Analysis System. The products and services provided by Cubeware
are targeted at the sales, finance, controlling and other specialised departments of both small-to-medium
businesses and major corporations. Product sales and implementation are handled either by Cubeware or by
one of a growing international network of certified Cubeware business partners and resellers. OEM sales in
the form of integration of Cubeware’s BI products and components in the solutions of other BI and ERP
vendors build a third, important revenue pillar.
Worldwide more than 100,000 installations of Cubeware’s analysis, reporting and controlling solutions have
been made to date. Cubeware has been independent and self-financed from day one and employs over 50
BI specialists. Cubeware has gained the confidence of hundreds of renowned customers from many lines of
industry, including for example Abbott, ANZAG, AOK Brandenburg, Bertelsmann Stiftung, Daimler Chrysler,
Danone Austria, Gabor Shoes, Kaufhof Warenhaus, Plaut Salzburg, Saeco, Viessmann and ZWILLING J. A.
HENCKELS.
For more information: www.cubeware.de/eng

EPOQ GmbH
EPOQ develops and offers solutions for management and dynamic optimization of customer interactions for
process oriented multi channel marketing. The modular product suite ready REALTIME DYNAMIC is based
on a unique method for creating dynamic scores. In outbound processes it enables dynamic customer
selection, and in inbound processes it generates customer oriented product recommendations in real-time.
Each customer reaction to a recommendation is fed back to the engine. This real-time closed loop optimizes
the out- and inbound activities by the continuous self learning mechanism. The proven benefits are a
significant increase of success rates, highest flexibility in campaign design, and a highly improved
exploitation of customer potentials. For enterprises that rely on high volume direct customer interactions,
EPOQ offers a considerable competitive advantage.
For more information: www.epoq.de

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 57


Wolfgang Martin Team/CSA Consulting

Informatica – “The Data Integration Company”


Informatica Corporation (NASDAQ: INFA) is a leading provider of data integration software and services,
solving the problem of data fragmentation across disparate systems, helping organisations gain greater
business value from all their information assets. Informatica solution options include Data Quality, Grid
Capabilities, Structured and Unstructured Data support and Real Time data integration. Informatica's open,
platform-neutral software reduces risks and costs, speeds time to results, and scales to handle data
integration projects of any size or complexity.
Data Integration solutions for the enterprise that Informatica customers are using include but are not limited
to Data Migration, Data Consolidation, Data Synchronization, Data Warehousing, Master Data
Management, RFID Initiatives, Sarbanes-Oxley Compliance and SAP Data Migration. With Informatica,
companies can gain greater business value by integrating all their information assets from across the
enterprise. With a proven track record of success, Informatica helps companies and government
organisations of all sizes realise the full business potential of their enterprise data.
More than 2,850 companies worldwide rely on Informatica to reduce the cost and expedite the time to
address data integration needs of any complexity and scale. In Europe Informatica has many leading
enterprises across different vertical industries in its customer base, including companies such as abbey,
AUDI AG, DaimlerChrysler, Deutsche Börse, The Dutch Ministry of Defense, GlaxoSmithKline, ING Direct
UK, La Poste, Mexx, Nestlé and Prudential.
For more information on Informatica please visit www.informatica.com

in-factory
is an independent consultancy company located in Winterthur, Switzerland.
Its core competency is “Enterprise Information Integration”.
For more information please visit www.in-factory.com

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 58


Wolfgang Martin Team/CSA Consulting

Panoratio

Panoratio delivers in-memory Dynamic Data Discovery Solutions—which enable 360º customer-centric
analysis on any computer—through a patented process for rendering large, complex data sets into a
Portable Data Insights™ (.pdi™). The .pdi format is transparently compressed, and ideally suited for
archiving, distribution and syndication of large amounts of data. By joining multiple PDIs together (Brick,
Marketing, Web Analytics, Demographic, etc), companies can realize a complete customer view across all
major touch-points; and can discover interrelationships and patterns that have previously remained “hidden”.
All queries return in fractions of seconds, and complete analysis can occur in a fraction of the historical time.
With Panoratio, there is practically no limit to the complexity of the data which can be analyzed.

Panoratio has over 50 customers in production, including Macy’s, Yahoo!, AVIS, Audi and AOL Deutschland,
and is an IBM® Premier Business Partner. Please visit www.panoratio.com to learn more.

Panoratio Customer-
Customer-Centered Performance Management

Segment A,B,C Segmentation


Conventional

Sales
Sales
Data Marts

Campaign 1,2,3 Campaign Management

Marketing
Marketing Product X,Y,Z Response Optimization

Region I,II,III Churn Prevention


Customer
Customer
Service
Service
… CLV Management
Fragmented Data Sources

Multiple analytic angles


Precision Segmentation
Sales
Sales
Data Marts

Sales Campaign Management


Co m
p laints ails
E-M
Marketing
Marketing Customer Response Optimization
R eturn
les
ct S a s
Dire Orders Churn Prevention
Customer
Customer
Service
Service CLV Management

Operational IT/ Performance


Systems Analysts Management

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 59


Wolfgang Martin Team/CSA Consulting

StatSoft
StatSoft was founded in 1984 in the United States and is today one one of the largest global providers of
analytic software worldwide. One of StatSoft’s flagships, the STATISTICA Data Miner, is a modern software
tool which offers a comprehensive selection of methods for predictive analytics. It considers the proven fact
that there exists no single best method for every analytical problem. But the software also provides an
intuitive user interface which allows even less experienced users to create - with the help of the implemented
data miner recepies - step by step predictive models. Models easily can be deployed and implemented in
other applications and run in the background. This way users get quick access to well-founded predictions.
Today the application of data mining methods is no longer limited to traditional areas like sales support, risk
management and customer scoring. Industry has recognized that the availability of a growing amount of data
makes complex analytics very valuable for root cause analysis and process optimization. In these application
areas StatSoft offers specific solutions to optimize production processes. Another application area is the
analysis of unstructured text. STATISTICA Text Miner is a solution that translates unstructured text data into
meaningful, valuable clusters of decision-making "gold."
Although sometimes stated predictive analytics is not a trivial process and requires the investment of man
power and know-how. StatSoft supports its customers establishing and implementing decision support
systems. Services will be customized to the needs of the users. Experienced consultants of StatSoft make
sure that professional and appropriate solutions will be created. With a network of subsidiaries on all major
markets on all continents StatSoft is capable to support internationally operating companies worldwide.
For more information please visit www.statsoft.com

Viscovery Software GmbH


As one of the first data mining companies in Europe, Viscovery (formerly eudaptics software gmbh) is among
the leading vendors of predictive analytics solutions. The Viscovery suite possesses unique patented
technology for the explorative analysis and statistical modelling of complex data. Comprehensive workflows
support the generation of high-performance predictive models which may be real-time integrated and
updated automatically.
For many years now, Viscovery software is being used by more than 300 customers from different
application areas, such as banking, insurance, telecom, industry, media, retail, as well as research
organizations and universities. Since September 2007 Viscovery is a company of the Biomax
group.
For more information please visit www.viscovery.net.

CPM White Paper / Dr. W. Martin / R. Nußdorfer 2/15/2008 Page 60

You might also like