Professional Documents
Culture Documents
please contact:
Email: InfosyslabsBriefngs@infosys.com
Infosys Limited, 2013
Infosys acknowledges the proprietary rights of the trademarks and product names of the other
companies mentioned in this issue of Infosys Labs Briefngs. The information provided in this
document is intended for the sole use of the recipient and for educational purposes only. Infosys
makes no express or implied warranties relating to the information contained in this document or to
any derived results obtained by the recipient from the use of the information in the document. Infosys
further does not guarantee the sequence, timeliness, accuracy or completeness of the information and
will not be liable in any way to the recipient for any delays, inaccuracies, errors in, or omissions of,
any of the information or in the transmission thereof, or for any damages arising there from. Opinions
and forecasts constitute our judgment at the time of release and are subject to change without notice.
This document does not contain information provided to us in confdence by our clients.
BIG DATA:
CHALLENGES AND
OPPORTUNITIES
Subu Goparaju
Senior Vice President
and Head of Infosys Labs
At Infosys Labs, we constantly look for opportunities to leverage
technology while creating and implementing innovative business
solutions for our clients. As part of this quest, we develop engineering
methodologies that help Infosys implement these solutions right,
frst time and every time.
B
I
G
D
A
T
A
:
C
H
A
L
L
E
N
G
E
S
A
N
D
O
P
P
O
R
T
U
N
I
T
I
E
S
V
O
L
1
1
N
O
1
2
0
1
3
VOL 11 NO 1
2013
Infosys Labs Briefings
I
n
f
o
s
y
s
L
a
b
s
B
r
i
e
f
i
n
g
s
AADITYA PRAKASH is a Senior Systems Engineer with the FNSP unit of Infosys. He can be
reached at Aaditya_Prakash@infosys.com.
ABHISHEK KUMAR SINHA is a Senior Associate Consultant with the FSI business unit of Infosys.
He can be reached at Abhishek_Sinha11@infosys.com.
AJAY SADHU is a Software Engineer with the Big data practice under the Cloud Unit of Infosys.
He can be contacted at ajay_sadhu@infosys.com.
ANIL RADHAKRISHNAN is a Senior Associate Consultant with the FSI business unit of Infosys.
He can be reached at Anil_Radhakrishnan@infosys.com.
BILL PEER is a Principal Technology Architect with the Infosys Labs. He can be reached at
Bill_Peer@Infosys.com.
GAUTHAM VEMUGANTI is a Senior Technology Architect with the Corp PPS unit of Infosys.
He can be contacted at Gautham_Vemuganti@infosys.com.
KIRAN KALMADI is a Lead Consultant with the FSI business unit of Infosys. He can be contacted
at kiran_kalmadi@infosys.com.
MAHESH GUDIPATI is a Project Manager with the FSI business unit of Infosys. He can be reached
at mahesh_gudipati@infosys.com.
NAJU D MOHAN is a Delivery Manager with the RCL business unit of Infosys. She can be contacted
at naju_mohan@infosys.com.
NARAYANAN CHATHANUR is a Senior Technology Architect with the Consulting
and Systems Integration wing of the FSI business unit of Infosys. He can be reached at
Narayanan_CS@Infosys.com.
NAVEEN KUMAR GAJJA is a Technical Architect with the FSI business unit of Infosys. He can be
contacted at Naveen_Gajja@infosys.com.
PERUMAL BABU is a Senior Technology Architect with RCL business unit of Infosys. He can be
reached at perumal_babu@infosys.com.
PRAKASH RAJBHOJ is a Principal Technology Architect with the Consulting and Systems
Integration wing of the Retail, CPG, Logistics and Life Sciences business unit of Infosys. He can be
contacted at PrakashR@Infosys.com.
PRASANNA RAJARAMAN is a Senior Project Manager with RCL business unit of Infosys. He can
be reached at Prasanna_Rajaraman@infosys.com.
SARAVANAN BALARAJ is a Senior Associate Consultant with Infosys Retail & Logistics Consulting
Group. He can be contacted at Saravanan_balaraj@infosys.com.
SHANTHI RAO is a Group Project Manager with the FSI business unit of Infosys. She can be
contacted at Shanthi_Rao@infosys.com.
SUDHEESHCHANDRAN NARAYANAN is a Senior Technology Architect with the Big data practice
under the Cloud Unit of Infosys. He can be reached at sudheeshchandran_n01@infosys.com.
ZHONG LI PhD. is a Principal Architect with the Consulting and System Integration Unit of
Infosys. He can be contacted at zhong_li@infosys.com.
Big data was the watchword of year 2012. Even before one could understand
what it really meant, it began getting tossed about in huge doses in almost every
other analyst report. Today, the World Wide Web hosts upwards of 800 million
webpages, each page trying to either educate or build a perspective on the concept
of Big data. Technology enthusiasts believe that Big data is the next big thing
after cloud. Big data is of late being adopted across industries with great fervor.
In this issue we explore what the Big data revolution is and how it will likely help
enterprises reinvent themselves.
As the citizens of this digital world we generate more than 200 exabytes of
information each year. This is equivalent to 20 million libraries of Congress.
According to Intel, each internet minute sees 100,000 tweets, 277,000 Facebook
logins, 204-million email exchanges, and more than 2 million search queries fred.
Looking at the scale at which data is getting churned it is beyond the scope of a
humans capability to process data and hence there is need for machine processing
of information. There is no dearth of data for todays enterprises. On the contrary,
they are mired with data and quite deeply at that. Today therefore the focus
is on discovery, integration, exploitation and analysis of this overwhelming
information. Big data may be construed as the technological intervention to
undertake this challenge.
Since Big data systems are expected to help analysis of structured and
unstructured data and hence are drawing huge investments. Analysts have
estimated enterprises will spend more than US$120 billion by 2015 on analysis
systems. The success of Big data technologies depends upon natural language
processing capabilities, statistical analytics, large storage and search technologies.
Big data analytics can help cope with large data volumes, data velocity and
data variety. Enterprises have started leveraging these Big data systems to mine
hidden insights from data. In the first issue of 2013, we bring to you papers
that discuss how Big data analytics can make a significant impact on several
industry verticals like medical, retail, IT and how enterprises can harness the
value of Big data.
Like always do let us know your feedback about the issue.
Happy Reading,
Yogesh Dandawate
Deputy Editor
yogesh_dandawate@infosys.com
Authors featured in this issue
Infosys Labs Briefings
Advisory Board
Anindya Sircar PhD
Associate Vice President &
Head - IP Cell
Gaurav Rastogi
Vice President,
Head - Learning Services
Kochikar V P PhD
Associate Vice President,
Education & Research Unit
Raj Joshi
Managing Director,
Infosys Consulting Inc.
Ranganath M
Vice President &
Chief Risk Officer
Simon Towers PhD
Associate Vice President and
Head - Center of Innovation for
Tommorows Enterprise,
Infosys Labs
Subu Goparaju
Senior Vice President &
Head - Infosys Labs
Big Data: Countering
Tomorrows Challenges
Infosys Labs Briefings
3
9
19
27
35
41
47
53
65
73
VOL 11 NO 1
2013
Opinion: Metadata Management in Big Data
By Gautham Vemuganti
Any enterprise that is in the process of or considering Big data applications deployment
has to address the metadata management problem. The author proposes a metadata
management framework to realize Big data analytics.
Trend: Optimization Model for Improving Supply Chain Visibility
By Saravanan Balaraj
The paper tries to explore the challenges that dot the Big data adoption in supply chain and
proposes a value model for Big data optimization.
Discussion: Retail Industry Moving to Feedback Economy
By Prasanna Rajaraman and Perumal Babu
Big data analysis of customers preferences can help retailers gain a significant competitive
advantage, suggest the authors.
Perspective: Harness Big Data Value and Empower Customer Experience Transformation
By Zhong Li PhD
Always-on digital customers continuously create more data in various types. Enterprise
are analyzing this heterogeneous data for understanding customer behavior, spend, social
media patterns.
Framework: Liquidity Risk Management and Big Data: A New Challenge for Banks
By Abhishek Kumar Sinha
Managing liquidity risk on simple spreadsheets can lead to non-real-time and inappropriate
information that may not be enough for efficient liquidity risk management (LRM). The author
proposes an iterative framework for effective liquidity risk management.
Model: Big Data Medical Engine in the Cloud (BDMEiC): Your New Health Doctor
By Anil Radhakrishnan and Kiran Kalmadi
In this paper the authors describe how Big data analytics can play a significant role in the early
detection and diagnosis of fatal diseases, reduction in health care costs improving quality of health
care administration.
Approach: Big Data Powered Extreme Content Hub
By Sudeeshchandran Narayanan and Ajay Sadhu
With the arrival of Big Content, the need to extract, enrich, organize and manage
semi-structured and un-structured content and media is increasing. This paper talks about
the need for an Extereme Content Hub to tame the Big data explosion.
Insight: Complex Events Processing: Unburdening Big Data Complexities
By Bill Peer, Prakash Rajbhoj and Narayanan Chathanur
Complex Event Processing along with in-memory data grid technologies can help in pattern
detection, matching, analysis, processing and split second decision making in Big data
scenarios opine the authors.
Practioners Perspective: Big Data: Testing Approach to Overcome Quality Challenges
By Mahesh Gudipati, Shanthi Rao, Naju D. Mohan and Naveen Kumar Gajja
This paper suggests the need for a robust testing approach to validate Big data systems to
identify possible defects early in the implementation life cycle.
Research: Nature Inspired Visualization of Unstructured Big Data
By Aaditya Prakash
Classical visualization methods are falling short in accurately representing the multidimensional
and ever growing Big data. Taking inspiration from nature, the author has proposed a nature
inspired spider cobweb visualization technique for visualization of Big data.
Index
Robust testing approach needs to be defined for validating
structured and unstructured data to identify possible
defects early in the implementation life cycle.
Naju D. Mohan
Delivery Manager, RCL Business Unit
Infosys Ltd.
Big Data augmented with Complex Event Processing
capabilities can provide solutions in utilizing
memory data grids for analyzing trends,
patterns and events in real time.
Bill Peer
Principal Technology Architect
Infosys Labs, Infosys Ltd.
3
VOL 11 NO 1
2013
Metadata Management in Big Data
By Gautham Vemuganti
B
ig data, true to its name, deals with large
volumes of data characterized by volume,
variety and velocity. Any enterprise that is
in the process of or considering a Big data
applications deployment has to address the
metadata management problem. Traditionally,
much of the data that business users use is
structured. This however is changing with the
exponential growth of data or Big data.
Metadata defning this data, however,
is spread across the enterprise in spreadsheets,
databases, applications and even in peoples
minds (the so-called tribal knowledge). Most
enterprises do not have a formal metadata
management process in place because of
the misconception that it is an Information
Technology (IT) imperative and it does not have
an impact on the business.
However, the converse is true. It has been
proven that a robust metadata management
process is not only necessary but required for
successful information management. Big data
introduces large volumes of unstructured data
for analysis. This data could be in the form of a
text fle or any multimedia fle (for e.g., audio,
video). To bring this data into the fold of an
information management solution, its metadata
should be correctly defned.
Met adat a management sol ut i ons
provided by various vendors usually have
a narrow focus.An ETL vendor will capture
metadata for the ETL process.A BI vendor will
provide metadata management capabilities
for their BI solution. The silo-ed nature of
metadata does not provide business users an
opportunity to have a say and actively engage
in metadata management. A good metadata
management solution must provide visibility
across multiple solutions and bring business
users into the fold for a collaborative, active
metadata management process.
METADATA MANAGEMENT CHALLENGES
Metadata, simply defned, is data about data.
In the context of analytics some common
examples of metadata are report defnitions,
table defnitions, meaning of a particular master
data entity (sold-to customer, for example),
ETL mappings and formulas and computations.
The i mport ance of met adat a cannot be
overstated. Metadata drives the accuracy of
reports, validates data transformations, ensures
Big data analytics must reckon the
importance and criticality of metadata
Infosys Labs Briefings
4
accuracy of calculations and enforces consistent
defnition of business terms across multiple
business users.
In a typical large enterprise which has
grown by mergers, acquisitions and divestitures,
metadata is scattered across the enterprise in
various forms as noted in the introduction.
In large enterprises, there is wide
acknowledgement that metadata management
is critical but most of the time there is no
enterprise level sponsorship of a metadata
management initiative.Even if there is, it is only
focused either for one specifc project sponsored
by one specifc business.
The i mpa c t of good me t a da t a
management practices are not consistently
understood across the various levels of the
enterprise. Conversely, the impact of poorly
managed metadata comes to light only after
the fact i.e., a certain transformation happens,
a report or a calculation is run or two divisional
data sources are merged.
Metadata is typically viewed as the
exclusive responsibility of the IT organization
with business having little or no input or say in
its management. The primary reason is that there
are multiple layers of organization between IT
and business. This introduces communication
barriers between IT and business.
Finally, metadata is not viewed as a very
exciting area of opportunity.It is only addressed
as an after-thought.
DIFFERENCES BETWEEN TRADITIONAL
AND BIG DATA ANALYTICS
In traditional analytics, implementations
data is typically stored in a data warehouse.
The data warehouse is modeled using one
of several techniques, developed over time
and is a constantly evolving entity. Analytics
People Rules
Metrics
Single monolithic
governance process
Multiple
governance process
Process
People
Rules
Metrics
Process
People Rules
Metrics Process
People Rules
Metrics Process
Figure 1: Data Governance Shift with Big Data Analytics Source: Infosys Research
5
application developed using the data in a data
warehouse are also long-lived. Data governance
in traditional analytics is a centralized process.
Metadata is managed as part of the data
governance process.
In traditional analytics, data is discovered,
collected, governed, stored and distributed.
Big data introduces large volumes of
unstructured data.This data changes is highly
dynamic and therefore needs to be ingested
quickly for analysis.
Bi g dat a anal yt i cs appl i cat i ons,
however, are characterized by short-lived,
quick implementations focused on solving a
specific business problem.The emphasis of
Big data analytics applications is more on
experimentation and speed as opposed to long
drawn out modeling exercise.
The need to experiment and derive
insights quickly using data changes the way
data is governed. In traditional analytics
there is usually one central governance team
focused on governing the way data is used
and distributed in the enterprise.In Big data
analytics, there are multiple governance
processes in play simultaneously, each geared
towards answering a specific business question.
Figure 1 illustrates this.
Most of the metadata management
challenges we referred to in the previous section
alluded to typical enterprise data that is highly
structured. To analyze unstructured data,
additional metadata defnitions are necessary.
To illustrate the need to enhance metadata
to support Big data analytics, consider sentiment
analysis using social media conversations as
an example. Say someone posts a message on
Facebook I do not like my cell-phone reception.
My wireless carrier promised wide cell coverage
but it is spotty at best.I think I will switch
carriers. To infer the intent of this customer,
the inference engine has to rely on metadata
as well as the supporting domain ontology.
The metadata will defne Wireless Carrier,
Customer, Sentiment and Intent.The
inference engine will leverage the ontology
dependent on this metadata to infer that this
customer wants to switch cell phone carriers.
Big data is not just restricted to text.It
could also contain images, videos, and voice
fles. Understanding, categorizing and creating
metadata to analyze this kind of non-traditional
content is critical.
It is evident that Big data introduces
additional challenges in metadata management.It
is clear that there is a need for a robust metadata
management process which will govern metadata
with the same rigor as data for enterprises to be
successful with Big data analytics.
To summarize, a metadata management
process specifc to Big data should incorporate
the context and intent of data, support non-
traditional sources of data and be robust to
handle the velocity of Big data.
ILLUSTRATIVE EXAMPLE
Consider an existing master data management
system in a large enterprise.This master data
system has been developed over time.This
has specifc master data entities like product,
customer, vendor, employee etc.The master data
system is tightly governed and data is processed
(cleansed, enriched and augmented) before it is
loaded into the master data repository.
This specific enterprise is considering
bringing in social media data for enhanced
customer analytics.This social media data is to be
sourced from multiple sources and incorporated
into the master data management system.
As not ed ear l i er , s oc i al medi a
conversat i ons have cont ext , i nt ent and
sentiment.The context refers to the situation
6
METADATA STORAGE
METADATA DISTRIBUTION
METADATA GOVERNANCE
METADATA COLLECTION Collect
METADATA DISCOVERY
Figure 2: Metadata Management Framework for Big Data
Analytics
Source: Infosys Research
in which a customer was mentioned, the intent
refers to the action that an individual is likely
to take and the sentiment refers to the state of
being of the individual.
For example, if an individual sent a
tweet or a starts a Facebook conversation about
a retailer from a football game. The context
would then be a sports venue. If the tweet or
conversation consisted of positive comments
about the retailer then the sentiment would be
determined as positive. If the update consisted
of highlighting a promotion by the retailer then
the intent would be to collaborate or share with
the individuals network.
If such social media updates have to
be incorporated into any solution within the
enterprise then the master data management
solution has to be enhanced with metadata about
Context, Sentiment and Intent. Static
lookup information will need to be generated
and stored so that an inference engine can
leverage this information to provide inputs for
analysis. This will also necessitate a change in the
back-end.The ETL processes that are responsible
for this master data will now have to incorporate
the social media data as well. Furthermore, the
customer information extracted from these feeds
need to be standardized before being loaded into
any transaction system.
FRAMEWORK FOR METADATA
MANAGEMENT IN BIG DATA ANALYTICS
We propose that metadata be managed using
5 components shown in Figure 2.
Metadata Discovery Discovering metadata
is critical in Big data for the reasons of context
and intent noted in the prior section. Social data
is typically sourced from multiple sources.All
these sources will have different formats. Once
metadata for a certain entity is discovered for
one source it needs to be harmonized across all
sources of interest. This process for Big data
will need to be formalized using metadata
governance.
Metadata Collection A metadata collection
mechanism should be implemented. A robust
collection mechanism should aim to minimize
or eliminate metadata silos. Once again, a
technology or a process for metadata collection
should be implemented.
Metadata Governance Metadata creation
and maintenance needs to be governed.
Governance should include resources from
both the business and IT teams. A collaborative
framework between business and IT should
be established to provide this governance.
Appropriate processes (manual or technical)
should be utilized for this purpose. For example,
on-boarding a new Big data source should be
a collaborative effort between business users
and IT. IT will provide the technology to enable
business users discover metadata.
7
BIG DATA DISTRIBUTION
DATA DISTRIBUTION
DATA STORAGE
DATA GOVERNANCE
DATA COLLECTION
DATA DISCOVERY
Collect
METADATA DISCOVERY
METADATA COLLECTION
METADATA GOVERNANCE
METADATA STORAGE
METADATA DISTRIBUTION
Collect
Metadata Storage Multiple models for
enterprise metadata storage exist.The Common
Warehouse Meta-model (CWM) is one example.
A similar model or its extension thereof can be
utilized for this purpose.If one such model will
not ft the requirements of an enterprise then
suitable custom models can be developed.
Metadata Distribution This is the final
component. Metadata, once stored will need
to be distributed to consuming applications.A
formal distribution model should be put into
place to enable this distribution. For example,
some applications can directly integrate to
the metadata storage layer while others will
need some specialized interfaces to be able to
leverage this metadata.
We note that in traditional analytics
implementation, a framework similar to the one
we propose exists but with data.
The metadata management framework
should be implemented alongside a data
management framework to realize Big data analytics.
THE PARADIGM SHIFT
The discussion in this paper brings to light the
importance of metadata and the impact it has
not only on Big data analytics but traditional
analytics as well.We are of the opinion that if
enterprises want to get value out of their data
assets and leverage the Big data tidal wave then
the time is right to shift the paradigm from
data governance to metadata governance and
make data management part of the metadata
governance process.
A framework is as good as how it is
viewed and implemented within the enterprise.
The metadata management framework is
successful if there is sponsorship for this effort
from the highest levels of management.This
Figure 3: Equal Importance of Metadata &
Data Processing for Big Data Analytics
Source: Infosys Research
8
include both business and IT leadership within
the enterprise. The framework can be viewed as
being very generic. Change is a constant in any
enterprise.The framework can be made fexible
to adapt to changing needs and requirements
of the business.
All the participants and personas in
engaged in the data management function within
an enterprise should participate in the process.
This will promote and foster collaboration
between business and IT.This should be made
sustainable and followed diligently by all the
participants until this framework is used to on-
board not only new data sources but also new
participants in the process.
Metadata and its management is an
oft ignored area in enterprises with multiple
consequences.The absence of robust metadata
management processes lead to erroneous results,
project delays and multiple interpretations of
business data entities. These are all avoidable
with a good metadata management framework.
The consequences affect the entire
enterprise either directly or indirectly.From
the l owest l evel empl oyee to the seni or
most executive, incorrect or poorly managed
metadata not only will affect operations but also
directly contribute to the top-line growth and
bottom-line proftability of an enterprise. Big
data is viewed as the most important innovation
that brings tremendous value to enterprises.
Without a proper metadata management
framework, this value might not be realized.
CONCLUSION
Big data has created quite a bit of buzz in the
market place.Pioneers like Yahoo and Google
created the foundations of what is today called
Hadoop.There are multiple players in the Big
data market today developing everything from
technology to manage Big data to applications
needed to analyze Big data to companies engaged
in Big data analysis and selling that content.
In the midst of all the innovation in the
Big data space, metadata is often forgotten. It
is important for us to recognize and realize the
importance of metadata management and the
critical impact it has on enterprises.
If enterprises wish to remain competitive,
they have to embark on Big data analytics
initiatives.In this journey, enterprises cannot
afford to ignore the metadata management
problem.
REFERENCES
1. Davenport, T., and Harris, J., (2007),
Competing on Analytics The New
Science of Winning, Harvard Business
School Press.
2. J e nni ngs , M. , Wha t r ol e doe s
me t a da t a ma na ge me nt pl a y i n
enterprise information management
( EI M) ? . Av a i l a b l e a t h t t p: //
searchbusinessanalytics.techtarget.com/
answer/The-importance-of-metadata-
management-in-EIM.
3. Metadata Management Foundation
Capabilities Component (2011). http://
mike2.openmethodology.org/wiki/
Metadata_Management_Foundation_
Capabilities_Component.
4. Rogers, D. (2010), Database Management:
Metadata is more important than you think.
Available at http://www.databasejournal.
com/sql et c/art i cl e. php/3870756/
Database-Management-Metadata-is-more-
important-than-you-think.htm.
5. Data Governance Institute, (2012), The
DGI Data Governance Framework.
Available a t http://datagovernance.
com/fw_the_DGI_data_governance_
framework.html.
9
VOL 11 NO 1
2013
Optimization Model for Improving
Supply Chain Visibility
By Saravanan Balaraj
I
n today s competi ti ve l ead or l eave
market pl ace, Bi g dat a i s seen as an
oxymoron that offers both challenge as well as
opportunity. Effective and efficient strategies
to acquire, manage and analyze data leads
to better decision making and competitive
advantage. Unlocking potential business
value out of this diverse and multi-structured
dataset beyond organizational boundary is a
mammoth task.
We have stepped into an interconnected
and intelligent digital world where convergence
of new technologies is fast happening round
the corners. In this process the underlying
data set is growing not only in volumes but
also in velocity and variety. The resulting data
explosion created by a combination of mobile
devices, tweets, social media, blogs, sensors and
emails demands a new kind of data intelligence.
Big data has started creating lot of buzz
across verticals and Big data in supply chain is
no different. Supply chain is one of the key focus
areas that are undergoing transformational
changes in the recent past. Traditional supply
chain applications leverage only on transactional
data to solve operational problems and improve
effciency. Having stepped into Big data world,
the existing supply chain applications have
become obsolete as they are unable to cope up
with tremendously increasing data volumes
cutting across multiple sources, the speed with
which they are generated and unprecedented
growth in new data forms.
Enterprises are in tremendous pressure
to solve new problems emerging out of new
forms of data. Handling large volume of data
across multiple sources and deriving value out
of this massive chunk for strategy execution
is the biggest challenge that enterprises are
facing in todays competitive landscape.
Careful analysis and appropriate usage of
these data would result in cost-reduction and
better operational performance. Competitive
pressures and customers more for less
Enterprises need to adopt different
Big data analytic tools and technologies
to improve their supply chains
Infosys Labs Briefings
10
attitudes have left enterprise with no choice
other than to re-think on their supply chain
strategies and creating a differentiation.
Enterprises need to adopt appropriate
Big data techniques and technologies and
build suitable models to derive value out
of these unstructured data and henceforth
plan, schedule and route in a cost-effective
manner. The paper tries to explore what are
the challenges that dot the Big data adoption in
supply chain and proposes a value model for
Big data optimization.
BIG DATA WAVE
International Data Corporation (IDC) has
predicted that Big data market will grow from
$3.2 billion in 2010 to $16.9 billion by 2015
at a compound annual growth rate of 40%
[2]. This shows tremendous traction towards
Big data tools, technologies and platforms
among enterprises. Lots of researches and
investments are carried out on how to fully tap
the potential benefts hidden in Big data and
derive fnancial value out of it. Value derived
out of Big data enables enterprises to achieve
differentiation by reducing cost, effcient
planning and thereby improving process
effciency.
Big data is an important asset in supply
chain which enterprises are looking forward
to capitalize upon. They adopt different Big
data analytic tools and technologies to improve
their supply chain, production and customer
engagement processes. The path towards
operational excellence is facilitated through
effcient planning and scheduling of production
and logistic processes.
Though supply chain data is really huge,
it brings about the biggest opportunity for
enterprises to reduce cost and improve their
operational performances. The areas in supply
chain planning where Big data can create an
impact are: demand forecasting, inventory
management, production planning, vendor
management and logistics optimization. Big
data can improve supply chain planning process
if appropriate business models are identifed,
designed, built and then executed. Some of
its key benefits are: short time-to-market,
improved operational excellence, cost reduction
and increased proft margins.
CHALLENGES WITH SUPPLY CHAIN
PLANNING
Supply chain planning process success
depends on how closely demands are
forecasted, inventories are managed and
logistics are planned. Supply chain is the
heart of industry vertical and if managed
effciently drives positive business and enables
sustainable advantage. With the emergence of
Big data, optimizing supply chain processes
has become complicated than ever before.
Handling Big data challenges in supply chain
and transforming them into opportunities
is the key to corporate success. The key
challenges are:
Volume - According to a McKinsey
report, the number of RFID tags sold
globally is projected to increase from
12 million in 2011 to 209 billion in
2021 [3]. Along with this, phenomenal
increase in the usage of temperature
sensors, QR codes and GPS devices, the
underlying supply chain data generated
has multiplied manifold beyond our
expectations. Data is flowing across
multiple systems and sources and hence
they are likely to be error-prone and
incomplete. Handling such huge data
volumes is a challenge.
11
Velocity - Business has become highly
dynamic and volatile. The changes arising
due to unexpected events must be handled
in a timely manner in order to avoid losing
out in business. Enterprises are fnding it
extremely diffcult to cope up with this
data velocity. Optimal decisions must
be made quickly and shorter processing
time is the key for successful operational
execution which is lacking in traditional
data management systems.
Variety - In supply chain, data has
emerged in different forms which
dont ft in traditional applications and
models. Structured (transactional data),
unstructured (social data), sensor data
Launch Customer
Promotion Inventory Transportation
Data Sourcing
Data Extraction & Cleansing
Data Representation
Acquire
OLTP DB
Transactional Systems Big Data Systems
Sensor
RFID
Transactional Social
Video
Voice
Digital Image
Channel Time bound
QR
Temperature
Structured Unstructured
New Type
Cascading | Hive
Pig | MapReduce
HDFS | NoSQL
(temperature and RFID) along with
new data types (video, voice and digital
images) have created nightmares among
enterprise to handle such diverse and
heterogeneous data sets.
In todays data explosion in terms
of volume, variety and velocity, handling
them alone doesnt suffce. Value creation by
analyzing such massive data sets and extraction
of data intelligence for successful strategy
execution is the key.
BIG DATA IN DEMAND FORECASTING &
SUPPLY CHAIN PLANNING
Enterprises use forecasting to determine how
much to produce of each product type, when
Figure 1: Optimization Model for Improving Supply
Chain Visibility - I
Source: Infosys Research
12
and where to ship them, thereby improving
supply chain visibility. Inaccurate forecast
causes detrimental effect in supply chain.
Over-forecast results in inventory pile ups
and working capital locks. Under-forecast
leads to failure in meeting demand, resulting
in loss of customer and sales. Hence in todays
volatile market comprised of unpredictable
shifts in customer demands, improving
accuracy of forecast is of paramount
importance.
Data in supply chain planning has
mushroomed in terms of volumes, velocity
and variety. Tesco, for instance, generates
more than 1.5 billion new data items every
mont h. Wal - Mart s warehouse handl es
some 2.5 petabytes of information which is
roughly equivalent to half of all the letters
delivered by the US Postal Service in 2010.
Accordi ng to McKi nsey Gl obal i nsti tute
report [3], leveraging on Big data in demand
forecasting and supply chain planning could
increase profit margin by 2-3% in Fast Moving
Consumer Goods (FMCG) manufacturing
val ue chai n. Thi s uneart hs t remendous
opportunity in forecasting and supply chain
planning available for enterprises to capitalize
on this Big data deluge.
MISSING LINKS IN TRADITIONAL
APPROACHES
Ent erpri ses have st art ed real i zi ng t he
importance of Big data in forecasting and
have begun investing in Big data forecasting
tools and technologies to improve their supply
chain, production and manufacture planning
processes. Traditional forecasting tools arent
adequate enough in handling huge data
volumes, variety and velocity. Moreover they
are missing out on the following key aspect
which improves accuracy of forecasts:
Soci al Medi a Dat a As An Input :
Social media is a platform that enables
enterpri ses to col l ect i nformati on
a b o ut po t e nt i a l a nd pr o s pe c t
customers. Thanks to the technological
advancements that has made tracking
customer data easier. Companies can
now track every visit customer makes
to the websites, e-mail exchanged and
comments logged across social media
websi tes. Soci al medi a data hel ps
analyze customer pulse and gain insights
on forecasting, planning, scheduling of
supply chain and inventories. Buzz in
social networks can be used as an input
for demand forecasting for numerous
benefit. One such use case is, enterprise
can launch a new product to online fans
to sense customer acceptance. Based on
the response, inventories and supply
chain can be planned to direct stocks
to high buzz locations during launch
phase.
Predi ct And Respond Approach:
Traditional forecasting is done by
analyzing historical patterns, considering
sales inputs and promotional plans
to forecast demand and supply chain
planning. They focus on what happened
and work on sense and respond strategy.
History repeats itself is no longer apt
in todays competitive marketplace.
Enterprises need to focus on what
will happen and require predict and
respond strategy to stay alive in business.
This calls for models and systems capable
of capturing, handling and analyzing
huge volume of real-time data generated
from unexpected competitive events,
weather patterns, point-of-sales and
13
natural disasters (volcanoes, foods, etc.)
and converting them into actionable
information for forecasting plans on
production, inventory holdings and
supply chain distribution.
Optimized Decisions with Simulations:
Traditional decision support systems
lack fexibility to meet changing data
requirements. In real world scenario,
supply chain delivery plan changes
unexpectedly due to various reasons
like demand change, revised sales
forecast, etc. The model and system
should have ability to factor in this and
respond quickly to such unplanned
events. Decision should be taken only
after careful analysis of the unplanned
events impact on other elements of
supply chain. Traditional approaches
lack this capability and this necessitates
a model for performing what-if analysis
on all possible decisions and selecting
the optimal one in the Big data context.
IMPROVING SUPPLY CHAIN VISIBILITY
USING BIG DATA
Supply chain doesnt lack data whats missing
is a suitable model to convert this huge diverse
raw data into actionable information so that
enterprises can make critical business decisions
for effcient supply chain planning. A 3-stage
optimized value model helps to overcome
the challenges posed by Big data in supply
chain planning and demand forecasting. It
bridges the existing gaps in traditional Big
data approaches and offers a perspective
to unlock the value from growing Big data
torrent. Designing and building an optimized
Big data model for supply chain planning is a
complex task but successful execution leads to
signifcant fnancial benefts. Lets take a deep
dive into each stage of this model and analyze
what their value-add are in enterprises supply
chain planning process.
Acquire Data: The biggest driver of supply
chain planning is data. Acquiring all the relevant
data for supply chain planning is the frst step
in this optimized model. It involves three steps
namely data sourcing, data extraction and
cleansing and data representation which make
data ready for further analysis.
Data Sourcing - Data is available in
different forms across multiple sources,
systems and geographies. It contains
extensive details of historical demand
data and other relevant information. For
further analysis it is therefore necessary
to source required data. Data that are
to be sourced for improving accuracy
of forecast in-addition to transactional
data are:
Product Promotion data - items,
prices, sales
Launch data - items to be ramped up
or down
Inventory data - stock in warehouse
Customer data - purchase history,
social media data
Transport at i on dat a - GPS and
logistics data.
Enterprises should adopt appropriate
Big data systems that are capable of handling
such huge data volumes, variety and velocity.
14
Data Extraction and Cleansing - Data
sources are available in different forms
from structured (transactional data) to
un-structured (social media, images,
sensor data, etc.) and they are not in
analysis-friendly formats. Also due
to l arge vol ume of heterogeneous
dat a t here i s hi gh probabi l i t y of
inconsistencies and data errors while
sourcing. The sourced data should be
expressed in structured form for supply
chain planning. Moreover analyzing
inaccurate and untimely data leads to
erroneous non-optimal results. High
quality and comprehensive data is a
valuable asset and appropriate data
cleansing mechanisms should be in
place for maintaining the quality of Big
data. Choice of Big data tools for data
cleansing and enrichment plays a crucial
role in supply chain planning.
Data Representation Database design
for such huge data volume is a herculean
task and poses some serious performance
issues if not executed properly. Data
representation plays a key role in Big
data analysis. There are numerous ways
to store data and each design has its
own set of advantages and drawbacks.
Selection of appropriate database design
and executi ng appropri ate desi gn
favoring business objectives reduces the
efforts in reaping benefts out of Big data
analysis in supply chain planning.
Analyze Data: The next stage is analyzing
cleansed data and capturing value for forecasting
and supply chain planning. There is plethora of
Big data techniques available in market for
forecasting and supply chain planning. The
selection of Big data technique depends on the
business scenario and enterprise objectives.
Incompatible data formats make value creation
from Big data a complex task and this calls for
innovation in techniques to unlock business
value out of the growing Big data torrent. The
proposed model adopts optimization technique
to generate insights out of this voluminous and
diverse Big dataset.
Optimization in Big data analysis -
Manufacturers have started synchronizing
forecasting with production cycles,
so accuracy of forecasting plays a
crucial role in their success. Adoption
of optimization technique in Big data
analysis creates a new perspective and
it helps in improving the accuracy of
demand forecasting and supply chain
planning. Analyzing the impact of
promotions on one specifc product for
demand forecasting appears to be an easy
task. But real life scenarios comprises
of huge army of products with factors
affecting their demand varying for
every product and location making it
difficult for traditional techniques in
data analysis.
Optimization technique has several
capabilities which make it an ideal choice for
data analysis in such scenarios. Firstly, this
technique is designed for analyzing and drawing
insights for highly complex system with huge
data volumes, multiple constraints and factors
to be accounted for. Secondly, supply chain
planning has number of enterprise objectives
associated with it like cost reduction, demand
fulfllment, etc. The impact of each of these
objective measures on enterprises proftability
can be easily analyzed using optimization
15
technique. Flexibility of optimization technique
is another beneft that makes it suitable for Big
data analysis to uncover new data connections
and turn them into insights.
Opt i mi zat i on model compri ses of
four components, viz., (i) input consistent,
real-time, quality data which is sourced,
cleansed and integrated becomes the input
of the optimization model; (ii) goals the
model should take into consideration all
the goals pertaining to the forecasting and
supply chain planning like minimizing cost,
maximizing demand coverage, maximizing
profits, etc. (iii) constraints the model should
incorporate the entire constraints specific to
the supply chain planning in the model; some
of the constraints are minimum inventory
in warehouse, capacity constraint, route
constraint, demand coverage constraint, etc;
and (iv) output results based on input, goals
and constraints defined in the model that can
be used for strategy executions. The result can
be demand plan, inventory plan, production
plan, logistics plan, etc.
Choice of Algorithm: One of the key
differentiators in supply chain planning
is the algorithm used in modeling.
Data Sourcing
Data Extraction & Cleansing
Data Representation
ACQUIRE
OPTIMIZATION TECHNIQUE
INPUT
OUTPUT
GOALS
Min (Cost)
Max (Profit)
Max (Demand Coverage)
CONSTRAINTS
Capacity constraint
Route Constraint
Demand Coverage Constraint
Inventory Plan
Demand Plan
Logistics Plan
ANALYZE
ACHIEVE
Performance Trackers
KPI Dashboards
Actual Vs. Planned
Multi User
Collaboration
Build
Compare Simulate
Scenario Management
Figure 2: Optimization Model for Improving Supply
Chain Visibility II
Source: Infosys Research
16
Optimization problems have numerous
possible solutions and the algorithm
should have the capability to fne-tune
itself for achieving optimal solutions.
Achieve Business Objective: The fnal stage
in this model is achieving business objectives
through demand forecasting and supply
chain planning. It involves three steps which
facilitates enterprise in supply chain decisions.
Scenario Management Business events
are diffcult to predict and most of the
times deviate from their standard paths
resulting unexpected behaviors and
events. This makes it diffcult for planning
and optimizing during uncertain times.
Scenario management is the approach
to overcome such uncertain situations.
Scenario management facilitates creating
business scenarios, comparing multiple
different scenarios, analyze and assessing
its impact before making decisions. This
capability helps to balance conficting
KPIs and arrive at an optimal solution
matching business needs.
Multi User Collaboration Optimization
model in real business case comprises
of highly complex data sets and models
which requires support from an army
of analysts and determines its effects
on enterprises goals. Combinations
of technical and domain experts are
required to obtain optimal results.
To achieve near accurate forecasts
and supply chain optimization the
model shoul d support mul t i -user
collaboration so that multiple users can
collaboratively produce optimal plans
and schedules and re-optimize as and
when business changes. This model
builds a collaborative system with
capability of supporting inputs from
multiple users and incorporating in its
decision making process
Per f or mance Tr acker Demand
forecasting and supply chain planning
does not follow build-model-execute
approach, i t requi res si gni f i cant
continuous effort. Frequent changes in
the inputs and business rules necessitate
monitoring of data, model and algorithm
performance. Actual and planned results
are to be compared regularly and steps
are to be taken to minimize the deviations
in accuracy. KPI is to be defned and
dashboard shoul d be const ant l y
monitored for model performances.
KEY BENEFITS
Enterprises can accrue lot of beneft by adopting
this 3-stage model for Big data analysis. Some of
them are detailed below:
Improves Accuracy of Forecast: One of
the key objectives of forecasting is profit
maximization. This model adopts effective data
sourcing, cleansing and integration systems and
makes data ready for forecasting. Inclusion of
social media data, promotional data, weather
predictions, seasonality s in addition to
historical demand and sales histories adds value
and improves forecasting accuracy. Moreover
optimization technique for Big data analysis
reduces forecasting errors to a great extent.
Continuous Improvement: Acquire-Analyze-
Achieve model is not a hard-wired model. It
allows flexibility to fine tune and supports
what-if analysis. Multiple scenarios can be
17
created, compared and simulated to identify
the impact of change on the supply chain and
demand forecasting prior to the making any
decisions. Also it enables enterprise to defne,
track and monitor KPIs from time to time
resulting in continuous process improvements.
Better Inventory Management: Inventory data
along with weather predictions, history of sales
and seasonality is considered as an input to
the model for forecasting and planning supply
chain. This approach minimizes incidents of
out-of-stock or over-stocks across different
warehouses. Optimal plan for inventory
movement is forecasted and appropriate stocks
are maintained at each warehouse to meet the
upcoming demand. To a great extent this will
reduce loss of sales and business due to stock-
outs and leads to better inventory management.
Logistic Optimization: Constant sourcing
and continuous analysis of transportation
data (GPS and other logistics data) and using
them for demand forecasting and supply chain
planning through optimization techniques
helps in improving distribution management.
Moreover optimization of logistics improves
fuel effciency and effcient routing of vehicles
resulting in operational excellence and better
supply chain visibility.
CONCLUSIONS
As rapid penetration of information technology
in supply chain planning continues, the amount
of data that can be captured, stored and analyzed
has increased manifold. The challenge is to
derive value out of these large volumes of data
by unlocking fnancial benefts in congruent
with the enterprises business objectives.
Competitive pressures and customers
more for less attitude has left enterprises with
no option other than reducing cost in their
operational executions. Adopting effective
and efficient supply chain planning and
optimization techniques to match customer
expectations with its offerings is the key
to corporate success. To attain operational
excellence and sustainable advantage, it is
necessary for the enterprise to build innovative
models and frameworks leveraging the power
of Big data.
Optimized value model on Big data
offers a unique way of demand forecasting
and suppl y chai n opti mi zati on through
collaboration, scenario management and
performance management. This model on
continuous improvement opens up doors for big
opportunities for the next generation of demand
forecasting and supply chain optimization.
REFERENCES
1. I DC - Press Rel ease ( 2012) , I DC
Releases First Worldwide Big data
Technol ogy and Servi ces Market
Forecast, Shows Big data as the Next
Essential Capability and a Foundation
for the Intelligent Economy. Available
at ht t p: //www. i dc. com/get doc.
jsp?containerId=prUS23355112.
2. McKinsey Global Institute (2011), Big
data: The next frontier for innovation,
competition, and productivity. Available
at http://www.mckinsey.com/~/media/
McKinsey/dotcom/Insights%20and%20
pubs/MGI/Research/Technology%20
and%20Innovation/Big%20Data/MGI_
big_data_full_report.ashx.
3. Furio, S. , Andres, C. , Lozano, S. ,
Adenso-Diaz, B., (2009), Mathematical
model to optimize land empty container
movement s. Avai l abl e at ht t p: //
www. fundacion. valenciaport. com/
18
Articles/doc/presentations/HMS2009_
Paperid_27_Furio.aspx.
4. Stojkovia, G., Soumisb, F., Desrosiersc,
J., Solomon, M. (2001), An optimization
model for a real-time fight scheduling
problem. Available at http://www.
sciencedirect.com/science/article/pii/
S0965856401000398.
5. Beck, M., Moore, T., Plank, J., Swany, M.
(2000), Logistical Networking. Available
at: http://loci.cs.utk.edu/ibp/files/
pdf/LogisticalNetworking.pdf.
6. Lasschuit, W., Thijssen, N., (2004),
Supporting supply chain planning and
scheduling decisions in the oil and
chemical industry, Computers and
Chemical Engineering, issue 28, pp. 863
870. Available at http://www.aimms.
com/aimms/download/case_studies/
shell_elsevier_article.pdf.
19
VOL 11 NO 1
2013
Retail Industry
Moving to Feedback Economy
By Prasanna Rajaraman and Perumal Babu
R
etail industry is going through a major
paradigm shift. The past decade has seen
unprecedented churn in retail industry virtually
changing the landscape. Erstwhile marquee
brands from traditional retailing side have
ceded space to start-ups and new business
models.
The key driver of this change is a
confuence of technological, sociological and
customer behavioral trends creating this
strategic infection point in retailing ecology.
Trends like emergence of internet as major
retailing channel, social platforms going
mainstream, pervasive retailing and emergence
of digital customer has presented a major
challenge to traditional retailers and retailing
models.
On the other hand, these trends have
also enabled opportunities for retailers to better
understand customer dynamics. For the frst
time, retailers have access to unprecedented
amount of publicly available information on
customer behavior and trends; voluntarily
shared by customers. The more effective
retailers can tap into these behavioral and social
reservoirs of data to model purchasing behaviors
and trends of their current and prospective
customers. Such data can also provide the
retailers with predictive intelligence, which
if leveraged effectively can create enough
mindshare, that the sale is completed even
before the conscious decision to purchase is
taken.
This move to a feedback economy
where retailers can have 360 degree view of
the customer thought process across the selling
cycle is a paradigm shift for retail industry
from retailer driving sales to retailer engaging
customer across the sales and support cycle.
Every aspect of retailing from assortment/
allocation planning, marketing/promotions to
customer interactions has to take the evolving
consumer trends into consideration.
The i mpl i c at i on f r om bus i ne s s
perspective is that retailers have to better
understand customer dynamics and align
Gain better insight into customer dynamics
through Big Data analytics
Infosys Labs Briefings
20
Implicit
Guidance &
Control
Implicit
Guidance &
Control
Observation
Orient Decide Act
Outside
Information
Observe
Unfolding
Circumstances
Unfolding
Interaction
with
Enviroment
Unfolding
Interaction
with
Enviroment
Decision
(Hypothesis)
Action
(Test)
F
e
e
d
F
o
r
w
a
r
d
F
e
e
d
F
e
e
d
F
o
r
w
a
r
d
F
o
r
w
a
r
d
Cultural
Transactions
Genetic
Heritage
New
Information
Previous
experiences
Analysis
and
Synthesis
Feedback
Feedback
Feedback
business processes effectively with these
trends. In addition, this implies that cycle
times will be shorter and businesses have to be
more tactical in their promotions and offerings.
Retailers who can ride this wave will be better
able to address demand and command higher
margins for the products and services. Failing
this, retailers will be left with low-margin
pricing/commodity space.
F r o m i nf o r ma t i o n t e c hno l o g y
perspective, the key challenge is that nature
of this information with respect to lifecycle,
velocity, heterogeneousness of the sources
and volume is radically different from what
traditional systems handle. Also, there are
overarching concerns like that of data privacy,
compliance and regulatory changes that need
to be internalized with internal processes. The
key is to manage lifecycle of this Big data and
effectively integrate with the organizational
system and to derive actionable information.
TOWARDS A FEEDBACK ECONOMY
Customer dynami cs refers to customer-
business relationships that describe the ongoing
interchange of information and transactions
between customers and organizations that
goes beyond the transactional nature of
the interaction to look at emotions, intent
and desires. Retailers can create significant
competitive differentiation by understanding
the customers true intent in a way that also
supports the business intents [1, 2, 3, 4].
John Boyd a colonel military strategist
in the US air force developed the OODA loop
(Observe, Orient, Decide and Act) which he
used for combative operations. Todays business
environment is nothing different Retailers
are battling to get customer into their shops
(physical or net-front) and convert their visits
to sales. And understanding customer dynamics
play a key role in this effort. The OODA loop
explains the crux of the feedback economy.
Figure 1: OODA loop Source: Reference [5] Source: Reference [5]
21
In a feedback economy, there is constant feedback
to the system from every phase of its execution.
Along with this, the organization should
observe the external environment, unfolding
circumstances and customer interactions. These
inputs are analyzed and action is taken based
on these inputs. This cycle of adaptation and
optimization makes the organization more
effcient and effective on an ongoing basis.
Leveraging this feedback loop is pivotal
in having a proper understanding of customer
needs and wants and the evolving trends. In
todays environment, this means acquiring
data from heterogeneous sources viz., in-
store transaction history, web analytics, etc.
This creates a huge volume of data that has
to be analyzed to get the required actionable
insights
BIG DATA LIFECYCLE: ACQUIRE-
ANALYZE-ACTIONIZE
The lifecycle of Big data can be visualized as a
three-phased approach resulting in continuous
optimization. The frst step in moving towards
feedback economy is to acquire data. In this
case, retailer should look into the macro and
micro environment trends, consumer behavior
- their likes, emotions, etc. Data from electronic
channels like blogs, social networking sites
and twitter will give the retailer a humongous
amount of data regarding the consumer. These
feeds help the retailer understand consumer
dynamics and give more insights into her
buying patterns.
The key advantage of plugging into these
disparate sources is the sheer information one
can gather about customer both individually
and in aggregate. On other hand, Big data is
materially different from the data the retailers
are used to handling. Most of the data is
unstructured (from blogs, twitter feeds, etc.) and
cannot be directly integrated with traditional
analytics tool leading to challenges on how the
data can be assimilated with backend decision
making systems and analyzed.
In the assimilate/analyze phase, retailer
must decide which data is of use and defne
rules for fltering the unwanted data. Filtering
should be done with utmost care, as there are
cases where indirect inferences are possible. The
data available to the retailer after the acquisition
phase would be of multiple formats and they
have to be cleaned and harmonized with the
backend platforms.
Cleaned up data is then mined for
actionable insights. Actionize is a phase where the
insights gathered from analyze phase is converted
to actionable business decisions by the retailer.
The response i.e., business outcome is
fed back to the system so that the system can
self-tune on an ongoing basis to result in a self-
adaptive system that leverages Big data and
feedback loops to offer business insight more
customized than what would be traditionally
possible. It is imperative to understand that
this feedback cycle is an ongoing process and
not to be considered as a one stop solution for
the analytics needs of a retailer.
ACQUIRE: FOLLOWING CUSTOMER
FOOTPRINTS
To understand the customer, retailers have to
leverage every interaction with the customer
and tap into the source of customer insight.
Traditionally, retailers have relied primarily on
in-store customer interactions and associated
transaction data along with specialized campaigns
like opinion polls to gain better insight into
customer dynamics. While this interaction looks
limited, a recent incident shows how powerful
customer sales history can be leveraged to gain
predictive intelligence on customer needs.
22
A father of a teenage girl called in a
major North American retailer to complain that
the retailer had mailed coupons for child care
products addressed to his underage daughter.
Few days later, the same father called in and
apologized that his daughter was indeed
pregnant and he was not aware of it earlier [6].
Surprisingly, by all indications, only in-
store purchase data was mined by the retailer
in this scenario to identify the customer need
which in this case is that of childcare products.
To exploit the power of next generation
of analytics retailers must plug into data from
non-traditional sources like social sites, twitter
feeds, environment sensor networks, etc. to
have better insight into customer needs. Most
major retailers now have multiple channels
brick/mortar store, online store, mobile apps,
etc. Each of these touch points not only acts
as a sales channel but can also generate data
on customer needs and wants. Coupling this
information with other repository like Facebook
posts, twitter feeds (i.e., sentiment analysis) and
web analytics retailers have the opportunity to
track customer footprints both in and outside
the store and to customize their offerings and
interactions with customer.
Traditionally retailers have dealt with
voluminous data. For example, Wal-Mart logs
more than 2.5 petabytes of information about
customer transactions every hour, equivalent
to 167 times the books in the Library of
Congress [7].
However, the nature of Big data is
materially different from traditional transaction
data and this must be considered while data
planning is done. Further, while data is
available readily, the legality and compliance
aspect of gathering and using data is additional
aspect that needs to be considered. Further,
integrating information from multiple sources
can result in generating data that is beyond
what user originally consented to; potentially
resulting in liability for the retailer. Given that
most of this information is accessible globally,
retailers should ensure compliance with local
regulations (EU data /privacy protection
regulations, HIPAA for US medical data, etc.)
where they operate.
ANALYZE - INSIGHTS (LEADS)
TO INNOVATION
Analyst Doug Laney defined data growth
challenges and opportunities as being three-
dimensional, i.e. increasing volume (amount of
data), velocity (speed of data in and out), and
variety (range of data types and sources)[9].
The key to acquire Big data is to handle
these dimensions while assimilating these
aforementioned external sources of data. To
understand how Big data analytics can enrich
and enhance a typical retail process allocation
planning lets look at the allocation planning
case study for a major North American apparel
retailer.
The forecasting engine used for planning
process uses statistical algorithms to determine
allocation quantities. Key inputs to forecasting
engine are sales history and current performance
of store. In addition, adjustments are also
based on parameters like Promotional events
(including markdown), current stock levels,
back orders to determine the inventory that
needs to be shipped to particular store.
While this is fairly in line with industry
standard for allocation forecasting, Big data
can enrich this process by including additional
parameters that can impact demand. For e.g., a
news piece on a towns go-green initiative or no
plastic day can be taken as additional adjustment
parameter for non-green items in that area.
Similarly, a weather forecast on warm front in
23
an area can automatically trigger reduction of
stocks of warm-clothing for stores there.
A high-level logical view of Big data
implementation is explained below to further
understanding on how Big data can be assimilated
with traditional data sources. The data feeds
for the implementation comes from various
structured sources like forums, feedback forms,
rating sites and unstructured source like social
web, etc. as well as semi-structured data from
emails, word documents, etc. This is a veritable
data feast thrown compared to traditional
systems but it is important that we diet on such
data and use only those feeds that create optimum
value. This is done through synergy of business
knowledge and processes specifc to retailer and
the industry segment the retailer operates in and
set of tools specialized in analyzing huge volume
of data in rapid speed. Once data is massaged for
downstream systems, big analytics tools are used
to analyze. Based on business needs, real-time or
offine data processing/analytics can be used. In
real-life scenarios, both these approaches are used
based on situation and need.
Proper analysis needs data not just
from consumer insight sources but also from
transactional data history and consumer
profles.
ACTIONIZE BIG DATA TO BIG IDEAS
This is the key part of the Big data cycle. Even
the best data cannot be substituted for timely
action. The technology and functional stacker
will facilitate retailer getting proper insight
into key customer intent on purchase what,
where, why and at what price. By knowing this,
Figure 2: Correlation between Customer Ratings and Sales Source: Reference [12]
Best Sellers in Tablet PCs Most Wished For in Tablet PCs
Kindle Fire HD 7, Dolby Audio,
Dual-Band Wi-Fi, 32GB
1.
Samsung Galaxy Tab 2
(7-Inch,Wi-Fi)
3.
Samsung Galaxy Tab 2
(10.1-Inch, Wi-Fi)
4.
Kindle Fire HD 8.9, Dolby Audio,
Dual-Band Wi-Fi, 32 GB
5.
Kindle Fire HD 8.9, 4G LTE Wireless,
Dolby Audio, Dual-Band Wi-Fi, 32GB
1.
Kindle Fire HD 8.9, Dolby Audio,
Dual-Band Wi-Fi, 32GB
2.
Kindle Fire, Full Color 7
Multi-touch Display, Wi-Fi
3.
Kindle Fire HD 7, Dolby Audio,
Dual-Band Wi-Fi, 32 GB
4.
Kindle Fire HD 8.9, Dolby Audio,
Dual-Band Wi-Fi, 16GB
2.
Samsung Galaxy Tab 2
(7-Inch, Wi-Fi)
5.
24
the retailer can customize the 4Ps (product,
pricing, promotions and place) to create enough
mindshare from customer perspective that sales
become inevitable [10].
For example, a cursory look at random
product category (tablet) in an online retailer
site shows the strong correlation between
customer ratings and sales, i.e., 4 out of 6 best
user-rated products are in the top fve in sales
a 60% correlation even when other parameters
like brand, price, release date are not taken into
consideration [Fig. 2] 12. The retailer knowing
the customer ratings can offer promotions
that can tip the balance between sales and lost
opportunity. While this example may not be the
rule, the key to analysis and actionizing the data
is to correlate the importance of user feedback
data and concomitant sales.
BIG DATA OPPORTUNITIES
The implication of Big data analytics on major
retailing processes will be along the following
areas.
Identi fyi ng the Product Mi x: The
assortment and allocation will need to
take into consideration the evolving user
trends identifed from Big data analytics
to ensure the offering matches the market
needs. Allocation planning especially
has to be tactical with shorter lead times.
Promotions and Pricing: Retailers have
to move from generic pricing strategies
to customized user specifc.
Communi cat i on wi t h Cus t omer :
Advertising will move from mass media
to personalized communication; from
one way to two-way communication.
Retailers will gain more from viral
marketing [12] than from traditional
advertising channels.
Compliance: Governmental regulations
and compl i ance requi rements are
mandatory to avoid liability as co-
mingling data from disparate sources
can result in generation of personal data
beyond the scope of the original users
intent. While data is available globally,
the use has to comply with local law of
the land and ensure it is done keeping in
mind customers sensibilities.
People, Process and Organizational
Dynamics: The move to feedback economy
requi res di f f erent organi zat i onal
mindset and processes. Decision making
will need to be more bottom-up and
collaborative. Retailers need to engage
customer to ensure the feedback loop is
in place. Further, Big data being cross-
functional, needs the active participation
and coordination between various
departments in the organization; hence
managing organizational dynamics is the
key consideration.
B e t t e r Cu s t o me r E x p e r i e n c e :
Organizations can improve the overall
customer experience by providing
updates services and thereby eliminating
surprises. For instance Big data solutions
can be used to pro-actively inform
customers of expected shipment delays
based on traffc data, climate and other
external factors.
BIG DATA ADOPTION STRATEGY
Presented below is a perspective on how to
adopt a Big data solution within the enterprise.
25
Defne Requirements, Scope and Mandate:
Defne mandate and objective in terms of what is
the required from Big data solution. A guiding
factor to identify the requirements would be the
prioritized list of business strategies. As part of
initiation, it is important to also identify the goal
and KPIs that vindicates the usage of Big data.
Key Player: Business
Choosing the Right Data Sources:
Once the requirement and scope is defned,
the IT department has to identify the various
feeds that would fetch the relevant data. These
feeds would be structured, semi structured and
unstructured. The source could be internal or
external. For internal sources, the policies and
processes should be defned to enable friction
less fow of data.
Key Players: IT and Business
Choosing the Required Tools and Technologies:
After deciding upon the sources of data that
would feed the system, the right tools and
technology should be identifed and aligned
with business needs. Key areas are capturing the
data, tools and rules to clean the data, identify
tools for real-time and offine analytic, identify
storage and other infrastructure needs.
Key Player: IT
Creating Inferences from Insights:
One of the key factors to a successful Big
data implementation is to have a pool of
talented data analyst who can create proper
inferences from the insights and facilitate
build and definition of new analytic models.
These models help in probing the data and
understand the insights.
Key Player: Data Analyst
Strategy to Actionize the Insights:
Business should create process that would take
these inferences as inputs to decision making.
Stakeholders in decision making should be
identifed and actionable inferences have to be
communicated at the right time. Speed is critical
to the success of Big data.
Key Player: Business
Measuring the Business Benefts:
The success of the Big data initiative depends on
the value it creates to the organization and its
decision making body. It should also be noted
that unlike other initiatives, Big data initiatives
are usually continuous process in search of the
best results. Organizations should be in tune to
this understanding to derive the best results.
However, it is important that a goal is set and
measured to track the initiative and ensure its
movement in the right direction.
Key Players: IT and Business
CONCLUSION
The move to feedback economy presents an
inevitable paradigm shift for the retail industry.
Big data as the enabling technology will play key
role in this transformation. As ever, business
needs will continue to drive technology process
and solution. However, given the criticality of
Big data, organizations will need to treat Big
data as an existential strategy and make the right
investment to ensure they can ride the wave.
REFERENCES
1. Customer dynamics. Available at http://
en. wi ki pedi a. org/wi ki /Customer_
dynamics.
26
2. Davenport, T. and. Harris, G., (2007),
Competi ng on Anal yti cs, Harvard
Business School Publishing.
3. De Bor de , M. , ( 2006) , Do Your
Organizational Dynamics Determine Your
Operational Success?, The O and P Edge.
4. Lemon, K., Barnett, T., White, Russell S.
Winer, Dynamic Customer Relationship
Management: Incorporating Future
Considerations into the Service Retention
Decision, Journal of Marketing.
5. Boyd, J. (September 3, 1976). OODA
loop, In Destruction and Creation.
Available at http://en.wikipedia.org/
wiki/OODA_loop.
6. Doyne, S. (2012), Should Companies
Collect Information About You?, NY
Times. Available at http://learning.
bl ogs. nyt i mes. com/2012/02/21/
should-companies-collect-information-
about-you/.
7. Data, data everywhere (2010), The
Economist. Available at http://www.
economist.com/node/15557443.
8. IDC Digital Universe (2011). Available
at ht t p: //c huc ks bl og. emc . c om/
chucks_blog/2011/06/2011-idc-digital-
universe-study-big-data-is-here-now-
what.html.
9. Gartner Says Solving Big data Challenge
Involves More Than Just Managing
Volumes of Data (2011). http://www.
gartner.com/it/page.jsp?id=1731916.
10. Gens, F. ( 2012) . I DC Pr edi ct i on
2012: Competing for 2020. Available
at ht t p: //cdn. i dc. com/research/
Predi ct i ons12/Mai n/downl oads/
IDCTOP10Predictions2012.pdf.
11. Bhasin, H. 4Ps of marketing. Available
at http: //www. marketi ng91. com/
marketing-mix-4-ps-marketing/.
12. Amazon US site / tablets category (2012).
Available at http://www.amazon.com/
gp/top-rated/electronics/3063224011/
r e f = z g _ b s _ t a b _ t _ t r ? p f _ r d _
p = 1 3 7 4 9 6 9 7 2 2 &p f _ r d_ s = r i g h t -
8&pf_rd_t=2101&pf_rd_i =l i st&pf_
r d_m=ATVPDKI KX0DER&pf _r d_
r=14YWR6HBVR6XAS7WD2GG.
13. Godwin, G. (2008) Viral marketing. Available
at http://sethgodin.typepad.com/seths_
blog/2008/12/what-is-viral-m.html.
14. Wang, R. (2012), Mondays Musings:
Beyond The Three Vs of Big data
Viscosity and Virality , http://blog.
sof t warei nsi der. org/2012/02/27/
mondays-musings-beyond-the-three-
vs-of-big-data-viscosity-and-virality/
27
VOL 11 NO 1
2013
Harness Big Data Value and
Empower Customer Experience
Transformation
By Zhong Li PhD
I
n todays hyper-competitive experience
economy, communication service providers
(CSPs) recognize that product and price alone
will not differentiate their business and brand.
Since brand loyalty, retention and long-term
profitability are now so closely aligned with
customer experience, the ability to understand
customers, spot changes in their behavior
and adapt quickly to new consumer needs is
fundamental to the success of the consumer
driven Communication Service Industry.
The increasingly sophisticated digital
consumers demand more personal i zed
ser vi ces t hr ough t he channel of t hei r
choice. In fact, the internet, mobile and
particularly, the rise of social media in the
past 5 years have empowered consumers
more than ever before. There is a growing
challenge for CSPs that are contending with
an increasingly scattered relationship with
customers who can now choose from multiple
channels to conduct business interactions.
A recent industry research indicates that
some 90% of todays consumers in the US
and West Europe interact across multiple
channels, representing a moving target that
makes achieving a full view of the customer
that much more challenging .
To compound this trend, the always-on
digital customers continuously create more data in
various types, from many more touch points with
more interaction options. CSPs encounter Big
data phenomenon by accumulating signifcant
amounts of customer related information such
as purchase patterns, activities on the website,
from mobile, social media or interactions with
the network and call centre.
Such Big data phenomenon presents
CSPs with challenges along 3V dimensions
(Fig. 1), viz.,
Communication Service Providers need to
leverage the 3M Framework with a holistic
5C process to extract Big Data value (BDV)
Infosys Labs Briefings
28
Large Volume: Recent industry research
shows that the amount of data that
the CSP has to manage with consumer
transaction and interaction has doubled in
the past three years, and its growth is also
in acceleration to double the size again
in the next two years, much of it coming
from new sources including blogs, social
media, internet search, and networks [7].
Broad Variety: The type, form and
format of data are created in a broad
variety. Data is created from multiple
channels such as online, call centre,
stores and soci al medi a i ncl udi ng
Facebook, Twitter and other social media
platforms. It presents itself in a variety of
types, comprising structured data form
transaction, semi-structure data from call
records and unstructured data in multi-
media forms from social interactions
Rapidly Changing Velocity: The always
on digital consumers create change
dynamics of data in the speed of light.
They equally demand fast response from
CSPs to satisfy their personalized needs
in real time.
CSPs of all sizes have learned the hard way that
it is very diffcult to take full advantage of all of
the customer interactions in Big data if they do
not know what their customers are demanding
or what their relative value to the business is.
Even some CSPs that do segment their customers
with the assistance of customer relationship
management (CRM) system struggle to take
complete advantage of that segmentation in
developing a real-time value strategy. In hyper-
sophisticated interaction patterns throughout
their journey spanning marketing, research,
order, service and retention, Big data sheds
shining light to expose treasured customer
intelligence along aspects of 4Is viz., interest,
insight, interaction and intelligence.
Interest and Insight: Customers offer
their attention for interest and share
their insights. They visit a web site,
make a call, or access a retail store, share
view on social media because they want
something from CSP at that moment
information about a product or help
with a problem. These interactions
present an opportunity for the CSP to
communicate with a customer who is
engaged by choice and ready to share
information regarding her personalized
wants and needs.
Interaction and Intelligence: It is typically
crucial for CSPs to target offerings to
particular customer segments based on
the intelligence of customer data. The
success of these real time interactions
whether through online, mobile, social
media, or other channels depends to a
great extent on the CSPs understanding
of the customers wants and needs at the
time of the interaction.
Social Store
Web Call centre Variety
Value
Volume Velocity
Mobile
Figure 1: Big Data in 3Vs is accumulated from Multiple
Channels
Source: Infosys Research
29
Therefore, alongside managing and securing
Big data in 3V dimensions, CSPs are facing a
fundamental challenge on how to explore and
harness Big data Value (BDV).
A HOLISTIC 5C PROCESS TO
HARNESS BDV
Rising to the challenges and leveraging on the
opportunity in Big data, CSPs need to harness
BDV with predictive models to provide deeper
insight into customer intelligence from profles,
behaviours and preferences that are hidden in
Big data of vast volume and broad variety, and
to deliver superior personalized experience
with fast velocity in real time throughout entire
customer journey.
In the past decade, most CSPs have
invested signifcant amount of efforts in the
implementation of complex CRM systems to
manage customer experience. While those CRM
systems bring effciency in helping CSPs to
deliver on what to do in managing historical
transactions, they lack the crucial capability
of defning how to act in time with the most
relevant interaction to maximize the value for
the customer.
CSPs now need to look beyond what
CRM has to offer and dive deeper to cover
how to do things right for the customer by
capturing customers subjective sentiment
in a particular interaction, resultant insight
into predication on what a customers demand
from CSPs and trigger proactive action to
satisfy their needs, which is more likely
to lead to customer delight and ultimate
revenues.
To do so, CSPs needs to execute a
holistic 5C process, i.e., collect, converge,
correlate, collaborate and control, in
extracting BDV (Fig. 2).
The holistic 5C process will help CSPs to
aggregate the whole interaction with a customer
across time and channels, support with large
volume and broad variety of data including
promotion, product, order and services, defne
interactions with that of customers preferences.
The context of the customers relationship with
the CSP, and actual and potential value that she
derives, in particular, determine the likelihood
that she consumer will take particular actions
based on real time intelligence. Big data can
help the CSP correlate the customers needs with
product, promotion, order, service and deliver
the right offer at the right time in the appropriate
context that she is most likely to respond to.
AN OVERARCHING 3M FRAMEWORK
TO EXTRACT BDV
To execute a holistic 5C process for Big data,
CSPs need to implement an overarching
framework that integrates the various pools
of customer related data residing in CSPs
enterprise systems, create an actionable
customer profile, deliver insight based on
that profle in real time customer interaction
event and effectively match sales and service
resources to take proactive actions, so as to
monetize ultimate value on the fy.
Collaborate Collect
Converge Control
Correlate
Customer
Product
Service
Order Promotion
Figure 2: Harness BDV with a Holistic 5C Process
Source: Infosys Research
30
The overarching framework needs to incorporate
3M modules, i.e. Model, Monitor and Mobilize
Model Profile: It models customer
profle based on all the transactions that
helps CSPs gain insight at the individual-
customer level. Such a profle requires
not only integration of all customer
facing systems and enterprise systems,
but integration with all the customer
interactions such as email, mobile, online
and social in enterprise systems such as
OMS, CMS, IMS and ERP in parallel with
CRM paradigm, and model an actionable
customer profle to be able to effectively
deploy resources for a distinct customer
experience.
Monitor Pattern: It monitors customer
interaction events from multiple touch
points in real time, dynamically senses
and triggers matching patterns of events
with the defned policies and set models,
and makes suitable recommendations
and offers at right time through an
appropriate channel. It enables CSPs
to quickly respond to changes in the
marketplacea seasonal change in
demand, for exampleand bundle
offerings that will appeal to a particular
customer, across a particular channel, at
a particular time.
Mobilize Process: It mobilizes a set
of automations that allows customers
enj oy t he personal i zed engagi ng
journey in real time that spans outbound
and inbound communications, sales,
orders, service and help intervention,
and fulfil customers next immediate
demand.
The 3M framework needs to be based on an
event-driven architecture (EDA) incorporating
Enterprise Service Bus (ESB) and Business
Process Management (BPM) and should be
application and technology agnostic. It needs
to interact with multiple channels using events;
match patterns of a set of events with pre-defned
policies, rules, and analytical models; deliver a set
of automations to fulfl personalized experience
that spans the complete customer lifecycle.
Furthermore, the 3M framework needs to
be supported with key high-level functional
components, which include:
Customer Intelligence from Big data:
A typical implementation of customer
i nt el l i gence f rom Bi g dat a i s t he
combination of Data Warehouse and real
time customer intelligence analytics. It
requires aggregation of customer and
product data from CSPs various data
sources in BSS/OSS, leveraging CSPs
existing investments with data models,
workfows, decision tables, user interface,
etc. It also integrates with the key modules
in CSPs enterprise landscape, covering:
C u s t o me r Ma n a g e me n t :
A complete customer relationship
management solution combines a
360 degree view of the customer
with intelligent guidance and
seamless back-offce integration
to increase frst contact resolution
and operational effciency.
O f f e r M a n a g e m e n t :
CSP- s peci f i c s peci al i zat i on
and re- use capabi l i t i es t hat
define new services, products,
31
bundles, fulfilment processes
and dependencies and rapidly
c a pi t a l i z e on ne w ma r ke t
oppor t uni t i es and i mpr ove
customer experience.
O r d e r M a n a g e m e n t :
The confgurable best practices for
creating and maintaining holistic
order journey that is critical to the
success of such product-intensive
functions as account opening, quote
generation, ordering, contract
generation, product fulflment and
service delivery.
Service Management: Case based
work automation and a complete
view of each case enables an
effective management of every
case throughout its lifecycle.
Event Driven Process Automation:
A dynamic process automation engine
empowered with EDA leverages the
context of the interaction to orchestrate
the fow of activities, guiding customer
service representatives (CSRs) and self-
service customers through every step in
their inbound and outbound interactions,
in particular for Campaign Management
and Retention Management.
Campaign Management: Outbound
interactions are typically used
to target products and services
to particular customer segments
based on analysis of customer data
through appropriate channels.
It uncovers relevant, timely and
actionable consumer and network
insights to enable intelligently
dri ven marketi ng campai gns
to develop, define and refine
marketing messages and target
customer with a more effective
planand meet customers at the
touch points of their choosing
through optimized display and
search results while generating
demand via automated email
creation, delivery and results
tracking.
R e t e n t i o n Ma n a g e me n t :
Customers offer their attention,
e i t he r i nt r us i ve l y or non-
intrusively to look for the products
and services that meet their needs
through the channel of thei r
choices. It dynamically captures
consumer data from highly active
and relevant outlets such as social
media, websites and other social
sources and enabl es CSPs to
quickly respond to customer needs
and proactively deliver relevant
offers for upgrades and product
bundles that take into account each
customers personal preference.
Experience Personalization: It provides
the customer with personalized, relevant
experience, enabled from business process
automation that connects people, processes
and systems in real time and eliminates
product, process and channel silos. It helps
CSPs extend predictive targeting beyond
basic cross-sells to automate more of their
cross-channel strategies and gain valuable
insights from hidden, consuming and
interaction patterns.
32
Overall, the 3M framework will empower BDV
solution for CSP to execute on the real-time
decision that aligns individual needs with
business objectives and dynamically fulfls the
next best action or offer that will increase the
value of each personalized interaction.
BDV IN ACTION- CUSTOMER EXPERIENCE
OPTIMIZATION
By implementing the proposed BDV solution,
CSPs can optimize customer experience that
delivers the right interaction with each customer
at right time so as to build strong relationships,
reduce churn, and increase customer value to
the business.
From Customer Experience Perspective:
It provides CSP with real-time, end-
to end visibility into all the customer
interaction events taking place across
multi-channels, by correlating and
analyzing these events, using a set of
business rules, and automatically takes
proactive actions which ultimately lead
to customer experience optimization.
It helps CSP turn their multi-channel
contacts with customers into cohesive,
integrated interaction patterns, allowing
them to better segment their customers
and ultimately to take full advantage of
that segmentation, deliver personalized
experiences that are dynamically tailored
to each customer while dramatically
improving interaction effectiveness and
effciency.
From CSPs Perspective: It helps CSPs
quickly weed out underperforming
campaigns and learn more about their
customers and their needs. From retail
store to contact centre to Web to social
media, it helps CSPs deliver a new
standard of branded, consistent customer
experiences that build deeper, more
profitable and lasting relationships. It
enables CSPs to maximize productivity
by handling customer interactions as fast
as possible in the most proftable channel.
At every point in the customer lifecycle, from
marketing campaigns, offer and order to
servicing and retention efforts, BDV helps to
inform its interactions with that customers
preferences, the context of her relationship with
the business, and actual and potential value,
enables CSPs focus on creating personalized
experiences that balance the customers needs
with business values.
Campaign Management: BDV delivers
focused campaigns on the customer with
predictive modelling and cost-effective
campaign automation that consistently
distinguishes the brand and supports
personalized communications with
prospects and customers.
Offer Management: BDV dynamically
generates offers that account for such
factors as the current interaction with
the customer, the individuals total value
across product lines, past interactions,
and likelihood of defecting. It helps
deliver optimal value and increases the
effectiveness of propositions with next-
best-action recommendations tailored to
the individual customer.
Order Management: BDV enables the
unified process automation applicable
to multiple product lines, with agile and
33
flexible workflow, rules and process
orchestration that accounts for the
individual needs in product pricing,
configuration, processing, payment
scheduling and delivery.
Service Management: BDV empowers
customer service representatives to
act based on the unique needs and
behaviours of each customer using real-
time intelligence combined with holistic
customer content and context.
Retention Management: BDV helps
CSPs retain more high-value customers
wi t h t ar ge t e d ne xt - be s t - ac t i on
dialogues. It consistently turns customer
interactions into sales opportunities
by automatically prompting customer
service representatives to proactively
deliver relevant offers to satisfy each
customers unique need.
CONCLUSION
Todays increasingly sophisticated digital
consumers expect CSPs to deliver product,
service and interaction experience designed
just for me at this moment. To take on the
challenge, CSPs need to deliver customer
experience optimization powered by BDV in
real time.
By implementing an overarching 3M
BDV framework to execute a holistic 5C process
new products can be brought to market with
faster velocity and with the ability to easily
adapt common services to accommodate unique
customer and channel needs.
Suffce it to say that BDV will enable
CSP to deliver customer-focused experience
that matches responses to specifc individual
demands; provide real time intelligent guidance
that streamlines complex interactions; and
automate interactions from end-to-end. The
result is an optimized customer experience
that helps CSPs substantially increase customer
satisfaction, retention and proftability, and
consequently empowers CSPs evolving into
the experience centric Tomorrows Enterprise.
REFERENCES
1. IBM Big data solutions deliver insight
and relevance for digital media Solution
Brief- June 2012 available at www-05.
ibm. com/fr/events/netezzaDM. . . /
Solutions_Big_Data.pdf.
2. Oracle Big data Premier-Presentation
( May 2012) . Avai l abl e at ht t p: //
premiere.digitalmedianet.com/articles/
viewarticle.jsp?id=1962030.
3. SAP HANA for Next-Generation
Business Applications and Real-Time
Analytics (July 2012). Available at http://
www.saphana.com/docs/DOC-1507.
4. SAS
Subu Goparaju
Senior Vice President
and Head of Infosys Labs
At Infosys Labs, we constantly look for opportunities to leverage
technology while creating and implementing innovative business
solutions for our clients. As part of this quest, we develop engineering
methodologies that help Infosys implement these solutions right,
frst time and every time.
B
I
G
D
A
T
A
:
C
H
A
L
L
E
N
G
E
S
A
N
D
O
P
P
O
R
T
U
N
I
T
I
E
S
V
O
L
1
1
N
O
1
2
0
1
3
VOL 11 NO 1
2013
Infosys Labs Briefings
I
n
f
o
s
y
s
L
a
b
s
B
r
i
e
f
i
n
g
s