You are on page 1of 467

Content

1 An Overview of Data Warehouse 2 Data Warehouse Architecture 3 Data Modeling for Data Warehouse 4 Overview of Data Cleansing

5 Data Extraction, Transformation, Load

2009 Wipro Ltd - Confidential

Content [contd]
6 Metadata Management 7 OLAP 8 Data Warehouse Testing

2009 Wipro Ltd - Confidential

Data Warehouse Concepts

Avinash Kanumuru Diya Jana Debyajit Majumder

2009 Wipro Ltd - Confidential

An Overview
Understanding What is a Data Warehouse

2009 Wipro Ltd - Confidential

What is Data Warehouse?


Definitions of Data Warehouse A data warehouse is a subject-oriented, integrated, nonvolatile, time-variant collection of data in support of management's decisions. WH Inmon Data Warehouse is a repository of data summarized or aggregated in simplified form from operational systems. End user orientated data access and reporting tools let user get at the data for decision support Babcock A data warehouse is a relational database a copy of transaction data specifically structured for query and analysis Ralph Kimball In simple: Data warehousing is collection of data from different systems, which helps in Business Decisions, Analysis and Reporting.

2009 Wipro Ltd - Confidential

Data Warehouse def. by WH Inmon


A common way of introducing data warehousing is to refer to the characteristics of a data warehouse as set forth by William Inmon: Subject Oriented Data that gives information about a particular subject instead of about a company's ongoing operations. Integrated Data that is gathered into the data warehouse from a variety of sources and merged into a coherent whole. Nonvolatile Data is stable in a data warehouse. More data is added but data is never removed. This enables management to gain a consistent picture of the business. Time Variant In order to discover trends in business, analysts need large amounts of data. This is very much in contrast to online transaction processing (OLTP) systems, where performance requirements demand that historical data be moved to an archive. All data in the data warehouse is identified with a particular time period.

2009 Wipro Ltd - Confidential

Data Warehouse Architecture


What makes a Data Warehouse

2009 Wipro Ltd - Confidential

Components of Warehouse
Source Tables: These are real-time, volatile data in relational databases for transaction processing (OLTP). These can be any relational databases or flat files. ETL Tools: To extract, cleansing, transform (aggregates, joins) and load the data from sources to target. Maintenance and Administration Tools: To authorize and monitor access to the data, set-up users. Scheduling jobs to run on offshore periods. Modeling Tools: Used for data warehouse design for high-performance using dimensional data modeling technique, mapping the source and target files. Databases: Target databases and data marts, which are part of data warehouse. These are structured for analysis and reporting purposes. End-user tools for analysis and reporting: get the reports and analyze the data from target tables. Different types of Querying, Data Mining, OLAP tools are used for this purpose.

2009 Wipro Ltd - Confidential

Data Warehouse Architecture


This is a basic design, where there are source files, which are loaded to a warehouse and users query the data for different purposes.

This has a staging area, where the data after cleansing, transforming is loaded and tested here. Later is directly loaded to the target database/warehouse. Which is divided to data marts and can be accessed by different users for their reporting and analyzing purposes.

10

2009 Wipro Ltd - Confidential

Data Modeling
Effective way of using a Data Warehouse

11

2009 Wipro Ltd - Confidential

Data Modeling Commonly E-R Data Model is used in OLTP, In OLAP Dimensional Data Model is used commonly. E-R (Entity-Relationship) Data Model
Entity: Object that can be observed and classified based on its properties and characteristics. Like employee, book, student Relationship: relating entities to other entities.

Different Perceptive of Data Modeling.


o Conceptual Data Model o Logical Data Model o Physical Data Model
12

2009 Wipro Ltd - Confidential

Terms used in Dimensional Data Model


To understand dimensional data modeling, let's define some of the terms commonly used in this type of modeling: Dimension: A category of information. For example, the time dimension. Attribute: A unique level within a dimension. For example, Month is an attribute in the Time Dimension. Hierarchy: The specification of levels that represents relationship between different attributes within a dimension. For example, one possible hierarchy in the Time dimension is Year Quarter Month Day. Fact Table: A table that contains the measures of interest. Lookup Table: It provides the detailed information about the attributes. For example, the lookup table for the Quarter attribute would include a list of all of the quarters available in the data warehouse. Surrogate Keys: To avoid the data integrity, surrogate keys are used. They are helpful for Slow Changing Dimensions and act as index/primary keys.
A dimensional model includes fact tables and lookup tables. Fact tables connect to one or more lookup tables, but fact tables do not have direct relationships to one another. Dimensions and hierarchies are represented by lookup tables. Attributes are the non-key 2009 Wipro Ltd - Confidential columns in the lookup tables.

13

Star Schema
Dimension Table
product prodId p1 p2 name price bolt 10 nut 5

Dimension Table
store storeId c1 c2 c3 city nyc sfo la

Fact Table
sale oderId date o100 1/7/97 o102 2/7/97 105 3/8/97 custId 53 53 111 prodId p1 p2 p1 storeId c1 c1 c3 qty 1 2 5 amt 12 11 50

Dimension Table
customer custId 53 81 111 name joe fred sally address 10 main 12 main 80 willow city sfo sfo la

14

2009 Wipro Ltd - Confidential

Snowflake Schema
Dimension Table Fact Table
store storeId s5 s7 s9 cityId sfo sfo la tId t1 t2 t1 mgr joe fred nancy
sType tId t1 t2 city size small large location downtown suburbs regId north south

Dimension Table
cityId pop sfo 1M la 5M

The star and snowflake schema are most commonly found in dimensional data warehouses and data marts where speed of data retrieval is more important than the efficiency of data manipulations. As such, the tables in these schema are not normalized much, and are frequently designed at a level of normalization short of third normal form.

region regId name north cold region south warm region

15

2009 Wipro Ltd - Confidential

Overview of Data Cleansing

16

2009 Wipro Ltd - Confidential

The Need For Data Quality Difficulty in decision making Time delays in operation Organizational mistrust Data ownership conflicts Customer attrition Costs associated with

17

error detection error rework customer service fixing customer problems


2009 Wipro Ltd - Confidential

Six Steps To Data Quality


Understand Information Flow In Organization
Identify authoritative data sources Interview Employees & Customers

Identify Potential Problem Areas & Asses Impact

Data Entry Points


Cost of bad data

Measure Quality Of Data

Use business rule discovery tools to identify data with inconsistent,

missing, incomplete, duplicate or incorrect values


Use data cleansing tools to clean data at the source Load only clean data into the data warehouse

Clean & Load Data

Continuous Monitoring

Schedule Periodic Cleansing of Source Data

Identify Areas of Improvement

Identify & Correct Cause of Defects Refine data capture mechanisms at source Educate users on importance of DQ
2009 Wipro Ltd - Confidential

18

Data Quality Solution Customized Programs Strengths:


Addresses specific needs No bulky one time investment

Limitations
Tons of Custom programs in different environments are difficult to manage Minor alterations demand coding efforts

Data Quality Assessment tools Strength


Provide automated assessment

Limitation
19
2009 Wipro Ltd - Confidential

Data Quality Solution


Business Rule Discovery tools Strengths
Detect Correlation in data values Can detect Patterns of behavior that indicate fraud

Limitations
Not all variables can be discovered Some discovered rules might not be pertinent There may be performance problems with large files or with many fields.

Data Reengineering & Cleansing tools Strengths


Usually are integrated packages with cleansing features as Add-on
20
2009 Wipro Ltd - Confidential

Tools In The Market Business Rule Discovery Tools


Integrity Data Reengineering Tool from Vality Technology Trillium Software System from Harte -Hanks Data Technologies Migration Architect from DB Star

Data Reengineering & Cleansing Tools


Carlton Pureview from Oracle ETI-Extract from Evolutionary Technologies PowerMart from Informatica Corp Sagent Data Mart from Sagent Technology

Data Quality Assessment Tools


Migration Architect, Evoke Axio from Evoke Software Wizrule from Wizsoft

Name & Address Cleansing Tools


21

Centrus Suite from Sagent I.d.centric from First Logic


2009 Wipro Ltd - Confidential

Data Extraction, Transformation, Load

22

2009 Wipro Ltd - Confidential

ETL Architecture

Visitors

Web Browsers

The Internet

External Data Demographics, Household, Webographics, Income

Staging Area
Web Server Logs & E-comm Transaction Data Flat Files

Meta Data Repository

Scheduled Extraction

RDBMS

Clean Transform Match Merge

Scheduled Loading

Enterprise Data Warehouse

Other OLTP Systems

Data Collection

Data Extraction

Data Transformation

Data Loading

Data Storage & Integration

23

2009 Wipro Ltd - Confidential

ETL Architecture Data Extraction:


Rummages through a file or database Uses some criteria for selection Identifies qualified data and Transports the data over onto another file or database

Data transformation
Integrating dissimilar data types Changing codes Adding a time attribute Summarizing data Calculating derived values Renormalizing data

Data Extraction Cleanup

Data loading
Initial and incremental loading Updation of metadata

24

Restructuring of records or fields Removal of Operational-only data Supply of missing field values Data Integrity checks Data Consistency and Range checks, 2009 Wipro Ltd - Confidential

Why ETL ?
Companies have valuable data lying around throughout their networks that needs to be moved from one place to another. The data lies in all sorts of heterogeneous systems,and therefore in all sorts of formats. To solve the problem, companies use extract, transform and load (ETL) software.
25
2009 Wipro Ltd - Confidential

Major components involved in ETL Processing

26

2009 Wipro Ltd - Confidential

Major components involved in ETL Processing


Design manager Lets developers define source-to-target mappings, transformations, process flows, and jobs Meta data management Provides a repository to define, document, and manage information about the ETL design and runtime processes Extract The process of reading data from a database. Transform The process of converting the extracted data Load The process of writing the data into the target database. Transport services ETL tools use network and file protocols to move data between source and target systems and in-memory protocols to move data between ETL run-time components. Administration and operation ETL utilities let administrators schedule, run, monitor ETL jobs, log all events, manage errors, recover from failures, reconcile outputs with source systems
2009 Wipro Ltd - Confidential

27

ETL Tools Provides facility to specify a large number of transformation rules with a GUI Generate programs to transform data Handle multiple data sources Handle data redundancy Generate metadata as output Most tools exploit parallelism by running on multiple low-cost servers in multi-threaded environment
28
2009 Wipro Ltd - Confidential

Metadata Management

29

2009 Wipro Ltd - Confidential

What Is Metadata?
Metadata is Information...

That describes the WHAT, WHEN, WHO, WHERE, HOW of the data warehouse About the data being captured and loaded into the Warehouse Documented in IT tools that improves both business and technical understanding of data and data-related processes

30

2009 Wipro Ltd - Confidential

Importance Of Metadata
Locating Information Time spent in looking for information. How often information is found? What poor decisions were made based on the incomplete information?

How much money was lost or earned as a result? Interpreting information


How many times have businesses needed to rework or recall products? What impact does it have on the bottom line ? How many mistakes were due to misinterpretation of existing documentation? How much interpretation results form too much metadata? How much time is spent trying to determine if any of the metadata is accurate? Integrating information How various data perspectives connect together? How much time is spent trying to figure out that? How much does the inefficiency and lack of metadata affect decision making

31

2009 Wipro Ltd - Confidential

Requirements for DW Metadata Management


Provide a simple catalogue of business metadata descriptions and views Document/manage metadata descriptions from an integrated development environment Enable DW users to identify and invoke pre-built queries against the data stores Design and enhance new data models and schemas for the data warehouse Capture data transformation rules between the operational and data warehousing databases Provide change impact analysis, and update across these technologies
32
2009 Wipro Ltd - Confidential

Consumers of Metadata
Technical Users Warehouse administrator Application developer Business Users -Business metadata Meanings Definitions Business Rules Software Tools Used in DW life-cycle development Metadata requirements for each tool must be identified The tool-specific metadata should be analysed for inclusion in the enterprise metadata repository Previously captured metadata should be electronically transferred from the enterprise metadata repository to each individual tool

33

2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Third Party Bridging Tools Oracle Exchange
Technology of choice for a long list of repository, enterprise and workgroup vendors

Reischmann-Informatik-Toolbus
Features include facilitation of selective bridging of metadata

Ardent Software/ Dovetail Software -Interplay


Hub and Spoke solution for enabling metadata interoperability Ardent focussing on own engagements, not selling it as independent product

Informix's Metadata Plug-ins


Available with Ardent Datastage version 3.6.2 free of cost for Erwin, Oracle Designer, Sybase Powerdesigner, Brio, Microstrategy
34
2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Metadata Repositories IBM, Oracle and Microsoft to offer free or near-free basic repository services Enable organisations to reuse metadata across technologies Integrate DB design, data transformation and BI tools from different vendors Multi-tool vendors taking a bridged or federated rather than integrated approach to sharing metadata Both IBM and Oracle have multiple repositories for different lines of products e.g., One for AD and one for DW, with bridges between them

35

2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Metadata Interchange Standards CDIF (CASE Data Interchange Format)
Most frequently used interchange standard Addresses only a limited subset of metadata artifacts

OMG (Object Management Group)-CWM


XML-addresses context and data meaning, not presentation Can enable exchange over the web employing industry standards for storing and sharing programming data Will allow sharing of UML and MOF objects b/w various development tools and repositories

MDC (Metadata Coalition)


Based on XML/UML standards Promoted by Microsoft Along With 20 partners including Object Management Group (OMG), Oracle Carleton Group, CA-PLATINUM Technology (Founding Member), Viasoft
36
2009 Wipro Ltd - Confidential

OLAP

37

2009 Wipro Ltd - Confidential

Agenda
OLAP Definition Distinction between OLTP and OLAP

MDDB Concepts
Implementation Techniques Architectures

Features
Representative Tools

12/20/2012

38

38

2009 Wipro Ltd - Confidential

OLAP: On-Line Analytical Processing


OLAP can be defined as a technology which allows the users to view the aggregate data across measurements (like Maturity Amount, Interest Rate etc.) along with a set of related parameters called dimensions (like Product, Organization, Customer, etc.) Used interchangeably with BI Multidimensional view of data is the foundation of OLAP Users :Analysts, Decision makers
12/20/2012 39

39

2009 Wipro Ltd - Confidential

Distinction between OLTP and OLAP


OLTP System Source of data Operational data; OLTPs are the original source of the data To control and run fundamental business tasks A snapshot of ongoing business processes Short and fast inserts and updates initiated by end users
2009 Wipro Ltd - Confidential

OLAP System Consolidation data; OLAP data comes from the various OLTP databases Decision support

Purpose of data

What the data reveals Inserts and Updates


12/20/2012

Multi-dimensional views of various kinds of business activities Periodic long-running batch jobs refresh the data
40

40

MDDB Concepts
A multidimensional database is a computer software system designed to allow for efficient and convenient storage and retrieval of data that is intimately related and stored, viewed and analyzed from different perspectives (Dimensions). A hypercube represents a collection of multidimensional data. The edges of the cube are called dimensions Individual items within each dimensions are called members

41

2009 Wipro Ltd - Confidential

RDBMS v/s MDDB: Increased Complexity...


Relational DBMS
MODEL MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SEDAN SEDAN SEDAN ... COLOR BLUE BLUE BLUE RED RED RED WHITE WHITE WHITE BLUE BLUE BLUE RED RED RED WHITE WHITE WHITE BLUE BLUE BLUE DEALER Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr VOL. 6 3 2 5 3 1 3 1 4 3 3 3 4 3 6 2 3 5 4 3 2 ...

MDDB

Sales Volumes

M O D E L

Mini Van

Coupe Carr Gleason Clyde Blue Red White

Sedan

DEALERSHIP

COLOR

27 x 4 = 108 cells
42

3 x 3 x 3 = 27 cells

2009 Wipro Ltd - Confidential

Benefits of MDDB over RDBMS


Ease of Data Presentation & Navigation A great deal of information is gleaned immediately upon direct inspection of the array User is able to view data along presorted dimensions with data arranged in an inherently more organized, and accessible fashion than the one offered by the relational table. Storage Space Very low Space Consumption compared to Relational DB Performance Gives much better performance. Relational DB may give comparable results only through database tuning (indexing, keys etc), which may not be possible for ad-hoc queries. Ease of Maintenance No overhead as data is stored in the same way it is viewed. In Relational DB, indexes, sophisticated joins etc. are used which require considerable storage and maintenance
12/20/2012
43
2009 Wipro Ltd - Confidential

43

Issues with MDDB

Sparsity
- Input data in applications are typically sparse -Increases with increased dimensions

Data Explosion
-Due to Sparsity -Due to Summarization

Performance
-Doesnt perform better than RDBMS at high data volumes (>20-30 GB)

12/20/2012
44
2009 Wipro Ltd - Confidential

44

Issues with MDDB - Sparsity Example If dimension members of different dimensions Employee Age do not interact , then blank cell is left behind. LAST NAME EMP# AGE
Smith

SMITH REGAN FOX WELD KELLY LINK KRANZ LUCUS WEISS

M O D E L

01 21 12 Sales Volumes 19 31 63 Miini Van 14 6 5 31 4 54 3 5 27 Coupe 5 03 56 4 3 2 Sedan 41 45 Blue Red White 33 COLOR 41 23 19

21

Regan

19 63 31 27 56 45 41 19
31 41 23 01 14 54 03 12 33

Fox

L A S T N A M E

Weld

Kelly

Link

Kranz

Lucas

Weiss

EMPLOYEE #

12/20/2012
45
2009 Wipro Ltd - Confidential

45

OLAP Features
Calculations applied across dimensions, through hierarchies and/or across members Trend analysis over sequential time periods, What-if scenarios. Slicing / Dicing subsets for on-screen viewing Rotation to new dimensional comparisons in the viewing area Drill-down/up along the hierarchy Reach-through / Drill-through to underlying detail data

12/20/2012
46
2009 Wipro Ltd - Confidential

46

Features of OLAP - Rotation

Complex Queries & Sorts in Relational environment translated to simple rotation.


Sales Volumes

M O D E L

Mini Van

6 3 4
Blue

5 5 3
Red

4 5 2
White

Coupe

C O L O R ( ROTATE 90 )
o

Blue

6 5 4

3 5 5
MODEL

4 3 2
Sedan

Red

Sedan

White

Mini Van Coupe

COLOR

View #1

View #2

2 dimensional array has 2 views.


12/20/2012
47
2009 Wipro Ltd - Confidential

47

Features of OLAP - Rotation


Sales Volumes

M O D E L

Mini Van Coupe Carr Gleason Clyde Blue Red White

Sedan

C O L O R

Blue

Red White Sedan Coupe Mini Van Carr Gleason Clyde

C O L O R

Blue

Red White Carr Gleason Clyde Mini Van Coupe Sedan

COLOR

( ROTATE 90 )

MODEL

( ROTATE 90 )

DEALERSHIP

( ROTATE 90 )

DEALERSHIP

DEALERSHIP

MODEL

View #1
D E A L E R S H I P D E A L E R S H I P

View #2

View #3

Carr Gleason Mini Van Coupe Sedan White Red Blue

Carr Gleason Blue Red White

Mini Van

Clyde

Clyde Mini Van Coupe Sedan

M O D E L

Coupe Blue Red White Clyde Gleason Carr

Sedan

COLOR

( ROTATE 90 )

MODEL

( ROTATE 90 )

DEALERSHIP

MODEL

COLOR

COLOR

View #4

View #5

View #6

3 dimensional array has 6 views.


12/20/2012
48
2009 Wipro Ltd - Confidential

48

Features of OLAP - Slicing / Filtering


MDDB allows end user to quickly slice in on exact view of the data required.

Sales Volumes

M O D E L

Mini Van Mini Van

Coupe

Coupe Normal Metal Blue Blue

Carr Clyde

Carr Clyde

Normal Blue

Metal Blue

DEALERSHIP

COLOR
12/20/2012
49
2009 Wipro Ltd - Confidential

49

Features of OLAP - Drill Down / Up

ORGANIZATION DIMENSION
REGION Midwest

DISTRICT

Chicago

St. Louis

Gary

DEALERSHIP

Clyde

Gleason

Carr

Levi

Lucas

Bolton

Sales at region/District/Dealership Level

Moving Up and moving down in a hierarchy is referred to as drill-up / roll-up and drill-down

12/20/2012
50
2009 Wipro Ltd - Confidential

50

OLAP Reporting - Drill Down

Inflows ( Region , Year)


200 150 Inflows 100 ($M) 50 0 Year Year 1999 2000 Years

East West Central

12/20/2012
51
2009 Wipro Ltd - Confidential

51

OLAP Reporting - Drill Down

Inflows ( Region , Year - Year 1999)


90 80 70 60 50 Inflows ( $M) 40 30 20 10 0

East West Central

1st Qtr

2nd Qtr 3rd Qtr Year 1999

4th Qtr

Drill-down from Year to Quarter


12/20/2012
52
2009 Wipro Ltd - Confidential

52

OLAP Reporting - Drill Down

Inflows ( Region , Year - Year 1999 - 1st Qtr)


20 15 Inflows ( $M 10 ) 5

East West Central


January February March Year 1999

Drill-down from Quarter to Month

53

2009 Wipro Ltd - Confidential

Implementation Techniques -OLAP Architectures

MOLAP - Multidimensional OLAP


Multidimensional Databases for database and application logic layer

ROLAP - Relational OLAP


Access Data stored in relational Data Warehouse for OLAP Analysis. Database and Application logic provided as separate layers

HOLAP - Hybrid OLAP


OLAP Server routes queries first to MDDB, then to RDBMS and result processed on-the-fly in Server

DOLAP - Desk OLAP


Personal MDDB Server and application on the desktop

12/20/2012
54
2009 Wipro Ltd - Confidential

54

MOLAP - MDDB storage

OLAP
Cube
OLAP Calculation Engine

Web Browser

OLAP Tools

OLAP Appli cations


12/20/2012
55
2009 Wipro Ltd - Confidential

55

MOLAP - Features

Powerful analytical capabilities (e.g., financial, forecasting, statistical) Aggregation and calculation capabilities Read/write analytic applications Specialized data structures for
Maximum query performance. Optimum space utilization.
12/20/2012
56
2009 Wipro Ltd - Confidential

56

ROLAP - Standard SQL storage

MDDB - Relational Mapping

Relational DW

Web Browser
OLAP Calculation Engine

SQL

OLAP Tools

OLAP Applications
12/20/2012
57
2009 Wipro Ltd - Confidential

57

ROLAP - Features Three-tier hardware/software architecture:


GUI on client; multidimensional processing on midtier server; target database on database server Processing split between mid-tier & database servers

Ad hoc query capabilities to very large databases DW integration Data scalability

12/20/2012
58
2009 Wipro Ltd - Confidential

58

HOLAP - Combination of RDBMS and MDDB


OLAP Cube

Any Client

Relational DW

Web Browser
OLAP Calculation Engine

SQL

OLAP Tools

OLAP Applications
12/20/2012
59
2009 Wipro Ltd - Confidential

59

HOLAP - Features

RDBMS used for detailed data stored in large databases MDDB used for fast, read/write OLAP analysis and calculations Scalability of RDBMS and MDDB performance Calculation engine provides full analysis features Source of data transparent to end user

12/20/2012
60
2009 Wipro Ltd - Confidential

60

Architecture Comparison

MOLAP
Definition

ROLAP

HOLAP
Hybrid OLAP = ROLAP + summary in MDDB Sparsity exists only in MDDB part To the necessary extent

MDDB OLAP = Relational OLAP = Transaction level data + Transaction level data + summary in MDDB summary in RDBMS Good Design 3 10 times High (May go beyond control. Estimation is very important) Fast - (Depends upon the size of the MDDB) No Sparsity To the necessary extent

Data explosion due to Sparsity Data explosion due to Summarization Query Execution Speed

Slow

Optimum - If the data is fetched from RDBMS then its like ROLAP otherwise like MOLAP. High: RDBMS + disk space + MDDB Server cost Large transactional data + frequent summary analysis

Cost

Medium: MDDB Server + large disk space cost

Low: Only RDBMS + disk space cost

Where to apply?

Small transactional Very large transactional data + complex model + data & it needs to be frequent summary viewed / sorted analysis

12/20/2012
61
2009 Wipro Ltd - Confidential

61

Representative OLAP Tools:

Oracle Express Products Hyperion Essbase Cognos -PowerPlay Seagate - Holos SAS

Micro Strategy - DSS Agent Informix MetaCube Brio Query Business Objects / Web Intelligence

12/20/2012
62
2009 Wipro Ltd - Confidential

62

Sample OLAP Applications

Sales Analysis Financial Analysis Profitability Analysis Performance Analysis Risk Management Profiling & Segmentation Scorecard Application NPA Management Strategic Planning Customer Relationship Management (CRM)
12/20/2012
63
2009 Wipro Ltd - Confidential

63

Data Warehouse Testing

64

2009 Wipro Ltd - Confidential

Data Warehouse Testing Overview


There is an exponentially increasing cost associated with finding software defects later in the development lifecycle. In data warehousing, this is compounded because of the additional business costs of using incorrect data to make critical business decisions

The methodology required for testing a Data Warehouse is different from testing a typical transaction system

65

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Data warehouse testing is different on the following counts: User-Triggered vs. System triggered Volume of Test Data Possible scenarios/ Test Cases Programming for testing challenge

66

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System.


User-Triggered vs. System triggered

In data Warehouse, most of the testing is system triggered. Most of the production/Source system testing is the processing of individual transactions, which are driven by some input from the users (Application Form, Servicing Request.). There are very few test cycles, which cover the system-triggered scenarios (Like billing, Valuation.)

67

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Volume of Test Data The test data in a transaction system is a very small sample of the overall production data. Data Warehouse has typically large test data as one does try to fill-up maximum possible combination of dimensions and facts. Possible scenarios/ Test Cases In case of Data Warehouse, the permutations and combinations one can possibly test is virtually unlimited due to the core objective of Data Warehouse is to allow all possible views of data.

68

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Programming for testing challenge In case of transaction systems, users/business analysts typically test the output of the system. In case of data warehouse, most of the 'Data Warehouse data Quality testing' and ETL testing is done at backend by running separate stand-alone scripts. These scripts compare preTransformation to post Transformation of data.

69

2009 Wipro Ltd - Confidential

Data Warehouse Testing Process


Data-Warehouse testing is basically divided into two parts : 'Back-end' testing where the source systems data is compared to the endresult data in Loaded area 'Front-end' testing where the user checks the data by comparing their MIS with the data displayed by the end-user tools like OLAP. Testing phases consists of : Requirements testing Unit testing Integration testing Performance testing Acceptance testing

70

2009 Wipro Ltd - Confidential

Requirements testing
The main aim for doing Requirements testing is to check stated requirements for completeness. Requirements can be tested on following factors. Are the requirements Complete? Are the requirements Singular? Are the requirements Ambiguous? Are the requirements Developable? Are the requirements Testable?

71

2009 Wipro Ltd - Confidential

Unit Testing
Unit testing for data warehouses is WHITEBOX. It should check the ETL procedures/mappings/jobs and the reports developed. Unit testing the ETL procedures: Whether ETLs are accessing and picking up right data from right source.

All the data transformations are correct according to the business rules and data warehouse is correctly populated with the transformed data.
Testing the rejected records that dont fulfil transformation rules.

72

2009 Wipro Ltd - Confidential

Unit Testing
Unit Testing the Report data:

Verify Report data with source: Data present in a data warehouse will be stored at an aggregate level compare to source systems. QA team should verify the granular data stored in data warehouse against the source data available Field level data verification: QA team must understand the linkages for the fields displayed in the report and should trace back and compare that with the source systems Derivation formulae/calculation rules should be verified

73

2009 Wipro Ltd - Confidential

Integration Testing
Integration testing will involve following:

Sequence of ETLs jobs in batch. Initial loading of records on data warehouse. Incremental loading of records at a later date to verify the newly inserted or updated data. Testing the rejected records that dont fulfil transformation rules. Error log generation

74

2009 Wipro Ltd - Confidential

Performance Testing
Performance Testing should check for : ETL processes completing within time window.

Monitoring and measuring the data quality issues.


Refresh times for standard/complex reports.

75

2009 Wipro Ltd - Confidential

Acceptance testing
Here the system is tested with full functionality and is expected to function as in production. At the end of UAT, the system should be acceptable to the client for use in terms of ETL process integrity and business functionality and reporting.

76

2009 Wipro Ltd - Confidential

Questions

77

2009 Wipro Ltd - Confidential

Thank You

78

2009 Wipro Ltd - Confidential

Data Warehouse Concepts

Avinash Kanumuru Diya Jana Debyajit Majumder

2009 Wipro Ltd - Confidential

Content
1 An Overview of Data Warehouse 2 Data Warehouse Architecture 3 Data Modeling for Data Warehouse 4 Overview of Data Cleansing

5 Data Extraction, Transformation, Load

80

2009 Wipro Ltd - Confidential

Content [contd]
6 Metadata Management 7 OLAP 8 Data Warehouse Testing

81

2009 Wipro Ltd - Confidential

An Overview
Understanding What is a Data Warehouse

82

2009 Wipro Ltd - Confidential

What is Data Warehouse?


Definitions of Data Warehouse A data warehouse is a subject-oriented, integrated, nonvolatile, time-variant collection of data in support of management's decisions. WH Inmon Data Warehouse is a repository of data summarized or aggregated in simplified form from operational systems. End user orientated data access and reporting tools let user get at the data for decision support Babcock A data warehouse is a relational database a copy of transaction data specifically structured for query and analysis Ralph Kimball In simple: Data warehousing is collection of data from different systems, which helps in Business Decisions, Analysis and Reporting.

83

2009 Wipro Ltd - Confidential

Data Warehouse def. by WH Inmon


A common way of introducing data warehousing is to refer to the characteristics of a data warehouse as set forth by William Inmon: Subject Oriented Data that gives information about a particular subject instead of about a company's ongoing operations. Integrated Data that is gathered into the data warehouse from a variety of sources and merged into a coherent whole. Nonvolatile Data is stable in a data warehouse. More data is added but data is never removed. This enables management to gain a consistent picture of the business. Time Variant In order to discover trends in business, analysts need large amounts of data. This is very much in contrast to online transaction processing (OLTP) systems, where performance requirements demand that historical data be moved to an archive. All data in the data warehouse is identified with a particular time period.

84

2009 Wipro Ltd - Confidential

Data Warehouse Architecture


What makes a Data Warehouse

85

2009 Wipro Ltd - Confidential

Data Warehouse Concepts

Avinash Kanumuru Diya Jana Debyajit Majumder

2009 Wipro Ltd - Confidential

Content
1 An Overview of Data Warehouse 2 Data Warehouse Architecture 3 Data Modeling for Data Warehouse 4 Overview of Data Cleansing

5 Data Extraction, Transformation, Load

87

2009 Wipro Ltd - Confidential

Content [contd]
6 Metadata Management 7 OLAP 8 Data Warehouse Testing

88

2009 Wipro Ltd - Confidential

An Overview
Understanding What is a Data Warehouse

89

2009 Wipro Ltd - Confidential

What is Data Warehouse?


Definitions of Data Warehouse A data warehouse is a subject-oriented, integrated, nonvolatile, time-variant collection of data in support of management's decisions. WH Inmon Data Warehouse is a repository of data summarized or aggregated in simplified form from operational systems. End user orientated data access and reporting tools let user get at the data for decision support Babcock A data warehouse is a relational database a copy of transaction data specifically structured for query and analysis Ralph Kimball In simple: Data warehousing is collection of data from different systems, which helps in Business Decisions, Analysis and Reporting.

90

2009 Wipro Ltd - Confidential

Data Warehouse def. by WH Inmon


A common way of introducing data warehousing is to refer to the characteristics of a data warehouse as set forth by William Inmon: Subject Oriented Data that gives information about a particular subject instead of about a company's ongoing operations. Integrated Data that is gathered into the data warehouse from a variety of sources and merged into a coherent whole. Nonvolatile Data is stable in a data warehouse. More data is added but data is never removed. This enables management to gain a consistent picture of the business. Time Variant In order to discover trends in business, analysts need large amounts of data. This is very much in contrast to online transaction processing (OLTP) systems, where performance requirements demand that historical data be moved to an archive. All data in the data warehouse is identified with a particular time period.

91

2009 Wipro Ltd - Confidential

Data Warehouse Architecture


What makes a Data Warehouse

92

2009 Wipro Ltd - Confidential

Data Warehouse Concepts

Avinash Kanumuru Diya Jana Debyajit Majumder

2009 Wipro Ltd - Confidential

Content
1 An Overview of Data Warehouse 2 Data Warehouse Architecture 3 Data Modeling for Data Warehouse 4 Overview of Data Cleansing

5 Data Extraction, Transformation, Load

94

2009 Wipro Ltd - Confidential

Content [contd]
6 Metadata Management 7 OLAP 8 Data Warehouse Testing

95

2009 Wipro Ltd - Confidential

An Overview
Understanding What is a Data Warehouse

96

2009 Wipro Ltd - Confidential

What is Data Warehouse?


Definitions of Data Warehouse A data warehouse is a subject-oriented, integrated, nonvolatile, time-variant collection of data in support of management's decisions. WH Inmon Data Warehouse is a repository of data summarized or aggregated in simplified form from operational systems. End user orientated data access and reporting tools let user get at the data for decision support Babcock A data warehouse is a relational database a copy of transaction data specifically structured for query and analysis Ralph Kimball In simple: Data warehousing is collection of data from different systems, which helps in Business Decisions, Analysis and Reporting.

97

2009 Wipro Ltd - Confidential

Data Warehouse def. by WH Inmon


A common way of introducing data warehousing is to refer to the characteristics of a data warehouse as set forth by William Inmon: Subject Oriented Data that gives information about a particular subject instead of about a company's ongoing operations. Integrated Data that is gathered into the data warehouse from a variety of sources and merged into a coherent whole. Nonvolatile Data is stable in a data warehouse. More data is added but data is never removed. This enables management to gain a consistent picture of the business. Time Variant In order to discover trends in business, analysts need large amounts of data. This is very much in contrast to online transaction processing (OLTP) systems, where performance requirements demand that historical data be moved to an archive. All data in the data warehouse is identified with a particular time period.

98

2009 Wipro Ltd - Confidential

Data Warehouse Architecture


What makes a Data Warehouse

99

2009 Wipro Ltd - Confidential

Components of Warehouse
Source Tables: These are real-time, volatile data in relational databases for transaction processing (OLTP). These can be any relational databases or flat files. ETL Tools: To extract, cleansing, transform (aggregates, joins) and load the data from sources to target. Maintenance and Administration Tools: To authorize and monitor access to the data, set-up users. Scheduling jobs to run on offshore periods. Modeling Tools: Used for data warehouse design for high-performance using dimensional data modeling technique, mapping the source and target files. Databases: Target databases and data marts, which are part of data warehouse. These are structured for analysis and reporting purposes. End-user tools for analysis and reporting: get the reports and analyze the data from target tables. Different types of Querying, Data Mining, OLAP tools are used for this purpose.

100

2009 Wipro Ltd - Confidential

Data Warehouse Architecture


This is a basic design, where there are source files, which are loaded to a warehouse and users query the data for different purposes.

This has a staging area, where the data after cleansing, transforming is loaded and tested here. Later is directly loaded to the target database/warehouse. Which is divided to data marts and can be accessed by different users for their reporting and analyzing purposes.

101

2009 Wipro Ltd - Confidential

Data Modeling
Effective way of using a Data Warehouse

102

2009 Wipro Ltd - Confidential

Data Modeling Commonly E-R Data Model is used in OLTP, In OLAP Dimensional Data Model is used commonly. E-R (Entity-Relationship) Data Model
Entity: Object that can be observed and classified based on its properties and characteristics. Like employee, book, student Relationship: relating entities to other entities.

Different Perceptive of Data Modeling.


o Conceptual Data Model o Logical Data Model o Physical Data Model
103

2009 Wipro Ltd - Confidential

Terms used in Dimensional Data Model


To understand dimensional data modeling, let's define some of the terms commonly used in this type of modeling: Dimension: A category of information. For example, the time dimension. Attribute: A unique level within a dimension. For example, Month is an attribute in the Time Dimension. Hierarchy: The specification of levels that represents relationship between different attributes within a dimension. For example, one possible hierarchy in the Time dimension is Year Quarter Month Day. Fact Table: A table that contains the measures of interest. Lookup Table: It provides the detailed information about the attributes. For example, the lookup table for the Quarter attribute would include a list of all of the quarters available in the data warehouse. Surrogate Keys: To avoid the data integrity, surrogate keys are used. They are helpful for Slow Changing Dimensions and act as index/primary keys.
A dimensional model includes fact tables and lookup tables. Fact tables connect to one or more lookup tables, but fact tables do not have direct relationships to one another. Dimensions and hierarchies are represented by lookup tables. Attributes are the non-key 2009 Wipro Ltd - Confidential columns in the lookup tables.

104

Star Schema
Dimension Table
product prodId p1 p2 name price bolt 10 nut 5

Dimension Table
store storeId c1 c2 c3 city nyc sfo la

Fact Table
sale oderId date o100 1/7/97 o102 2/7/97 105 3/8/97 custId 53 53 111 prodId p1 p2 p1 storeId c1 c1 c3 qty 1 2 5 amt 12 11 50

Dimension Table
customer custId 53 81 111 name joe fred sally address 10 main 12 main 80 willow city sfo sfo la

105

2009 Wipro Ltd - Confidential

Snowflake Schema
Dimension Table Fact Table
store storeId s5 s7 s9 cityId sfo sfo la tId t1 t2 t1 mgr joe fred nancy
sType tId t1 t2 city size small large location downtown suburbs regId north south

Dimension Table
cityId pop sfo 1M la 5M

The star and snowflake schema are most commonly found in dimensional data warehouses and data marts where speed of data retrieval is more important than the efficiency of data manipulations. As such, the tables in these schema are not normalized much, and are frequently designed at a level of normalization short of third normal form.

region regId name north cold region south warm region

106

2009 Wipro Ltd - Confidential

Overview of Data Cleansing

107

2009 Wipro Ltd - Confidential

The Need For Data Quality Difficulty in decision making Time delays in operation Organizational mistrust Data ownership conflicts Customer attrition Costs associated with

108

error detection error rework customer service fixing customer problems


2009 Wipro Ltd - Confidential

Six Steps To Data Quality


Understand Information Flow In Organization
Identify authoritative data sources Interview Employees & Customers

Identify Potential Problem Areas & Asses Impact

Data Entry Points


Cost of bad data

Measure Quality Of Data

Use business rule discovery tools to identify data with inconsistent,

missing, incomplete, duplicate or incorrect values


Use data cleansing tools to clean data at the source Load only clean data into the data warehouse

Clean & Load Data

Continuous Monitoring

Schedule Periodic Cleansing of Source Data

Identify Areas of Improvement

Identify & Correct Cause of Defects Refine data capture mechanisms at source Educate users on importance of DQ
2009 Wipro Ltd - Confidential

109

Data Quality Solution Customized Programs Strengths:


Addresses specific needs No bulky one time investment

Limitations
Tons of Custom programs in different environments are difficult to manage Minor alterations demand coding efforts

Data Quality Assessment tools Strength


Provide automated assessment

Limitation
110
2009 Wipro Ltd - Confidential

Data Quality Solution


Business Rule Discovery tools Strengths
Detect Correlation in data values Can detect Patterns of behavior that indicate fraud

Limitations
Not all variables can be discovered Some discovered rules might not be pertinent There may be performance problems with large files or with many fields.

Data Reengineering & Cleansing tools Strengths


Usually are integrated packages with cleansing features as Add-on
111
2009 Wipro Ltd - Confidential

Tools In The Market Business Rule Discovery Tools


Integrity Data Reengineering Tool from Vality Technology Trillium Software System from Harte -Hanks Data Technologies Migration Architect from DB Star

Data Reengineering & Cleansing Tools


Carlton Pureview from Oracle ETI-Extract from Evolutionary Technologies PowerMart from Informatica Corp Sagent Data Mart from Sagent Technology

Data Quality Assessment Tools


Migration Architect, Evoke Axio from Evoke Software Wizrule from Wizsoft

Name & Address Cleansing Tools


112

Centrus Suite from Sagent I.d.centric from First Logic


2009 Wipro Ltd - Confidential

Data Extraction, Transformation, Load

113

2009 Wipro Ltd - Confidential

ETL Architecture

Visitors

Web Browsers

The Internet

External Data Demographics, Household, Webographics, Income

Staging Area
Web Server Logs & E-comm Transaction Data Flat Files

Meta Data Repository

Scheduled Extraction

RDBMS

Clean Transform Match Merge

Scheduled Loading

Enterprise Data Warehouse

Other OLTP Systems

Data Collection

Data Extraction

Data Transformation

Data Loading

Data Storage & Integration

114

2009 Wipro Ltd - Confidential

ETL Architecture Data Extraction:


Rummages through a file or database Uses some criteria for selection Identifies qualified data and Transports the data over onto another file or database

Data transformation
Integrating dissimilar data types Changing codes Adding a time attribute Summarizing data Calculating derived values Renormalizing data

Data Extraction Cleanup

Data loading
Initial and incremental loading Updation of metadata

115

Restructuring of records or fields Removal of Operational-only data Supply of missing field values Data Integrity checks Data Consistency and Range checks, 2009 Wipro Ltd - Confidential

Why ETL ?
Companies have valuable data lying around throughout their networks that needs to be moved from one place to another. The data lies in all sorts of heterogeneous systems,and therefore in all sorts of formats. To solve the problem, companies use extract, transform and load (ETL) software.
116
2009 Wipro Ltd - Confidential

Major components involved in ETL Processing

117

2009 Wipro Ltd - Confidential

Major components involved in ETL Processing


Design manager Lets developers define source-to-target mappings, transformations, process flows, and jobs Meta data management Provides a repository to define, document, and manage information about the ETL design and runtime processes Extract The process of reading data from a database. Transform The process of converting the extracted data Load The process of writing the data into the target database. Transport services ETL tools use network and file protocols to move data between source and target systems and in-memory protocols to move data between ETL run-time components. Administration and operation ETL utilities let administrators schedule, run, monitor ETL jobs, log all events, manage errors, recover from failures, reconcile outputs with source systems
2009 Wipro Ltd - Confidential

118

ETL Tools Provides facility to specify a large number of transformation rules with a GUI Generate programs to transform data Handle multiple data sources Handle data redundancy Generate metadata as output Most tools exploit parallelism by running on multiple low-cost servers in multi-threaded environment
119
2009 Wipro Ltd - Confidential

Metadata Management

120

2009 Wipro Ltd - Confidential

What Is Metadata?
Metadata is Information...

That describes the WHAT, WHEN, WHO, WHERE, HOW of the data warehouse About the data being captured and loaded into the Warehouse Documented in IT tools that improves both business and technical understanding of data and data-related processes

121

2009 Wipro Ltd - Confidential

Importance Of Metadata
Locating Information Time spent in looking for information. How often information is found? What poor decisions were made based on the incomplete information?

How much money was lost or earned as a result? Interpreting information


How many times have businesses needed to rework or recall products? What impact does it have on the bottom line ? How many mistakes were due to misinterpretation of existing documentation? How much interpretation results form too much metadata? How much time is spent trying to determine if any of the metadata is accurate? Integrating information How various data perspectives connect together? How much time is spent trying to figure out that? How much does the inefficiency and lack of metadata affect decision making

122

2009 Wipro Ltd - Confidential

Requirements for DW Metadata Management


Provide a simple catalogue of business metadata descriptions and views Document/manage metadata descriptions from an integrated development environment Enable DW users to identify and invoke pre-built queries against the data stores Design and enhance new data models and schemas for the data warehouse Capture data transformation rules between the operational and data warehousing databases Provide change impact analysis, and update across these technologies
123
2009 Wipro Ltd - Confidential

Consumers of Metadata
Technical Users Warehouse administrator Application developer Business Users -Business metadata Meanings Definitions Business Rules Software Tools Used in DW life-cycle development Metadata requirements for each tool must be identified The tool-specific metadata should be analysed for inclusion in the enterprise metadata repository Previously captured metadata should be electronically transferred from the enterprise metadata repository to each individual tool

124

2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Third Party Bridging Tools Oracle Exchange
Technology of choice for a long list of repository, enterprise and workgroup vendors

Reischmann-Informatik-Toolbus
Features include facilitation of selective bridging of metadata

Ardent Software/ Dovetail Software -Interplay


Hub and Spoke solution for enabling metadata interoperability Ardent focussing on own engagements, not selling it as independent product

Informix's Metadata Plug-ins


Available with Ardent Datastage version 3.6.2 free of cost for Erwin, Oracle Designer, Sybase Powerdesigner, Brio, Microstrategy
125
2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Metadata Repositories IBM, Oracle and Microsoft to offer free or near-free basic repository services Enable organisations to reuse metadata across technologies Integrate DB design, data transformation and BI tools from different vendors Multi-tool vendors taking a bridged or federated rather than integrated approach to sharing metadata Both IBM and Oracle have multiple repositories for different lines of products e.g., One for AD and one for DW, with bridges between them

126

2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Metadata Interchange Standards CDIF (CASE Data Interchange Format)
Most frequently used interchange standard Addresses only a limited subset of metadata artifacts

OMG (Object Management Group)-CWM


XML-addresses context and data meaning, not presentation Can enable exchange over the web employing industry standards for storing and sharing programming data Will allow sharing of UML and MOF objects b/w various development tools and repositories

MDC (Metadata Coalition)


Based on XML/UML standards Promoted by Microsoft Along With 20 partners including Object Management Group (OMG), Oracle Carleton Group, CA-PLATINUM Technology (Founding Member), Viasoft
127
2009 Wipro Ltd - Confidential

OLAP

128

2009 Wipro Ltd - Confidential

Agenda
OLAP Definition Distinction between OLTP and OLAP

MDDB Concepts
Implementation Techniques Architectures

Features
Representative Tools

12/20/2012

129

129

2009 Wipro Ltd - Confidential

OLAP: On-Line Analytical Processing


OLAP can be defined as a technology which allows the users to view the aggregate data across measurements (like Maturity Amount, Interest Rate etc.) along with a set of related parameters called dimensions (like Product, Organization, Customer, etc.) Used interchangeably with BI Multidimensional view of data is the foundation of OLAP Users :Analysts, Decision makers
12/20/2012 130

130

2009 Wipro Ltd - Confidential

Distinction between OLTP and OLAP


OLTP System Source of data Operational data; OLTPs are the original source of the data To control and run fundamental business tasks A snapshot of ongoing business processes Short and fast inserts and updates initiated by end users
2009 Wipro Ltd - Confidential

OLAP System Consolidation data; OLAP data comes from the various OLTP databases Decision support

Purpose of data

What the data reveals Inserts and Updates


12/20/2012

Multi-dimensional views of various kinds of business activities Periodic long-running batch jobs refresh the 131 data

131

MDDB Concepts
A multidimensional database is a computer software system designed to allow for efficient and convenient storage and retrieval of data that is intimately related and stored, viewed and analyzed from different perspectives (Dimensions). A hypercube represents a collection of multidimensional data. The edges of the cube are called dimensions Individual items within each dimensions are called members

132

2009 Wipro Ltd - Confidential

RDBMS v/s MDDB: Increased Complexity...


Relational DBMS
MODEL MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SEDAN SEDAN SEDAN ... COLOR BLUE BLUE BLUE RED RED RED WHITE WHITE WHITE BLUE BLUE BLUE RED RED RED WHITE WHITE WHITE BLUE BLUE BLUE DEALER Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr VOL. 6 3 2 5 3 1 3 1 4 3 3 3 4 3 6 2 3 5 4 3 2 ...

MDDB

Sales Volumes

M O D E L

Mini Van

Coupe Carr Gleason Clyde Blue Red White

Sedan

DEALERSHIP

COLOR

27 x 4 = 108 cells
133

3 x 3 x 3 = 27 cells

2009 Wipro Ltd - Confidential

Benefits of MDDB over RDBMS


Ease of Data Presentation & Navigation A great deal of information is gleaned immediately upon direct inspection of the array User is able to view data along presorted dimensions with data arranged in an inherently more organized, and accessible fashion than the one offered by the relational table. Storage Space Very low Space Consumption compared to Relational DB Performance Gives much better performance. Relational DB may give comparable results only through database tuning (indexing, keys etc), which may not be possible for ad-hoc queries. Ease of Maintenance No overhead as data is stored in the same way it is viewed. In Relational DB, indexes, sophisticated joins etc. are used which require considerable storage and maintenance
12/20/2012
134
2009 Wipro Ltd - Confidential

134

Issues with MDDB

Sparsity
- Input data in applications are typically sparse -Increases with increased dimensions

Data Explosion
-Due to Sparsity -Due to Summarization

Performance
-Doesnt perform better than RDBMS at high data volumes (>20-30 GB)

12/20/2012
135
2009 Wipro Ltd - Confidential

135

Issues with MDDB - Sparsity Example If dimension members of different dimensions Employee Age do not interact , then blank cell is left behind. LAST NAME EMP# AGE
Smith

SMITH REGAN FOX WELD KELLY LINK KRANZ LUCUS WEISS

M O D E L

01 21 12 Sales Volumes 19 31 63 Miini Van 14 6 5 31 4 54 3 5 27 Coupe 5 03 56 4 3 2 Sedan 41 45 Blue Red White 33 COLOR 41 23 19

21

Regan

19 63 31 27 56 45 41 19
31 41 23 01 14 54 03 12 33

Fox

L A S T N A M E

Weld

Kelly

Link

Kranz

Lucas

Weiss

EMPLOYEE #

12/20/2012
136
2009 Wipro Ltd - Confidential

136

OLAP Features
Calculations applied across dimensions, through hierarchies and/or across members Trend analysis over sequential time periods, What-if scenarios. Slicing / Dicing subsets for on-screen viewing Rotation to new dimensional comparisons in the viewing area Drill-down/up along the hierarchy Reach-through / Drill-through to underlying detail data

12/20/2012
137
2009 Wipro Ltd - Confidential

137

Features of OLAP - Rotation

Complex Queries & Sorts in Relational environment translated to simple rotation.


Sales Volumes

M O D E L

Mini Van

6 3 4
Blue

5 5 3
Red

4 5 2
White

Coupe

C O L O R ( ROTATE 90 )
o

Blue

6 5 4

3 5 5
MODEL

4 3 2
Sedan

Red

Sedan

White

Mini Van Coupe

COLOR

View #1

View #2

2 dimensional array has 2 views.


12/20/2012
138
2009 Wipro Ltd - Confidential

138

Features of OLAP - Rotation


Sales Volumes

M O D E L

Mini Van Coupe Carr Gleason Clyde Blue Red White

Sedan

C O L O R

Blue

Red White Sedan Coupe Mini Van Carr Gleason Clyde

C O L O R

Blue

Red White Carr Gleason Clyde Mini Van Coupe Sedan

COLOR

( ROTATE 90 )

MODEL

( ROTATE 90 )

DEALERSHIP

( ROTATE 90 )

DEALERSHIP

DEALERSHIP

MODEL

View #1
D E A L E R S H I P D E A L E R S H I P

View #2

View #3

Carr Gleason Mini Van Coupe Sedan White Red Blue

Carr Gleason Blue Red White

Mini Van

Clyde

Clyde Mini Van Coupe Sedan

M O D E L

Coupe Blue Red White Clyde Gleason Carr

Sedan

COLOR

( ROTATE 90 )

MODEL

( ROTATE 90 )

DEALERSHIP

MODEL

COLOR

COLOR

View #4

View #5

View #6

3 dimensional array has 6 views.


12/20/2012
139
2009 Wipro Ltd - Confidential

139

Features of OLAP - Slicing / Filtering


MDDB allows end user to quickly slice in on exact view of the data required.

Sales Volumes

M O D E L

Mini Van Mini Van

Coupe

Coupe Normal Metal Blue Blue

Carr Clyde

Carr Clyde

Normal Blue

Metal Blue

DEALERSHIP

COLOR
12/20/2012
140
2009 Wipro Ltd - Confidential

140

Features of OLAP - Drill Down / Up

ORGANIZATION DIMENSION
REGION Midwest

DISTRICT

Chicago

St. Louis

Gary

DEALERSHIP

Clyde

Gleason

Carr

Levi

Lucas

Bolton

Sales at region/District/Dealership Level

Moving Up and moving down in a hierarchy is referred to as drill-up / roll-up and drill-down

12/20/2012
141
2009 Wipro Ltd - Confidential

141

OLAP Reporting - Drill Down

Inflows ( Region , Year)


200 150 Inflows 100 ($M) 50 0 Year Year 1999 2000 Years

East West Central

12/20/2012
142
2009 Wipro Ltd - Confidential

142

OLAP Reporting - Drill Down

Inflows ( Region , Year - Year 1999)


90 80 70 60 50 Inflows ( $M) 40 30 20 10 0

East West Central

1st Qtr

2nd Qtr 3rd Qtr Year 1999

4th Qtr

Drill-down from Year to Quarter


12/20/2012
143
2009 Wipro Ltd - Confidential

143

OLAP Reporting - Drill Down

Inflows ( Region , Year - Year 1999 - 1st Qtr)


20 15 Inflows ( $M 10 ) 5

East West Central


January February March Year 1999

Drill-down from Quarter to Month

144

2009 Wipro Ltd - Confidential

Implementation Techniques -OLAP Architectures

MOLAP - Multidimensional OLAP


Multidimensional Databases for database and application logic layer

ROLAP - Relational OLAP


Access Data stored in relational Data Warehouse for OLAP Analysis. Database and Application logic provided as separate layers

HOLAP - Hybrid OLAP


OLAP Server routes queries first to MDDB, then to RDBMS and result processed on-the-fly in Server

DOLAP - Desk OLAP


Personal MDDB Server and application on the desktop

12/20/2012
145
2009 Wipro Ltd - Confidential

145

MOLAP - MDDB storage

OLAP
Cube
OLAP Calculation Engine

Web Browser

OLAP Tools

OLAP Appli cations


12/20/2012
146
2009 Wipro Ltd - Confidential

146

MOLAP - Features

Powerful analytical capabilities (e.g., financial, forecasting, statistical) Aggregation and calculation capabilities Read/write analytic applications Specialized data structures for
Maximum query performance. Optimum space utilization.
12/20/2012
147
2009 Wipro Ltd - Confidential

147

ROLAP - Standard SQL storage

MDDB - Relational Mapping

Relational DW

Web Browser
OLAP Calculation Engine

SQL

OLAP Tools

OLAP Applications
12/20/2012
148
2009 Wipro Ltd - Confidential

148

ROLAP - Features Three-tier hardware/software architecture:


GUI on client; multidimensional processing on midtier server; target database on database server Processing split between mid-tier & database servers

Ad hoc query capabilities to very large databases DW integration Data scalability

12/20/2012
149
2009 Wipro Ltd - Confidential

149

HOLAP - Combination of RDBMS and MDDB


OLAP Cube

Any Client

Relational DW

Web Browser
OLAP Calculation Engine

SQL

OLAP Tools

OLAP Applications
12/20/2012
150
2009 Wipro Ltd - Confidential

150

HOLAP - Features

RDBMS used for detailed data stored in large databases MDDB used for fast, read/write OLAP analysis and calculations Scalability of RDBMS and MDDB performance Calculation engine provides full analysis features Source of data transparent to end user

12/20/2012
151
2009 Wipro Ltd - Confidential

151

Architecture Comparison

MOLAP
Definition

ROLAP

HOLAP
Hybrid OLAP = ROLAP + summary in MDDB Sparsity exists only in MDDB part To the necessary extent

MDDB OLAP = Relational OLAP = Transaction level data + Transaction level data + summary in MDDB summary in RDBMS Good Design 3 10 times High (May go beyond control. Estimation is very important) Fast - (Depends upon the size of the MDDB) No Sparsity To the necessary extent

Data explosion due to Sparsity Data explosion due to Summarization Query Execution Speed

Slow

Optimum - If the data is fetched from RDBMS then its like ROLAP otherwise like MOLAP. High: RDBMS + disk space + MDDB Server cost Large transactional data + frequent summary analysis

Cost

Medium: MDDB Server + large disk space cost

Low: Only RDBMS + disk space cost

Where to apply?

Small transactional Very large transactional data + complex model + data & it needs to be frequent summary viewed / sorted analysis

12/20/2012
152
2009 Wipro Ltd - Confidential

152

Representative OLAP Tools:

Oracle Express Products Hyperion Essbase Cognos -PowerPlay Seagate - Holos SAS

Micro Strategy - DSS Agent Informix MetaCube Brio Query Business Objects / Web Intelligence

12/20/2012
153
2009 Wipro Ltd - Confidential

153

Sample OLAP Applications

Sales Analysis Financial Analysis Profitability Analysis Performance Analysis Risk Management Profiling & Segmentation Scorecard Application NPA Management Strategic Planning Customer Relationship Management (CRM)
12/20/2012
154
2009 Wipro Ltd - Confidential

154

Data Warehouse Testing

155

2009 Wipro Ltd - Confidential

Data Warehouse Testing Overview


There is an exponentially increasing cost associated with finding software defects later in the development lifecycle. In data warehousing, this is compounded because of the additional business costs of using incorrect data to make critical business decisions

The methodology required for testing a Data Warehouse is different from testing a typical transaction system

156

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Data warehouse testing is different on the following counts: User-Triggered vs. System triggered Volume of Test Data Possible scenarios/ Test Cases Programming for testing challenge

157

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System.


User-Triggered vs. System triggered

In data Warehouse, most of the testing is system triggered. Most of the production/Source system testing is the processing of individual transactions, which are driven by some input from the users (Application Form, Servicing Request.). There are very few test cycles, which cover the system-triggered scenarios (Like billing, Valuation.)

158

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Volume of Test Data The test data in a transaction system is a very small sample of the overall production data. Data Warehouse has typically large test data as one does try to fill-up maximum possible combination of dimensions and facts. Possible scenarios/ Test Cases In case of Data Warehouse, the permutations and combinations one can possibly test is virtually unlimited due to the core objective of Data Warehouse is to allow all possible views of data.

159

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Programming for testing challenge In case of transaction systems, users/business analysts typically test the output of the system. In case of data warehouse, most of the 'Data Warehouse data Quality testing' and ETL testing is done at backend by running separate stand-alone scripts. These scripts compare preTransformation to post Transformation of data.

160

2009 Wipro Ltd - Confidential

Data Warehouse Testing Process


Data-Warehouse testing is basically divided into two parts : 'Back-end' testing where the source systems data is compared to the endresult data in Loaded area 'Front-end' testing where the user checks the data by comparing their MIS with the data displayed by the end-user tools like OLAP. Testing phases consists of : Requirements testing Unit testing Integration testing Performance testing Acceptance testing

161

2009 Wipro Ltd - Confidential

Requirements testing
The main aim for doing Requirements testing is to check stated requirements for completeness. Requirements can be tested on following factors. Are the requirements Complete? Are the requirements Singular? Are the requirements Ambiguous? Are the requirements Developable? Are the requirements Testable?

162

2009 Wipro Ltd - Confidential

Unit Testing
Unit testing for data warehouses is WHITEBOX. It should check the ETL procedures/mappings/jobs and the reports developed. Unit testing the ETL procedures: Whether ETLs are accessing and picking up right data from right source.

All the data transformations are correct according to the business rules and data warehouse is correctly populated with the transformed data.
Testing the rejected records that dont fulfil transformation rules.

163

2009 Wipro Ltd - Confidential

Unit Testing
Unit Testing the Report data:

Verify Report data with source: Data present in a data warehouse will be stored at an aggregate level compare to source systems. QA team should verify the granular data stored in data warehouse against the source data available Field level data verification: QA team must understand the linkages for the fields displayed in the report and should trace back and compare that with the source systems Derivation formulae/calculation rules should be verified

164

2009 Wipro Ltd - Confidential

Integration Testing
Integration testing will involve following:

Sequence of ETLs jobs in batch. Initial loading of records on data warehouse. Incremental loading of records at a later date to verify the newly inserted or updated data. Testing the rejected records that dont fulfil transformation rules. Error log generation

165

2009 Wipro Ltd - Confidential

Performance Testing
Performance Testing should check for : ETL processes completing within time window.

Monitoring and measuring the data quality issues.


Refresh times for standard/complex reports.

166

2009 Wipro Ltd - Confidential

Acceptance testing
Here the system is tested with full functionality and is expected to function as in production. At the end of UAT, the system should be acceptable to the client for use in terms of ETL process integrity and business functionality and reporting.

167

2009 Wipro Ltd - Confidential

Questions

168

2009 Wipro Ltd - Confidential

Thank You

169

2009 Wipro Ltd - Confidential

Components of Warehouse
Source Tables: These are real-time, volatile data in relational databases for transaction processing (OLTP). These can be any relational databases or flat files. ETL Tools: To extract, cleansing, transform (aggregates, joins) and load the data from sources to target. Maintenance and Administration Tools: To authorize and monitor access to the data, set-up users. Scheduling jobs to run on offshore periods. Modeling Tools: Used for data warehouse design for high-performance using dimensional data modeling technique, mapping the source and target files. Databases: Target databases and data marts, which are part of data warehouse. These are structured for analysis and reporting purposes. End-user tools for analysis and reporting: get the reports and analyze the data from target tables. Different types of Querying, Data Mining, OLAP tools are used for this purpose.

170

2009 Wipro Ltd - Confidential

Data Warehouse Architecture


This is a basic design, where there are source files, which are loaded to a warehouse and users query the data for different purposes.

This has a staging area, where the data after cleansing, transforming is loaded and tested here. Later is directly loaded to the target database/warehouse. Which is divided to data marts and can be accessed by different users for their reporting and analyzing purposes.

171

2009 Wipro Ltd - Confidential

Data Modeling
Effective way of using a Data Warehouse

172

2009 Wipro Ltd - Confidential

Data Modeling Commonly E-R Data Model is used in OLTP, In OLAP Dimensional Data Model is used commonly. E-R (Entity-Relationship) Data Model
Entity: Object that can be observed and classified based on its properties and characteristics. Like employee, book, student Relationship: relating entities to other entities.

Different Perceptive of Data Modeling.


o Conceptual Data Model o Logical Data Model o Physical Data Model
173

2009 Wipro Ltd - Confidential

Terms used in Dimensional Data Model


To understand dimensional data modeling, let's define some of the terms commonly used in this type of modeling: Dimension: A category of information. For example, the time dimension. Attribute: A unique level within a dimension. For example, Month is an attribute in the Time Dimension. Hierarchy: The specification of levels that represents relationship between different attributes within a dimension. For example, one possible hierarchy in the Time dimension is Year Quarter Month Day. Fact Table: A table that contains the measures of interest. Lookup Table: It provides the detailed information about the attributes. For example, the lookup table for the Quarter attribute would include a list of all of the quarters available in the data warehouse. Surrogate Keys: To avoid the data integrity, surrogate keys are used. They are helpful for Slow Changing Dimensions and act as index/primary keys.
A dimensional model includes fact tables and lookup tables. Fact tables connect to one or more lookup tables, but fact tables do not have direct relationships to one another. Dimensions and hierarchies are represented by lookup tables. Attributes are the non-key 2009 Wipro Ltd - Confidential columns in the lookup tables.

174

Star Schema
Dimension Table
product prodId p1 p2 name price bolt 10 nut 5

Dimension Table
store storeId c1 c2 c3 city nyc sfo la

Fact Table
sale oderId date o100 1/7/97 o102 2/7/97 105 3/8/97 custId 53 53 111 prodId p1 p2 p1 storeId c1 c1 c3 qty 1 2 5 amt 12 11 50

Dimension Table
customer custId 53 81 111 name joe fred sally address 10 main 12 main 80 willow city sfo sfo la

175

2009 Wipro Ltd - Confidential

Snowflake Schema
Dimension Table Fact Table
store storeId s5 s7 s9 cityId sfo sfo la tId t1 t2 t1 mgr joe fred nancy
sType tId t1 t2 city size small large location downtown suburbs regId north south

Dimension Table
cityId pop sfo 1M la 5M

The star and snowflake schema are most commonly found in dimensional data warehouses and data marts where speed of data retrieval is more important than the efficiency of data manipulations. As such, the tables in these schema are not normalized much, and are frequently designed at a level of normalization short of third normal form.

region regId name north cold region south warm region

176

2009 Wipro Ltd - Confidential

Overview of Data Cleansing

177

2009 Wipro Ltd - Confidential

The Need For Data Quality Difficulty in decision making Time delays in operation Organizational mistrust Data ownership conflicts Customer attrition Costs associated with

178

error detection error rework customer service fixing customer problems


2009 Wipro Ltd - Confidential

Six Steps To Data Quality


Understand Information Flow In Organization
Identify authoritative data sources Interview Employees & Customers

Identify Potential Problem Areas & Asses Impact

Data Entry Points


Cost of bad data

Measure Quality Of Data

Use business rule discovery tools to identify data with inconsistent,

missing, incomplete, duplicate or incorrect values


Use data cleansing tools to clean data at the source Load only clean data into the data warehouse

Clean & Load Data

Continuous Monitoring

Schedule Periodic Cleansing of Source Data

Identify Areas of Improvement

Identify & Correct Cause of Defects Refine data capture mechanisms at source Educate users on importance of DQ
2009 Wipro Ltd - Confidential

179

Data Quality Solution Customized Programs Strengths:


Addresses specific needs No bulky one time investment

Limitations
Tons of Custom programs in different environments are difficult to manage Minor alterations demand coding efforts

Data Quality Assessment tools Strength


Provide automated assessment

Limitation
180
2009 Wipro Ltd - Confidential

Data Quality Solution


Business Rule Discovery tools Strengths
Detect Correlation in data values Can detect Patterns of behavior that indicate fraud

Limitations
Not all variables can be discovered Some discovered rules might not be pertinent There may be performance problems with large files or with many fields.

Data Reengineering & Cleansing tools Strengths


Usually are integrated packages with cleansing features as Add-on
181
2009 Wipro Ltd - Confidential

Tools In The Market Business Rule Discovery Tools


Integrity Data Reengineering Tool from Vality Technology Trillium Software System from Harte -Hanks Data Technologies Migration Architect from DB Star

Data Reengineering & Cleansing Tools


Carlton Pureview from Oracle ETI-Extract from Evolutionary Technologies PowerMart from Informatica Corp Sagent Data Mart from Sagent Technology

Data Quality Assessment Tools


Migration Architect, Evoke Axio from Evoke Software Wizrule from Wizsoft

Name & Address Cleansing Tools


182

Centrus Suite from Sagent I.d.centric from First Logic


2009 Wipro Ltd - Confidential

Data Extraction, Transformation, Load

183

2009 Wipro Ltd - Confidential

ETL Architecture

Visitors

Web Browsers

The Internet

External Data Demographics, Household, Webographics, Income

Staging Area
Web Server Logs & E-comm Transaction Data Flat Files

Meta Data Repository

Scheduled Extraction

RDBMS

Clean Transform Match Merge

Scheduled Loading

Enterprise Data Warehouse

Other OLTP Systems

Data Collection

Data Extraction

Data Transformation

Data Loading

Data Storage & Integration

184

2009 Wipro Ltd - Confidential

ETL Architecture Data Extraction:


Rummages through a file or database Uses some criteria for selection Identifies qualified data and Transports the data over onto another file or database

Data transformation
Integrating dissimilar data types Changing codes Adding a time attribute Summarizing data Calculating derived values Renormalizing data

Data Extraction Cleanup

Data loading
Initial and incremental loading Updation of metadata

185

Restructuring of records or fields Removal of Operational-only data Supply of missing field values Data Integrity checks Data Consistency and Range checks, 2009 Wipro Ltd - Confidential

Why ETL ?
Companies have valuable data lying around throughout their networks that needs to be moved from one place to another. The data lies in all sorts of heterogeneous systems,and therefore in all sorts of formats. To solve the problem, companies use extract, transform and load (ETL) software.
186
2009 Wipro Ltd - Confidential

Major components involved in ETL Processing

187

2009 Wipro Ltd - Confidential

Major components involved in ETL Processing


Design manager Lets developers define source-to-target mappings, transformations, process flows, and jobs Meta data management Provides a repository to define, document, and manage information about the ETL design and runtime processes Extract The process of reading data from a database. Transform The process of converting the extracted data Load The process of writing the data into the target database. Transport services ETL tools use network and file protocols to move data between source and target systems and in-memory protocols to move data between ETL run-time components. Administration and operation ETL utilities let administrators schedule, run, monitor ETL jobs, log all events, manage errors, recover from failures, reconcile outputs with source systems
2009 Wipro Ltd - Confidential

188

ETL Tools Provides facility to specify a large number of transformation rules with a GUI Generate programs to transform data Handle multiple data sources Handle data redundancy Generate metadata as output Most tools exploit parallelism by running on multiple low-cost servers in multi-threaded environment
189
2009 Wipro Ltd - Confidential

Metadata Management

190

2009 Wipro Ltd - Confidential

What Is Metadata?
Metadata is Information...

That describes the WHAT, WHEN, WHO, WHERE, HOW of the data warehouse About the data being captured and loaded into the Warehouse Documented in IT tools that improves both business and technical understanding of data and data-related processes

191

2009 Wipro Ltd - Confidential

Importance Of Metadata
Locating Information Time spent in looking for information. How often information is found? What poor decisions were made based on the incomplete information?

How much money was lost or earned as a result? Interpreting information


How many times have businesses needed to rework or recall products? What impact does it have on the bottom line ? How many mistakes were due to misinterpretation of existing documentation? How much interpretation results form too much metadata? How much time is spent trying to determine if any of the metadata is accurate? Integrating information How various data perspectives connect together? How much time is spent trying to figure out that? How much does the inefficiency and lack of metadata affect decision making

192

2009 Wipro Ltd - Confidential

Requirements for DW Metadata Management


Provide a simple catalogue of business metadata descriptions and views Document/manage metadata descriptions from an integrated development environment Enable DW users to identify and invoke pre-built queries against the data stores Design and enhance new data models and schemas for the data warehouse Capture data transformation rules between the operational and data warehousing databases Provide change impact analysis, and update across these technologies
193
2009 Wipro Ltd - Confidential

Consumers of Metadata
Technical Users Warehouse administrator Application developer Business Users -Business metadata Meanings Definitions Business Rules Software Tools Used in DW life-cycle development Metadata requirements for each tool must be identified The tool-specific metadata should be analysed for inclusion in the enterprise metadata repository Previously captured metadata should be electronically transferred from the enterprise metadata repository to each individual tool

194

2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Third Party Bridging Tools Oracle Exchange
Technology of choice for a long list of repository, enterprise and workgroup vendors

Reischmann-Informatik-Toolbus
Features include facilitation of selective bridging of metadata

Ardent Software/ Dovetail Software -Interplay


Hub and Spoke solution for enabling metadata interoperability Ardent focussing on own engagements, not selling it as independent product

Informix's Metadata Plug-ins


Available with Ardent Datastage version 3.6.2 free of cost for Erwin, Oracle Designer, Sybase Powerdesigner, Brio, Microstrategy
195
2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Metadata Repositories IBM, Oracle and Microsoft to offer free or near-free basic repository services Enable organisations to reuse metadata across technologies Integrate DB design, data transformation and BI tools from different vendors Multi-tool vendors taking a bridged or federated rather than integrated approach to sharing metadata Both IBM and Oracle have multiple repositories for different lines of products e.g., One for AD and one for DW, with bridges between them

196

2009 Wipro Ltd - Confidential

Data Warehouse Concepts

Avinash Kanumuru Diya Jana Debyajit Majumder

2009 Wipro Ltd - Confidential

Content
1 An Overview of Data Warehouse 2 Data Warehouse Architecture 3 Data Modeling for Data Warehouse 4 Overview of Data Cleansing

5 Data Extraction, Transformation, Load

198

2009 Wipro Ltd - Confidential

Data Warehouse Concepts

Avinash Kanumuru Diya Jana Debyajit Majumder

2009 Wipro Ltd - Confidential

Content
1 An Overview of Data Warehouse 2 Data Warehouse Architecture 3 Data Modeling for Data Warehouse 4 Overview of Data Cleansing

5 Data Extraction, Transformation, Load

200

2009 Wipro Ltd - Confidential

Content [contd]
6 Metadata Management 7 OLAP 8 Data Warehouse Testing

201

2009 Wipro Ltd - Confidential

An Overview
Understanding What is a Data Warehouse

202

2009 Wipro Ltd - Confidential

What is Data Warehouse?


Definitions of Data Warehouse A data warehouse is a subject-oriented, integrated, nonvolatile, time-variant collection of data in support of management's decisions. WH Inmon Data Warehouse is a repository of data summarized or aggregated in simplified form from operational systems. End user orientated data access and reporting tools let user get at the data for decision support Babcock A data warehouse is a relational database a copy of transaction data specifically structured for query and analysis Ralph Kimball In simple: Data warehousing is collection of data from different systems, which helps in Business Decisions, Analysis and Reporting.

203

2009 Wipro Ltd - Confidential

Data Warehouse def. by WH Inmon


A common way of introducing data warehousing is to refer to the characteristics of a data warehouse as set forth by William Inmon: Subject Oriented Data that gives information about a particular subject instead of about a company's ongoing operations. Integrated Data that is gathered into the data warehouse from a variety of sources and merged into a coherent whole. Nonvolatile Data is stable in a data warehouse. More data is added but data is never removed. This enables management to gain a consistent picture of the business. Time Variant In order to discover trends in business, analysts need large amounts of data. This is very much in contrast to online transaction processing (OLTP) systems, where performance requirements demand that historical data be moved to an archive. All data in the data warehouse is identified with a particular time period.

204

2009 Wipro Ltd - Confidential

Data Warehouse Architecture


What makes a Data Warehouse

205

2009 Wipro Ltd - Confidential

Components of Warehouse
Source Tables: These are real-time, volatile data in relational databases for transaction processing (OLTP). These can be any relational databases or flat files. ETL Tools: To extract, cleansing, transform (aggregates, joins) and load the data from sources to target. Maintenance and Administration Tools: To authorize and monitor access to the data, set-up users. Scheduling jobs to run on offshore periods. Modeling Tools: Used for data warehouse design for high-performance using dimensional data modeling technique, mapping the source and target files. Databases: Target databases and data marts, which are part of data warehouse. These are structured for analysis and reporting purposes. End-user tools for analysis and reporting: get the reports and analyze the data from target tables. Different types of Querying, Data Mining, OLAP tools are used for this purpose.

206

2009 Wipro Ltd - Confidential

Data Warehouse Architecture


This is a basic design, where there are source files, which are loaded to a warehouse and users query the data for different purposes.

This has a staging area, where the data after cleansing, transforming is loaded and tested here. Later is directly loaded to the target database/warehouse. Which is divided to data marts and can be accessed by different users for their reporting and analyzing purposes.

207

2009 Wipro Ltd - Confidential

Data Modeling
Effective way of using a Data Warehouse

208

2009 Wipro Ltd - Confidential

Data Modeling Commonly E-R Data Model is used in OLTP, In OLAP Dimensional Data Model is used commonly. E-R (Entity-Relationship) Data Model
Entity: Object that can be observed and classified based on its properties and characteristics. Like employee, book, student Relationship: relating entities to other entities.

Different Perceptive of Data Modeling.


o Conceptual Data Model o Logical Data Model o Physical Data Model
209

2009 Wipro Ltd - Confidential

Terms used in Dimensional Data Model


To understand dimensional data modeling, let's define some of the terms commonly used in this type of modeling: Dimension: A category of information. For example, the time dimension. Attribute: A unique level within a dimension. For example, Month is an attribute in the Time Dimension. Hierarchy: The specification of levels that represents relationship between different attributes within a dimension. For example, one possible hierarchy in the Time dimension is Year Quarter Month Day. Fact Table: A table that contains the measures of interest. Lookup Table: It provides the detailed information about the attributes. For example, the lookup table for the Quarter attribute would include a list of all of the quarters available in the data warehouse. Surrogate Keys: To avoid the data integrity, surrogate keys are used. They are helpful for Slow Changing Dimensions and act as index/primary keys.
A dimensional model includes fact tables and lookup tables. Fact tables connect to one or more lookup tables, but fact tables do not have direct relationships to one another. Dimensions and hierarchies are represented by lookup tables. Attributes are the non-key 2009 Wipro Ltd - Confidential columns in the lookup tables.

210

Star Schema
Dimension Table
product prodId p1 p2 name price bolt 10 nut 5

Dimension Table
store storeId c1 c2 c3 city nyc sfo la

Fact Table
sale oderId date o100 1/7/97 o102 2/7/97 105 3/8/97 custId 53 53 111 prodId p1 p2 p1 storeId c1 c1 c3 qty 1 2 5 amt 12 11 50

Dimension Table
customer custId 53 81 111 name joe fred sally address 10 main 12 main 80 willow city sfo sfo la

211

2009 Wipro Ltd - Confidential

Snowflake Schema
Dimension Table Fact Table
store storeId s5 s7 s9 cityId sfo sfo la tId t1 t2 t1 mgr joe fred nancy
sType tId t1 t2 city size small large location downtown suburbs regId north south

Dimension Table
cityId pop sfo 1M la 5M

The star and snowflake schema are most commonly found in dimensional data warehouses and data marts where speed of data retrieval is more important than the efficiency of data manipulations. As such, the tables in these schema are not normalized much, and are frequently designed at a level of normalization short of third normal form.

region regId name north cold region south warm region

212

2009 Wipro Ltd - Confidential

Overview of Data Cleansing

213

2009 Wipro Ltd - Confidential

The Need For Data Quality Difficulty in decision making Time delays in operation Organizational mistrust Data ownership conflicts Customer attrition Costs associated with

214

error detection error rework customer service fixing customer problems


2009 Wipro Ltd - Confidential

Six Steps To Data Quality


Understand Information Flow In Organization
Identify authoritative data sources Interview Employees & Customers

Identify Potential Problem Areas & Asses Impact

Data Entry Points


Cost of bad data

Measure Quality Of Data

Use business rule discovery tools to identify data with inconsistent,

missing, incomplete, duplicate or incorrect values


Use data cleansing tools to clean data at the source Load only clean data into the data warehouse

Clean & Load Data

Continuous Monitoring

Schedule Periodic Cleansing of Source Data

Identify Areas of Improvement

Identify & Correct Cause of Defects Refine data capture mechanisms at source Educate users on importance of DQ
2009 Wipro Ltd - Confidential

215

Data Quality Solution Customized Programs Strengths:


Addresses specific needs No bulky one time investment

Limitations
Tons of Custom programs in different environments are difficult to manage Minor alterations demand coding efforts

Data Quality Assessment tools Strength


Provide automated assessment

Limitation
216
2009 Wipro Ltd - Confidential

Data Quality Solution


Business Rule Discovery tools Strengths
Detect Correlation in data values Can detect Patterns of behavior that indicate fraud

Limitations
Not all variables can be discovered Some discovered rules might not be pertinent There may be performance problems with large files or with many fields.

Data Reengineering & Cleansing tools Strengths


Usually are integrated packages with cleansing features as Add-on
217
2009 Wipro Ltd - Confidential

Tools In The Market Business Rule Discovery Tools


Integrity Data Reengineering Tool from Vality Technology Trillium Software System from Harte -Hanks Data Technologies Migration Architect from DB Star

Data Reengineering & Cleansing Tools


Carlton Pureview from Oracle ETI-Extract from Evolutionary Technologies PowerMart from Informatica Corp Sagent Data Mart from Sagent Technology

Data Quality Assessment Tools


Migration Architect, Evoke Axio from Evoke Software Wizrule from Wizsoft

Name & Address Cleansing Tools


218

Centrus Suite from Sagent I.d.centric from First Logic


2009 Wipro Ltd - Confidential

Data Extraction, Transformation, Load

219

2009 Wipro Ltd - Confidential

ETL Architecture

Visitors

Web Browsers

The Internet

External Data Demographics, Household, Webographics, Income

Staging Area
Web Server Logs & E-comm Transaction Data Flat Files

Meta Data Repository

Scheduled Extraction

RDBMS

Clean Transform Match Merge

Scheduled Loading

Enterprise Data Warehouse

Other OLTP Systems

Data Collection

Data Extraction

Data Transformation

Data Loading

Data Storage & Integration

220

2009 Wipro Ltd - Confidential

ETL Architecture Data Extraction:


Rummages through a file or database Uses some criteria for selection Identifies qualified data and Transports the data over onto another file or database

Data transformation
Integrating dissimilar data types Changing codes Adding a time attribute Summarizing data Calculating derived values Renormalizing data

Data Extraction Cleanup

Data loading
Initial and incremental loading Updation of metadata

221

Restructuring of records or fields Removal of Operational-only data Supply of missing field values Data Integrity checks Data Consistency and Range checks, 2009 Wipro Ltd - Confidential

Why ETL ?
Companies have valuable data lying around throughout their networks that needs to be moved from one place to another. The data lies in all sorts of heterogeneous systems,and therefore in all sorts of formats. To solve the problem, companies use extract, transform and load (ETL) software.
222
2009 Wipro Ltd - Confidential

Major components involved in ETL Processing

223

2009 Wipro Ltd - Confidential

Major components involved in ETL Processing


Design manager Lets developers define source-to-target mappings, transformations, process flows, and jobs Meta data management Provides a repository to define, document, and manage information about the ETL design and runtime processes Extract The process of reading data from a database. Transform The process of converting the extracted data Load The process of writing the data into the target database. Transport services ETL tools use network and file protocols to move data between source and target systems and in-memory protocols to move data between ETL run-time components. Administration and operation ETL utilities let administrators schedule, run, monitor ETL jobs, log all events, manage errors, recover from failures, reconcile outputs with source systems
2009 Wipro Ltd - Confidential

224

ETL Tools Provides facility to specify a large number of transformation rules with a GUI Generate programs to transform data Handle multiple data sources Handle data redundancy Generate metadata as output Most tools exploit parallelism by running on multiple low-cost servers in multi-threaded environment
225
2009 Wipro Ltd - Confidential

Metadata Management

226

2009 Wipro Ltd - Confidential

What Is Metadata?
Metadata is Information...

That describes the WHAT, WHEN, WHO, WHERE, HOW of the data warehouse About the data being captured and loaded into the Warehouse Documented in IT tools that improves both business and technical understanding of data and data-related processes

227

2009 Wipro Ltd - Confidential

Importance Of Metadata
Locating Information Time spent in looking for information. How often information is found? What poor decisions were made based on the incomplete information?

How much money was lost or earned as a result? Interpreting information


How many times have businesses needed to rework or recall products? What impact does it have on the bottom line ? How many mistakes were due to misinterpretation of existing documentation? How much interpretation results form too much metadata? How much time is spent trying to determine if any of the metadata is accurate? Integrating information How various data perspectives connect together? How much time is spent trying to figure out that? How much does the inefficiency and lack of metadata affect decision making

228

2009 Wipro Ltd - Confidential

Requirements for DW Metadata Management


Provide a simple catalogue of business metadata descriptions and views Document/manage metadata descriptions from an integrated development environment Enable DW users to identify and invoke pre-built queries against the data stores Design and enhance new data models and schemas for the data warehouse Capture data transformation rules between the operational and data warehousing databases Provide change impact analysis, and update across these technologies
229
2009 Wipro Ltd - Confidential

Consumers of Metadata
Technical Users Warehouse administrator Application developer Business Users -Business metadata Meanings Definitions Business Rules Software Tools Used in DW life-cycle development Metadata requirements for each tool must be identified The tool-specific metadata should be analysed for inclusion in the enterprise metadata repository Previously captured metadata should be electronically transferred from the enterprise metadata repository to each individual tool

230

2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Third Party Bridging Tools Oracle Exchange
Technology of choice for a long list of repository, enterprise and workgroup vendors

Reischmann-Informatik-Toolbus
Features include facilitation of selective bridging of metadata

Ardent Software/ Dovetail Software -Interplay


Hub and Spoke solution for enabling metadata interoperability Ardent focussing on own engagements, not selling it as independent product

Informix's Metadata Plug-ins


Available with Ardent Datastage version 3.6.2 free of cost for Erwin, Oracle Designer, Sybase Powerdesigner, Brio, Microstrategy
231
2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Metadata Repositories IBM, Oracle and Microsoft to offer free or near-free basic repository services Enable organisations to reuse metadata across technologies Integrate DB design, data transformation and BI tools from different vendors Multi-tool vendors taking a bridged or federated rather than integrated approach to sharing metadata Both IBM and Oracle have multiple repositories for different lines of products e.g., One for AD and one for DW, with bridges between them

232

2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Metadata Interchange Standards CDIF (CASE Data Interchange Format)
Most frequently used interchange standard Addresses only a limited subset of metadata artifacts

OMG (Object Management Group)-CWM


XML-addresses context and data meaning, not presentation Can enable exchange over the web employing industry standards for storing and sharing programming data Will allow sharing of UML and MOF objects b/w various development tools and repositories

MDC (Metadata Coalition)


Based on XML/UML standards Promoted by Microsoft Along With 20 partners including Object Management Group (OMG), Oracle Carleton Group, CA-PLATINUM Technology (Founding Member), Viasoft
233
2009 Wipro Ltd - Confidential

OLAP

234

2009 Wipro Ltd - Confidential

Agenda
OLAP Definition Distinction between OLTP and OLAP

MDDB Concepts
Implementation Techniques Architectures

Features
Representative Tools

12/20/2012

235

235

2009 Wipro Ltd - Confidential

OLAP: On-Line Analytical Processing


OLAP can be defined as a technology which allows the users to view the aggregate data across measurements (like Maturity Amount, Interest Rate etc.) along with a set of related parameters called dimensions (like Product, Organization, Customer, etc.) Used interchangeably with BI Multidimensional view of data is the foundation of OLAP Users :Analysts, Decision makers
12/20/2012 236

236

2009 Wipro Ltd - Confidential

Distinction between OLTP and OLAP


OLTP System Source of data Operational data; OLTPs are the original source of the data To control and run fundamental business tasks A snapshot of ongoing business processes Short and fast inserts and updates initiated by end users
2009 Wipro Ltd - Confidential

OLAP System Consolidation data; OLAP data comes from the various OLTP databases Decision support

Purpose of data

What the data reveals Inserts and Updates


12/20/2012

Multi-dimensional views of various kinds of business activities Periodic long-running batch jobs refresh the 237 data

237

MDDB Concepts
A multidimensional database is a computer software system designed to allow for efficient and convenient storage and retrieval of data that is intimately related and stored, viewed and analyzed from different perspectives (Dimensions). A hypercube represents a collection of multidimensional data. The edges of the cube are called dimensions Individual items within each dimensions are called members

238

2009 Wipro Ltd - Confidential

RDBMS v/s MDDB: Increased Complexity...


Relational DBMS
MODEL MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SEDAN SEDAN SEDAN ... COLOR BLUE BLUE BLUE RED RED RED WHITE WHITE WHITE BLUE BLUE BLUE RED RED RED WHITE WHITE WHITE BLUE BLUE BLUE DEALER Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr VOL. 6 3 2 5 3 1 3 1 4 3 3 3 4 3 6 2 3 5 4 3 2 ...

MDDB

Sales Volumes

M O D E L

Mini Van

Coupe Carr Gleason Clyde Blue Red White

Sedan

DEALERSHIP

COLOR

27 x 4 = 108 cells
239

3 x 3 x 3 = 27 cells

2009 Wipro Ltd - Confidential

Benefits of MDDB over RDBMS


Ease of Data Presentation & Navigation A great deal of information is gleaned immediately upon direct inspection of the array User is able to view data along presorted dimensions with data arranged in an inherently more organized, and accessible fashion than the one offered by the relational table. Storage Space Very low Space Consumption compared to Relational DB Performance Gives much better performance. Relational DB may give comparable results only through database tuning (indexing, keys etc), which may not be possible for ad-hoc queries. Ease of Maintenance No overhead as data is stored in the same way it is viewed. In Relational DB, indexes, sophisticated joins etc. are used which require considerable storage and maintenance
12/20/2012
240
2009 Wipro Ltd - Confidential

240

Issues with MDDB

Sparsity
- Input data in applications are typically sparse -Increases with increased dimensions

Data Explosion
-Due to Sparsity -Due to Summarization

Performance
-Doesnt perform better than RDBMS at high data volumes (>20-30 GB)

12/20/2012
241
2009 Wipro Ltd - Confidential

241

Issues with MDDB - Sparsity Example If dimension members of different dimensions Employee Age do not interact , then blank cell is left behind. LAST NAME EMP# AGE
Smith

SMITH REGAN FOX WELD KELLY LINK KRANZ LUCUS WEISS

M O D E L

01 21 12 Sales Volumes 19 31 63 Miini Van 14 6 5 31 4 54 3 5 27 Coupe 5 03 56 4 3 2 Sedan 41 45 Blue Red White 33 COLOR 41 23 19

21

Regan

19 63 31 27 56 45 41 19
31 41 23 01 14 54 03 12 33

Fox

L A S T N A M E

Weld

Kelly

Link

Kranz

Lucas

Weiss

EMPLOYEE #

12/20/2012
242
2009 Wipro Ltd - Confidential

242

OLAP Features
Calculations applied across dimensions, through hierarchies and/or across members Trend analysis over sequential time periods, What-if scenarios. Slicing / Dicing subsets for on-screen viewing Rotation to new dimensional comparisons in the viewing area Drill-down/up along the hierarchy Reach-through / Drill-through to underlying detail data

12/20/2012
243
2009 Wipro Ltd - Confidential

243

Features of OLAP - Rotation

Complex Queries & Sorts in Relational environment translated to simple rotation.


Sales Volumes

M O D E L

Mini Van

6 3 4
Blue

5 5 3
Red

4 5 2
White

Coupe

C O L O R ( ROTATE 90 )
o

Blue

6 5 4

3 5 5
MODEL

4 3 2
Sedan

Red

Sedan

White

Mini Van Coupe

COLOR

View #1

View #2

2 dimensional array has 2 views.


12/20/2012
244
2009 Wipro Ltd - Confidential

244

Features of OLAP - Rotation


Sales Volumes

M O D E L

Mini Van Coupe Carr Gleason Clyde Blue Red White

Sedan

C O L O R

Blue

Red White Sedan Coupe Mini Van Carr Gleason Clyde

C O L O R

Blue

Red White Carr Gleason Clyde Mini Van Coupe Sedan

COLOR

( ROTATE 90 )

MODEL

( ROTATE 90 )

DEALERSHIP

( ROTATE 90 )

DEALERSHIP

DEALERSHIP

MODEL

View #1
D E A L E R S H I P D E A L E R S H I P

View #2

View #3

Carr Gleason Mini Van Coupe Sedan White Red Blue

Carr Gleason Blue Red White

Mini Van

Clyde

Clyde Mini Van Coupe Sedan

M O D E L

Coupe Blue Red White Clyde Gleason Carr

Sedan

COLOR

( ROTATE 90 )

MODEL

( ROTATE 90 )

DEALERSHIP

MODEL

COLOR

COLOR

View #4

View #5

View #6

3 dimensional array has 6 views.


12/20/2012
245
2009 Wipro Ltd - Confidential

245

Features of OLAP - Slicing / Filtering


MDDB allows end user to quickly slice in on exact view of the data required.

Sales Volumes

M O D E L

Mini Van Mini Van

Coupe

Coupe Normal Metal Blue Blue

Carr Clyde

Carr Clyde

Normal Blue

Metal Blue

DEALERSHIP

COLOR
12/20/2012
246
2009 Wipro Ltd - Confidential

246

Features of OLAP - Drill Down / Up

ORGANIZATION DIMENSION
REGION Midwest

DISTRICT

Chicago

St. Louis

Gary

DEALERSHIP

Clyde

Gleason

Carr

Levi

Lucas

Bolton

Sales at region/District/Dealership Level

Moving Up and moving down in a hierarchy is referred to as drill-up / roll-up and drill-down

12/20/2012
247
2009 Wipro Ltd - Confidential

247

OLAP Reporting - Drill Down

Inflows ( Region , Year)


200 150 Inflows 100 ($M) 50 0 Year Year 1999 2000 Years

East West Central

12/20/2012
248
2009 Wipro Ltd - Confidential

248

OLAP Reporting - Drill Down

Inflows ( Region , Year - Year 1999)


90 80 70 60 50 Inflows ( $M) 40 30 20 10 0

East West Central

1st Qtr

2nd Qtr 3rd Qtr Year 1999

4th Qtr

Drill-down from Year to Quarter


12/20/2012
249
2009 Wipro Ltd - Confidential

249

OLAP Reporting - Drill Down

Inflows ( Region , Year - Year 1999 - 1st Qtr)


20 15 Inflows ( $M 10 ) 5

East West Central


January February March Year 1999

Drill-down from Quarter to Month

250

2009 Wipro Ltd - Confidential

Implementation Techniques -OLAP Architectures

MOLAP - Multidimensional OLAP


Multidimensional Databases for database and application logic layer

ROLAP - Relational OLAP


Access Data stored in relational Data Warehouse for OLAP Analysis. Database and Application logic provided as separate layers

HOLAP - Hybrid OLAP


OLAP Server routes queries first to MDDB, then to RDBMS and result processed on-the-fly in Server

DOLAP - Desk OLAP


Personal MDDB Server and application on the desktop

12/20/2012
251
2009 Wipro Ltd - Confidential

251

MOLAP - MDDB storage

OLAP
Cube
OLAP Calculation Engine

Web Browser

OLAP Tools

OLAP Appli cations


12/20/2012
252
2009 Wipro Ltd - Confidential

252

MOLAP - Features

Powerful analytical capabilities (e.g., financial, forecasting, statistical) Aggregation and calculation capabilities Read/write analytic applications Specialized data structures for
Maximum query performance. Optimum space utilization.
12/20/2012
253
2009 Wipro Ltd - Confidential

253

ROLAP - Standard SQL storage

MDDB - Relational Mapping

Relational DW

Web Browser
OLAP Calculation Engine

SQL

OLAP Tools

OLAP Applications
12/20/2012
254
2009 Wipro Ltd - Confidential

254

ROLAP - Features Three-tier hardware/software architecture:


GUI on client; multidimensional processing on midtier server; target database on database server Processing split between mid-tier & database servers

Ad hoc query capabilities to very large databases DW integration Data scalability

12/20/2012
255
2009 Wipro Ltd - Confidential

255

HOLAP - Combination of RDBMS and MDDB


OLAP Cube

Any Client

Relational DW

Web Browser
OLAP Calculation Engine

SQL

OLAP Tools

OLAP Applications
12/20/2012
256
2009 Wipro Ltd - Confidential

256

HOLAP - Features

RDBMS used for detailed data stored in large databases MDDB used for fast, read/write OLAP analysis and calculations Scalability of RDBMS and MDDB performance Calculation engine provides full analysis features Source of data transparent to end user

12/20/2012
257
2009 Wipro Ltd - Confidential

257

Architecture Comparison

MOLAP
Definition

ROLAP

HOLAP
Hybrid OLAP = ROLAP + summary in MDDB Sparsity exists only in MDDB part To the necessary extent

MDDB OLAP = Relational OLAP = Transaction level data + Transaction level data + summary in MDDB summary in RDBMS Good Design 3 10 times High (May go beyond control. Estimation is very important) Fast - (Depends upon the size of the MDDB) No Sparsity To the necessary extent

Data explosion due to Sparsity Data explosion due to Summarization Query Execution Speed

Slow

Optimum - If the data is fetched from RDBMS then its like ROLAP otherwise like MOLAP. High: RDBMS + disk space + MDDB Server cost Large transactional data + frequent summary analysis

Cost

Medium: MDDB Server + large disk space cost

Low: Only RDBMS + disk space cost

Where to apply?

Small transactional Very large transactional data + complex model + data & it needs to be frequent summary viewed / sorted analysis

12/20/2012
258
2009 Wipro Ltd - Confidential

258

Representative OLAP Tools:

Oracle Express Products Hyperion Essbase Cognos -PowerPlay Seagate - Holos SAS

Micro Strategy - DSS Agent Informix MetaCube Brio Query Business Objects / Web Intelligence

12/20/2012
259
2009 Wipro Ltd - Confidential

259

Sample OLAP Applications

Sales Analysis Financial Analysis Profitability Analysis Performance Analysis Risk Management Profiling & Segmentation Scorecard Application NPA Management Strategic Planning Customer Relationship Management (CRM)
12/20/2012
260
2009 Wipro Ltd - Confidential

260

Data Warehouse Testing

261

2009 Wipro Ltd - Confidential

Data Warehouse Testing Overview


There is an exponentially increasing cost associated with finding software defects later in the development lifecycle. In data warehousing, this is compounded because of the additional business costs of using incorrect data to make critical business decisions

The methodology required for testing a Data Warehouse is different from testing a typical transaction system

262

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Data warehouse testing is different on the following counts: User-Triggered vs. System triggered Volume of Test Data Possible scenarios/ Test Cases Programming for testing challenge

263

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System.


User-Triggered vs. System triggered

In data Warehouse, most of the testing is system triggered. Most of the production/Source system testing is the processing of individual transactions, which are driven by some input from the users (Application Form, Servicing Request.). There are very few test cycles, which cover the system-triggered scenarios (Like billing, Valuation.)

264

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Volume of Test Data The test data in a transaction system is a very small sample of the overall production data. Data Warehouse has typically large test data as one does try to fill-up maximum possible combination of dimensions and facts. Possible scenarios/ Test Cases In case of Data Warehouse, the permutations and combinations one can possibly test is virtually unlimited due to the core objective of Data Warehouse is to allow all possible views of data.

265

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Programming for testing challenge In case of transaction systems, users/business analysts typically test the output of the system. In case of data warehouse, most of the 'Data Warehouse data Quality testing' and ETL testing is done at backend by running separate stand-alone scripts. These scripts compare preTransformation to post Transformation of data.

266

2009 Wipro Ltd - Confidential

Data Warehouse Testing Process


Data-Warehouse testing is basically divided into two parts : 'Back-end' testing where the source systems data is compared to the endresult data in Loaded area 'Front-end' testing where the user checks the data by comparing their MIS with the data displayed by the end-user tools like OLAP. Testing phases consists of : Requirements testing Unit testing Integration testing Performance testing Acceptance testing

267

2009 Wipro Ltd - Confidential

Requirements testing
The main aim for doing Requirements testing is to check stated requirements for completeness. Requirements can be tested on following factors. Are the requirements Complete? Are the requirements Singular? Are the requirements Ambiguous? Are the requirements Developable? Are the requirements Testable?

268

2009 Wipro Ltd - Confidential

Unit Testing
Unit testing for data warehouses is WHITEBOX. It should check the ETL procedures/mappings/jobs and the reports developed. Unit testing the ETL procedures: Whether ETLs are accessing and picking up right data from right source.

All the data transformations are correct according to the business rules and data warehouse is correctly populated with the transformed data.
Testing the rejected records that dont fulfil transformation rules.

269

2009 Wipro Ltd - Confidential

Unit Testing
Unit Testing the Report data:

Verify Report data with source: Data present in a data warehouse will be stored at an aggregate level compare to source systems. QA team should verify the granular data stored in data warehouse against the source data available Field level data verification: QA team must understand the linkages for the fields displayed in the report and should trace back and compare that with the source systems Derivation formulae/calculation rules should be verified

270

2009 Wipro Ltd - Confidential

Integration Testing
Integration testing will involve following:

Sequence of ETLs jobs in batch. Initial loading of records on data warehouse. Incremental loading of records at a later date to verify the newly inserted or updated data. Testing the rejected records that dont fulfil transformation rules. Error log generation

271

2009 Wipro Ltd - Confidential

Performance Testing
Performance Testing should check for : ETL processes completing within time window.

Monitoring and measuring the data quality issues.


Refresh times for standard/complex reports.

272

2009 Wipro Ltd - Confidential

Acceptance testing
Here the system is tested with full functionality and is expected to function as in production. At the end of UAT, the system should be acceptable to the client for use in terms of ETL process integrity and business functionality and reporting.

273

2009 Wipro Ltd - Confidential

Questions

274

2009 Wipro Ltd - Confidential

Thank You

275

2009 Wipro Ltd - Confidential

Content [contd]
6 Metadata Management 7 OLAP 8 Data Warehouse Testing

276

2009 Wipro Ltd - Confidential

An Overview
Understanding What is a Data Warehouse

277

2009 Wipro Ltd - Confidential

What is Data Warehouse?


Definitions of Data Warehouse A data warehouse is a subject-oriented, integrated, nonvolatile, time-variant collection of data in support of management's decisions. WH Inmon Data Warehouse is a repository of data summarized or aggregated in simplified form from operational systems. End user orientated data access and reporting tools let user get at the data for decision support Babcock A data warehouse is a relational database a copy of transaction data specifically structured for query and analysis Ralph Kimball In simple: Data warehousing is collection of data from different systems, which helps in Business Decisions, Analysis and Reporting.

278

2009 Wipro Ltd - Confidential

Data Warehouse def. by WH Inmon


A common way of introducing data warehousing is to refer to the characteristics of a data warehouse as set forth by William Inmon: Subject Oriented Data that gives information about a particular subject instead of about a company's ongoing operations. Integrated Data that is gathered into the data warehouse from a variety of sources and merged into a coherent whole. Nonvolatile Data is stable in a data warehouse. More data is added but data is never removed. This enables management to gain a consistent picture of the business. Time Variant In order to discover trends in business, analysts need large amounts of data. This is very much in contrast to online transaction processing (OLTP) systems, where performance requirements demand that historical data be moved to an archive. All data in the data warehouse is identified with a particular time period.

279

2009 Wipro Ltd - Confidential

Data Warehouse Architecture


What makes a Data Warehouse

280

2009 Wipro Ltd - Confidential

Components of Warehouse
Source Tables: These are real-time, volatile data in relational databases for transaction processing (OLTP). These can be any relational databases or flat files. ETL Tools: To extract, cleansing, transform (aggregates, joins) and load the data from sources to target. Maintenance and Administration Tools: To authorize and monitor access to the data, set-up users. Scheduling jobs to run on offshore periods. Modeling Tools: Used for data warehouse design for high-performance using dimensional data modeling technique, mapping the source and target files. Databases: Target databases and data marts, which are part of data warehouse. These are structured for analysis and reporting purposes. End-user tools for analysis and reporting: get the reports and analyze the data from target tables. Different types of Querying, Data Mining, OLAP tools are used for this purpose.

281

2009 Wipro Ltd - Confidential

Data Warehouse Architecture


This is a basic design, where there are source files, which are loaded to a warehouse and users query the data for different purposes.

This has a staging area, where the data after cleansing, transforming is loaded and tested here. Later is directly loaded to the target database/warehouse. Which is divided to data marts and can be accessed by different users for their reporting and analyzing purposes.

282

2009 Wipro Ltd - Confidential

Data Modeling
Effective way of using a Data Warehouse

283

2009 Wipro Ltd - Confidential

Data Modeling Commonly E-R Data Model is used in OLTP, In OLAP Dimensional Data Model is used commonly. E-R (Entity-Relationship) Data Model
Entity: Object that can be observed and classified based on its properties and characteristics. Like employee, book, student Relationship: relating entities to other entities.

Different Perceptive of Data Modeling.


o Conceptual Data Model o Logical Data Model o Physical Data Model
284

2009 Wipro Ltd - Confidential

Terms used in Dimensional Data Model


To understand dimensional data modeling, let's define some of the terms commonly used in this type of modeling: Dimension: A category of information. For example, the time dimension. Attribute: A unique level within a dimension. For example, Month is an attribute in the Time Dimension. Hierarchy: The specification of levels that represents relationship between different attributes within a dimension. For example, one possible hierarchy in the Time dimension is Year Quarter Month Day. Fact Table: A table that contains the measures of interest. Lookup Table: It provides the detailed information about the attributes. For example, the lookup table for the Quarter attribute would include a list of all of the quarters available in the data warehouse. Surrogate Keys: To avoid the data integrity, surrogate keys are used. They are helpful for Slow Changing Dimensions and act as index/primary keys.
A dimensional model includes fact tables and lookup tables. Fact tables connect to one or more lookup tables, but fact tables do not have direct relationships to one another. Dimensions and hierarchies are represented by lookup tables. Attributes are the non-key 2009 Wipro Ltd - Confidential columns in the lookup tables.

285

Star Schema
Dimension Table
product prodId p1 p2 name price bolt 10 nut 5

Dimension Table
store storeId c1 c2 c3 city nyc sfo la

Fact Table
sale oderId date o100 1/7/97 o102 2/7/97 105 3/8/97 custId 53 53 111 prodId p1 p2 p1 storeId c1 c1 c3 qty 1 2 5 amt 12 11 50

Dimension Table
customer custId 53 81 111 name joe fred sally address 10 main 12 main 80 willow city sfo sfo la

286

2009 Wipro Ltd - Confidential

Snowflake Schema
Dimension Table Fact Table
store storeId s5 s7 s9 cityId sfo sfo la tId t1 t2 t1 mgr joe fred nancy
sType tId t1 t2 city size small large location downtown suburbs regId north south

Dimension Table
cityId pop sfo 1M la 5M

The star and snowflake schema are most commonly found in dimensional data warehouses and data marts where speed of data retrieval is more important than the efficiency of data manipulations. As such, the tables in these schema are not normalized much, and are frequently designed at a level of normalization short of third normal form.

region regId name north cold region south warm region

287

2009 Wipro Ltd - Confidential

Overview of Data Cleansing

288

2009 Wipro Ltd - Confidential

The Need For Data Quality Difficulty in decision making Time delays in operation Organizational mistrust Data ownership conflicts Customer attrition Costs associated with

289

error detection error rework customer service fixing customer problems


2009 Wipro Ltd - Confidential

Six Steps To Data Quality


Understand Information Flow In Organization
Identify authoritative data sources Interview Employees & Customers

Identify Potential Problem Areas & Asses Impact

Data Entry Points


Cost of bad data

Measure Quality Of Data

Use business rule discovery tools to identify data with inconsistent,

missing, incomplete, duplicate or incorrect values


Use data cleansing tools to clean data at the source Load only clean data into the data warehouse

Clean & Load Data

Continuous Monitoring

Schedule Periodic Cleansing of Source Data

Identify Areas of Improvement

Identify & Correct Cause of Defects Refine data capture mechanisms at source Educate users on importance of DQ
2009 Wipro Ltd - Confidential

290

Data Quality Solution Customized Programs Strengths:


Addresses specific needs No bulky one time investment

Limitations
Tons of Custom programs in different environments are difficult to manage Minor alterations demand coding efforts

Data Quality Assessment tools Strength


Provide automated assessment

Limitation
291
2009 Wipro Ltd - Confidential

Data Quality Solution


Business Rule Discovery tools Strengths
Detect Correlation in data values Can detect Patterns of behavior that indicate fraud

Limitations
Not all variables can be discovered Some discovered rules might not be pertinent There may be performance problems with large files or with many fields.

Data Reengineering & Cleansing tools Strengths


Usually are integrated packages with cleansing features as Add-on
292
2009 Wipro Ltd - Confidential

Tools In The Market Business Rule Discovery Tools


Integrity Data Reengineering Tool from Vality Technology Trillium Software System from Harte -Hanks Data Technologies Migration Architect from DB Star

Data Reengineering & Cleansing Tools


Carlton Pureview from Oracle ETI-Extract from Evolutionary Technologies PowerMart from Informatica Corp Sagent Data Mart from Sagent Technology

Data Quality Assessment Tools


Migration Architect, Evoke Axio from Evoke Software Wizrule from Wizsoft

Name & Address Cleansing Tools


293

Centrus Suite from Sagent I.d.centric from First Logic


2009 Wipro Ltd - Confidential

Data Extraction, Transformation, Load

294

2009 Wipro Ltd - Confidential

ETL Architecture

Visitors

Web Browsers

The Internet

External Data Demographics, Household, Webographics, Income

Staging Area
Web Server Logs & E-comm Transaction Data Flat Files

Meta Data Repository

Scheduled Extraction

RDBMS

Clean Transform Match Merge

Scheduled Loading

Enterprise Data Warehouse

Other OLTP Systems

Data Collection

Data Extraction

Data Transformation

Data Loading

Data Storage & Integration

295

2009 Wipro Ltd - Confidential

ETL Architecture Data Extraction:


Rummages through a file or database Uses some criteria for selection Identifies qualified data and Transports the data over onto another file or database

Data transformation
Integrating dissimilar data types Changing codes Adding a time attribute Summarizing data Calculating derived values Renormalizing data

Data Extraction Cleanup

Data loading
Initial and incremental loading Updation of metadata

296

Restructuring of records or fields Removal of Operational-only data Supply of missing field values Data Integrity checks Data Consistency and Range checks, 2009 Wipro Ltd - Confidential

Why ETL ?
Companies have valuable data lying around throughout their networks that needs to be moved from one place to another. The data lies in all sorts of heterogeneous systems,and therefore in all sorts of formats. To solve the problem, companies use extract, transform and load (ETL) software.
297
2009 Wipro Ltd - Confidential

Major components involved in ETL Processing

298

2009 Wipro Ltd - Confidential

Major components involved in ETL Processing


Design manager Lets developers define source-to-target mappings, transformations, process flows, and jobs Meta data management Provides a repository to define, document, and manage information about the ETL design and runtime processes Extract The process of reading data from a database. Transform The process of converting the extracted data Load The process of writing the data into the target database. Transport services ETL tools use network and file protocols to move data between source and target systems and in-memory protocols to move data between ETL run-time components. Administration and operation ETL utilities let administrators schedule, run, monitor ETL jobs, log all events, manage errors, recover from failures, reconcile outputs with source systems
2009 Wipro Ltd - Confidential

299

ETL Tools Provides facility to specify a large number of transformation rules with a GUI Generate programs to transform data Handle multiple data sources Handle data redundancy Generate metadata as output Most tools exploit parallelism by running on multiple low-cost servers in multi-threaded environment
300
2009 Wipro Ltd - Confidential

Metadata Management

301

2009 Wipro Ltd - Confidential

What Is Metadata?
Metadata is Information...

That describes the WHAT, WHEN, WHO, WHERE, HOW of the data warehouse About the data being captured and loaded into the Warehouse Documented in IT tools that improves both business and technical understanding of data and data-related processes

302

2009 Wipro Ltd - Confidential

Importance Of Metadata
Locating Information Time spent in looking for information. How often information is found? What poor decisions were made based on the incomplete information?

How much money was lost or earned as a result? Interpreting information


How many times have businesses needed to rework or recall products? What impact does it have on the bottom line ? How many mistakes were due to misinterpretation of existing documentation? How much interpretation results form too much metadata? How much time is spent trying to determine if any of the metadata is accurate? Integrating information How various data perspectives connect together? How much time is spent trying to figure out that? How much does the inefficiency and lack of metadata affect decision making

303

2009 Wipro Ltd - Confidential

Requirements for DW Metadata Management


Provide a simple catalogue of business metadata descriptions and views Document/manage metadata descriptions from an integrated development environment Enable DW users to identify and invoke pre-built queries against the data stores Design and enhance new data models and schemas for the data warehouse Capture data transformation rules between the operational and data warehousing databases Provide change impact analysis, and update across these technologies
304
2009 Wipro Ltd - Confidential

Consumers of Metadata
Technical Users Warehouse administrator Application developer Business Users -Business metadata Meanings Definitions Business Rules Software Tools Used in DW life-cycle development Metadata requirements for each tool must be identified The tool-specific metadata should be analysed for inclusion in the enterprise metadata repository Previously captured metadata should be electronically transferred from the enterprise metadata repository to each individual tool

305

2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Third Party Bridging Tools Oracle Exchange
Technology of choice for a long list of repository, enterprise and workgroup vendors

Reischmann-Informatik-Toolbus
Features include facilitation of selective bridging of metadata

Ardent Software/ Dovetail Software -Interplay


Hub and Spoke solution for enabling metadata interoperability Ardent focussing on own engagements, not selling it as independent product

Informix's Metadata Plug-ins


Available with Ardent Datastage version 3.6.2 free of cost for Erwin, Oracle Designer, Sybase Powerdesigner, Brio, Microstrategy
306
2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Metadata Repositories IBM, Oracle and Microsoft to offer free or near-free basic repository services Enable organisations to reuse metadata across technologies Integrate DB design, data transformation and BI tools from different vendors Multi-tool vendors taking a bridged or federated rather than integrated approach to sharing metadata Both IBM and Oracle have multiple repositories for different lines of products e.g., One for AD and one for DW, with bridges between them

307

2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Metadata Interchange Standards CDIF (CASE Data Interchange Format)
Most frequently used interchange standard Addresses only a limited subset of metadata artifacts

OMG (Object Management Group)-CWM


XML-addresses context and data meaning, not presentation Can enable exchange over the web employing industry standards for storing and sharing programming data Will allow sharing of UML and MOF objects b/w various development tools and repositories

MDC (Metadata Coalition)


Based on XML/UML standards Promoted by Microsoft Along With 20 partners including Object Management Group (OMG), Oracle Carleton Group, CA-PLATINUM Technology (Founding Member), Viasoft
308
2009 Wipro Ltd - Confidential

OLAP

309

2009 Wipro Ltd - Confidential

Agenda
OLAP Definition Distinction between OLTP and OLAP

MDDB Concepts
Implementation Techniques Architectures

Features
Representative Tools

12/20/2012

310

310

2009 Wipro Ltd - Confidential

OLAP: On-Line Analytical Processing


OLAP can be defined as a technology which allows the users to view the aggregate data across measurements (like Maturity Amount, Interest Rate etc.) along with a set of related parameters called dimensions (like Product, Organization, Customer, etc.) Used interchangeably with BI Multidimensional view of data is the foundation of OLAP Users :Analysts, Decision makers
12/20/2012 311

311

2009 Wipro Ltd - Confidential

Distinction between OLTP and OLAP


OLTP System Source of data Operational data; OLTPs are the original source of the data To control and run fundamental business tasks A snapshot of ongoing business processes Short and fast inserts and updates initiated by end users
2009 Wipro Ltd - Confidential

OLAP System Consolidation data; OLAP data comes from the various OLTP databases Decision support

Purpose of data

What the data reveals Inserts and Updates


12/20/2012

Multi-dimensional views of various kinds of business activities Periodic long-running batch jobs refresh the 312 data

312

MDDB Concepts
A multidimensional database is a computer software system designed to allow for efficient and convenient storage and retrieval of data that is intimately related and stored, viewed and analyzed from different perspectives (Dimensions). A hypercube represents a collection of multidimensional data. The edges of the cube are called dimensions Individual items within each dimensions are called members

313

2009 Wipro Ltd - Confidential

RDBMS v/s MDDB: Increased Complexity...


Relational DBMS
MODEL MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SEDAN SEDAN SEDAN ... COLOR BLUE BLUE BLUE RED RED RED WHITE WHITE WHITE BLUE BLUE BLUE RED RED RED WHITE WHITE WHITE BLUE BLUE BLUE DEALER Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr VOL. 6 3 2 5 3 1 3 1 4 3 3 3 4 3 6 2 3 5 4 3 2 ...

MDDB

Sales Volumes

M O D E L

Mini Van

Coupe Carr Gleason Clyde Blue Red White

Sedan

DEALERSHIP

COLOR

27 x 4 = 108 cells
314

3 x 3 x 3 = 27 cells

2009 Wipro Ltd - Confidential

Benefits of MDDB over RDBMS


Ease of Data Presentation & Navigation A great deal of information is gleaned immediately upon direct inspection of the array User is able to view data along presorted dimensions with data arranged in an inherently more organized, and accessible fashion than the one offered by the relational table. Storage Space Very low Space Consumption compared to Relational DB Performance Gives much better performance. Relational DB may give comparable results only through database tuning (indexing, keys etc), which may not be possible for ad-hoc queries. Ease of Maintenance No overhead as data is stored in the same way it is viewed. In Relational DB, indexes, sophisticated joins etc. are used which require considerable storage and maintenance
12/20/2012
315
2009 Wipro Ltd - Confidential

315

Issues with MDDB

Sparsity
- Input data in applications are typically sparse -Increases with increased dimensions

Data Explosion
-Due to Sparsity -Due to Summarization

Performance
-Doesnt perform better than RDBMS at high data volumes (>20-30 GB)

12/20/2012
316
2009 Wipro Ltd - Confidential

316

Issues with MDDB - Sparsity Example If dimension members of different dimensions Employee Age do not interact , then blank cell is left behind. LAST NAME EMP# AGE
Smith

SMITH REGAN FOX WELD KELLY LINK KRANZ LUCUS WEISS

M O D E L

01 21 12 Sales Volumes 19 31 63 Miini Van 14 6 5 31 4 54 3 5 27 Coupe 5 03 56 4 3 2 Sedan 41 45 Blue Red White 33 COLOR 41 23 19

21

Regan

19 63 31 27 56 45 41 19
31 41 23 01 14 54 03 12 33

Fox

L A S T N A M E

Weld

Kelly

Link

Kranz

Lucas

Weiss

EMPLOYEE #

12/20/2012
317
2009 Wipro Ltd - Confidential

317

OLAP Features
Calculations applied across dimensions, through hierarchies and/or across members Trend analysis over sequential time periods, What-if scenarios. Slicing / Dicing subsets for on-screen viewing Rotation to new dimensional comparisons in the viewing area Drill-down/up along the hierarchy Reach-through / Drill-through to underlying detail data

12/20/2012
318
2009 Wipro Ltd - Confidential

318

Features of OLAP - Rotation

Complex Queries & Sorts in Relational environment translated to simple rotation.


Sales Volumes

M O D E L

Mini Van

6 3 4
Blue

5 5 3
Red

4 5 2
White

Coupe

C O L O R ( ROTATE 90 )
o

Blue

6 5 4

3 5 5
MODEL

4 3 2
Sedan

Red

Sedan

White

Mini Van Coupe

COLOR

View #1

View #2

2 dimensional array has 2 views.


12/20/2012
319
2009 Wipro Ltd - Confidential

319

Features of OLAP - Rotation


Sales Volumes

M O D E L

Mini Van Coupe Carr Gleason Clyde Blue Red White

Sedan

C O L O R

Blue

Red White Sedan Coupe Mini Van Carr Gleason Clyde

C O L O R

Blue

Red White Carr Gleason Clyde Mini Van Coupe Sedan

COLOR

( ROTATE 90 )

MODEL

( ROTATE 90 )

DEALERSHIP

( ROTATE 90 )

DEALERSHIP

DEALERSHIP

MODEL

View #1
D E A L E R S H I P D E A L E R S H I P

View #2

View #3

Carr Gleason Mini Van Coupe Sedan White Red Blue

Carr Gleason Blue Red White

Mini Van

Clyde

Clyde Mini Van Coupe Sedan

M O D E L

Coupe Blue Red White Clyde Gleason Carr

Sedan

COLOR

( ROTATE 90 )

MODEL

( ROTATE 90 )

DEALERSHIP

MODEL

COLOR

COLOR

View #4

View #5

View #6

3 dimensional array has 6 views.


12/20/2012
320
2009 Wipro Ltd - Confidential

320

Features of OLAP - Slicing / Filtering


MDDB allows end user to quickly slice in on exact view of the data required.

Sales Volumes

M O D E L

Mini Van Mini Van

Coupe

Coupe Normal Metal Blue Blue

Carr Clyde

Carr Clyde

Normal Blue

Metal Blue

DEALERSHIP

COLOR
12/20/2012
321
2009 Wipro Ltd - Confidential

321

Features of OLAP - Drill Down / Up

ORGANIZATION DIMENSION
REGION Midwest

DISTRICT

Chicago

St. Louis

Gary

DEALERSHIP

Clyde

Gleason

Carr

Levi

Lucas

Bolton

Sales at region/District/Dealership Level

Moving Up and moving down in a hierarchy is referred to as drill-up / roll-up and drill-down

12/20/2012
322
2009 Wipro Ltd - Confidential

322

OLAP Reporting - Drill Down

Inflows ( Region , Year)


200 150 Inflows 100 ($M) 50 0 Year Year 1999 2000 Years

East West Central

12/20/2012
323
2009 Wipro Ltd - Confidential

323

OLAP Reporting - Drill Down

Inflows ( Region , Year - Year 1999)


90 80 70 60 50 Inflows ( $M) 40 30 20 10 0

East West Central

1st Qtr

2nd Qtr 3rd Qtr Year 1999

4th Qtr

Drill-down from Year to Quarter


12/20/2012
324
2009 Wipro Ltd - Confidential

324

OLAP Reporting - Drill Down

Inflows ( Region , Year - Year 1999 - 1st Qtr)


20 15 Inflows ( $M 10 ) 5

East West Central


January February March Year 1999

Drill-down from Quarter to Month

325

2009 Wipro Ltd - Confidential

Implementation Techniques -OLAP Architectures

MOLAP - Multidimensional OLAP


Multidimensional Databases for database and application logic layer

ROLAP - Relational OLAP


Access Data stored in relational Data Warehouse for OLAP Analysis. Database and Application logic provided as separate layers

HOLAP - Hybrid OLAP


OLAP Server routes queries first to MDDB, then to RDBMS and result processed on-the-fly in Server

DOLAP - Desk OLAP


Personal MDDB Server and application on the desktop

12/20/2012
326
2009 Wipro Ltd - Confidential

326

MOLAP - MDDB storage

OLAP
Cube
OLAP Calculation Engine

Web Browser

OLAP Tools

OLAP Appli cations


12/20/2012
327
2009 Wipro Ltd - Confidential

327

MOLAP - Features

Powerful analytical capabilities (e.g., financial, forecasting, statistical) Aggregation and calculation capabilities Read/write analytic applications Specialized data structures for
Maximum query performance. Optimum space utilization.
12/20/2012
328
2009 Wipro Ltd - Confidential

328

ROLAP - Standard SQL storage

MDDB - Relational Mapping

Relational DW

Web Browser
OLAP Calculation Engine

SQL

OLAP Tools

OLAP Applications
12/20/2012
329
2009 Wipro Ltd - Confidential

329

ROLAP - Features Three-tier hardware/software architecture:


GUI on client; multidimensional processing on midtier server; target database on database server Processing split between mid-tier & database servers

Ad hoc query capabilities to very large databases DW integration Data scalability

12/20/2012
330
2009 Wipro Ltd - Confidential

330

HOLAP - Combination of RDBMS and MDDB


OLAP Cube

Any Client

Relational DW

Web Browser
OLAP Calculation Engine

SQL

OLAP Tools

OLAP Applications
12/20/2012
331
2009 Wipro Ltd - Confidential

331

HOLAP - Features

RDBMS used for detailed data stored in large databases MDDB used for fast, read/write OLAP analysis and calculations Scalability of RDBMS and MDDB performance Calculation engine provides full analysis features Source of data transparent to end user

12/20/2012
332
2009 Wipro Ltd - Confidential

332

Architecture Comparison

MOLAP
Definition

ROLAP

HOLAP
Hybrid OLAP = ROLAP + summary in MDDB Sparsity exists only in MDDB part To the necessary extent

MDDB OLAP = Relational OLAP = Transaction level data + Transaction level data + summary in MDDB summary in RDBMS Good Design 3 10 times High (May go beyond control. Estimation is very important) Fast - (Depends upon the size of the MDDB) No Sparsity To the necessary extent

Data explosion due to Sparsity Data explosion due to Summarization Query Execution Speed

Slow

Optimum - If the data is fetched from RDBMS then its like ROLAP otherwise like MOLAP. High: RDBMS + disk space + MDDB Server cost Large transactional data + frequent summary analysis

Cost

Medium: MDDB Server + large disk space cost

Low: Only RDBMS + disk space cost

Where to apply?

Small transactional Very large transactional data + complex model + data & it needs to be frequent summary viewed / sorted analysis

12/20/2012
333
2009 Wipro Ltd - Confidential

333

Representative OLAP Tools:

Oracle Express Products Hyperion Essbase Cognos -PowerPlay Seagate - Holos SAS

Micro Strategy - DSS Agent Informix MetaCube Brio Query Business Objects / Web Intelligence

12/20/2012
334
2009 Wipro Ltd - Confidential

334

Sample OLAP Applications

Sales Analysis Financial Analysis Profitability Analysis Performance Analysis Risk Management Profiling & Segmentation Scorecard Application NPA Management Strategic Planning Customer Relationship Management (CRM)
12/20/2012
335
2009 Wipro Ltd - Confidential

335

Data Warehouse Testing

336

2009 Wipro Ltd - Confidential

Data Warehouse Testing Overview


There is an exponentially increasing cost associated with finding software defects later in the development lifecycle. In data warehousing, this is compounded because of the additional business costs of using incorrect data to make critical business decisions

The methodology required for testing a Data Warehouse is different from testing a typical transaction system

337

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Data warehouse testing is different on the following counts: User-Triggered vs. System triggered Volume of Test Data Possible scenarios/ Test Cases Programming for testing challenge

338

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System.


User-Triggered vs. System triggered

In data Warehouse, most of the testing is system triggered. Most of the production/Source system testing is the processing of individual transactions, which are driven by some input from the users (Application Form, Servicing Request.). There are very few test cycles, which cover the system-triggered scenarios (Like billing, Valuation.)

339

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Volume of Test Data The test data in a transaction system is a very small sample of the overall production data. Data Warehouse has typically large test data as one does try to fill-up maximum possible combination of dimensions and facts. Possible scenarios/ Test Cases In case of Data Warehouse, the permutations and combinations one can possibly test is virtually unlimited due to the core objective of Data Warehouse is to allow all possible views of data.

340

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Programming for testing challenge In case of transaction systems, users/business analysts typically test the output of the system. In case of data warehouse, most of the 'Data Warehouse data Quality testing' and ETL testing is done at backend by running separate stand-alone scripts. These scripts compare preTransformation to post Transformation of data.

341

2009 Wipro Ltd - Confidential

Data Warehouse Testing Process


Data-Warehouse testing is basically divided into two parts : 'Back-end' testing where the source systems data is compared to the endresult data in Loaded area 'Front-end' testing where the user checks the data by comparing their MIS with the data displayed by the end-user tools like OLAP. Testing phases consists of : Requirements testing Unit testing Integration testing Performance testing Acceptance testing

342

2009 Wipro Ltd - Confidential

Requirements testing
The main aim for doing Requirements testing is to check stated requirements for completeness. Requirements can be tested on following factors. Are the requirements Complete? Are the requirements Singular? Are the requirements Ambiguous? Are the requirements Developable? Are the requirements Testable?

343

2009 Wipro Ltd - Confidential

Unit Testing
Unit testing for data warehouses is WHITEBOX. It should check the ETL procedures/mappings/jobs and the reports developed. Unit testing the ETL procedures: Whether ETLs are accessing and picking up right data from right source.

All the data transformations are correct according to the business rules and data warehouse is correctly populated with the transformed data.
Testing the rejected records that dont fulfil transformation rules.

344

2009 Wipro Ltd - Confidential

Unit Testing
Unit Testing the Report data:

Verify Report data with source: Data present in a data warehouse will be stored at an aggregate level compare to source systems. QA team should verify the granular data stored in data warehouse against the source data available Field level data verification: QA team must understand the linkages for the fields displayed in the report and should trace back and compare that with the source systems Derivation formulae/calculation rules should be verified

345

2009 Wipro Ltd - Confidential

Integration Testing
Integration testing will involve following:

Sequence of ETLs jobs in batch. Initial loading of records on data warehouse. Incremental loading of records at a later date to verify the newly inserted or updated data. Testing the rejected records that dont fulfil transformation rules. Error log generation

346

2009 Wipro Ltd - Confidential

Performance Testing
Performance Testing should check for : ETL processes completing within time window.

Monitoring and measuring the data quality issues.


Refresh times for standard/complex reports.

347

2009 Wipro Ltd - Confidential

Acceptance testing
Here the system is tested with full functionality and is expected to function as in production. At the end of UAT, the system should be acceptable to the client for use in terms of ETL process integrity and business functionality and reporting.

348

2009 Wipro Ltd - Confidential

Questions

349

2009 Wipro Ltd - Confidential

Thank You

350

2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Metadata Interchange Standards CDIF (CASE Data Interchange Format)
Most frequently used interchange standard Addresses only a limited subset of metadata artifacts

OMG (Object Management Group)-CWM


XML-addresses context and data meaning, not presentation Can enable exchange over the web employing industry standards for storing and sharing programming data Will allow sharing of UML and MOF objects b/w various development tools and repositories

MDC (Metadata Coalition)


Based on XML/UML standards Promoted by Microsoft Along With 20 partners including Object Management Group (OMG), Oracle Carleton Group, CA-PLATINUM Technology (Founding Member), Viasoft
351
2009 Wipro Ltd - Confidential

OLAP

352

2009 Wipro Ltd - Confidential

Agenda
OLAP Definition Distinction between OLTP and OLAP

MDDB Concepts
Implementation Techniques Architectures

Features
Representative Tools

12/20/2012

353

353

2009 Wipro Ltd - Confidential

OLAP: On-Line Analytical Processing


OLAP can be defined as a technology which allows the users to view the aggregate data across measurements (like Maturity Amount, Interest Rate etc.) along with a set of related parameters called dimensions (like Product, Organization, Customer, etc.) Used interchangeably with BI Multidimensional view of data is the foundation of OLAP Users :Analysts, Decision makers
12/20/2012 354

354

2009 Wipro Ltd - Confidential

Distinction between OLTP and OLAP


OLTP System Source of data Operational data; OLTPs are the original source of the data To control and run fundamental business tasks A snapshot of ongoing business processes Short and fast inserts and updates initiated by end users
2009 Wipro Ltd - Confidential

OLAP System Consolidation data; OLAP data comes from the various OLTP databases Decision support

Purpose of data

What the data reveals Inserts and Updates


12/20/2012

Multi-dimensional views of various kinds of business activities Periodic long-running batch jobs refresh the 355 data

355

MDDB Concepts
A multidimensional database is a computer software system designed to allow for efficient and convenient storage and retrieval of data that is intimately related and stored, viewed and analyzed from different perspectives (Dimensions). A hypercube represents a collection of multidimensional data. The edges of the cube are called dimensions Individual items within each dimensions are called members

356

2009 Wipro Ltd - Confidential

RDBMS v/s MDDB: Increased Complexity...


Relational DBMS
MODEL MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SEDAN SEDAN SEDAN ... COLOR BLUE BLUE BLUE RED RED RED WHITE WHITE WHITE BLUE BLUE BLUE RED RED RED WHITE WHITE WHITE BLUE BLUE BLUE DEALER Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr VOL. 6 3 2 5 3 1 3 1 4 3 3 3 4 3 6 2 3 5 4 3 2 ...

MDDB

Sales Volumes

M O D E L

Mini Van

Coupe Carr Gleason Clyde Blue Red White

Sedan

DEALERSHIP

COLOR

27 x 4 = 108 cells
357

3 x 3 x 3 = 27 cells

2009 Wipro Ltd - Confidential

Benefits of MDDB over RDBMS


Ease of Data Presentation & Navigation A great deal of information is gleaned immediately upon direct inspection of the array User is able to view data along presorted dimensions with data arranged in an inherently more organized, and accessible fashion than the one offered by the relational table. Storage Space Very low Space Consumption compared to Relational DB Performance Gives much better performance. Relational DB may give comparable results only through database tuning (indexing, keys etc), which may not be possible for ad-hoc queries. Ease of Maintenance No overhead as data is stored in the same way it is viewed. In Relational DB, indexes, sophisticated joins etc. are used which require considerable storage and maintenance
12/20/2012
358
2009 Wipro Ltd - Confidential

358

Issues with MDDB

Sparsity
- Input data in applications are typically sparse -Increases with increased dimensions

Data Explosion
-Due to Sparsity -Due to Summarization

Performance
-Doesnt perform better than RDBMS at high data volumes (>20-30 GB)

12/20/2012
359
2009 Wipro Ltd - Confidential

359

Issues with MDDB - Sparsity Example If dimension members of different dimensions Employee Age do not interact , then blank cell is left behind. LAST NAME EMP# AGE
Smith

SMITH REGAN FOX WELD KELLY LINK KRANZ LUCUS WEISS

M O D E L

01 21 12 Sales Volumes 19 31 63 Miini Van 14 6 5 31 4 54 3 5 27 Coupe 5 03 56 4 3 2 Sedan 41 45 Blue Red White 33 COLOR 41 23 19

21

Regan

19 63 31 27 56 45 41 19
31 41 23 01 14 54 03 12 33

Fox

L A S T N A M E

Weld

Kelly

Link

Kranz

Lucas

Weiss

EMPLOYEE #

12/20/2012
360
2009 Wipro Ltd - Confidential

360

OLAP Features
Calculations applied across dimensions, through hierarchies and/or across members Trend analysis over sequential time periods, What-if scenarios. Slicing / Dicing subsets for on-screen viewing Rotation to new dimensional comparisons in the viewing area Drill-down/up along the hierarchy Reach-through / Drill-through to underlying detail data

12/20/2012
361
2009 Wipro Ltd - Confidential

361

Features of OLAP - Rotation

Complex Queries & Sorts in Relational environment translated to simple rotation.


Sales Volumes

M O D E L

Mini Van

6 3 4
Blue

5 5 3
Red

4 5 2
White

Coupe

C O L O R ( ROTATE 90 )
o

Blue

6 5 4

3 5 5
MODEL

4 3 2
Sedan

Red

Sedan

White

Mini Van Coupe

COLOR

View #1

View #2

2 dimensional array has 2 views.


12/20/2012
362
2009 Wipro Ltd - Confidential

362

Features of OLAP - Rotation


Sales Volumes

M O D E L

Mini Van Coupe Carr Gleason Clyde Blue Red White

Sedan

C O L O R

Blue

Red White Sedan Coupe Mini Van Carr Gleason Clyde

C O L O R

Blue

Red White Carr Gleason Clyde Mini Van Coupe Sedan

COLOR

( ROTATE 90 )

MODEL

( ROTATE 90 )

DEALERSHIP

( ROTATE 90 )

DEALERSHIP

DEALERSHIP

MODEL

View #1
D E A L E R S H I P D E A L E R S H I P

View #2

View #3

Carr Gleason Mini Van Coupe Sedan White Red Blue

Carr Gleason Blue Red White

Mini Van

Clyde

Clyde Mini Van Coupe Sedan

M O D E L

Coupe Blue Red White Clyde Gleason Carr

Sedan

COLOR

( ROTATE 90 )

MODEL

( ROTATE 90 )

DEALERSHIP

MODEL

COLOR

COLOR

View #4

View #5

View #6

3 dimensional array has 6 views.


12/20/2012
363
2009 Wipro Ltd - Confidential

363

Features of OLAP - Slicing / Filtering


MDDB allows end user to quickly slice in on exact view of the data required.

Sales Volumes

M O D E L

Mini Van Mini Van

Coupe

Coupe Normal Metal Blue Blue

Carr Clyde

Carr Clyde

Normal Blue

Metal Blue

DEALERSHIP

COLOR
12/20/2012
364
2009 Wipro Ltd - Confidential

364

Features of OLAP - Drill Down / Up

ORGANIZATION DIMENSION
REGION Midwest

DISTRICT

Chicago

St. Louis

Gary

DEALERSHIP

Clyde

Gleason

Carr

Levi

Lucas

Bolton

Sales at region/District/Dealership Level

Moving Up and moving down in a hierarchy is referred to as drill-up / roll-up and drill-down

12/20/2012
365
2009 Wipro Ltd - Confidential

365

OLAP Reporting - Drill Down

Inflows ( Region , Year)


200 150 Inflows 100 ($M) 50 0 Year Year 1999 2000 Years

East West Central

12/20/2012
366
2009 Wipro Ltd - Confidential

366

OLAP Reporting - Drill Down

Inflows ( Region , Year - Year 1999)


90 80 70 60 50 Inflows ( $M) 40 30 20 10 0

East West Central

1st Qtr

2nd Qtr 3rd Qtr Year 1999

4th Qtr

Drill-down from Year to Quarter


12/20/2012
367
2009 Wipro Ltd - Confidential

367

OLAP Reporting - Drill Down

Inflows ( Region , Year - Year 1999 - 1st Qtr)


20 15 Inflows ( $M 10 ) 5

East West Central


January February March Year 1999

Drill-down from Quarter to Month

368

2009 Wipro Ltd - Confidential

Implementation Techniques -OLAP Architectures

MOLAP - Multidimensional OLAP


Multidimensional Databases for database and application logic layer

ROLAP - Relational OLAP


Access Data stored in relational Data Warehouse for OLAP Analysis. Database and Application logic provided as separate layers

HOLAP - Hybrid OLAP


OLAP Server routes queries first to MDDB, then to RDBMS and result processed on-the-fly in Server

DOLAP - Desk OLAP


Personal MDDB Server and application on the desktop

12/20/2012
369
2009 Wipro Ltd - Confidential

369

MOLAP - MDDB storage

OLAP
Cube
OLAP Calculation Engine

Web Browser

OLAP Tools

OLAP Appli cations


12/20/2012
370
2009 Wipro Ltd - Confidential

370

MOLAP - Features

Powerful analytical capabilities (e.g., financial, forecasting, statistical) Aggregation and calculation capabilities Read/write analytic applications Specialized data structures for
Maximum query performance. Optimum space utilization.
12/20/2012
371
2009 Wipro Ltd - Confidential

371

ROLAP - Standard SQL storage

MDDB - Relational Mapping

Relational DW

Web Browser
OLAP Calculation Engine

SQL

OLAP Tools

OLAP Applications
12/20/2012
372
2009 Wipro Ltd - Confidential

372

ROLAP - Features Three-tier hardware/software architecture:


GUI on client; multidimensional processing on midtier server; target database on database server Processing split between mid-tier & database servers

Ad hoc query capabilities to very large databases DW integration Data scalability

12/20/2012
373
2009 Wipro Ltd - Confidential

373

HOLAP - Combination of RDBMS and MDDB


OLAP Cube

Any Client

Relational DW

Web Browser
OLAP Calculation Engine

SQL

OLAP Tools

OLAP Applications
12/20/2012
374
2009 Wipro Ltd - Confidential

374

HOLAP - Features

RDBMS used for detailed data stored in large databases MDDB used for fast, read/write OLAP analysis and calculations Scalability of RDBMS and MDDB performance Calculation engine provides full analysis features Source of data transparent to end user

12/20/2012
375
2009 Wipro Ltd - Confidential

375

Architecture Comparison

MOLAP
Definition

ROLAP

HOLAP
Hybrid OLAP = ROLAP + summary in MDDB Sparsity exists only in MDDB part To the necessary extent

MDDB OLAP = Relational OLAP = Transaction level data + Transaction level data + summary in MDDB summary in RDBMS Good Design 3 10 times High (May go beyond control. Estimation is very important) Fast - (Depends upon the size of the MDDB) No Sparsity To the necessary extent

Data explosion due to Sparsity Data explosion due to Summarization Query Execution Speed

Slow

Optimum - If the data is fetched from RDBMS then its like ROLAP otherwise like MOLAP. High: RDBMS + disk space + MDDB Server cost Large transactional data + frequent summary analysis

Cost

Medium: MDDB Server + large disk space cost

Low: Only RDBMS + disk space cost

Where to apply?

Small transactional Very large transactional data + complex model + data & it needs to be frequent summary viewed / sorted analysis

12/20/2012
376
2009 Wipro Ltd - Confidential

376

Representative OLAP Tools:

Oracle Express Products Hyperion Essbase Cognos -PowerPlay Seagate - Holos SAS

Micro Strategy - DSS Agent Informix MetaCube Brio Query Business Objects / Web Intelligence

12/20/2012
377
2009 Wipro Ltd - Confidential

377

Sample OLAP Applications

Sales Analysis Financial Analysis Profitability Analysis Performance Analysis Risk Management Profiling & Segmentation Scorecard Application NPA Management Strategic Planning Customer Relationship Management (CRM)
12/20/2012
378
2009 Wipro Ltd - Confidential

378

Data Warehouse Testing

379

2009 Wipro Ltd - Confidential

Data Warehouse Testing Overview


There is an exponentially increasing cost associated with finding software defects later in the development lifecycle. In data warehousing, this is compounded because of the additional business costs of using incorrect data to make critical business decisions

The methodology required for testing a Data Warehouse is different from testing a typical transaction system

380

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Data warehouse testing is different on the following counts: User-Triggered vs. System triggered Volume of Test Data Possible scenarios/ Test Cases Programming for testing challenge

381

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System.


User-Triggered vs. System triggered

In data Warehouse, most of the testing is system triggered. Most of the production/Source system testing is the processing of individual transactions, which are driven by some input from the users (Application Form, Servicing Request.). There are very few test cycles, which cover the system-triggered scenarios (Like billing, Valuation.)

382

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Volume of Test Data The test data in a transaction system is a very small sample of the overall production data. Data Warehouse has typically large test data as one does try to fill-up maximum possible combination of dimensions and facts. Possible scenarios/ Test Cases In case of Data Warehouse, the permutations and combinations one can possibly test is virtually unlimited due to the core objective of Data Warehouse is to allow all possible views of data.

383

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Programming for testing challenge In case of transaction systems, users/business analysts typically test the output of the system. In case of data warehouse, most of the 'Data Warehouse data Quality testing' and ETL testing is done at backend by running separate stand-alone scripts. These scripts compare preTransformation to post Transformation of data.

384

2009 Wipro Ltd - Confidential

Data Warehouse Testing Process


Data-Warehouse testing is basically divided into two parts : 'Back-end' testing where the source systems data is compared to the endresult data in Loaded area 'Front-end' testing where the user checks the data by comparing their MIS with the data displayed by the end-user tools like OLAP. Testing phases consists of : Requirements testing Unit testing Integration testing Performance testing Acceptance testing

385

2009 Wipro Ltd - Confidential

Requirements testing
The main aim for doing Requirements testing is to check stated requirements for completeness. Requirements can be tested on following factors. Are the requirements Complete? Are the requirements Singular? Are the requirements Ambiguous? Are the requirements Developable? Are the requirements Testable?

386

2009 Wipro Ltd - Confidential

Unit Testing
Unit testing for data warehouses is WHITEBOX. It should check the ETL procedures/mappings/jobs and the reports developed. Unit testing the ETL procedures: Whether ETLs are accessing and picking up right data from right source.

All the data transformations are correct according to the business rules and data warehouse is correctly populated with the transformed data.
Testing the rejected records that dont fulfil transformation rules.

387

2009 Wipro Ltd - Confidential

Unit Testing
Unit Testing the Report data:

Verify Report data with source: Data present in a data warehouse will be stored at an aggregate level compare to source systems. QA team should verify the granular data stored in data warehouse against the source data available Field level data verification: QA team must understand the linkages for the fields displayed in the report and should trace back and compare that with the source systems Derivation formulae/calculation rules should be verified

388

2009 Wipro Ltd - Confidential

Integration Testing
Integration testing will involve following:

Sequence of ETLs jobs in batch. Initial loading of records on data warehouse. Incremental loading of records at a later date to verify the newly inserted or updated data. Testing the rejected records that dont fulfil transformation rules. Error log generation

389

2009 Wipro Ltd - Confidential

Performance Testing
Performance Testing should check for : ETL processes completing within time window.

Monitoring and measuring the data quality issues.


Refresh times for standard/complex reports.

390

2009 Wipro Ltd - Confidential

Acceptance testing
Here the system is tested with full functionality and is expected to function as in production. At the end of UAT, the system should be acceptable to the client for use in terms of ETL process integrity and business functionality and reporting.

391

2009 Wipro Ltd - Confidential

Questions

392

2009 Wipro Ltd - Confidential

Thank You

393

2009 Wipro Ltd - Confidential

Components of Warehouse
Source Tables: These are real-time, volatile data in relational databases for transaction processing (OLTP). These can be any relational databases or flat files. ETL Tools: To extract, cleansing, transform (aggregates, joins) and load the data from sources to target. Maintenance and Administration Tools: To authorize and monitor access to the data, set-up users. Scheduling jobs to run on offshore periods. Modeling Tools: Used for data warehouse design for high-performance using dimensional data modeling technique, mapping the source and target files. Databases: Target databases and data marts, which are part of data warehouse. These are structured for analysis and reporting purposes. End-user tools for analysis and reporting: get the reports and analyze the data from target tables. Different types of Querying, Data Mining, OLAP tools are used for this purpose.

394

2009 Wipro Ltd - Confidential

Data Warehouse Architecture


This is a basic design, where there are source files, which are loaded to a warehouse and users query the data for different purposes.

This has a staging area, where the data after cleansing, transforming is loaded and tested here. Later is directly loaded to the target database/warehouse. Which is divided to data marts and can be accessed by different users for their reporting and analyzing purposes.

395

2009 Wipro Ltd - Confidential

Data Modeling
Effective way of using a Data Warehouse

396

2009 Wipro Ltd - Confidential

Data Modeling Commonly E-R Data Model is used in OLTP, In OLAP Dimensional Data Model is used commonly. E-R (Entity-Relationship) Data Model
Entity: Object that can be observed and classified based on its properties and characteristics. Like employee, book, student Relationship: relating entities to other entities.

Different Perceptive of Data Modeling.


o Conceptual Data Model o Logical Data Model o Physical Data Model
397

2009 Wipro Ltd - Confidential

Terms used in Dimensional Data Model


To understand dimensional data modeling, let's define some of the terms commonly used in this type of modeling: Dimension: A category of information. For example, the time dimension. Attribute: A unique level within a dimension. For example, Month is an attribute in the Time Dimension. Hierarchy: The specification of levels that represents relationship between different attributes within a dimension. For example, one possible hierarchy in the Time dimension is Year Quarter Month Day. Fact Table: A table that contains the measures of interest. Lookup Table: It provides the detailed information about the attributes. For example, the lookup table for the Quarter attribute would include a list of all of the quarters available in the data warehouse. Surrogate Keys: To avoid the data integrity, surrogate keys are used. They are helpful for Slow Changing Dimensions and act as index/primary keys.
A dimensional model includes fact tables and lookup tables. Fact tables connect to one or more lookup tables, but fact tables do not have direct relationships to one another. Dimensions and hierarchies are represented by lookup tables. Attributes are the non-key 2009 Wipro Ltd - Confidential columns in the lookup tables.

398

Star Schema
Dimension Table
product prodId p1 p2 name price bolt 10 nut 5

Dimension Table
store storeId c1 c2 c3 city nyc sfo la

Fact Table
sale oderId date o100 1/7/97 o102 2/7/97 105 3/8/97 custId 53 53 111 prodId p1 p2 p1 storeId c1 c1 c3 qty 1 2 5 amt 12 11 50

Dimension Table
customer custId 53 81 111 name joe fred sally address 10 main 12 main 80 willow city sfo sfo la

399

2009 Wipro Ltd - Confidential

Snowflake Schema
Dimension Table Fact Table
store storeId s5 s7 s9 cityId sfo sfo la tId t1 t2 t1 mgr joe fred nancy
sType tId t1 t2 city size small large location downtown suburbs regId north south

Dimension Table
cityId pop sfo 1M la 5M

The star and snowflake schema are most commonly found in dimensional data warehouses and data marts where speed of data retrieval is more important than the efficiency of data manipulations. As such, the tables in these schema are not normalized much, and are frequently designed at a level of normalization short of third normal form.

region regId name north cold region south warm region

400

2009 Wipro Ltd - Confidential

Overview of Data Cleansing

401

2009 Wipro Ltd - Confidential

The Need For Data Quality Difficulty in decision making Time delays in operation Organizational mistrust Data ownership conflicts Customer attrition Costs associated with

402

error detection error rework customer service fixing customer problems


2009 Wipro Ltd - Confidential

Six Steps To Data Quality


Understand Information Flow In Organization
Identify authoritative data sources Interview Employees & Customers

Identify Potential Problem Areas & Asses Impact

Data Entry Points


Cost of bad data

Measure Quality Of Data

Use business rule discovery tools to identify data with inconsistent,

missing, incomplete, duplicate or incorrect values


Use data cleansing tools to clean data at the source Load only clean data into the data warehouse

Clean & Load Data

Continuous Monitoring

Schedule Periodic Cleansing of Source Data

Identify Areas of Improvement

Identify & Correct Cause of Defects Refine data capture mechanisms at source Educate users on importance of DQ
2009 Wipro Ltd - Confidential

403

Data Quality Solution Customized Programs Strengths:


Addresses specific needs No bulky one time investment

Limitations
Tons of Custom programs in different environments are difficult to manage Minor alterations demand coding efforts

Data Quality Assessment tools Strength


Provide automated assessment

Limitation
404
2009 Wipro Ltd - Confidential

Data Quality Solution


Business Rule Discovery tools Strengths
Detect Correlation in data values Can detect Patterns of behavior that indicate fraud

Limitations
Not all variables can be discovered Some discovered rules might not be pertinent There may be performance problems with large files or with many fields.

Data Reengineering & Cleansing tools Strengths


Usually are integrated packages with cleansing features as Add-on
405
2009 Wipro Ltd - Confidential

Tools In The Market Business Rule Discovery Tools


Integrity Data Reengineering Tool from Vality Technology Trillium Software System from Harte -Hanks Data Technologies Migration Architect from DB Star

Data Reengineering & Cleansing Tools


Carlton Pureview from Oracle ETI-Extract from Evolutionary Technologies PowerMart from Informatica Corp Sagent Data Mart from Sagent Technology

Data Quality Assessment Tools


Migration Architect, Evoke Axio from Evoke Software Wizrule from Wizsoft

Name & Address Cleansing Tools


406

Centrus Suite from Sagent I.d.centric from First Logic


2009 Wipro Ltd - Confidential

Data Extraction, Transformation, Load

407

2009 Wipro Ltd - Confidential

ETL Architecture

Visitors

Web Browsers

The Internet

External Data Demographics, Household, Webographics, Income

Staging Area
Web Server Logs & E-comm Transaction Data Flat Files

Meta Data Repository

Scheduled Extraction

RDBMS

Clean Transform Match Merge

Scheduled Loading

Enterprise Data Warehouse

Other OLTP Systems

Data Collection

Data Extraction

Data Transformation

Data Loading

Data Storage & Integration

408

2009 Wipro Ltd - Confidential

ETL Architecture Data Extraction:


Rummages through a file or database Uses some criteria for selection Identifies qualified data and Transports the data over onto another file or database

Data transformation
Integrating dissimilar data types Changing codes Adding a time attribute Summarizing data Calculating derived values Renormalizing data

Data Extraction Cleanup

Data loading
Initial and incremental loading Updation of metadata

409

Restructuring of records or fields Removal of Operational-only data Supply of missing field values Data Integrity checks Data Consistency and Range checks, 2009 Wipro Ltd - Confidential

Why ETL ?
Companies have valuable data lying around throughout their networks that needs to be moved from one place to another. The data lies in all sorts of heterogeneous systems,and therefore in all sorts of formats. To solve the problem, companies use extract, transform and load (ETL) software.
410
2009 Wipro Ltd - Confidential

Major components involved in ETL Processing

411

2009 Wipro Ltd - Confidential

Major components involved in ETL Processing


Design manager Lets developers define source-to-target mappings, transformations, process flows, and jobs Meta data management Provides a repository to define, document, and manage information about the ETL design and runtime processes Extract The process of reading data from a database. Transform The process of converting the extracted data Load The process of writing the data into the target database. Transport services ETL tools use network and file protocols to move data between source and target systems and in-memory protocols to move data between ETL run-time components. Administration and operation ETL utilities let administrators schedule, run, monitor ETL jobs, log all events, manage errors, recover from failures, reconcile outputs with source systems
2009 Wipro Ltd - Confidential

412

ETL Tools Provides facility to specify a large number of transformation rules with a GUI Generate programs to transform data Handle multiple data sources Handle data redundancy Generate metadata as output Most tools exploit parallelism by running on multiple low-cost servers in multi-threaded environment
413
2009 Wipro Ltd - Confidential

Metadata Management

414

2009 Wipro Ltd - Confidential

What Is Metadata?
Metadata is Information...

That describes the WHAT, WHEN, WHO, WHERE, HOW of the data warehouse About the data being captured and loaded into the Warehouse Documented in IT tools that improves both business and technical understanding of data and data-related processes

415

2009 Wipro Ltd - Confidential

Importance Of Metadata
Locating Information Time spent in looking for information. How often information is found? What poor decisions were made based on the incomplete information?

How much money was lost or earned as a result? Interpreting information


How many times have businesses needed to rework or recall products? What impact does it have on the bottom line ? How many mistakes were due to misinterpretation of existing documentation? How much interpretation results form too much metadata? How much time is spent trying to determine if any of the metadata is accurate? Integrating information How various data perspectives connect together? How much time is spent trying to figure out that? How much does the inefficiency and lack of metadata affect decision making

416

2009 Wipro Ltd - Confidential

Requirements for DW Metadata Management


Provide a simple catalogue of business metadata descriptions and views Document/manage metadata descriptions from an integrated development environment Enable DW users to identify and invoke pre-built queries against the data stores Design and enhance new data models and schemas for the data warehouse Capture data transformation rules between the operational and data warehousing databases Provide change impact analysis, and update across these technologies
417
2009 Wipro Ltd - Confidential

Consumers of Metadata
Technical Users Warehouse administrator Application developer Business Users -Business metadata Meanings Definitions Business Rules Software Tools Used in DW life-cycle development Metadata requirements for each tool must be identified The tool-specific metadata should be analysed for inclusion in the enterprise metadata repository Previously captured metadata should be electronically transferred from the enterprise metadata repository to each individual tool

418

2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Third Party Bridging Tools Oracle Exchange
Technology of choice for a long list of repository, enterprise and workgroup vendors

Reischmann-Informatik-Toolbus
Features include facilitation of selective bridging of metadata

Ardent Software/ Dovetail Software -Interplay


Hub and Spoke solution for enabling metadata interoperability Ardent focussing on own engagements, not selling it as independent product

Informix's Metadata Plug-ins


Available with Ardent Datastage version 3.6.2 free of cost for Erwin, Oracle Designer, Sybase Powerdesigner, Brio, Microstrategy
419
2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Metadata Repositories IBM, Oracle and Microsoft to offer free or near-free basic repository services Enable organisations to reuse metadata across technologies Integrate DB design, data transformation and BI tools from different vendors Multi-tool vendors taking a bridged or federated rather than integrated approach to sharing metadata Both IBM and Oracle have multiple repositories for different lines of products e.g., One for AD and one for DW, with bridges between them

420

2009 Wipro Ltd - Confidential

Trends in the Metadata Management Tools


Metadata Interchange Standards CDIF (CASE Data Interchange Format)
Most frequently used interchange standard Addresses only a limited subset of metadata artifacts

OMG (Object Management Group)-CWM


XML-addresses context and data meaning, not presentation Can enable exchange over the web employing industry standards for storing and sharing programming data Will allow sharing of UML and MOF objects b/w various development tools and repositories

MDC (Metadata Coalition)


Based on XML/UML standards Promoted by Microsoft Along With 20 partners including Object Management Group (OMG), Oracle Carleton Group, CA-PLATINUM Technology (Founding Member), Viasoft
421
2009 Wipro Ltd - Confidential

OLAP

422

2009 Wipro Ltd - Confidential

Agenda
OLAP Definition Distinction between OLTP and OLAP

MDDB Concepts
Implementation Techniques Architectures

Features
Representative Tools

12/20/2012

423

423

2009 Wipro Ltd - Confidential

OLAP: On-Line Analytical Processing


OLAP can be defined as a technology which allows the users to view the aggregate data across measurements (like Maturity Amount, Interest Rate etc.) along with a set of related parameters called dimensions (like Product, Organization, Customer, etc.) Used interchangeably with BI Multidimensional view of data is the foundation of OLAP Users :Analysts, Decision makers
12/20/2012 424

424

2009 Wipro Ltd - Confidential

Distinction between OLTP and OLAP


OLTP System Source of data Operational data; OLTPs are the original source of the data To control and run fundamental business tasks A snapshot of ongoing business processes Short and fast inserts and updates initiated by end users
2009 Wipro Ltd - Confidential

OLAP System Consolidation data; OLAP data comes from the various OLTP databases Decision support

Purpose of data

What the data reveals Inserts and Updates


12/20/2012

Multi-dimensional views of various kinds of business activities Periodic long-running batch jobs refresh the 425 data

425

MDDB Concepts
A multidimensional database is a computer software system designed to allow for efficient and convenient storage and retrieval of data that is intimately related and stored, viewed and analyzed from different perspectives (Dimensions). A hypercube represents a collection of multidimensional data. The edges of the cube are called dimensions Individual items within each dimensions are called members

426

2009 Wipro Ltd - Confidential

RDBMS v/s MDDB: Increased Complexity...


Relational DBMS
MODEL MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN MINI VAN SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SPORTS COUPE SEDAN SEDAN SEDAN ... COLOR BLUE BLUE BLUE RED RED RED WHITE WHITE WHITE BLUE BLUE BLUE RED RED RED WHITE WHITE WHITE BLUE BLUE BLUE DEALER Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr Clyde Gleason Carr VOL. 6 3 2 5 3 1 3 1 4 3 3 3 4 3 6 2 3 5 4 3 2 ...

MDDB

Sales Volumes

M O D E L

Mini Van

Coupe Carr Gleason Clyde Blue Red White

Sedan

DEALERSHIP

COLOR

27 x 4 = 108 cells
427

3 x 3 x 3 = 27 cells

2009 Wipro Ltd - Confidential

Benefits of MDDB over RDBMS


Ease of Data Presentation & Navigation A great deal of information is gleaned immediately upon direct inspection of the array User is able to view data along presorted dimensions with data arranged in an inherently more organized, and accessible fashion than the one offered by the relational table. Storage Space Very low Space Consumption compared to Relational DB Performance Gives much better performance. Relational DB may give comparable results only through database tuning (indexing, keys etc), which may not be possible for ad-hoc queries. Ease of Maintenance No overhead as data is stored in the same way it is viewed. In Relational DB, indexes, sophisticated joins etc. are used which require considerable storage and maintenance
12/20/2012
428
2009 Wipro Ltd - Confidential

428

Issues with MDDB

Sparsity
- Input data in applications are typically sparse -Increases with increased dimensions

Data Explosion
-Due to Sparsity -Due to Summarization

Performance
-Doesnt perform better than RDBMS at high data volumes (>20-30 GB)

12/20/2012
429
2009 Wipro Ltd - Confidential

429

Issues with MDDB - Sparsity Example If dimension members of different dimensions Employee Age do not interact , then blank cell is left behind. LAST NAME EMP# AGE
Smith

SMITH REGAN FOX WELD KELLY LINK KRANZ LUCUS WEISS

M O D E L

01 21 12 Sales Volumes 19 31 63 Miini Van 14 6 5 31 4 54 3 5 27 Coupe 5 03 56 4 3 2 Sedan 41 45 Blue Red White 33 COLOR 41 23 19

21

Regan

19 63 31 27 56 45 41 19
31 41 23 01 14 54 03 12 33

Fox

L A S T N A M E

Weld

Kelly

Link

Kranz

Lucas

Weiss

EMPLOYEE #

12/20/2012
430
2009 Wipro Ltd - Confidential

430

OLAP Features
Calculations applied across dimensions, through hierarchies and/or across members Trend analysis over sequential time periods, What-if scenarios. Slicing / Dicing subsets for on-screen viewing Rotation to new dimensional comparisons in the viewing area Drill-down/up along the hierarchy Reach-through / Drill-through to underlying detail data

12/20/2012
431
2009 Wipro Ltd - Confidential

431

Features of OLAP - Rotation

Complex Queries & Sorts in Relational environment translated to simple rotation.


Sales Volumes

M O D E L

Mini Van

6 3 4
Blue

5 5 3
Red

4 5 2
White

Coupe

C O L O R ( ROTATE 90 )
o

Blue

6 5 4

3 5 5
MODEL

4 3 2
Sedan

Red

Sedan

White

Mini Van Coupe

COLOR

View #1

View #2

2 dimensional array has 2 views.


12/20/2012
432
2009 Wipro Ltd - Confidential

432

Features of OLAP - Rotation


Sales Volumes

M O D E L

Mini Van Coupe Carr Gleason Clyde Blue Red White

Sedan

C O L O R

Blue

Red White Sedan Coupe Mini Van Carr Gleason Clyde

C O L O R

Blue

Red White Carr Gleason Clyde Mini Van Coupe Sedan

COLOR

( ROTATE 90 )

MODEL

( ROTATE 90 )

DEALERSHIP

( ROTATE 90 )

DEALERSHIP

DEALERSHIP

MODEL

View #1
D E A L E R S H I P D E A L E R S H I P

View #2

View #3

Carr Gleason Mini Van Coupe Sedan White Red Blue

Carr Gleason Blue Red White

Mini Van

Clyde

Clyde Mini Van Coupe Sedan

M O D E L

Coupe Blue Red White Clyde Gleason Carr

Sedan

COLOR

( ROTATE 90 )

MODEL

( ROTATE 90 )

DEALERSHIP

MODEL

COLOR

COLOR

View #4

View #5

View #6

3 dimensional array has 6 views.


12/20/2012
433
2009 Wipro Ltd - Confidential

433

Features of OLAP - Slicing / Filtering


MDDB allows end user to quickly slice in on exact view of the data required.

Sales Volumes

M O D E L

Mini Van Mini Van

Coupe

Coupe Normal Metal Blue Blue

Carr Clyde

Carr Clyde

Normal Blue

Metal Blue

DEALERSHIP

COLOR
12/20/2012
434
2009 Wipro Ltd - Confidential

434

Features of OLAP - Drill Down / Up

ORGANIZATION DIMENSION
REGION Midwest

DISTRICT

Chicago

St. Louis

Gary

DEALERSHIP

Clyde

Gleason

Carr

Levi

Lucas

Bolton

Sales at region/District/Dealership Level

Moving Up and moving down in a hierarchy is referred to as drill-up / roll-up and drill-down

12/20/2012
435
2009 Wipro Ltd - Confidential

435

OLAP Reporting - Drill Down

Inflows ( Region , Year)


200 150 Inflows 100 ($M) 50 0 Year Year 1999 2000 Years

East West Central

12/20/2012
436
2009 Wipro Ltd - Confidential

436

OLAP Reporting - Drill Down

Inflows ( Region , Year - Year 1999)


90 80 70 60 50 Inflows ( $M) 40 30 20 10 0

East West Central

1st Qtr

2nd Qtr 3rd Qtr Year 1999

4th Qtr

Drill-down from Year to Quarter


12/20/2012
437
2009 Wipro Ltd - Confidential

437

OLAP Reporting - Drill Down

Inflows ( Region , Year - Year 1999 - 1st Qtr)


20 15 Inflows ( $M 10 ) 5

East West Central


January February March Year 1999

Drill-down from Quarter to Month

438

2009 Wipro Ltd - Confidential

Implementation Techniques -OLAP Architectures

MOLAP - Multidimensional OLAP


Multidimensional Databases for database and application logic layer

ROLAP - Relational OLAP


Access Data stored in relational Data Warehouse for OLAP Analysis. Database and Application logic provided as separate layers

HOLAP - Hybrid OLAP


OLAP Server routes queries first to MDDB, then to RDBMS and result processed on-the-fly in Server

DOLAP - Desk OLAP


Personal MDDB Server and application on the desktop

12/20/2012
439
2009 Wipro Ltd - Confidential

439

MOLAP - MDDB storage

OLAP
Cube
OLAP Calculation Engine

Web Browser

OLAP Tools

OLAP Appli cations


12/20/2012
440
2009 Wipro Ltd - Confidential

440

MOLAP - Features

Powerful analytical capabilities (e.g., financial, forecasting, statistical) Aggregation and calculation capabilities Read/write analytic applications Specialized data structures for
Maximum query performance. Optimum space utilization.
12/20/2012
441
2009 Wipro Ltd - Confidential

441

ROLAP - Standard SQL storage

MDDB - Relational Mapping

Relational DW

Web Browser
OLAP Calculation Engine

SQL

OLAP Tools

OLAP Applications
12/20/2012
442
2009 Wipro Ltd - Confidential

442

ROLAP - Features Three-tier hardware/software architecture:


GUI on client; multidimensional processing on midtier server; target database on database server Processing split between mid-tier & database servers

Ad hoc query capabilities to very large databases DW integration Data scalability

12/20/2012
443
2009 Wipro Ltd - Confidential

443

HOLAP - Combination of RDBMS and MDDB


OLAP Cube

Any Client

Relational DW

Web Browser
OLAP Calculation Engine

SQL

OLAP Tools

OLAP Applications
12/20/2012
444
2009 Wipro Ltd - Confidential

444

HOLAP - Features

RDBMS used for detailed data stored in large databases MDDB used for fast, read/write OLAP analysis and calculations Scalability of RDBMS and MDDB performance Calculation engine provides full analysis features Source of data transparent to end user

12/20/2012
445
2009 Wipro Ltd - Confidential

445

Architecture Comparison

MOLAP
Definition

ROLAP

HOLAP
Hybrid OLAP = ROLAP + summary in MDDB Sparsity exists only in MDDB part To the necessary extent

MDDB OLAP = Relational OLAP = Transaction level data + Transaction level data + summary in MDDB summary in RDBMS Good Design 3 10 times High (May go beyond control. Estimation is very important) Fast - (Depends upon the size of the MDDB) No Sparsity To the necessary extent

Data explosion due to Sparsity Data explosion due to Summarization Query Execution Speed

Slow

Optimum - If the data is fetched from RDBMS then its like ROLAP otherwise like MOLAP. High: RDBMS + disk space + MDDB Server cost Large transactional data + frequent summary analysis

Cost

Medium: MDDB Server + large disk space cost

Low: Only RDBMS + disk space cost

Where to apply?

Small transactional Very large transactional data + complex model + data & it needs to be frequent summary viewed / sorted analysis

12/20/2012
446
2009 Wipro Ltd - Confidential

446

Representative OLAP Tools:

Oracle Express Products Hyperion Essbase Cognos -PowerPlay Seagate - Holos SAS

Micro Strategy - DSS Agent Informix MetaCube Brio Query Business Objects / Web Intelligence

12/20/2012
447
2009 Wipro Ltd - Confidential

447

Sample OLAP Applications

Sales Analysis Financial Analysis Profitability Analysis Performance Analysis Risk Management Profiling & Segmentation Scorecard Application NPA Management Strategic Planning Customer Relationship Management (CRM)
12/20/2012
448
2009 Wipro Ltd - Confidential

448

Data Warehouse Testing

449

2009 Wipro Ltd - Confidential

Data Warehouse Testing Overview


There is an exponentially increasing cost associated with finding software defects later in the development lifecycle. In data warehousing, this is compounded because of the additional business costs of using incorrect data to make critical business decisions

The methodology required for testing a Data Warehouse is different from testing a typical transaction system

450

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Data warehouse testing is different on the following counts: User-Triggered vs. System triggered Volume of Test Data Possible scenarios/ Test Cases Programming for testing challenge

451

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System.


User-Triggered vs. System triggered

In data Warehouse, most of the testing is system triggered. Most of the production/Source system testing is the processing of individual transactions, which are driven by some input from the users (Application Form, Servicing Request.). There are very few test cycles, which cover the system-triggered scenarios (Like billing, Valuation.)

452

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Volume of Test Data The test data in a transaction system is a very small sample of the overall production data. Data Warehouse has typically large test data as one does try to fill-up maximum possible combination of dimensions and facts. Possible scenarios/ Test Cases In case of Data Warehouse, the permutations and combinations one can possibly test is virtually unlimited due to the core objective of Data Warehouse is to allow all possible views of data.

453

2009 Wipro Ltd - Confidential

Difference In Testing Data warehouse and Transaction System


Programming for testing challenge In case of transaction systems, users/business analysts typically test the output of the system. In case of data warehouse, most of the 'Data Warehouse data Quality testing' and ETL testing is done at backend by running separate stand-alone scripts. These scripts compare preTransformation to post Transformation of data.

454

2009 Wipro Ltd - Confidential

Data Warehouse Testing Process


Data-Warehouse testing is basically divided into two parts : 'Back-end' testing where the source systems data is compared to the endresult data in Loaded area 'Front-end' testing where the user checks the data by comparing their MIS with the data displayed by the end-user tools like OLAP. Testing phases consists of : Requirements testing Unit testing Integration testing Performance testing Acceptance testing

455

2009 Wipro Ltd - Confidential

Requirements testing
The main aim for doing Requirements testing is to check stated requirements for completeness. Requirements can be tested on following factors. Are the requirements Complete? Are the requirements Singular? Are the requirements Ambiguous? Are the requirements Developable? Are the requirements Testable?

456

2009 Wipro Ltd - Confidential

Unit Testing
Unit testing for data warehouses is WHITEBOX. It should check the ETL procedures/mappings/jobs and the reports developed. Unit testing the ETL procedures: Whether ETLs are accessing and picking up right data from right source.

All the data transformations are correct according to the business rules and data warehouse is correctly populated with the transformed data.
Testing the rejected records that dont fulfil transformation rules.

457

2009 Wipro Ltd - Confidential

Unit Testing
Unit Testing the Report data:

Verify Report data with source: Data present in a data warehouse will be stored at an aggregate level compare to source systems. QA team should verify the granular data stored in data warehouse against the source data available Field level data verification: QA team must understand the linkages for the fields displayed in the report and should trace back and compare that with the source systems Derivation formulae/calculation rules should be verified

458

2009 Wipro Ltd - Confidential

Integration Testing
Integration testing will involve following:

Sequence of ETLs jobs in batch. Initial loading of records on data warehouse. Incremental loading of records at a later date to verify the newly inserted or updated data. Testing the rejected records that dont fulfil transformation rules. Error log generation

459

2009 Wipro Ltd - Confidential

Performance Testing
Performance Testing should check for : ETL processes completing within time window.

Monitoring and measuring the data quality issues.


Refresh times for standard/complex reports.

460

2009 Wipro Ltd - Confidential

Acceptance testing
Here the system is tested with full functionality and is expected to function as in production. At the end of UAT, the system should be acceptable to the client for use in terms of ETL process integrity and business functionality and reporting.

461

2009 Wipro Ltd - Confidential

Questions

462

2009 Wipro Ltd - Confidential

Thank You

463

2009 Wipro Ltd - Confidential

Data Warehouse Concepts

Avinash Kanumuru Diya Jana Debyajit Majumder

2009 Wipro Ltd - Confidential

Content
1 An Overview of Data Warehouse 2 Data Warehouse Architecture 3 Data Modeling for Data Warehouse 4 Overview of Data Cleansing

5 Data Extraction, Transformation, Load

465

2009 Wipro Ltd - Confidential

Content [contd]
6 Metadata Management 7 OLAP 8 Data Warehouse Testing

466

2009 Wipro Ltd - Confidential

An Overview
Understanding What is a Data Warehouse

467

2009 Wipro Ltd - Confidential

You might also like