Professional Documents
Culture Documents
Training Introduction/Logistic
Lesson Agenda
Training Introduction/Logistic
Trainer introduction : Over 10 years of experience with DWH/BI Oracle technologies and building solution using those technologies 6 years of experience with OBIEE (past Siebel Analytics) technology (implementations, training) Worked for global companies (Siebel, Oracle) in consulting practice A few words about you (participants) : Name Role Prior experience with BI (particularly OBIEE) and DWH
Training Introduction/Logistic
Training is aimed mainly to: Business Intelligence developers DWH/Data designers BI Architects Business analysts
Training Audience
Training Introduction/Logistic
Desirable experience/knowledge for participants: Some BI/DWH experience Knowledge of Data modeling principles, ideally Kimball Dimensional Modeling Knowledge (at least basic) of SQL language
Training Prerequisites
Training Introduction/Logistic
Gain basic knowledge of OBIEE10g architecture and its components Gain knowledge of metadata structure (layers) Gain basic knowledge of BI Administration Tool (client sw, used for designing metadata) Learn basic administration routines and best practices Gain (using practical labs) essential basic knowledge for being able to
independently build OBIEE10g on top of relational data sources
Training Objectives
Training Introduction/Logistic
Day 1: Training Introduction/Logistic OBIEE10g architecture/Repository Introduction Physical Layer of a Repository Business Model and Mapping Layer of a Repository Presentation Layer of a Repository Day 2: Testing and Validating a Repository Adding Multiple Logical Table Sources Adding Calculations to a Fact Creating Dimension Hierarchies and Level-Based Measures
Training Agenda
Training Introduction/Logistic
Day 3: Using Aggregates Using Repository Variables Modeling Time Series Data Security Basics
Training Agenda
Become familiar with OBIEE10g architecture and its components Become familiar with 3 layers of OBIEE10g repository (metadata) Learn about BI Administration Tool
Lesson Objectives
Ad-hoc Analysis
Disconnected Analytics
MS Office Plug-in
Web Services
Enterprise Business Model and Abstraction Layer Intelligent Caching Services Multidimensional Calculation and Integration Engine
Oracle BI Server
Oracle BI server generates one or more Physical SQL queries
Physical Layer
Business Process
10
Main OBIEE10g architecture components: Clients BI Presentation Services BI Server Data sources
11
Provide access to BI content (requests, dashboards) within OBIEE10g UI: Answers tool for creation/modification/view of requests (synonym for a report Interactive Dashboards for displaying Answers requests along with other
content on dashboard pages (also allowing interactivity for exposed requests filtering, navigation.)
12
Takes care of rendering OBIEE10g web based UI : Answers Interactive Dashboards Delivers Administration Takes care of catalog for storage of user created content (requests, dashboards..) Presentation Catalog Gets the data from Oracle BI server and provides it to the clients
13
Core component of OBIEE10g architecture (heart of the system) Uses metadata (repository) during data processing Communicates with underlying data sources For relational data sources it generates dynamic SQL select statement to get the data It is using native connectivity (OCI for Oracle) or generic ODBC connectivity to connect to data sources It provides results of the queries (eventually enriched by its own calculations) to its clients (main client is BI Server) Acts as a virtual data source to its surroundings (accessible via ODBC interface)
OBIEE10g Metadata Modeling Basics 14
Contains metadata used by BI Server It has physically the form of binary file with RPD extension It maps particular data sources used for reporting and transforms them into semantic model (business model) Repository is created/modified by client tool BI Administration Tool (available on
Windows platform only)
15
Contains data/information relevant to business users for analysis/reporting BI Server is getting data from the data sources BI Server can work with different data sources: Relational databases (RDBMS) most common OLAP data sources (Essbase, MSAS, SAP BW..) XML files Excels (on Windows platform only, mainly for demonstration purposes) BI Server knows about the capabilities of individual relational data sources to be able to generate most effective SQL statements The most suitable data source for BI is DWH (integrated data with data model
of suitable form Kimball dimensional model)
OBIEE10g Metadata Modeling Basics 16
1 - User submits a request (report) 2 - BI Presentation Services asks for a data (in form of logical SQL) to BI Server 3 - BI Server uses a metadata (repository) to address correct data source and generates optimized SQL to get the data 4 - BI Server receives the data from the data sources and perform any post-processing if needed (internal calculation, internal joins of data sets....) 5 - BI Server passes the data to Presentation Services (result of logical SQL send to BI Server) 6 - BI Presentation Services formats the data and sends it to the client (Answers, Dashboards)
OBIEE10g Metadata Modeling Basics 17
Tool for building BI Server metadata divided into 3 layers: Physical Business Model and Mapping (BMM) Presentation
OBIEE10g Metadata Modeling Basics 18
Dimensions Hierarchies Measures Calculations Aggregation Rules Time Series Physical Layer
Map Physical Data Connections Schema
19
Siebel Operational
20
Contains objects representing physical data sources used for reporting/analysis by BI Server Can map multiple (even disparate) data sources First layer built in the repository
OBIEE10g Metadata Modeling Basics 21
Extra aliases to prevent circular joins Extra aliases for time series reporting
Actual tables
22
23
OLTP
Dimensions Facts
24
Layer, where transformation of physical data model into business model (semantic layer) understandable by end/business users is performed It has a form of logical star schemas according to dimensional modeling
methodology (R. Kimball) OBIEE10g Metadata Modeling Basics 25
26
BMM layer objects map to data objects in the Physical layer Mappings is usually not one-to-one: Business models may map to multiple data sources Logical tables may map to multiple physical tables. Logical columns may map to multiple physical columns
OBIEE10g Metadata Modeling Basics 27
28
Only metadata layer directly exposed to end users Contains objects, providing view to a business model for end users Exposes just objects, relevant to users (simplification)
OBIEE10g Metadata Modeling Basics 29
Subject Area
Presentation Folder
Folder
30
Presentation layer objects map to objects in BMM layer Subject area/Presentation Catalog maps to Business Model Presentation Folder may map to logical table Presentation Column maps (strict mapping) to logical column
OBIEE10g Metadata Modeling Basics 31
32
Repository files can be opened for editing in offline or online mode Offline Mode: Repository file (RPD) is open from file system Online Mode: Connection is made (via ODBC driver) to live BI Server We are working against repository loaded into Oracle BI Server memory Administrators can perform tasks not available in offline mode: Manage scheduled jobs Manage user sessions Manage the query cache Manage clustered servers Stop Oracle BI Server
33
34
In this lesson, we have learned about : Key components of OBIEE10g architecture Three layers of BI Server repository and their relations BI Administration Tool and its purpose
35
36
Learn about the objects in Repository Physical layer Create Repository Physical layer
Lesson Objectives
37
Contains objects representing physical data sources used for reporting/analysis by BI Server Can map multiple (even disparate) data sources First layer built in the repository
OBIEE10g Metadata Modeling Basics 38
Extra aliases to prevent circular joins Extra aliases for time series reporting
Actual tables
39
Represents the highest-level object in Physical layer Specifies data source, which is used by BI Server for querying (database type,
etc...)
Database object
Database Object
40
Name does not need to represent the physical data source name (like Oracle DB instance name) Database type of the data source (influences queries generated by BI server) :
Relational Multidimensional File based (XML)
41
You can set the SQL features, BI Server can leverage when querying this data source OBIEE has "knowledge base" for default features for registered data source types (like Oracle Db) Be careful when overwriting the default values (can considerably influence generated SQL statements) You can "Ask DBMS" for setting DB features while working in online mode
42
Defines connection of BI Server to a data source Could be either generic ODBC or native connectivity (Oracle Db) Uses principle of "connection pooling" - multiple users share pool of connection
to a data source
Connection Pool Name Connectivity Type (Native/ODBC) Max no of connections Data source name (ODBC name or connect descriptor) Connection Pooling Enabled
Connection Pool
43
Optional object Container object for definitions of tables, columns For Oracle Db it represents Db schema, owning imported database objects
definitions (tables, views) its name is used in generated SQL - <<schema name>>.<<Db object name>>
Physical Schema
Physical Schema
44
Corresponds to a table (or view) in a physical data source Most typically the definition is imported from a database or other data source (although can be created manually as well) Provides metadata needed for BI Server to access the tables/views with SQL
queries
Physical Table
Physical Table
45
Table Type:
Physical Table (most common) Stored Proc (never used from my experience) Select (glasses) also called opaque view (SQL SELECT statement, specific to Db source) - better to implement view on Db level then using this opaque view functionality
Name
Table Type
Caching Properties
46
Represents "pointer" to physical table definition (created by right-clicking on physical table definition) Used mainly to avoid circular join definition in physical layer (for example role playing dimensions in Kimball methodology) Alias structure (columns) is "read-only", always reflects structure of original table definition Appears with different icon in physical layer
Alias Name Source Table
Alias
47
Object that corresponds to a column in a physical database Child object of physical table It has data type (internal BI Server, corresponding to Db data types) and nullable
property
Physical Column
Columns
48
Defines relationships between tables Primary key: Uniquely identifies a single row of data Consists of a column or set of columns Is identified by a key icon Foreign key: Refers to the primary key columns in another table Is composed of a column or set of columns
Key Column
Key Column
49
Represent primaryforeign key relationships between tables in the Physical layer Usually defined in Physical diagram Join Properties define:
Tables joined Cardinality Join Expression
Physical Diagram Join Properties
Joins
50
Fictional company XYZ Selling Tigers data are in a relational database (Oracle) in a schema SH containing data about : Sales figures (amount, quantity) by day, product, promotions, channels and customers Unit price and costs by day , product, promotions, channels
Labs Scenario
51
1. Import the physical schema from data source 2. Select tables (and columns) to be imported 3. Import keys and joins (optional) 4. Check the import 5. Edit connection pool properties 6. Define physical keys and joins
52
Using BI Administration Tool menu File -> Import -> From Database
53
In import wizard (next step) select tables/columns needed for building physical
model in physical layer (you can add tables later on)
Filtering tables for import
54
You can import keys, foreign keys and corresponding joins, if they are defined in a data source This selection is optional, you can define keys, joins "manually" later on (to have
better control over the build physical data model)
55
Check, whether correct schema, tables, columns, and keys has been imported You can use View Data option to check the connection to data source
56
After import (or during import when importing for the first time), you check or
modify the connection pool properties using the Connection Pool properties dialog box Connection
Pool Name Connectivity Type (Native/ODBC) Max no of connections Data source name (ODBC name or connect descriptor) Connection Pooling Enabled
57
In case keys and joins has not been imported automatically (either intentionally
or they do not exists in Db) from data source (Db), you can define them in Admin Tool:
Define keys (and also joins) using the Physical Table properties dialog box Define joins and keys using the Physical Diagram
58
Key Name
59
Physical Diagram can be open by selecting tables in physical layer and then
Right-click selected tables and choose Physical Diagram Or use Physical Diagram icon in a toolbar
60
Select New foreign key icon in a toolbar Drag from table with cardinality 1 and drop to table with cardinality N Define join property (name, join expression) in Physical Foreign Key dialog
window
FK Name
Join Expression
61
In this lesson, we have learned : About objects constituting Physical layer of a repository How to create Physical layer of a repository
62
This lesson is accompanied with following labs: Become familiar with XYZ Selling Tigers data, which will be used to build BI solution (metadata) Create empty repository and import tables Define keys and joins
63
Learn about the objects in Repository Business Model and Mapping (BMM) layer Create a business model Create measures in a business model
Lesson Objectives
64
Layer, where transformation of physical data model into business model (semantic layer) understandable by end/business users is performed It has a form of logical star schemas according to dimensional modeling methodology (R. Kimball) Measures and (Dimensional) attributes are clearly defined in this layer
65
66
BMM layer objects map to data objects in the Physical layer Mappings is usually not one-to-one: Business models may map to multiple data sources Logical tables may map to multiple physical tables. Logical columns may map to multiple physical columns Mapping of logical columns can also include transformations/formulas
67
Business model Logical tables (logical dimensions/logical facts) Logical table sources Logical columns Measures (Special form of logical column) Logical primary keys Logical joins
68
Is the top-level object in the BMM layer Contains the business model definitions (logical star schemas) and the mappings from logical to physical tables Business Model uses terminology familiar to business users (not "cryptic"
physical object names)
Business Model
Business model
Business Terms
69
Logical Table contains logical columns that can be mapped to one or more physical columns from the physical layer Represents logical fact or logical dimension Can be created:
Automatically by dragging tables from Physical layer Manually by right-clicking business model and selecting New Object > Logical Table
Logical Table
70
71
Within Logical Table Source (LTS), you define mapping between logical columns of logical table and physical columns from tables, constituting LTS Column Mapping tab within LTS dialog box is intended for building, viewing or
modifying logical to physical column mappings
72
Represent the business element (attribute, measure) of the data Can map to many columns in the Physical layer Can be defined by referring other logical columns in formula (calculation) Can be created:
Automatically by dragging tables or columns from the Physical layer Manually by right-clicking a logical table and selecting New Object > Logical Column (and then defining mapping to physical column within LTS)
Logical Column
73
Define unique identifiers (logical columns) for logical tables (but from business point of view, not for example "surrogate, generated DWH keys) Define the lowest level of detail of a logical table Logical Keys are only required for logical dimension tables (for repository to be
valid), not for logical fact tables
74
Defines the cardinality relationship between logical tables (derives the logical table role in model - dimension or fact) Is required for a valid business model Help Oracle BI Server understand the relationships between various objects of the business model Examining logical joins is an integral part of how Oracle BI Server figures out how to construct the physical queries There is no "join expression" defined as part of logical join definition (as
opposed to join definition on physical layer - FK)
Logical Join
75
76
Represents calculations with measurable quantity Special form of logical columns placed in logical fact tables Have aggregation rule defined in metadata
Measure
77
Create a dimensional model to represent XYZ Selling Tigers business Logical dimensions:
Time (Periods) Products Customers Channels Promotions Logical Facts : SALES - main fact table containing total sales measures (amount, quantity) by day, product, promotions, channels and customers COSTS - "supporting" fact table with unit price and unit cost measures by day,
product, promotions, channels
78
79
1. Create business model 2. Create logical tables and columns 3. Define logical joins 4. Modify logical tables and columns 5. Define measures in logical fact tables
80
Right-click in the BMM layer and select New Business Model (more control over the process) Or Dragging selected physical tables into BMM layer
81
Drag physical table objects from the Physical layer into BM definition fast, but
less control over the process (creates automatically also logical columns for all physical columns) Or right-click BM object and select New Object > Logical Table (logical columns needs to be defined afterwards)
82
Similar to definition of foreign key joins in physical layer Business Model Diagram used for it and complex join used instead of foering key
join (with no join expression defined, just cardinality)
83
Rename tables and columns Reorder columns Add or delete tables and columns Add, delete and modify logical table sources (LTS) Cut, copy and paste objects (but with caution)
84
Special form of logical columns placed in logical fact tables (with aggregation rule) Right-click the logical column (numeric) in logical fact table and select Properties
> Aggregation
85
Use only complex joins in BMM layer !!!!!!! Rename logical columns to have meaningful names by end user community (business dictionary) Use (if possible) short enough names (due to the space in requests) Use unique names for logical columns to avoid ambiguity
Recommended practice
86
87
This lesson is accompanied with following labs: Creating Business Model Creating Simple Measures
88
Learn about the objects of Presentation layer of a repository Modify the properties of Presentation layer objects Create Presentation layer
Lesson Objectives
89
Only metadata layer directly exposed to end users Contains objects, providing view to a business model for end users Exposes just objects, relevant to users (simplification)
Presentation Layer
90
Organize and simplify business model for a set of users Single presentation catalog must be populated from a single business model (presentation catalog cannot "span" across business models) But multiple Presentation catalogs can reference the same business model
(pretty common scenario)
Presentation Catalog
91
Organize presentation columns into groups that make sense to the users May contain columns from one or more logical tables Can be modified independently of logical tables (no strict "mapping" between logical table and presentation table) Can be created:
Automatically by dragging logical tables from BMM layer Manually in the Presentation layer
Presentation Tables
92
Define the columns used to build requests in the OBIEE user interface Map to logical columns in the BMM layer (strict 1:1 mapping) Usually derive their names from corresponding logical column name (best practice) Can be created: Automatically by dragging logical tables or columns from BMM layer Manually in the Presentation layer
Presentation Columns
93
Presentation layer objects map to objects in BMM layer Subject area/Presentation Catalog maps to Business Model Presentation Table may map to logical table Presentation Column maps (strict mapping) to logical column
94
Presentation layer objects define the content (Subject Areas), which end users
see to query the data from the data sources
95
Give the appearance of nested folders in Answers tool In Description of presentation table put following characters ate the beginning : -> Place presentation table after the table in which you would like to nest it
96
Keep track of any changes to Presentation layer objects Used in Answers when referencing renamed presentation layer object in Answer request Have additional overhead for BI Presentation Server Avoid renaming after BI outputs has been intensively created (rather perform
replacing in Web Catalog)
97
Build Presentation layer to present users with a customized view of XYZ Selling
Tigers data
98
1. Create new presentation catalog 2. Rename tables 3. Reorder tables 4. Delete columns 5. Rename columns 6. Reorder columns
99
100
2. Rename tables
101
For reordering of presentation tables in Presentation Catalog, open the Presentation Catalog properties and use the Up and Down buttons or drag/drop Recommendation for ordering : Place dimension presentation tables at the top
and measure (fact) presentation tables at the bottom
3. Reorder tables
102
4. Delete columns
103
Presentation columns use the logical column name by default - this is the
recommended approach to "inherit" logical column names in presentation layer to have consistent naming of a single logical column across different Presentation Catalog If there is a special need for renaming presentation column, use General tab in the column properties dialog box to deselect the Use Logical Column Name check box and enter custom name
5. Rename columns
104
6. Reorder columns
105
Presentation catalogs can map to only one BM Multiple presentation catalogs can map to the same BM Presentation columns in single presentation table can come from multiple logical tables in BM Presentation columns are automatically renamed when corresponding logical column is renamed (with alias created for previous name) Presentation tables cannot have the same name as presentation catalogs Presentation objects can be modified or deleted without affecting corresponding logical objects Be careful with modification/restructuring of presentation objects in case BI
"outputs" are already using them
Consideration
106
Use meaningful names Names cannot contain single quotation marks (); the Administration Tool prevents it (also avoid spaces at the beginning/end) Keep presentation object names unique: Naming presentation columns the same as presentation tables can lead to inaccurate results Uniqueness allows SQL statements to be shorter because qualifiers are unnecessary Group objects in meaningful ways (grouping in presentation tables) Eliminate unnecessary objects to reduce confusion (KISS principle) Use object description fields to convey information to users (displays as tooltip in Answers) Keep names short to save space on reports
OBIEE10g Metadata Modeling Basics
Recommendation/Best Practice
107
108
This lesson is accompanied with following labs: Creating Presentation layer objects Modifying Presentation layer objects
109
Lesson Objectives
110
The following steps validate whether the repository is constructed correctly and whether it produces expected query results: Checking repository for consistency Enabling logging Loading/Deploying a repository Checking a repository using BI Answers front end tool Inspecting the query log
Validating a Repository
111
Validate and test SH business model before making it available for using by end
users
112
It is a feature of BI Administration Tool that checks whether a repository has met certain requirements, such as: All logical columns are mapped directly or indirectly to one or more physical columns. All logical dimension tables have a logical key. All logical tables have a logical join relationship to another logical table. There are at least two logical tables in the business model: logical fact table and logical dimension table. Both can map to the same physical table. There are no circular logical join relationships. At least one Presentation Catalog exists for business model. Does not guarantee that the business model is constructed correctly
Consistency Check
113
Check consistency for the entire repository or for individual repository objects by
using either of the following menus (you are also asked when saving RPD):
File Menu Tools Menu BMM layer object right-click menu
Checking Consistency
114
Displays consistency check messages: Errors: Must be fixed to make the repository consistent Warnings: Condition that may or may not be an error Best Practices: Condition does not indicate an inconsistency (disabled by
default)
115
The Options tab in the Consistency Check Manager allows to enable or disable
consistency messages
116
BI Server provides a facility for logging query activity at the individual user level (driven by variable LOGLEVEL set for individual users) Logging is intended for quality assurance testing, debugging, and use by Oracle Support Query logging is normally disabled in production mode (turning logging on
Enabling Logging
causes additional overhead on OBIEE server infrastructure - log file can extensively grow) Query log file is named NQQuery.log and is located in the \OracleBI\server\Log directory Various levels of query logging enables various degree of details logged
117
Use the Security Manager to enable logging level for individual users Or set LOGLEVEL session variable using Initialization Block (session variables are
covered in Using Repository Variables lesson)
118
There are 7 log levels (higher levels are intended just for communicating issues with Oracle support) Logging level 2 allows you to see the SQL generated (sufficient for
testing/debugging)
Logging Levels
119
After you build a repository and it is consistent, you need to "point" BI Server to
use this RPD - add/modify an entry in the NQSConfig.ini file
120
If the repository was edited in offline mode (recommended practice), start or restart BI Server process to load new/modified repository If the repository was edited in online mode (just for small changes, not
recommended), check in changes and reload the server metadata in Answers
Loading Repository
121
122
Either directly looking at NQQuery.log file on the OBIEE server(located in \OracleBI\server\Log) Or by going into Manage Session in Administration page of OBIEE UI
123
124
125
BI Server Logical SQL SELECT statements are different from standard SQL in the following ways: No join information is required (SQL is constructed against Presentation layer
objects)
Join conditions are predefined in the repository Any join conditions supplied in a query are ignored
If aggregated data is requested in the SELECT statement, a GROUP BY clause is automatically assumed by the server BI Server always issues SELECT DISTINCT to eliminate duplicate rows automatically
126
In this lesson, we have learned : How to execute the steps to test and validate a repository
127
This lesson is accompanied with following labs: Testing and validating new repository (along with deployment) Generating inconsistent repository (business model) and fixing consistency
errors
128
Describe normalized and de-normalized table structures in database designs Add multiple sources to a logical table source (LTS) for a dimension/fact in business model Topic : Adding a second logical table source to logical table - will be covered in
Aggregation lesson
Lesson Objectives
129
Normalized table structures (usually transactional systems - OLTP): Consists of many tables where data has been split or normalized Are used for inserts and updates Does not work well for queries that perform business data analysis Could occur also in dimensional model (snowflake) De-normalized table structures (DWH - dimensional model): Follows more a business model and is easier to understand Has data that may be duplicated in several locations in a database Can take the form of a star schema Provides better query performance
Table Structures
130
Data may be spread across several physical tables and needs to be mapped to a
single logical table
Business Challenge
131
Model multiple physical sources for the logical table: Add multiple sources/tables to an LTS Where data is not duplicated across tables Add a new logical table source Where data is duplicated across tables (aggregated tables, fragmented
data....)
Business Solution
132
Add normalized table that store geographical attributes for customer (country,
subregion, region) to Customers dimension in SH business model
XYZ Selling Tigers Scenario: Adding Additional Table to an existing Logical Table Source (LTS)
133
1. Import additional tables 2. Define keys and joins 3. Identify physical columns for mappings 4. Add sources/tables to a logical table source: Manual method Drag method 5. Rename logical columns 6. Add columns to the Presentation layer
134
135
136
137
There are alternative methods , how you can add sources to an LTS: Manual: Use the Properties dialog box of Logical Table Source (LTS) to map tables and columns Drag: Drag columns from the Physical layer to LTS to automatically map
tables and columns
138
139
Use the Properties dialog box of a logical table source to add a new physical
table to the source
140
Use the Properties dialog box of a logical table source to map a new logical
column to a physical column in the new table within LTS
141
LTS CUSTOMERS now maps to two physical tables: CUSTOMERS and COUNTRIES Logical column Country maps to COUNTRY_NAME physical column in
COUNTRIES physical table
142
143
144
145
146
Rename new logical columns with names that are meaningful to users (not
necessary for manual method)
147
148
In this lesson, we have learned : Describe normalized and de-normalized table structures in database designs Add multiple sources to LTS for a dimension in the business model
149
This lesson is accompanied with following labs: Importing normalized table that contain additional geographical information for customer to Physical layer, defining physical keys and joins Creating multiple sources (adding table) to LTS using manual method
150
Describe a calculation measure and its use in a business model Create new calculation measures based on logical columns Create new calculation measures based on physical columns Create new calculation measures by using the Calculation Wizard
Lesson Objectives
151
Physical schema may not have everything required for your analysis Adding columns into physical tables may require additional integration or ETL work Deriving a metrics in such scenario may be more efficient than storing values in the database Average Order Size: Divide Revenue by Number of Orders
Derived Metrics
152
BI Server provides utilities to create calculation measures in the business model Use the Expression Builder to create new logical columns with a calculation formula Use existing logical columns or physical columns as objects in the formula Use the Calculation Wizard to create calculation measures based on existing logical columns Expose calculation measures to the Presentation layer so that users can
formulate business questions in Answers by using familiar terminology
OBIEE solution
153
"XYZ Selling Tigers" wants to track a calculated measure Gross Profit, that
calculates the difference between Amount Sold and Unit Cost (stored in table COSTS) multiplied by Quantity Sold : Gross Profit = Amount Sold - (Unit Cost * Quantity Sold)
154
Calculation measures can be created using 2 different methods: Existing logical columns as objects in a formula Physical columns as objects in a formula You can use as well Calculation Wizard to automate the process of creation of
comparison measures (comparing 2 existing measures/logical columns)
Implementation Methods
155
1. Create a new logical column 2. Specify logical columns as the source 3. Build a formula
156
Right-click the fact table and select New Object > Logical Column
157
158
Open the Expression Builder and build the calculation formula using existing
logical columns (from logical tables)
3. Build a Formula
159
1. Create a new logical column 2. Map the new column 3. Build the formula 4. Specify an aggregation rule
160
Use the Column Mapping tab of LTS dialog box to open the Expression Builder
for the new column
161
162
Click the Aggregation tab and set the aggregation rule for new calculated logical
column
163
1. Open the Calculation Wizard 2. Choose the columns for comparison 3. Select the calculations 4. Confirm the calculation measures 5. New calculation measures are added
164
Right-click a logical column that you want to use in the calculation and select
Calculation Wizard
165
166
Index, Percent), by default Change (absolute difference) and Percent Change selected You can name the calculation measures You can also choose NULL value handling
167
168
New calculation measures are automatically added to the logical fact table
169
170
Using functions
Variables can be built and used in building formula, e.g. Current Month, Last ETL Run Date etc.
171
Use physical columns for building calculations that require an aggregation rule
(such as sum or average) that is applied after the calculation
select T82.ItemType as c1, sum(T55.UnitOrdd - T55.UnitShpd) as c3
from
D1_products T82, D1_Orders2 T55 where
172
Example: To accurately calculate total revenue you would multiply the unit price
by the number of units sold and then sum the totals
173
Use logical columns for calculation formulas that require an aggregation rule
that is applied before the calculation
Select distinct D1.c1 as c1, D1.c2 as c2. D1.c3 as c4, (D1. c2 D1.c3) as c4 From (
select
T82.ItemType as c1, sum(T55.UnitOrdd) as c2, sum(T55.UnitShpd) as c3 from
D1_products T82,
D1_Orders2 T55 where T55.ProdKey = T82.ProductKey and group by T79.ItemType ) D1
174
175
In this lesson, we have learned : About a calculation measure and its use in a business model How to create new calculation measures based on existing logical columns How to create new calculation measures based on physical columns How to create new calculation measures by using the Calculation Wizard
176
This lesson is accompanied with following labs: Creating Calculation Measures by Using Physical Columns Creating Calculation Measures by Using Physical Columns
177
Create dimension hierarchies Create level-based measures Create rank and share measures
Lesson Objectives
178
Defines parent-child relationships within a dimension Establishes levels for data groupings and calculations Provides paths for drilldown
Period Dimensional Hierarchy
Dimension Hierarchies
Days
179
Level-based hierarchy, general hierarchy style where a dimensional column act as the parent to a child level column Unbalanced (or ragged) hierarchy, is a hierarchy where the leaves (members with no children) do not necessarily have the same depth. Skip-level hierarchy, is a hierarchy where there are members that do not have a value for a particular ancestor level. Value-based hierarchy (Parent Child Hierarchy), uses the same dimension
column for all levels but based on relational tables (parent-child relationship table) in the data model to identify the parent-child relationship. In OBIEE10g metadata, only level-based (balanced, not ragged) hierarchy is supported for modeling
Types of Hierarchies
180
Are columns whose values are calculated (aggregated) always at a specific level
of hierarchy
Period Dimensional Hierarchy Levels Measures
Level-Based Measures
Grand Total
Years Quarters Months Days
181
Share Measures
182
183
XYZ Selling Tigers Scenario wants to implement dimension hierarchies for all logical dimensions : Dim - Channels Dim - Customers Dim - Products Dim - Promotions Dim - Times
184
1. Create a dimension object 2. Add a parent-level object 3. Add child-level objects 4. Determine the number of elements 5. Specify level columns 6. Create level keys 7. Set the preferred drill path 8. Create a level-based measure 9. Create additional level-based measures 10. Create share measures 11. Add measures to the Presentation layer 12. Test share measures
185
Alternative methods are: Right-click the business model object and select New Object> Dimension (recommended) Right-click the logical dimension table and select Create Dimension
186
187
188
Part of previous step (3. Add Child-Level Objects) There are two methods for determining the number of elements for a dimension
level:
Using result from "Update Row Counts" operation (this operation is usually used just for small amount of data) Estimate Levels - only accessible when working in online mode
Estimate Levels
189
Specify which columns in the logical dimension table are associated with
particular level in the dimension hierarchy
190
Level keys: Define the unique identifier for the level Provide context for drilldown (specifies the subset of data to include from
the next level down)
191
Use Preferred Drill Path to specify a drill path outside the normal drill path
defined by the dimension level hierarchy (quite rarely used)
192
Example: Create a level-based measure for the Grand Total level of particular
dimension (Product) that refers to an existing logical fact column
193
194
195
Create a new logical fact column that calculates the share by dividing the appropriate measure by a total measure (level-based measure) this is an example of measure, calculated from logical columns
196
Expose new measures to Presentation layer so they can be used by end users
197
198
In this lesson, we have learned : How to create dimension hierarchies How to create level-based measures How to create share measures
199
This lesson is accompanied with following labs: Creating Dimension Hierarchies Creating Level-Based Measures Creating Share Measure
200
Using Aggregates
Describe aggregate tables and their purpose in dimensional modeling Model aggregate tables Use the Aggregate Persistence Wizard to create aggregates
Lesson Objectives
201
Using Aggregates
Data in fact and dimension sources is stored at the lowest level of detail Data often needs to be rolled up or summarized during analysis (example: users often requests total products by region by year ) Based on the amount of data, performing calculations at the time of the query
can be resource intensive and can delay results to the user (thus leads to end user dissatisfaction with BI application)
Business Challenge
202
Using Aggregates
Create physical tables that store pre-computed aggregates at required levels Use these "aggregate" tables to process user queries Eliminates run-time calculations Delivers faster results to the users
Id 122222 133333 144444 155555 166666 Office Key 1001 1001 1002 1002 1005 Time Key Product Dollars Key 100000 200000 10000 20000 20000 Summarized Office Year 19980102 100 19990105 200 19980505 300 19980601 400 19980101 600 Total Dollars
West
West Central
1998
1999 1998
100000
200000 50000
203
Using Aggregates
Enables queries to use the information stored in aggregate tables automatically BI Server decides which tables provide the fastest answers Metadata must be configured for aggregate navigation
204
Using Aggregates
Aggregated sales fact table columns store precomputed results at a given set of
levels
Time Hierarchy
Aggregated Facts
Total
Total
Total
Dimension Hierarchies
Brand
Company
Year
Product Hierarchy
LOB Organization
Office Hierarchy
Month
Type
Department
Week
Detail
Day
205
Using Aggregates
Model aggregate tables in the same way as other source data Physical layer: Import physical table Create physical joins BMM layer: Add sources to logical tables Specify aggregation content - identifies the level of aggregation in the table so BI Server can determine when to use it for queries Presentation layer: No changes: Aggregate navigation is independent of Presentation layer objects Test the results
New step
Modeling Aggregates
206
Using Aggregates
Uses prebuilt aggregate tables to improve performance Must have matching levels of aggregation for fact and dimensions Simplified scenario: Materialized View CAL_MONTH_SALES_MV aggregates Amount Sold by Month This MV will serve as LTS for aggregated fact and as well as aggregated
dimension (Dim Times)
207
Using Aggregates
1. Import tables 2. Create joins 3. Create fact logical table source and mappings 4. Specify fact aggregation content 5. Specify content for the fact detail source 6. Create dimension logical table source and mappings 7. Specify dimension aggregation content 8. Specify content for the dimension detail source 9. Test results for levels stored in aggregates 10.Test results for data above or below levels
208
Using Aggregates
Import fact and dimension aggregates For XYZ Selling Tigers Scenario, MV will serve as aggregated fact and
dimension (so no join between fact and dimension at physical layer will be defined see next step)
1. Import Tables
209
Using Aggregates
Use the Physical Diagram to create joins between aggregate fact table and aggregate dimension tables For XYZ Selling Tigers simplified scenario, MV will serve as aggregated fact and
dimension - thus no join between fact and dimension at physical layer will be defined)
2. Create Joins
210
Using Aggregates
Create new aggregate LTS in the existing logical fact table and map the columns
211
Using Aggregates
Specify the aggregation content of the new fact LTS, so that BI Server knows
what level of data is stored in the aggregate tables (to make optimal decision, which source to use for query)
212
Using Aggregates
Set the levels of the fact detail LTS to the lowest levels in hierarchies
213
Using Aggregates
Create new aggregate LTS in the existing logical dimension tables and map the columns For XYZ Selling Tigers simplified scenario, MV will serve also as aggregated
dimension LTS
214
Using Aggregates
Specify the aggregation content of the new dimension LTS, so that BI Server
knows what level of data is stored in the aggregate tables (to make optimal decision, which source to use for query)
215
Using Aggregates
Set the level of the dimension detail source to the lowest in the hierarchy
216
Using Aggregates
Run queries and inspect the query log to ensure that the aggregate tables are
accessed as expected
217
Using Aggregates
Run queries with lower granularity and inspect the query log to ensure that the
detailed tables are accessed (and opposite also for higher granularity)
218
Using Aggregates
Automates the creation of physical aggregate tables and their corresponding objects in the repository from Admin Tool Just FYI I personally dont use this feature (rather prefer manual creation of
aggregate tables, registering them manually in metadata and let ETL take care of aggregate table content freshness Real demo follows
219
Using Aggregates
If aggregate navigation is not working, the cause might be one of the following: Aggregation content is not specified correctly for one or more sources Aggregate dimension sources are not physically joined to aggregate fact table sources at the same level Dimensional source does not exist at the same level as a fact table source Aggregate dimension sources do not contain a column that maps to the primary key of the dimension hierarchy level The number of elements is not specified correctly for dimension hierarchy
levels
220
Using Aggregates
Using aggregates comes with a price: Additional time is required to build and load these tables (ETL process) Additional storage is necessary Build only the aggregates you need: Look at query patterns and build aggregates to speed up common queries that require summarized results Ensure that enough data is combined to offset the cost of building aggregates Monitor and adjust to account for changing query patterns
Aggregates Considerations
221
Using Aggregates
Summary
In this lesson, we have learned : Describe aggregate tables and their purpose in dimensional modeling How to model aggregate tables How to use the Aggregate Persistence Wizard to automatically create
aggregates
222
Using Aggregates
Labs - Overview
This lesson is accompanied with following labs: Using Aggregates tables (simplified scenario with MV usage)
223
Create session variables Create repository variables Create initialization blocks Implement a dynamic repository variable
Lesson Objectives
224
Contain values in memory that are used by the Oracle BI Server during its processing Are created and managed using the Variable Manager feature in BI Administration Tool Consist of types: Session variables (system/non-system) Repository variables Static Variables Dynamic Variables
Variables
225
Variable Manager
226
Variable Types
227
Persist from the time BI Server is started until it is shut down Can be used instead of literals or constants in the Expression Builder in BI Administration Tool (or in Answers as filter conditions as well) 2 types of repository variables: Static Dynamic
Repository Variables
228
Are repository variables whose values are constant (literal) and do not change while BI Server is running Have values that are initialized in the Static Repository Variable dialog box
229
Are repository variables whose values change according to a refresh schedule Values are initialized and refreshed using an initialization block
230
Persist only while a users session is active Receive values when users establish their sessions 2 types of session variables: System Non-system
Session Variables
231
Predefined session variables reserved for specific purposes Have reserved names, which cannot be used for other kinds of variables Example: USER Refer to BI Server Administration guide for complete list and description
232
Are application-specific variables that are created by the implementation team Example: Capture the users Region and limit the records the user sees to only
those for that Region
233
Are used to initialize system and non-system session variables, as well as dynamic repository variables Specify SQL to be run to populate one or more variables by accessing data sources Are invoked at BI Server startup and are periodically rerun to refresh values for
dynamic variables according to an established schedule
Initialization Blocks
234
Determine the latest dates contained in the source data and store it in variables
235
236
237
238
1. Open the Variable Manager 2. Create an initialization block 3. Edit the data source 4. Edit the data target 5. Test the initialization block query 6. Verify the initialization 7. Test in query filter
Implementation Steps
239
240
In the Variable Manager, select Action > New > Repository > Initialization Block
to open the Repository Variable Init Block dialog box
241
Click the Edit Data Source button to navigate to the Repository Variable Init
Block Data Source dialog box
242
Click the Edit Data Target button to navigate to Repository Variable Init Block
Variable Target dialog box
243
Click the Edit Data Source button to navigate to Repository Variable Init Block
Data Source dialog box
244
Check the query log to verify that the variable is initialized and is used properly
on BI Server startup
245
Check the query log to verify that the variable is initialized and is used properly
on BI Server startup
246
247
In this lesson, we have learned : How to create session variables How to create repository variables How to create initialization blocks How to implement a dynamic repository variable
248
This lesson is accompanied with following labs: Using Dynamic Repository Variables as Filters
249
Describe the use of time comparisons for business analysis Implement time comparison measures in the business model using the time
series functions
Lesson Objectives
250
Provide the ability to compare business performance (measures) with prior time periods Allows you to analyze data spanning multiple time periods Example: Compare this years Revenue and last years Revenue
Time Comparisons
251
There is no direct way in SQL to do this type of analysis in a single SQL query Requires three separate queries to be sent to the database
252
253
BI Server provides Ago and ToDate functions for time series comparisons: Ago function Calculates aggregated value as of some time period shifted from the current time ToDate function Aggregates a measure attribute from the beginning of a specified time
period to the currently displayed time
254
BI Server provides Ago and ToDate functions for time series comparisons: Ago function Calculates aggregated value as of some time period shifted from the current time ToDate function Aggregates a measure attribute from the beginning of a specified time
period to the currently displayed time
255
XYZ Selling Tigers wants to implement new measures in the business model to compare Amount Sold performance this month to performance a month ago Show Amount Sold a month ago Show Month Ago absolute change for Amount Sold Show Month Ago percentage absolute change for Amount Sold Show to date (Year To Date - YTD) value for Amount Sold
256
1. Identify time dimension and chronological keys 2. Create the Ago measure 3. Use existing columns to create additional Ago measures 4. Create the ToDate measure 5. Add new measures to the Presentation layer 6. Test the results in Answers
257
Mark Time dimension and set chronological keys for all levels in Time dimension
258
Use the Expression Builder to build the Ago function with the form:
Ago(<measure>,<time level>,<number to shift>)
259
Use the existing Ago measure to build Change and Percent Change measures
(you can use Calculation Wizard to automate creation of comparison measures)
260
Use the Expression Builder to build the ToDate function with the form:
ToDate(<measure>,<time level>)
261
Add new time series measures to the Presentation layer so that users can
include them in query criteria
262
263
In this lesson, we have learned : How to describe the use of time comparisons for business analysis How to implement time comparison measures in the business model using
the time series functions
264
This lesson is accompanied with following labs: Creating Time Series Comparison Measures
265
Security Basics
Define users and groups Set permissions for users and groups to control access to repository objects Explain group inheritance This training does not cover Advanced security topics (different authentication methods, row-level security, query limits) Security for OBIEE front-end (permissions to web catalog objects, privileges
to end-use functionalities)
Lesson Objectives
266
Security Basics
Only qualified persons should have access rights to data analysis applications Data needs to be protected so that only authorized employees can access sensitive information Employees should automatically see the information that is relevant to their
roles
Business Challenge
267
Security Basics
Provides ability to authenticate users through logon Controls user access to data Secures access control on object and data levels
268
Security Basics
Is a utility in the Administration Tool that displays all the security information for
a repository
Security Manager
269
Security Basics
User accounts can be defined explicitly in: BI Server repository External source (such as a database table or an LDAP server) additional work needed not covered within this training Users must be authenticated by BI Server for a session to take place.
270
Security Basics
Adding a User to a Repository
271
Security Basics
You can set repository permissions for: Catalogs, Tables or Columns in the Presentation layer Connection pools in the Physical layer Permissions can be undefined, read, or access explicitly denied
272
Security Basics
Is a default, permanent user account in every BI Server repository Cannot be deleted or modified other than to change the password and setting logging level Belongs to Administrators group by default
Administrator Account
273
Security Basics
Group is a set of security attributes Use Security Manager to create groups and then grant membership in them to users or other groups Used for simplification of permission/privileges maintenance Administration of group membership can be externalized (not covered in this
training)
274
Security Basics
Is a predefined group with authority to access and modify any object in a repository Administrator user is automatically a member
Administrators Group
275
Security Basics
You can create an unlimited number of groups in a repository Each group can contain: Explicitly granted privileges Implicitly granted privileges through membership in another group
Defined Groups
276
Security Basics
Group Inheritance
277
Security Basics
Adding a New Group
278
Security Basics
Click the hierarchy icon in the left pane of the Security Manager, and then
expand the tree in the right pane
279
Security Basics
Is the process by which a system verifies (with a user ID and password) that a user has the necessary permissions and authorizations to log on and access data BI Server authenticates each connection request that it receives
Authentication
280
Security Basics
BI Server supports the following authentication types: Operating system External table LDAP Database Internal
Authentication Options
281
Security Basics
BI Server supports Windows Unified Logon If a user is configured on a trusted Windows domain, BI Server user of the same name does not need to be authenticated by BI Server The user ID in the repository must match the user ID in the trusted Windows domain Rarely used in practice
282
Security Basics
Instead of storing IDs and passwords in a repository, maintain lists of users and passwords in an external database table Use Oracle BI session variables to get values Could be potential security danger external table stores users passwords
283
Security Basics
Very common method in OBIEE implementations Instead of storing IDs and passwords in a repository, BI Server passes the user ID and password entered by the user to an LDAP server for authentication Use Oracle BI session variables to get authentication values Active Directory could be used as an LDAP
LDAP Authentication
284
Security Basics
Authenticates users through database logons To set up database authentication: Store user IDs (without passwords) in a repository Import database to the repository Specify authentication database in NQSConfig.ini
Database Authentication
285
Security Basics
Maintain lists of users and passwords in the repository using BI Administration Tool Oracle BI Server authenticates against this list unless: Another authentication method has already succeeded Database authentication is specified in NQSConfig.ini User IDs are non-encrypted and non-case-sensitive Passwords are encrypted and case-sensitive Users can access any business model if they have the necessary access privileges Users do not span repositories
Internal Authentication
286
Security Basics
1. Operating system (OS): No logon name Turned on in NQSConfig.ini 2. LDAP or external database table Populates session variables 3. Internal or database
Order of Authentication
287
Security Basics
Summary
In this lesson, we have learned : How to define users and groups How to set permissions for users and groups to control access to repository objects Explain group inheritance Identify and describe the authentication methods used by Oracle BI Server
288
Security Basics
Labs - Overview
This lesson is accompanied with following labs: Creating Users and Groups Setting Permissions for Users and Groups
289