You are on page 1of 70

Autumn 14 Issue 54

Reporting/Charting/Graphing
With New SQL Developer
Features

CBO Parameter Choices


Full Table Scans vs Index
Access

Going Mobile
From Concept to Delivery The Story of an Applications DBA

OracleScene
Serving the Oracle Community

Whats the big


deal about
In-Memory?

www.ukoug.org
An independent publication not affiliated with
Oracle Corporation

This editions
sponsors:

AUTUMN 14

Welcome to Oracle Scene

Inside this issue


Oracle Scene Editorial Team
Editor: Brendan Tierney
Email: editor@ukoug.org.uk
UKOUG Contact: Brigit Wells
Email: brigit@ukoug.org
Sales: Kerry Stuart
Email: kerry@ukoug.org
Project Manager: Lavinia Mildwater
UKOUG Governance
A full listing of Board members, along with
details of how the user group is governed,
can be found at:
www.ukoug.org/about-us/governance
UKOUG Office
UK Oracle User Group, User Group House,
591-593 Kingston Road, Wimbledon
London, SW20 8SA
Tel: +44 (0)20 8545 9670
Email: info@ukoug.org
Web: www.ukoug.org
Produced and Designed by
Why Creative
Tel: +44 (0)7900 246400
Web: www.whycreative.co.uk
UKOUG Event Photography
Liesbeth Verdegaal
Next Oracle Scene Issue
Issue 55: Publish month: December 2014
Content deadline: 3rd October 2014

10

21

by Peter Scott

by Heli Helskyaho

A FIRST LOOK AT ORACLES


IN-MEMORY DATABASE OPTION

ORACLE SQL DEVELOPER DATA


MODELER

28

31

by Grcan Orhan

by Richard Childe

DATA WAREHOUSING TRENDS &


OPPORTUNITIES FOR 2014

GOING MOBILE:
FROM CONCEPT TO DELIVERY

TECHNOLOGY

A Cloud Menu for All Tastes by Dermot OKelly


Oracle SQL Developer & Reporting:
Introduction & Whats New for Version 4.0 by Jeff Smith
CBO Choice Between Index & Full Scan: The Good, the Bad &
the Ugly Parameters by Franck Pachot

KPIT-Oracle Partnership: Driving Business Value Leveraging Innovation


by Nikhil Gupta

EVENTS FOCUS

Focus on: JDE14


Focus on: Tech14
Focus on: Apps14
Why You Should Be Attending UKOUG SIG Events This September

VOLUNTEER FOCUS

Join them now. @UKOUG

The views stated in Oracle Scene are the views of the


author and not those of the UK Oracle User Group Ltd.
We do not make any warranty for the accuracy of any
published information and the UK Oracle User Group
will assume no responsibility or liability regarding the
use of such information. All articles are published on
the understanding that copyright remains with the
individual authors. The UK Oracle User Group reserves
the right, however, to reproduce an article, in whole
or in part, in any other user group publication. The
reproduction of this publication by any third party,
in whole or in part, is strictly prohibited without
the express written consent of the UK Oracle User
Group. Oracle is a registered trademark of Oracle
Corporation and /or its affiliates, used under license.
This publication is an independent publication,
not affiliated or otherwise associated with Oracle
Corporation. The opinions, statements, positions
and views stated herein are those of the author(s) or
publisher and are not intended to be the opinions,
statements, positions, or views of Oracle Corporation.

www.ukoug.org

62

36
13
24
34
61
26
23

ADVERTORIALS

Insightsoftware.com: One Mans Brave Journey Away From Excel & Hyperion 38
Inoapps: Why Cloud Deployment Makes Sense by Phil Wilson
68

REGULAR FEATURES
OracleScene UK Oracle User Group Ltd

16

E-BUSINESS SUITE

Women in IT: Speaker Spotlight by Debra Lilley


Meet a Volunteer: Jo Bates

More than 13,000 people


follow UKOUG.

08

News & Reviews


Supplier Guide

OracleScene
D I G I T A L

06
67

View additional content and get more from your


Oracle Scene at: www.ukoug.org

Connecting Oracle Business Intelligence to the Oracle BigDataLite VM


by Mark Rittman

40

Reinventing Enterprise Performance Management to Support Sustainable,


Innovation-Based Growth by Gilles Bonelli, Erik Dorr & Sherri Liao

44

Agile Methods & Data Warehousing: How to Deliver Faster by Kent Graziano

48

Oracle Application Express Dispelling the Myth by Simon Greenwood

54

Capturing the Process Truth by Colin Armitage

56

The Interview All Graduates Need to Read by Alison Mulligan

58

03

OracleScene
D I G I T A L

AUTUMN 14

Welcome to Oracle Scene

First word
Welcome to the Autumn edition of Oracle Scene.
At the time of writing this piece Oracle OpenWorld is
still a couple of months away, but as you read it will
be just around the corner.
It will be interesting to see what big announcement Oracle will
have this year. We all remember those from last year and it is
only recently that we have been able to get our hands on some
of these product releases.
One of the big announcements from last years Oracle
OpenWorld was the In-Memory option, which was released
in July. In this edition we have Peter Scott telling us about
his experiences of using the In-Memory option during the
beta-programme. For developers we have Jeff Smith (Mr. SQL
Developer himself) describing how to use the new Reports
feature in SQL Developer and Heli (from Finland) Helskyaho
introducing us to Oracle SQL Developer Data Modeler. In
addition to these we have some great articles by Grcan Orhan,
Frank Pachot and Richard Childe, along with many user group
news items. All of these articles are in the traditional print
version of Oracle Scene.
Most of you will be aware that we have two versions of Oracle
Scene. We have the traditional print version and the digital
edition. With the digital edition you get a bumper version of
Oracle Scene with lots of extra articles. In our last print edition
we had an article from Mark Rittman on using the Oracle
BigDataLite VM. We have a follow up article in the digital
edition on how to connect OBIEE to the BigDataLite VM.

ABOUT
THE
EDITOR

Kent Graziano has an article on Data Model Design and perhaps


we can all learn from his Agile guidelines. Other articles are on
APEX, EPM, Business value of Apps among many more.
Our next edition of Oracle Scene will be out in early December,
just in time for the UKOUG Applications and Technology
conferences. We are always open to article submissions on
topics across the Oracle product set. Our next edition would be
a fantastic way to showcase your knowledge and skills to all the
attendees at these conferences and in the wider community.
If you have been working with any of the new features, or
you have some techniques or approaches you would like to
share with the community, please do submit an article. The
submission deadline is 3rd October. More information of
submitting an article can be found online at:
www.ukoug.org/oraclescene
And finally, dont forget to look out for the first ever Oracle Scene
Most Read Article awards that will be presented at Apps14 and
Tech14 as part of the UKOUG Speaker Awards. These two new
awards provide recognition for our authors based on readership
figures from the full digital editon of Oracle Scene. We would
like to take this opportunity to thank all our authors, past and
present, for their continued support of Oracle Scene.

Brendan Tierney
Consultant, Oralytics.com
Brendan is an Oracle ACE Director, independent consultant and lectures on
Data Mining and Advanced Databases in DIT in Ireland. Brendan has extensive
experience working in the areas of Analytics, Data Mining, Data Warehousing, Data
Architecture and Database Design for over 20 years. He has worked on projects in
Ireland, UK, Belgium and USA. He started working with the Oracle 5 Database, Forms
2.3 and ReportWriter 1.1, and has worked with all versions since then. Brendan is
the editor of the UKOUG Oracle Scene magazine and is the deputy chair of the OUG
Ireland BI SIG. Brendan is a regular presenter at conferences in Ireland,
UK, Norway, Brazil and USA, including Oracle Open World.
Contact Brendan at: editor@ukoug.org.uk

04

www.ukoug.org

UKOUG Membership

MEMBER FILES

Why Mark
Recommends UKOUG
Mark Williams works for Elsevier Ltd where he watches over the installation,
drawing parallels between requirements and system provisioning and
generally controls environments and monitors who does what, and where.

Why did you first engage with the


user group?
The initial feeling was a need to mingle
with the Oracle community at SIGs and
the annual conference and to get the
trends and where we collectively thought
the technology was going.
What value do you find through UKOUG?
It enables me to recognise the parallels
between what we were doing and feeling in
the Oracle space, and that experienced by
other Oracle customers. We are not alone.

What has surprised you about UKOUG


membership?
There is benefit to be gained for those
prepared to put in the effort. For those
who adopt a back foot approach and are
not willing to engage, the benefit
is questionable.
Is there a specific example where engaging
with UKOUG has had a direct beneficial
result for you and your company?
Yes. At Apps13, I was able to discuss a
real-world deliverable with one of the
vendors. Even though we have not yet

Our
maths
isnt
wrong...
www.ukoug.org

Your top three reasons for recommending


UKOUG membership are?
Value gained from mingling with other
Oracle professionals - priceless. Face to
face participation at events makes
connections in a way not achievable
online. Group-based pressure from
customers is more successful than
individual campaigns.

Introduce another department,


colleague or customer to UKOUG
and earn yourself 50!
The networking opportunities, the invaluable information, the
practical advice from Oracle experts at our events it all adds up to
a happier Oracle customer. When theyre ready to join, send them to
www.ukoug.org/join to complete their membership application and
ask them to enter your name and email address.

For each successful membership application, well


send you a 50 Amazon voucher to say thank you

www.ukoug.org/join
www.ukoug.org

engaged with said vendor, their business


card is still on my desk, whereas others
arriving through other means may be
consigned to the waste bin.

05

OracleScene
D I G I T A L

AUTUMN 14

News & Reviews

UKOUG Speaker Awards


If youve ever attended a UKOUG
conference, you will have been asked to
complete speaker evaluations either
on paper forms or feeding back via the
mobile app. We ask all our delegates
to complete these for every conference
session they attend.
But why do we do this? Well, we do it for
several reasons. It makes up a part of our
quality control process, so that we can
make sure that the papers our volunteers
select on agenda planning days are
actually delivering what they promised
from the delegates perspective. We also
use them to give valuable feedback to the
speakers. We inform them of their average
score and some constructive comments
where necessary. We find that most of
the speakers really appreciate, and even
look forward to, receiving this feedback
because it means they can fine tune their
content and/or delivery skills. We also use
your feedback to determine the winners
of our UKOUG Speaker Awards. The
speakers scores from the previous
years conferences are run through a
snazzy spreadsheet to determine the

Awards are given to Oracle, UK and


overseas speakers in the following
categories: Best New Speaker; Best
Speaker; Best Judged Paper.
Several years ago a group of UKOUG
volunteers created what were then called
the Inspiring Presentation Awards. The
purpose of the awards at their inception
was to provide recognition for speakers
at our annual Technology & E-Business
Suite conference. Times have moved on
and as we now have an Applications
conference too we have decided to
rename the awards to the UKOUG
Speaker Awards. The awards now cover
both the Applications and the Technology
conferences so we have double the
number of awards!
Although the name has changed, the
ethos behind the awards has not. The
Oracle Technology Network has sponsored
the awards every year since they were
launched because the awards are all
about giving back to the community -

Speaker
U K O U G

Pictured left:
UKOUG President,
David WarburtonBroadhurst, presenting
Martin Nash, ORAsavon
Ltd, with Best Judges
Score Award at Tech13

P a r t n e r

winners in each speaker category.

o f

t h e

Y e a r

a w a r d s

Record Number of Votes


Thank you to everyone who voted in
UKOUGs Partner of the Year Awards
2014/2015.
As the awards are wholly decided by the
customers they are particularly special to
partners and we know how much they
appreciate the time you take to vote.

partneroftheyear
2014/15

06

The winners will be announced at


the awards ceremony at Kent House,
Knightsbridge on 23rd October. If you
would like to join us and your winners
for an evening of celebrations, book your
place today.

www.ukoug.org/pya

promoting people from their very first


presentation through to becoming world
renowned speakers.
When someone has won three consecutive
awards they are presented with a Lifetime
Award. This award enables us to recognise
contributions from many of our long term
speakers. Jonathan Lewis was our first long
standing speaker to be presented with this
award in 2013. He says:
Filling in the speaker evaluation forms
is the best way the delegates have
of showing their appreciation of the
speakers and Im delighted to have
received so much encouragement over
the years. The Lifetime Award may have
removed me from the competition, but
this just means the comments, criticisms,
and requests that the delegates are free
to make on their forms will be even more
important to me.
Look out for the 2014 awards being
presented at the conferences ahead of the
main keynotes. And dont forget to keep
filling in those speaker evaluations!

Pictured right:
UKOUG Member
Advocate, Debra Lilley,
presenting Mark
Thomas, Hays, with
Best New Speaker
Award at Apps13

Stop Press SIGs!!


In the lead up to Apps14 & Tech14,
we have some fantastic content lined
up for you on our Autumn Special
Interest Group (SIG) agendas.
Our applications and technology
focused SIGs are covering a host
of subject areas from all things
mobile to database security, from
Fusion Financials to BI in the higher
education space...
and much more.
You can find SIG
listings on p44 or
visit www.ukoug.
org/events

www.ukoug.org

News & Reviews

Top Money-Saving Tips This


Conference Season
The Apps14, Tech14 and JDE14 agendas have now been launched and were experiencing a surge in
enquiries from prospective delegates asking whether their organisations are UKOUG members.
Why are they asking? Because they know
that being a member is the most cost
effective way to attend, even if theyre
not planning to attend the whole event.
We have membership packages to suit
all budgets and many of these come with
inclusive conference tickets.
All memberships come with Special
Interest Group passes valid for 365 days
which you can start using as soon as your
membership is active check out the
calendar at: www.ukoug.org/events

What is each memberships


entitlement of conference tickets?

1 ticket = 1 conference
All membership packages entitle you to
purchase tickets at member rates.
Member day rate: 325+VAT
Non member rates: 1 day 600+VAT,
2 days 1,000+VAT, 3 days 1,400+VAT

Member advice

Dont worry if your membership is due


before the event you can still register

Membership Level
Platinum

Ticket
Entitlement
1,500
2 tickets
Price

Gold

955

1 ticket

Silver

655

0 tickets

Bronze

200

0 tickets

Diamond Partner
(invitation only)

3,500

5 tickets

Platinum Partner

3,600

4 tickets

Gold Partner

2,600

3 tickets

Silver Partner

1,500

2 tickets

Independent
Partner

200

0 tickets

using your future allocation of tickets,


the booking will just remain provisional
until the renewal fee is paid.
A handy reward
We know some organisations have many
departments that use Oracle products.
Did you know that if you recommend
membership to them, once their

application is received you get a 50


Amazon voucher thank you from us and
can also proudly say that you have helped
your company to save money.
Need more days? Consider upgrading
Remember, when it comes to conference
places its frequently cheaper to upgrade
or buy another membership, than to
purchase additional tickets.
Save on your train fares
For 3-9 passengers travelling together,
Virgin Trains offer a 20% discount off
Advance Fares booked through their
website. For more information go to
www.virgintrains.co.uk/tickets-offers/
group-travel
Help is at hand
Give the membership team a call on
+44(0)20 8545 9670 or contact info@
ukoug.org and well talk you through your
membership or registration options.
Registration FAQs
www.ukoug.org/conferencefaqs

Feature in the next

A Huge Thank You


Wed like to say a massive thank
you to everyone who completed our
membership survey. We had just under
450 responses - a fantastic result!
Your feedback is being used to feed into
our planning for 2015 and beyond. From
the survey results were able to see where
our members find value, what you want
to see us offering that we currently dont
and, by asking for feedback from nonmembers as well, weve been able to get
insight into what will help us grow the
Oracle community.
www.ukoug.org

Big congratulations to the winners of the


50 Amazon voucher prize draw; Farrukh
Ahmed of Certus Solutions, Alex Bett
of Student Loans Company and Jeremy
Webb of Perfect Wave Solutions.
Look out for future UKOUG and Oracle
surveys to ensure your opinions are
heard. UKOUG is your user group.
And remember, your feedback is
welcome at any time so contact us
at info@ukoug.org if you ever want to
share any thoughts or ideas with the
UKOUG team.

Were looking for compelling stories


about your experiences with your
Oracle technology and applications
from a technical and/or functional
perspective. Whatever your story is
and whether its good news or bad
news, future plans, innovative use
of your applications,
integrations with
other solutions, we
want to know about
it. Send your
submissions to:
articles@ukoug.org.uk
Article submissions
deadline: 3rd Oct 2014
Publish month: Dec 2014 in time for
Tech14 & Apps14

07

OracleScene
D I G I T A L

AUTUMN 14

Oracle

A Cloud Menu for All Tastes


Its no secret that the cloud makes businesses more agile and flexible. Weve
moved past the period where the cloud was a glossy new technology and
entered an age in which virtually all modern enterprises have embraced cloud
or are exploring how to best implement it. The technology continues to gain
momentum, with Gartner recently predicting the public cloud services market
to grow over 17 per cent each year up to 20181.
Dermot OKelly, Senior Vice President Oracle UK, Ireland and Israel Technology
Oracle recently showed the world that we are at the forefront
of cloud development. Our financial results from last year
revealed that we have become the worlds second largest
Software-as-a-Service (SaaS) cloud provider. The success of our
SaaS business comes down to the strength of the platform
these run on, the Oracle database, and on the ubiquity of the
worlds most popular programming language, Java.
1

That being said, our cloud strategy doesnt stop at SaaS.


Our focus is on cloud computing as a whole, and on helping
businesses gain the flexibility and agility they need to stay
ahead of the trends social engagement, mobile media, and
unstructured data analysis guiding the market today. The truth
is that business leaders will first determine what pain points
they want to address with cloud, and then choose whichever
platform and add-on services they need to get the job done.

https://www.gartner.com/doc/2738817/forecast-analysis-public-cloud-services

08

www.ukoug.org

Oracle: Dermot OKelly

The cloud promises to help businesses find value across a wide


range of functions, from CX, to financials to ERP. What sets
Oracle apart in this space is that where many vendors only focus
on one of these areas, we offer a complete range of solutions for
businesses to choose from. Why would a business take on the
headache of multiple contracts and disparate systems when it
can implement one fully customised set of cloud solutions that
ticks all the boxes?

The new product managers

Before cloud, every time a business required an ERP or CRM


system, for example, they needed to reinvent the wheel
to implement an effective solution tailored to their needs.
Oracles broad set of as-a-service platforms, applications, and
technologies empower IT leaders to think more like product
managers and develop their own successful cloud strategies.
With the worlds most complete and elastic set of cloud
solutions at their fingertips, businesses can simply pick and
choose the platform and services best-suited to their needs.
Also, because our applications are hosted on the cloud the
process of implementing them is quick, easy, and cost-effective.
To add to the wide range of applications we provide, we also
offer more innovative IT services hosted on the cloud than any
of our competitors. Looking to the future, Oracle will soon offer
full infrastructure and platform services that allow businesses
to run their own virtual machines in the cloud. Customers will
be able to benefit from a virtual compete or database-as-aservice (DbaaS) to develop new applications in a public or private
environment, depending on their preference.

Where the database meets the cloud

The Oracle Database is the worlds leading enterprise-grade


database, and through the cloud we have now made it easier

ABOUT
THE
AUTHOR

www.ukoug.org

to consume than ever for businesses. Traditionally, customers


that wanted their database on-premise also had to pay for
and maintain a great deal of additional components and
management services. Today, we offer a fully-loaded Platformas-a-Service with all the software and systems already installed,
which means companies can just plug in and gain access to the
data and applications they need.
The same goes for middleware; we offer a multitenant platform
option to developers who want to create innovative applications
using the worlds best database. Ultimately, it comes down to
offering businesses more choice than anyone else and making it
easy for them to benefit from the IT they need.

Building on our success

Our SaaS business has had its best year ever, and as Larry
Ellison said during Oracles Q4FY14 earnings announcement
we are laser-focussed on becoming the number one SaaS
provider in the world. We expect that by the end of this fiscal
year every business we work with will be using the public cloud
in some form.
That being said, we as a company have never pursued excellence
in only one arena. We understand that every business expects
something different from the cloud, which is why we provide
them with the broadest collection of IT solutions available and
continue to build on our offerings. Many of our customers are
innovators in their fields and will prefer a hybrid solution that
offers the flexibility and cost-savings of the public cloud while
allowing them to differentiate themselves with innovative
services developed in-house.
No matter how businesses want to consume the cloud, Oracle
offers something on the menu for everyone.

Dermot OKelly
Senior Vice President Oracle UK, Ireland and Israel Technology
Dermot OKelly is Senior Vice President for Oracle UK, Ireland and Israel region and
is responsible for driving Oracles operations, growth and profitability across these
geographies. He also leads the close alignment of Oracles key accounts, and is the
Country Leader for the UK.

09

OracleScene
D I G I T A L

AUTUMN 14

Technology

A First Look at

Oracles In-Memory
Database Option
One of the hottest
topics in Larry
Ellisons keynote
at last years
Oracle OpenWorld
conference was
the announcement
that in-memory
data storage
within the Oracle
Database was on
the way.
Peter Scott
Principal Consultant
Rittman Mead

10

In June this year Oracle launched the


in-memory database option and it is
generally availability as part of Oracle 12c
(12.1.0.2) as an additional cost option to
Oracle Enterprise Edition.
So whats the big deal about in-memory
(IM)? After all, keeping data in memory as
way of increasing performance has been
with us for years; DBAs have pinned tables
in memory, have configured large data
caches, we have been using SSD storage,
even old-school RAM disk, developers
have used SGA Result Cache to cache
the results of PL/SQL functions. Some
of us have used completely in-memory
database technology such as Oracle
TimesTen. The big deal with this new
in-memory option is that it is easy to set
up and it works without having to make
changes to physical storage, application
code or any of those other things that
are a nightmare to push through
change control.
In my opinion, IM storage is the most
significant change in how we access data
since table partitioning. It may not be
the right solution for every use case; for
example pure OLTP may not benefit much
since things such as SGA buffer cache

already boosts performance significantly,


but for mixed use OLTP/reporting and
pure reporting we may see significant
gains in performance. In fact any use
where we need to access and analyse a
significant number of rows from a table
could well benefit.
Before looking at things in a little more
technical detail lets look at a few of the
misconceptions that people may have:
What happens if my server crashes,
Ill lose data wont I?
No more than you might if your prein-memory Oracle Database crashed.
Underpinning the IM option is
conventional table technology. All of the
data is stored in tables in the same way
as now, data modifications continue
to be logged to redo logs in exactly the
same way, nothing changes in how data
is written and protected. The difference
with IM is that we selectively copy the
table content to highly optimised RAM
based structures.
I have a terabyte database, I cant put a
1TB of RAM into my database server
Well you dont have to! You define
how much memory to allocate (at

www.ukoug.org

Technology: Peter Scott

least 100MB). You can specify which tables to populate as IM


objects; you can even specify individual partitions or columns
to load. You store in memory only what makes sense for your
organisation, whether that is your most recent data or the
whole of your historic record. As a memory saving bonus the IM
tables are compressed, so you may not need anywhere as much
RAM as your disk-based versions of the tables.
Compressed? That means slow updates and inserts, doesnt it?
Well no, all of the data changes are written to the underlying
traditional, disk-based tables, in exactly the same way as in now.
So there is no change in performance here. The difference is that
we also mark an internal structure in the IM component that
we need to look elsewhere for the content of the changed rows.
When enough changes have occurred a background process
automatically rebuilds the compressed structures.
Do I need to change my application to use in-memory?
Not really, the minimum you need to do is switch the feature
on at the database level, tell it how much memory to use and
selectively alter the tablespaces, individual tables, partitions or
even columns that you wish to populate in-memory to use the
feature. All of this is DBA activity and does not alter the data
model or application code one jot. In time, a DBA may wish to
drop some indexes, as they will not be needed for IM queries. In
my experience there are two kinds of index, those needed by the
application for data integrity and or DML performance reasons
and indexes needed to support queries, its the query indexes we
may not need as I will explain later.

storage indexes are linked to each column extent and thus


always available;
Column-based storage has an effect very similar to creating
single column indexes on the column. This means that it may
be possible to remove traditional indexes that are solely used
for query performance. Removing indexes used for query can
also have a beneficial effect on OLTP workloads as the CPU
cycles and IO taken to update these query indexes is no
longer needed;
IM uses SIMD (single instruction, multiple data) vector
processing on the data during retrieval and thus can filter
more data per CPU cycle than we could with more traditional
fetch from disk techniques;
Finally, we are able to compress the data. We have various
compression options available to us each with various
trade-offs, the default compression MEMCOMPRESS FOR
QUERY LOW gives good compression with the advantage of
not having to decompress the data before accessing it, more
aggressive compression may require more CPU usage to
access the data and thus be a bit slower than the default low
compression, but it is still significantly faster than reading
from disk. How much you can compress your data depends on
the nature of your data, but in general columnar compression
gives much better compression than row based compression.

Its in the name in-memory option that the database is storing


data in server memory, but how is this different from the current
SGA buffer cache? The fundamental difference is that buffer
cache is row-orientated and therefore transaction orientated;
on the other hand IM stores its data as separate columns with
larger tables being divided into multiple column extents. Storing
columns separately gives us many advantages:
W
 e only need retrieve the columns of interest, if I access only
3 out 50 columns of a table in my query I am moving far fewer
bytes of data than I would if I had to read whole rows. In my
mind doing less is always the good option;
We need not store all of the columns of a table in memory.
Indeed the Oracle Database may decide not materialise all
of the columns of a table if it thinks it can better use the
memory in other ways. We can build on this concept using
our knowledge of the data to mark columns as ineligible
for storage in memory, for example we may have a table
with audit style columns that are seldom queried in normal
operations;
Being column organised it is far easier to filter queries,
especially on range style predicates;
On larger tables, those where the column is stored in more
than one extent, we can use storage indexes (very similar to
those found on Exadata at the storage cell level) to maintain
minima and maxima for each column block. So if we know
there are no data items of interest in my column extent I
need not access it. Unlike Exadata storage indexes in-memory
www.ukoug.org

As you may know, I have been working in the data warehouse/


ODS space for most of my career with Oracle technologies.
Much of that time I have worked with a performance tuning
mantra big SGA bad, big PGA good and that was mainly for the
reasons that my ideal data access paths favoured direct reads,
that I used bitmap indexing and, probably most significantly,
that I never ran the same query twice. IM makes me revisit that
notion, mainly because IM data is stored in the SGA. Using SGA is
a sensible decision since in-memory data needs to be available to
all users, PGA is private to the session, and the idea of developing
additional memory allocation and access routines outside the
supported SGA framework is plain silly. To enable IM we need to
set the size of the In-Memory Columnar Store Area in the SGA. To
do this we need to do two things, allocate enough SGA and carve
11

OracleScene
D I G I T A L

AUTUMN 14

Technology: Peter Scott

out a smaller chunk of SGA for our in-memory use:


ALTER SYSTEM SET SGA_TARGET=10G SCOPE=SPFILE;
ALTER SYSTEM SET INMEMORY_SIZE=8G SCOPE=SPFILE;

Then, as with such changes, restart the instance. Optionally we


could let the database instance control memory management; in
this case we set MEMORY_TARGET and not SGA_TARGET. I am a
bit old school and prefer to set my own SGA and PGA values. Our
next step is to indicate to the database which tablespaces, tables
or partitions are eligible for populating in the in-memory area.
Allowing existing objects to be replicated into memory is just a
matter of an alter table command to set the inmemory option.
Before accessing IM data it has to be copied from conventional
(heap organised) tables in to the IM columnar store; this can
happen as part of the database start-up (and for non-engineered
systems this lengthens database start-up duration) or, as by
default, on first access of the table. In this case the query runs
against the conventional disk-based table or buffer cache and an
asynchronous background process populates the IM area so that
the data becomes available for subsequent queries. If you are
running on Exadata it is possible to repopulate the in-memory
area at instance start-up from a check-pointed copy held on disk
or in Flash memory and reduce DB start-up time.
So far, we have discovered that in-memory data are stored in a
column store that resides in the SGA, that these IM structures
are populated at instance start-up or on first access and that
the data is compressed and optimised for analytic style queries.
We have also mentioned that not all columns of a table need be
held in the column store. It should be noted that currently we
can only populate IM tables from conventional heap organised
tables, it is not possible to make an in-memory Index Organised
Table, this is not really surprising as the IOT structure is more
akin to a composite index than a table. The other thing to note
is that certain data types cant be put into the column store,
LONGS are not supported and nor are out-of-line LOBS; In my

ABOUT
THE
AUTHOR

12

opinion, these data type restrictions will not be a problem in


likely usage scenarios.
One of the major use cases for IM is likely to be as an enabler
to analytic reporting over an OLTP system. Historically this
type of mixed used system has been difficult to build. The
tuning requirements for OLTP and DW/DSS workloads tend to
conflict and the lack of aggregates to support reporting leads
to high-impact queries consuming resource and slowing the
work that the OLTP system is supposed to be doing. Creating
IM copies of our transactional data allows us to run analytical
queries without touching the underlying disk-based tables. The
compression, storage indexes and column-based storage gives
us good performance without the need to create additional
indexes to support queries. OLTP systems, by their nature,
have frequently changing data items, so it is reasonable to ask
what happens to the copy of data we populated into the IM
column store. The key thing is that Oracle guarantees that the
in-memory data is transactionally consistent with the data
in the buffer cache. It does this by marking the IM column
store data item as stale and putting a transactional entry
into an in-memory journal table - the IM engine will combine
the results of the columnar store and the journal table logs to
produce an in-memory result without needing to go back to
disk-based storage. Over time more IM column-entries become
stale and the in-memory structure gets rebuilt by one of the IM
background processes.
Building on this idea further, we can now design systems where
we combine operational reporting over the OLTP source, with a
more traditional data warehouse supporting queries over more
historic aspects and more in-depth data analysis and discovery
through a tightly coupled Big Data platform. In-memory query is
an enabler for new ways to do reporting and is far more than a
way to make a few queries run faster. As I said before it is a gearshift in our abilities to manage data, much in the same way that
partitioning was the great enabler for effective parallel query.

Peter Scott
Principal Consultant Rittman Mead
Peter specialises in Business Intelligence architecture and Data Warehouse
performance tuning. For the past few months has led Rittman Meads Oracle
In-Memory Option evaluation team. Recently Peter became an Oracle ACE
Associate in recognition of his blog writing and conference talks in Europe and
North America.

www.ukoug.org

Focus on: JDE14

Wherever Your Destination


Book early and save!
Check out our
money saving tips
for conference travel
& accommodation
on P5

#ukoug_jde14 | www.jde14.ukoug.org

Gets You There

1 2 - 1 3 N O V E M B E R 2 0 1 4 | H E Y T H R O P PA R K R E S O R T | O X F O R D

Following the great success of our call for papers, the JD Edwards committee
came together in July for agenda planning day. Working through the submitted
abstracts, the committee had their work cut out to decide which presentations
made it onto the JDE14 agenda.
After a lot of blu-tak/post-it note sticking,
debating, deciding and moving sessions
around, an agenda was finally formed.
With six streams and two days of
valuable content on topics on everything
from the cloud to upgrades this
conference is truly essential for anyone
who uses JD Edwards.

So whats new this year?

Knowledge Bites: Following feedback


from previous conferences were running
Knowledge Bites for the first time at
JDE14 to provide delegates with more
interactive sessions. These are not
designed to be deep dive workshops but
short, punchy sessions with lots of chat
and bite-sized amounts of information. If
you like the taste you will be told where to
go for more!
More user stories: Find out how other
companies are working with the JD
Edwards product with real life stories
from companies such as BSkyB, Gondola
Group (operators of Pizza Express, Zizzi,
and ASK Italian) and Mizuno.

Back by popular demand: Lyle Ekdahl,


Senior Vice President, JD Edwards
Development with a keynote presentation
on JD Edwards leadership and vision on
how digital technologies are changing the
way we live and are transforming the way
business is done.
Extra conference tickets for Gold and
Platinum members: Were giving back
more to our JDE end user members. This
year, for one year only, once you have used
your membership allocation of tickets,
you will be issued with an extra ticket. Full
details on registering for JDE14 can be
found at www.jde14.ukoug.org/register.
View the agenda at www.jde14.ukoug.
org/agenda.

Brand new venue!

This year we have a superb new venue


hosting the conference and exhibition.
Surrounded by 440 acres of stunning
Oxford countryside is the Heythrop
Park Resort. Dating back to 1710 this
quintessential English country estate
seamlessly combines elegance with style.

Convenient to get to, the Heythrop Park


Resort is located just 12 miles from Oxford,
its central location makes travelling from
both London and Birmingham easily
accessible. There are over 500 free car
parking spaces on site, direct train lines
from Paddington Train Station and is within
90 minutes of Birmingham International
Airport and London Heathrow.
Why not extend your time away? A must
for keen golfers is the venues onsite golf
course, the 7088 yard par 72 course weaves
throughout the 440 acre Heythrop estate
providing the perfect golfing challenge.
Located on the edge of the Cotswolds,
with Oxford on the doorstep, Heythrop
Park Resort is also perfect for exploring
attractions such as Blenheim Palace and
the picturesque Cotswold Villages of
Burford and Chipping Norton.
Book early and save! Check out our
money saving tips for conference & travel
on P7. Further details on this years venue
can be found at:
www.jde14.ukoug.org/venue

OracleScene
D I G I T A L

AUTUMN 14

Focus on: JDE14

Working in a World of
Shared Experiences
JD Edwards community
What does this really mean? And whats in it for you?
Gideon Tester, JD Edwards Committee Member

This may seem like a sentimental trip


down memory lane, but these questions
are discussed at UKOUG JDE committee
meetings frequently, so with the annual
UKOUG JD Edwards Conference &
Exhibition around the corner, we thought
we would ask some community members
what they think are the answers.
But before we do, lets make some
observations about events that happened
over a few years which have impacted on
where we are today.
Many of us who work with JD Edwards will
have been doing so since the early days
before Oracle and PeopleSoft. Going back to
the 1990s and 2000s when legacy became
ERP, the days of process engineering and
user groups. User communities provided
a valuable way of sharing experiences and
jointly engaging with big ERP vendors.

Whats in a name?

First (and probably the most fundamental


change) was in 2003, when JD Edwards
was acquired by PeopleSoft. What
followed was not security about the
product so recently implemented, but
uncertainty about the direction of an
ERP which had considerable spend and
human capital investment implementing,
upgrading and supporting.
More uncertainty followed as JD Edwards
became EnterpriseOne under PeopleSoft
and then in January 2005 Oracle acquired
14

PeopleSoft. This was followed eight


months later with the announcement of
Fusion applications.
Where did that leave JD Edwards? Already
reeling from two years of uncertainty
it would be another nine months (April
2006) before Oracle announced indefinite
support for JD Edwards, placing JDE firmly
as its mid-market offering.

new technologies. With the uncertainty


around JD Edwards and limited ERP
activity we had the time to do so, to
deliver the new wave of ICT, to integrate
and to mobilise.
Today we do so much more but, in our
opinion, this means the JDE community
is possibly more important today because
shared experiences go further.

Things began to change. The new version


8.12, released in 2006, saw people begin
upgrading and adopting the web and,
in 2009, the major release of Oracle
JDE EnterpriseOne v9.0 incorporating
new features, modules, modern UI and
a big push to promote the product.
The 2012 release of v9.1 and published
and promoted roadmap going forward
highlight that things are positive turning
a corner for JD Edwards and its customers.

Change continues, with many companies


moving processes out of their JDE solution
into a variety of cloud and mobile services.
Others are continuing to expand their JDE
footprint and invest further in the JDE
centric technologies. The committee and
UKOUG recognises this and both work to
change and adapt to your needs, but the
people who make the real difference are
you, the users, implementers, partners,
analysts.

Today people are upgrading, engaged and


ERP still beats at the heart of most of our
organisations.

We would love to see more people at the


conference, to hear more experiences,
to ask Oracle new questions and to hear
your views first-hand about how UKOUG
can deliver value to you in the future.

Things change

The other observation during this


period of uncertainty is that the world
was changing. Smart phones became
mainstream, Google, Social Media,
Tablets, the list goes onso how does
this relate? The answer is that we, as
IT professionals, were being asked to
adapt, support and deliver using these

We asked three community members


to tell us about the value they get from
engaging in the community, why they
would encourage others to get actively
involved, why networking is vital to them
and their businesses and how they would
like to see the community develop. Heres
what they had to say.
www.ukoug.org

Focus on: JDE14

Registration is open for JDE14, secure your place at www.jde14.ukoug.org/register

Mike Gibbons
Aggregate Industries

Martin Gater
Norgine

Mike Gibbons
CIO, Aggregate Industries
Weve had JDE ERP solutions for nearly 15 years and have
used the Oracle databases solution for similar time frames.
Were also using Oracle CRM On Demand. Weve been actively
engaged with UKOUG JDE community for several years with
our Business Solutions Manager being on the committee. We
always have a presence at the annual JD Edwards Conference
& Exhibition and send technical staff to the more technically
orientated conferences. Peers experiences of upgrading systems,
value derived from new or existing modules and experience
of Oracle partners forms a crucial part of our research before
embarking upon change. I would like the JDE community to
become more integrated with UKOUG as we see more benefit
in understanding the Oracle portfolio and their integration
capabilities with JDE, but supported with more compelling
JDE SIGs throughout the year. Sharing experiences (good and
bad) in an honest and trusted network is invaluable so I would
encourage anyone using JDE to get involved.
Martin Gater
IT Engagement Manager, Norgine
Ive been working with JD Edwards for 20 years. For the last seven
years Ive been with Norgine, a leading independent European
speciality pharmaceutical company. Norgine are responsible
for JD Edwards and other key systems supporting the business.
Weve a good relationship with Oracle JD Edwards and a good
working relationship with our Oracle Account Manager. Weve
found it valuable to maintain good relations with Oracles senior
management. Weve just completed a successful upgrade to
9.1 and the relationship benefited us during this period. We use
both UKOUG and Oracle JD Edwards to help guide our tactical
decisions towards delivering our IT strategy. We attend the
UKOUG JDE Conference & Exhibition and SIGs deciding which
to attend based on the content. Typical content may be specific
to our projects, industry legislation, customer experiences and
www.ukoug.org

Paul Barker
Sir Robert McAlpine

networking. The JDE community is very valuable. Its the one way
we can benchmark ourselves against what other people are doing
and its good to hear from people who have done things that you
want to do so that you can learn from them. Id like to encourage
more customer engagement in the community. Users are missing
out on incredibly valuable experiences and interactions. We need
more ongoing user collaboration between the conferences each
year, maybe forums for sharing knowledge and resources. UKOUG
forms a central repository to gain usable information and it
definitely aids the development of our 5 year IT plan.
Paul Barker
Company Business Systems Manager,
Sir Robert McAlpine
We implemented JDE 10 years ago and its continued to be
a solid foundation for our commercial systems, used by all
employees. As well as being a member of the user group during
this time, we have, in the past, also been a member of the
JDE steering committee and chaired the Construction SIG. We
recognise the benefits of being part of the JDE community, not
just to sustain our ERP investment, but also to build a network
of contacts and share experience in other areas. Its an essential
part of maintaining our system whilst providing insight to
complementary products and services as used by others. The
last few years have seen the JDE community grow with each
conference, demonstrating a vibrancy and interest that reflects a
good product. We need to see more user stories on getting more
value out of the system and further investments. Embracing
a more online, social means of networking and knowledge
sharing will widen the appeal of the group.

In summary, like-minded people, sharing experience around


a common interest, will add value to your investment.
UKOUG would like to thank all of those who contributed to
this article.
15

OracleScene
D I G I T A L

AUTUMN 14

Technology

Oracle SQL Developer


& Reporting
Introduction & Whats New for Version 4.0
Oracle SQL
Developer is the
graphical user
interface (GUI) for
Oracle Database
developers,
administrators and
IT support staff.
Jeff Smith
Product Manager
Oracle

With more than 3.5 million active users,


it is very well-known for making it easy
to execute queries, develop stored
procedures and manage databases.
However, only a small number of our
users understand, and take full advantage
of, our powerful reporting capabilities.
This article will introduce the reporting
feature and will briefly describe its
capabilities. The content was developed
using Oracle SQL Developer v4.0.2
and Database 12c each available for
download from the Oracle Technology
Network (OTN).
The reporting examples and SQL code
used to build them are all built on top of
the Human Resources (HR) demonstration
schema, which is available when using
the Oracle Database Creation Assistant
(DBCA).

Defining Reporting & Use Cases

There are many commercial reporting


solutions available in the marketplace

16

FIGURE 1: ORACLE
SQL DEVELOPER
V4.0 WAS
RELEASED IN
DECEMBER 2013

today. Oracle offers Oracle Reports


Services, Oracle BI Publisher and several
industry specific reporting solutions.
Oracle SQL Developers reporting feature
is for the database end-user. It allows
someone to take one or more SQL
statements and to render their result
sets as grids, charts, graphs, basic HTML,
or script output inside of SQL Developer
itself. It can also be described as ad hoc
reporting for the database user.
So, why would someone care about this
feature in SQL Developer? I find that
only about 20 percent of our users (from
informal polling and educated guessing)
take advantage of the reporting feature.
While I am not surprised, I do think many
of you could greatly benefit from using this

www.ukoug.org

Technology: Jeff Smith

FIGURE 2: REPORTS PANEL IN THE SQL DEVELOPER DESKTOP

FIGURE 3: NEW REPORT DIALOG

feature. Lets examine a few scenarios where it might make your


day-to-day business activities a bit more bearable:
You frequently run the same queries to answer business
questions
You need to make information available to other database
end users
Charts and pretty pictures make consuming your data and
recommendations much more compelling
Granted, thats only three scenarios, but they are BIG use
cases. Maybe you have a Bob in accounting or a Nancy in field
operations that contacts you every 3 months or so to ask that
question. Wouldnt it be nice if you could just run the Bob
report and have the data all nicely presented so you could just
email it to him right away?
Apart from making the formatting and standardising the output
- you are also codifying the answer to Bobs question. Writing
the query by hand each time he asks the question means that at
some point, you will probably write it a bit differently - and that
introduces the opportunity for mistakes.

SQL Developer ships with many pre-canned reports. These are


reports that tell you more about your database. To create a new
report, right mouse click on the User Defined Reports tree node,
and select New Report.
This will open the Create Report dialog (see Figure 3).
The fields are fairly self-explanatory, but lets define them
anyway. The name of the report is what will be shown in the
User Defined Reports tree. The style defines how the results will
be rendered. In this case, Table means a data grid, as you are
used to seeing in a SQL Worksheet when running a query. The
description allows you to document the business requirement of
the report. The tool tip is shown when you mouse-hover over the
report name in the User Defined Reports tree. And finally, the
SQL is the code that is executed when you run the report and
supplies the data to be displayed.
Here is what that report looks like after it has executed:

Ok, enough preaching, lets get on to the technical bits and


examples.

The Basic Grid Report

The simplest report consists of a single query, with the results


returned in a data grid (see Figure 4). Bob in accounting, just
Bob for the rest of this article, wants to know how much the
people in each department are costing the company in terms of
payroll or salary.
You know this is a simple query, just a sum of salary grouped by
department. But instead of running it as a query in a worksheet,
you decide to create a report for Bob that you can run on demand.
To access the reports in SQL Developer, you can simply click on
the Reports panel (see Figure 2). It is open by default in the SQL
Developer desktop. If at some point you have closed it, you can
restore it by going to the View menu.

www.ukoug.org

FIGURE 4: THE OUTPUT OF A REPORT IN A


FLOATING DOCUMENT WINDOW

Note that reports are


not hard coded to run
against any particular
database or SQL
Developer connection.
When you doubleclick on a report in
the tree to execute it,
the first thing you will
see is a connection
dialog prompt. This
allows you to specify
the database and user
to be used to execute
your report.

Since this is a standard, SQL Developer grid, all of the grid


features are available here. You can rearrange the columns, use
the Find to highlight specific values, export the results to PDF
or Excel and much more.
The column headers are defined by SQL statement itself, so you

17

OracleScene
D I G I T A L

AUTUMN 14

Technology: Jeff Smith

can take advantage of column aliases to make the report appear


as you require.

Child Reports

Bob may want to start asking more questions once you give him
his salary numbers. For example, who is making so much money
in department 80? Or, I forget which department, department
80 is? You can help Bob with these questions by creating a few
Child Reports.
Remember: Bob is ALSO a database user. He can run these reports
in SQL Developer OR you are exporting the report to a consumable
format such as PDF or HTML.

FIGURE 6: A PARENT-CHILD REPORT. NOTE: YOU CAN HAVE MULTIPLE CHILDREN


REPORTS.

If we edit the Bob in Accounting report, we can add one or more


child reports.

The selected row in the Parent report drives the context of the
Child report data. The SQL supplied to achieve this result is:
select * from hr.employees
where department_id = :DEPARTMENT_ID
order by salary desc, last_name asc

Please note the where clause - the department_id column, which


is used in the Parent report, is referenced in the Child report using
the :UPPERCASE syntax. So when department 100 is selected up
top, the value 100 is passed to the Child report query.

FIGURE 5: CHILD REPORT DEFINITION

When adding the Child Report, the same information will be


required. Name the report, define the report style and provide
the query to drive the data displayed.
The only trick here is that if you want to reference a specific
value in the Parent report, you must use an upper-cased bind
variable in the child report.

Pictures Help Tell The Story

I like to joke that your bosses will ignore your reports until you
add pretty pictures to them. Funny or not, some information is
just easier to consume when presented visually. If we look at our
first report, the sum of salary by department - how much more
understandable would that be if it were displayed as a pie chart?
Imagine your company has 360 departments - and you need
to quickly identify the top 5. Lets look at how SQL Developers
Chart style report can help.

FIGURE 7: SQL DEVELOPER V4.0


NOW OFFERS MORE THAN 50
DIFFERENT CHARTING OPTIONS

18

www.ukoug.org

Technology: Jeff Smith

Bobs boss wants just the top 5 departments in terms of payroll and he wants it in a graph.
Instead of choosing a report style of Table, go with Chart. To
define the type of chart and just what is to be charted, proceed
to the Property item in the Report Tree (see Figure 7).
But before anything is defined, the SQL query must first be
tweaked remember that Bobs boss only wants the top 5
departments by salary to be included in the report.
So the driving query for this report will now be this note the
new FETCH FIRST syntax introduced in Oracle Database 12c:
select DEPARTMENT_ID, SUM(SALARY)
from hr.employees
group by DEPARTMENT_ID
order by sum(salary) desc
fetch first 5 rows only

The previous version of SQL Developer required that charts be


supplied with the Group, Series and Value properties via the
SELECT clause. Version 4.0 allows you to specify these manually
in the report designer.
With the new SQL statement supplied and with the report style
set to Chart, it is time to supply the details for the chart itself
(see Figure 8).
Please forgive me, I know that there is a lot going on in Figure 8.
However, the most important thing to pay attention to are the
two areas highlighted with the red boxes. New for version 4.0,
you can now live preview how your report/chart will
render AS you are defining the report! No more setting the
properties, saying Apply and then opening the report to see
what would happen.

I find this single enhancement to be a key driving reason to


upgrade to version 4.0 if you are already using the reporting
feature. So if you want a pie chart instead, or if you are curious
as to how this would look in a pie chart, just simply go to the
Property tree node and change it. As you cycle through the
charting types, you can see what the new report will look like.
When this screen is opened for the first time, the Group, Series
and Value entries will default to null. The report engine will try
to guess the values based on the items in the SELECT clause
of your SQL statement. Instead, use the Fetch Column Names
button and then use the dropdown controls to manually define
the columns to be used. For the Series item, you can use static
text which will normally look better than the column definition.
I have highlighted each property and their associated chart
component the colouring and red boxes in the screenshot will
not be present when you design your report.
Once you have the other chart properties defined and there
are many you can save and run your report as before. Again,
the ability to live preview the report as you are setting these
properties will make it very easy to figure out what each
property controls.

A Few More Features

In order to keep this article from growing into a book, Ill just
briefly touch on the other capabilities of the reports before I
show you how to run a report from the command-line interface.
Prompting the User for Values
If you have a bind or substitution variable in your SELECT, SQL
Developer will prompt you for a value when running the report.
You can also define a default value for these in the report
definition.

FIGURE 8: CHART DESIGN,


DATA PANEL

www.ukoug.org

19

OracleScene
D I G I T A L

AUTUMN 14

Technology: Jeff Smith

Linking Reports
If you want to be able to jump from one report to another
without going back to the Report tree in SQL Developer, use the
Drill Down section in the report editor. This makes it easy to
navigate back and forth through multiple reports.
PDF Rendering
The report editor allows you to define exactly how the report
will be exported to PDF. It also allows for a password to be
required to open the report and whether the report will be
encrypted. If your query contains pictures stored as BLOBs, those
can be rendered in the report!
Sharing Reports
A report, or a collection of reports, can be exported to an XML file.
These reports can then be imported in the Report tree by using
the Open dialog. This makes it easy to move reports between
computers or to share them with other SQL Developer users.
Organising Reports
All user-defined reports are listed together. Once you have more
than a dozen reports, you may want to begin organising them in
Folders. Create a new folder by right-clicking on the User Define
Reports tree item and using the New Folder item. Then cut and
paste reports from the main folder into the new subfolder.

Running The Report From The Command Line

If you have a report you want to be generated as part of an


automated build process, then the new command line interface

introduced in version 4.0 of SQL Developer is just what you need.


In the SQL Developer installation directory, navigate down to the
bin directory. There you will find a sdcli.exe (for Windows) that
you can run to invoke the reports feature.
Running the exe without any parameters will return with a list
of supported features. Running it again with the feature name,
such as reports will return with the help and required syntax for
running a report. Doing this for reports, e.g. > sdcli.exe reports
will return:
generate -report <path> -db <connection name> -file <output file>
[-bind <bname>=<bvalue>]* generates an HTML report

And now for an example:


C:\Users\jdsmith\Desktop\sqldeveloper\4.0\sqldeveloper\
sqldeveloper\bin>sdcli64.exe reports generate -report Demo for
4.0 Take Three -db HR -file C:\sqldev4.html

Success!
The report, which I quoted as the report name contains spaces,
was executed on the supplied connection, in this case HR,
and a what-you-see-is-what-you-get rendered HTML page
was generated. In other words, the report will be exported to
HTML as close as possible as to how it is displayed in the SQL
Developer. PDF reports are only available when generated in SQL
Developer and not via the command-line interface at this time.

Summary
Reporting is a key feature of Oracle SQL Developer and has been in the product since its debut in 2006. It can take one or more
queries and make their result sets easier to consume and understand. Reports can be presented in a spreadsheet or grid-like
style or can be rendered as one of 50+ charting types. These reports and/or charts can be previewed during the design process,
making the process more seamless and less iterative. Reports can be run inside of SQL Developer or, new for version 4.0, can be
generated from the command-line.

ABOUT
THE
AUTHOR

Jeff Smith
Product Manager, Oracle
Jeff is a Product Manager in the Database Development Tools Group at Oracle, and has
been obsessing over saving people clicks and keystrokes for the last decade.
Blog: www.thatjeffsmith.com

20

@thatjeffsmith

www.ukoug.org

TBC

Introduction to

Oracle SQL Developer


Data Modeler
Oracle SQL Developer Data Modeler (Data Modeler) is a tool for designing and documenting
databases. It can be used for Oracle databases as well as Microsoft SQL Server or IBM DB2 databases.
The tool is available as standalone product but it is also integrated in Oracle SQL Developer, so a user
can decide which way is the best for him/her to use the tool. Installing the tool is very simple. You
just go to the Oracle pages, download the right version of the tool, extract the .zip file and start using
the tool. Data Modeler is free to use and support is provided by Oracle if a customer has a database
support contract.

Heli Helskyaho, CEO, Miracle Finland Oy

About Data Modeler

In Data Modeler a single design is called Design. One design


consists of one logical model, optionally one or more relational
models that are based on that logical model. And optionally one
or more physical models that are each based on one relational
model. There can also be multi-dimensional models, data types
models, process models, domains, business information etc.

the object you will see actions allowed to this particular object.
In the middle you can find the Start Page. This is very useful
especially when starting to use the tool. The Start Page has links
to different kinds of documentation, tutorials, videos, online
demonstrations, OTN forum etc. The Navigator on the right hand
side will show the whole diagram and lets you navigate to the
part of the diagram wanted. Below the Start Page you can find
the Messages Log pane that shows all activity at the tool.

In Figure 1 you can see what Data Modeler looks like.

Preferences & Design Properties

Before starting to use the tool in production it would be good


to define some preferences and design properties. These can be
changed afterwards, but they only take effect from the moment
of change and, if many people are using the tool, it might be
confusing to change them too often. Preferences can be found
from the Tools menu and Design Properties from the Browser,
by right-clicking the design name and selecting Properties.
The difference between Preferences and Design Properties is
that Preferences are valid in one installation of a Data Modeler
whereas the Design Properties are only valid in one Design.
Both Settings and Design Properties can be exported and then
imported to another computer or to another Design.
FIGURE 1: DATA MODELER

On the left hand side in Figure 1 you can find the Browser. The
Browser is a directory of all the objects in a design. You can
navigate to any of the objects with a mouse and by right-clicking
www.ukoug.org

Database Design with Data Modeler

The database design starts with designing a Logical Model. This is


done on a canvas shown in Figure 2. On the top left there are icons
marked with a red circle. These are the elements for designing. By
21

OracleScene
D I G I T A L

AUTUMN 14

Technology: Heli Helskyaho

clicking the element icon needed and then clicking on the canvas
this element will be created. On canvas you can also right-click on
any object to see the actions allowed for that object. In the Logical
model you define entities, attributes and relationships.

When you have created a Physical Model you should define


the properties for the physical objects. After that you are ready
to generate the DDLs (SQL scripts for creating your database
objects). DDLs will be created from the File Menu under Export
and DDL File. Then just run these DDLs to your database to get
objects created.

Documenting Existing Databases

You can also use Data Modeler for documenting existing


databases (Oracle, SQL Server, DB2). You can get the
documentation from the data dictionary, existing DDLs or
from another design tool (Oracle Designer, Erwin). Or you
can combine these, for instance, by bringing some of the
descriptions from another design tool and adding it to the
information from data dictionary. These features can be found
under File Menu in Import functionalities.
FIGURE 2: LOGICAL MODELER

The next step is to create a relational model based on the logical


model. This can be done by pressing the Engineer to Relational
Model icon shown in Figure 3 and following the instructions.

Quality

Data Modeler has a good support for improving the quality of


your database design. You can use predefined Design Rules or
create your own rules and rule sets. You can create Glossaries
based on ER model or an existing Glossary and edit them as
much as you want. You can compare models to each other and
synchronise a model to a database and the opposite. As a result
of synchronising you can either get your model updated to the
same level with the database or get the alter DDLs to update
your database to the same level with your model. There are
many ways of getting your designs and databases for better
quality when using Data Modeler.

Multi-User Environment & Version Control

FIGURE 3: ENGINEER TO RELATIONAL MODEL

When you are ready with the relational model it is time to


create the Physical Model. This is created by right-clicking
Physical Model in Browser and selecting New, as shown in
Figure 4. When creating a physical model you must know what
technology your database will be (Oracle, SQL Server, DB2) and
which version. All the properties for the Physical Model depend
on the chosen technology.

In Data Modeler there is an integrated support for Subversion, a


free to use version control tool. The version control integration
enables both version control functionalities and multi-user
environment. The information in Data Modeler can also be
exported to Excel for editing and then imported back to the tool.
The export is done with the Search tool (shown in Figure 6) and
the import by right clicking either the Logical or the Relational
Model and selecting Update model with previously exported XLS
(XLSX) file, as shown in Figure 5. This functionality is very useful
with end users that do not want to have access to Data Modeler
but would like to add descriptions.

FIGURE 4: CREATING A PHYSICAL MODEL


FIGURE 5: IMPORTING FROM EXCEL

22

www.ukoug.org

Technology: Heli Helskyaho

Reporting

Reports can be found under File Menu. There are several built-in
reports in Data Modeler and a user can create more. Reports can
be run on different output formats (HTML, PDF, RTF) and they
can be based on manageable templates. The scope of a report
can be the loaded designs or a separate reporting schema. A
user can also print out a Diagram by selecting that option from
Files Menu. Reports can also be managed through the Search
option (as shown in Figure 6) by pressing the Report button.

Conclusions
Data Modeler is a good tool for database design and it can
also be used to document existing databases. Data Modeler
will enable multi-user environment and version control as
well as helping you to improve the quality of your database
design. Data Modeler is free to use and easy to install.

ABOUT
THE
AUTHOR

FIGURE 6: SEARCH AND REPORTING USING SEARCH OPTION

Heli Helskyaho
CEO, Miracle Finland Oy
Heli Helskyaho is the CEO for Miracle Finland Oy and she holds a Masters degree
(Computer Science) in Helsinki University. Heli is also an Oracle ACE Director and a
frequent speaker in many conferences. Heli has been an Oracle Designer user since
1996 and a Data Modeler user since 2010. Helis book on Designing Databases with
Oracle SQL Developer Data Modeler will be out next year.

@HeliFromFinland

MEET A VOLUNTEER: JO
Name:
Jo Bates
Job Title: Business Systems Analyst
Company: Daiwa Capital Markets Europe
Tell us about yourself in 50 words
Ive been working in the Oracle
E-Business Suite space for 24 years
and involved in the UKOUG off and
on for much of that time. Most of my
career has been around the Financials
modules but Ive branched out into HR,
Purchasing and Inventory, and latterly
become more involved in the BI side
of things.
What is your main goal in life?
To make a difference.
What are your interests?
Horsemanship, personal development,
coaching, inspiring, educating, solving
problems and enabling folk to achieve
their goals.

www.ukoug.org

When did you first start volunteering


with UKOUG?
It was sometime in the mid 1990s.
Why did you want to become a
UKOUG volunteer?
It seemed a great way to be involved in
sharing information and experiences.
What volunteering activities do
you undertake?
I am a Co-Chair of the Financials SIG.

and generally helping to ensure


everyone is ok.
What is the best thing about
volunteering?
The opportunity to meet lots of people!
Whats most rewarding about
volunteering?
The feeling that weve just delivered a
good event that the members enjoyed.

What does a day of volunteering


involve for you?
Sourcing speakers for the SIG, agreeing

@JoandTillie
an agenda with the rest of the
Committee and, on the day of the event, https://www.linkedin.com/pub/jointroducing the day and the speakers
bates/1/a35/9a9

23

Social

Mobile

Registration is open for UKOUG


Technology Conference &
Exhibition 2014, the largest
independent event for the Oracle
Technology community in the UK.
Over 200 world-class speakers
and industry experts will meet in
Liverpool on 8th - 10th December
2014 to talk all things Tech.

Register now at
www.tech14.ukoug.org/register

Discover, Develop, Deliver


with Tech14
A Conference With You In Mind

If youre a DBA, Developer, Architect, System Administrator, Data


Analyst, Technical Engineer or work with Oracle Technology in
any way, Tech14 is an essential three days of training you cannot
afford to miss.
Your organisations needs and priorities are constantly
developing and you need to ensure the technology you use
meets these demands. Whether this is ensuring data is secure
or developing an app that meets your marketing departments
requirements, you need to understand how to get the best of
the technology you currently have and how new or emerging
technologies could work for you.
Many organisations are turning to cloud, mobile, social and
analytics to find solutions and the Tech14 agenda has been
designed so you can get practical knowledge and advice about
these key areas. Tech14 has a multiple stream agenda that
covers five core practice areas:
Database
Development
Middleware
Big Data, Business Analytics
Operating Systems, & Data Warehousing
Network & Engineered
Systems, Hardware and Storage

24

Discover

As Oracle Technology is constantly changing, you need to keep up.


Tech14 is the best place to discover whats new in your industry:
Find out the latest news on new releases remember Tech14
is only a few months after Oracle OpenWorld, so if youre
missing out on going to San Francisco, Liverpool is the next
best thing!
Hear from early adopters of 12c about their experiences,
recommendations and things to avoid so you can successfully
manage the transition in your own organisation.
Meet industry experts in sessions and in the exhibition hall
where you can also explore products and services that can
help you meet your objectives.
Get practical hints and tips that you can take back to your
team and turn into results.

Develop

As well as learning what you can do, you need to know how
to do it. The Tech14 agenda covers sessions on developing
solutions in your organisation that lead to results.
Understand how to get the best out of your current systems
and improve efficiency now and in the future.
Use the best practice you learn in sessions to make
performance improvements.
Attend sessions that will help you manage projects.

www.ukoug.org

Deliver

Book early and save!


Check out our money
saving tips for conference
& travel on P7

Analytics

d
d

u u
C l oC l o

Focus on: Tech14

Delivery brings the agenda full circle. Attending Tech14 will


mean that you can return to your business and implement
change that leads to success:
Cut overheads and enhance your organisations
future productivity.
Effectively address the specific challenges faced by
your organisation.

OakTable World
OakTable is a network of internationally renowned Oracle
scientists who believe in a better way to develop and administer
Oracle Systems. Tech14 and OakTable World will together give
you access to some of the finest technical minds in the industry,
with dedicated streams within the conference and Super
Sunday, plus OakTable speakers across the rest of the agenda.

Find out more about the Tech14 agenda at:


www.tech14.ukoug.org/agenda.

Oracle UX
Once again you have the opportunity to take part in the Oracle
UX usability feedback sessions where your input influences the
existing and future usability of Oracle products that affect you,
your organisation and the rest of the Oracle community.

A Conference Giving You More

Tech14 is not just a great agenda. There are so many other


reasons to come to Liverpool in December.
Super Sunday
Back by popular demand, Super Sunday is an additional day of
free content to enhance the value you get from your conference
visit. Its held in the same venue on the Sunday before the
conference. This years agenda includes highly technical content
from speakers including Jonathan Lewis, Andrejus Baranovskis
and Julian Dyke. Together, youll be tackling topics ranging from
ADF myth-busting to what a salted banana means to SQL! Take
a look at the agenda at www.supersunday14.ukoug.org. You can
register for Super Sunday in two ways:
When you register for Tech14. Just check the Super Sunday
options when you see it.
After registering for Tech14 by calling our Registration Team
on +44 (0)1462 744 933.
Community Keynotes
Four big names in the Oracle world will be at Tech14 to bring
you up to date with the latest news and developments:
Maria Colgan, Database Community Keynote
Frances Zhao-Perez, Middleware Community Keynote
Duncan Mills, Development Community Keynote
Jack Berkowitz, Business Analytics Community Keynote
Take a look at the Tech14 agenda to see the key topics they will
be discussing at www.tech14.ukoug.org/agenda.
RAC Attack
The RAC Attack ninjas are back at Tech14 to help you build a
fully functional Oracle RAC database on your own laptop using
virtual machines. And the best bit is you can take it home with
you afterwards.

Rover Ticket
Tech14 is co-located with Apps 14 this year. If you like the look
of the Apps14 agenda and want to attend sessions at both
conferences, you can. Simply buy a rover ticket for 150 when
you book your conference ticket.
Legendary Socials
Attending Tech14 is not just about the agenda its also
about meeting up with old friends and making new ones.
Share your experiences and learn from each other as you
socialise at our official Tech14 socials, partner events and
impromptu gatherings.
Exhibition & Sponsorship
If you are a supplier of an Oracle related product or service then
exhibiting or sponsoring at Tech14 will give you access to the
largest collection of qualified prospects in the UK. Contact Kerry
Stuart on kerry@ukoug.org to discuss your marketing objectives
and how we can help you meet them.

Spread The Word

If youre already coming to Tech14 or think other people you


know might be interested, tell people about it! Blog about
it, share it on social channels, put it in your email footer, and
anything else you can think of to spread the word. The more
people we get at Tech14 the better your delegate experience
will be. Go to www.tech14.ukoug.org/promote for resources
that you can use to promote the event.

Cloud
www.ukoug.org

OracleScene
D I G I T A L

AUTUMN 14

Women in IT

Debra Lilley, UKOUG Member Advocate

Speaker Spotlight
At last years conferences we held Women in IT sessions where delegates
shared their experiences about working in IT and agreed on 3 key initiatives:

Share

Mentor

Speak

Stories both on our website and


through the BCS Computers
in Schools initiative

Women who want to step


up and tell their story

Encourage women to present


at SIGs or at conference

The share initiative is hopefully inspiring our members and


further afield with the different roles and case studies being
showcased. If you havent submitted yours please do so
www.ukoug.org/womeninit. A few women have stepped
forward and asked for a mentor and we have willing volunteers
to match them with.
A recent study (www.theguardian.com/higher-educationnetwork/2013/jul/05/science-women-representationuniversity-policy) showed that the UK has women in only
13% of STEM jobs in the UK and, whilst this number is far too
small, I am pleased to say that UKOUG has just exceeded that
percentage in women speakers for our annual Apps and Tech
conference & exhibitions*

At Tech14, not only are women speaking,


but they are phenomenal women.
Maria Colgan, the queen of the database, is talking giving the
technology database community keynote on In Memory. Maria
is a great advocate of UKOUG and one of the many UK and Irish
experts working at Redwood Shores. Nadia Bendjedou, who
holds a British passport but lives in Paris, is key to our E-Business
Suite agenda in Apps14. Yes, Oracle looks to us for the best talent.

Sometimes I hear that the women who are


in technology tend to be outside of the really
technical roles, but I dispute this and you
need look no further than at our agenda to
see why.
From the very depths of the database with Maria, to Robyn
Sands speaking about pure SQL , Melanie Caffrey on database
archiving and bringing many of these areas together is the
importance of OEM being covered very nicely by Kellyn PotVin.
Tammy Bednar speaks on the Database Appliance and Mina
Sagha Zadeh on Solaris.
Frances Zhao-Perez will cover both Architecture and Middleware
and from the wider development we have Mary Garvey from the
University of Wolverhampton, Iloon Ellen-Wolff on APEX, Lynn
Munsinger on ADF and UX, with Lonneke Dikmans and Simone
Geib on all kinds of data and process integration, Mia Urman on
developing ADF mobile on Oracle Forms,, and who can forget our
great volunteer Susan Duncan, who will be not only speaking
but co-ordinating our mobile development sessions as well. As
consumption of IT looks to the Cloud our own UKOUG director,
Fiona Martin, will discuss the opportunities and challenges
*Names and agenda information were correct at time of print. The agenda is subject to change.

26

www.ukoug.org

Women in IT

Top row, from left to right: Fiona Martin;


Debra Lilley & Jo Bates;
Frances Zhao-Perez; Julie Stringfellow
Bottom row, from left to right:
Lonneke Dikmans; Mia Urman;
Nadia Bendjedou
Bottom right image: Maria Colgan

it brings. Between them, these ladies seem to have the whole


stack sewn up.

At Apps14 if you were to attend just the


community keynotes and then only attend
sessions given by women you wouldnt have
time to draw breath and a real wealth of
end user stories our favourite.
This year we have a customer showcase on Monday led by Julie
Stringfellow who will tell the Reading Borough Council story on
adopting Oracle Financials Cloud. Luci Love from BG Group is
presenting on adopting Oracle FCM Cloud integrated with SAP
and Nathalie Shawcross from the NHS about Sales Cloud.

us fresh from the Tech agenda to look at the adoption side of


Mobile in Apps.
Reporting solutions are always popular and Susan Gillis and Sue
Maloney from The University of Oxford tell their story, whilst
Julie Niven shares the Cairn Energy story along with Julie Bowen.
Dr Lizzy Wright will be showcasing her work with Hyperion.
Oracle Analytics is covered by Jo Bates of Daiwa Capital Markets
with their case study and Stephanie Gilbert talks about
migrating Hyperion.
There certainly is no shortage of female expertise in this agenda.
On the Wednesday morning women attending both Tech14 and
Apps14 are invited to come together for another Women in IT
session, where our panel will share their stories and inspire us
all. Everyone is welcome to join us.

Cindy De Smedt joins a colleague to tell us about the Marketing


Cloud and Fiona Martin talks about cloud adoption. I will share
the design principles and User Experience ethos behind cloud
applications and Zo Read will show us a sneak preview of the
next release of Financials Cloud.
Whilst Nadia Bendjedou explains how to run E-Business Suite,
Margaret Walsh will give us a E-Business Suite roadmap session
and other product updates will come from our favourites Liz
Wilson for PeopleSoft and Nicki Payne for Siebel.
More customer case studies follow. Kate Williams talks about
their experience of E-Business Suite Collections at the University
of Manchester. Hear how Iona Livesey at Bupa uses mobile with
PeopleSoft and, continuing with this theme, Susan Duncan joins

www.ukoug.org

27

OracleScene
D I G I T A L

AUTUMN 14

Business Intelligence

A few words about the Top 10

Data Warehousing
Trends &
Opportunities
Grcan Orhan, Data Integration Architect, Wipro Technologies

Earlier this year, Oracle released a white paper about the Top
10 Data Warehousing Trends & Opportunities for 2014, which I
have used as the basis for this article.

So lets take a look at the Top 10 Data Warehousing Trends and


Opportunities for 2014 white paper.

I believe that all the enterprise level companies have their own
data warehouses in some platforms producing meaningful data,
correlations, dashboards, ad-hoc reports, cubes, etc. from their
source systems. As data is growing much more than expected,
IT leaders and architects need to take more actions with more
expectations from harder business needs.

1. The datafication of the enterprise spawns more


capable data warehouses

Every organism has individual characteristics that separates


each none from another and so do companies as well. Each
company has its own culture, workflow or business that makes
it unique. So, there is not a single best practice approach
that fits every company. This may sound weird but this is why
adaptation of best practices occurs. To solve the equation which
has multiple unknown variables, you need to know balance of at
least one minus variable count.

28

Most of our data warehouses source systems collect data from


OLTP (On-Line Transaction Processing) applications, mostly
ERPs, CRMs or other operational related utilities, where data is
generally produced by humans. By selecting data from preprepared combo boxes or check boxes or radio buttons, most
of the data is structured, except descriptive fields. But there
are much more than that. But in the new era, there are other
sources (cloud-generated, machine generated) that produce
much more than these traditional human-generated systems.
And this data is mostly unstructured. Such as social media feeds,
mobile phones, sensors, terminals, monitoring systems, etc.
And this huge data should be correlated and unified with the
data of historical traditional and structured in order to acquire,
organise and analyse and produce valuable information for
decision makers.

www.ukoug.org

Business Intelligence: Grcan Orhan

Shown data should be reduced and aggregated with the level


of organisation gets higher. A sales representative will probably
need a list report, but his/her manager will need a bar chart in
order to check many KPIs. Variety will increase and sales director
will need to see multiple bar and pie charts in order to make a
better decision.

2. Physical and logical consolidation reduces costs

Consolidation of different acted systems, help to maximise IT


resources and minimise inactive or idle hardware. Traditional
OLTP systems generally required being ready for action and
standing still in daytime and vice-versa, data warehouse
peaks are generally at night. So, it would be worth considering
consolidating these systems and letting them run on the same
environment for different times of the day. To simplify, using the
same server resource for both data warehousing and ERP at the
same time.
As a conclusion, engineered systems are specifically designed
to handle different aimed database consolidations to reduce
system management tasks, energy consumption and space in
data center by a perfect combination of database, storage and
networking gear.

3. Hadoop optimises data warehousing environments


by accelerating data transformation

Transforming data requires analysts to convert unstructured


data to semi-structured or structured form in order to make
analysis or use in pattern recognitions or other methods.
Hadoop an open source software that enables organisations
to process huge amounts of data across inexpensive servers and
storage devices is the key to adding meaning to this huge data
and expanding data warehouses if an organisation wants to be
competitive.

4. Customer experience (CX) strategies gain real-time


insight to improve marketing campaigns

Most of the reports, that are generated by data warehouse


systems as well as unification with big data, are to identify the
best offer to the targeted customers and segment them in order

to improve their loyalty, measure and analyse their sentiment


and to formulate more-effective promotions by marketing
professionals. And surely, gain more revenue.
Unfortunately, traditional data warehouse approaches with
classic demographic information of customers can not always
help marketers, sales representatives or customer service
personnel understand what customers need, because demand
is changing from time to time. For instance, if a customer is on
vacation and he is abroad, it would be counterproductive to call
him to offer a discount that will be available for 24 hours.

5. Engineered systems become the de facto standard for


large-scale information management activities

Engineered systems are preferred by organisations who


wants to move fast, because these systems provide them
simpler, cheaper, pre-installed and a more flexible solution
regarding their needs. Instead of buying and installing different
equipments from different vendors, it is much more feasible
to obtain them all from one vendor and already preconfigured.
These equipments can be servers, storage systems, networking
equipments, peripherals and management software. Engineered
systems are designed to perform optimum performance with
software and hardware working in harmony to produce best
from the database.
Since the system is already designed and configured for
expected output, it accelerates time to market, which is a very
important KPI for large-scale organisations.

6. On-demand sandbox analytics environments


meet rising demand for rapid prototyping and
information discovery

After you prepare all the data into your data warehouse, it needs
to be seen by the business users. Business intelligence and
analytics platforms should handle complex requirements and
on-demand computing. Analytics as a service (AaaS) on public or
private cloud is established by future-ready organisations. Being
flexible, these versatile sandbox environments can be scaled
up or down in order to analyse challenging volume and velocity
of data.

FIGURE 1

www.ukoug.org

29

OracleScene
D I G I T A L

AUTUMN 14

Business Intelligence: Grcan Orhan

FIGURE 2

Multitenant database technology enables organisations to


create their own cloud environments with many pluggable
databases.

7. Data compression enables high-value analytics

As it is known, cost of storage is not decreasing regarding the


amount of data thats been growing and in the meantime,
IT budgets are same, but business needs more data to make
higher-quality analysis. So the amount of data stored on
the same size of storage should be increased. This is why
compression makes sense. Compression will enable us to
store more data on the same size of disk. As the volume of
data is rapidly increasing, our data warehouses gets larger and
our systems need to be scalable. The combination of row and
columnar compression in todays technology not only saves
space, but also improves performance.
In conclusion with the same IT budget, 10x more data can be
stored with more performance to access the data.

8. In-database analytics simplifies data-driven analytics

It is always hard to compute finding patterns and relationships


in large volume of data. But in-database analytics capabilities,
which includes data-mining algorithms implemented in the
database supported with native SQL functions for basic statistical
activities and integration of statistical programming languages
(like R), makes the dream come true. By eliminating the activity
of data movement from one server to another to make statistical
calculations, cycle times are accelerated and total cost of
ownership reduced. Eventually this means less hardware cost,
less management and less maintenance as well, and analysts
can perform running their jobs on the same server rather than
duplicating the data to analytics warehouse, which they dont
need to wait for data to load to analytics server as well.

9. In-memory technologies supercharge data


warehouse performance

It is true, reading and writing from and to memory is much


faster than magnetic disks. But in todays technology, important
data, which is less compared to complete data warehouse

ABOUT
THE
AUTHOR

30

FIGURE 3

can be now moved into RAM to achieve performance


improvements to answer orders and to acquire and analyse
quickly. Of course new database technologies make this come
true. As the hardware technology gets cheaper, complete
data warehouses can be stored on RAM, flash or disk and
can be configured according to the business needs. It is
suggested to reside important and frequently accessed data
in RAM to process instantaneously and provide new analytics
opportunities. Also there are some methods based on heuristic
access patterns where the system can decide whether data is
transferred to RAM, flash or disk.

10. Data warehouses become more critical to business


operations

In todays conditions, data warehouses not only provide support


to decision makers, but are also mission-critical for running
business. Your data warehouse must be more highly available
than ever in order to catch this level of criticality to provide
continuous access to information even in planned or unplanned
outages. Applying software patches, consolidating databases or
migrating information to new platforms should be done without
compromising user activities. Information security and data
security is very important for almost all industries because the
data warehouse is no longer used by a few back-office analysts.
Information architecture contains foundational parts, in order to
make security and availability essential.

References
White Paper: Oracle Top 10 Data Warehousing Trends and
Opportunities for 2014 www.oracleimg.com/us/dm/dwtop-trends-2014-v13-2075572.pdf
Oracle Data Warehousing Solutions Main Page
www.oracle.com/us/products/database/datawarehousing/
overview/index.html

Grcan Orhan
Data Integration Architect, Wipro Technologies
Awarded Oracle Excellence Awards, Technologist of the Year 2011, Enterprise
Architect as well as ACE Director for Business Intelligence expertise. Experienced
mostly on data warehouse architecture and ETL development. Grcan has been
working with database systems since 1994. He developed his first data warehouse
in 2003 with Oracle 6i. He has used almost well known DBMS systems, modelling,
ETL and BI tools, but experienced mainly on Oracle Data Integrator as a data
integration tool. Hes one of the board member of TROUG (Turkish Oracle User
Group) and Chairman of BI&DWH SIG.

www.ukoug.org

Technology

Going Mobile:

From Concept
to Delivery
Throughout this magazine you will no doubt see articles from leading experts in a whole
range of technologies, using their years of experience to successfully implement solutions
on the Oracle stack. Well in this article I want to tell a slightly different story: a story of how
an applications DBA, rather than someone from a development background, managed to
end up successfully designing and building a production on-device mobile application for
Lloyds Register. If you want to learn the lessons, and maybe start building your own mobile
applications, then read on!
Richard Childe, Independent Oracle Applications DBA & ADF Developer

Why Mobile?

Ive always thought that whilst the IT


industry is quick to announce something
as the next big thing, the reality is that
unless the people who hold the IT purse
strings have also decided its the next big
thing, then its going to count for little.
There are plenty of statistics out there
to demonstrate the explosion in mobile
app usage, such as the research showing
that overall app usage grew 115% in
2013. And when you see the number of
people at work with smartphones, (and
hear the complaints that they cant access
anything to do with their jobs on them),
then its clear that the demand for mobile
is tangible, real, and it is here today.
Everybody wants mobile apps. People on
the move want quick access to their work
data on their shiny new tablet, not on

www.ukoug.org

an anti-virus laden corporate laptop that


takes an age to start.
Working at Lloyds Register (LR), Id already
dipped my toes in the water with Oracle
ADF by re-developing a few of LRs publicfacing web pages using Oracle ADF 11g.
Id learnt the framework mostly by
reading Grant Ronalds The Quick Start
Guide to Fusion Development, and
studied a few examples on the web. Some
people are little bit wary of a perceived
complexity of ADF, but once you find your
way around it, you start to appreciate just
how quickly and easily you can get an
application up and running.

A Mobile Framework

So with only a little ADF knowledge my


interest was piqued when ADF Mobile

hit the market in late 2012. ADF Mobile is


a hybrid mobile framework based on Java
and HTML5 which can be written once but
deployed to both iOS and Android devices.
I decided to take ADF Mobile for a spin and
see if I could get a simple piece of data out
of a database and onto a phone. Somewhat
surprisingly this proved to be very straightforward. The enthusiastic response to this
small achievement from Lloyds Register
management was encouraging. This simple
demonstration showed that developing a
mobile app could be fairly straightforward,
did not have to involve a vast army of
developers, could sit happily on an existing
Oracle infrastructure, and would not entail
the spiralling development costs that
sometimes happens when a small, mobile
development company comes unstuck
when trying to deal with your
IT infrastructure.

31

OracleScene
D I G I T A L

AUTUMN 14

Technology: Richard Childe

FIGURE 1: INITIAL PAGE


DISPLAYING OVERALL SNAPSHOT
OF SURVEY DATA FOR A CLIENT,
PLUS VESSEL LIST AND SURVEY
STATUS INDICATOR

FIGURE 2: VESSEL SURVEY


CATEGORIES AND STATUS

The Mobile Application

We already had a definite idea of what


we wanted for one particular production
app: something that would allow clients
of the company in the shipping industry
to monitor the status of the inspection
surveys which are regularly carried out on
their vessels. This on-device application
would be based on a sub-set of data
already exposed through Lloyds Register
website. However if the same functionality
was to be presented on the phone we had
an immediate challenge - I couldnt see
how I could realistically employ any sort
of on-device caching strategy, as the data
volumes to be searched were very large,
specific to a particular user, and constantly
changing. I took the decision that so long
as the mobile app wasnt pulling gigabytes
of data over the network, and the back-end
was running database queries that took
less than a second to complete, then it
should be perfectly workable to simply
request and deliver data to the mobile
application as and when it was needed. For
these web service calls, REST seemed to be
the way to go due to its well-documented
advantages for mobile access.

One immediate advantage


of writing this mobile app is
that we werent committing
to some horrendously
complex application.
Mobile apps, by their nature, tend to be
simple and easy to use. When planning
an app, its always worth remembering
that the average time a person will spend

32

FIGURE 3: SIMPLIFIED ARCHITECTURE VIEW

using a mobile app is just over 1 minute,


so I decided I probably shouldnt be
looking at trying to shovel a re-write of
E-Business Suite on to peoples phones.

When it came to actually


coding the app, I switched to
using a Mac, as the iPhone
simulator proved to be an
essential item in the toolkit,
allowing me to quickly view
the fruits of my labours
without loading it on to an
actual phone.
However now and again I did need to
see how it behaved on a physical device,
so I deployed my web services into the
integrated Weblogic server, and put the
laptop on a WiFi network, so that a phone
or tablet could talk to it. My app uses a
number of backing beans, and using the
debugger was a must, as it can talk to the
simulator or a physical device and halt
the execution at a breakpoint in the usual
way. I made extensive use of JDevelopers
built in HTTP Analyser to test my web
service calls, which are basically RESTful
web services which hook into an ADF
application module to obtain the data
from ADF Business Components view
objects. Each row being returned from the
view object is loaded into a POJO, an array
of which is returned from the web service
call in REST/XML format. A simplified
view of the architecture looks a little like
Figure 3.

Branding and look and feel tend to be


hot topics when it comes to mobile apps
and luckily enough, LR were undergoing
a major re-branding exercise during
the development phase, so fortunately
the in-house brand management team
gave me plenty of ideas and direction
on a look and feel. With ADF Mobile its
pretty easy to change the look and feel
as I was using skinning, which allows you
to create a style-sheet to be applied to
all your mobile pages in the same way a
website does, rather than burdening each
page with styling directives. Determining
how the users would navigate their way
through the app was initially based on
some very simple wire-framing of pages,
together with a basic page flow diagram.
The app is essentially one ADF task
flow a series of pages linked together
with control flows. Mostly the design
process evolved slowly over time, and
in retrospect I learned the importance
of nailing-down the whole UX from the
start and not trying to deviate from it too
much. The initial beta-testing we did with
users made me re-think the design a little,
and taught me that maybe I didnt get
enough user-input from the outset.

Lesson learned. Know


your audience! Take a look
through Oracles Fusion
Applications User Experience
Patterns and Guidelines
website for some very good
guidance on getting this right
from the off.
www.ukoug.org

Technology: Richard Childe

If youre thinking of
developing an app
with Oracle ADF
Mobile, heres what
you (and your team if
youre lucky enough
to have one), need to
arm yourself with:

Key skills:
ADF Mobile Learn how to build a small app.
There are plenty of good videos out there from
the Oracle development team to get you started
with Mobile.
ADF BC - For database queries, learn how to build
a simple application module with a view object to
query some data.
Java - Basic syntax and structure. Look for
examples on how to code a RESTful web service
and access your ADF BC components, and learn
how to use the debugger and the HTTP Analyser.
SQL Tuning - Again, if you are going to be
running database queries, your queries need to
finish pronto. You may get away with a SELECT
statement taking 5 seconds to complete in an
OLTP system, but not in a mobile app. Learn to
read an explain plan. Or ask your friendly DBA to
do it for you. (Good luck with that).

And remember that materialised views are


your friend.

When it came to some initial beta testing,


we used TestFlight (https://testflightapp.
com) to allow us to distribute the app
to a selected user base under controlled
conditions.

naming scheme and SSL certificate setup


to facilitate any future applications sited
in the same infrastructure.

change the way they work. The possibilities


are huge. Take a look around your
workplace and have a think about how
mobility can improve existing processes.

We supplied the download


details to our users along
with a questionnaire
concerning the apps
usefulness and usability,
the results of which allowed
us to make some pertinent
changes.
This was made easier by the Agile
approach wed taken with this project,
which seems particularly suited to mobile
app development. We were able to make
regular presentations of the app to the
business stakeholders. You dont need the
full, functioning, finished application to
demonstrate something meaningful to
them. And the iPhone simulator, along
with some devices handed around during
the meetings, made an ideal way to
present the work Id done so far.
When the development work was
complete, it was time to add security.
That meant securing the web-services
with Oracle Web Services Manager
(OWSM), and SSL enabling the web service
traffic. We placed a load balancer in front
of our Weblogic servers and terminated
SSL at that point, which allows us to
spread the load across multiple servers,
and provides for a much simpler URL
www.ukoug.org

Deploying on the Apple iStore and


Google Play

Once Id put the supporting production


database and application server
infrastructure in place, it was time to
make the app available on Apples iStore
and Googles Play. Youll see a lot of bad
press on the web around how picky Apple
can be with allowing apps to go on the
iStore, but actually, we found the process
to be fairly stress free. The ADF Mobile
framework meets all the stipulated
technical requirements, so unless youve
shoe-horned something into your app
that the Apple reviewers arent happy
with, you should find the submission
process goes smoothly. Make sure your
accompanying documentation is up to
scratch, and make sure the description of
the app accurately reflects what it does.

Looking to the future, theres


a number of possibilities to
take this further.
A surveyor carrying out a vessel survey may
find it useful to have the vessels current
GPS location available. Using the devices
camera to capture images of damaged or
worn components and upload them to a
database may also be a handy capability.
All of these are capabilities the framework
already supports, and I dont think its an
exaggeration to say that a well-written
mobile app has the potential to make life
much easier for people out in field and

The tools you will need:


An Apple Mac or Macbook - worth it for the
iPhone simulator alone. All the Android emulators
I tried were as much use as a chocolate teapot.
Oracle JDeveloper & Lots of coffee.
General Tips:
Speak to your users and ask them exactly they
want out of the app. Make sure theyll want to
download it and use it on a regular basis. Keep
them engaged during the development process
to make sure everything is heading in the right
direction. Make the app intuitive to use, and
easy to navigate. It shouldnt require any sort of
user manual. All this might seem obvious but its
important not to lose sight of it. Good luck!

Sources:
Mobile-App Use Increased 115% in 2013
[http://mashable.com/2014/01/14/mobileapp-use-2013]
Oracle Fusion Applications User Experience
Patterns and Guidelines [http://www.oracle.
com/webfolder/ux/applications/fusiongps/
mobile/index.htm]
Know Your Users Guideline [http://www.
oracle.com/webfolder/ux/applications/
fusiongps/mobile/index.htm]
Study: Average App Session Lasts About 1
Minute [http://readwrite.com/2012/01/17/
study_average_app_session_lasts_about_1_
minute#awesm=~ovcbifLYharQyn]

ABOUT
THE
AUTHOR
Richard Childe
Independent Oracle Applications DBA
& ADF Developer
An independent Oracle Applications DBA
and ADF Developer. Has worked with
Oracle technologies for over 20 years, for
a variety of companies including AT&T,
AHL Trading & Lloyds Register. To learn
more about developing a mobile app
with your existing Oracle stack,
contact me through LinkedIn or talk
to Nymad at www.nymad.co.uk
(info@nymad.co.uk)

33

OracleScene
D I G I T A L

AUTUMN 14

OrOr
aclacl
e e
E-BE-B
usine
usine
ss ss
SuSu
iteite

Focus on: Apps14


caotinos ns
ApApplipcaliti
ati
vaotinon
InnInonvo

8-10 DECEMBER 2014 | ACC LIVERPOOL

Connecting
Communities

CRCMRM

The Apps14 agenda is bigger and better than ever before:

6
200
sessions

Over 10 tracks per day and over

PACKED DAYS

of content

Over

First time

speakers

Apps14 has quickly become a success


story by offering the community true deep
dive content and training opportunities
for various product modules. Attendees
of the 2014 conference will experience
sessions on Oracle Applications areas such
as E-Business Suite, Hyperion & EPM, CRM,
Business Analytics, PeopleSoft, Applications
Innovation and Business & Strategy.

increase efficiency through peer-to-peer


networking, lectures, roundtables and more,
specifically designed for users like you.
Technical Users: Tune your skills while
gathering best practices and fixes
from others like you. You also have the
opportunity to interact directly with
Oracle and the people who developed
your product.

community

keynotes

s s
es es
sinustiengtyegy
u
B tBratra
S S
&&

Virtual streams covering


the hot Oracle topics:

Cloud
Social
Mobile
Analytics

up in our themed networking area. Meet


up with old friends and make new ones.

What can you expect at Apps14?

Theres more

C-level & Management: Attend our


Business & Strategy stream to find out
about future developments and understand
how the future of applications influences
your companys strategic direction.

The exhibition will showcase the leading


Oracle Applications solutions representing
vendors from all over the world. Plan
to visit the exhibitor booths, network
with your peers and stick around for the
evening social on Tuesday.

Oracle E-Business Suite (Mon, Tue, Wed)


Concentrate on whats important and how
the applications community can work
together across all brands, with sessions
on general Oracle E-Business Suite, Oracle
HCM, Oracle Projects, and Oracle E-Business
Suite CRM to name a few topic areas.

Functional Users: Choose from products


and streams most relevant to attend to
improve your day-to-day processes and

Cool Britannia will celebrate all that is


great about Great Britain. Imagine all your
favourite hits from the Beatles, wrapped

Agenda highlights include Willow Table


- your opportunity to put your questions
about any functional aspect of Oracles
E-Business Suite applications to a panel
of experienced applications users and
consultants, including Cliff Godwin,
Oracle E-Business Suite General Manager,
who will also be delivering the Oracle
E-Business Suite Community Keynote.

With over 35 tracks and more than


200 sessions at your fingertips, there is
something for everyone in your company.

34

www.ukoug.org

e
e

Focus on: Apps14


ness
Busi lytics
a
n
A

Peop
le

Soft

H
& ype
EP rio
M n

The 2nd annual conference is shaping


up to be the most impressive event
yet, with over 150 speakers taking the
stage on 8th-10th December, at the
ACC Liverpool.

Connecting
Experiences
View the full agenda and register your place
today: www.apps14.ukoug.org

committee has designed the format to


be a conference within a conference to
maintain the vibrant community we
have established.

Hyperion & EPM (Tue, Wed)


From roadmaps, case studies, tips and
tricks to Cloud-based applications, mobile
solutions and data integration. This
stream has been designed to make sure
that Apps14 delegates will get the best
from their Oracle Hyperion investment
and will give delegates the opportunity
to get feedback on OpenWorld.
Agenda highlights include roadmaps and
futures direction from key Oracle speakers
like Rich Wilkie; customer case studies;
interactive hands-on workshops from
PureApps, Keyteach and more.
CRM (Mon)
These streams will cover the many
aspects of CRM, including Oracle CRM,
CRM OnDemand, On-Premise, Cloud and
Fusion CRM/CX. Join us to hear about the
Siebel and CRM OnDemand roadmap,
social and mobile.
CRM delegates can expect the same as
they have enjoyed from the annual CRM
conferences in recent years with the
added benefit of being part of a single,
larger applications event. The steering

www.ukoug.org

Agenda highlights include Community


Keynote update on CRM; For Siebel
customers - the many faces of CRM.
Attend to hear more about the innovative
solutions available to maximise your
Siebel ROI, enhance customer
experience and improve overall business
performance and satisfaction.
PeopleSoft (Tue, Wed)
Four streams over two days covering
Human Resources, Financials, Global
Payroll and Technology. Speakers from
Oracle, end users and suppliers will
focus on product development, real
world experiences and innovative ideas.
Come along to understand the latest
information on PeopleSoft 9.2, share
experiences, and find hints and tips to
address issues encountered in the
work environment.

Business Analytics (Mon, Tue, Wed)


Theres a dedicated stream to Business
Analytics across three days, where delegates
explore the many aspects of BI, through
E-Business Suite, PeopleSoft, JDE, Hyperion,
CRM and a mixed view of multiple data
sources. Most sessions include real-life
case studies from organisations sharing
the benefits of their experience of having
integrated BI into their Oracle applications.
Plus thought-provoking sessions on
advanced topics like Big Data, real-time
analytics and data visualisation that show
how BI can enhance operational systems
and decision-making beyond traditional
structured reporting.

Applications Innovation (Mon, Wed)


This stream is all about what can add value
to your organisation, this may be an upgrade
to allow you to make use of the latest
enhancements, co-existence or migration
to Cloud (Fusion) Applications, expanding
your portfolio with new acquisitions such
as Endeca or taking the AppAdvantage
and using the Fusion Middleware stack.
Agenda highlights include John Webb,
Our Community Keynote Jeremy Ashley,
Vice President PeopleSoft Product Strategy,
Vice President of the Oracle Applications
delivering an update on the Oracle
User Experience group, and is sure to be as
PeopleSoft product line and leading an
entertaining and insightful as ever.
open discussion on Oracle PeopleSoft;
Best Practices in PeopleSoft 9.2 Upgrade;
Upgrade roundtable and Q&A
Book early and save!
bringing users together to discuss
Check out our money saving tips for
their upgrade challenges.
conference & travel on P7.

35

OracleScene
D I G I T A L

AUTUMN 14

E-Business Suite

KPIT - Oracle Partnership


In todays rapidly
globalising
world, it is
imperative for
shippers to have
robust intercontinental
supply chains.
To sustain, grow
and remain
differentiated in
the competitive
marketplace
manufacturers
need to identify
modes that
are safest, cost
effective, and
reliable.
Nikhil Gupta
Solution Architect
KPIT
36

Driving Business
Value Leveraging
Innovation
Container transportation has therefore
emerged as the most preferred option
and is undoubtedly the largest mode
of transportation of goods across the
globe. As the containers cover millions
of miles transporting raw-materials,
and finished goods, delivering anything
from processed food to medicines to
biomedical devices to automobiles,
there is a constant need to manage their
timelines, inter-country regulations and
security. Managing containers across
the supply chain is therefore a challenge
that manufacturers and shippers face,
as small errors in this could result into
high detention costs, rising inventories,
and rejection of cargo by the buyers. The
losses that these might cumulate into
are significant for the manufacturers,
exporters, 3PLs and practically every
stakeholder in a supply chain.

Key Challenges In Container


Management

Two factors that are key to a successful


business setup are on-time delivery of
goods and adherence to promised SLAs.
These principles apply to container
management too. Shipping lines rent the

containers to shippers to transport their


goods. However, shipping lines provide
a fixed time for these containers to be
handed over back to them. Owing to
supply chain inefficiencies and lack of
visibility these deadlines are difficult to
meet and as a result liners charge per-day
detention for such containers. A result of
this delay is significant additional costs
for the exporter.
Other challenges in effective container
management are:
Cargo tracking and visibility The
exporter or the buyer does not have realtime visibility on the status of the cargo
once it leaves the shipper premises. The
only sources of information are various
intermediaries which are involved in the
transit. This non-visibility leads to transit
delays and, if the goods transported are
perishable items, the supplier incurs
significant losses in case the goods are
spoiled and returned by the buyer.
Vessel cut-offs More often than not,
the vessel schedules are updated at the
last moment. If the trader is not aware
of the revised vessel schedules, the
container can miss the cut-off loading of
a vessel and be detained. The port levies
www.ukoug.org

E-Business Suite: Nikhil Gupta

demurrage charges in such scenarios.


Selecting the correct containers To avoid
detention, monitoring of all the containers
is needed for efficient utilisation. Owing
to the non-visibility of the entire inventory
of the available containers at different
depots, many times containers that are
approaching the end of their free period
or are already in the detention period are
not used for transportation. This leads to
further cost increment.

An Ideal Solution

The ingredients of a perfect container


management solution cannot always be
predicted owing to the dynamic variations
in the dependent parameters. However,
a rough definition of a perfect container
management solution would be:
A solution that is built using a robust
logic, enabling it to be so flexible that it
can be moulded and customised to suit
all the requirements of a specific export
house, manufacturer, LSP or any other
beneficiary across the value chain.
There are container management
solutions available in the market, however
there isnt any solution that addresses
every aspect of container management.
A solution which provides end-to-end
visibility of the container and/or cargo
and which can be integrated with
shipping lines in terms of arranging online
bookings, schedules, vessel planning,
receiving event updates and more is the
need of the hour.
Another important requirement that
such a solution must fulfil is that it should
be able to achieve seamless integration
with the ERP system of the shipper. This
integration can provide complete visibility
of all the available containers at all their
container depots and the transit status
of those containers. This integration
combined with real-time tracking ability
can be termed as an ideal solution which
would help a shipper not only plan but
execute his activities more effectively
and efficiently.

Leveraging Oracle Solutions To


Provide End-To-End Logistics Support
Few solutions exist in the market to
address some problem areas in container
management while also covering the
broader aspect of allied issues. In view of
this pressing need, KPIT has developed
an integrated container management
solution using Oracle Transportation
Management (OTM) as the foundation.
www.ukoug.org

This solution was deployed recently at


one of our customers with great success.
The customer uses a large number of
containers, refrigerated and dry, for
exporting various food items and was
facing the problem of rising container
inventory at their inland container depot
and was incurring significant losses
owing to detention charges.
The container management solution
leveraged OTM capabilities to support
all aspects of planning, execution, and
freight payment for both shippers and
logistics service providers and integrated
it with KPIT OnTrackTM to provide
end-to-end visibility of all available
containers at different container depots
and factory locations across the country.
This facilitated easy tracking of existing
containers in transit and allocation of
correct containers for stuffing new cargo.
Furthermore, OTM was customised to
incorporate an algorithm, which helped
with the assignment of best container
based on various factors such as cargo
type, cargo requirement, container
condition and container free period.

The best part about Oracle


OTM solution is its flexibility
and integration capability with
the existing systems of an
organisation and with other
important information portals.
This flexibility was the key attribute in
the successful deployment and execution
of this container management solution.
This solution based on Oracle OTM is
integrated with the shipping lines to
receive vessel schedules, book container
requests and to receive event updates. It
also is integrated with the existing ERP
of the customers to handle the complete
process, from procurement to delivery,
and is equipped with a user-friendly
dashboard that provides all details of
container transit on a single screen.
The solution also provides various reports
to track the container history and events
that happened throughout a container
lifecycle are also available.

Delivering The Benefits

The container management solution


designed and implemented by KPIT

has been extremely beneficial to our


customer. Post implementation, just in
two weeks, the solution has helped the
customer to substantially reduce the
detention costs.

The customer has also


been able to increase
productivity leveraging the
container tracking module
of our solution.
Our solution is enabled with an integrated
platform for receiving vessel schedules
from various liners, resulting in almost no
vessel misses. The real-time visibility of
the cargo/container ensures that there are
no transit delays.

KPIT-Oracle Partnership

KPIT in partnership with Oracle


provides best-of-the-breed supply chain
management and execution solutions and
standalone applications. Our solutions
help our customers to reduce operational
cost, build seamless processes, and ensure
high returns on investment.

ABOUT
THE
AUTHOR
Nikhil Gupta
Solution Architect, KPIT
Nikhil Gupta is a Solution Architect
with KPITs Value Chain Execution (VCE)
Practice. He has been working in delivery
and pre-sales for OTM & GTM for 6 years.
Nikhil comes with strong experience as an
Application Lead and Solution Architect
in complex Greenfield implementations
in India, Middle East and the USA. Nikhil
is a Computer Science Engineer and
MBA Gold Medallist from the Centre for
Development of Advanced Computing,
India. He has demonstrated innovation in
various dimensions of his work and was
awarded a national level prize for
the most innovative project
idea presentation.

37

OracleScene
D I G I T A L

AUTUMN 14

Advertorial: insightsoftware.com

Planning

One Mans Brave JOurney aWay


frOM excel and HyperiOn
An article looking at the typical path taken by JD Edwards users
before finding a purpose-built planning solution

Businesses are moving faster than ever before. So its no


surprise to hear that companies are screaming out for the
ability to quickly react to changing market conditions. But
unfortunately budgeting and planning over JD Edwards
data is never fast and simple, leaving Finance and
business users tearing their hair out. Even with business
intelligence solutions companies are still at the mercy of
uncontrolled spreadsheets and disparate systems.
The main character and his story in our article are based
on real experiences that our customers tell us about time
and time again.
Spreadsheet Madness
Ian, Financial Controller at a large investment bank, was
tired of the lengthy and repetitive processes involved in
the JDE planning cycle. He had been in the profession for
15 years and couldnt believe that although his company
was progressing in so many areas, the crucial planning
process remained archaic at best and wholly inadequate
at worst. By the time one cycle had ended, he was
preparing for the next one. The average planning cycle
looked like this:

38

Ian would send out spreadsheets to each budget


owner

Budget owners would often send these


spreadsheets to various people for their input
and approval

Spreadsheets were emailed back and forth,


creating multiple versions

Ian would get the spreadsheets back in dribs and


drabs

Some budget owners were asked to correct


errors, causing further delays

Numbers were uploaded to JDE through a tedious


process

By the time the budget was finished, it was


halfway through the year

Hyperion Is it for Everyone?


Ian started to look for a planning solution that
would eliminate the need to rely so heavily on static
spreadsheets and speed up the planning cycle. He spoke
to some people who recommended Oracle Hyperion. It
made sense; he already had an Oracle ERP system. What
could go wrong?
As part of the evaluation, he used an independent Bloor
Report to look into the Total Cost of Ownership (TCO)
of Hyperion. He researched everything from licenses
needed, databases, ETL tools, hardware, consultancy,
training costs, maintenance costs and the effort that
would be required for future upgrades.
Ian compared Hyperion to InsightUnlimited Planning, a
solution which is tightly integrated with JDE, allowing him
to manage the usage of uncontrolled spreadsheets and
enable enterprise-wide collaboration and transparency.
Although Hyperion was expensive and resource-intensive,
the CFO was a big fan of Oracle products so the approval

www.ukoug.org

Advertorial: insightsoftware.com

process was quick and straight-forward. The company


purchased Hyperion and the project team started the
implementation. Ian had a large budget to play with so
wasnt too concerned by the initial cost of the system and
the consultants fees. But a couple of months down the
line, Ian started to worry about the projects slow progress
and that he made the wrong decision. When the system
went live 6 months later his concerns were founded when
he observed that:

Data was extracted from JDE, breaking the link to


the ERP system

Static data was analysed in a data warehouse

Adjustments to Hyperion were being made in


Excel, not in JDE

The company was still heavily reliant on


spreadsheets

Multiple people were required to support


Hyperion

Speed and performance was surprisingly slow

The budget data in Hyperion was never sent back


to JDE, so a simple AA vs BA report couldnt be
done in his GL!

InsightUnlimited Planning - Does What it Says on the Tin


Fed up and disillusioned after completing two planning
cycles using Hyperion, Ian attended the JDE User
Conference in November 2013 in search of a replacement
solution. Ian had already been burnt once and was
determined not to invest in another solution that
wouldnt meet the companys planning requirements. Ian
revisited InsightUnlimited Planning, a real-time solution
he had previously dismissed because it wasnt Oracle.
He discovered that JDE users can plan over any JDE or
external ERP data including Payroll Forecasting, Supply
and Demand, Capital Expenditure, Cash Flow Projections,
www.ukoug.org

Sales Forecasting, Project Forecasting, and Depreciation


Planning.
The solution promised:

Implementation in weeks, not months

Zero additional infrastructure requirements

To unify planning processes with JDE

Elimination of rogue spreadsheets and other


disparate systems

A single source of the truth

Earlier this year Ian purchased InsightUnlimited Planning.


The solution was implemented in 3 weeks, just in time
for year-end and the start of the next planning cycle. The
cycle took 4 weeks to complete this year instead of the 6
months that Ian and the business had struggled with in
previous years. Now that the data is no longer removed
from JDE, IT no longer needs to support and manage a
data warehouse and the business can react to changes
and identify opportunities in real-time. Ian is now seeing
how the solution can benefit other planning processes
across the business starting with Cash Flow Projections.
Integrated Planning for JDE Master Class
Like Ian, do you find yourself suffering with a spreadsheet
driven planning process? Do you dream of finding a
planning solution that has a direct link to JDE, can be
installed within a few weeks and has a low TCO? If so,
attend the Integrated Planning Master Class session,
presented by John Brooks from InsightSoftware.com at
the JDE User Conference on Wednesday 12th November
at 2:05pm.
Dont do an Ian! Get it right the first time around.

39

OracleScene

Part 2 of a 3-Part Series

D I G I T A L

In the final article in the series, well look


at the Oracle NoSQL Database on the
BigDataLite VM, and see how OBIEE and ODI
leverage Hive to access and report on the
clickstream data it contains.

OracleScene additional
D I G I T A L

content

Oracle
BigDataLite VM
Connecting Oracle Business
Intelligence to the

In the first article


in this series we
looked at the new
Oracle BigDataLite
VirtualBox virtual
machine and some
of the Hadoopbased technologies
it contained, such
as Apache Hive,
Hadoop Distributed
Filesystem (HDFS)
and MapReduce.

We also took a high-level look at how


tools such as Oracle Business Intelligence
and Oracle Data Integrator could make
use of the Hadoop technologies within
the BigDataLite VM, using Oracle Data
Integrator to load and Oracle Business
Intelligence to report on data within
Hadoop. In this second article in the series
well look in more detail at Hive and
how you can connect to it using Oracle
Business Intelligence, so that you can
create dashboards and reports against
the data that it contains.

Mark Rittman
Co-founder
Rittman Mead

The answer is in a Hadoop technology


called Apache Hive. Hive is an open-source
technology that sits on top of Hadoop,
and provides a SQL access layer over the
data within it. When that data is already
neatly stored in fixed-width or delimited
files then Hive works in a similar way to
Oracle Database external tables, turning

40

How Oracle Business Intelligence


Interfaces with Hadoop

Oracle Business Intelligence generally


expects its data sources to be tables
and columns sourced from a relational
database, such as Oracle Database or
MySQL. Data within a big data system
such as Hadoop by definition generally
isnt arranged neatly in tables and
columns, so how do we make Hadoop
data suitable for reporting against using
Oracle Business Intelligence?

the fields in the file into columns in a


table, but Hive can also map its tables
and columns onto semi-structured
data sources such as JSON documents
or NoSQL databases using SerDes, or
Serializer-Deserializers, Java utilities that
translate one data format into another.
This ability to transform the varied types
of data in a Hadoop system into more
familiar tables and columns is one of the
main strengths of Apache Hive, and its
the underlying technology that allows
Oracle Business Intelligence to report
against big data sources and Hadoop.

Preparing your Hadoop Data for


BI Reporting

Just as all data within Oracle is typically


held in an Oracle Database, most data
within a Hadoop system is typically held
in files; specifically, files stored within
Hadoops own clustered filesystem
called Hadoop Distributed File System,
or HDFS. If youre familiar with the
Linux or Unix filesystem youll feel at
home within HDFS, which uses similar
commands to create directories, list files,
set file and directory ownership and so
on. HDFS has its own command-shell
that you can access from within the
BigDataLite VM, but something thats
even more convenient, especially for
first-time users of Hadoop, is a web-based
developer interface thats included with

www.ukoug.org

Business Intelligence: Mark Rittman

the Oracle Database data dictionary; you


can see three Hive tables already listed
in the centre of the page, the ones that
Oracle ship with the BigDataLite VM for
demonstration purposes.

FIGURE 1: UPLOADING A FILE AND CREATING A HIVE TABLE USING HUE

the BigDataLite VM, called Hue. You can use Hue to upload files
into HDFS and then create Hive tables over them, making it easy
to get some data into the BigDataLite VM Hadoop cluster ready
for reporting on using Oracle Business Intelligence.
For example, to upload a file of transactional data into the
BigDataLite VM using Hue for reporting on later using Oracle
Business Intelligence, follow these steps:
1. Using either the Firefox web browser, included in the
BigDataLite VM, or your own desktop web browser, if
youve access the VM over a network, navigate to http://
bigdatalite:8888 and log in using the credentials oracle/
welcome1
2. Using the menu at the top of the Hue application, select Data
Browsers > Metastore Tables. The metastore in the menu
item name refers to the Hive Metastore, the equivalent to

3. U
 nder the Actions menu on the lefthand side of the page, click on the
Create a new table from a file link. This
will bring up a wizard that allows you
to upload a data file from your local
filesystem to the HDFS filesystem, and
then define the column mapping that
translates your file data into tables and
columns, as shown in Figure 1. Note
that by default the file uploader in Hue
uses the HDFS filesystem for the file
picker, but the Upload a File button
presents you with the local filesystem
for the web browser youre using.

Using this wizard, I first upload a file of flight delay transaction


data in pipe-delimited form and then add another three files
containing details on flight origins, flight destinations and
carriers (airlines). If you want to try this yourself, you can
download the same files from Dropbox, here:
https://dl.dropboxusercontent.com/u/304565/airlines_data.
zip. Under the covers, the Hive tables created by this wizard will
store their data in files within the HDFS filesystem, with each
block of data replicated three times over the Hadoop cluster to
provide resilience in the event of a cluster node failure. When
HiveQL queries are run against these Hive tables, the Hive query
engine generates Java MapReduce jobs to retrieve and filter
the data, breaking the task into several parts and automatically
running the task in parallel on the cluster. Checking Hue one
last time before I move over to Oracle Business Intelligence, I
can see the four Hive tables Ill be reporting on listed alongside
the demo tables provided by Oracle with the BigDataLite VM, as
shown in Figure 2.

FIGURE 2: VIEWING THE FINAL LIST OF HIVE TABLES


FIGURE 3: CREATING THE ODBC CONNECTION TO
APACHE HIVE

www.ukoug.org

41

OracleScene
D I G I T A L

AUTUMN 14

Business Intelligence: Mark Rittman

Connecting Oracle Business Intelligence to Hadoop

Oracle Business Intelligence can connect to Apache Hive data


sources using ODBC drivers supplied either by Oracle, or using
drivers you can download from vendors such as Cloudera
(who make the Hadoop software in the BigDataLite VM and in
Oracle Big Data Appliance) or from vendors such as DataDirect.
At the time of writing, the Hive drivers supplied by Oracle as
part of the 11.1.1.7 installation of Oracle Business Intelligence
Enterprise Edition are for HiveServer1, whilst the version of
Cloudera Distribution including Hadoop (CDH) that comes with
BigDataLite needs the updated HiveServer2 drivers; therefore,
you are probably best downloading the latest drivers from the
Cloudera website (http://www.cloudera.com/content/support/
en/downloads/connectors/hive/hive-odbc-v2-5-10.html) and
using those. Note that if you are using a Linux installation of
Oracle Business Intelligence for the server elements, youll
need to download separate Windows ODBC drivers for the BI
Administration tool environment, along with the Linux ones
(note there are no drivers for Solaris or any other Unix-like
platforms and Oracle only officially support Hadoop access
on Linux).
Before you can connect Oracle Business Intelligence to
the BigDataLite environment, youll need to create ODBC
connections through to it, from the Windows environment that
your Oracle BI Adminstration tool is running on, and on your
Linux server if thats where you run the server components for
Oracle Business Intelligence. In this example well be running
all of the Oracle Business Intelligence platform from the one
Windows Server 2008 64-bit environment so that we only
have to install the ODBC drivers once (note that for production
environments you should run the server parts of Oracle Business
Intelligence from a Linux 64-bit environment, and youll need to
download and install separate Linux ODBC drivers for Apache
Hive from the Cloudera website as well).

Data Source Name:


BigDataLite Hive
Description: Connection through to
BigDataLite HiveServer2
Host: 172.16.77.187 (replace with
your own IP address, see note below
on how to obtain this)
Port:
10000
Database:
default
Hive Server Type:
Hive Server 2
Mechanism:
User Name
User Name:
oracle
Note that to obtain the IP address of your BigDataLite Hadoop
environment, right-click on the BigDataLite VM desktop and
select Open in Terminal; when the terminal session is then
displayed, type in ifconfig and press enter to display the VM IP
address. Both your BigDataLite VM and the Windows Oracle
Business Intelligence environment will need to be on the
same network for connectivity to work, i.e. both using NAT
networking if youre using two VMs on your desktop or laptop,
or both on the same office network if youre connecting
to BigDataLite using an installation of Oracle Business
Intelligence elsewhere on your company network.
5. Once done, press Test to check that connectivity works and,
when this is confimed, press OK and then OK to close the
dialogs.

Creating the Repository and Creating a Report

To create a Windows ODBC data source for the BI


Administration tool, follow these steps:

Now that youve set up your Hive tables within the BigDataLite
Hadoop environment and created an ODBC connection through
from your Windows install of Oracle Business Intelligence
through to them, you can use the BI Administration tool to
create a new repository and import the Hive table metadata
into it; then, once the repository is created, you can run your first
report against the Hadoop data source. To do so, follow these
final set of steps, from your Windows environment:

1. Follow the installation instructions that come with your


Hive ODBC drivers so that they are available in your Windows
client environment and the Linux server environment
as necessary.

1. Using the Windows Start menu, select Start > Oracle Business
Intelligence > BI Administration. When the BI Administration
tool starts up, either create a new repository or open an
existing online repository for editing.

2. Using a supported 64-bit Windows from the Start menu


select Administrative Tools > Data Sources (ODBC) to bring up
the ODBC Data Source Administrator dialog.

2. If you are creating a new repository, enter the name and other
details for the repository and, once youve done so, the Create
New Repository > Select Data Source dialog should display.
Alternatively, if youre using an existing online repository,
select File > Import Metadata to display the same dialog;
when its displayed, select the Apache Hive ODBC data source
you created in the previous steps and enter oracle as the user
name (leave the password blank). Press Next to proceed to the
next dialog page.

3. Click on the System DSN tab, and press Add.., to add a new
system DSN to the list displayed. When prompted, select
Cloudera ODBC Driver for Apache Hive as the driver type and
press Finish.
4. At the Cloudera ODBC Driver for Apache Hive DSN Setup
dialog, enter the following values to set up the connection, as
shown in Figure 3.

3. At the Select Metadata Type page, leave the checkboxes with
their default selection and then press Next to proceed.
4. At the Import Metadata > Select Metadata Objects page,
expand the Hive folder on the left-hand side and then expand
the default database entry underneath it. Then, ctrl-click on
the four Hive tables you created earlier and press the Import

42

www.ukoug.org

Business Intelligence: Mark Rittman

tables and then define foreign key links


between the flight delays fact table and
these dimension source tables, and then
create a simple business model and
presentation subject area to report on
these tables.
Finally, either save your updated
repository, if youve opened an existing
one online, or upload your repository
to the server environment using Fusion
Middleware Control if you created it
offline. You should then be able to create
reports and dashboards against data
in the BigDataLite VM, with Hive in
the background converting the HiveQL
queries the Oracle Business Intelligence
BI Server sends to it into MapReduce jobs,
running on the Hadoop cluster.

FIGURE 4: CREATING THE INITIAL REPOSITORY ENTRIES

selected shuttle button to add them to the right-hand side of


the screen; then press Finish to complete the table selection
and return to the main BI Administration application window.
5. On the right-hand Physical panel in the BI Administration
screen, your Hive database and tables should now be
displayed. Double-click on the Physical Database entry for
your Hive database and select the General tab; change the
Database Type selection to Apache Hadoop as shown in Figure
4, but leave the connection pool call interface set to ODBC
2.0 if youre prompted to change it straight afterwards. When
done, press OK and then OK again to close the dialogs and
return to the BI Administration main window.
6. Once done, save your repository and check consistency to
make sure there are no unexpected errors. To confirm that
your repository can connect successfully to the BigDataLite
VM and the Hadoop environment, either right-click on one
of the tables and select View Data or right-click and select
Update Row Count and then check that either table rows are
returned or the table row count is updated as expected.
At this point, you can treat your Apache Hive Hadoop data
source the same as any other relational data source youve
worked with before. Create keys on the carrier, dest and origin

ABOUT
THE
AUTHOR

Summary
In the first article in this three-part series, we looked at
the new BigDataLite VirtualBox virtual machine, and went
through an overview of the Hadoop and Oracle software
it contained and how it connects to tools such as Oracle
Business Intelligence and Oracle Data Integrator. In this
second article in the series, we looked in detail at how
we create Hive tables within the BigDataLite Hadoop
environment and then connect Oracle Business Intelligence
to it in order to report in the data those Hive tables
contained. As you will have seen from the examples in this
article, once you define Hive tables over structured and
semi-structured file data in your Hadoop environment, tools
such as Oracle Business Intelligence are then able to work
with them as they present their data in the familiar tables
and columns that tools like those are able to work with.
But what about data contained in NoSQL databases running
on Hadoop, such as Apache HBase and Oracle NoSQL
database? In the final article in this series well look at how
this becomes possible, using new capabilities in Oracle Data
Integrator 12c and SerDes within Apache Hive.

Mark Rittman
Co-founder, Rittman Mead
Mark Rittman is an Oracle ACE Director and co-founder of Rittman Mead, a UKOUG
Partner Member specialising in business intelligence, analytics and data warehousing
solutions. Mark is author of the Oracle Press book Oracle Business Intelligence
Developers Guide, was past Chair of the BIRT SIG, and past editor of Oracle Scene.
Blog: www.rittmanmead.com/blog

www.ukoug.org

@markrittman

43

OracleScene
D I G I T A L

AUTUMN 14

Hyperion & EPM

OracleScene additional
D I G I T A L

content

Reinventing Enterprise
Performance Management
to Support Sustainable,
Innovation-Based Growth
For 2014, The Hackett Groups Key Issues research finds that companies are focusing on
innovation as a core strategy to deliver growth. Financial planning and analysis organisations
are profoundly impacted by this strategic orientation as well as the main business drivers
behind it (global competition, volatility and the information revolution).
Gilles Bonelli, The Hackett Group

To rise to the challenge of supporting innovation-based growth,


FP&A organisations must pursue a broad transformation
agenda covering:
Integration of EPM processes and development of
business partnerships
Improvement of core processes to recalibrate FP&As
value proposition
Development of business intelligence information
delivery capabilities
While owned by FP&A, this agenda is aimed at elevating the
companys broader EPM capability level, and thus extends
beyond the functional boundaries of FP&A.

Strategic Priority in 2014:


Sustainable, Innovation-Based Growth

The Hackett Groups EPM Key Issues research is based on a study


conducted in late 2013. Study participants included executives
from over 150 large companies globally. The study covered their
business strategies, revenue and budget expectations, as well as
key initiatives for 2014.
The study found that the business environment continues to
be characterised by high levels of volatility and risk. Volatility of
demand has been a recurring theme since the global financial

44

crisis and remains the number-one driver of business strategy.


However, other types of risk and instability are becoming more
prominent. These are related to competition, regulation and
talent. Furthermore, the lingering risks of the crisis are still
being factored into business strategies by 62% of companies
participating in The Hackett Groups 2014 Key Issues Study; so
is supply volatility (for example commodity prices), included by
70% of companies.
Despite persistent uncertainty about business conditions,
most companies have reverted to a focus on creating value
for shareholders through revenue growth and margin
improvement. Over half of study participants cite revenue
growth as their top financial objective; 25% indicate margin
improvement is their main goal. Just 4% of companies are
actually targeting the acceleration of their historical revenue
growth rate. This very low percentage indicates a great amount
of lingering caution in business.
This conservative outlook is reflected in the business strategies
that will be deployed to realise financial objectives in 2014.
Sustainable, innovation-based strategies are top-ranked, but
only for the purpose of maintaining historical growth rates.
Traditionally, innovation was usually associated with a growthacceleration strategy. Today, it is a prerequisite to simply
maintaining growth rates, staying competitive and ultimately
remaining commercially viable.
www.ukoug.org

Hyperion & EPM: Gilles Bonelli

For many, this will require changes touching all aspects of


operations. The most important among these are adopting
technology, unlocking the value of information, and realigning
talent. Companies also continue to look for growth in other
countries, as reflected in the high priority given to expanding
the customer base. Globalising the business brings its own
challenges. Not surprisingly, then, finding or developing the
right talent remains a fundamental issue.

EPM & BI: Critical to Achieve Sustainable, Innovationbased Growth

EPM and BI are critically important organisational competencies


to develop and execute an innovation-based growth agenda.
While both competencies extend beyond the boundaries of
the FP&A organisation, in our research we focused just on the
implications for FP&A organisations of the need to support the
enterprises innovation- based growth agenda. For many, this
will demand the reinvention of their own service offering and
decision support capability.

The FP&A function plays an important role in providing this


support throughout the entire innovation life cycle. Innovationrelated business decisions are fraught with far more uncertainty
than traditional capital investments and resource allocation.
Hence, FP&A needs to innovate itself in order to support
innovation in the core business.
Our research shows that to improve the organisational EPM
competency level, FP&A needs a broad transformation agenda,
which can be clustered into the following three themes (Fig. 2):
1. Integrate EPM processes and develop business partnerships:
Financial planning and analysis is at the core of the enterprise
EPM capability, which extends beyond finance. The maturity
of EPM as a decision-making competency is a function of
integration between planning domains (strategic, operational
and financial) and the maturity of partnerships between
FP&A and the business.
2. Improve core processes to recalibrate FP&As value
proposition: FP&A is the custodian and owner of the
organisations core financial management control cycle:
financial planning and budgeting, forecasting and
performance reporting, supported by analytics. Continuous
improvement in this cycle is needed to improve service levels
and drive out cost. Efficiency gains are also necessary to
free up resources to move up the value chain with a flat or
declining cost base. Further, providing better value to the
enterprise will rely on FP&A groups ability to better integrate
and garner business operational knowledge to take their
insights to the next level.

The majority of companies are pursuing changes in cost


structure, led by reduction in SG&A cost. As a result, few
organisations will find additional budget for net-new
investment in FP&A, so any reinvention will need to be selffunded. Staffing levels across the finance function are expected
to decrease in 2014, while operating budgets will increase
slightly (Fig. 1). FP&A budgets and staffing levels will closely
track these projections. Yet FP&A organisations are expected to
meet higher demand levels. Our model assumes that demand
is proportional to the revenue base supported by the FP&A
organisation (which is projected to grow at 6-7%). This increase
3. Develop BI information delivery capability: As the preeminent
in demand, plus the anticipated budget cutback of close to 1%,
value-added information provider to the organisation, FP&A
translates into a need for productivity gains in the range of 7-8%.
is at the center of the BI revolution. Often a driver and major
contributor to kick-starting initiatives in these areas, FP&A
is well positioned to help bring a focus on what matters in
analytics, as well as bridge the gap between financial analytics
and those in other business domains (e.g., sales, marketing,
operations). Transforming and innovating the way information
is delivered will become a critical capability for FP&A.

FIGURE 1: ANTICIPATED CHANGES IN FINANCE BUDGET AND STAFFING 2013-2014

An Agenda for Reinvention of FP&A

The need for transformational change in FP&A to support the


evolving requirements of the business is nothing new. Ever since
the financial crisis, companies have generally been operating
in a structurally higher-risk (and thus less predictable) business
environment. Meanwhile, the transition from the industrialage economy to the information-age economy is accelerating,
creating its own set of challenges for EPM. These trends all
have major implications for financial planning, forecasting,
performance reporting and analysis processes.
With innovation now prominent on the enterprise agenda, FP&A
has gained yet another reason to rethink its value proposition.
An innovation-based enterprise agenda requires rigorous
decision-support based on operational and financial modeling.
www.ukoug.org

FIGURE 2: EPM PRIORITIES IN 2014

45

OracleScene
D I G I T A L

AUTUMN 14

Hyperion & EPM: Gilles Bonelli

In the remainder of this report, we explore each theme and its


associated initiatives in more detail.

improve business performance reporting is aligned both with


the improve core FP&A processes and develop information
delivery themes discussed in the next section.)

Integrate EPM Processes & Develop Business


Partnerships

Improving core FP&A processes is a prerequisite for moving up


the value curve in decision-support services. In the absence of
additional funding to develop new capabilities, the reinvention
of FP&A needs to be self-financed. Only efficiency improvements
beyond the rate needed to meet additional growth-driven
demand can free up the necessary resources.

While financial planning and budgeting, owned by FP&A, have


traditionally been at the heart of the organisations performance
management cycle, EPM is inherently an enterprise-wide
capability and includes the strategic, operational and financial
planning domains. In part, integration of these domains revolves
around formal process and data integration and alignment
of planning calendars. Various EPM performance studies
conducted by The Hackett Group since 2011 have all found
a strong correlation between EPM performance and process
and data integration, consistently showing far higher planning
integration at top-performing organisations.
However, without effective collaboration among stakeholders,
successful integration of operational, finance and strategic
planning will be elusive. Many organisations focus either on the
(structured) process and information aspects of integration, or
the (unstructured) partnership aspects. The most successful
organisations focus on both and maturity evolves in parallel; as
FP&A business partners benefit from better information access,
analytical capability and more mature processes, they can add
more value to the partnership.

Develop BI & Information Delivery Capability

After decades of investment in transactional backbone systems,


and the digitisation of content (documents, multimedia)
on a massive scale, companies are accumulating data at an
unprecedented pace. Add in universal connectivity, access
to virtually unlimited amounts of information through the
Internet, an explosion of information access devices and
technologies, and it is clear that a perfect information storm
is brewing. For years, Hackett studies have found that BI and
analytics rank among the top three most important technology
investments, and are often the number-one priority. This years
Key Issues Study findings were no different (Fig. 3).

Improve Core Processes to Recalibrate FP&As Value


Proposition

Many FP&A organisations are redesigning parts of their core


financial management control cycle (financial planning and
budgeting, forecasting and performance reporting), motivated
in part by widespread frustration over the ineffectiveness
and inefficiency of the traditional annual financial budgeting
process. While some organisations aspire to eliminate the
financial budgeting process altogether, very few have actually
achieved this. Most are trying to ease the pain of annual
budgeting simply through process improvement, technology
enablement and complexity reduction.
The financial forecasting process is often hampered by the same
complexity and excess detail that burden budgeting. Hackett
research shows that effective forecasting is driver-based, and
that failure to reduce complexity is the main reason many rolling
forecast implementations do not achieve their objectives. Our
2014 Key Issues Study strongly confirms finance organisations
emphasis on improving core FP&A processes. Improving
the efficiency of annual budgeting is the highest-ranked
performance improvement initiative among finance functions,
with 18% of companies considering this their top priority;
another 55% have a major initiative planned for 2014.
The third core FP&A process, business performance reporting,
is often resource-intensive, inefficient and ineffective. As
with budgeting and forecasting, complexity is performance
reportings worst enemy. World-class organisations cover
most of their reporting needs through standard reporting, use
well-defined, standard reporting packs, and have a rigorous
governance process to prevent unnecessary complexity
from creeping in. They also use self-service reporting portals
and scorecards far more extensively than the peer group to
improve reporting efficiency and timeliness. (The need to
46

FIGURE 3: BI, ANALYTICS AND DATA MANAGEMENT INVESTMENT PRIORITY

FP&As primary role is to improve business decision making by


providing value-added information (including analytics) services
to the business. The vast majority of business decisions have
a financial dimension, which is the primary area of focus and
expertise of FP&A organisations. This puts FP&A at the center
of the BI analytics revolution. First, FP&A is the custodian (and
in many cases, the owner) of purely financial information and
analytics. Second, it plays an important role in developing the
financial dimension of analytical models in other areas of the
business. Third, it is responsible for ensuring consistent financial
planning assumptions of financial and operational analytical
models and tools used throughout the organisation. And finally,
it is responsible for governance of much of the financial master
data, KPIs, data definitions and metadata underlying all of these
models and tools.
Given financial planners and analysts central role in BI/
analytics, it should come as no surprise that in many
organisations these staff are aligned with BI/analytics Centers of
Excellence. They may be the de facto functional owners of BI and
analytics data models and toolsets, and help elevate the general
BI/analytics competence level throughout the organisation.
www.ukoug.org

Hyperion & EPM: Gilles Bonelli

Strategic Implications

The demand emanating from changes in the external business


environment, business strategies focusing on innovation,
and the information revolution puts tremendous pressure on
FP&A organisations for reinvention. Those that remain stuck
in a pattern of facilitating the annual financial budgeting
cycle, periodic forecast and financial variance reporting will
rapidly become viewed as adding no value, and thus strictly an
overhead cost.
To avoid this fate, FP&A organisations must pursue a very
broad transformation agenda, simultaneously making core
FP&A processes more efficient, developing new BI and analytics

ABOUT
THE
AUTHORS

capabilities, integrating financial planning processes with


the operational and strategic planning domains, and building
partnerships with the business.
The FP&A transformation roadmap must be the outcome of a
deliberate design process, based on current-state performance
and capability in assessment and gap analysis.
Fortunately, Hackett research indicates most FP&A
organisations recognise the challenge and are answering the
call, as reflected in their aggressive transformation agendas for
2014 and beyond.

Gilles Bonelli
Practice Leader, The Hackett Group
Gilles Bonelli is Practice Leader, Enterprise Performance Management & Business
Intelligence Executive Advisory Program, Europe, for The Hackett Group.
Article written in conjunction with Erik Dorr & Sherri Liao

www.ukoug.org

47

OracleScene
D I G I T A L

AUTUMN 14

Business Intelligence

OracleScene additional
D I G I T A L

content

Agile Methods & Data


Warehousing:

How to Deliver Faster


Most people will
agree that data
warehousing and
business intelligence
projects take too long
to deliver tangible
results. Often by the
time a solution is in
place, the business
needs have changed.
Kent Graziano
Owner
Data Warrior LLC

With all the talk about Agile development


methods and Extreme Programming,
the question arises as to how these
approaches can be used to deliver data
warehouse and business intelligence
projects faster. This article will look at
the principles behind the Agile Manifesto
and see how they might be applied in the
context of a data warehouse project. The
goal is to determine a method or methods
to get a more rapid (2-4 weeks) delivery

of portions of an enterprise data


warehouse architecture.

The Agile Manifesto

This is what started the Agile movement.


It is a high-level statement of an approach
to software development authored by a
number of people in 2001.
Quoting from http://agilemanifesto.org:

Manifesto for Agile Software Development


We are uncovering better ways of developing
software by doing it and helping others do it.
Through this work we have come to value:
Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan
That is, while there is value in the items on
the right, we value the items on the left more
Kent Beck
Mike Beedle
Arie van Bennekum
Alistair Cockburn
Ward Cunningham
Martin Fowler James

Grenning
Jim Highsmith
Andrew Hunt
Ron Jeffries
Jon Kern
Brian Marick

Robert C. Martin
Steve Mellor
Ken Schwaber
Jeff Sutherland
Dave Thomas

2001, the above authors this declaration may be freely copied in any form, but only in its entirety through this notice.

48

www.ukoug.org

Business Intelligence: Kent Graziano

Applying the Principles Behind the


Agile Manifesto

data warehouse projects need to be


architected with this principle in mind.
That is they need to be flexible and
Along with the Manifesto, the authors
adaptable. Using a normalised model
provided a list of 12 specific principles to
as a base often helps as does using
be followed that would define an agile
code generators (like Oracle Warehouse
process. Here is the list of the principles
and discussion of each in the context of a Builder) to produce the ETL code. This
requires that the data warehouse team
data warehouse project.
be staffed with experienced professionals
who have built flexible architectures in
1. Our highest priority is to satisfy the
the past and who have experience with
customer through early and continuous
the appropriate tools. It also helps if there
delivery of valuable software.
While every project has the goal to satisfy is an existing enterprise data model. The
team must also have access to subject
the customer, data warehouses have
rarely been able to do it quickly. One of the mater experts for the subject areas that
need to be designed.
first questions to address in the area is
what do we mean by valuable software
and customer. In a data warehouse
world sense, I equate this to a business
Another, perhaps better,
intelligence interface (i.e. reports,
dashboards, etc.) and the customer to the option is to use an agile data
end user or knowledge worker.
If that is the case, then we cannot really
apply this principle to a data warehouse
project until after the data warehouse has
been designed, developed and put into
production. In order to gain some benefit
from the Agile approach, I have proposed
that in the case of data warehouse
projects, we define these terms more
broadly. Specifically, customer can be
anyone from the knowledge worker to
the BI programmer. In this context, if the
ETL programmer puts into production a
piece of code that populates some tables
that the BI programmer needs to produce
some reports, that is valuable software.
The goal then would be to continuously
deliver to the BI programmer populated
tables that in turn can be used to produce
valuable reports for the knowledge
worker. With tools like TOAD and Oracle
SQL Developer (and Excel), with a
populated table or two we can show the
users what the data looks like.

Seeing the data sooner rather


than later, most would agree,
is valuable to the business.
2. Welcome changing requirements, even
late in development. Agile processes
harness change for the customers
competitive advantage.
Typically, data warehouse projects
hate changing requirements because
it usually leads to major data model
changes and a large number of ETL
programming changes. To be more agile,
www.ukoug.org

engineering method, such


as Data Vault, to build your
data warehouse (more on
that later).

3. Deliver working software frequently,


from a couple of weeks to a couple of
months, with a preference to the shorter
timescale.
Obviously a principle we can all agree
on. Everyone wants the 30-90 day
deliverable max! This is achievable in
a data warehouse project provided you
have good scope control and approach
the effort one subject area at a time. This
of course begs the question What is a
subject area? Regardless of that, we can
all agree that we should not try to do the
entire EDW all at once.
Can we get to a point where we deliver
new functionality in a couple of weeks?
That will be addressed specifically later in
this article.
4. Business people and developers must
work together daily throughout the
project.
Great principle but can we get it to
happen in the data warehouse world? We
must at some level. Without interaction
with the business users, a data warehouse
project is doomed to failure. In the initial
stages of analysis and design, you need
access to these folks at least weekly (for
interviews, design and requirements
reviews, etc). Daily interaction may be
required at some times but not at others.
As stated earlier, where the Agile approach

seems to really apply in data warehousing


is when developing the BI reports. At this
point if the data is in place, then daily
interaction will insure better reports faster.
Another place where it applies nicely is in
the early definition of subject areas and the
data model.
As always, whether this interaction is
achievable may be more dependent on
political factors and management priorities
rather than the desire by the development
team to be agile. Without the consistent,
and valuable feedback from a business
partner, it does not matter how fast we
deliver as we will deliver the wrong thing.
5. Build projects around motivated
individuals. Give them the environment
and support they need and trust them to
get the job done.
Again, this is true for every project, data
warehouse or otherwise. If you can find
people who really want to be part of
the project, they will move mountains
to be successful. It also helps if they are
eminently qualified to do the work. If not,
you need to get them training ASAP, then
turn them lose. After the training they may
need access to ongoing support or expert
mentoring to get up to speed faster. Of
course if you have a shortage of motivated,
qualified staff, the chance of using an Agile
approach successfully is pretty much zero
(I can tell you stories about that!).
Once you have these people you must
keep them motivated. On a large data
warehouse project which really will
last years (with many intermittent
deliverables), this means more than
throwing pizza and Mountain Dew into
their cubes a few times a month. If we can
keep the units of work (i.e. deliverables)
smaller, they can experience successful
deployments more frequently. That is a
great motivator. Everyone likes to be told
good job on a regular basis. It also builds
a culture of success.

The way to blow this is to


try to deliver the entire data
warehouse solution all at once
then it is a do or die outcome.
6. The most efficient and effective
method of conveying information to and
within a development team is face-toface conversation.
49

OracleScene
D I G I T A L

AUTUMN 14

Business Intelligence: Kent Graziano

Developers do not like to write


documentation. They like to code. The
pain of communication can be minimised
by quick face-to-face meetings on a
regular basis. This principle leads to
concepts like a daily team huddle (more
about this later). This can certainly be
used on a data warehouse project.

The more complex the


project, the more important
that everyone be on the
same page and preferably
on the same floor in the
building too.
7. Working software is the primary
measure of progress.
The question here is what constitutes
working software on a data warehouse
project? Does one ETL routine count? I
propose that anything we build which
gets tested and moved into production
should be considered under this principle.
Hence, moving a few tables and the
associated ETL code into production
counts as progress (and a success).
Likewise a new BI report also meets this
criterion.
8. Agile processes promote sustainable
development. The sponsors, developers,
and users should be able to maintain a
constant pace indefinitely.
In other words, we want to avoid burnout
and overtime. This is about building
a software development architecture,
process and culture that make people
want to be part of the team and stay
motivated (see Principle #5). Given how
long data warehouse projects tend to
last, this is a principle worth pursuing. If
you are building a true enterprise data
warehouse with a Corporate Information
Factory or Data Vault architecture,
then the project has no end, so it is
paramount that this principle be applied.
This requires very good planning and
scope control so there are no all nighters
to get code into production. Smaller
deliverables, delivered more frequently
should help in this area. One goal
then is to create the smallest valuable
unit of work possible in order to keep
things moving (which also keeps people
motivated).

9. Continuous attention to technical


excellence and good design enhances
agility.
This is certainly true for a data warehouse
project. A bad design or architecture will
kill the project sooner or later. Either you
will not be able to adapt and expand the
scope of the warehouse in the future
or you will find reports and data marts
that you cannot easily produce. If it takes
a huge effort to make a change, your
project is not agile.
The best approach I have found for this is
to use the Data Vault modeling method
and do frequent design review sessions
with the internal team critiquing each
other. After a time, patterns of good
design and bad design start to emerge
and become obvious to the team. The
overall competence of the team is thereby
increased and as a result, the time to do
design reviews is decreased.
10. Simplicity the art of maximising the
amount of work not done is essential.
Or stated another way KISS (Keep It
Simple Stupid!). Data warehouses are
complex enough without adding work
by trying to do something really slick
that takes thousands of lines of code or
complex ODI transformations to do.

One of the best ways to


achieve this principle is to
use code generators like
those provided by Oracle SQL
Developer Data Modeler,
Oracle Warehouse Builder or
Oracle Data Integrator.

While the code generated may be


complex, there are no syntax errors to be
fixed. In addition the tools give you visual
diagrams that can be used to interact
with the business user rather than trying
to review DDL or PL/SQL with them. Often
a change in requirements or design can
be rapidly deployed by modifying a
diagram, then regenerating the code and
executing it.
11. The best architectures, requirements,
and designs emerge from self-organising
teams.
Put a bunch of smart people in a room
together and they will generally figure out
a good solution to the problem at hand.
Dont label them with specific titles. In
data warehousing you definitely need
people with architecture, data modeling,
DBA and ETL programming skills, but
that does not have to be their entire job.
Allowing them to interact and help each
other not only builds team morale but it
also insures you have cross-trained staff. If
you are going to follow an Agile approach,
you cannot afford to have only one
person on the team with a specific skill
set, otherwise you will experience delays
when they are out sick or on vacation.
A team with the latitude to self-organise
will be much more productive, once they
figure out how to work together.
12. At regular intervals, the team reflects
on how to become more effective,
then tunes and adjusts its behaviour
accordingly.
In SCRUM, this is referred to as a
Retrospective, held at the end of each
Sprint. At Denver Public Schools (DPS)
(before I learned about SCRUM) we used
a decision model promulgated by then
CIO, Dr. Ed Freeman (see Figure 1). This
approach started with debate mode
where we discussed possible solutions

FIGURE 1: DECISION MODEL

50

www.ukoug.org

Business Intelligence: Kent Graziano

to a problem (such as, what is our code


promotion process). After we reached
consensus we moved forward but set
a checkpoint in the future to revisit the
solution to see if it worked as planned or
if it needs to be adjusted. Over a period of
time processes and procedures became
much more effective and efficient, and
everyone had a stake in the solution. Our
team used this to continuously refine the
process of moving changes to the data
warehouse from development to quality
assurance and then to production. The
result was over 99% success rate on our
code promotions.

Agile Concepts & Methods

This section will examine a few Agile


ideas and how they might apply to data
warehousing.
Team Huddles
In SCRUM this is also referred to as the
Morning Roll Call. It is a mandatory
all hands project status meeting where
outstanding tasks are listed and assigned.
It is also where team members report
completed items and ask for help on
in-progress items when they need it. It
is limited to no more than 15 minutes
so people can get back to work. Detailed
discussions must be taken offline.
Team members must answer three
questions:

(virtual or otherwise), a programmer


using something like Oracle BI, Business
Objects or even an Excel pivot table, could
work one-on-one with a user to define
and then develop the reports in a very
rapid manner. This of course implies that
no new data requirements emerge during
the report development (any that do are
added to the project backlog).
Pair Programming
This concept is generally associated with
the XP method. This approach entails
programmers working side-by-side on one
terminal. One person types while the other
reviews. This actually helps them catch
programming errors on the spot. Again,
we have found that this technique can
definitely be used on a data warehousing
project. At DPS, we used it with Oracle
Warehouse Builder when building and
deploying ETL mappings and workflow
processes, as well as when we input PL/SQL
code into Oracle Designer. Most often we do
not use it in the initial development phases,
but do use it when debugging in QA.

Two pairs of eyes and two


brains are definitely more
efficient than one in many
situations.

1. What did I work on yesterday?


2. What will I work on today?
3. Do I have any blockers preventing me
from making progress?
This concept can definitely be applied to
data warehouse projects. We did it at DPS
and on one project at Hewlett-Packard
(HP), and are now using it at McKesson
Specialty Health. It was very successful in
helping us all to keep a handle on a very
complex project and keeps everyone in
sync. It provides a daily forum for positive
feedback to the team and helps keep the
team supervisor (or project manager) in
the loop. It has helped foster a very positive
team environment and helps support
Principle #11 self-organising teams.

Occasionally we even do Pair Data


Modeling, with one person drawing the
picture and adding attributes, while the
other person provides input to the design
as well as reviewing what has just been
typed on the screen.

Extreme Programming (XP)


This Agile method is based around
the idea of the programmer working
directly with the end user to develop
an application or interface. In the data
warehouse world, this approach is most
likely applicable in developing a business
intelligence portal or report. Assuming
the data marts are already in place

1. A stage table;
2. A fact table for a star schema;
3. A dimension table;
4. A complete star (fact and all
dimensions);
5. One piece of ETL code that populates a
fact table;
6. A function needed by the ETL code, or
7. A new report or query.

www.ukoug.org

The two week iteration can we do it?


This seems to be the goal most people
think of when they talk about the Agile
approach. Can this achieved on a data
warehouse project? In part, that depends
on what you define as a deliverable and
what your acceptance criteria is. To get to
quicker deliverables we must first think in
smaller increments of work. A deliverable
could be:

Even if the original intent of Agile did


not consider database and ETL type
development efforts, can we not apply
the principles anyway? The intent of
this article is to propose some new
best practices for data warehouse
development. So my proposition is that
we can apply the concepts and principles
of Agile as a means of organising our
work efforts and our teams to be more
efficient and deliver something sooner
rather than later. Is that not a good thing?
Remember that Principle #3 indicates
delivery in a couple of weeks to a couple
of months, so a two week interaction,
while desirable, is not mandatory to be
considered Agile. Perhaps we (the Oracle
and data warehouse community) need
to be broader in our thinking and our
interpretation of Agile methods. Why
cant we use Agile methods and concepts
to deliver a database structure quickly? Do
we have to follow the letter of the law to
reap benefit from Agile thinking?
I think not.
How about another perspective - who is
the customer/user? In the DW/BI world
my user, as a builder of data warehouse
structures (ODS, data marts, etc), is
really the BI programmer. His user is the
knowledge worker. He cannot deliver
anything useful until he has a structure
with data in it. So then delivery of working
ETL code that populates that structure
does put something of value in the hands
of my user. Should not my success as an
ETL developer be measured by my ability
to correctly populate tables? In turn can
the success of a data warehouse designer
be measured in terms of correctly
designed tables?
Oracle CASE Method Fast-track
An Agile approach has been right in front
of our faces for over 20 years in the Oracle
world. The Oracle CASE Method Fast-track
is a rapid application development (RAD)
approach, proposed and developed by Dai
Clegg and Richard Barker in 1994 (CASE
Method Fast-Track: A RAD Approach). This
was really the first agile methodology
we saw in the Oracle world. In fact, it
has been around since before the Agile
movement took off. According to the
authors, the development of Oracle*CASE
(later called Oracle Designer) with its
transformers and code generators made it
a feasible approach.
In summary, this method uses a
combination of tight scope control, direct
and continuous interaction with the users,
time boxing, and iterative prototyping and
51

OracleScene
D I G I T A L

AUTUMN 14

Business Intelligence: Kent Graziano

builds (using the Designer code generators)


to deliver a system, or sub-system in a few
months. According to Richard Barker, The
aim is to build an adequate usable system
quickly not a perfect system too late for
business. Sounds like an Agile method
to me.
One caution from Mr. Barker: It requires
small teams of practioners of better
than average expertise, each of whom is
capable of covering many aspects of the
work, and close cooperation with the
right users
Theres the rub this method will not work
with inexperienced developers. Nor will it
work without access to the right users.
Again all this seems to be in keeping with
the principles of the Agile Manifesto.
So how does this apply to a data
warehouse or BI project? As stated
earlier, we used Oracle Designer (and
now SQL Developer Data Modeler) to
design and build our data warehouse
structures and to generate custom PL/SQL
transformation code. The same concepts
apply when using Oracle Warehouse
Builder, ODI, or Informatica to generate
ETL and process flow code. Using the Web
PL/SQL generator in Designer or APEX, it is
even possible to quickly prototype a webbased query screen to let the user validate
the data loaded in the warehouse in
advance any BI reports being developed.
In the case of an ODS, an APEX module
might be the report they actually need.
Even if your team is not at the level
Mr. Barker said you need to use RAD (or

ABOUT
THE
AUTHOR

Agile) effectively, all is not lost. In our case,


we did not start with a team of better
than average expertise. Most of the
DPS team had never used Designer and
they were all new to data warehousing.
Because of this we did not attempt an
Agile approach until almost two years into
the project.
Through training, mentoring and just
doing the work, the team got to a level of
competence where we could use an Agile
approach effective.

Data Vault

There is a method of data modeling


specifically designed for building the
persistent, historical foundation layer of
an enterprise data warehouse called Data
Vault (see www.learndatavault.com ). This
is a very flexible and extensible modeling
technique that makes it possible to change
and extend the data warehouse very
rapidly in small chunks. Hence you can
have faster, but smaller, deliverables if the
project is planned correctly. Data Vault
modeling has been used around the world
for over a decade and has proven to be well
suited to agile development efforts. In fact,
the 2.0 version of the Data Vault System
now includes recommendations to use a
modified SCRUM project approach. I have
used this technique for nearly 12 years
with projects at Denver Public Schools,
the MD Anderson Cancer Center and
McKesson Specialty Health. It helped me
to build the historical portion and stage
areas of the architecture in a much more
incremental manner.

Conclusion
While most Agile methods did
not really have database and data
warehouse projects in mind, I think
it is clear that we can benefit a great
deal from the concepts and principles
embodied by these approaches.
In some cases (team huddles, pair
programming) we can directly
adopt their techniques. With the
ever increasing rate of change in the
technology and business world, and
with ever decreasing resources, we
owe it to our users, customers and
employers to examine every possible
means of becoming more effective
and efficient in what we do.
Can we achieve 2-4 week delivery
of data warehouse components? In
the case of BI reports that should
be no problem. If we cannot develop
a single report in that time, then
there is a serious problem with our
warehouse design. As for delivering
the design, tables and the populated
databases in the data warehouse or
data mart, I think that it is possible to
improve the rate of delivery of useful
objects by adopting some of these
approaches and by redefining what
a deliverable is. We must be open
minded and think a little differently
about what we are doing. Hopefully
this paper has given you some new
ways to look at your approach to big
data warehouse projects that will be
help you in achieving your goals.

Kent Graziano
Owner, Data Warrior
Kent Graziano is the owner of Data Warrior LLC in The Woodlands, Texas and a
lifetime member of Rocky Mountain Oracle Users Group (RMOUG) and ODTUG.
He is a certified Data Vault Master (DVDM) and Data Vault 2.0 Architect, Oracle
ACE Director, and expert data modeler and architect with more than 30 years of
experience, including over 25 years working using Oracle (since version 5), Oracle
tools, and two decades doing data warehousing. Kent has written numerous
articles, authored one Kindle book, co-authored four books, and has done many
presentations, nationally and internationally. He was recently voted the #2 best
presenter at OUGF14 in Helsinki, Finland.
Kent was the recipient of the 1999 Chris Wooldridge Award (from IOUG) for
outstanding contributions to the Oracle user community. In 2003 he was presented
with The Doug Faughnan Award for his dedicated service and outstanding
contributions to RMOUG. In 2007, he was the recipient of the ODTUG Volunteer
Award. He is a co-author of The Data Model Resource Book, Oracle Designer: A
Template for Developing an Enterprise Standards Document, and Super Charge Your
Data Wareouse (available on Amazon.com).

52

@KentGraziano

www.ukoug.org

THE RIGHT PEOPLE TO SUPPORT YOUR JD EDWARDS TECHNOLOGY

Great accelerated
implementation myth?

Supply Chain
Management

Ask for the proof,


ask out loud.

Customer
Relationship
Management

Human
Resource
Management

mobile

networked systems

Financial
Resource
Management
IT services

Manufacturing
Resource
Planning

unified communications

BMS offers the complete range of services to license, install, implement and maintain your
JD Edwards system. EnterpriseOne or World, from the earliest versions to the most recent
EnterpriseOne 9.1 and World A9.3. BMS offers you complete satisfaction by providing highly skilled and
experienced professionals at all times, and always at competitive rates.
BMS also provides additional complimentary products for JD Edwards including U.K. Payroll, Approval
Express and Construction Industry CIS. Offering renowned support and expertise in the following areas:

BMS Services

BMS Products

Accelerated Implementations

ERP Solutions

Flexible Support Services

Approval Express

Upgrades

Integrated Payroll Management

Consultancy

Human Capital Management Solutions

Managed Services & Hosting

Document Management

CNC Services
Project Management
Training
ACHIEVING THE HIGHEST AWARD FOR JD EDWARDS SERVICES
4 YEARS RUNNING
2010
2011

2011
2012

2012
2013

2013
2014

Platinum Partner

For more information please call +44 (0)1527 851 350 or visit www.beoleymill.co.uk
YEARS ANNIVERSARY

AWA R D W I N N E R S

Platinum Partner

SPECIALISTS

OracleScene
D I G I T A L

AUTUMN 14

Technology

OracleScene additional
D I G I T A L

content

Oracle Application Express (APEX)

Dispelling the Myth


Simon Greenwood, Oracle Development Director, Explorer (UK)
I sometimes hear the myth that Oracle Application Express (APEX) is only suitable for small projects and
department systems. This could not be further from the truth; Oracle Application Express is exceptional,
mature and a very scalable development platform. I have been an Oracle Development Consultant, Project
Manager and now a Development Director. I have been associated with this product for nearly 10 years,
working on many client solutions, varying from small systems, systems deployed nationally throughout the
UK and also dashboard-style systems with multi-language capability. APEX has consistently proved that
it has the flexibility to adapt to the challenges thrown at it. However as with all development tools, if you
confine your opinion based upon just using very high level features and wizards or by just viewing simple
demonstrations then you are supporting the myth.
Oracle Application Express is an Oracle Database feature and
Oracles primary tool for web applications, using SQL and PL/
SQL. Individuals and teams can build very secure and scalable
applications in timescales just not possible a few years ago.
The architecture is simple but the product is feature rich. The
deployment, import and export of applications are very straight
forward and APEX can scale in-line with your Oracle Database.
The learning curve with APEX is not steep and if you have an
Oracle Database development background, using PL/SQL, then
you will progress very quickly.

Joel also provided me with some very interesting information


regarding Oracles internal instance of Oracle Application
Express. This instance is hosted inside Oracle for anyone in the
company to build applications, requiring nothing but a browser.
Its used by virtually every line of business in the company (e.g.
EMEA HR, Database QA, Fusion Applications Development,
Marketing, North American Sales, India Development Centre
Facilities, and Manufacturing & Distribution). In comparison to
the evaluation instance available over the web, this instance is
for real applications that the business depends upon:

So how scalable is APEX? The simple answer is very scalable; a


good example is Oracles Database Cloud Service https://cloud.
oracle.com/database which includes APEX as the development
environment. One reason that APEX can scale as well as it
does is that it does not need a dedicated database session per
user, only a database session to use to process a request from
a user. Oracle also provides an instance that you connect to
for evaluation purposes apex.oracle.com. Joel Kallman, Oracle
Software Director, provided me with some recent statistics
which highlight the volume of activity this single APEX instance
is handling.

Total Page Views:


Distinct Applications Used:
Distinct Users:
Total Number of Workspaces:
Total Number of Applications:
(Statistics 6th May 2014)

Statistics over 7 days on apex.oracle.com:


Total Page Views:
Distinct Applications Used:
Distinct Users:
Total Number of Workspaces:
Total Number of Applications:
New Workspaces Approved:
(Statistics 6th May 2014)
54

4,875,173
5,842
9,048
20,974
77,478
904

2,389,593
2,023
18,203
2,759
4,592

Oracle also have an internal APEX application called Aria People


which is basically an employee directory. This application is
used by virtually every employee within Oracle and averages
1.4M - 1.5M page views per day. On one specific day (18-MAR2014) there were 3,132,573 page views from 45,767 distinct IP
addresses. The median page rendering time was 0.03 seconds.
In this same application, on 11-MAR-2014 there were 171,156
page views in a single hour, from 6,254 distinct IP addresses.
That averages out to 47.543 page views per second.
Besides APEX being the development environment within
Oracle Database Cloud Service, products such as Audit Vault and
Database Firewall and 12c Multi tenant Self Service provisioning
all use APEX.
www.ukoug.org

Technology: Simon Greenwood

The overhead associated with the APEX engine is fairly static


(measured in hundredths of a second), however if you write a
SQL query that takes 60 seconds to execute and you place your
query in a report in an APEX application, you can expect the
execution of the page to take 60 seconds also. The key to writing
great performing APEX applications is to write efficient SQL.
From a security perspective, APEX applications can suffer
from the same class of application security issues as other
web applications based on technologies such as PHP, ASP.NET
and Java. Cross Site Scripting tends to be the most common
vulnerability along with SQL injection; these issues are not
specific to APEX and are resolved through good coding practice.
Other potential vulnerabilities like Access Control and Item
Protection within APEX itself can be swiftly rectified through
defining and enforcing APEX build standards within your
organisation. I have found APEX applications to be no less secure
than Java or .NET applications, but usually easier to secure
because the architecture is so much less complicated.
From my experience of working with a number of different
customers, APEX is flexible enough to adapt to the customers
existing way of working rather than the tool itself dictating
how the customer should work. Admittedly, this is not always
obvious to those new to Application Express and relies on
experienced and skilled APEX developers to get buy-in from the
business to adopt APEX as a strategic and tactical development
tool. With APEX I have always found a way to accommodate
the clients preferred design/build methodology, security,
audit and authentication requirements and also incorporate
existing corporate branding standards. Make no mistake, APEX
is powerful and should be seriously considered and evaluated
by those who recognise the value in low cost development and
rapid deployment.
APEX can also be used successfully with a number of different
project methodologies and approaches - not just a traditional
software development lifecycle. Because APEX is so tightly
integrated with the Database, the complexity of the application
framework, including session management and security, are
handled automatically, allowing the developer to focus on
delivering application functionality relevant to their business
requirements. This means that applications can be delivered in
a number of days rather than weeks. Because of this, APEX is
perfect for methodologies that emphasise rapid and iterative
prototyping such as Rapid Application Development (RAD),
as well as those that place high value in the quick delivery of
useable software such as Agile. One particular client application

ABOUT
THE
AUTHOR

The next version of APEX introduces more productivity gains


with a wealth of new features, and going forward more
connectivity options through Oracle REST Data services,
responsive theme and enhancements to the mobile UI. Oracle
has consistently released new versions of Application Express
since its inception in 2004 and this commitment has been great
for client confidence in the product. Application Express is here
to stay and adoption is only going to increase due to an ever
growing profile and successful use cases. Application Express
is part of the Database and Oracle will always protect the
Database.
The APEX community is growing at a rapid rate and thanks to a
number of dedicated individuals who put the time and effort into
writing blogs, answering forum questions and event speaking,
the product popularity is going from strength to strength. The
Oracle development team responsible for Application Express are
the most visible and receptive product group I have experienced
throughout my own journey with Oracle technology. The formula
is right, the product just keeps getting better and it enables me
to confidently deliver projects of varying degrees of complexity
time and time again without failure.
It always surprises me when key Database features never
actually ever get used or at least evaluated, especially when
a more efficient and easier method could be introduced for
Database and Application development projects. So, why
not extend your Database development skills utilising what
you already own (APEX) with code (PL/SQL) you have already
written? It would be a shame if Oracle Application Express went
under your radar.

Simon Greenwood
Oracle Development Director, Explorer (UK)
Simon Greenwood is the Development Director at Explorer, Oracles 2014 Database
Partner of the Year. Simon has a long history with Oracle Development tools such
as Forms and PL/SQL, and since 2005 he has taken a leading role in promoting
Application Express to Oracle customers. Simon is the deputy chair of the UKOUG
APEX SIG and a member of the APEX advisory board. Explorer is an Oracle Platinum
Partner focused on developing bespoke applications, consultancy and training using
Oracle Application Express. Simons team of well-respected and highly regarded
APEX developers are highly skilled in converting business problems into
functional and low cost bespoke systems. www.explorer-development.uk.com

www.ukoug.org

needed to maintain the full development lifecycle approach


but also needed to switch to a more RAD approach to utilise
techniques such as prototyping to further enhance the
application to meet business demands. Using APEX gave us the
ability to adapt the methodology, as the application matured,
to align with client budgets and timescales. The application in
question has been live for over 7 years and has been through
three major releases to introduce additional benefit to the
business, proving how scalable it can be. In another situation,
where APEX had to prove it was the right choice for a large UK
wide deployed project, confidence had to be gained at board
level first. We were able to quickly prototype key features to
demonstrate suitability for the business requirements, which
instilled confidence in the board to progress. These prototypes
were then used as the foundations for the ongoing project
meaning no effort was wasted.

@Explorerukltd

55

OracleScene
D I G I T A L

AUTUMN 14

Business & Strategy

OracleScene additional
D I G I T A L

content

Capturing the
Process Truth
With the E-Business Suite upgrade deadline looming, many companies are either grappling
with the challenges of making the transition or dealing with the aftermath of the process.
The need to optimise processes and software is more pressing than ever but if business
processes are not documented in glorious, gory detail organisations could not only find
themselves floundering at the first upgrade hurdle but more importantly they could be
leaving themselves open to significant risk. Colin Armitage, CEO of Original Software,
considers the case for Business Process Capture (BPC) as a solution.
Colin Armitage, CEO, Original Software

Application & Process Challenges

Organisations face myriad application and process challenges


on a day-to-day basis, especially so when they are dealing with
specific business situations such as a merger or acquisition,
or an essential technology upgrade such as Oracle E-Business
Suite R12. Companies having to respond to these challenges
are acutely aware of the need for a deep understanding of how
software and business processes operate. However, in reality
they are finding that there is rarely a central repository of process
information where they can gain access to precise details. Theres
a grim realisation that process knowledge typically resides with
just a few people, the super users in the case of software, or
subject matter experts (SMEs) for business processes.
This is where Business Process Capture (BPC), a subset of
Business Process Management (BPM), comes in. Its a way to
automatically document an organisations business processes
while its staff perform them in their software applications.
It might seem like yet another acronym, but in complex IT
environments, BPC could be the key in helping organisations
to get their processes in good shape and its far less costly and
time-consuming than asking business analysts to do it.

56

BPC Health Check

As challenging as the application and process landscape is, BPC


can start to unravel any complexity and help to ease any issues
organisations might have. Any problems can be mitigated and
any previously undetected areas of weakness can be resolved.
These are the keys to a successful business process capture
exercise, which will help organisations to understand how well
they are functioning:
Defining the scope: It sounds obvious, but if you dont know
exactly what you want to capture, the people doing the work
cant give you what you want. For example, for financials it
would not be detailed enough to say Accounts Receivable.
You would need to be more specific, such as breaking that
down into Managing Revenue, Managing Documents and
Performing Transactions. This can then be broken down
further by the transactions required, for example, enter invoice
with rules, place an item in dispute etc. By naming each
transaction or business process that needs to be recorded, it
will be much easier to scope out the effort that is required.

www.ukoug.org

Business & Strategy: Colin Armitage

Securing resources: Who will perform the capture and how


long will it take them? Thorough assessment of the skills
and man hours required will enable you to request from the
business the right individuals required to capture each of the
business processes.
Managing and tracking: If 100 people from 12 offices are
performing the capture then youre going to need a central
platform for allocating tasks and tracking progress. Email and
phone will be too unwieldy.
P
 rocess harmonisation: Remember that teams in two offices
may perform the business process in different ways. Once
identified, the discrepancies can be analysed and a best
practice approach to that business process established. This
can help organisations avoid duplication.
Beware sensitive data: Ensure that any sensitive data is
logged and so can be masked out or changed before the
business process is recorded.

ABOUT
THE
AUTHOR

Safeguarding Business Process Knowledge

The steps for preparation for Business Process Capture are very
straightforward, but organisations have to make sure they
approach it in a planned, thorough and cohesive way.
Recent research conducted by Original Software in partnership
with UKOUG (Oracle Scene Spring 14) revealed that a significant
number of companies are still in the process of upgrading from
E-Business Suite 11i. BPC can significantly ease the upgrade
process by helping organisations to be more strategic about
the business processes that underpin them and by giving them
valuable information about how their organisations work.
Understanding the interdependencies between processes
and applications is core to this. Carefully approaching BPC can
help improve efficiency, boost effective operational running
and allow teams to share valuable process knowledge. This is
not only relevant during the upgrade process but also when
moving forward post-upgrade to ensure that the process
knowledge thats been captured doesnt fall down the gap, it is
safeguarded for the future. For further information please see
www.origsoft.com/oracle.

Colin Armitage
CEO, Original Software
Colin Armitage is CEO of Original Software, a software testing solutions company
serving more than 400 organisations, from multinational corporations to small
development shops, operating in more than 30 countries.

Youve Seen Them in Print,


Now Meet Them in Person
Almost 50% of the writers that have penned articles for us in
the last year will be speaking at Apps14, Tech14 or even both!
Come along and see these experts bringing their content to life and you could even meet them in
person. Many of our speakers hang around after their sessions to meet with delegates and answer
questions. So if youve enjoyed their articles, why not register and get access to a huge amount of
equally valuable content.

You can register via these links:

www.apps14.ukoug.org/register

www.tech14.ukoug.org/register

PLUS, if you want access to both conferences


you can add on a Rover Ticket during
registration for just 150!

www.ukoug.org

57

OracleScene additional
D I G I T A L

content

The Interview All


Graduates Need to Read
Regardless of which sector you have qualified in as a graduate, it is generally true that as
you enter the world of work you feel pretty invincible. The complications, frustrations and
hurdles that we all face in our working lives have yet to really have an impact on your state
of mind and this means you are keen to take on challenges of all kinds.
Being confident in your own strengths
is a major part of this, but have you ever
thought of the value that identifying
weaknesses at this stage might
potentially have?
Identifying how you can improve your
desirability to potential employers could
be the key to placing you ahead of the
competition. Alison Mulligan, Head of
BI & Key Accounts, Maximus IT, spoke
to Noel Gorvett, Managing Director of
Amosca about this exact topic and the
results were notably intriguing.
AM: How do AMOSCA, as a business, view
hiring graduates and trainees, and which
skills would do you feel these types of
candidates are typically lacking?
NG: The view of graduates/trainees
coming into AMOSCA is slightly
different from those we see with other
consultancies, where the assumption is
that they will automatically be on a track
to a specified end result.
For us, they are lacking a focus on what the
industry has to offer, and thus a compass
on their worth and what they can bring
to it. This might be from the EPM/ERP
industry actually lacking a solid message or
profile. This means that either accounting
or IT graduates assume incorrectly that
they will not fit the brief, or have the skill to
add value.
AM: What advice would you give young
people entering the technology, ERP or
58

EPM sector in terms of deciding which


direction they should aim their career?
NG: Each person would have a different
entry channel, but key to them enjoying
the opportunity and being successful
at it is to have an open mind, do not be
afraid to ask questions, and to expect
things to change quickly. The rate at
which the software vendors are changing
the technologies involved, aligned to
the changes in business requirements
(regulatory reporting; integrated reporting;
sustainability reporting etc) will mean
that the core skills of understanding the
underlying business fundamentals are
more important than the transitional
phase of a project or data structure.
So, I would recommend that you be sure
to look beyond the software, vendor or
project most businesses have multiples
of these three elements, and it is the
person who can look at the business from
both the current and future perspectives
that will bring a value proposition worth
investing in.
AM: If you were entering the workforce
now and looking to improve your
prospects of building a career in the EPM
or ERP sector, which skills would you
choose to learn?
NG: First and foremost, work on your
people skills. Know how to read the parties
involved and ensure that communication
stays consistent. Many consultancy clients
these days have multiple layers of people
who are involved in programmes; ranging

from lifers (20+ years with the company),


politicians, managers, etc, with conflicting
views, expectations and budgets. Then
there are external parties (e.g. audit
firms, software vendors, independent
consultants, specialists) all voicing their
opinions, and many more. So its really
important to understand who, when and
how to interact, be heard, and add value.
System skills are also an important skill
areas so keep your technical skills fresh.
Always know what the market is doing,
changes the new versions will introduce,
and where best they apply to the
environment.
Understand your core skills (linking the
two above) it is imperative that anything
is backed up by simple life basics. It should
be standard that you be able to document
anything and ensure that things like
meeting notes, change management and
system updates are recorded. It is also
essential to ensure that whatever you are
doing or being asked to do meets a moral
duty of honesty, integrity and passion. If it
does not feel, look or sound right, question
it this could save a lot of time, money
and effort.
AM: We frequently hear that graduates
and apprentices will be lacking in realworld business experience, what would
you suggest they do to counter this lack
of knowledge?
NG: Spend time looking at how others do
it. Shadowing others and discussing why
www.ukoug.org

Focus on: Next Gen

they use specific methods or approaches is


invaluable. At AMOSCA, we like to do this
via the support desk. This ensures that a
lot of the skills and knowledge is gained
in a non-pressure environment, where the
experience is mentored through hearing
and seeing, as well as hands on, without
the risk of injury.
AM: What is the best advice you could
give to a graduate about how to build an
effective network, for now and the future?
NG: Dont be greedy. It is a fine line
between taking and giving, and if the
balance is right, people will share. There are
a lot of places where knowledge is power,
so being seen to not be a threat to others
will ensure a learning opportunity instead
of just getting kept busy!

NG: There are business and technology


courses available for everything, but
meaningless without direction or focus.
The best thing you can do is research, the
internet has a wealth of content, so
get reading.
If on a finance track, CIMA or ACCA,
with a view and systems angle will always
serve well, and if from technical track,
look for something thats giving a view
as to the end result, not just a
programmer/developer.
AM: Any other comments?

NG: The one thing that I have in mind is


the variety this offers. We are a consulting
business, so things like a nomadic lifestyle,
long hours are possible. Also, we favour
the team collaborative approach, not
Align to this loyalty. A lot of people are very independent consultant, and my belief
quick to see bright lights and money, but do is a graduate should be on the inside to
not forget the fundamentals that whoever maximise the benefits.
gambled on giving you the opportunity
will not be best pleased if you change your
mind at the earliest opportunity.
Staying Open to All Possibilities
One of the most important aspects that
Both of these could be factored financially,
Noel Gorvett identified for candidates
and the modern trend for making a fast
entering the Technology, ERP or EPM sector
buck is unsustainable. Remember what is
was an open mind. He elaborated by
ahead, and ensure that there is a balance
suggesting that the person who can look
between age, experience, skills and income. at the business from both the current
and future perspectives will bring a value
AM: What advice would you give a
proposition worth investing in.
graduate on how to make themselves
stand out in the job application or
People skills, system skills and core skills
interview process?
were the areas which he felt IT graduates
and trainees should focus on most if they
NG: Know what and why they are in the
want to contribute more to their overall
process, by having done some research on
working environment.
all aspects of the industry and being able to
talk about it, including probing questions, is His dissection of the important role which
an assured differentiator. Do not feel afraid people skills can play was particularly
to contribute, and ensure that there is a
intriguing, as he identified that when and
defined programme to vary the aspects of
how to interact, be heard, and ADD VALUE
experience in multiple areas, this will give
can define how an IT professional will be
both parties best opportunity to see where viewed in their role.
best to see the talent available.
In assessing how graduates can make up
AM: Are there any courses you would
for a general lack of real-world experience,
recommend to anyone looking to work in
Noel Gorvetts advice was simple but
the EPM sector?
clear! Spend time looking at how others

ABOUT
THE
INTERVIEWEE

www.ukoug.org

do it was the succinct message to


graduates who may have a strong grasp
on the theory but have yet to get their
teeth into the practice.
He also advocated strongly the idea that
IT graduates should know precisely what
and why they are in the process if they
want to stand out in their position. It can
be easy for an IT graduate to fade into
the background if they are surrounded by
individuals holding similar qualifications
and the competition is fierce!

A Clear Idea of Your Place in


the Company

However, a graduate who has a strong


idea of their role and where they fit into
the wider picture is likely to do the basics
competently as well as identifying how
they might be able to progress.
This final point seems to be indicative of
Noel Gorvetts wider message. Carrying
out the basics well, taking your lead from
more experienced colleagues and working
on general people and communication
skills could be the key to a graduate
workforce improving collectively.
Putting a focus on these areas might be
the difference between your company
standing still or standing out from the
crowd!
As part of UKOUGs exciting new Next
Generation initiative well be getting
the views of leading IT professionals on
which skills modern IT workforces need
to thrive and stay one step ahead of the
competition, so students can identify how
they can set themselves apart from the
crowd when interviewing for positions
upon graduating.

Noel Gorvett
Director, AMOSCA
Noel Gorvett is Director at AMOSCA has been with the company for 8 years.
AMOSCA are a Award winning (UKOUG Partner of the Year) Oracle Gold
Partner. Hyperion implementations providing solutions, services, software
and support. Value add for businesses in providing business solutions from
process, integration and best practice.

59

ENRICH Your
Oracle Applications
APPLICATION
SERVICES

PROCUREMENT

AMS

Sourcing

Management distracted by IT?


Resource Constraints?
High risk of failures?
Employees growth and turnover
issues?

Could you source more of your spend?


Can you exploit sourcing opportunities?
What about cutting costs with online
collaboration? Can you create
immediate and long term savings and
achieve rapid ROI through open
competition?

Implement & Upgrade


Stuck on old technology?
Sourcing
Slow to deploy?
Too costly to upgrade and update?
Unfeasible project cycles?

Spend Visibility

Sourcing

Poor visibility of spend? Do you know


who, what, when, why and how?
Where can you improve spend
management. Can you aggregate to
model rationalisation and where you
can leverage? Do you have a
common taxonomy across accounts
and procurement?

Innovation
Has technical innovation plateaued?
Investment benefits not fully
delivered?
Poor user adoption?
Searching for the next big thing?

Easy iProc

Sourcing

Sourcing

Expertise

Is there a lack of end user adoption to


iProcurement? Do you suffer from a
lack of compliance adhoc requests
not coded? What about inefficient
user productivity too many clickthroughs?

Constant need for Oracle expertise?


Business processes need
improvement?
Resource constraints?
High risk of failures?

Integrated Source to Pay

WHY ENRICH?

As Sourcing
represented by the schematic to the
right. Does this represent your
procurement process? Do you have
full integration to supplier networks and
spend visibility? Oracle Applications
plus Flexible Deployment plus
Procurement Expertise equals
Integrated Procurement Excellence.

Enrich can solve all of these


problems for you, whether
via Enrich Cloud, Oracle
Cloud or On-Premise.

Enri

Enrich offers an entire suite of Procurement Solutions and Procurement Concierge


Services to help our clients succeed; from spend analytics, opportunity assessments
and sourcing, to contract management, catalogue management, iProcurement, tail spend
management and working capital initiatives.
We offer a range of deployment options to suit our clients specific needs including; Cloud, On
Premise or Hybrid and provide a one-stop shop for implementation and/or managed services to help
clients squeeze the maximum value from their Oracle solution investment.
Enrich has over 250 Oracle EBS and Fusion Procurement domain experts. We pride ourselves on the
number of subject matter experts within Oracle Applications as well as our procurement business
practitioners who support leading customers such as Carillion, JLP, BT, NBTY to name but a few.
We are Procurement! We are Oracle Apps! We are Cloud!

Want to find out more?

Email: hello@enrich.com Tel: Americas +1 (888)778 1402 - EMEA +44 (0)20 3574 4720
Website: www.enrich.com

Focus on: SIGs

OracleScene additional
D I G I T A L

content

Why You Should Be


Attending UKOUG SIG
Events This September
Its September, which can only mean one thing in the UK...
the return of UKOUG Special Interest Group (SIG) meetings after
their summer break!
If you havent attended a SIG event
before, or if you havent attended one
in a while, allow one of our members,
Phil Wilkins of Specsavers, to take you
through his reasons for attending and
why he thinks more members should
make the most of their inclusive SIG
places and brush up on their Oracle
knowledge.

Phil on UKOUG SIG meetings...

I am fortunate enough to have an


employer who promotes the idea of
community participation both internally
and also with communities relating to our
technology vendors such as Oracle.
The original motivation for UKOUG
membership was that membership
effectively paid for attendance to the big
annual conferences, given that the
chance of attending Oracle OpenWorld
was a lot less likely.
In addition to the conference
opportunity, part of our membership
is the opportunity to participate in
Special Interest Group sessions. There
are SIGs covering different aspects of
Oracles portfolio from middleware
and development technologies (my
specialisms) through to Supply Chain

www.ukoug.org

and JD Edwards and obviously database


tech. I have to admit I didnt have great
expectations when I attended my
first SIG, but actually the first SIG and
subsequent ones I have attended have
been gold mines of useful information.
The sessions cover a range of topics and
the presentations come from customers
and partners, as well as Oracle, and are
typically very conversational, so as a
result you pick up insight into a lot of
practical aspects, not just theory as youd
commonly get in say a training session.
Oracle support UKOUG SIGs by having
representation at them which means
there are potential opportunities to
pick their brains 15 minutes of free
consultancy over coffee (something
that doesnt come often with Oracle ;-)
). Not to mention time given in the day
to chew the fat with partners and other
customers. For example, at my second SIG
session I ended up discussing experiences
of working with Packt Publishing with an
Oracle Partner (not necessarily directly
related, but interesting to see what the
experience was like from an authors
perspective).
I know from talking with other colleagues
where I work who have attended SIGs

have come away feeling that it was a


day well used (and have also encouraged
others to participate). It would also
seem that many people who attend also
participate on a regular basis suggesting
they get a lot out of the sessions (all
lending towards a bit of a community
spirit as well).
Based on my experiences, and those
shared with me, I would strongly
recommend finding an excuse (or making
the time as it is for me) to get out of
the office and take advantage of your
membership (or even joining UKOUG).
Justify it as cheap training if need be,
but get yourself along to one of Oracles
offices (who lend their facilities to
support the user group) in London,
Reading or Solihull. Im sure youll find it
will be very worthwhile even if the travel
is a bit of a bind.
I would also like to take the time to thank
UKOUG volunteers like Simon Haslam at
Veriton who put their time and effort in
organising their particular SIG sessions.
Take Phils advice and have a look at our
events calendar to book a SIG place now:
www.ukoug.org/events

61

OracleScene
D I G I T A L

AUTUMN 14

Technology

CBO Choice Between Index & Full Scan:

The Good, the Bad &


the Ugly Parameters
Usually, the conclusion comes at the end. But here I will clearly show my goal: I wish I will
never see the optimizer_index_cost_adj parameters again. Especially when going to 12c
where Adaptive Join can be completely fooled because of it. Choosing between index access
and full table scan is a key point when optimising a query and historically the CBO came
with several ways to influence that choice. But on some systems, the workarounds have
accumulated one on top of the other biasing completely the CBO estimations. And we see
nested loops on huge number of rows because of those wrong estimations.
Franck Pachot, Senior Consultant, dbi services

Full Table Scan vs Index Access

Full table scan is easy to cost. You know where the table is stored
(the allocated segment up to the high water mark) so you just
scan the segment blocks in order to find the information you are
looking for. The effort does not depend on the volume of data
that you want to retrieve, but only on the size of the table. Note
that the size is the allocated size - you may have a lot of blocks to
read even if the table is empty, just because you dont know that
it is empty before you have reached the high water mark.
The good thing about Full Table Scan is that the time it takes
is always the same. And because blocks are grouped in extents
where they are stored contiguously, reading them from disk
is efficient because we can read multiple blocks at a time. Its
even better with direct-path and smart scan, or with in-memory
option.
The bad thing is that reading all data is not optimal when you
want to retrieve only a small part of information.
This is why we build indexes. You search the entry in the index
and then go to the table, accessing only the blocks that may
have relevant rows for your predicates. The good thing is that
you do not depend on the size of your table, but only on the size
of your result. The bad thing comes when you underestimate the

62

number of lookups you have to do to the table. Because in that


case it may be much more efficient to full scan the whole table
and avoid all those loops.
So the question is: do you prefer to read more information
than required, but with very quick reads, or to read only what
you need but with less efficient reads. People often ask for the
threshold where an index access is less efficient than a full
table scan. 15 years ago people were talking about 15% or 20%.
Since then the rule of thumb has decreased. Not because the
behaviour has changed, but I think its just because the tables
became bigger. Index access efficiency is not related to the table
size, but only to the resulting rows. So those rules of thumb are
all wrong. In fact there are three cases:
You need a few rows, and you accept that the time is
proportional to the result, then go with index
You need most of the rows, and you accept that the time is
proportional to the whole data set, then full scan
Youre in between, then none are ok. Ideally, you need to
change your data model to fit in one of the previous case.
But in the meantime, the optimizer has to find the least
expensive access path.

www.ukoug.org

Technology: Franck Pachot

Of course there are several variations where a Full Table Scan


is not so bad even if you need only a small part of the rows
(parallel query, Exadata SmartScan). And there are other cases
where index access is not that bad even to get lots of rows
(covering index, well clustered index, prefetching/batching,
cache, SSD). But now let see how the optimizer is doing
the choice.

Cost Based Optimizer

At the beginning, things were easy. If you can use an index,


then use it. If you cant then full scan. Either you want to read
everything, and you full scan (and join with sort merge join) or
you want to retrieve only part of it and you access via index (and
do a nested loop join). This sounds too simple, but its amazing
the number of application developers that are nostalgic of that
RBO time. For small transactions it was fine. Remember, it was
a time where there was no BI reporting tools, where you didnt
have those 4 pages queries joining 20 tables, generated by those
modern ORM, and tables were not so big. And if you had to
optimize, denormalization was the way: break your data model
for performance, in order to avoid joins.
Then came a very efficient join, the Hash Join which was very
nice to join a big table with some lookup tables, even large ones.
And at the same time came the Cost Base Optimizer. And people
didnt understand why Oracle didnt support the brand new
Hash Join with the old stable RBO. But the reason was just that
its impossible to do. How can you choose to join with Nested
Loop index access or with Hash Join full table scan? There is no
rule for that. It depends on the size. So you need Statistics. And
you need the CBO.

Multiblock Read

And explain plan for both join methods:


Nested Loop in 8i with db_file_multiblock_read_count=8:
--------------------------------------------------------------------| Id | Operation
| Name | Rows | Bytes | Cost |
--------------------------------------------------------------------|
0 | SELECT STATEMENT
|
|
500 | 9000 | 1501 |
|
1 | NESTED LOOPS
|
|
500 | 9000 | 1501 |
|
2 |
TABLE ACCESS FULL
| A
|
500 | 4000 |
1 |
|
3 |
TABLE ACCESS BY INDEX ROWID| B
|
1 |
10 |
3 |
|* 4 |
INDEX RANGE SCAN
| I
|
1 |
|
2 |
---------------------------------------------------------------------

Hash Join in 8i with db_file_multiblock_read_count=8:


----------------------------------------------------------| Id | Operation
| Name | Rows | Bytes | Cost |
----------------------------------------------------------|
0 | SELECT STATEMENT
|
|
500 | 9000 | 3751 |
|* 1 | HASH JOIN
|
|
500 | 9000 | 3751 |
|
2 |
TABLE ACCESS FULL| A
|
500 | 4000 |
1 |
|
3 |
TABLE ACCESS FULL| B
|
100K|
976K| 3749 |
-----------------------------------------------------------

Clearly the nested loop is estimated to be cheaper. This is the


CBO default behaviour up to 9.2.
How is the cost calculated? The cost estimates the number of
I/O calls that has to be done.
Nested Loops has to do 500 index access and each of them has
to read 2 index blocks and 1 table block. This is the cost=1500.
Hash Join has to Full Scan the whole table with 30000 blocks
under the High Water Mark (we can see it in USER_TABLES.
BLOCKS). Because we read 8 blocks at a time, the cost that
estimates the number of I/O calls is 30000/8=3750.

Ok, you changed you optimizer mode to CBO. You were now able
to do Hash Joins. You did not fear Full Table Scan anymore.
What is the great power of full scans? You can read several
blocks at once. The db_file_multiblock_read_count controls that
number of blocks. And because the maximum I/O size at that
time on most platforms was 64k, and default block is 8k, then
the default value for db_file_multiblock_read_count was
8 blocks.

But then, at the time of 8i to 9i, the systems were able to do


larger I/O. The maximum I/O size reached 1MB.

Ill illustrate the optimizer behaviour with a simple join between


a 500 rows table and a 100000 rows table, forcing the join with
hints in order to show how the Nested Loop Join and Hash Join
cost is evaluated.

Lets see how the CBO estimates each join now.

On my example, when we execute it, the nested loops is 3 times


faster. Only when the first table reaches 1500 rows the nested
loop response time is over the hash join. Nested Loop is the
plan I want to be chosen by the optimizer for that query. Now,
imagine I had that query 15 years ago. We will see how that
query execution plan evolves with the versions of the CBO.
So I set the optimizer to the 8i version and the db_file_
multiblock_read_count to the value it had at that time: 8 blocks.

And in order to be able to do those large I/O we raised db_file_


multiblock_read_count to 128 (when db_block_size=8k).
alter session set db_file_multiblock_read_count=128;

Nested Loop in 8i with db_file_multiblock_read_count=128:


--------------------------------------------------------------------| Id | Operation
| Name | Rows | Bytes | Cost |
--------------------------------------------------------------------|
0 | SELECT STATEMENT
|
|
500 | 9000 | 1501 |
|
1 | NESTED LOOPS
|
|
500 | 9000 | 1501 |
|
2 |
TABLE ACCESS FULL
| A
|
500 | 4000 |
1 |
|
3 |
TABLE ACCESS BY INDEX ROWID| B
|
1 |
10 |
3 |
|* 4 |
INDEX RANGE SCAN
| I
|
1 |
|
2 |
---------------------------------------------------------------------

Hash Join in 8i with db_file_multiblock_read_count=128:

alter session set optimizer_features_enable=8.1.7;


alter session set db_file_multiblock_read_count=8;

www.ukoug.org

63

OracleScene
D I G I T A L

AUTUMN 14

Header here Franck Pachot


Technology:

----------------------------------------------------------| Id | Operation
| Name | Rows | Bytes | Cost |
----------------------------------------------------------|
0 | SELECT STATEMENT
|
|
500 | 9000 |
607 |
|* 1 | HASH JOIN
|
|
500 | 9000 |
607 |
|
2 |
TABLE ACCESS FULL| A
|
500 | 4000 |
1 |
|
3 |
TABLE ACCESS FULL| B
|
100K|
976K|
605 |
-----------------------------------------------------------

And now I have a problem. Hash Join looks cheaper. Cheaper in


number of I/O calls, thats right. But its not cheaper in time.
Its right that doing less I/O calls is better, because the latency is
an important part of the disk service time. But we still have the
same size to transfer. Reading 1MB in one I/O call is better than
reading it in 16 smaller I/O calls. But we cannot cost those 1MB
I/O as the same as one 8k I/O. This is the limit of costing the I/O
calls. We now have to cost the time it takes. But thats for the
next version (when 9i that introduced cpu costing).
This is what happened at that 8i time. We were able to do
larger I/O but a lot of execution plans switched to Hash Join
when it were not the right choice. We didnt want to lower
db_file_multiblock_read_count and did not have a way to let the
optimizer evaluate the cost as an estimate time.
So came a freaky parameter to influence the optimizer

Why set the optimizer_index_cost_adj to 20%? Its an arbitrary


way to lower the cost of index access as much as the cost of full
table scan has been wrong. The goal is to compensate the ratio
between multiblock read and single block read disk
service times.
Of course, in hindsight, that was not a good approach. More
and more decisions are based on the optimizer estimations and
faking it with arbitrary value is not a good solution.

System Statistics

So the right approach is to change the signification of the cost.


Estimating the number of I/O calls was fine when the size of I/O
were all in the same ballpark. But now not all I/O are equal and
we need to differentiate single block and multi block I/O. We
need to estimate the time. The cost will now be the estimated
time, even if for consistency with previous versions it will not be
expressed in seconds but in number of equivalent single block
reads that take the same time.
In addition to that, the optimizer tries to estimate also the time
spend in CPU. This is why it is called cpu costing even if the
major difference is in the costing of multiblock I/O.
In order to do that, system statistics were introduced: we
can calibrate the time it takes to do a single block I/O and a
multiblock I/O. That was introduced in 9i but not widely used.

Cost Adjustment

And calibration can also calculate a multiblock read count


measured during a workload, or have the default value of 8
when db_file_multiblock_read_count if not explicitly set.

This is optimizer_index_cost_adj that defaults to 100 (no


adjustment) but can change from 0 to 10000.
Lets see what it does:

The idea is then not to set db_file_multiblock_read_count.


The maximum I/O size will be used at execution time but the
optimizer uses a more realistic value, either the default (which
is 8) or the value measured during a workload statistics
gathering. But what we often see in the real life is that the
values that have been set once do remain for years even when
not accurate anymore.

The weird idea was: because Full Table Scan is under-estimated,


lets under-estimate Index Access cost as well!

alter session set optimizer_index_cost_adj=20;

The Hash Join cost is the same as before (the under-evaluated


cost=607) but now the Nested Loop is cheaper:
--------------------------------------------------------------------| Id | Operation
| Name | Rows | Bytes | Cost |
--------------------------------------------------------------------|
0 | SELECT STATEMENT
|
|
500 | 9000 |
301 |
|
1 | NESTED LOOPS
|
|
500 | 9000 |
301 |
|
2 |
TABLE ACCESS FULL
| A
|
500 | 4000 |
1 |
|
3 |
TABLE ACCESS BY INDEX ROWID| B
|
1 |
10 |
1 |
|* 4 |
INDEX RANGE SCAN
| I
|
1 |
|
1 |
---------------------------------------------------------------------

The arithmetic is simple: we told the optimizer to underevaluate index access to 20% of the calculated value. 300
instead of 1500. Nostalgic of RBO were happy. They had a mean
to always favour indexes, even in CBO.
But this is a short-term satisfaction only, because now the cost
is false in all the cases.

64

In 10g the cpu costing became the default and uses default
values if we didnt gather system statistics, based on a 10
millisecond seek time and a 4KB/millisecond transfer rate, and
the default multiblock estimation is 8 blocks per I/O call.
So reading an 8KB block takes 10+2=12 milliseconds and
reading 8 blocks take 10+16=26 milliseconds. This is how the
choice between index access and table full scan can be
evaluated efficiently.
alter session set optimizer_features_enable=10.2.0.5;

Ive reset optimizer_index_cost_adj so that Nested Loop has the


correct cost:

www.ukoug.org

Technology: Franck
Header
Pachot
here

------------------------------------------------------------------------------------| Id | Operation
| Name | Rows | Bytes | Cost (%CPU)| Time
|
------------------------------------------------------------------------------------|
0 | SELECT STATEMENT
|
|
500 | 9000 | 1503
(1)| 00:00:19 |
|
1 | NESTED LOOPS
|
|
500 | 9000 | 1503
(1)| 00:00:19 |
|
2 |
TABLE ACCESS FULL
| A
|
500 | 4000 |
2
(0)| 00:00:01 |
|
3 |
TABLE ACCESS BY INDEX ROWID| B
|
1 |
10 |
3
(0)| 00:00:01 |
|* 4 |
INDEX RANGE SCAN
| I
|
1 |
|
2
(0)| 00:00:01 |
-------------------------------------------------------------------------------------

You see the apparition of the estimated time.


Now cost is time. It is estimated to 1500 single block reads.
And Hash join now uses system statistics (Ive reset db_file_
multiblock_read_count as well):

If you didnt gather workload system


statistics (which is the right choice if youre
not sure that your workload is relevant)
you wont see them in sys.aux_stats$ but
you can calculate from IOSEEKTIM and
IOTFRSPEED:

- MBRC when not gathered is 8 when


db_file_multiblock_read_count is not set
(which is the right approach)
- SREADTIM when not gathered calculated as
IOSEEKTIM + db_block_size / IOTFRSPEED

- And MREADTIM as IOSEEKTIM + db_block_size *
MBRC / IOTFRSPEED

--------------------------------------------------------------------------| Id | Operation
| Name | Rows | Bytes | Cost (%CPU)| Time
|
--------------------------------------------------------------------------|
0 | SELECT STATEMENT
|
|
500 | 9000 | 4460
(1)| 00:00:54 |
|* 1 | HASH JOIN
|
|
500 | 9000 | 4460
(1)| 00:00:54 |
|
2 |
TABLE ACCESS FULL| A
|
500 | 4000 |
2
(0)| 00:00:01 |
|
3 |
TABLE ACCESS FULL| B
|
100K|
976K| 4457
(1)| 00:00:54 |
---------------------------------------------------------------------------

When you validate that those values are accurate,


you can stop to fake the optimizer with arbitrary
cost adjustments.

12c Adaptive Joins

And even if we do less I/O calls, Hash join is estimated to be


longer. On multiblock reads, the transfer time is an important
part of the response time, and this is what was not taken into
account before system statistics.
So we have now the right configuration for the optimizer:
db_file_multiblock_read_count not set
optimizer_index_cost_adj not set
accurate system statistics
This is the right configuration for all versions since 9i.
Unfortunately a lot of sites had moved to
cpu costing when upgrading to 10g but still
keep some mystic value for optimizer_index_
cost_adj. Thus they have a lot of inefficient
reporting queries that are doing nested loops
on large number of rows. This takes a lot of
CPU and the response time increases as the
volume increases. And people blame the
instability of the optimizer, without realising
that they explicitly give wrong input to the
optimizer algorithm.

But that decision is based on the cost. At parse time the


optimizer evaluates the inflexion point where the cardinality
is too high for a Nested Loop and when it is better to switch to
a Hash Join. But if the cost for Nested Loop is under evaluated,
then a Nested Loop will be used even for a high cardinality and
that will be bad, consuming CPU to read always the same blocks.
Below is my adaptive execution plan on 12c.

------------------------------------------------------------------------------|
Id | Operation
| Name | Starts | E-Rows | Cost (%CPU)|
------------------------------------------------------------------------------|
0 | SELECT STATEMENT
|
|
1 |
| 1503 (100)|
|- * 1 | HASH JOIN
|
|
1 |
500 | 1503
(1)|
|
2 |
NESTED LOOPS
|
|
1 |
|
|
|
3 |
NESTED LOOPS
|
|
1 |
500 | 1503
(1)|
|4 |
STATISTICS COLLECTOR
|
|
1 |
|
|
|
5 |
TABLE ACCESS FULL
| A
|
1 |
500 |
2
(0)|
| * 6 |
INDEX RANGE SCAN
| I
|
500 |
1 |
2
(0)|
|
7 |
TABLE ACCESS BY INDEX ROWID| B
|
500 |
1 |
3
(0)|
|8 |
TABLE ACCESS FULL
| B
|
0 |
1 |
3
(0)|

If this is your case, its time to get rid of it.


The problem it originally addressed empirically
is now solved statistically. You should now check your system
statistics and if SREADTIM and MREADTIM looks good (check sys.
aux_stats$), then you should reset optimizer_index_cost_adj
and db_file_multiblock_read_count to their default value.

www.ukoug.org

We will be upgrading to 12c soon. And we will benefit of a very


nice optimizer feature that will intelligently choose between
Hash join and Nested Loop at execution time. This is a great
improvement when the estimated cardinality is not accurate.
The choice will be done at runtime, from the real cardinality.

And from the optimizer trace (gathered with even 10053 or with
dbms_sqldiag.dump_trace)
DP: Found point of inflection for NLJ vs. HJ: card = 1432.11

65

OracleScene
D I G I T A L

AUTUMN 14

Technology: Franck Pachot

Header here

That means that the index access is the best approach as long
as there is less than 1400 nested loops to do. If there is more,
then Hash Join is better. The statistics collector will count the
rows at execution time to see if that inflexion point is reached.

The inflexion point is much higher:

But here is what happens if I keep my old cost adjustment


inherited from the old 8i times:

If I have 5000 rows instead of 500 the execution will still do


a Nested Loop, and thats obviously bad. That will do 15000
single block reads (2 index blocks and one table block per loop).
Without a doubt it is far more efficient to read the 30000 blocks
from table with a full table scan doing large I/O calls.

alter session set optimizer_index_cost_adj=20;

DP: Found point of inflection for NLJ vs. HJ: card = 7156.65

Conclusion
Using optimizer_features_enable like a time machine we were able to see how the optimizer has evaluated the cost of index vs.
full scan in the past. But there is an issue that is current. A lot of databases still have old settings, and a lot of software editors
still recommend those old settings. They finally gave up with RBO because they cannot recommend a desupported feature. But
probably because of the fear of change, they still recommend this old cost adjustment setting.
However the only reason for it has disappeared with system statistics, years ago. So its time to stop faking the CBO. Today the
CBO can do really good choices when having good input. Since 10g, the good is System Statistics, the bad is RBO, and the ugly
is optimizer_index_cost_adj. You are in 10g, 11g or even 12c, then choose the good and dont mix it with an ugly setting
inherited from the past.

ABOUT
THE
AUTHOR

Franck Pachot
Senior Consultant, dbi services
Franck Pachot is senior consultant at dbi services in Switzerland. He has 20 years
of experience in Oracle databases, all areas from development, data modeling,
performance, administration, training. He tries to leverage knowledge sharing in
forums, publications, presentations, and became recently an Oracle
Certified Master.

Autumn Special Interest Group Meetings


September

November

17th UKOUG Business Intelligence & Reporting Tools SIG,


London
17th
UKOUG Solaris SIG Meeting, London
18th
UKOUG Public Sector HCM Customer Forum, Solihull
24th OUG Ireland BI & EPM SIG Meeting, Dublin
24th
OUG Ireland HCM SIG Meeting, Dublin
24th
OUG Ireland Technology SIG Meeting, Dublin
25th
UKOUG Oracle Projects SIG, London

4th UKOUG Public Sector Financials


Customer Forum, London
6th
UKOUG Apps DBA for OEBS, London
12-13th UKOUG JD Edwards Conference & Exhibition 2014,
Oxford
18th UKOUG Application Express SIG Meeting, London
19th
UKOUG Solaris SIG Meeting, London
27th UKOUG Public Sector HCM Customer Forum Workshop, Solihull

October
TBC UKOUG RAC Cloud Infrastructure & Availability SIG
7th
UKOUG Public Sector Applications SIG Meeting, Solihull
9th
UKOUG Application Server & Middleware SIG, Reading
14th
UKOUG Taleo SIG Meeting, London
15th
UKOUG Solaris SIG Meeting, London
21st
UKOUG Database Server SIG, Reading
22nd UKOUG Supply Chain & Manufacturing SIG, Solihull
23rd
UKOUG HCM SIG, Solihull
23rd
UKOUG Partner Forum, London
23rd
UKOUG Partner of the Year Awards 2014, London

66

December
8-10th UKOUG Applications Conference & Exhibition 2014,
Liverpool
8-10th UKOUG Technology Conference & Exhibition 2014,
Liverpool
9th
UKOUG Primavera 2014, Liverpool
17th
UKOUG Solaris SIG Meeting, London

Dates correct at time of print.

www.ukoug.org

Supplier Guide

SUPPLIER GUIDE

A list of suppliers of Oracle-related


products and services to save you the
time and energy of searching for them.

Rescue Your Month End


with Live Oracle 11i/R12
Reporting in Excel

www.excel4apps.com
dean.jones@excel4apps.com
+44 (0)800 756 6869

OracleScene

Be seen
in Oracle Scene

@ Provanis

2014 OUG Ireland - Supplier Directory Ad 40x80mm.indd 1

We focus exclusively on the staffing


of Oracle Applications projects.

5/15/2014 7:14:12 AM

Want to see your company listed here?

t. +44 (0)20 7010 0450 e. jobs@provanis.com w. provanis.com

Contact: Kerry Stuart


T +44 (0)20 8545 9686

kerry@ukoug.org

More than 70% of members


join UKOUG looking
to solve a problem

You

could have
their solution...

Get involved at www.ukoug.org/join


Follow us
@UKOUG

www.ukoug.org

67

AUTUMN 14

Advertorial: Inoapps

Why Cloud Deployment


Makes Sense
The significant advantages of cloud software are increasingly clear to many midsize and larger, global
companies. Indeed, leading European Oracle Platinum Partner, Inoapps, was so impressed that it
deployed Oracle Financials Cloud itself, gaining a unique insight as both a user and a consultant. Here
Phil Wilson, Senior Vice President of Inoapps, explains why moving to the cloud makes perfect sense.

Phil Wilson, VP Business Development and Alliances, Inoapps


Today, CxOs have the daunting task
of selecting the right IT to support
business growth whilst seeking to
reduce back-office costs. Naturally, cloud
computing now plays a significant role
in re-balancing a companys technology
investment, as it enables management
to fundamentally alter how their
organisation utilises and accesses the
technology they need.
Most importantly, the cloud model is
enabling organisations to release and redeploy significant budgetary expenditure
to other business critical areas such as
product development, mobile working
or creating customer-facing growth
opportunities. Gone are the days where
businesses had to fund teams of IT staff
reviewing, configuring, testing, deploying
and upgrading the myriad pieces of
technology that they had bought.
Oracle has led the charge here, recognising
how effectively the cloud can simplify
IT, and thereby eliminate the limitations
of complex and costly back-office
infrastructures.
As President Mark Hurd said recently;
We think our customers deserve better.
the IT industry should assume the cost of,
and responsibility for, making products

ABOUT
THE
AUTHOR

68

that work together, so that our customers


dont have to do it themselves.

unequalled levels of integration and also


its world-class security.

This refreshingly clear vision is well received.


Empowered business leaders recognise
that modern SaaS applications are explicitly
designed to exploit the latest IT innovations
and that Oracle meets the strategic need
for a primary cloud provider; one that is
capable of delivering streamlined systems
across all core business processes.

Once deployed, a wide range of major


benefits became clear. We now have an
exceptionally scalable platform, capable
of supporting Inoapps growth, acquisition
strategy and internationalisation.

As moving to the cloud is such a


fundamentally different approach,
its important to find the right
implementation partner, one that can fully
exploit the technologys potential. Inoapps
has a clear advantage here as not only can
we support the whole stack, but we also
possess an unrivalled level of experience.
This is no idle claim. We were so impressed
with the technology, that we became
one of Europes first Oracle ERP Cloud
implementations.
In all, there is a three-phase process. Using
our rapid implementation methodology,
the core financial modules were up and
running in just two months, enhanced
financial reporting and analytics in only
another 4 weeks and Project Controls
a further 8. We were impressed by the
systems usability, functionality, the

It has also enabled us to empower mobile


working and with the systems excellent
social app support, it significantly improves
collaboration. Were not stopping there
though. Looking forward, were further
expanding the footprint with Oracle EPM
Cloud and Oracle HCM Cloud.
So by utilising Oracle Cloud ourselves,
we gained real customer insight into
the systems significant deliverables.
Consequently, Inoapps can now
advocate its merits from both user
and consultant perspectives. We aim
to pass this unique insight onto other
organisations seeking to modernise their
IT right across the enterprise.
This isnt just limited to financials, HR
and CRM, it could well be areas such as
strategic financial planning, supply chain
development or customer services, so any
organisations who think we could help
them, should contact us.

Phil Wilson
VP Business Development & Alliances, Inoapps
Inoapps Phil Wilson has over 20 years experience in the business and IT functions
arenas, working at both end-client organisations and with implementers. With extensive
knowledge of implementing packages covering manufacturing, logistics, financials and data
warehousing, Phil has had responsibility for the project management and delivery of a large
number of these. Phil has developed skills in taking business systems and applying them
effectively to support Business Processes to drive adoption and efficiency, particularly in
the development of leading-edge implementation tools that drive successful, low-risk,
low-cost and rapid installations. Follow Phil Wilson on Twitter @PhilipDWilson

www.ukoug.org

InFusion
NOUN Infju3()n

1. Introduction of something needed


The addition of a new or necessary quality or element to something
2. The perfect blend of inoapps and Oracle Fusion ERP Cloud
3. Enables you to fast-track Oracle Fusion ERP Cloud
into your business for a rapid ROI

Get Oracle Fusion ERP Cloud in just 8 weeks


Achieve growth objectives
with a system that grows with you safely
and securely

Ensure consistent processes


to standardise, automate, and simplify your
enterprise around the globe

Reduce costs
with a subscription-based, flexible, Software
as a Service (SaaS) deployment model

Modernise your financial back-office


with the latest in analytic, mobile and social
technologies

Find out More:


w: www.inoapps-fusion.com e: info@inoapps.com t: 0870 112 2000

CONSULT I LICENCE I HOST I SUPPORT I DEVELOP I EXTEND

You might also like