You are on page 1of 152

H

F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y
bu
to
k
lic

IMPROVING CUSTOMER SATISFACTION AND OPERATIONAL


EFFECTIVENESS WITH THE USE OF AN ICT SERVICE
MANAGEMENT BEST PRACTICE FRAMEWORK: ACTION RESEARCH
IN THE SHARED SERVICE CENTRE

Note: This is a slightly revised version of the dissertation with some mistakes
and oversights corrected. Some new considerations are included as footnotes
on the relevant pages.

Johannes Hendrik Botha

Dissertation submitted to Oxford Brookes University in partial fulfilment of the requirements for
the degree of Master of Business Administration

Supervisor: Dr. BC Potgieter


(Waikato Institute of Technology)

Academic Dean: Dr. Charlene Lew


(Damelin International College of Postgraduate Business Sciences)

March 2004

.d o

.c

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

TABLE OF CONTENTS
ACKNOWLEDGEMENTS AND THANKS

CHAPTER 1
INTRODUCTION

1.1

Background

1.2

Problem Statement

10

1.3

Purpose, Objectives and Key Constructs

11

1.4

Overview of Dissertation

12

CHAPTER 2
LITERATURE REVIEW
2.1 Customer Satisfaction and Quality

14
15

2.1.1 Customer Satisfaction Overview

15

2.1.2 Quality Overview

16

2.1.3 Assessing Customer Satisfaction

18

2.2 ICT Service Management Best Practice

19

2.2.1 Best Practice

19

2.2.2 Service Management

20

2.2.4 ITIL Background and Concepts

21

2.2.5 Quality Best Practice

24

2.2.6 Managing ICT Organisations Measurement and Control

27

2.2.7 Control Objectives for Information and Related Technologies (CobiTTM)

30

2.3 Research Framework

33

2.4 Summary of Literature Reviewed.

36

CHAPTER 3
RESEARCH METHODOLOGY

40

3.1 Introduction

40

3.2 Nature of Research

41

3.2.1 The nature of data collected

41

3.2.2 Research approach

44

3.3 Research Design


3.3.1 Unfolding research project

45
46

Page 2 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

3.3.2 Planning

46

3.4 Characteristics of Sample

47

3.5 Gathering Data

48

3.6 Design of Questionnaires

48

3.6.1 ICT Service Management Process Maturity Measurement

48

3.6.2 Customer Satisfaction Survey

51

3.6.3 Call Statistics

52

3.7 Data analysis

52

CHAPTER 4
RESULTS

53

4.1 Overview of Results

53

4.2 ICT Service Management Maturity and Best Practice Conformance Assessments

54

4.2.1 Assessment Data

54

4.2.1 Conclusion - Maturity and Best Practice Conformance Assessments

56

4.3 Customer Satisfaction Survey

56

4.3.1 Results of Surveys

56

4.3.2 Validity of Responses

56

4.3.3 Initial comparison

59

4.3.4 Adjustment of Survey results

60

4.3.5 Anomalies

62

4.3.6 SERVQUAL - Factors Compared

63

4.3.7 Conclusion - Satisfaction Surveys

64

4.4 Call Statistics

64

4.4.1 Discussion of Data

64

4.4.2 Conclusion - Call Statistics

66

4.5 Management Interviews

67

4.6 Conclusion on Results

69

CHAPTER 5
FINDINGS AND RECOMMENDATIONS
5.1

The nature of data collected

70
70

5.2 Research Framework

72

5.4 General Observations

77

5.5 Conclusion - Research Results

78
Page 3 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

5.6 The Operational Context

80

5.6.1 Objectives and Goals

80

5.7 Context for implementing Service Management Best Practice

81

5.8 Specific Recommendations

84

5.8.1 Implementing a Quality Management System

85

5.8.2 Effectively Control and Manage Objectives

86

5.8.3 Matrix Structure

87

5.8.4 Service Culture

88

5.8.5 Managing Change

88

5.8 Conclusion

89

CHAPTER 6
THEORY AND PRACTICE

92

6.1 Critique on Literature and Theory

92

6.2 Literature vs. the Research Environment

96

6.3 Research Framework

97

6.3.1 Applicability of the Framework

97

6.3.2 Future use of the Framework

98

6.4 Conclusions re the purpose of the research

98

6.4.1 Is there a direct link between Customer Satisfaction and Service Management Best
Practice?
98
6.4.2 Is Customer Satisfaction an indication of effective service Provision?

99

6.4.3 The Operational context best suited for the implementation of ITIL.

99

6.5 Models devised

100

6.6 Future Research

101

6.7 The value of this dissertation

102

Bibliography

103

Appendix A: Interviews with Executive Management

110

Appendix-C: Other Interviews

122

Appendix-D: Customer Satisfaction Surveys

124

Appendix-E: Call Statistic

128

Appendix-F: Research Synopsis

130

Appendix-G: The Content of Service Support and Delivery disciplines in ITIL


Appendix-H: Balanced Scorecard and CobiT

TM

136
138

Page 4 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-I: Potgieters IT/IS Practices Framework

140

Appendix-J: Action Research, Research Roadmap used

142

Appendix-K: The Service Capability Maturity Model

146

Appendix-L: The Service Quality Gap Model

147

Appendix-M: Correlation between Primary Data

148

Appendix-N: Balanced Scorecard Perspectives and what they should mean to Technology
Services

149

TABLES AND FIGURES


Figure-1. Managing Customer Expectations

15

Figure-2. Adapted view of American Consumer Satisfaction Index Model

17

Figure-3: Information Technology Infrastructure Library (ITIL ).

23

Figure-4: Quality Spanning the whole Organisation

25

Figure-5: The Balanced Scorecard (Kaplan and Norton, 1996)

28

Figure-6: Balance Scorecard cause-and-effect relationships

29

Figure-7: Balanced Scorecard, Measured Outcomes and Performance

32

Figure-8: Cascading Balanced Scorecard

32

Figure-9: The Gaps Model for Service Quality.

33

Figure-10: Research Framework

35

Figure-11: The Internal Business Process Perspective vs. ICT Service and Quality Management
38
Figure-12: The Nature of Primary Data Collected

41

Figure-13: Potgieters Quality Systems Practice Framework

43

Figure-14: Process Maturity Framework for IT Service Management

49

Figure 15: Levels of Process Maturity

50

Figure-16: The Research Framework as related to Section-4

53

Figure-17: OGC Process Maturity Assessments compared

54

Figure-18: Best Practice Conformance Assessments compared

55

Figure 19: Percentage of respondents who have logged calls

57

Figure-20: Percentage of the population who responded to the surveys

58

Figure-21: Median Response to Customer Satisfaction Surveys

59

Figure-22: Did Service improve during the last Quarter Surveyed?

60

Figure-23: Off-set values applied to Survey-3

61

Page 5 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Figure-24. Customer Satisfaction Surveys Compared

61

Figure-25: Survey Questions related to SERVQUAL Factors

63

Figure-26: SERVQUAL factors by Customer Satisfaction Survey

64

Figure-27: Summary Statistics from the Service Desk Database and Incident Manager.

65

Figure-28: Average number of calls logged by user per quarter

67

Figure 29 : Research data related to Potgieters 4 paradigms

71

Figure-30: Research Framework vs. Research Findings

72

Figure-31: Data sets used in final assessments

74

Figure-32: Correlation Coefficient calculated for Assessment and Survey results.

75

Figure-33: Coefficient of Determination (R2) of Aggregate Assessments.

76

Figure 34: Goals and Objectives relating to this Dissertation.

81

Figure-35: Operating and Service Delivery Strategy for Technology Services.

83

Figure-36: ICT Service and Quality Management and the Business Environment.

84

Figure-37: Quality Management Process vs. ITIL.

86

Figure-38: The Process Integration Model - An Overview of Recommendations

89

Figure-39: The Hourglass Model of Customer Satisfaction

93

Figure-40: Potgieters New IT/IS Practices Framework

140

Figure-41: Potgieters Quality Systems Practice Framework

141

Figure-42: The Gaps Model for Service Quality (Niessink, 2001)

147

Table-1. Sample of Customer Satisfaction Survey Questions and Measurement Scale

51

Table-2: Summarised responses of Managers regarding Service Quality, Best Practice etc.

68

Table-3: Assessment for Service Level Management

112

Table-4: Prerequisites for PMF Levels to equal 3.5

114

Table-5: Process Maturity and Best Practice Conformance Assessment - Service Level
Management & Service Desk

118

Table-6: Process Maturity and Best Practice Conformance Assessment Incident & Problem
Management
119
Table-7: Process Maturity and Best Practice Conformance Assessment - Change &
Configuration Management

120

Table-8: Process Maturity and Best Practice Conformance Assessment - Release Management
121
Table-9: The Customer Satisfaction Survey used, Qualitative Responses (Ordinal Data)

126

Table-10: Customer Satisfaction Surveys 1, 2, 3 and 3 for Managers

127

Page 6 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Table-11: Summary Statistics from the Service Desk Database and Incident Manager.

128

Table-12. Call Statistics from the Service Desk Database

129

Table-13: The unfoldingproject Plan, Act, Observe and Reflect

142

Table-14: Correlation Coefficient (R) calculated for all data sets collected.

148

2003/2004 J.H.Botha. All rights reserved.


Making use of models and frameworks devised by the author is permitted provided that the author is acknowledged
use the following reference. Botha JH (2004). Improving Customer Satisfaction and Operational Effectiveness with
the used of an ICT Service Management Best-Practice framework: Action Research in the Shared Services Centre.
Dissertation for the degree of Master of Business Administration Oxford-Brookes University / Damelin
International College of Postgraduate Business Sciences , Johannesburg South Africa. For other authors,
institutions or organisations referenced in this publication the respective copyright of the original author/s applies.
For more information write to:
Johann Botha, P.O.Box 2554, Pinegowrie 2123, South Africa or e-mail: jhb@jitt.co.za
A note on doing Internet based Literature Research
The internet is a wonderful research aid especially for finding literature for your literature review. Be very careful
though it is so easy to copy a piece of text to your draft papers forgetting to note the source. In this dissertation I
made this oversight. The work of Mr. Jos van Iwaarden was cited in the bibliography of my dissertation but citations
in the text somehow got lost between draft papers and the 30 odd revisions of the final text. Jos was kind enough to
point this oversight out to me and I corrected it in this version of the text this however does not correct the copy of
the dissertation filed in the University Library or copies downloaded from my website (www.jitt.co.za) between the
time of posting it and Joss correspondence with me.
I am a firm believer in making intellectual property freely available for use in both academic circles and for use in
business. The originators/owners of IP though need to be acknowledged. Although not intentional, I wronged Jos by
not being diligent enough in my research methods and neglecting to cite his work appropriately. Dont make this
mistake be diligent in recording your sources and if a train of thought was sparked by something someone else
said, please give them the credit they deserve.

Page 7 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

ACKNOWLEDGEMENTS AND THANKS

The last five years were eventful to say the least and were it not for the support and constant
motivation of friends and family, this dissertation would not have been possible.
My appreciation goes to the close friends who gave great support during a challenging time in my
career and life, the three Davids, George, Nicki, Mark and Alta - you are rare gems, to my mentor
Francois Baird who assisted me with my first steps on the path of management, thank you.
To the personnel of Damelin and Oxford Brookes, especially Dr. Charlene Lew, Prof. Zak Nel,
my study leader Dr. Christo Potgieter from the Waikato Institute of Technology for their support,
guidance and assistance and Yacinda Fourie for reviewing the language and grammar of this
dissertation, your effort and input is appreciated.
My appreciation also goes to the management of Shared Service Centre for giving permission to
do the research at the organisation. I am also indebted to my colleagues for their input, support
and guidance and to the Meta Group, QIMS, Foster Melliar, Quninica, AL Indigo and Microsoft
South Africa, who assisted, supplied information and advice during the project.
The individuals to whom I am most indebted, my two sons Adriaan and Jan-Hendrik, thank you
for your patience and understanding and my wife Erna, the love of my life, without your support,
patience and understanding I would not have been able to complete this mammoth task.
This has truly been one of the most enriching experiences in my life - my sincere thanks to
everyone that contributed.
I dedicate this dissertation to my father and who passed away before he had the opportunity to
complete his masters degree this one is for you dad!
---ooOoo---

Page 8 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

CHAPTER 1

INTRODUCTION
1.1 Background
The management of a multi-disciplinary organisation in South Africa, who asked not to be
named, identified the need to consolidate non-core services, making better use of resources and
ensuring business-units focus on their core competencies and mandate, and as a result the
Shared Service Centre was formed. Non-core services which will be offered by Shared Service
Centre include financial, procurement, audit, human resource and Information and
Communication Technology (ICT) services.
The mission of Shared Service Centre is to improve quality of service by providing enterprisewide, cost effective transversal services. Shared Service Centre intends to deliver a world-class
reference site with best of breed practices, procedures and systems, across the organisation
(Budget, 2002).
The ICT department (Technology Services) of Shared Service Centre was established in 2000
and actively started providing services in June 2002 to users in Shared Service Centre itself
this was considered the first phase of service provision to the broader organisation. Further
phases will focus on the provision of services to other business-units from July 2003 onwards.
The mission of Technology Services is to enable and transform the organisation through the use
of ICT. This mission translates into a set of strategic objectives that are best summarised by the
following theme:
Perform while you transform(Organisations ICT plan: 2002)
This theme is evident in all objectives; ensuring that the transition to an e-business enabled
infrastructure happens concurrently with daily business thus minimising disruption. It is an
incremental process with continuous value-addas the implementations grow and expand.
ICT in the organisation was decentralised and managed in a distributed fashion by businessunits. Decentralised departmental technology units have not functioned effectively due to a lack
of funds and skilled resources.
Page 9 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

A very diverse environment exists within the organisation. The problem now, is to ensure that
the ICT environment supports the vision of an e-business enabled enterprise. This new
environment requires information sharing and collaborative service provision across
departmental boundaries, suppliers and customers. It further requires the alignment of all ICT
functions in all business-units. Whilst this does not necessary mean standardisation of a product,
it does mean that the collective development of common objectives and standards, systems,
procedures and governance. The business and operational plans for ICT services should address
Business and Technology alignment, Human Capital Management, Architecture and
Infrastructure Planning, Operational and Service Management and Governance and Control.
To ensure that these objectives are met, Technology Services decided to adopt the de facto
standard for ICT Service Management Best Practice standard, Information Technology
Infrastructure Library or commonly called ITIL as a framework.
ITIL is a comprehensive and coherent code of practice to help organisations provide efficient
and cost-effective ICT services (OCG3: 2002).

ITIL was developed by the Office of

Government Commerce (OGC), a British government Executive Agency, tasked with


developing Best Practice guidelines on the use of ICT in service management and operations
and in conjunction with leading industry players around the world which validated these best
practices guidelines. ITIL is a framework of Best Practice and not a methodology; it is an
ongoing process of service improvement and it provides a reference rather than dictate.
1.2 Problem Statement
The benefits of implementing ITIL are many, according to the Office of Government
Commerce, including improved quality of service and customer satisfaction (OGC(1): 2002)
When the Office of Government Commerce was approached by the researcher to enquire about
the basis of these claims, especially the claim that ITIL contributes to Customer Satisfaction, a
Best Practice development consultant of the Office of Government Commerce responded as
follows (Burrel, 2003):
The claims that the implementation of ITIL can result in increased Customer Satisfaction are
based on reports we have received from organisations which have employed the best practices for

Page 10 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

areas such as the Service Desk and which have found from their internal surveys that Customer
Satisfaction has increased. We are unable to provide statistical evidence to support this.
Mr. Burrel pointed the researcher to the IT Service Management Forum, a body of Service
Management professionals, as he puts it, they have more contact with users of ITIL than us.
The response of the IT Service Management Forum took a similar line; the IT Service
Management Forum did however conduct a survey in the last quarter of 2001 a hundred
FTSE500 and UK government agencies took part in the survey. This survey showed that 97%
respondents claimed to have derived benefits from using ITIL, and 69% claimed to have derived
measurable benefits from using ITIL what these benefits were, were not specified (itSMF,
2001).
It is clear to the researcher that no theoretical premise, delineating the benefits of ICT Service
Management Best Practice (ITIL) exists, proving that it does contribute to increased customer
satisfaction or operational effectives. This dissertation would thus prove to be of importance to
the field of Information Technology Service Management and the field of Service Management
as a whole.
1.3 Purpose, Objectives and Key Constructs
The purpose of this dissertation is to determine if a relationship exists between the use of ICT
Service Management Best Practice and Customer Satisfaction and if a measurable improvement
in Service Quality was achieved as a result of implementing/following Best Practice.
Although the Office of Government Commerce claims that the use of ITIL results in increased
Customer Satisfaction, this hypothesis was not proved to date but rather inferred. Proving or
disproving this claim will thus substantially contribute to ICT Service Management field of
study.
The objective of this dissertation is to ascertain if (1) there is a direct correlation between
Customer Satisfaction and the use of Service Management Best Practice in the form of the
ITIL, (2) if Customer Satisfaction is an indication of effective service provision and (3) to
establish the operational context best suited to implement ITIL. Whether the objectives of this
dissertation were achieved, will be reviewed in Section 6.4.

Page 11 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Before addressing the research project, some key questions need to be addressed. Firstly, what is
a customer? Some authors draw a distinction between users and customers (Wood et al, 2001),
in this dissertation these terms will be used interchangeably. A customer is thus any
organisation, group or user, using a service and the term will be treated as such.
Secondly, what is Customer Satisfaction? Customer Satisfaction is a subjective measure of how
customers perceive or experience the services provided by a service provider.
Thirdly, what is ICT Service Management Best Practice? Best Practice is a very elusive concept
ITIL as a Best Practice framework is a set of suggestionsthat the contributors to ITIL
regarded as important elements in managing ICT environments no one organisation can claim
total conformance to ICT Best Practice.
Lastly the role of the researcher was that of Interim Service Level Manager and Service
Management consultant at the research site. A key consideration for the researcher was to
ensure that business focused and valuedservices were provided to customers. Service Level
Management is a key component of Service Management and act as an interface between
business and the ICT Service Provider.
1.4 Overview of Dissertation
This dissertation is divided into six sections, each dealing with issues to be addressed to satisfy
the aims and objectives of the dissertation. These sections are:
1. An introductory chapter Giving background to the research organisation, the problem
statement, some key constructs, the purposes and objectives of the research and finally
this section an overview of the dissertation.
2. The second chapter deals with a review of literature that is of relevance to the research.
Key issues addressed are; Customer Satisfaction and its relationship to Quality, what ICT
Service Management Best Practice is and components necessary for successful
implementation, the research framework used and finally a summary of this chapter.
3. The third chapter has a short introduction followed by a section dealing with the nature of
research, research design, sample characteristics, gathering of data, design of
questionnaires and the instruments used. It concludes with data analysis.

Page 12 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

4. Chapter four deals with the research results followed by a short conclusion on the results
obtained. The three major instruments used, deals with Process Maturity and
Conformance to Best Practice, Customer Satisfaction and Call Statistics.
5. Chapter five concludes by critically looking at the research and making recommendations,
looking at the nature of information and data collected, comparing primary data to see if
correlations exist between the data-sets, followed by a conclusion on the analysis of data
and findings. Next the operational context and the context for effective implementation of
Service Management at the research site is discussed, followed by recommendations and a
conclusion.
6. The last chapter deals with the relationship between the research conducted and the theory
discussed and used.

Page 13 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

CHAPTER 2
LITERATURE REVIEW
Although this dissertation focuses on the field of ICT Service Management, the social impact of
actions and decisions made by managers in the organisation needs to be considered. The
dissertation evaluates the social impact (Customer Satisfaction) of the implementation of IT
Service Management Best Practice, amongst other things. It also ascertains if IT Service
Management Best Practice contributes to organisational effectiveness and efficiency by
measuring and comparing three independent but interrelated measures:
1.

Service Management Process Maturity and Best Practice Conformance (objective


measure using an assessment tool of the Office of Government Commerce/itSMF).

2.

Service Quality (objective measure using system statistics, the number of users and
support personnel - supplemented by interactions with staff and managers).

3.

Customer Satisfaction (subjective measure based on customer satisfaction surveys).

To understand the context of the research conducted, key questions need to be answered:

What is Customer Satisfaction and why it seems to be synonymous with Customers


perceptions of Quality?

What is Best Practice, in particular ICT Service Management Best Practice?

Why is Customer Satisfaction important?

What is the relationship between theory and practice in the area of Service Management
Best Practice, Quality Management and business performance?

What type of environment ensures best results for the implementation of Best Practice?

Note:
1. Although some authors treat Information Technology or the new term that has replaced it,
Information and Communication Technology and Information Systems differently, the
researcher will use the term ICT to replace both these terms.
2. Users and Customers are not the strictly the same Customer Satisfaction will be seen as
one measure, as long as it does not effect the context discussed.

Page 14 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

3. This dissertation focuses on the daily activities of the organisation, although daily
involvement in these activities by the researcher is not a pre-requisite.

2.1 Customer Satisfaction and Quality


2.1.1 Customer Satisfaction Overview
In many organisations service quality is arbitrary, this result in a purely subjectively judging
service quality, based on short-term criteria, explaining why customers can be satisfied with
service the one day and another be very dissatisfied with service.
It is clear that the issues of quality and perception are closely linked. Wood et al (2000) (Figure1) explains that customers usually perceive service levels to be worse than what they actually are
and as service levels improve, this perception lags behind (credibility gap). Service levels
however can not always continue to improve and at some stage it will reach a plateau where
customers will, for a short period still perceive the service to be improving (over-expectation) and
will eventually become dissatisfied because no further improvement occurs, resulting once again
in the perception that service levels are worse than what they actually are. The only way to close
the credibility gap is to develop a trust relationship with the customer. Customer satisfaction =
Expectations - Perceptions

Managing Expectations
100

80

Over expectation

60

Actual
Perceived

40

20

Time

Figure-1. Managing Customer Expectations


Page 15 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

Quality of Service

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Communicating what can be expected and delivering what was promised will thus result in
satisfied customers (Wood et al, 2000).
General accepted definitions of Quality include (Sacks, 2002, p3):
Quality means a predictable degree of uniformity and dependability at low cost, with
a quality suited to the market.(Deming,1986)

The extent of discrepancy between customers expectations or desires and their


perceptions of the service.(Zeithaml et al,1988)

The totality of features and characteristics of a product or service that bear on the ability
to satisfy stated or implied needs. (ISO9001,2000) and

A comprehensive customer-focused system that will improve the quality of products and
services. It is a way of managing the organisation at all levels from top management to
front-line, to achieve Customer Satisfaction by involving all employees and continuously
improving the work processes of the organisation(The Federal Quality Institute, quoted
in Lewis, 1991).

It is interesting to note that Zeithamls definition of Quality and Woods definition of Customer
Satisfaction are virtually identical. In fact during the research period the researcher found that
Customer Satisfaction and Service Quality are in many instances synonymous. This
observation is substantiated by Dale (1999), who observed that satisfied customers are a product
of a quality service. Van Iwaarden (2002) also came to this conclusion in his recent study on user
perceptions of web-site quality and the exploration of this link was aptly (especially within an
ICT setting) drawn in his study. The issue of quality thus features very dominantly when
considering Customer Satisfaction.

2.1.2 Quality Overview


Quality, like customer satisfaction, is a subjective measure and expectations need to be clearly
defined. Total Quality Management (TQM) is defined in ISO9000:2000 (ISO, 2000 in Van
Iwaarden, 2002) as: a management approach that tries to achieve and sustain long-term
organisational success by encouraging employee feedback and participation, satisfying customer
needs and expectations, respecting social values and beliefs and obeying governmental statutes
Page 16 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y

and regulations, thus by implications, if an organisation is not sensitive to customer needs and
expectations, its customers will be dissatisfied.
Cox and Dale (2001) state that quality is the key element in business achievement and that
without attention to quality the organisation will fail to deliver the appropriate service levels,
resulting in dissatisfied customers.
The question now remains, what constitutes bad quality? In essence badquality (customer
dissatisfaction) is when what is expected does not match that which is delivered (see Woods
expectation-gap in figure-1). Whether a customer is satisfied or not, depends partly on the
customers expectations and partly on previous experiences1 with the organisation (Zeithaml et
al. 1990, Wood et al. 2000). Customers assess quality by comparing service delivered to service
expected (Berry and Parasuraman, 1991). When the organisation thus delivers what is expected
or exceeds that expectation, customers are satisfied.
The American Customer Satisfaction Index Model aptly summarises the relationship between
Expectations, Quality, Perceived Value and Customer Satisfaction (Figure-2).

Perceived
Quality

Received
Value

Customer
Satisfaction

Customer
Expectation
Based on the American Customer Satisfaction Index Model

Figure-2. Adapted view of American Consumer Satisfaction Index Model

It is interesting to note that over-deliveryon what was promised to the customer in terms of quality parameters
defined in the Service Level Agreement (SLA) with the customer frequently leads do dissatisfaction if these value
addsare no longer delivered. The customer came to expect the added valueas part of the quality parameters even
though it was never intended to be part of the quality parameters defined when SLAs were drawn up or subsequently.
This statement is based on observation and I have no academic references to substantiate this fact other than Wood
and Zeithamls comments and reading between the lines.

Page 17 of 152

to

bu
.d o

m
o

.c

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

lic
C
c u-tr a c k

.d o

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Irrespective of which definition is preferred (as defined in 2.1.2), quality cannot be achieved
without consideration of, and a focus on, the customer. It is perhaps for this reason that quality
management has become widely used and recognised (Wilson, Pitman and Trahn, 2000).
Kaplan and Norton (1996, p.87) made the following comment on quality: Quality was a critical
competitive dimension in the 1980s and remains important to this day. By the mid 1990s,
however, quality has shifted from a strategic advantage to a competitive necessity. It has become
a hygiene factor, customers take for granted that their suppliers will execute according to product
and service specifications. Nevertheless excellent quality may still offer opportunities for
companies to distinguish themselves from their competitors. In this case, customer-perceived
quality measures would be highly appropriate to include in the Balanced Scorecard perspective.
(emphasis added).
Although the principles of Total Quality Management as espoused by Deming (1986) and others
had been enthusiastically adopted and built into a number of programs, the Quality Audit period
saw the formalisation of qualityinto documentation and terminology and during this period the
development of performance indicators were resultantly a high priority (Wilson et al, 2000). This
statement of Wilson and those of Norton and Kaplan (1996) now raise other questions, (1) how
does Quality relate to performance indicators, particularly the Balanced Scorecard and (2) how
does quality (Customer Satisfaction) relate to the performance of the business as a whole.
2.1.3 Assessing Customer Satisfaction
Comparisons of customer expectations and their perception of actual performance can be made
done by using the SERVQUAL scale of Zeithaml et al. (1990). This scale was developed for the
service sector, and it thus fits the environmental context of this dissertation well. The Scale has
five generic dimensions (quality factors?) (adapted from Zeithaml et al, 1990), namely: 1. Tangibles: The appearance of physical facilities, equipment, personnel and
communication materials
2. Reliability: The ability to perform the proposed service dependably and accurately.
3. Responsiveness: The willingness to help customers and provide prompt service.

Page 18 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

4. Assurance: The knowledge and courtesy of employees and their ability to convey trust
and confidence (including competence, courtesy, credibility and security).
5. Empathy: The provision of caring and individualised attention to customers (including
access, communication and understanding the customer).
Although there is some criticism on the long term results of the SERVQUAL scale (Lam & Woo,
1997) and the general applicability of the five dimensions (Crosby & LeMay, 1998),
SERVQUAL is a widely used instrument in the business and academic sector alike (Van
Iwaarden, 2002).
Zeithaml et al. (1990) found that customers judge service quality by the same set of criteria,
regardless of the type of service evaluated (Llosa et al, 1998).
The most important factors are reliability, then responsiveness, then assurance followed by
empathy and the least important factor is tangibles (Zeithaml et al, 1990), proving how fickle the
issue of Customer Satisfaction is.

2.2 ICT Service Management Best Practice


Constantly high service quality can only be obtained when it is backed up by good internal
organisation, systems, processes and procedures(Kerklaan in Mastenbroek, 1991, p.51).
2.2.1 Best Practice
Best Practice and quality are also often used synonymously and although there are similarities,
Best Practice has engendered its own definitions. The Australian Best Practice Demonstration
Program defines Best Practice as:
the pursuit of world class performance. It is the way in which the most successful organisations
manage and organise their operations. It is a moving target. As the leading organisations
continue to improve the best practice, goalposts are constantly moving. The concept of
continuous improvement is integral to the achievement of best practice.(ABPDP, 1994, p.3)

Page 19 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y

2.2.2 Service Management


Service Management is the ability to conceive and design service packages and service delivery
systems that fulfil some client needs, to manage the growth and daily activities for the service
organisation as effectively and efficiently as possible, it is the ability of simultaneously executing
service marketing and service operations activities whilst providing the service. The General
objectives of Service Management are the maximisation of profit, customer service and quality
making use of the limited resources available to the organisation (Collier, 1987). Service
Management should thus deliver a quality service that satisfies the customers needs or
expectations, within the organisations financial means.
Service Management is reliant on the existence of a Service Culture in the organisation. A
Service Culture is an organisational culture which emphasises satisfying customer requirements
(Fry, 1989) and the existence of a service culture is seen as a prerequisite for delivering a quality
service (Mastenbroek, 1991).
2.2.3 ICT Service Management Best Practice
The dilemma of the researcher is that very little, especially academic, material exists on the topic
of ICT Service Management Best Practice. The ITIL framework seems to be the de facto
standard for ICT Service Management Best Practice as is evident by the sheer number of ICT
vendors and users that have adopted the ITIL framework. ITIL forms the basis of the de jure
standard for ICT Service Management best practice, BS150002 (http://www.itsm.org.uk).
Section 2.2.4 outlines the ITIL framework as the basis of ICT Service Management Best
Practice, although not all authors agree that ITIL encompasses all aspects of ICT Service
Management Best Practice.
Thiadens (2002) comments that three phases are visible in its development, 1980s - Structuring
of Organisations (ITIL), 1990s Provision of Functions, and 2000 beyond Extension and
Growth. Thiadens comments that the accent in the field of Service Management moved from
organising internal services to directing services towards an improved performance. Although the

The BS15000 framework was recently accepted by the International Standards Organization as an international
standard to managing ICT Service Management, now called ISO/IEC20000-1:2005 and 20000-2:2005, see
http://www.iso.ch/iso/en/CombinedQueryResult.CombinedQueryResult?queryString=20000)

Page 20 of 152

to

bu
.d o

m
o

.c

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

lic
C
c u-tr a c k

.d o

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y

researcher agrees on the content of Thiadens statement, he does not agree that it is a time-bound
phenomena it is rather a sequential progression based on organisational maturity. One first has
to ensure that a firm foundation exists (structuring of the organisation) before the organisation
can focus on extension and growth ITIL does this.
2.2.4 ITIL Background and Concepts
The Information Technology Infrastructure Library (ITIL) is a set of guidelines developed by
the UKs Office of Government Commerce (OGC). The framework documented in a set of
books, describes an integrated, process based, Best Practice framework for managing ICT
service. To date, ITIL is the only comprehensive, non-proprietary, publicly available guideline
of ICT Service Management (Pink Elephant, 2002).
The Office of Government Commerce did not write the library, it was rather a collaborative effort
of many leading organisations, including users and vendors. Editorial boards consisting of
industry experts determined the scope of the library, books were written by a number of
organisation and quality assured by a number of others. The Office of Government Commerce
performed an editorial function, examined processes presented and ensured that processes
matched any requirements of ISO9001 (note the emphasis on quality).
ITIL was conceived in the late 1980s and although its original intent was to serve as a platform
to improve ICT Service Management at the UK central government, it is relevant to all
organisations; public or private sector, large or small, centralised or distributed (Pink Elephant,
2002). ITIL library underwent a major revision in 2000 3 to ensure that the framework is current
and applicable to the current environment.
Thus ITIL is:
1. Non-proprietary because the end results are not based on a single person or organisations
view of a particular process.
2. Best Practice because the library represents the experience of many ICT professional and
industry participants.
3. Written to de jure quality standards (ISO9001:2000 and BS15000 aware4).
4. Public Domain as the copyright for ITIL is held by the British Crown.
3
4

The third major revision of ITIL is currently in process see - http://www.itsmf.com/itil3refresh.asp


Now also - ISO/IEC20000-1:2005 and 20000-2:2005

Page 21 of 152

to

bu
.d o

m
o

.c

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

lic
C
c u-tr a c k

.d o

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y

ICT service is provided by a provider and comprise of the infrastructure that enable the service
provided. Infrastructure is used to collectively describe hardware, software, processes,
procedures, work methods and instructions, communication equipment, documentation and the
required support skills. These components need to be collectively managed, hence the term ICT
Service Management.
The first ITIL books were compiled by 1995, and although the library consists of 39 titles, 10
are most widely used. These 10 books were combined in two volumes during the 2000 revision
of ITIL, namely Service Support and Service Delivery (Figure-3). Other sets of books are
concerned with Operations, Security and Application Management and Implementation
Guidelines (see Appendix-G for more detail on the Service Support and Service Delivery) 5.

This study focused primarily on the implementation of Service Support processes and one Service Delivery
process, namely Service Level Management. ITIL is frequently seen as IT Service Management Processes (Service
Delivery and Service Support) this view is NOT CORRECT. We would be doing the library a disservice to view
the ITSM processes as an all-encompassing management framework these processes encompass the management
of services in an operational environment (mostly). It is of vital importance to realize that ITIL is more than Service
Delivery and Service Support and it is advised that one have a thorough understanding of the complete library and
the additional processes defined in the other books! The OGC and the itSMF is partly to blame for this state of affairs
as training and certification programs focused (until recently) entirely on the two Service Management books and not
the others.

Page 22 of 152

to

bu
.d o

m
o

.c

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

lic
C
c u-tr a c k

.d o

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Figure-3: Information Technology Infrastructure Library (ITIL).

Each of the disciplines outlined in the Service Support and Service Delivery books, address the
effective management of and services provided by the ICT enterprise the disciplines also
address important issues to ensure successful implementation:
1. People quantity and quality of experience and knowledge.
2. Process ICT and organisational specific practices, procedures, guidelines, and the level
of complexity or sophistication needed.
3. Technology the total picture of technology and data in the organisation, physically
and logically.

Page 23 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

4. Organisation internal and external business factors that may affect ICT and how ICT
and the organisation interface (including issues of culture, direction, strategy etc.)
5. Integration how ICT integrates with the business model, what services are needed or
provided and how ICT can contribute to achieving business objectives.
2.2.5 Quality Best Practice
Garvin (1988) categorises the various definitions of quality into five approaches, the approach
adopted by this researcher is the product-based approach, thus viewing quality as a measurable
set of characteristics that are required to satisfy the customer. In short - quality is consistent
conformance to customer expectations. Slack (2001, p.556) puts it this way: Quality needs to be
understood from a customers point of view, because to the customer, the quality of a product or
service is whatever he or she perceives it to be.
If the product or service experience was better than expected, then the customer is satisfied and
quality is perceived to be high, if it was less than expected the customer is dissatisfied and the
quality is perceived to be low and if it met expectations the quality is seen as acceptable.
Improving quality is thus a function of addressing gaps in performance.
In some situations, customers may be unable to judge the technicaloperational specification of
the service or product. They may then use surrogate measuresas a basis for their perception of
quality (Haywood-Farmer et al, 1991), this situation is obviously not ideal as this obviously does
not relate to reality.
The aim is to ensure that expectations are clearly defined by setting measurable deliverables,
ensuring these are met. This can be achieved by (adapted from Haywood-Farmer et al, 1991):

Defining the quality characteristics of the product or service.

Decide how to measure each quality characteristic.

Set pre-defined standards for each quality characteristic.

Controlling quality against set standards.

Find causes of poor quality and take corrective action.

Continually make improvements.

Page 24 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Quality characteristics need to be matched with the product or service but generally fall into the
following categories (adapted from Slack, 2001):

Functionality how well does the product or service do its job.

Appearance the aesthetic elements of the product or service (yes services can contain
aesthetic appeal).

Reliability mean time between failures, mean time between system incidents,
consistency and performance over a period of time.

Durability the useful life of the product or service does the product or service keep up
with trends and advancements in technology.

Recovery ease or repair and restoration of service.

Contact inter-personal contact between service staff and the customer influenced by
the level of skill, product/service knowledge, people and conflict resolution skills.

Cost seen against the value added to the customer cost vs. benefit.

Once quality standards are identified operational staff need to check the quality of delivery and
Quality standards need to be revised frequently as customer needs change over time.
Effective quality improvements need to go beyond what is described above the quality
philosophy need to span the whole of the organisation. Total Quality Management (TQM) can be
viewed as a logical progression of quality related practices as seen in Figure-4 (adapted from
Slack, 2001).

Figure-4: Quality Spanning the whole Organisation

Page 25 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

TQM is a well thought trough philosophy of how to approach quality management, it is a way
organisations need to think and work which lays particular emphasis on (adapted from Slack,
2001):

Meeting needs and expectations of customers.

Spanning the whole organisation and involving all employees.

Evaluating all costs relating to quality, especially the cost of failure to deliver against set
standards.

Design of product and services include quality perspective.

Systems and procedures support quality improvement.

Quality improvement is a continuous activity.

TQM is sometimes referred to as quality at source. This notion stresses the impact that every
staff member has on quality - every person is responsible for quality. For an organisation to be
truly effective, each department, each activity, each person and each level, must work together
because every person and activity affect, and in turn are affected by, others (Muhlemann et
al,1992).
The Quality system should cover all facets of an organisations operation, from identifying and
meeting the needs and requirements of customers, design, planning, purchasing, manufacturing,
packaging, storage, delivery and service, together with all relevant activities carried out within
these functions. It deals with the organisation, responsibilities, procedures and processes. Simply
put, a quality system is Management Best Practice(Dale, 1999).
The documentation which is used in a quality system can be defined at four levels (Slack, 2001)
the last three is of particular importance to Service Management Best Practice:
Level 1 Company Quality Manual: this is the fundamental document and provides the
quality management policy and quality system.
Level 2 Procedures manual: describes the systems functions, structure and
responsibilities of each department, function or process owner.
Level 3 Work instructions: specifications and detailed methods of performing work
activities.

Page 26 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y

Level 4 Database: contains the above plus all other reference documents like forms,
standards, diagrams, reference material etc.
ISO 9000 is a set of worldwide standards that establish requirements for organisationsquality
management systems. ISO 9001:2000 deals with a quality model for quality assurance in design,
development, production, installation and servicing. The purpose of the ISO standards is to
provide assurance to customers that services and products are delivered or produced in such a
way that they meet customers defined requirements - it is a well defined Quality Management
System that applies to all types of organisations and provides a platform for Total Quality
Management6.

2.2.6 Managing ICT Organisations Measurement and Control


The underlying logic of the research done by Vos et al. (1997) in the International Service Study
is that the adoption of Best Practice has a direct link to the attainment of high service
performance, which in turn leads to superior business performance (Markland,1999). The studys
research model draws on established models and practice and performance in service
organisations. This includes the Service Profit Chain (Heskett et al, 1994) which points out that
there are strong links between Customer Satisfaction and loyalty and business performance,
productivity and service value. The study also has a substantial focus on Quality thus assuring
the researcher that the concepts discussed above are of relevance for this study.
The researcher consequently sought a Best Practice framework that encompasses Quality,
Customer Satisfaction and a solid framework for Management Best Practice, which led to the
Balanced Scorecard. The Balanced Scorecard extends the key concepts of TQM, including
customer-defined Quality, continuous improvement, employee empowerment and primarily, a
measurement based management feedback system that is strongly aligned to the organisations
strategy (The Balanced Scorecard Institute, 2003).

9001:2000 certifications does not imply that Total Quality Management is entrenched in the organization, it merely
states that a quality system is in place, maintained and standards are adhered to. TQM on the other hand is more of a
transformational process that changes the heartof the organization it is entrenching the quality philosophy in
everything the organization and its people do.

Page 27 of 152

to

bu
.d o

m
o

.c

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

lic
C
c u-tr a c k

.d o

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

The Balanced Scorecard provides managers with the instruments needed to navigate future
competitive success. Organisations are competing in complex environments where an accurate
understanding of goals and the methods for attaining these goals are vital. The Balanced
Scorecard translates the organisations mission and strategy into a comprehensive set of
performance measures that provide a framework for strategic measurement and management.

Financial
KPI

Measure

Generic Measures includes:


ROI, ROCE, EVA etc.

Customer
KPI

Measure

Internal Process
Vision & Strategy

KPI

Measure

Generic Measures includes:


Quality, Response Time, Product
Development etc.

Generic Measures Includes: Cust.


Satisfaction & Retention, Market &
Account Share

Learn & Innovate


KPI

Measure

Generic Measures Includes:


Employee Satisfaction, Systems &
Knowledge Base Availability etc.

Figure-5: The Balanced Scorecard (Kaplan and Norton, 1996)


Although the Balanced Scorecard retains an emphasis on achieving financial objectives, it also
includes performance drivers for non-financial objectives. The Scorecard measures organisational
performance across four balanced perspectives: financial, customer, internal business process and
learning, growth and innovation (Figure-5). The Balanced Scorecard enables companies to track
financial results while simultaneously monitoring progress in building capabilities and acquiring
the intangible assets they need for future growth (Kaplan and Norton, 1996).
Kaplan and Norton (1996) state that the Information Age environment, for both manufacturing
and service organisations requires new capabilities for competitive success and that the ability of
a company to mobilise and exploit its tangible and invisible assets has become far more decisive
than investing and managing physical, tangible assets. Intangible assets enable an organisation to:

Develop customer relationships retaining the loyalty of existing customer segments and
market areas to be served effectively and efficiently.

Introduce and produce innovative, high-quality products and services desired by targeted
customers, at low cost and with short lead times.

Page 28 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Mobilise employee skills and motivation for continuous improvements in process


capabilities, quality and response times and

Deploy information technology, data bases and systems.

Financial measures largely look at past events. These measures are inadequate, however for
organisations that need to survive in todays information age. Companies must create future value
trough investment in customers, suppliers, processes, technology and innovation, ala Kaplan and
Norton (1996). The Balanced Scorecard is a tool to enable organisations to achieve their long
term strategies.

ROCE
Financial
Customer Loyalty

On-time Delivery
Customer
Internal
Business
Process
Learning
& Growth

Process
Quality

Process Cycle
Time

Employee Skills

Norton & Kaplan

Figure-6: Balance Scorecard cause-and-effect relationships


Any organisational strategy is a set of hypotheses about cause and effect. Kaplan and Norton
(1996) state that the measurement system (balanced scorecard) should make the relationships
among objectives in various perspectives explicit so that they can be managed and validated. The
chain of cause and effect should pervade all four perspectives of Balanced Scorecard (Figure-6).
Effective financial management must address risk as well as return. Objectives relating to
growth, profitability, and cash flow emphasise improving returns from investment. But
businesses should balance expected returns with management and controlling risk. Thus,
businesses include an objective in their financial perspective that addresses the risk dimension of
their strategy(Kaplan and Norton, 1996, p.50). This comment leads to the discussion of how
the Balanced Scorecard fits into the ICT organisation.
Page 29 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

2.2.7 Control Objectives for Information and Related Technologies (CobiTTM)


Control Objectives are defined as, The Policies, Procedures, Practices and Organisational
Structures, Designed to Provide Reasonable Assurance that Business Objectives will be Achieved
and that Undesired Events will be Prevented or Detected and Corrected(CobiT TM Management
Guidelines, 2000, p.3) In essence it is a management best practice that aims to ensure that effort
leads to the achievement of business objectives.
ICT Governance is an integral part of the success of Corporate governance by assuring
efficient and effective measurable improvements. Governance provides structure that links
processes, resources and information to organisational strategies and objectives. It further
integrates and institutionalises the Best Practices of planning and organising, acquiring and
implementing, delivering and supporting, and monitoring ICT service provision performance,
ensuring that the organisations information technology supports its business objectives - thus
enabling the organisation to take full advantage of its information, maximise the benefits of this
information and capitalising on opportunities.
Control Objectives for Information and related Technologies (CobiT TM), now in its 3rd edition,
helps meet the multiple needs by bridging the gaps between business risks, control needs and
technical issues. It provides Best Practices across domains and process frameworks and presents
activities in a manageable and logical function.(IT Governance Institute, 2000, p2)
CobiTTM was sponsored by the Information Systems Audit and Control Foundation and it
provides a comprehensive checklist for business process owners. From the business perspective,
CobiTTM recognises 34 high-level control objectives for ICT processes. These processes are
grouped into four domains: planning and organisation, acquisition and implementation, delivery
and support and monitoring. By addressing these 34 high-level control objectives, the business
process owner can be assured that there is an adequate control system for the ICT environment.
Of particular relevance to the planning and implementation of Service Management Best Practice
are the control objectives associated with delivery and support(OGC(2), 2002) including:

Defined Service levels

Management of performance and capacity

Ensuring continuous service

Page 30 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Ensuring systems security

Identifying and attributing costs

Assisting and advising IT Customers

Managing configurations

Management of problems and incidents

Managing facilities and operations

The guidelines set out in CobiTTM are action oriented and generic and provide management
direction for getting the organisations information related processes under control, monitor the
achievements of organisational goals and monitoring performance with each ICT process.
CobiTTM makes use of the Balanced Scorecard and three key indicators, namely:

Critical Success Factors which define the most important management oriented
implementation guidelines to achieve control over and within its IT processes

Key Goal Indicators which define measures that tell management whether an IT process
has achieved its business requirements, and

Key Performance Indicators which are lead indicators that define measures of how well
the IT process is performing in enabling the goal to be reached.

ICT Governance and its processes ensure delivery of information to the organisation that
addresses the required Information Criteria (effectiveness, efficiency, integrity etc.) and measure
Key Goal Indicators (organisational Balanced Scorecard). This is enabled by creating and
maintaining a system of processes and controls appropriate for the organisation that directs and
monitors the business valueof technology, it considers Critical Success Factors (Best Practice)
that leverage Resources (people, facilities, applications etc.) and is measured by Key
Performance Indicators (process Scorecard). Appendix-H provides more detail on the KGI and
KPIs. CobiTTM/Balance Scorecard thus aims to ensure that the factors contributing to
organisational success are measured, supporting organisational, functional and personal
objectives.

Page 31 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Business
Alignment
(CSF's)

Goals

Enablers

Measured
Outcome

Measured
Performance
(KPI's)

Process
Goals
(KGI's)

Operational
Scorecard
(ICT Scorecard)

Balanced Business Scorecard

(source: CobIT Management


Guidelines)

Figure-7: Balanced Scorecard, Measured Outcomes and Performance


Figure-7 relates different performance measures to the context in which they are used. Critical
Success factors (CSF) are used to measure the effectiveness of business alignment: key issues
that must go rightfor the organisation to flourish (Robson, 1997), Key Goal Indicators (KGI)
represent functional goals (activities to be performed and measured after the fact) and Key
Performance Indicators (KPI) are indicators which measure how well enabling activities are
done. For the Balanced Scorecards to be successful as a control mechanism, performance
measures needs to cascade trough the organisation (Figure-8).

Strategic Goals
and BSC

Enablers
(measured
Performance)

Manager / Team-leader
(Work-stream / Function)

Tactical Goals
and BSC

Practitioner
(Function)

Management / Executive
(Organisation)

Enablers
(measured
Performance)

Operational
Goals and BSC

Enablers
(measured
Performance)

Figure-8: Cascading Balanced Scorecard

Page 32 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

A word of warning; if the management scheme (Balanced Scorecard) is perceived as a control


mechanism over employees it may well lower levels of trust between management and
employees. It may not achieve the desired results and produce less than optimal performance, or
even lower performance (Handy, 1999). Appendix-H outlines some key issues to avoid when
implementing the Balanced Scorecard.
2.3 Research Framework
This document is about evaluating the components that contribute to Service Quality although
specific components are focused on, Customer Satisfaction as a consequence of a Service Quality
and Best Practice as a mechanism to improve Service Quality, the researcher needed to devise a
framework to guide research activities and determine if a correlation between data from these
dimensions exists and how to do this in an academically sound model?

Figure-9: The Gaps Model for Service Quality.

For guidance the researcher turned to the Gaps Model of Service Quality (Niessink, 2001).
Niessinks model corresponds to a number of other models used in the fields of marketing
(Doyle, 1998) and operations management (Slack et al, 2001). Niessink adapted existing models
to deliver service quality in an Information Technology environment (Figure-9, also see
Appendix-L) and drew heavily on the work of Parasuram et al.(1985), thus assuring this

Page 33 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

researcher of the appropriateness of using this model as a basis especially in the light of
discussions in Section-2.1.2, 3 and 4 above.
The Gap Model identifies five gaps and was adapted to focus on the objectives of this
dissertation. Some components were combined (External Communication and Service Design) as
both falls within the scope of the chosen ICT Service Management Best Practice Framework
(ITIL). Perceptions of Expected Service were removed as it also forms part of Service Level
Management as defined in ITIL. Perceived Service remains as this relates to Customer
Satisfaction and Service Delivery was also retained. Service Delivery measurements were very
difficult, and for the following reasons:

A measure must be used that was in place from the inception of the research project.

The Purpose of following Best Practice is to ensure that measurements are in place mainly
as these normally do not exist to compare the organisation against.

Only two service delivery measurements existed at inception of the project, although it
may not necessarily reflect total service delivery it can be used as in indicative measure.
These measures are Incident Resolution Time and calls logged per user, per period.

The Research Framework, as outlined in Figure-10 was used as the basis for research conducted.
Customer perceptions were measured using Customer surveys based on SERVQUAL this
relates to customers view of Quality of Service (Comparison-A). The ICT Service Management
Process Maturity and Best Practice Conformance Assessments were used to measure Service
Design, Standards, Process Maturity, Conformance to Best Practice and the effectiveness of
External Communications (Comparison-B). Both Comparisons A and B compared and
analysed the results of three assessments.
Call Statistics were obtained from the Service Desk Tool relating time to repair and
information on the number of users and technical support staff formed the basis of the actual
improvement of Service Delivery (Comparison-F).

Page 34 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Figure-10: Research Framework


The results of each of the above analysis were compared with others to establish if there was a
correlation between results (A to B = C, A to F = D and B to F = E).
Results were then analysed and in conjunction with interviews and observations by this
researcher regarding behaviour of service staff, the environment, customers and observed
symptoms/phenomena.
In Comparison A, B and F, a baseline was established at the inception of the research project
which was used to compare future results and observations against gauging improvement (or
not). To ascertain if Best Practice do or do not contribute to increased Customer Satisfaction, the
researcher attempted to find a correlation between results, thus addressing the gapsthat exist,
over time, assessing the effect of the adoption and implementation of ITIL. It should be noted
that the researcher was more interested in how the measurements change over time and how
they relate to each other closing the gaps was of secondary concern.

Page 35 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

2.4 Summary of Literature Reviewed.


At the onset of the Literature Review, key questions were posed; the literature reviewed was
chosen to assist the researcher in shedding some light on these questions. The resulting review of
literature posed additional questions. These and the original questions posed will form the basis
of the summary of the reviewed literature.
Q: What is Customer Satisfaction and why is Customer Satisfaction synonymous to

customers perception of Quality?


A: Customer Satisfaction is a highly subjective measure in essence a customer is satisfied when
what is delivered is better or the same as what customers expect. Customer Satisfaction is
synonymous with a customers view of Quality (Dale, 1999, Wood et al, 2000, van Iwaarden,
2002 and Zeithaml et al, 1988).
Whether a customer is satisfied or not depends partly on the customers expectations and
partly on previous experiences with the organisation (Zeithaml et al, 1990, Slack 2001 &
Wood et al, 2000). Customers assess Quality by comparing service delivered to service
expected (Berry & Parasuraman 1991, ACSI 2002). When the organisation thus delivers what
is expected or if it exceeds that expectation, the customer is satisfied.
The American Customer Satisfaction Index Model aptly summarised the relationship between
Customer Expectations, Quality, Perceived Value and Customer Satisfaction; stating the first
three factors all lead to Customer Satisfaction. Quality is thus of vital importance to ensure
Customer Satisfaction and will play a major role in this dissertation.
Q: What is Best Practice- in particular what is ICT Service Management Best Practice?
A: Best Practice is defined as: the pursuit of world class performance. It is the way in which
the most successful organisations manage and organise their operations. It is a moving target.
As the leading organisations continue to improve, the best practicegoalposts are constantly
moving. The concept of continuous improvement is integral to the achievement of Best
Practice.(ABPDP,1994, p.3)
Very little information exists on the topic of ICT Service Management Best Practice. The
ITIL framework seems to be the de facto standard for ICT Service Management Best

Page 36 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y

Practice as is evident by the sheer number of vendors and users that adopted the framework.
ITIL supports the ISO quality approach.
It should however be noted that ITIL is a Best Practice framework focusing on processes,
other Best Practice frameworks are thus necessary to ensure ITIL is properly implemented.
The two most obvious are ISO9001:20007 and CobiTTM8 which utilises the Balanced
Scorecard.
Q: Why is Customer Satisfaction/Quality of importance to the organisation and the
Information and Communications Technology provider?
A: Cox and Dale (2001) state that Quality is the key element in business achievement and that
without attention to Quality the organisation will fail to deliver the appropriate service levels,
resulting in dissatisfied customers. Wilson (2000) postulates that a formal quality
management programs provide concrete evidence of an organisations commitment to client
service, continuous improvement and provision of quality service, thus satisfying customers.
Q: What is the relationship between theory and practice in the area of ICT Service
Management Best Practice, Quality Management, performance indicators and the use
of competency standards?
A: Constantly high service quality can only be obtained when it is backed up by good internal
organisation, systems, processes and procedures(Kerklaan in Mastenbroek, 1991, p.51). It is
thus clear that a framework is needed that encapsulate best practices, consolidating it into one
executable program. Figure-11 offers such a framework; it includes the concepts necessary to
ensure that the daily activities of all staff contribute to the strategic objectives of the
organisation in a manageable fashion.

7
8

A quality system and process to manage processes.


Well defined KPIs to use as success indicators for the effective implementation of the selected ITIL processes.

Page 37 of 152

to

bu
.d o

m
o

.c

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

lic
C
c u-tr a c k

.d o

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Business Strategy and Managing the Business Risk (KGIs and CSFs)

Identify the
Market

Customer
Need
Identified

Create the
Product/
Service
Offering

Build the
Product/
Service

Deliver the
Product/
Service

Program/Project Management
Architecture and Operations

ICT Architecture, Application


Development and Business
Analysts

The Innovation Process

The Operations
Process

Service the Customer

Customer
Need
Satisfied

ICT Service Management


and Operations

The Service Process

Manage and Monitor Service Delivery and Quality - ICT Service Management and TQM (KPIs)
adapted from Kaplan and Norton 1997

Figure-11: The Internal Business Process Perspective vs. ICT Service and Quality Management

Q: What type of environment ensures best results for the implementation of Best Practice?
A: Best Practice is reliant on the existence of a Service Culture - a Service Culture is an
organisational culture which emphasises satisfying customer requirements (Fry, 1989) and
the existence of a service culture is seen as a prerequisite for delivering quality service
(Mastenbroek,1991).
Q: How does Quality relate to performance indicators, more particularly to the Balanced
Scorecard?
A: A Quality Management System ensures that performance indicators are documented,
understood and adhered to, it thus provides the framework in which organisational goals can
be related to daily activities and ensures adherence to these standards, processes, procedures
and work instruction (Sacks, 2003).
Q: How does Quality relate to the performance of the business as a whole?
A: Kaplan and Norton (1996) commented that: Quality was a critical competitive dimension in
the 1980s and remains important to this day. By the mid 1990s, however, quality has shifted
from a strategic advantage to a competitive necessity . It has become a hygiene factor,
customers take for granted that their suppliers will execute according to product and service
specifications(Kaplan and Norton, 1996, p.87).
Q: What framework can be used for the research in this dissertation?

Page 38 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

A: A logical framework was needed to compare results from Customer Surveys and Process
Maturity and Best Practice Assessments firstly, over the research period and secondly with
each other and information collected from call statistics to ascertain if the use of ICT Best
Practice contributed towards Customer Satisfaction and secondly contributes to better service
delivery/Quality.
Niessinks Gaps Model of Service Quality (2001) was used as the basis for the Research
Framework that met these objectives.

Page 39 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

CHAPTER 3
RESEARCH METHODOLOGY

3.1 Introduction
Measuring service improvement would have been a simple task, were it not for the involvement
of people. This necessitated broadening the scope of the dissertation to include the perception of
the quality of service offered by Technology Services to the customer.
The researcher understands that the world is socially constructed (subjective). The researcher
was part of the object and subject observed and understands that science is driven by human
interest. Research thus followed a phenomenological approach as both subjective and objective
data needed to be evaluated and interpreted. The researcher realised that for this dissertation to
be meaningful, the aim must be to understand what is happening, have a holistic view of the
situation and interpret data from this viewpoint, thus developing ideas through induction of the
data at his disposal.
The mere nature of what the researcher aimed to understand, necessitated the use of multiple
methods to collect primary data, both objective and subjective, to understand the environment
that lead to support (or not) of the hypothesis that (1) there is a direct correlation between the
level of Customer Satisfaction with services provided by Technology Services and the use of
ICT Best Practice in the form of the ITIL framework and (2) that Customer Satisfaction is an
indication of effective service delivery in the organisation.
As the magnitude of the whole Shared Service Centre project was far too ambitious, this
dissertation focused on the first phase of the project (the establishment of the Shared Service
Centre function.). This phase also contributed to a more stable environment in which research
could be conducted, as fewer variables in the operating environment were present during this
phase, although it could not be said that all things were always equal.
For data to be meaningful, making use of the phenomenological paradigm, information, data and
observations needed to be made over time. The collection of data was done over a period of
eleven months during this time pre-, during and post-evaluation of customer perceptions
Page 40 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

(subjective) were made, Call Statistics were collected and assessments were conducted to
determine the process maturity of Technology Services.
Primary data was collected in the form of e-mail based questionnaires focusing on customers,
structured interviews with senior and line-function managers, assessments, surveys and systemgenerated statistics.

3.2 Nature of Research


3.2.1 The nature of data collected
During the research period, data collected could either be quantitative or qualitative, objective or
subjective. Figure-12 outlines the relationship of primary data collected.

Figure-12: The Nature of Primary Data Collected


Research was not conducted in a stable environment; a pre-condition existed, new concepts,
practices and procedures were introduced and resulted in specific outcomes. The nature of this
dissertation revolved around change, its impact and its outcomes. Apart from objective
measurement of Service Management Process Maturity, conformance to Best Practice and

Page 41 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

system statistics, the subjectivemeasurement of Customer Satisfaction and the general views
of management and staff needed to be considered. Regardless of the contents of statistical
observations and regardless of managements, staffs and the researchers views, customers
either felt satisfied with Services Quality or they did not.
Obtaining information from objectiveobservation and subjective experience of the ICT
Services offered was thus necessary and would benefit the field of ICT Service Management in
general. These two seemingly opposing or contradictory sources of information caused the
researcher difficulty how do you consolidate or integrate these two opposing views? An
academically accepted framework had to be explored as a guide for the researchers research
approach.
Burrell and Morgan (1985) concluded that different approaches to social theory make
assumptions about the nature of society and that these assumptions yield contrasting
dimensions. The assumptions about the nature of social sciences cater for subjective as well as
objective views, order as well as change. Considering this principle, they presented a framework
for the analysis of social theories as four quadrants of a matrix, each representing a certain
paradigm. Hirschheim and Klein (1989) applied the Burrell/Morgan framework to Information
Systems Development and whereas Burrell/Morgan considered the paradigms to be mutually
exclusive, Hirschheim and Klein (1989) suggests that the influence from one paradigm is merely
dominant, and Information System development approaches are influenced by assumptions from
more than one paradigm, thus concluding that some elements may only be more dominant than
others but all are none the less present.

Page 42 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Figure-13: Potgieters Quality Systems Practice Framework

Potgieter (1997) applied the Hirschheim/Klein model to the field of Service Management. He
further states that the four quadrants of his model, as is the case with Hirschheim and Kleins
(1989) are not regarded as mutually exclusive and can be studied simultaneously.
Potgieter (1997) adapted his IT/IS Practices framework to what he termed the Quality Systems
Practice Framework (Figure-12). Four descriptive quadrants (paradigms) were used to describe
ICT Service Management and Quality practices, putting research results in context. It should be
noted that the fundamental assumptions of each of these paradigms were based on different
(objectivity versus subjectivity and order versus change) information sources.
Potgieter (1997) further suggests that dynamic selective use of a practice for application may be
made consciously by the Information Technology Systems practitioner. This choice may be
made with a purpose in mind, derived from the need to focus on either, ICT systems or human
systems, order or change.
The above framework was used to observe situations that arose during the research period, in an
attempt to understand the underlining assumption on which each data set was dependant on, in
context of the environment, and also lend itself to a typical phenomenological research
approach.

Page 43 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

3.2.2 Research approach


The appropriateness of a research methodology is vital when considering any research topic. It
is important to ensure that research is both useful and conclusions are valid. The researcher
should make sure that the research not only have content validity but also construct validity
(Luthans, 1989).
The position of the researcher also needs to be remembered (interim manager and consultant).
The researcher assisted the organisation with the implementation of Best Practice and as such
did not stand independent from the change in the organisation, but rather facilitated change.
The aim of this dissertation was to ascertain if measurable change in the perception of customers
transpired and to establish if these changes were as a result of Best Practice Implementation as
claimed by the Office of Government Commerce.
The methodology chosen for this research dissertation therefore took cognisance of the above
facts. Two research methodologies were considered, (1) the Single Case Study (Jankowicz,
1995) and (2) a subset of the first, Action Research (Galliers,1992).
The case study method allows the researcher to explore both issues in the present and the past,
as they affect a relative complete organisational unit and is applicable if a theory is explored
which specifies a particular set of outcomes (Jankowicz, 1995). Case studies further assist
researchers to develop and refine concepts and frames of reference, while studying the
phenomena and the researcher can thus let the subject unfold naturally, assisting the
researcher to understand the dynamics of the subject studied (Galliers,1992).
Traditional case studies do however have some limitations. For example, it limits the ability of
the researcher to experiment or intervene in the unfolding subject matter.
Action research on the other hand allows for the researcher to take part in the unfolding subject
matter (which the researcher, in terms of his relationship with Technology Services, was
required to do) (Galliers, 1992). The only viable approach was thus Action Research.
Action Research interventions are based on the formulation of a perceived need by a group of
stakeholders as a result of observation of a specific situation. Action Science can be summarised
as publicly testing knowledge claims in accordance with an explicit set of rules (set by the
researcher), but they adapt these to the context and extend them to include all normative,

Page 44 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y

interpretive, and empirical claims that actors necessarily make as they try to understand the
world in order to act. (Argyris et al,1985).
Action research follows the following process (Zuber-Skerritt, 1991, p.103)9:

Plan (problem analysis and strategic plan)

Act (implement the plan)

Observe (evaluate actions)

Reflect (on results of evaluation and the process a whole.)

Because of the limited time span available to do research at the organisation, Observe and Reflect
in most instances were part of a single interaction, the researcher and his colleagues, observed
actions, shared these observations, evaluated these observations and reflected on these in a single
action (normally as part of the daily operations meeting). It should be noted that Action Research
projects differ substantially in characteristics this depends on a number of factors including the
situation of the researcher in the organisation and his relationship to the research theme it is
thus not a prescriptive research methodology (Dick, 2002).
Dick (2002) further comments that action research should be regarded as a methodology which is
intended to have both action outcomes and research outcomes - in some cases the research
component mostly takes the form of understanding on the part of those involved - the action of
participants though is primary. This was the primary mode of research conducted in this
dissertation.
3.3 Research Design
Although any form of research needs to be planned and designed, action research can not follow
a rigid process, otherwise it looses its usefulness as the research can not unfold(Galliers,1992)
and the danger then exists that no action will be taken.
The design of the data collection process can mainly be divided into three categories, customer
surveys, management surveys & interviews and system statistics. As the research project
unfolded, activities were being identified as Plan, Act, Observe/Evaluate and Reflect.

It is interesting to note the similarities between the Action Research Approach and the Deming Quality Cycle (Plan,
Do, Check and Act) Deming was first so one can safely assume that the Action Research Approach borrowed their
approach from Deming?

Page 45 of 152

to

bu
.d o

m
o

.c

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

lic
C
c u-tr a c k

.d o

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

This section will not cover a detailed design of the research undertaken as this is not practical.
The general idea of the research will rather be shared and the rest of this section will deal with
the unfoldingresearch project.
The aim of the research was to understand the context or environment in which the research
took place, establish the maturity of the organisation in terms of implementing Service
Management Best Practice, establishing the perception of the customers regarding service
Quality and lastly to correlate this view with the actual improvement in terms of service and
organisational maturity, thus aiming to test the hypothesis that:
1. There is a direct correlation between the level of Customer Satisfaction with services
provided by Technology Services and the use of Service Management Best Practice in
the form of the ITIL framework and
2. Is Customer Satisfaction an indication of effective service delivery?
3.3.1 Unfolding research project
An important focus of applying Service Management Best Practice is to enable the ICT function
to be a more effective service provider and to ensure that business requirements are met.
Although managing the ICT infrastructure itself is a necessary component of Service
Management, it was not the primary focus of this dissertation - instead it focused on aligning ICT
service provision with the organisational requirements, needs and strategies. This meant that the
traditional paradigm of service provision needed to change to one that was process-oriented,
proactive and truly permeated all aspects of the business. (Leopoldi, 2002). The new paradigm
needed to be one of a service and customer driven ICT function, following the set perspectives of
people, process, technology, organisation and integration.
3.3.2 Planning
With this paradigm in mind, several objectives need to be addressed

Business objectives

Service Level Objectives

Technology and infrastructure

Best Practice objectives.

Page 46 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

The first step in implementing Service Management Best Practice was to understand the initial
environment a base-line was thus needed. Based on the information at hand the desired future
state of ICT then needed to be determined and a roadmap needed to be developed. Appendix-J
outlines the unfoldingresearch project in greater detail.
3.4 Characteristics of Sample
This dissertation related to the Shared Service Centre and the data collected and analysed were
confined to this environment. Customers surveyed to determine the level of satisfaction ranged
from 16.2% to 27.9% of the total user population. Managers surveyed represented 21.8% of all
middle and senior managers. Satisfaction Surveys were sent to all Managers and Users, the
sample collected was based on the individuals choice to participate in the survey. The sample
size though, is substantial and can be regarded as representative of the general views of Managers
and Users.
Process Maturity and Best Practice Conformance Assessments were done with process owners
and relied on the honesty of respondents - as the researcher was involved in the operations of
Technology Services, the researcher from time to time had to challenge the views of respondents
and asked for concrete proof of their response. The Process Maturity and Best Practice
Conformance Assessments can thus also be regarded as representative of the true state of affairs
at Shared Service Centre.
Call statistics used, represented all the calls logged and responded to during the research period
it was thus 100% representative of the sample.
Survey and Assessment information was supplemented by structured interviews from Senior
Managers three of the seven Executive Managers were interviewed. The views of these
managers were seen as representative of the views of the Executive Management team as they
represent nearly half of the team.
Other interviews with middle managers were done as needed. These interviews mostly took the
form of formal interviews but sometimes were one-on-one conversations about a specific issue or
problem.
Although the study was confined to Shared Service Centre, most managers in the organisation
came from the private sector. They commented that Shared Service Centre is in no way unique

Page 47 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

and that problems and challenges experienced, in their view, were similar to previous
experiences. The results of this study should thus be meaningful to all organisations.
3.5 Gathering Data
Customer Satisfaction Surveys were sent to all Managers and Users via e-mail, the sample
collected was based on the individuals choice to participate. Respondents answered the survey
questions on-line and submitted it to a central database (Appendix-D).
Process Maturity and Best Practice Conformance Assessments were done by interviewing
process owners (Appendix-B). Call statistics used, were collected from the Service Desk tools
database calls represented all the calls logged and responded to during the research period
(Appendix-E).
Structured interviews with three of the seven Executive Managers were held (Appendix-A).
Other interviews with managers were done as needed (Appendix-C).
3.6 Design of Questionnaires
3.6.1 ICT Service Management Process Maturity Measurement
The Office of Government Commerce developed a tool10 to measure organisations Information
Technology Service Management maturity based on the Capability Maturity Model (CMM) of
Carnegie Melon University and the Service Capability Maturity Model of die Vrije
Universiteit, and the work done in 1970 at Harvard Business School and IBM. The Process
Maturity Framework, although based on the concepts of the Capability Maturity Model does not
exactly have the same underlying structure and draws heavily on IBM/Harvard model
(OGC(2):2002). The tool proves helpful, not only to obtain a baseline of the organisations
Service Management process maturity, but also assisting the organisation in identifying areas
needing attention. For more information on the Service Capability Maturity Model see
Appendix-L.

10

This statement is not entirely correct the OGC developed a model based on CMM and the itSMF developed the
toolset see http://www.itsmf.com/bestpractice/selfassessment.asp

Page 48 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Figure-14: Process Maturity Framework for IT Service Management


The first survey was conducted in December 2002. The organisation however was submitted to a
similar survey (Microsoft) that was based on the Office of Government Commerce model. The
Microsoft assessment was done in January 2002. To be able to have an understanding of where
the organisation came fromthe Microsoft Assessment was converted back to the original Office
of Government Commerce format. This was a very simple task as about 90% of the survey
questions were identical - for the remaining 10% the researcher depended on general observations
when he joined the project (June 2002) and interviewed technical managers.
It should further be noted that only seven of the eleven ITIL processes/functions were
implemented at the time of the last survey and only five in June 2002.
Figure-14 illustrates how the Process Maturity Framework progressively measure maturity.
Activities in higherlevels do not mean that a higher level of maturity is achieved complying
to the set prerequisites for each level is the major determining factor qualifying the organisation
to move to a higher level of maturity. Figure-15 outlines issues that need to be addressed to reach
a certain level of maturity.

Page 49 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Figure 15: Levels of Process Maturity


Appendix-B details the questionnaires used and the result of assessments undertaken.
The assessment was done by interviewing process owners, responses were reviewed by a
manager who does not own the process, discrepancies were then discussed with the process
owner and adjustments were made to the original response where appropriate (if process owner
could not disprove objections).
The processes assessed were:

Service Level Management (Service Delivery)

Service Desk (Service Support)

Incident Management (Service Support)

Problem Management (Service Support)

Change Management (Service Support)

Configuration Management (Service Support)

Release Management (Service Support)

Page 50 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y

3.6.2 Customer Satisfaction Survey


Berry, Parasuraman and Zeithamls (1990) SERVQUAL scale was used to design Customer
Satisfaction Surveys. The organisation regarded some issues more important than others and also
believed that surveys should be short and concise. The survey design considered these factors and
focused on metrics that would, in the view of the organisation, prove helpful. The survey was a
set of simple multiple choice questions which were posed to customers, Table-1 gives an
example of a question posed and the full set of Survey Questions can be seen in Appendix-D.
Quantitative data are much easier to work with but less user friendly; although the Customer
Satisfaction Survey was in essence a quantitative measurement device, qualitative statements
were associated with the rating scale (Table-1).

Q-No.

Table-1. Sample of Customer Satisfaction Survey Questions and Measurement Scale


Question Posed

Numeric Rating
of 5 equals a
Qualitative
response of

Numeric Rating
of 4 equals a
Qualitative
response of

Numeric Rating
of 3 equals a
Qualitative
response of

Numeric Rating
of 2 equals a
Qualitative
response of

Numeric Rating
of 1 equals a
Qualitative
response of

How well ICT services or products offered


are communicated to users?

Very Bad

Bad

Fair

Well

Excellent

3.6.2.1 Intent of Survey


Technology Services needed a mechanism to measure whether services offered to customers, met
expectations. The measurement of the success of services needed to be broad enough to ascertain
if and how Technology Services failed to deliver in terms of customer expectations and to see
how and in which area Technology Services needed to focus service improvement efforts.
3.6.2.2 Methodology and Instrument
The instrument used to conduct Customer Satisfaction surveys was an automated system making
use of the corporate e-mail system. The system used SERVQUAL as a basis and focused on the
following SERVQUAL factors:

Tangibles (Question-1 & 2)

Reliability and Responsiveness (Question-3, 4 & 8)

Assurance (Question-5 & 7)

Empathy (Question-6)
Page 51 of 152

to

bu
.d o

m
o

.c

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

lic
C
c u-tr a c k

.d o

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

For each question five; predetermined responses were made available, ranging from excellent to
very poor. Table-1 gives a sample of the questionnaire - a numeric equivalent was assigned to
each possible answer thus allowing the researcher to deal with qualitative data as quantitative
data making analysis easier. In most instances the use of qualitative information increases
responsiveness. It is possible to work in natural language, which is easier for informants (Dick,
2000).
Appendix-D outlines the questions posed the possible responses customers could choose and the
reasons why questions were posed. It should be noted that a lower numeric value, equals a better
result.
3.6.3 Call Statistics
The Service Desk tool did not provide very useful information pertaining to ICT service
improvement. Two measures could be used, (1) time to resolve and incident and (2) reduction of
calls logged per user. This information was drawn from the Service Desk Softwares database,
additional information on the number of users and the numbers of technical resources responding
to calls, at the time of any one survey was used (Appendix-E).

3.7 Data analysis


Data analysis followed the approach set out in the Research Framework. Data colleted were
tabulated and analysed in Microsoft Excel more detail on the approach can be seen in Sections4 and 5.

Page 52 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

CHAPTER 4
RESULTS

4.1 Overview of Results


This section will address the results of the three individual sets of results obtained during the
research period and the results of Management Interviews. The comparison of these three data
sets will be dealt with in Section-5. Management Interviews were used to contextualise the
results in this section and the findings in Section-5. The Highlighted Blocks in Figure-16 outlines
the three sets of results that will be dealt with in this section Management Interviews forms part
of the context of the research.

Figure-16: The Research Framework as related to Section-4

Page 53 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

4.2 ICT Service Management Maturity and Best Practice Conformance Assessments
4.2.1 Assessment Data
The results of the maturity assessment are summarised in Figure-17. Of the six processes and one
function implemented, six showed improvement over the research period, of which two improved
substantially (Service Level Management and Configuration Management). These were regarded
as key processes around which all the others revolved hence the reason why they improved more
significantly than others. The Service Desk function showed no improvement in terms of Process
Maturity, as the service desk does not belong to Technology Services but rather the Customer
Relationship Management function in Shared Service Centre the understanding of ITIL and
priorities differed, staff employed for this function was not technically competent as they acted as
call-loggers rather than providing first-line support. None of the prerequisites for an improved
maturity level was thus met by the Service Desk.
3.5

3.0

2.5

2.0

1.5

1.0

0.5

0.0

OGC/CMM (Ass 1 - Jul 02)

OGC/CMM (Ass 2 - Jan 03)

OGC/CMM (Ass 3 - April 03)

Service Lev el Management

0.0

1.0

3.0

Service Desk

0.0

0.0

0.0

Incident Management

1.0

1.0

2.0

Problem Management

0.0

1.0

1.6

Change Management

0.0

1.0

1.5

Configuration Management

0.0

1.0

2.5

Release Management

0.0

1.0

1.0

Figure-17: OGC Process Maturity Assessments compared


In addition to the maturity level assessment, the researcher added another dimension to the
standard assessment. The Process Maturity assessment tool dictated that unless the pre-requisites

Page 54 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

for a level are met, the maturity of the organisation will remain at that level. This however does
not reflect the status of other activities that may be related to a higher level of maturity.
To get a clearer understanding of the number of issues that still needs to be addressed to
become a mature Technology Service Provider, the researcher determined the percentage of
process activities that was addressed, regardless of the levelin which the issuewas assessed,
thus determining the level of Best Practice conformance. The results of the revised use of the tool
can be seen in Figure-18.
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%

Activity Score (Ass 1 - Jul 02)

Activity Score (Ass 3 - Jan 03)

Activity Score (Ass 4 - April 03)

Service Level Management

18%

70%

82%

Service Desk

15%

57%

65%

Incident Management

5%

65%

85%

Problem Management

5%

65%

75%

Change Management

4%

49%

64%

Configuration Management

2%

67%

76%

Release Management

4%

71%

78%

Figure-18: Best Practice Conformance Assessments compared


All processes showed significant improvements Service Desk activities also showed a
significant improvement, this was largely due to the fact that Incident Management took up the
slack for the Service Desk by taking over many of its responsibilities. This situation is
obviously not ideal as the Incident Management Process as a result looses focus.

Page 55 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

4.2.1 Conclusion - Maturity and Best Practice Conformance Assessments


Although Best Practice have partly been implemented in the Shared Service Centre environment
and have shown significant improvement over the research period, some key issues still holds
back the effectiveness and efficiency that Best Practice can bring to the organisation. The Major
contributor to the organisations ability to reach higher levels of maturity is demonstrable
management commitment to Best Practice and breaking down Silos that exists within the
organisation.

4.3 Customer Satisfaction Survey


4.3.1 Results of Surveys
The results of Customer Satisfaction surveys discussed below take the form of investigating if
respondents qualify to have an opinion on the quality of service and the resulting improvement,
an initial impression, adjustments made to the results and the reasons why these adjustments were
made, a short discussion on the anomalies found and lastly the conclusions made as a result of
analysis.
4.3.2 Validity of Responses
When doing surveys one has to question if the respondents qualify to express an opinion;
respondents must have a basis from which to voice an opinion otherwise the results of the survey
may be questionable or at least be seen as extremely subjective (quality or customer surveys are
however always subjective by nature as the perception rather than the actual performance are the
subject of the survey or questionnaire).
As seen in Appendix-D, one of the questions posed was if the respondent has logged a call with
the Service Desk. This is important as customers who have not done so do not have an informed
view of the following questions:

Technical Response

Information Response

IT Staff Knowledge and Ability

Time to resolve an reported Incident

Page 56 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

And to a lesser extent:

IT staff care, and

Overall Satisfaction will also be influenced.

The result of the question showed that a very high number of customers have logged calls with
the Service Desk. This ranged from a Low of 94% for Survey 3 and a High of 96% for Survey 2.
Figure-19 shows the results of the response to this question.

97%
96%
96%
95%
95%
94%
94%
93%
Survey 2

Survey 3

Survey 3 - SM

Figure 19: Percentage of respondents who have logged calls


The question was however only posed during the second and third survey as during the first all
users were new to Shared Service Centre.
The second question that needs to be posed when attempting to draw conclusions from Customer
Satisfaction surveys, is even more important can the sample be considered as representative of
the population?
4.3.2.1 Survey 1 October 2002
The first survey was sent to 370 employees who have worked for the organisation before they
started to work for Shared Service Centre they represented a baseline from which to gauge
service improvement. The 370 respondents represented 80% of the users at the start of operations
of Shared Service Centre.

Page 57 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y

quarter of the employees in this category.


4.3.2.2 Survey 2 January 2003
The second survey was sent to 463 employees who work for Shared Service Centre, of these, 129
responded this represented a sample of 27.9% once again more than one quarter of the
employees.
4.3.2.3 Survey 3 May 2003
The third survey posed the survey questions to two categories of employees all users (893) and
managers (96). The Managers represented the business interest of Shared Service Centre and
included executive, senior and middle managers of the organisation.
Of the 893 employees 145 responded, representing a sample of 16.2% or about one sixth of the
users and of the 96 managers 21 responded to the survey representing 21.8% of the managers.
4.3.2.4 Conclusion on Validity
30.0%
25.0%

15.0%
10.0%
5.0%

Survey 1

Survey 2

Population 96
Sample 21 (21.8%)

20.0%

0.0%
Survey 3

Survey 3
(Managers)

Figure-20: Percentage of the population who responded to the surveys


The validity of the data collected during the Customer Satisfaction Surveys was considered as
valid and for the following reasons:

Page 58 of 152

lic

to

bu

Of the 370 employees, 103 responded this represented a sample of 27.8%, more than one

.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

Population 893
Sample 145 (16.2%)

.c

Population 463
Sample 129 (27.9%)

c u-tr a c k

Population 370
Sample 103 (27.8%)

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

1. 94 to 96% of respondents logged calls with the Service Desk and all are customers of
services provided by Technology Services the sample thus qualifies to express an
opinion.
2. The percentage of population sampled was considered as significant enough - the goal
was to sample 15% of the population (all surveys yielded more). Figure-20 outlines the
detail with regard to response to surveys conducted.
4.3.3 Initial comparison
A significant improvement in service quality occurred between the first and the second survey as
a result of the efforts of Technology Services, the only exception to this was product-service fit. It
seems that Technology Services did not fully understand the needs of the customers and thus did
not add value by improving the way customers work.
The third survey was the first survey where managers were surveyed separately. Managers
seemed to have a less positive view of the services offered. The normal survey also did not show
significant improvement in fact in one instance (technical response time) service seemed to
deteriorate.
5.0

(lower value = bet ter result)

4.5
4.0
3.5
3.0
2.5
2.0
1.5
1.0
0.5
0.0
Communication

Product /
Service Fit

Technical
Response

Information
Response

IT Staff
Know ledge

IT Caring for
Customers

Median (S1)

3.0

3.0

3.0

4.0

3.0

3.0

3.0

3.0

3.0

Median (S2)

2.0

3.0

2.0

2.0

2.0

2.0

2.0

2.0

2.0

Median (S3)

2.0

3.0

3.0

2.0

2.0

2.0

2.0

2.0

2.0

Median (S3M)

3.0

3.0

3.0

2.0

2.0

2.0

2.0

2.0

3.0

IT Staff Ability Time to resolve

Overall
Satisfaction

Figure-21: Median Response to Customer Satisfaction Surveys


Figure-21 shows the median response to the surveys and seems to indicate that the results for the
last survey stayed the same. The Median of the survey results was chosen as Ordinal Data was

Page 59 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

used. Although not strictly correct, ordinal data is sometimes treated as interval/ratio data the
researcher shied away from this approach where possible (Blanchard et al, 1999).

4.3.4 Adjustment of Survey results


The stagnant results of the survey were expected as the researcher did not regard the level of
improvement as significant enough to move from goodto excellent(Appendix-D). Three
additional questions were posed to gauge marginal improvement and these three questions were
related to other questions to measure if any improvement occurred. In hind site, the researcher
would have rather chosen a scale of 1 to 10 as the 1 to 5 scale showed its limitations during the
third survey. Figure-22 outlines the responses to the three questions posed and how questions
relate to other questions posed.

Product & Services (Q 2 & 6)

Communication (Q 1 & 4)

Service Response (Q 3, 5 &7)

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

Service Res ponse (Q 3, 5 &7)

Communication (Q 1 & 4)

Worse

1%

2%

2%

Same

38%

37%

34%

Improved

61%

61%

64%

100%

Product & Services (Q 2 & 6)

Figure-22: Did Service improve during the last Quarter Surveyed?


It is clear that the majority of respondents did regard service has improved and about one third
regarded service levels to be the same (Figure-22), this third will be ignored as it does not
influence result. It was rather the results that regarded improved or deterioration in service that
was regarded as important when the off-set formula below was applied to the data.
Real Result = Result obtained (%that said service improved - % that said service deteriorated)
The offset was subtracted from the results of Survey-3 as a lower score represent a better
response - Figure-23 shows the results of applying the off-set formula.

Page 60 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

4.5
4.0
3.5
3.0
2.5
2.0
1.5
1.0
0.5
0.0
Communication

Product/Service
Fit

Technical
Response

Information
Response

IT Staff
Know ledge

IT Caring for
Customers

Overall
Satisf action

Median (S1)

3.0

3.0

3.0

4.0

3.0

3.0

3.0

3.0

3.0

Median (S2)

2.0

3.0

2.0

2.0

2.0

2.0

2.0

2.0

2.0

Off-set Median (S3)

1.4

2.4

2.4

1.4

1.4

1.4

1.4

2.0

1.4

IT Staff Ability Time to resolve

Figure-23: Off-set values applied to Survey-3


The researcher regarded the adjusted results as closer to the truth, backed up by the results of the
additional questions posed in Survey-3(Figure-22). It should be noted that the results of
Question-8 (Appendix-D) was not adjusted as the response to this question was factual (time to
resolve an Incident) and thus the response to this question was not subject to a respondents view
or open to interpretation.

Final Customer Satisfaction Results


4.5
(lo wer value = bet t er result )

4.0
3.5
3.0
2.5
2.0
1.5
1.0
0.5
0.0
Communication

Product /
Service Fit

Technical
Response

Information
Response

Staff
Know ledge

Caring for
Customers

Staff Ability

Time to resolve

Overall
Satisfaction

S-1 Median

3.0

3.0

3.0

4.0

3.0

3.0

3.0

3.0

3.0

S-1 Std. Dev.

1.1

0.9

1.1

1.2

1.0

1.0

0.9

1.3

1.1

S-2 Median

2.0

3.0

2.0

2.0

2.0

2.0

2.0

2.0

2.0

S-2 Std. Dev.

0.8

0.8

1.0

0.8

0.7

0.7

0.7

0.9

0.9

S-3 Median

1.5

2.5

2.5

1.5

1.5

1.5

1.5

2.0

1.5

S-3 Std. Dev.

0.9

0.9

1.1

0.9

0.7

0.8

0.8

1.2

0.8

Figure-24. Customer Satisfaction Surveys Compared

Page 61 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

As explained above the 1 to 5 scale used proved inflexible and unable to show a marginal
improvement (the reason for the three additional questions and the off-set), the best means the
researcher could find to solve this problem was to adapt the ordinal scale to use 0.5 increments
instead of 1 (the same as converting the 1-5 scale to a 1-10 scale).
The off-set calculations calculated for Survey-3 were no longer Ordinal, this posed a new
dilemma. To overcome this problem, off-set values were rounded to the nearest half instead of
the absolute value calculated - it allowed the researcher to represent marginal improvements as
Ordinal Data (Figure-24).
Over the research period customersperception of service quality and their satisfaction with
service have improved constantly and significantly. There are however two exceptions that will
be discussed below. Manager Satisfaction was tested for the first time in Survey-3 no
comparative data existed although if compared to Customer Satisfaction it is less positive, but
not substantially so. A number of reasons may exists for this difference; the main factor may be
that Managers hear about all the problems in their area of responsibility their judgement is thus
based on their knowledge of the collective service provision to their area of responsibility where
users perceptions are largely their own.
4.3.5 Anomalies
In Figures-24 two anomalies became apparent, namely:

Technical Response (Survey-3 worse than Survey-2)

Time to Resolve (Survey-3 same as Survey-2)

It was the original view of the researcher that these anomalies were caused by the fact that the
user count grew substantially over the research period but the number of support people did not.
This assumption however was not true as will be seen in the section dealing with call statistics,
a more plausible explanation would be that customers are starting to have over-expectations
(Figure-1) from the Technology Services division.
The second anomaly is based on factual information - time to resolve an incident - this did not
improve and it was substantiated by the analysis of call statistics and will be dealt with in the
section dealing with Call Statistics.

Page 62 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

The two anomalies were not seen as significant indicators that customers are dissatisfied with
service provided or as indicative of a trend as the results of all three additional questions posed
clearly indicate that customers have seen an improvement in the service offered (Figure-22).

4.3.6 SERVQUAL - Factors Compared


Of even more relevance is the comparison of the SERVQUAL factors, as these more accurately
reflect the views of customers. Figure-25 shows the relationship of questions posed to
SERVQUAL factors.

Figure-25: Survey Questions related to SERVQUAL Factors

Results from the Customer Satisfaction Surveys were aggregated to get a view of the customers
response to SERVQUAL factors making use of the data in outlined in Figure-25 (see Figure-26).
From Figure-26 it is clear that a definite trend is emerging, showing constant improvement in
customers perceptions of service quality. What is concerning though was that an important
SERVQUAL factor, Responsiveness seems to be stagnating. A number of factors may have
contributed:
1. All things were not always equal in the research environment during the last quarter
surveyed , new applications were introduced to the research environment (SAP, Live-Link
etc.), all with their own unique teething problems and requiring that support resource
needed to be diverted from normal daily operations to address these issues.

Page 63 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

2. Demands from external departments on the technical resource pool were mounting as the
provincial Wide Area Network was diverting further resources to address problems.
3. The Transformation of other departments was planned during this quarter diverting even
more resources to site-surveys and planning activities.

Reliability
Empathy
Assurance
Responsiveness
Tangible
0

0.5

1.5

2.5

3.5

4.5

(lower value = better


result)

Tangible

Responsiveness

Assurance

Empathy

Reliability

S3 Median (Managers)

2.5

S3 Median

1.5

1.5

1.5

S2 Median

2.5

S1 Median

3.5

Figure-26: SERVQUAL factors by Customer Satisfaction Survey


4.3.7 Conclusion - Satisfaction Surveys
The results of the three Customer Satisfaction surveys - all things considered - seem to clearly
indicate a significant improvement in satisfaction levels over the research period. The sample of
respondents for all the surveys was regarded as representative and valid. The results obtained
can thus be used to evaluate whether or not ICT Service Management Best Practice contributes
to Customer Satisfaction.

4.4 Call Statistics


4.4.1 Discussion of Data
Data collected from the Service Desk database proved very limiting. Data that could be collected
were calls logged and resolved per period and the time it took to resolve a call.

Page 64 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

This information on its own did not shed much light on Service Improvement as the research site
although relatively stable over the research period was subject to a number of changes. The
number of users to support increased from three hundred and seventy to eight hundred and ninety
six, while the technical support staff did not change substantially.
Call volumes stayed relatively stable over the period, thus off-setting the exponential growth in
users. The ability for support staff to cope with such rapid increase of users to support without
having good processes and procedures would most probably not be possible (conjecture thus
inferring that Best Practice must have played a role in the technical support staffs ability to cope
with the increased number of users to support).

60
50
40
30
20
10
0

Oct/Dec-02 Quarter-1

Jan/Mar-03 Quarter-2

Apr/Jun-03 Quarter-3

Average calls per day

52.1

53.7

48.9

Average call-closure time

3.7

3.7

4.3

Median call-closure time

Number of technicians (Total)

15

22

25

Number of technicians (Effective)

15

15

15

Number of w ork-days

62

63

65

Calls per Technician/Day (Efective)

3.5

3.6

3.3

Figure-27: Summary Statistics from the Service Desk Database and Incident Manager.
Figure-27 outlines the core information collected from the Service Desk Database and other
sources to attempt to have a meaningful view of call statistics for the research period. The detail
of the data collected can be seen in Appendix-E.
Call volumes decreased slightly over the research period with 49 calls being logged per day by
the end of the third quarter as oppose to 52 calls per day this in spite of the user count growing
nearly three fold over the research period. Dedicated technical resource available to resolve calls
stayed the same (Total vs. Effective), although the number of total support staff nearly doubled
during the research period. Additional technical resources were diverted towards other projects.
Page 65 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

The number of dedicated support personnel as compared to other similar environments does
however still seem appropriate.
Although most (median) calls were resolved in less than two days and stayed constant for the
research period, the fact that the average time to resolve has increased does not bode well
(increased from 3.7 days to 4.2 days). The following reasons may have contributed to the
increase as the database used proved very limiting in trying to discriminate between different
types of calls:
1. Requests for Changes are logged as calls, the requests can be standard changes (changes
that can be effected easily and quickly, thus similar to a support call), setting up of new
users (although a simple task, time consuming) or it may be a request to implement new
applications (many of these were done during the research period), this last category is
very time-consuming and in most cases involves getting approval for spending (software
or hardware to host systems), testing systems for interoperability in the Laboratory,
piloting the systems and then finally deploying the system can take up to three months.
Extreme values like these may explain the big difference between Median and Mean Time
to Resolve calls and the large Standard Deviation (11.8 days December 2002, 9.8 days
January 2003 and 8.5 days June 2003)
2. New projects started in earnest in February 2003, implementation and testing of these
systems must have had an influence on the operational environment as systems, no matter
how well tested, always have teething problems, thus diverting resources to ensure stable
implementations.
3. Planning for the next migration phase (bringing 12 additional buildings on-line) also
started in February 2003; resources were diverted to assist with planning and auditing
these sites to facilitate the planning.
4.4.2 Conclusion - Call Statistics
The biggest suspected culprit influencing the call statistics was most probably request for
changes as all the possible reasons mentioned before falls within this category. The researcher
suspects that was he able to exclude change requests (not feasible - as that would mean reviewing
15,000 calls manually) the picture would have looked substantially different.

Page 66 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

The only concrete evidence of improved service levels was that the number of calls per user has
decreased substantially (less than half) from 8.7 to 3.6 and not as originally though a decrease
Time to Resolve Calls (Figure-28).
10

4000

3500

8
3000
7
2500

6
5

2000

1500

3
1000
2
500

1
0

Oct/Dec-02 Quarter-1

Number of users supported


Number of calls
Average calls logged per user

Jan/Mar-03 Quarter-2

Apr/Jun-03 Quarter-3

370

463

893

3,230.00

3,384.00

3,180.00

8.7

7.3

3.6

Figure-28: Average number of calls logged by user per quarter


It could thus be said that service levels have improved substantially during the research period
although not in a fashion recognisable by customers and as tangibles (average calls logged per
user per quarter) are not viewed as important as the SERVQUAL factors of responsiveness and
reliability (ability and time to resolve a call).

4.5 Management Interviews


Interviews with managers were largely seen as a mechanism to broaden the scope of data
collected, to substantiate or refute research data collected, and to give an environmental context
in which information and data can be analysed and interpreted.
Interviews largely focused on issues that may be considered as contributors or evidence of
service improvement. Table-2 summarises the results of interviews with executive and senior
managers and Appendix-A and C provides more detail.

Page 67 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Interviews with managers largely substantiate data collected from the three primary collection
sources. Managers further raised similar concerns that need to be addressed to ensure that the
implementation of ICT Service Management Best Practice work optimally and that service
improved. There is however no seemingly contradictory statements made with regard to the
above issues.
Table-2: Summarised responses of Managers regarding Service Quality, Best Practice etc.
Positive factors or improvements made
since the inception of the research project

Areas of concern still hampering effective and


efficient Service Provision

Processes and procedures implemented


contribute to more effective and efficient
service provision.

Although a number of processes are in place,


adherence to processes and procedures are weak,
work-streams work in islands. Consistency seems
to be lacking a more integrated approach is
needed.

Noticeable improvement in service


provision since October 2002.

General governance lacking.

Technology Services needs to improve


Improved communication with customers
and users.

communication around major outages, planned


outages and the impact that these may have on
operations of the Shared Service Centre.

Competent technical support staff with


reasonable people skills.

People do not follow process organisational


change management was not done well.

Customers are generally satisfied with


service provision.

Measurement and controls seems to be lacking.

Improved response times.

Turnaround time and responsiveness is still not


what customers expect.

Technology Services are less reactive and


starting to act proactive.
Processes make the environment easier to
manage.

Management does not demonstrably prove that


processes are important.
Training around processes lacking.

Improved management of customers and


suppliers noticeable.

Technology Services has a better


understanding of customer needs and
requirements.

Page 68 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

4.6 Conclusion on Results


In conclusion the results of surveys and interviews seem to be valid and representative of the
population; assessments and call statistics are representative of the operational environment.
Section-5 will address the objectives of this dissertation and draw conclusions from analysis
made in this section.

Page 69 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

CHAPTER 5
FINDINGS AND RECOMMENDATIONS

The data and information collected during the research project seem representative and valid and
in conjunction with additional information addressed in the Literature Review were deemed
sufficient to be used with confidence to assess whether:
1. there is a direct correlation between the level of Customer Satisfaction and Service
Quality provided by Technology Services and the use of Service Management Best
Practice in the form of the ITIL framework,
2. if Customer Satisfaction is an indication of effective ICT service provision in the
organisation, and to
3. establish what operational context bets suits the implementation of ICT Best Practice.
Based on the findings and other observations some general and some specific recommendations
will follow these need to be seen in the operational context of the organisation and need to be
considered by executive management.
5.1

The nature of data collected

As stated earlier, it is difficult to relate data collected if the source of the data and the type of data
differ. Some data are objective and un-refutable while others are subjective and rely heavily on
the perceptions of respondents. Research environments can not be regarded as stable; there is
always a measure of order and a measure of change present in the environment. Potgieters
Quality Systems Practice Framework (see 3.2.1) is used to assist this researcher to place
information in context and relate different types of data in a logical and academically valid
fashion.
Potgieter (1997) define four paradigms (regulating ICT systems, accommodating different service
perceptions, managing different service perceptions and changing ICT systems) and state that
these paradigms are all present in any research environment. He thus concluded that the ICT
environment contains elements of objectivity and subjectivity, order and conflict some

Page 70 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y

elements may be more dominant than others but all are none the less present. This model is thus
very useful for the researcher as he needed to consider both objective/subjective data and
order/change. Appendix-K discusses Potgieters model in greater detail.
It was helpful to relate information and data collected to the four dimensions to ascertain the
dominant paradigm in the organisation and make the most appropriate recommendations based
on this orientation or paradigm. Figure-29 outlines the orientation of the information sources.

Call Statistics
Time to Resolve {mostly}
Calls Logged per User {mostly}

Management Interviews
{somewhat}

Objective Service Quality

Subjective Service Quality

Customer/User Satisfaction
survey

Customer/User Satisfaction
surveys

SERVQUAL factors
Tangibles {all}
Reliability {mostly} and
Responsiveness {mostly}
Call Statistics
Time to Resolve {mostly} and
Calls Logged per User {mostly}
Management Interviews
{somewhat}

SERVQUAL factors
Empathy {all}
Assurance {all}
Reliability {somewhat} and
Responsiveness {somewhat}
Management Interviews
{somewhat}

Figure 29 : Research data related to Potgieters 4 paradigms

Shared Service Centre operated predominantly in an environment of Change and from Figure-29
it is clear that data and information collected are a mix of subjective and objective data, with a
slight bias towards objective data (shaded area). The paradigms that largely apply are Changing
ICT systems and Managing Differing Service Perceptions.

Page 71 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

Order

c u-tr a c k

Change / Conflict

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

5.2 Research Framework


As outlined in Section 2.3, a framework was put together to use during the research project to
satisfy the objectives of this dissertation (Figure-30).

Figure-30: Research Framework vs. Research Findings


Comparisons A, B and F was done in Section-4, Results, dealing with comparing Customer
Satisfaction Survey results (Comparison-A), the improvement in Process Maturity and Best
Practice Compliance (Comparison-B) and comparing the Call Statistics of Time to Resolve and
Calls Logged per User (Comparison-F).
What this section will address is to see if there is any correlation between the three data sets
collected. As in Section-4 the highlighted blocks in Figure-30 outline the comparisons and
findings and Management Interviews serve to contextualise findings. The comparisons relate to
the first two objectives of this dissertation:

Page 72 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

1. The direct correlation between the level of Customer Satisfaction with services provided
by Technology Services and the use of Service Management Best Practice in the form of
the ITIL Service Delivery and Service Support frameworks, and
2. if Customer Satisfaction is an indication of effective ICT service provision

5.3 Comparison of Primary Data


Although graphical representations (like scatter diagrams) are useful in providing (subjective)
assessments about the strength of relationships between two variables, a clear statistical measure
is needed; Statistically such a connection (relationship) is referred to as correlation and we can
calculate what is known as the coefficient correlation between X and Y variables. Such a
coefficient takes a value between 0 and 1: zero implies literally no correlation between the two
variables, while one implies a perfect correlation(Wisniewski, 1997, p.322).
The coefficient can further take negative or positive values; -1 is thus a perfect negative
correlation (inversely proportional) and +1 a perfect positive correlation (proportional). A
negative correlation would thus mean that as one value increased the other would decrease, while
a positive correlation would mean that the two variables compared would both increase or
decrease. Correlations larger than 0.8 and smaller than -0.8 are considered strong correlations
while those of 0.4 or -0.4 are considered weak correlations (Blanchard et al, 1999).
The correlation coefficient is known as r or R - to follow, two formulas of calculating r:
r=

(x x )(y y )
xx y y

( ) (
2

or:
r=

Cov( X , Y )

(where -1 r 1 and the covariance,

Cov( X , Y ) =

1 n

n j 1

(x )(y ))
j

Wisniewski (1997) recommends the use of a spreadsheet or statistical package as calculations can
be is quite complex, the researcher used Microsoft Excel. From the second formula it is clear that
data sets with a standard deviation of zero can not be used to calculate the correlation coefficient.
When the standard deviation equal zero ( =0), it is mathematically impossible to calculate a
result (division by zero is not allowed).

Page 73 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-M shows detailed correlation coefficient calculated between all data sets collected.
For simplicity sake aggregated values were calculated for Customer Satisfaction (5 SERVQUAL
factors), Process Maturity (ITIL processes implemented) and Best Practice Conformance (ITIL
processes implemented). The aggregate values accurately reflected Customer Satisfaction and
Process Maturity, and fairly accurately represented Best Practice Conformance. These data sets
were used in the analysis to follow, together with two additional data sets, namely; Calls logged
per user per quarter, and Median time to resolve (note: data sets are discussed in greater detail in
Section-4). Figure-31 outlines the result of the above sets of data.
10.0
9.0
8.0
7.0
6.0
5.0
4.0
3.0
2.0
1.0
0.0

Customer Satisf action


(SERVQUAL)

Best Practice Conf ormance


(x10%)

Process Maturity

Calls logged per User

Median Time to Resolve

Quarter 1

3.0

0.8

0.0

8.7

1.0

Quarter 2

2.0

6.3

1.0

7.3

1.0

Quarter 3

1.5

7.5

1.5

3.6

1.0

Figure-31: Data sets used in final assessments


Figure-31 shows the results of comparing the aggregates of the above data sets by calculating the
correlation coefficient between them. Remarkably strong correlations and in one instance
(Process Maturity versus User Satisfaction), a perfect negative correlation was observed. As the
standard deviation of Median Time to Resolve a call equals zero ( =0), no correlation coefficient
could be calculated it was discarded for comparative purposes.
A strong correlation (0.99) between Process Maturity and Best Practice Conformance was
expected as the first is reliant on the second, what was however surprising for the researcher was
the near perfect negative correlation between Customer Satisfaction and Best Practice
Conformance (-0.99) and the perfect negative correlation between Customer Satisfaction and
Process Maturity (-1). A negative correlation is expected between Customer Satisfaction and Best
Practice Conformance, and Customer Satisfaction and Process Maturity as the Survey Tool used
for determining Customer Satisfaction determined a rating two is better than a rating of three Page 74 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

were as a Best Practice Conformance rating of 80% is better than one of 20% and a Process
Maturity rating of Level-3 is better than one of Level-1 Customer Satisfaction thus improves as
the rating decreases while Process Maturity and Best Practice Conformance improve as the rating
increases.

Figure-32: Correlation Coefficient calculated for Assessment and Survey results.

Strong correlations were also found between the number of Calls Logged per Users per Quarter
and Customer Satisfaction (0.9), Best Practice Conformance (-0.82) and Process Maturity (-0.9).
Both Customer Satisfaction and the number of Calls Logged per User per Quarter improve as
ratings or data show lower values (thus the positive correlation). Calls Logged per User per
Quarter should show a negative correlation to Best Practice Conformance and Process Maturity.
Figure-32 also outlines whether a positive or negative correlation supports the view that the
implementation of ICT Best Practice in the form of ITIL contributes to Customer Satisfaction
and Operational Effectiveness.

Page 75 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

The above Correlation Coefficients are however unusually high and may be considered suspect.
Factors that may have contributed to this fact can be that subjective measures (customer
satisfaction) were converted to an integer response for simplicity and ease of analysis (one to
five) customers view of service quality though rather is expressed on a continual scale falling
anywhere between one and five, or one and one hundred for that matter. It was already indicated
that the rating scale used proved limiting and should in fact leave customers need more freedom
to express their perception of service quality. A similar situation existed with other assessments
(although the scale included half numbers) the unusually high Correlation Coefficients would
most probably vary slightly if rating scales offered more choice.
The second factor that may have contributed towards the high Correlation Coefficients is the fact
that only three sets of data were collected during the research project more frequent surveys
would have shown a strong correlation (but not 1 or -1) and would have looked more believable.
As stated previously however not only should content validity of research be considered but
also construct validity drawing on a number of observations and theoretical constructs
(Luthans:1989). The results of the primary research tools (surveys and assessments) do however
seem to be backed up by secondary tools (observations, interviews etc.) what is thus important
is not the absolute value of the Correlation Coefficient but rather if the Correlation Coefficient is
high or not.

Figure-33: Coefficient of Determination (R2) of Aggregate Assessments.

Page 76 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

The square of the Correlation Coefficient also proves to be a very helpful measure as the
Coefficient Determination (R2) can be expressed as a percentage thus making it more friendly
to use for comparative purposes (Blanchard et al, 1999). The Coefficient Determination however
does not show whether the correlation is positive or negative, as the square of a negative number
is always positive, this is not significant as long as the relationships are explained as in Figure-32.
Figure-33 shows the Coefficient Determination for the factors rated shaded cells show a
remarkably strong correlation.
From Figure-32 above it thus follows, for the data collected at the research site, that:
1) The probability that the improvement in Customer Satisfaction at the research site can be
attributed to an improvement in Best Practice Conformance is 97%.
2) The probability that the improvement in Customer Satisfaction at the research site can be
attributed to an improvement in Process Maturity is 100%.
3) The fact that the number of calls logged per user, per quarter reduced substantially:
a) Contributes to improved Customer Satisfaction in 81% of the cases, and
b) Can be attributed towards an improved Process Maturity in 81% of the cases.

5.4 General Observations


Executive and Technical Managers are in agreement that the implementation of Best Practice
made a demonstrable difference in the research environment. Executive Managers are reasonably
satisfied with service provision and senior management, as a whole are satisfied with service
provision and say service have improved over the research period (Figures-22 and 26).
Managers and the researcher have seen a reasonable improvement in skills levels and
understanding that processes contribute towards better performance. Technology Services are
notably more responsive (proactive) towards customers and have a better understanding of
customer needs although still far from adequate. Customer and Supplier management improved;
once again, a lot remains to be done.
A number of issues hamper the effectiveness and efficiency of the organisation. Although

Page 77 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

processes are in place, adherence to process is weak, work-streams within the division are still
islands, consistency is lacking, measures and controls seems to be lacking, governance is weak in
general and a more integrated approach is needed. The technology organisation can also do much
to improve communication and responsiveness, although it has improved substantially.
To understand the factors that hamper improved Process Maturity, one need to understand the
Mandatory Conditions than need to be met before the organisation can progress to the next level
of maturity. As is evident from the large gap between Process Maturity and Best Practice
Conformance, it is of little value to concentrate on activities (Conformance) as these activities
can not be sustained in a standalone fashion.
Mandatory Conditions thus largely focus on management commitment, adherence to process,
cultural/environmental, quality of service and integration between processes. Appendix-G
outlines the mandatory conditions for the processes and function implemented, to reach a
maturity level of three and a half (3.5), medium term goal of the Service Management Team.
(The team further set an objective to ensuring all the other processes are implemented and have
reached maturity levels of two (2) by the end of 2004 - although this dissertation will not focus
on these processes).
Reaching objectives will not be possible without the unwavering and demonstrable commitment
of management towards ITIL and Information Technology Service Management Best Practice
as a whole. Although management states that it is committed to implementation of ITIL
(corporate objectives) failure to adhere to implemented processes are overlooked, some
individuals are excused from adherence and reward systems do not support this commitment.

5.5 Conclusion - Research Results


Conducting research, especially on a single site can not conclusively prove a relationship or result
at another site or organisation. Research though is helpful to all who struggle with similar
questions, challenges and problems. This research was limited to Shared Service Centre and
specifically the first phase of its development conclusions can not be drawn to the state of
Service Provision, Customer Satisfaction, Technical Competence, Best Practice Adherence or

Page 78 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Process Maturity for any other site or for other phases of the development of the Shared Service
Centre.
The research results strongly indicate that (at least at the research site):
1. there is a direct correlation between the level of Customer Satisfaction and the use of ICT
Service Management Best Practice in the form of ITIL,
2. that Customer Satisfaction is an indication of effective ICT service provision in the
organisation.
This research is also an important contribution to the field of ICT Service Management Best
Practice (especially ITIL) as to the researchers knowledge; it is the first study of this nature and
the first to prove a positive connection between ITIL and Customer Satisfaction, although
previously inferred.
A number of other observations also follow from this research project, namely:
1. ICT Service Management Best Practice in the form of ITIL tells the organisation what
to do- Service Management Best Practice can not be viewed in isolation, the question of
broader Best Practice principles needs to be addressed, including:
a. Quality Management
b. Objectives, measurement and control (CobiTTM and Balanced Scorecard was
addressed in this dissertation, but others may prove to contribute to the
effectiveness and efficiency of Information Technology Organisations)
c. Organisational Change Management,
d. and the Cultivation of a Service Culture.
2. As with any implementation, unwavering management commitment is needed for success,
3. Strategic direction and a well defined strategy is a prerequisite for success, and
4. Change is a process and organisations should make sure that it is well managed and that
they do not attempt changes beyond their means.
As a result the researcher has thought it prudent to make the recommendations to follow.

Page 79 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

5.6 The Operational Context


It is clear from the research results, observations and the results of using Potgieters paradigms
that recommendations regarding structure, management style and organisational issues will be
complex and have to take cognisance of a number of factors.
5.6.1 Objectives and Goals
The stated goals and objectives of Technology Services that relate to this dissertation are shown
in Figure-34. The reader should familiarise himself with the goals and objectives as findings at
the research site clearly indicate a disconnect.
The reality though is that the picture in the organisation does not conform to these goals and
objectives. The reason for misalignment, based on interviews and observations, can be found in
the following:

Weak general business skills

Weak process integration and adherence to process in short lack of Quality Management
(confirmed by assessments done by Foster Melliar and Meta Group).

Wanting to be all things to all men the organisation are trying to run before it can walk,
thus over promising and under delivering.

Bad people management the organisational structure is not finalised, there is a lot of
uncertainty amongst especially middle managers and their staff - the newculture in the
organisation is unfamiliar and HR change management is non-existent.

Decision making is not devolved and lie in the hands of the executives who are too far
removed and busy to make timely decisions they have to get up-to-date before
meaningful decisions can be made this takes time.

New management initiatives are a huge paradigm shift for most employees and managers
alike.

Page 80 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

GOAL 1:
TO PROVIDE A
FLEXIBLE AND
ADAPTABLE ICT
INFRASTRUCTURE
THAT MEETS
THE BUSINESS
NEEDS OF THE
ORGANISATION

Objective 2.
To transform the IT service management
processes into ITIL best practice processes
Objective 3.
To transform the Organisation's technical
platform into a secure and optimal environment
Objective 4.
To establish quality assurance mechanisms
for IT changes

GOAL 4:
TO IMPLEMENT A
SOUND DECISION
MAKING PROCESS
FOR MAJOR ICT
INVESTMENTS

Objective 1.
To implement an ICT governance structure
and processes within the Organisation and
ensure that they are operational

GOAL 6:
TO INCREASE AND
RETAIN ICT HUMAN
CAPITAL CAPACITY IN
THE ORGANISATION
THROUGH
INNOVATIVE
PROGRAMMES

Objective 1.
Increase capacity through training of
employees on essential technologies
required within the Organisation

Figure 34: Goals and Objectives relating to this Dissertation.

5.7 Context for implementing Service Management Best Practice


The emergence of the information era made obsolete some fundamental assumptions of the
industrial era organisations can no longer gain competitive advantage by deploying new
technology and managing financial assets only. It requires new capabilities for competitive
success, namely exploiting and managing intangible assets more than managing tangible assets
(Itami,1987), thus having to focus on customer relationships, innovation, providing customised
high quality products and making the best possible use of technology, employee skills,
motivation, reward systems, process capabilities, response times and quality.
Breakthrough performance however requires major change, including the measurement and
management systems used by the organisation.

Page 81 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

The Balanced Scorecard provides management with the instrumentation to navigate to future
competitiveness, while it retains emphasis on financial objectives, it also include the long term
drivers of this objective these are the other three perspectives (but the organisation may elect to
add more) customers, internal process and innovation, learning and growth.
Kaplan and Norton (1996, p.30) summarised the above concepts when they said that a strategy
is a set of hypotheses about cause and effect. The measurement system should make the
relationships among objects (and measures) in the various perspectives explicit so that they can
be managed and validated thus, a properly constructed Balanced Scorecard should tell the story
of the business units strategy. It should identify and make explicit the sequence of hypotheses
about the cause-and-effect relationships between outcome measures (Key Goal Indicators CobiTTM) and the performance drivers (Key Performance - CobiTTM) of those outcomes. Every
measure selected for the Balanced Scorecard should be an element in the chain of cause-andeffect relationships that communicates the meaning of the business units strategy. Figure-35
bellow illustrates the Balanced Scorecard Perspectives (a fifth perspective Quality was added)
in relation to cause-and-effect relationships, first defined by Heskett et al (1994). Hesketts
service-profit chainwas adapted to align to major goals and objectives of Technology Services.
Appendix-N discusses each of the now five Balanced Scorecard Perspectives in greater detail and
serves as the backdrop of specific recommendations made in this section.

Page 82 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Effective and Valued


Service Delivery at
Optimal Cost

Improved
Efficiency thereby
Reducing Cost

Improved
Effectiveness thereby
Reducing Risk

Increase
Customer
Confidence

Customer
Satisfaction through
Superior Execution

Financial Perspective

Customer
Perspective

Understand
Customer
Needs

Offer Services
that are Valued
by Customers

Rapid Response
to Needs - Service
Culture

Operational
Excellence based
on Best-Practice

Quality Products,
Services and
Processes

Internal
Perspective
Increase
Employee
Productivity

Develop
Strategic Skills

Access to
Strategic
Information

Align Personal Goals


to Organizational
Goals

Satisfied
Employees
The Learning Perspective

Internal
Service Quality
The Quality Perspective
adapted from Kaplan and Norton 1997

Figure-35: Operating and Service Delivery Strategy for Technology Services.

Page 83 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

5.8 Specific Recommendations


ICT Service Management must make a difference to the whole organisation and it must make the
business processes more efficient and effective. ICT does not operate in a vacuum, neither does
Service Management. Figure-36 shows the relationship of Service and Quality Management to
the broader organisation.

People
Culture, Attitude
Beliefs and
Skills

Leadership &
Management
Strategy, Steering,
Direction and
Integration

Process
ICT Service
Management
Quality Management

Technology
Tools and
Infrastructure

Source: OGC

Figure-36: ICT Service and Quality Management and the Business Environment.
How the business sees Technology Services achieving its objectives needs to be understood, e.g.
business efficiency, cost reduction in Service Delivery, increased Customer Satisfaction or more
a reliable information technology service to support business critical services, processes or
deliverables as well as considering the current quality of information technology services and
possible quality improvement programmes.
Businesses are becoming increasingly aware of the importance of ICT in not only supporting, but
also enabling the business to achieve its objectives. By implication this means (OGC(2), 2002)
that ICT is seen as an enabler of business Change and an integral component of the business
change program, the focus needs to be on the quality of ICT in terms of reliability, availability,
capacity and security.
Technology Services find them in a position where they will have to realise and manage businessenabling technology and services and deliver the quality demanded by business and needs to
Page 84 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

demonstrate value for money. The technology component within e-business is not only
supporting the primary business processes, but is part of the primary business processes.
Managers and staff need to understand business operations and advise the business on the
possibilities and limitations of technology. They need to realise that the organisation not only
needs to accommodate more technology change but also reduce cycle time, guarantee quality of
service delivery and support while absorbing more technology and ensure quality of delivery and
support matches the business use of new technology doing all this while bringing escalating
costs under control.
Many organisations feel that ICT service provision leaves a lot to be desired, that quality levels
and work methods are not effective - if Technology Services want to support the business
transformation, it needs to undergo a similar transformation process (OGC(2), 2002).
The support of management during this dramatic period of change is cardinal leading by
example - if management does not support the use of best practice openly and demonstrably, or is
not fully committed to change and innovation, staff cannot be expected to improve themselves,
processes or service to Customers (OGC(2), 2002).
5.8.1 Implementing a Quality Management System
Quality Management for ICT services is a systematic way of ensuring that all the activities
necessary to design, develop and implement services, satisfy the requirements of the organisation
it is the way that an organisation plans to manage its Operations in order to deliver quality
services specified by its Quality Management System. The Quality Management System defines
the organisational structure, responsibilities, policies, procedures, processes, standards and
resources required to deliver quality IT services. However, a Quality Management System will
only function as intended if management and staff are committed to achieving its objectives
(OGC(2), 2002).
The approach recommended is the adoption of the ISO9001:2000 the approach perfectly
complements ITIL as it lacks a system that ensures measurement, analysis and improvement.
Figure-37 outlines the activities of ITIL and ISO9001:2000 and its alignment to normal
operational processes.

Page 85 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Figure-37: Quality Management Process vs. ITIL.


Deming (1986) highlights, amongst others, that management needs to give attention to breaking
down barriers between silos in the organisation and realise that process improvement requires
commitment from the top; good leaders motivate people to improve themselves and therefore the
image of the organisation. The emphasis should be on continual and constant improvement,
supported by an education and self-improvement program, on the job training and constantly
communicate one message Transformation is everyones job, the emphasis being on teamwork
and understanding.
The Quality Management System is a vital tool to ensure conformance to policies, procedures,
standards and continuous improvement.
5.8.2 Effectively Control and Manage Objectives
CobiTTMs use of the Balanced Scorecard makes for a simplified, guided implementation of the
Balanced Scorecard; it is well aligned to ICT environment and integrates well with ITIL .
Measurements in CobiTTM, from an operational/infrastructure perspective, are an identical match
to the processes when implementing the ITIL and ISO17799 (Security Best Practice) processes

Page 86 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

jointly - it would thus be very easy to use the CobiT TM framework as a guide for implementing
the Balanced Scorecard in Technology Services environment.
IT Control Objectives are in essence a management practice that aims to ensure that effort leads
to the achievement of business objectives. From this dissertation it is clear that operational
inefficiencies are not the result of the lack of effort or planning but rather the result of
inappropriate alignment of efforts, which in turn resulted from the lack of measurement, analysis
and control.
The major focus of recommended improvements should thus focus on measuring performance
against business objectives the organisations focus on value-add to its customers, resulting in
Customer Satisfaction (also see Figure-35 and Appendix-N on the Balanced Scorecard
perspectives). Efforts should thus focus on planning and measurement setting processes in
place to ensure that non-conformance results in corrective action (Quality Management System).
5.8.3 Matrix Structure
To ensure that measurable objectives could be set and quality objectives could be met, the current
silos in the organisation need to be broken down. Process owners need to measure and achieve
process targets across the organisation. This is currently very difficult as process owners report to
functional owners and other functions do not see the need to follow or conform to the process.
The proposed change in the structure is to change the organisation to a matrix ensuring that
processes apply to all functions. Kakabadse et al (1988, p.333) contends that Matrix structure
management is needed when an organisation has developed a somewhat unresponsive and
inflexible culture, possibly . leading to a product structure or divisionalisation, but finds that
various specialists need to be incorporated more closely into day-to-day operation of the
organisation. Matrix management is a way of generating greater flexibility over task
activities and more market responsive attitudes amongst managers and specialists in the
organisation(emphasis added).
Although matrix operation is already evident in the organisation, roles and responsibilities need
to be legitimised. For a matrix organisation to work, management needs to show commitment to
process owners and ensure that functional owners allow staff members to report to process
owners on process based activities.

Page 87 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

With a change in structure, the culture of the organisation also needs to change. The proposed
structure is extremely flat, this necessitates ensuring that staff members are empowered and
authority is devolved to the lowest possible level.
5.8.4 Service Culture
Although Customer Satisfaction is a strategic objective of Shared Service Centre, the growing
awareness of becoming an accountablebusiness unit, has led to managements recognition of
the importance of its role as service provider. In order to achieve true business success, the
concept of service should permeate through all layers of the organisation.
The term cultureis used to refer to the values and believes of the organisation including the
authority that is exercised and the way people are rewarded, methods of communication and the
extent to which procedures and regulations are enforced. Culture should support the objectives of
the organisation in this instance Customer Satisfaction.
All participants in service provision thus need to understand the demands of Customers and
understand how their actions can influence Customer Satisfaction.
For a service culture to be successful, publicly demonstrating that the objective of Customer
Satisfaction are important and demonstrating support for cultivating a service culture by
attending meetings and encouraging staff to take part in meetings and training and reward
contributions to Customer Satisfaction or quality improvement are equally important.
What customers can expect needs to be clearly communicated. Service Level Managers also need
to constantly communicate with customers, making sure that customers understand what they can
expect. Customers can only be satisfied if they believe they get what was promised.
5.8.5 Managing Change
Organisational culture comprise of ideas, corporate values, beliefs, practices, expectations about
behaviour and attitudes that are shared by employees and management in the organisation. The
proposed organisational changes can affect people and the way they feel can even lead to a split
within the organisation. This division can produce vagueness, public resistance, cynicism and
even sabotage where these symptoms are displayed, knowledge, experience and energy remain
unused or may be lost forever. However in times of radical change, emotions could just as easily

Page 88 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

be channelled to benefit the change. Feelings have an important impact on the outcome of an
Organisational Change.
The implementation of any change is a strategic choice, recognising that the business is skills and
people-dependant. Implementing any form of change will not be easy; without determination,
tenacity and patience the change initiative will fail. Managers need to assess two fundamental
elements:
1. To what extend are the organisations policies and practices consistent, and
2. To what extent are the organisations policies and practices externally consistent if it
leads to outcomes Best Practice promise?
It is imperative that all employees perceive a need to do things differently otherwise nothing
will happen.
5.8 Conclusion
The Process Integration Modelin Figure-38 summarises the recommendations made in this
dissertation, the framework of the model is based on the European Foundation for Qualitys
Excellence Model (OGC(2):2002).

Figure-38: The Process Integration Model - An Overview of Recommendations

Page 89 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

In the one year of operation Technology Services has achieved a lot basic foundation blocks are
in place. Efforts now should focus on consolidation. The recommendations above (measure
outcomes against performance objectives, adapting the organisational structure, developing
people, process improvement, quality management, change management and cultivating a service
culture) should ensure that the organisation reaches the next level of maturity and its overall
objectives.
Technology Services has achieved a lot during the research project, a number of critical issues
were addressed, the organisation has matured somewhat and at least seven of the ITIL processes
are fairly well entrenched in the organisation. What is now important is to ensure that momentum
is maintained and that all efforts should be made to ensure that the whole framework is
implemented and all processes are matured in the organisation. Judging form the comments made
by managers and also the fact that Process Maturity and Best Practice conformance have not
improved as significantly in the third quarter of the research period alarm bells should be
acknowledged. Technology managers responsible for ITIL processes consistently quoted weak
governance and adherence to process as the reason the organisation fails to improve more
significantly and states that it is a source of great frustration. The fact is that communication is
lacking and the silo based approach to business reinforces this observation and comments.
Governance need to ensure that processes are adhered to by all employees of the organisation and
ensure that controls and measurements are in place to highlight non-conformance and to ensure
that corrective action is taken, failing this, disciplinary action.
As stated in the sub-section dealing with change management, failure to uniteall work-streams
behind the rally cry of Service Management Best Practice can quickly lead to loosing ground
gained. Many implementations have failed in various organisations, in spite of making early
gains as momentum was lost. Lost momentum is not slower progress it is actually the first signs
of ultimate failure. It should also be noted that it is easier to change behaviour by changing
process, structure and systems than to change attitudes or corporate culture (Armstrong,1996)
Now more than ever, momentum with regard to implementing Best Practice needs to be
maintained if not stepped up. Ensuring follow-through is of vital importance if the organisation
wants to reap the rewards Best Practice can bring and continue to mature its processes. Proper
change management is a pre-requisite for success.
Page 90 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

The most important factor to ensure success remains is that management publicly demonstrate
their support for the above initiatives. Hammer (1996, p.22) states a fourth common mistake is
to attempt change(implied) without the requisite leadership. Strong, committed, executive
leadership is the absolute sine qua non .. Only a senior executive who deeply believes in
the change processcan actually make it happen . (Without executive sponsorship) you
are dead, but you may not know it!
The relationship between Service Management and Customer Satisfaction in the Shared Services
Centre was demonstrated in this dissertation. Customer Satisfaction is the major objective of the
organisation, it is then a logical conclusion that the continued implementation of ITIL backed up
by a Quality Management System should be the number one priority of the management of
Technology Services and Shared Services Centre, or as Michael Hammer puts it, stopping short
of achieving the objective set is just not an option.

The Tao (Way of Management), means including the people to the same aim (objectives,
strategy and goals) as the leadership, so they may share life (successes) and death (failures)
without fear of danger (thus learning from all experiences). Sun Tzu, The Art of War
(interpretation of this researcher in brackets). (Cleary, 1988, p.43)

Page 91 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

CHAPTER 6
THEORY AND PRACTICE

6.1 Critique on Literature and Theory


During the research period the researcher found that Customer Satisfaction and Service
Quality are in many instances synonymous, so to are Quality Management and Best
Practice (Zeithaml et al. 1988, van Iwaarden 2002, Slack 2001, Wood et al, 2000, Kerklaan
quoted in Mastenbroek,1991). This observation is substantiated by Dale (1999), who observed
that satisfied customers are a product of a quality service, and Kerklaan stating that: Constantly
high service quality can only be obtained when it is backed up by good internal organisation,
systems, processes and procedures (Kerklaan in Mastenbroek,1991, p.51), summarised by
Collier (1987) saying that Service Managements purpose is to ensure the delivery of a quality
service that satisfy the customers needs or expectations within the organisations financial
means.
The General objectives of Service Management, Best Practice and Quality are the maximisation
of profit, delivering a quality service to customers making use of the limited resources available
to the organisation. By implication we can say:
Best Practice provides the framework for implementing measures to ensure that a quality product
or service is delivered that meets the expectations of customers, thereby ensuring that customers
are satisfied.
These measuresare twofold:
1. Best Practice, that concentrates on the what to do or doing the right things, thus
focusing on efficiency, and
2. Quality Management, that ensures that it is done and that the operational environment is
conducive to get the how toand what todone or doing things right, thus focusing
on effectiveness.
Both Best Practice and Quality, looking at texts quoted in this dissertation, has Customer
Satisfaction as its aim. These factors are however not the only contributors towards Customer
Page 92 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Satisfaction. Indeed it is the aim of the Organisation as a whole (or should be) to satisfy
customers.
It is also clear that satisfying customers is not a simple task, as much of a customers perception
of quality and value is subjective and can be based on past or recent experiences and is thus not
justifiable in many instances. The role of the employees of the service provider can thus also
largely shape how customers feel about the quality of service and if they are satisfied. This can be
done in a number of ways; shaping behaviour in line with SERVQUAL factors, communicating
what was delivered and what corrective actions were taken.
The Hourglass model (Figure-39) aims to put the issue of Customer Satisfaction in Context with
the environment in which Customer Satisfaction is achieved.

Figure-39: The Hourglass Model of Customer Satisfaction


The Hourglass model thus contends that planning and setting objectives for service delivery in
line with the customers specification and expectation underpins the delivery of a quality service,
focusing on individual needs, relating it back to what can be expected or what was agreed and
staffs commitment to serve the customer which in turn forms the basis of the customers
perception of quality and value received and in turn determines Customer Satisfaction.

Page 93 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Customers expectations and specifications are derived from their strategy whereas the
organisations objectives and delivery model are based on the operational environment and its
strategy.
Quality Management is a major enabler of reaching objectives and the strategy of the
organisation as Cox and Dale (2001) state that quality is the key element in business achievement
and that without attention to quality the organisation will fail to deliver the appropriate service
levels, resulting in dissatisfied customers.
The commitment of executive managers to Quality and Best Practice is paramount, Wilson et al
(2000) regards a formal quality management program as concrete evidence of an organisations
commitment to client service, continuous improvement and provision of quality service.
This commitment should thus actively focus on the pursuit of world class performance (Best
Practice) using principles employed by the most successful organisations to manage and organise
their operations. Managers should also realise that Best Practice is a moving target and as the
leading organisations continue to improve the best practice goalpoststhe organisation should
follow, not only follow but also constantly strive to improve on their own thus contributing the
greater body of knowledge of Best Practice for their specific industry, constantly moving
forward. Best Practice and Quality is continuous improvement and is integral to the achievement
of Customer Satisfaction and other organisational goals.
The dilemma of the researcher, as stated before was that very little, especially academic material
exists on the topic of ICT Service Management Best Practice. The ITIL framework seems to be
the de facto standard for ICT Service Management Best Practice as is evident by the sheer
number of vendors and users that have adopted the framework. ITIL is a collaborative effort of
industry and the UK government and forms the basis of the de jure standard for ICT Service
Management Best Practice, BS15000 (http://www.itsm.org.uk)11.
The researcher concludes that the sheer number of supporters and adopters and industry
contributors towards ITIL, the fact that the framework is living (as proven by the second version
published in 2000, thus complying with the main requisite of Best Practice) and the fact that the
researcher has proved in an operational environment that ITIL contributes towards operational
effectiveness and Customer Satisfaction seem reason enough to accept the framework as valid.
11

Also see http://www.itsmf.com/news/news.asp?NewsID=191

Page 94 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

As stated before some academics have criticised the ITIL framework as too restrictive and not
going beyond infrastructure management. Thiadens (2002) comments that three phases are
visible in Service Management development, 1980s - Structuring of Organisations (ITIL),
1990s Provision of functions, and 2000 beyond Extension and Growth. Thiadens further
comments that the accent in the field of Service Management moved from organising internal
services to directing services towards an improved performance.
The researcher however does not agree that the focus has moved or indeed that it is a time-bound
phenomena it is rather a sequential progression (growth) based on organisational maturity. One
first has to ensure that a firm foundation exists (structuring the organisation) before the
organisation can focus on extension and growth. We should also remember that ITIL stands for
Information Technology Infrastructure Library, thus the focus on infrastructure it is not the be
all and end alland fits this purpose only. Its use does not preclude organisations from adopting
other Best Practice frameworks that deal with the Extension and Growth (e.g. program and
portfolio management) phases of Service Management Best Practice based on organisational
maturity.
The reason that the researcher introduces CobiTTM and the Balanced Scorecard are two fold:
1. CobiTTM is action oriented, generic and provides management direction (getting the
information related processes under control), monitor the achievements of organisational
goals and monitoring performance within process - it makes use of the Balanced
Scorecard and focus on three Indicators/factors.
a. Critical Success Factors, which define the management oriented implementation
guidelines to achieve control over and within its ICT processes
b. Key Goal Indicators, which tell management after the fact whether a process
has achieved its business objectives, and
c. Key Performance Indicators, lead indicators that define measures of how well
processes perform (enabling the goal to be reached).
CobiTTM thus ensures that a framework exists in which the goals and objectives of the
organisation are operationalised and measured.

Page 95 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

2. The parent organisation has embraced the principles of Balanced Scorecard - as CobiT TM
contextualised Balanced Scorecard for an ICT environment it thus seems to be a logical
fit.
Best Practice is further underpinned by Quality Management, ensuring performance indicators
are documented, understood and adhered to. It provides the framework in which organisational
goals can be related to daily activities and ensures adherence to standards, processes, procedures
and work instruction (Sacks, 2003). Quality Management thus also plays a major part in ensuring
that objectives are managed, set and met.
The successful implementation of Best Practice is further reliant on the existence of a Service
Culture in the organisation. A Service Culture is an organisational culture which emphasises
satisfying customer requirements (Fry, 1989) and the existence of a service culture is seen as a
prerequisite for delivering a quality service (Mastenbroek, 1991).
This dissertation does not cover the detail on how to entrench a service culture in an organisation;
Section-5 does however cover some key aspects as it was seen to be important to ensuring future
success in delivering quality and valued services in the organisation.
The theory covered in this dissertation focuses on two distinct themes:
1. Theory around Customer Satisfaction, Quality and Best Practice (SERVQUAL, Service
Management, ICT Service Management Best Practice, ITIL, Service Quality, etc.) and
2. Measures necessary for successful implementation of Best Practice and Quality that
result in Customer Satisfaction (CobiTTM, Balanced Scorecard, Service Culture, Service
Quality, Management Commitment, Organisational Style and Structure etc.).
More time was spent on the first theme as it relates to the aims and objectives of this dissertation.
The second theme contextualise the first, thereby setting the scene for Best Practice
implementation and Quality Management.
6.2 Literature vs. the Research Environment
The research conducted, delineates that a direct relationship could be found between the
implementation of ICT Service Management Best Practice as set out in the ITIL framework
and Customer Satisfaction. The use of the Best Practice Framework thus contributed towards

Page 96 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

the organisations success for the first phase of establishing Shared Service Centre. It is the
researchers view that the ITIL framework will continue to support the organisations goals
during future phases of implementation, consolidation and maturing.
The concepts dealt with in the Literature Review are largely applicable to the research
environment. If anything it shows that a number of gaps need to be addressed to ensure that
Shared Service Centre can be viewed as a best-of-breed reference site in-line with the stated
objective of the Chief Executive and Chief Information Officers.
Most of these issues revolve around the lack of integration between processes and functions in
Technology Services. No amount of effort or action will ensure the success of the division unless
this issue is addressed. For this reason Section-5 deals with the following topics:

The implementation of a Quality Management System

Establishing a Service Culture

Review of the Organisational Structure and Management of the Organisation

Effective Change Management

Managing and Controlling Objectives based on Organisational Strategy, in addition to

Ensuring that all the recommended processes and functions as set out in the ITIL , are
implemented, followed and matured over a period of time.

6.3 Research Framework


The research framework was conceptualise as part of the Synopsis of this dissertation, it was not
till later that the researcher realised that much of what was conceptualised regarding the research
actually closely resembled existing gap analysis models. This realisation assisted the researcher
to contextualise the framework and gave him an additional perspective on the research
conducted.
6.3.1 Applicability of the Framework
The Framework was however found to be effective in reaching the goals and objectives of this
dissertation. The researcher feels that the framework assisted him in analysing the data collected,
finding if correlations between data sets exists and contextualise the findings and
recommendations; the Framework thus served its purpose.

Page 97 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

The main problem was not the fact that the Framework was lacking or that the approach seems
flawed, rather a number of other factors, which, if in place would have contributed to more
meaningful research and research results, they are:
1. The Customer Satisfaction Survey was limited as it proved unable to reflect smaller
improvements in service quality; if a scale of 1 to 10 was used from the onset, the
researcher would not have to resort to obtain off-set values for the last survey. A scale
from 1 to 5 was definitely too restrictive.
2. Measures or methods to measure the actual improvement of service delivery were very
limited in the research environment having a better Service Desk management tool
implemented would have proved to be very helpful.
6.3.2 Future use of the Framework
The Framework, with minor adjustments can be used for future research projects. Suggested
dimensions to be added to the Framework are a Quality dimension and a dimension evaluating
the internal organisational change and how employees react to changes in the internal
environment.
Some work can thus still be done with regard to the practicality of using the Research Framework
and researchers that want to make use of the Framework may want to take cognisance of the
above factors.
6.4 Conclusions re the purpose of the research
The objective of this dissertation as stated in Section-1-3, is to ascertain whether (1) there is a
direct correlation between Customer Satisfaction and the use of Service Management Best
Practice in the form of the ITIL, (2) if Customer Satisfaction is an indication of effective service
provision and (3) to establish the operational context best suited to implement ITIL.
6.4.1 Is there a direct link between Customer Satisfaction and Service Management Best
Practice?
Research done at the research site seems to strongly suggest that a direct link exists between
Customer Satisfaction and Service Management Best Practice as defined in the ITIL framework.
A very high correlation was found between the results of Customer Satisfaction surveys and
assessments measuring the process maturity of Service Management Processes and conformance

Page 98 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

(see Section-5.3 and Appendix-M). Although the study was limited to a single research site,
claims made by the Office of Government Commerce as to the contribution that the ITIL
framework makes to satisfy customers, seems to be justified. As indicated later in this chapter
further research is needed to provide solid proof.
6.4.2 Is Customer Satisfaction an indication of effective service Provision?
Furthermore the research at the research site seems to indicate a definite link between a tangible
improvement in service delivery (less calls logged per customer at the end of the research period)
and Customer Satisfaction, as a very high correlation was once again found between the above
mentioned tangible deliverables and Customer Satisfaction surveys. It should be noted that the
user count has grown by nearly three fold over the period the research was done, and in spite of
this increase the median time to resolve incidents and number of calls logged at the Service Desk
remained constant clearly indicating that the resource pool servicing customers improved their
effectiveness substantially, as this pool also remained relatively unchanged (less people doing
more or better work) (see Section-5.3 and Appendix-M).
6.4.3 The Operational context best suited for the implementation of ITIL.
It was noted during the research period that the ITIL framework is not a prescriptive
methodology but rather a set of suggestions. While this provides great flexibility for
organisations that plan to use the Best Practice framework, it leaves gaping holes that needs to be
filled. The ITIL framework is predominantly a process based framework, practices that thus
applies to a process led approach applies equally to the framework.
As a result of the above observation, a number of factors need to be considered and suggested to
the organisation, namely:

The implementation of a Quality Management System

Establishing a Service Culture

Review of the Organisational Structure and Management of the Organisation

Effective Change Management

Managing and Controlling Objectives based on Organisational Strategy, in addition to

Ensuring that all the recommended processes and functions as set out in the ITIL , are
implemented, followed and matured over a period of time.

Page 99 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Although specific models or methodologies were recommended in Section-5, the appropriateness


of these should be evaluated against the backdrop of any specific organisation that plans to use
the ITIL framework. The specific recommendations are not of importance as long as the planned
outcomes of the alternative methodologies and models achieves the same aims. The focus should
thus be on establishing a customer ethic in the organisation, quality (process, people and
technology), sustainability, growth and maturity, effective change control and ensuring the ICT
activities, processes, procedures and work-methods align to the strategic objectives of the
organisation and that these are measurable and measured.
6.5 Models devised
A number of models were devised during the research period, based on the work of other authors
- delineating relationships between their works or contextualising the work done.
The Research Framework (Figure-10) draws on existing gap-models; the use of the Framework
differs from other models as the purpose of the framework is not to identify gaps but rather
investigate the relationship changes that each element has on other elements.
The Process Integration Model (Figure-38) used the EQF Excellence Model as a basis to
contextualise Service and Quality Management Best Practice with managing the outcomes of
organisational objectives and making use of the Balanced Scorecard and CobiTTM. The model
gives a cohesive view to managers on the effective use of Best Practice and ensuring achievement
of expected outcomes in line with Organisational Goals and Objectives.
The Hourglass Model (Figure-39) relates the principles of Service and Quality Management,
control objectives and operational issues to Customer Satisfaction using Quality and Service
Management Best Practice and the Balanced Scorecard and relating the outcomes to the causeand-effect factors of the American Customer Satisfaction Index Model. This provides
management with a quick view of what the outcomes of activities should relate to, or if issues
regarding Customer Satisfaction exist, where to look for the causes or where to focus attention in
order to overcome problems.

Page 100 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

6.6 Future Research


To conclusively prove the relationship between Customer Satisfaction and ICT Service
Management Best Practice, more research on a much broader scale is necessary. Future research
that includes a number of research sites from a number of industry segments, public and private
institutions would prove very helpful and should conclusively prove the relationship between
Customer Satisfaction and ICT Service Management Best Practice.
As stated in Section-5, the validity of research based on three sets of data (assessments and
questionnaires) is far from ideal; measurements should continue at the research site over a longer
period of time and more frequently, thus providing a more substantive base for statistical
analysis.
Even though the limited research proves a definite relationship between Customer Satisfaction
and ICT Service Management best practice, more definitive research delineating the nature of
this relationship is needed. This study merely illustrated that ICT Service Management Best
Practice has as one of its outcomes Satisfied Customers, how this outcome was obtained was not
illustrated at all. One also needs to consider that not all of the recommended processes/disciplines
of the chosen Best Practice Framework (ITIL) were implemented during the research project,
thus the full effect of using the framework could not be judged. Future research should make sure
that these factors are considered and must thus be more substantive, both in terms of scope (more
frequent measurement, following all the recommendations and implementing all the ITIL
disciplines) and time span (run over a number of years).
This more substantive research should also attempt to relate the contribution each of the
disciplines make towards Customer Satisfaction or the benefits of implementing each of the
disciplines and relating it back to Service Quality, value add, cost of ownership, the effectiveness
and efficiency of the organisation. Some of the managers interviewed regarded some the
processes implemented more beneficial than others or indeed, although they noted that processes
contributed towards value add, cost of ownership, effectiveness and efficiency, they felt that the
relationship with Customer Satisfaction was unsubstantiated or at the very least unclear. This
research thus merely illustrated that the Framework as a whole will contribute towards the
satisfaction of customers.

Page 101 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Further research on the use of ICT Service Management Best Practice and Quality Management
(especially Total Quality Management) and its relationship to CobiTTM and other Best Practice
Frameworks and Customer Satisfaction will also prove helpful to the general body of knowledge
of Service Management.
6.7 The value of this dissertation
Internal critics of Best Practice Frameworks and specifically ITIL, prompted the researcher to
investigate at least one of the claims made by the originators of the framework. Customer
Satisfaction was chosen as it is the ultimate goal of any organisation, other benefits are thus
secondary.
Very little literature exists currently on ICT Service Management/Best Practice. Although the
researcher could find some material on the issue on the internet (commercial/sites of Service
Management consultants mainly and very few academic work), the only published material found
was ITIL of the Office of Government Commerce UK. Many of the larger consulting firms have
developed their own ICT Service Management Best Practice frameworks, these are proprietary
and not in the public domain. More material was found on Service Management in general, ICT
Management in general, Quality Management and Management Best Practice as a result the
researcher had to draw on these bodies of knowledge, relate these to ICT Service Management
and build arguments from the ground up in many instances.
Although this dissertation did not focus on the mechanics ICT Service Management, but rather
on the consequences of following one of the Best Practice frameworks, it has however,
established valuable links and relationships to the fields of study mentioned in the previous
paragraph. This dissertation should thus make a valuable contribution to the field of ICT Service
Management. A lot of scope still exists for future research in this field of study and fellow
students are encouraged to consider the possibilities this field of study offer.

Page 102 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Bibliography
Papers, Books and other Academic Texts
ABPDP (1994). The Australian Best Practice Demonstration Program. As quoted in
Guidelines for the Application of Best Practice in Australian University Libraries. Department
of Education, Training and Youth Affairs, Australia.
Anderson L (1994). Espoused Theories and Theories-in-use: Bridging the Gap. Master of
Organisational Psychology Thesis, Bond - University of Old Australia.
Argyris et al (1985). Action Science. Jossey-Bass Publishers, San Francisco USA.
Armstrong, M (1996). A handbook of Personnel Management Practice (6th Edition). Kogan
Page, London UK
Berry L and Parasuraman A (1991). Marketing Services - Competing Through Quality. Free
Press, New York USA.
Berry L, Parasuraman A and Zeithaml V (1990). Five Imperatives for Improving Service
Quality - Reprint 3142. MIT Sloan Management Review Summer - 1990. Massachusetts
Institute of Technology, Cambridge USA.
Blanchard P, Brown H and Wilson D (1999). Organisational Decision Making and
Information Systems. Oxford Brooks University, Oxford UK.
Brown S, Gummesson E, Edvardsson B, Gustavsson B (1991). Service Quality:
Multidisciplinary and Multinational Perspectives. Lexington Books, Lexington USA.
Burrell and Morgan (1985). Sociological Paradigms and Organisational Analysis Elements
of Sociology of Corporate Life. Ashgate Publishing Ltd. Hampshire UK
Cleary T(1988). English translation of Sun Tzus The Art of War. Shambhala, London UK.
Collier DA (1987). Service Management - Operating Decisions. Prentice-Hall, New York
USA. As quoted in Potgieter C (1997). Service Management of the Information Technology
Infrastructure D.Com (Informatics) Thesis. Faculty of Economic and Management Sciences,
University of Pretoria, Pretoria RSA.
Cox J and Dale BG (2001). Service quality and e-commerce: an exploration analysis.
Managing Service Quality, Emerald Group Publishing Ltd., Bradford UK
Crosby L, Pearson M and LeMay S (1998). Empirical determination of shipper requirements
for motor carrier services: SERVQUAL, direct questioning and policy capturing methods.
Journal of Business Logistics, 19 (1), 139-53. As quoted in van Iwaarden J (2002). The Quest

Page 103 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

for Quality on the WEB, a study of user perceptions of WEB site quality - Masters Thesis. Dept.
of Marketing and Organisation, Rotterdam School of Economics, Erasmus Universiteit,
Rotterdam Netherlands.
Curtis, G (1998). Business Information Systems analysis, design and practice (p.45),
Addison-Wesley, Essex UK.
Dale BG (1999). Managing Quality (Third Edition). Blackwell Publishers, Oxford UK.
Davies H (1991). Managerial Economics - for business, management and accounting Second Edition. Financial Times -Pitman Publishing, London UK.
Deming EW (1986). Out of the Crisis. Quality, Productivity and Competitive Position.
Cambridge University Press, Cambridge UK.
Doyle P (1998). Marketing Management and Strategy Second Edition. Prentice Hall,
Hertfordshire UK.
Eccles R and Nohria N (1994). Beyond the Hype: Rediscovering the Essence of
Management. Harvard Business School Press, Cambridge USA.
Fry M (1989). Guide to IT Service Culture - Unit 2, Service Level Management. Protocol
Publishing. As quoted in Potgieter C (1997). Service Management of the Information
Technology Infrastructure D.Com (Informatics) Thesis. Faculty of Economic and Management
Sciences, University of Pretoria, Pretoria RSA.
Gabor A (1990). The Man who Discovered Quality. Times Books, New York USA.
Galliers, R (1992). Information Systems Research Issues, Methods and Practical
Guidelines. Blackwell Scientific Publications, as quoted in Dick B (2000). A beginners guide to
action research. Available on-line www.scu.edu.au/schools/gcm/ar/arp/guide.html.
Garvin DA (1988). Managing Quality. The Free Press, New York USA.
Griffith University Quality program as quoted in: Wilson, Pitman and Trahn (2000).
Guidelines for the Application of Best Practice in Australian University Libraries. Department
of Education, Training and Youth Affairs, Australia.
Shared Service Centre Technology Services (2003). Strategic Plan for 2004/5. Johannesburg
RSA.
Hammer, M (1996). The Reengineering Revolution Handbook (3rd Edition). HarperCollins.
London UK
Handy C (1999). Understanding Organisations. p.285, Penguin Books. London UK.
Hewlett-Packard (1989). The Test of Time. Brochure reprinted from the March-April 1989
Page 104 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

issue of Measure Magazine, Hewlett-Packard, USA


Haywood-Framer J and Nollet J (1991). Service Plus: Effective Service Management. Morin.
as quoted in Markland R (1999). International Service Study, Decision Line May 1999. Dept. of
Management Science, College of Business Administration, University of South Carolina,
Columbia USA.
Hertsberg F (1966). Work and the Nature of Man. World Publishing, Cleveland USA
Heskett JL, Sasser WE and Schlesinger LA (1992). The Service Profit Chain. Free Press, New
York USA.
Heskett J et al (1994). Putting the Service-Profit Chain to Work. As quoted in, Norton D and
Kaplan R (1996). The Balanced Scorecard Translating Strategy info Action. Harvard Business
School Press, Boston USA.
Hirschheim and Klein (1989). Exploring the Intellectual Structures of Information Systems
Development: A Social Action Theory Analysis. In Accounting, Management and Information
Technologies, Borland et al (1996). Vol 6, nr 1/2. Elsevier-Pergamon, St Louis USA.
International Standards Organisation (1999). ISO11620 Standard, ISO, Geneva Switzerland.
International Standards Organisation (2000). ISO 90001:2000 Standard. ISO, Geneva
Switzerland.
International Standards Organisation (2003). ISO 90001:2000 Quality Management System Version 3.0. ISO, Geneva Switzerland.
Itami H (1987). Mobilising Invisible Assets. Harvard University Press, Cambridge USA.
IT Governance Institute (2000). CobiTTM Executive Summary 3rd Edition. The
Information Systems Audit and Control Foundation, Rolling Meadows USA.
IT Governance Institute (2000). CobiTTM Management Guidelines 3rd Edition. The
Information Systems Audit and Control Foundation, Rolling Meadows USA.
Jankowicz AD (1995). Business Research Projects Second Edition. International Thompson
Business Press, London UK.
Kakabadse, Ludlow and Vinnicombe (1988). Working in Organisations. Penguin, London UK.
Kaplan R and Norton D (1996). The Balanced Scorecard Translating Strategy into Action.
Harvard Business Press, Boston USA.
Kerklaan in Mastenbroek WFG (1991). Managing Quality in the Service Sector. Basil
Blackwell Ltd. as quoted in Potgieter C (1997). Service Management of the Information
Technology Infrastructure D.Com (Informatics) Thesis. Faculty of Economic and Management
Page 105 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Sciences - University of Pretoria, Pretoria RSA.


Lam S and Woo K (1997). Measuring Service Quality: A Test-Retest Reliability
Investigation of SERVQUAL. Journal of the Market Research Society. Vol. 39, No. 2, London
UK.
Lewis FL (1991). Introduction to Total Quality Management in the Federal Government.
Federal Quality Institute, Washington DC USA.
Llosa S, Chandon JL and Orsingher C (1998). An Empirical Study of SERVQUAL's
Dimensionality. Service Industries Journal 18(2), EBSCO Industries Inc. Birmingham USA.
Luthans F (1989). Organisational Behaviour Fifth Edition. McGraw-Hill, New York USA.
Markland R, Voss C, Chase R and Roth A (1999). International Service Study, Decision Line
May 1999. Dept. of Management Science, College of Business Administration, University of
South Carolina, Columbia USA.
Mastenbroek WFG (1991). Managing Quality in the Service Sector. Basil Blackwell Ltd. as
quoted in Potgieter C (1997). Service Management of the Information Technology Infrastructure
D.Com (Informatics) Thesis. Faculty of Economic and Management Sciences - University of
Pretoria, Pretoria RSA.
Muhlemann A, Oakland J and Lockyer K (1992). Production and Operations Management,
6th Edition. Financial Times -Pitman Publishing, London UK.
National Institute of Standards and Technology (1991). Application Guidelines: Malcolm
Baldridge National Quality Awards. Gaithersburg USA.
Neissink F (2001). The Vrije Universiteit IT Service Capability Maturity Model Version
2.1. PowerPoint Presentation, Software Engineering Research Centre, Die Vrije Universiteit,
Amsterdam Netherlands.
Niessink and van Vliet (1999). The Vrije Universiteit Service Capability Maturity Model
Technical Report IR-463, Release L2-1.0. Faculty of Sciences, Division of Mathematics and
Computer Science, Die Vrije Universiteit, Amsterdam Netherlands.
Norton D and Kaplan R (1996). The Balanced Scorecard Translating Strategy info Action.
Harvard Business School Press, Boston USA.
Parasuraman Z, Berry L (1988). SERVQUAL: Multiple-item scale for measuring customer
perceptions of service quality. Journal of Retailing, 64(1) 12-14.
Peters T (1992). Liberation Management. p. 677. Macmillan. London UK.
Pfeffer J (1994). Competitive Advantage through People, unleashing the power of the work
force. Harvard Business School Press, Cambridge USA.
Page 106 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Porter ME (1985). Competitive Advantage Creating and Sustaining Superior


Performance. Free Press, New York USA.
Potgieter BC (1997). Service Management of the Information Technology Infrastructure
D.Com (Informatics) Thesis. Faculty of Economic and Management Sciences, University of
Pretoria, Pretoria RSA.
Organisational Department of Finance (2002). Budget all references removed.
Robson W (1997). Strategic Management & Information Systems (Second Edition). Pearson
Education Ltd., Essex UK.
Shaw B, Thomas J and Brown H (1999). Research Methodology. Oxford Brooks University,
Oxford UK.
Slack N, Chambers S and Johnston R (2001). Operations Management Third Edition.
Pearson Education Limited, Essex UK.
van Iwaarden J (2002). The Quest for Quality on the WEB, a study of user perceptions of
WEB site quality - Masters Thesis. Dept. of Marketing and Organisation, Rotterdam School of
Economics, Erasmus Universiteit, Rotterdam Netherlands.
Wilson, Pitman and Trahn (2000). Guidelines for the Application of Best Practice in
Australian University Libraries. Department of Education, Training and Youth Affairs,
Australia.
Wood et al (2001). IT Service Management Service Delivery and Support. Foster Melliar,
Johannesburg RSA.
Vos CA, Blackmon K, Chase RB, Rose E and Roth AV (1997). Service Competitiveness and
Anglo-US study. Business Strategy Review, 8(1), 7-22.
Zeithaml V, Berry L and Parasuraman A (1988). Communication and Control Processes in
the Delivery of Service Quality. Journal of Marketing, Vol. 52 (02/1988), pp.35-48.
Zeithaml V, Berry L and Parasuraman A (1990). Delivering Quality Service: Balancing
Customer Perceptions and Expectations. The Free Press, New York USA.
Zuber-Skerritt (1990). Action Research of Change and Development. C.A.L.T, Griffith
University, Brisbane, Australia. As quoted in Anderson, L (1994). Espoused Theories and
Theories-in-use: Bridging the Gap. Unpublished Master of Organisational Psychology Thesis,
University of Old. http://www.scu.edu.au/schools/grm/ar/arp/argyris.html. Accessed February
2003.

Page 107 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

White Papers
ACSI (2002). The American Customer Satisfaction Index The Voice of the Nations
Consumers - Whitepaper. http://theacsi.org. Visited December 2002.
Dick B (2000). A beginners guide to action research - Whitepaper. Available on-line
www.scu.edu.au/schools/gcm/ar/arp/guide.html.
InterProm USA Corporation (2002). ITIL and Balanced Scorecards White Paper. Houston
USA.
Leopoldi (2002). Policy Based IT Service Management White Paper. RL Consulting,
British Columbia Canada.
Microsoft Corporation (2001). Microsoft Operational Framework White Paper. Microsoft
Corp, Redmond USA.
Pink Elephant (2002). The ITIL Story Version 3. White Paper at http://www.pinkelephant.com - accessed on March 2003.
Sacks D (2002). Quality Today Whitepaper. QI Management Systems. Johannesburg RSA.
Thiadens Th (2002). Towards Customer Focused Management of ICT Service. White Paper
at http//ict-management.com.
Thiadens Th (2001). Van ICT beheer naar "ICT Service Management". White Paper at
http//www.ict-management.com.

Electronic Media - WEB Sites, CDs and e-Mail


Burrel G (5 February 2003). Personal communication via e-mail. (Mr. Burrel is a Best
Practice consultant of the Office of Government Commerce UK, the owners of ITIL).
Office of Government Commerce (2002). ITIL ICT Infrastructure Management CD
v2.0. The Stationary Office, Norwich, UK.
Office of Government Commerce (2) (2002). ITIL Planning to Implement Service
Management CD v2.0. The Stationary Office, Norwich, UK. Ref. use in text, OGC(2)
Office of Government Commerce (1a) (2002). ITIL Service Delivery CD v2.0. The
Stationary Office, Norwich, UK. Ref. used in text, OGC(1)
Office of Government Commerce (1b) (2002). ITIL Service Support CD v2.1. The

Page 108 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Stationary Office, London, UK. Ref. used in text, OGC(1)


http//www.itsm.org.uk. Accesses January, 10 February, 16 June and 9 October 2003, (IT
Service Management Forum - UK Chapter).
http://www.balanced scorecard.org/bscit/intranet/bsc1.html. Accessed 9 October 2003 (the
Balanced Scorecard Institute, Rockville USA).
http://www.itsmf.com/bestpractice/selfassessment.asp. Accessed 23 October 2003, (IT
Service Management Forum - US Chapter).
http://www.ogc.gov.uk. Accessed 7 December 2002, 10 February and 21 June 2003, (Office of
Government Commerce - UK Government).
http://www.scu.edu.au/schools/grm/ar/arp/argyris.html. Accessed 10 February 2003, (the
Action Research Site).
http://www.sei.org. Accessed 22 November 2003. (Carnegie Mellon University, Software
Engineering Institute).

Note:

ITIL is a registered trade-mark of the Office of Government Commerce and the Crown.
CobiTTMis a trademark of the IT Governance Institute.

Page 109 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix A: Interviews with Executive Management


This Appendix gives a sample of an interview with one the Executive Managers as research
data is volumous, only sample data are included in this dissertation.
Interview the Chief Financial Officer
1.

Do you think that the processes and procedures implemented by Technology Services are
effective, and contribute to a more efficient ICT environment?
a. Yes, definitely processes can be more effective however

2.

Have you noticed a measurable improvement in ICT service provision in your work
stream, since the inception of ICT services in June 2002?
a. Yes, response times are definitely better and the services offered more stable

3.

What do you regard as the most meaningful improvements made by Technology Services
to better ICT service provision since June 2002?
a. Certainly the virus issue was addressed effectively, e-mail is now stable and
information security has improved substantially. What I find worrying is that
turn-around times, although they improved are still not what it should be.

4.

Do you think formal processes and procedures like change management:a. add significant value to service provision in general?
i. Yes
b. would be useful to implement in other work streams in Shared Service Centre?
i. Yes although I think they apply more to IT or other process driven
functions there are definitely lots of those in Finance.

5.

What do you consider the most significant failures in terms of ICT service provision by
Technology Services?
a. Service requests and logged incidents resolution time is unacceptable
Technology Services need to focus on reducing this.

6.

Do you think communication from Technology Services is effective:


a. Do you know and understand what services are offered?

i. Yes
b. Do you know what additional services are planned?
i. Mostly if it applies to Finance yes, if it is other work streams I do not
know knowing what other functions are planning may be helpful
though.
c. Do you know, in time, when things go wrong?
i. No We only know once our work is affected by a failure we would
like to know whoever then we can plan better

Page 110 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

d. What would you suggest needs to be done by Technology Services to improve


communications in your work stream?
i. Plan maintenance in advance and communicate it well in advance to
ensure that the work stream is not affected. Make sure an emergency
plan is in place with a list of all the appropriate people that should
know when things go wrong.
7.

Do you think Technology Services understands your working environment and your work
streamsneeds/requirements?
a. No maybe it would be a good idea to invite GMs from Technology Services
to internal meetings where problems are discussed.

8.

What do you think Technology Services can do to improve the effectiveness and
efficiency of your work stream?
a. Yes? the approach of Technology Services still seems disjointed, the
following issues need attention; participation of Technology Services with
work-stream issues, project management in short ensuring an integrated
approach were responsibility is not shifted between parties.

9.

Do you think Technology Services staff members are competent to support your
organisation?
a. Technically ,yes - lack communication skills however

10.

Do you think Technology Services has knowledgeable resources available to advise you
how ICT can improve your work streams effectiveness and efficiency?
a. Currently we are all operating on crisis management mode I dont quite
know.

11.

Do you think Technology Services has the needed problem-solving capability/ability?


a. A reserved yes, some aspects of delivery are still lacking

12.

How satisfied are you, in general, with the service and support:a. currently provided by Technology Services?
i. On a scale of 1 to 5 I would say 2.8 so in general I am satisfied
b. do you think it has improved since June 2002?
i. Definitely yes with a few exceptions like BAS

Page 111 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-B: ICT Service Management PMF and Best Practice Assessment


Results
Outlined below isa sample of one of the questionnaires (Service Level Management) used by the
researcher to determine the Process Maturity and Best Practice Conformance of the processes and
function implemented at the research site (Table:3). The survey tool (questionnaires) was devised
by the Office of Government Commerce, a UK government agency and the owner/custodian of
ITIL. The tool is based on the Capability Maturity Model of the Carnegie Melon University in
the USA and the Vrije Universiteit in the Netherlands.
Table-3: Assessment for Service Level Management
Service Level Management Assessment

M = Mandatory

Level 1 - Pre-requisites
1. Are at least some service level management (SLM) activities established within the organisation, e.g.
service definition, negotiation of SLA's etc?
2. Have you identified the customers for your IT services?
3. Are service attributes identified?
Minimum score to achieve this level: 'Y' for all mandatory ('M') questions + 1 other answer 'Y'
Level 1.5 - Management Intent

4. Has the purpose and benefits of service level management been disseminated within the organisation?
5. Has the appropriate data on which to base service levels been determined?
6. Are there agreed procedures by which Service Level Agreements are negotiated and reviewed?
Minimum score to achieve this level: 'Y' for all mandatory ('M') questions + 1 other answer 'Y'
Level 2 - Process Capability
7. Have responsibilities for service level management activities been assigned?
8. Has a catalogue of existing services been compiled?
9. Are there mechanisms for monitoring and reviewing existing service levels?
10. Does the service catalogue give a clear and accurate picture of all services being provided?
11. Are all customer service requests verified?
12. Do you have a mechanism leading to service improvement?
13. Are services prioritised in the service catalogue?
14. Do you have a mechanism for scheduling service implementations?
15. Are the majority of services covered by SLAs?
16. Have all existing SLAs been reviewed and agreed by customers?
17. Do the majority of SLAs have underpinning contracts and OLAs in place?
18. Are there mechanisms in place to monitor and measure all items in existing SLAs?
19. Are SLAs reviewed on a regular basis?
20. Are the majority of SLAs, OLAs and underpinning contracts current?
Minimum score to achieve this level: 'Y' for all mandatory ('M') questions + 6 other answers 'Y'

Page 112 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Level 2.5 - Internal Integration


21. Do you compare service provision with the agreed service levels?
22. Do you have a mechanism for keeping your service catalogue in line with new/changed services?
23. Do you use service records to provide management and customers with meaningful information on the
quality of service?
Minimum score to achieve this level: 'Y' for all mandatory ('M') questions + 1 other answer 'Y'
Level 3 - Products
24. Are standard service reports produced regularly?
25. Are the services and their components explicitly defined and what is excluded documented in SLA's?
26. Do SLAs have clearly identified key targets for service hours, availability, reliability, support, response
times and change handling?
27. Are service components identified as configuration items (CIs)?
Minimum score to achieve this level: 'Y' for all mandatory ('M') questions + 2 other answers 'Y'
Level 3.5 - Quality Control
28. Are the standards and other quality criteria for SLM documented?
29. Are the personnel responsible for SLM activities suitably trained?
30. Does the organisation set and review either targets or objectives for SLM?
31. Does the organisation use any tools to support SLM?
Minimum score to achieve this level: 'Y' for all mandatory ('M') questions + 1 other answer 'Y'
Level 4 - Management Information
32. Do you provide management with information concerning service targets and actual performance?
33. Do you provide management with information concerning trends in service level breaches?
34. Do you provide management with information concerning standard service offerings?
35. Do you provide management with information concerning number of requests for new/changed services?
36. Do you provide management with information concerning trends in service level request?
37. Are SLA monitoring charts provided to give an overview of how achievements have measured up to
targets?
M

Minimum score to achieve this level: 'Y' for all mandatory ('M') questions + 1 other answer 'Y'

Level 4.5 - External Integration

38. Does SLM actively involve Availability Management regarding service levels?
39. When negotiating service levels does SLM consult other service delivery and support areas like Capacity
Management, Financial Management, Service Desk and Change Management?
40. Is SLM consulted by Change Management concerning potential impact of changes to agreed service
levels?
41. Does SLM ensure that the service catalogue is integrated and maintained as part of the Configuration
Management database (CMDB)?
42. Does SLM ensure that the incident and problem handling targets included in SLAs are the same as those
in the Service Desk tools?
Minimum score to achieve this level: 'Y' for all mandatory ('M') questions + 2 other answers 'Y'
Level 5 - Customer Interface
43. Do you check with the customer if the activities performed by Service Level Management adequately
support their business needs?

Page 113 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

44. Do you check with the customer that they are happy with the services provided?
45. Are you actively monitoring trends in Customer Satisfaction?
46. Are you feeding customer survey information into the service improvement agenda?
47. Are you monitoring the customers value perception of the services provided to them?
Minimum score to achieve this level: 'Y' for all mandatory ('M') questions

The Table-4 outlines the mandatory conditions the organisation should meet for the implemented
processes and function, to reach its medium term goal of a Process Maturity of 3.5. A Y
indicates that this condition is fulfilled and an Nthat it still has to be met.
Table-4: Prerequisites for PMF Levels to equal 3.5
Configuration Management
Level 1: Pre-requisites
Are at least some configuration management activities established within the organisation, e.g. registering Configuration Items
(CIs)?

Have you identified some of the CI attributes, e.g. location, current status, service component relationships?

Level 1.5: Management Intent


Has the purpose and benefits of configuration management been disseminated within the organisation?

Level 2: Process Capability


Have responsibilities for various configuration management activities been assigned?

Have configuration item naming conventions been agreed?

Are there procedures for identifying, controlling, updating, auditing and analysing configuration item information?

Level 2.5: Internal Integration


Have measure been taken to avoid duplication and anomalies with CI records?

Level 3: Products
Are standard reports concerning CI information produced regularly?

Level 3.5: Quality Control


Are the standards and other quality criteria applicable for the registration of CIs made explicit and applied?

Are the personnel responsible for configuration management activities suitably trained?

Service Level Management


Level 1 - Pre-requisites
Are at least some service level management (SLM) activities established within the organisation, e.g. service definition,
negotiation of SLA's etc?

Level 1.5 - Management Intent


Has the purpose and benefits of service level management been disseminated within the organisation?

Level 2 - Process Capability


Have responsibilities for service level management activities been assigned?

Has a catalogue of existing services been compiled?

Are there mechanisms for monitoring and reviewing existing service levels?

Page 114 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Level 3.5: Quality Control


Are the standards and other quality criteria applicable to financial management activities explicit and applied?

Are the personnel responsible for financial management activities suitably trained?

Problem Management
Level 1: Pre-requisites
Are at least some problem management activities established in the organisation, e.g. problem determination, problem analysis,
problem resolution?

Level 1.5: Management Intent


Has the purpose and benefits of problem management been disseminated within the organisation?

Level 2: Process Capability


Have responsibilities for various problem management activities been assigned?

Is there a procedure for analysing significant, recurring and unresolved incidents and identifying underlying problems?

Is there a procedure by which potential problems are classified, in terms of category, urgency, priority and impact and assigned
for investigation?

Level 2.5: Internal Integration


Is the nature of the problem always documented as part of the problem record?

Is Problem Management responsible for the completeness of all problem records?

Level 3: Products
Are standard reports concerning problems produced regularly?

Level 3.5: Quality Control


Are the standards and other quality criteria made explicit and applied to problem management activities?

Are the personnel responsible for problem management activities suitably trained?

Change Management
Level 1: Pre-requisites
Are at least some change management activities established in the organisation, e.g. logging of change requests, change
assessments, change planning, change implementation reviews?

Level 1.5: Management Intent


Have the purpose and benefits of change management been disseminated within the organisation?

Level 2: Process Capability


Have responsibilities for various change management activities been assigned?

Are the procedures for initiating change always adhered to?

Is there a procedure for approving, verifying and scheduling changes?

Level 2.5: Internal Integration


Are all changes initiated through the agreed change management channels, for example a Change Advisory Board?

Are changes planned and prioritised, centrally or by common agreement?

Level 3: Products
Are formal change records maintained?

Is a change schedule of approved changes routinely issued?

Level 3.5: Quality Control


Are the standards and other quality criteria for the documentation of change made explicit and applied?

Are the personnel responsible for change management activities suitably trained?

Page 115 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Service Desk
Level 1 - Pre-requisites
Does a Service Desk exist which manages, co-ordinates and resolves incidents reported by customers?

Level 1.5 - Management Intent


Is the business need for a Service Desk clearly identified and understood?

Are sufficient management commitment, budget provision and resource available for the effective operation of the Service
Desk?

Level 2 - Process Capability


Have the functions of the Service Desk been agreed?

Do Service Desk operators have a procedure or strategy for obtaining the required information from customers whilst call
handling?

Does the Service Desk provide the customer/user with information on service availability, an incident number or reference for
use in follow-up communications, and progress updates on any request being managed by the service team?

Does the Service Desk make an initial assessment of all requests received, attempting to resolve appropriate requests or referring
them to someone who can, based on agreed service levels?

Level 2.5 - Internal Integration


Does the Service Desk provide a single point of contact for all customer queries?

Level 3 - Products
Is a single source of customer / user and supplier details maintained?

Are standard pro-forma's available for capturing customer / user details and identification?

Are the services offered by the Service Desk clearly defined for customers and other parties?

Level 3.5 - Quality Control


Are the standards and other quality criteria applicable for the registration of incidents and for call handling made clear to Service
Desk operators?

Are Service Level Agreements available and understood by Service Desk operators

Are the personnel responsible for Service Desk activities suitably trained?

Release Management
Level 1: Pre-requisites
Are at least some release management activities established within the organisation, e.g. procedures for the release and
distribution of software?

Level 1.5: Management Intent


Have the purpose and benefits of release management been disseminated within the organisation?

Level 2: Process Capability


Have roles and responsibilities for various release management activities been assigned between operational groups and
development teams?

Are there operational procedures for defining, designing, building and rolling out a release to the organisation?

Are there formal procedures for purchasing, installing, moving and controlling software and hardware associated with a
particular release?

Are there formal procedures available for release acceptance testing?

Are explicit guidelines available on how to manage release configurations and changes to them?

Page 116 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Level 2.5: Internal Integration


Are all CI's within a release traceable, secure and do procedures ensure that only correct, authorised and tested versions are
installed?

Are CI records for a release kept in alignment with physical CI movements for the release?

Level 3: Products
Are there release naming and numbering conventions?

Are plans produced for each Release?

Are back-out plans produced for each Release?

Are test plans, acceptance criteria and test results produced for each Release?

Is there a library containing master copies of all controlled software within the organisation?

Level 3.5: Quality Control


Are the standards and other quality criteria for release management made explicit and applied?

Are the personnel responsible for release management activities suitably trained?

Incident Management
Level 1: Pre-requisites
Are incident records maintained for all reported incidents?

Are incidents currently assessed and classified by the Service Desk prior to referring them to a specialist?

Is there an incident manager responsible for managing and escalating incidents?

Level 1.5: Management Intent


Is the business committed to reducing the impact of incidents by their timely resolution?

Have management commitment, budget and resource been made available for incident management?

Level 2: Process Capability


Is an incident database maintained recording details for all reported incidents?

Are all incidents managed in conformance with the procedures documented in SLAs?

Is there a procedure for classifying incidents, with a detailed set of classification, prioritisation and impact codes?

Is there a procedure for assigning, monitoring and communicating the progress of incidents?

Does incident management provide the Service Desk or Customer/User with progress updates on the status of incidents?

Is there a procedure for the closure of incidents?

Level 2.5: Internal Integration


Does incident management match incidents against the problem and known error database?

Level 3: Products
Are incident records maintained for all reported incidents (including resolution and/or workaround)?

Are requests for changes produced, if necessary, for incident resolution?

Are resolved and closed incident records updated and clearly communicated to the Service Desk, customers and other parties?

Level 3.5: Quality Control


Are the standards and other quality criteria applicable for the registration of incidents and for call handling made clear to the
incident management team?

Are Service Level Agreements available and understood by incident management?

Are the personnel responsible for incident management suitably trained?

Page 117 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y
bu
to
k
lic

.c

.d o

c u-tr a c k

Table-5: Process Maturity and Best Practice Conformance Assessment - Service Level Management & Service Desk

Max Possible Score

Maturity Level Passing Score

Pass / Fail

Maturity Level Passing Score

Pass / Fail

% Achieved

Shared Service Centre

Max Possible Score

Maturity Level Passing Score

Pass / Fail

% Achieved

50%

75%

75%

100%

50%

1.5

Management Intent

0%

50%

50%

100%

100%

2.0

Process Capability

12

29

24

41%

20

29

24

69%

22

29

24

76%

25

29

24

86%

45%

2.5

Internal Integration

0%

100%

100%

100%

100%

3.0

Products

0%

60%

60%

100%

100%

3.5

Quality Control

0%

67%

67%

67%

67%

4.0

Management Information

12

0%

12

58%

12

58%

12

58%

58%

4.5

External Integration

0%

57%

57%

71%

71%

Customer Interface

0%

60%

80%

80%

80%

14

76

18%

50

76

66%

53

76

70%

62

76

82%

63%

Pre-requisites

25%

25%

50%

50%

25%

1.5

Management Intent

0%

44%

44%

89%

44%

2.0

Process Capability

26

22

23%

12

26

22

46%

13

26

22

50%

24

26

22

92%

27%

2.5

Internal Integration

14%

71%

71%

86%

57%

3.0

Products

16

13

6%

16

13

50%

10

16

13

63%

14

16

13

88%

56%

3.5

Quality Control

25%

25%

25%

25%

0%

4.0

Management Information

0%

0%

75%

75%

75%

4.5

External Integration

33%

67%

67%

67%

33%

100%

100%

100%

100%

13

89

46%

51

89

57%

71

89

80%

65%

Max Possible Score

Shared Service Centre

Shared Service Centre

% Achieved

Pre-requisites

* Score Based on an Interview


with Shared Service Centre
Technology Services
Management (06/2002) and
MOF assessment (01/2002)

% Achieved

Pass / Fail

05/2003

Maturity Level Passing Score

01/2003

Max Possible Score

12/2002 (not used)

Derived Score *

06/2002

OGC/PMF Level

c u-tr a c k

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action Research in the Shared Services Centre by JH Botha.
w

.d o

% Improvement

to

bu

O
W
!

PD

O
W
!

PD

Service Level Management


1.0

5.0

Totals Achieved

Level 0

Level 1

Level 1

Level 3

Service Desk
1.0

5.0

Customer Interface
Totals Achieved

F
Level 0

0%

15%

41

89

Level 0

Level 0

Level 0

Page 118 of 152

.c

H
F-XC A N GE

H
F-XC A N GE

N
y
bu
k
lic

.c

.d o

c u-tr a c k

Table-6: Process Maturity and Best Practice Conformance Assessment Incident & Problem Management

0%

33%

27

25

30%

21

0%

0%

12

10

0%

0%

12

10

25%

25%

Management Information

0%

External Integration

13

11

Customer Interface

0%

13

11

20%

85

Level 1

5%

30

85

Pre-requisites

50%

Management Intent

83%

0%

29%

Process Capability

17

14

6%

17

14

53%

11

17

Internal Integration
Products

10

0%

10

60%

0%

20%

Quality Control

0%

33%

Management Information

0%

83%

External Integration

15

10

0%

15

10

53%

Customer Interface

0%

20%

77

Level 0

5%

39

77

51%

50

77

Internal Integration

Products
Quality Control

% Improvement

0%

% Achieved

Max Possible Score

5
25

Pass / Fail

6
27

Maturity Level Passing Score

100%

Shared Service Centre

Process Capability

% Achieved

Management Intent

Pass / Fail

05/2003

Maturity Level Passing Score

100%

100%

25%

50%

27

25

78%

26

83%

50%

27

25

96%

50%

78%

50%

50%

10

12

10

83%

25%

10

12

10

83%

83%

75%

67%

25%

67%

67%

67%

23%

13

11

80%

38%

11

13

11

85%

38%

80%

80%

60%

35%

55

85

65%

72

85

85%

80%

100%

100%

50%

43%

86%

43%

14

65%

11

17

14

65%

59%

10

60%

10

90%

60%

80%

100%

80%

33%

67%

33%

100%

100%

100%

15

10

53%

10

15

10

67%

53%

80%

100%

80%

65%

62

77

81%

75%

Max Possible Score

75%

Shared Service Centre

Pre-requisites

% Achieved

Pass / Fail

Shared Service Centre

Maturity Level Passing Score

% Achieved

01/2003

Max Possible Score

Pass / Fail

* Score Based on an Interview with


Shared Service Centre Technology
Services Management (06/2002) and
MOF assessment (01/2002)

Derived Score *

Maturity Level Passing Score

12/2002 (not used)

Max Possible Score

06/2002

OGC/PMF Level

c u-tr a c k

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action Research in the Shared Services Centre by JH Botha.
w

.d o

to

to

bu

O
W
!

PD

O
W
!

PD

Incident Management
1.0
1.5
2.0
2.5
3.0
3.5
4.0
4.5
5.0

Totals Achieved

Level 1

Level 1

Level 2

Problem Management
1.0
1.5
2.0
2.5
3.0
3.5
4.0
4.5
5.0

Totals Achieved

Level 1

Level 1

Level 1.5

Page 119 of 152

.c

H
F-XC A N GE

H
F-XC A N GE

N
y
bu
k
lic

.c

.d o

c u-tr a c k

Table-7: Process Maturity and Best Practice Conformance Assessment - Change & Configuration Management
Change Management

53%

11

0%

0
0

67%

33%

Management Information

0%

15

12

4.5
5.0

0%

22

71

62

0%

4%

49

137

0%

67%

0%

17%

25

20

4%

25

20

36%

21

25

0%

63%

0%

80%

Quality Control

0%

33%

Management Information

11%

56%

4.5

External Integration

17

14

0%

17

14

5.0

Customer Interface

0%

Totals Achieved

87

2%

39

87

0%

0%

0%

15

12

External Integration

71

62

Customer Interface

Totals Achieved

137

1.0

Pre-requisites

1.5

Management Intent

2.0

Process Capability

2.5

Internal Integration

3.0

Products

3.5
4.0

Derived Score *

2.5

Internal Integration

3.0

Products

3.5

Quality Control

4.0

Level 0

100%

50%

17

14

65%

11

89%

100%

33%

33%

15

12

31%

23

71

62

20%

36%

67

137

Level 1

% Improvement

50%

% Achieved

24%

Pass / Fail

3
14

0%

Maturity Level Passing Score

4
17

Process Capability

Max Possible Score

3
14

Management Intent

2.0

Shared Service Centre

Shared Service Centre


4

1.5

% Achieved

% Achieved
100%

Pass / Fail

Pass / Fail
P

50%

Pre-requisites

Maturity Level Passing Score

Maturity Level Passing Score


3

1.0

* Score Based on an Interview


with Shared Service Centre
Technology Services
Management (06/2002) and
MOF assessment (01/2002)

05/2003

Max Possible Score

Max Possible Score

4
17

01/2003

Shared Service Centre

% Achieved

Maturity Level Passing Score

12/2002 (not used)

Pass / Fail

Max Possible Score

06/2002

OGC/PMF Level

c u-tr a c k

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action Research in the Shared Services Centre by JH Botha.
w

.d o

to

to

bu

O
W
!

PD

O
W
!

PD

100%

50%

100%

50%

17

14

65%

41%

89%

89%

100%

100%

67%

33%

47%

15

12

47%

47%

32%

44

71

62

62%

32%

80%

100%

80%

49%

93

137

68%

64%

100%

100%

100%

50%

100%

50%

20

84%

23

25

20

92%

80%

63%

75%

63%

100%

100%

100%

33%

33%

33%

67%

67%

56%

53%

10

17

14

59%

10

17

14

59%

59%

0%

0%

80%

0%

45%

58

87

67%

68

87

78%

76%

Level 1

Level 1.5

Configuration Management

Level 0

Level 0

Level 1

Level 2.5

Page 120 of 152

.c

H
F-XC A N GE

H
F-XC A N GE

N
y
bu
k
lic

.c

.d o

m
o

c u-tr a c k

.c

Table-8: Process Maturity and Best Practice Conformance Assessment - Release Management
Release Management

3.5
4.0
4.5
5.0

0%

0%

46%

23

24

56%

100%

25%

21

96%

23

24

89%

% Improvement

3
21

% Achieved

4
24

Max Possible Score

Maturity Level Passing Score

11

Shared Service Centre

17%

33%

% Achieved

0%

Pass / Fail

Pass / Fail

3.0

3
21

05/2003

Maturity Level Passing Score

2.5

4
24

Max Possible Score

Shared Service Centre

Internal Integration

0%

% Achieved

2.0

Pass / Fail

Maturity Level Passing Score

Process Capability

01/2003

Max Possible Score

Management Intent

Shared Service Centre

1.5

% Achieved

Pass / Fail

1.0

Maturity Level Passing Score

Pre-requisites

* Score Based on an Interview with


Shared Service Centre Technology
Services Management (06/2002) and
MOF assessment (01/2002)

12/2002 (not used)

Max Possible Score

Derived Score *

06/2002

OGC/PMF Level

c u-tr a c k

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action Research in the Shared Services Centre by JH Botha.
w

.d o

to

to

bu

O
W
!

PD

O
W
!

PD

100%

100%

50%

25%

21

96%

79%

100%

89%
83%

Products

18

16

0%

12

18

16

67%

15

18

16

83%

18

18

16

100%

Quality Control

0%

0%

0%

0%

0%

Management Information

16

13

0%

16

13

50%

16

13

56%

10

16

13

63%

56%

External Integration

17

14

0%

17

14

53%

13

17

14

76%

13

17

14

76%

76%

Customer Interface

0%

0%

0%

100%

0%

105

4%

47

105

45%

75

105

71%

86

105

82%

78%

Totals Achieved

Level 0

Level 0

Level 1

Level 1

Page 121 of 152

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-C: Other Interviews


This Appendix gives a sample of an interview with one the Technical Managers as research
data is volumous, only sample data are included in this dissertation.
Interview with the Change Manager
1.

Do you think that the processes and procedures implemented by Technology


Services are effective, and contribute to a more efficient ICT environment?
Yes, there is a measure of control being introduced

2.

Have you noticed a measurable improvement in ICT service provision in


your work stream, since the inception of ICT services in June 2002? Yes
sometimes two steps forward on step back.

3.

What do you regard as the most meaningful improvements made by


Technology Services to better ICT service provision since June 2002? Putting
processes in place that made managing the environment easier. Now people
must just follow process.

4.

Do you think formal processes and procedures like change management:a. add significant value to service provision in general? Yes
b. would be useful to implement in other work streams in Shared Service
Centre? Yes

5.

What do you consider the most significant failures in terms of ICT service
provision by Technology Services? Taking responsibility when things goes
wrong

6.

Do you think communication from Technology Services is effective:


a.
b.
c.
d.

No

Do you know and understand what services are offered? Yes


Do you know what additional services are planned? most
Do you know, in time, when things go wrong? No
What would you suggest needs to be done to improve communications by
Technology Services in your work stream? Lets start by building a Team
the lack of internal communication is a major cause of Technology
Servicess failures.

7.

Do you think Technology Services understands your working environment


and your work streamsneeds/requirements? No

8.

What do you think can Technology Services do to improve the effectiveness


and efficiency of your work stream? Work as a team people still operate in
silos.

Page 122 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

9.

Do you think Technology Services staff members are competent to support


your organisation? Yes, but no commitment

10.

Do you think Technology Services has knowledgeable resources available to


advise you how ICT can improve your work streams effectiveness and
efficiency? No

11.

Do you think Technology Services has the needed problem-solving


capability/ability? Do you think this has improved or deteriorated since June 2002.
No it improved though since 2002

13.

How satisfied are you, in general, with the service and support:a.
currently provided by Technology Services?
Average
b.
do you think it has improved since June 2002? Yes largely it has some
problems still remain.

Page 123 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-D: Customer Satisfaction Surveys


A number of questions were put to respondents in the Customer Satisfaction Survey, making use
of SERVQUAL as a basis to determine the level of respondents satisfaction.
Question-1 evaluated how well ICT services or products that are available are communicated to
users and customers. Question-2 evaluated if products and services offered by ICT meet job /
business requirements of users and customers. Question-3 evaluated if the ICT organisation was
considered responsive towards users and customers with regard to technical queries of action
taken when an incident was logged with the service desk. Question-4 evaluated if the ICT
organisation was considered responsive towards users and customers with regard to requests for
information. Question-5 asked users and customers to evaluate the (perceived) knowledge of the
ICT team with regard to the function they fulfil. Question-6 asked users if they feel cared for by
the ICT organisation. Question-7 asked users and customers to evaluate the (perceived) problem
solving ability of the ICT team with regard to the function they fulfil. Question-8 evaluated the
time it takes for ICT to resolve an incident or problem. Question-9 evaluated the customers
overall satisfaction. These nine questions formed the basis of the Customer Satisfaction Survey; a
number of additional questions were also posed during the second and third surveys. The policy
of the organisation was to limit the number of questions posed during surveys (less than 20).
There were two primary reasons for this:
1. Long surveys are seen as non productive
2. It was found that shorter questioners are answered by more people and thus offers a more
representative sample.
Question-10 evaluated if respondents have logged a call with the Service Desk (posed during the
second and third surveys) this is a check to ensure that respondents have used the support
services of the ICT organisation, and are thus qualified to answer questions 4, 5, 6 and 7.
As the measurement tool was not able to show small improvements (rating scale 1 to 5, thus users
can only select very bad, less than average, average, better than average and
excellent.) It was thought prudent to ask three additional questions to distinguish if marginal
improvements occurred with regard to questions 1 to 7.

Page 124 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Four Questions were added to the existing questionnaire, one to determine if the respondent
forms part of the Senior Managers group (this was done to measure the business perspective of
the service offering of the ICT organisation) and three grouping previous questions in three
categories, namely:
1. Improvement in Service
2. Improvement in Communication
3. Improvement in Product or Service Fit.
The total Questionnaire consisted of 13 questions for the first two surveys and 17 questions for
the last survey.
Questions 3, 5 and 7 relate to users / customers views with regard to the Quality of Service
provided - the technical response, knowledge and the ICT organisations ability to provide these
services. Question 11 was thus posed to determine if a marginal improvement occurred for these
three questions.
Questions 1 and 4 relate to users / customers views with regard to the effectiveness of
communication of the ICT organisation. Question 12 was thus posed to determine if a marginal
improvement occurred for these two.
Question 2 deals with the perceived value add the ICT organisation offers the question that was
posed aimed to ascertain if products and services offered meet the requirements of the job that the
user must fulfil. Secondary to this, Question 6 asked about individualised attention given to users
with regard to problems this question is a good indication if the ICT organisation does enough
to ensure that products are used optimally (technical staff who spend a bit of time with the user
who have logged an incident or reported a problem can quickly gauge if the product is used
correct and thus if the user can optimally use the product staff are encouraged to fulfil this
educationrole time permitting). Question 13 was thus posed to determine if products meet
the users job requirements and if the ICT organisation improved its products by ensuring optimal
use.
The researcher realised that the views of end users do not necessary reflect the views of
managers. Managers did not trust the statistics produced during the first two surveys an
additional question was thus introduced to determine if the respondent is a manager.

Page 125 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y

Other questions posed dealt with administrative issues that are not relevant to the researchers
work in this dissertation it was thus omitted from the dissertation.
Table-9 outlines the questions posed to respondents, the ordinal response available and the
related rating scale used during the Customer Satisfaction Surveys.

Question #

Table-9: The Customer Satisfaction Survey used, Qualitative Responses (Ordinal Data)

Question Posed

Numeric Rating
of 5 equals a
Qualitative
response of

Numeric Rating
of 4 equals a
Qualitative
response of

Numeric Rating
of 3 equals a
Qualitative
response of

Numeric Rating
of 2 equals a
Qualitative
response of

Numeric Rating
of 1 equals a
Qualitative
response of

How well ICT services or products offered


are communicated to users?

Very Bad

Bad

Fair

Well

Excellent

Do the ICT services or products meet job


needs / requirements?

Not at all

Not most of the


time

Most of the
time

All

More than
Expected

How satisfied are you with the response by


ICT staff towards technical problems?

Not satisfied at
all

Not satisfied
most of the
time

Satisfied most
of the time

Satisfied

Very Satisfied

How satisfied are you with the response to


requests for information?

Not satisfied at
all

Not satisfied
most of the
time

Satisfied most
of the time

Satisfied

Very Satisfied

Do you regard the ICT staff as


knowledgeable with regard to the problem/s
they attempt to resolve?

Ignorant

Not very
knowledgeable

Fairly
knowledgeable

Knowledgeable

Very
Knowledgeable

How will you rate the individualised


attention given to you or your problem by
ICT staff?

Very Bad

Bad

Fair

Well

Excellent

How will you rate the ICT teams ability to


resolve problems?

Very Bad

Bad

Fair

Well

Excellent

How long on average does it take the ICT


team to resolve a problem?

More than a
week

More than two


days but less
than a week

More than a
day but less
than two days

Less than a day


but more than
four hours

Less than four


hours

What are you overall satisfaction level with


the service provided by ICT?

Not satisfied at
all

Not satisfied
most of the
time

Satisfied most
of the time

Satisfied

Very Satisfied

Listed below is summary of data collected during Customer Satisfaction surveys 1, 2 and 3:

Page 126 of 152

to

bu
.d o

m
o

.c

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

lic
C
c u-tr a c k

.d o

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y

Q4
3

Q5
4

Q6
5

Q7
6

Q8
7

Q9
8

Q10
9

D5
11

103
3
3.19
1.08

103
4
3.24
1.15

103
3
2.65
0.99

103
3
2.77
0.96

103
3
2.81
0.86

103
3
3.44
1.3

103
3
3.17
1.06

129
2
2.33
0.96

129
2
2.33
0.82

129
2
2.03
0.66

129
2
2.03
0.67

129
2
2.12
0.66

129
2
2.13
0.93

129
2
2.26
0.91

Survey-2

Have logged calls


No of Respondents
Median Response
Mean Response
Standard Deviation

96%
145
2.0
2.5
0.9

Have logged calls

Survey-3
136
138
1.0
1.0
1.4
1.0
0.5
0.0

146
1.0
1.4
0.5

145
3.0
2.9
0.9

145
3.0
2.4
1.1

142
2.0
2.3
0.9

146
2.0
2.1
0.7

146
2.0
2.1
0.8

147
2.0
2.1
0.8

145
2.0
2.4
1.2

145
2.0
2.3
0.8

140
2.0
1.4
0.5

21
1.0
1.4
0.6

Survey-3 (Managers)
20
20
20
21
3.0
1.0
1.0
3.0
2.9
1.5
1.0
2.7
0.7
0.6
0.0
1.1

20
2.0
2.4
1.0

20
2.0
2.4
0.8

21
2.0
2.1
1.0

21
2.0
2.4
0.7

21
5.0
5.0
0.0

21
3.0
2.5
0.9

20
1.0
1.3
0.5

Have logged calls


No of Respondents
Median Response
Mean Response
Standard Deviation

129

94%
20
3.0
2.7
1.1

94%

Page 127 of 152

lic

to

bu

129
3
2.93
0.76

Service Improvement in the


last Quarter

129
2
2.39
0.78

Overall satisfaction

No of Respondents
Median Response
Mean Response
Standard Deviation

Time to resolve

103
3
3.43
0.94

IT Staff Ability

103
3
3.04
1.11

IT Caring for Customers

No of Respondents
Median Response
Mean Response
Standard Deviation

D2
Q3
13
10
Survey-1

IT Staff Knowledge

Q2
2

Information Response

D1
12

Technical Response

Q1
1

Have you logged a call at the


Service Desk

Field Name
Question Number

Products Improvement in the


last Quarter

Table-10: Customer Satisfaction Surveys 1, 2, 3 and 3 for Managers

.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

Product/Service Fit

.c

Communication in the last


Quarter

c u-tr a c k

Communication

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-E: Call Statistic


The tables below shows the call statistics data collected from the Service Desk database and
summary data related to call statistics collected from various other sources as explained in the
main text. This appendix gives a summation of the Service Desk Call Statistics and as research
data is volumous, only the samples of the actual Call Statistics data are included.

Table-11: Summary Statistics from the Service Desk Database and Incident Manager.
Measure
Man-days to close calls
logged this quarter
Number of calls for this
quarter
Average calls logged
per day
Average time to close a
call for this quarter
Median time to close a
call for this quarter
Number of technicians
employed this quarter
Number of work-days in
this quarter
Calls per technician this
quarter
Calls per Technician /
Day, this quarter
Technicians needed to
close 4 calls per day

Oct to Dec-02

Jan to Mar-03

Apr to Jun-03

Quarter 1

Quarter 2

Quarter 3

11,905.0

12,457.0

13,713.0

Man Days

3,230.0

3,384.0

3,180.0

Calls

52.1

53.7

48.9

Calls

3.7

3.7

4.3

Days

1.0

1.0

1.0

Days

22

25

15

(although only
15 dedicated to
call handling)

(although only
15 dedicated to
call handling)

Technicians

62

63

65

Work Days

215.3

153.8 (225.6)

127.2 (212)

Calls

3.5

2.4 (3.6)

2.0 (3.3)

Calls

13.0

13.4

12.2

Technicians

Measured in

Page 128 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y
bu
k
lic

.c

Table-12. Call Statistics from the Service Desk Database


Oct-02
Number of Calls
Median Time to Resolve
Mean Time to Resolve
Standard Deviation

Nov-02
1245
1
3.4
6.2

Jan-03
Number of Calls
Median Time to Resolve
Mean Time to Resolve
Standard Deviation

1197
1
3.2
8.9

Feb-03
1179
1
4.2
9.8

Apr-03
Number of Calls
Median Time to Resolve
Mean Time to Resolve
Standard Deviation

Number of Calls
Median Time to Resolve
Mean Time to Resolve
Standard Deviation

Dec-02

Number of Calls
Median Time to Resolve
Mean Time to Resolve
Standard Deviation

Number of Calls
Median Time to Resolve
Mean Time to Resolve
Standard Deviation

788
0
4.9
11.8

Mar-03
1005
1
3.2
7.0

May-03
931
1
4.5
7.8

Number of Calls
Median Time to Resolve
Mean Time to Resolve
Standard Deviation

Number of Calls
Median Time to Resolve
Mean Time to Resolve
Standard Deviation

1200
1
3.6
7.7

Jun-03
1157.0
1.0
3.9
6.8

Number of Calls
Median Time to Resolve
Mean Time to Resolve
Standard Deviation

1092
1
4.6
8.5

Page 129 of 152

.d o

m
o

c u-tr a c k

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action Research in the Shared Services Centre by JH Botha.
w

.d o

to

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-F: Research Synopsis


To follow the approved research Synopsis; during this dissertation the synopsis was not strictly
followed.
Title
The role that IT Service Management Best Practice plays in Improving Customer Satisfaction
and Effective Service Provision.
Background
The dawning of the Information Age has placed great importance on Information Systems (IS)
and Information Technology (IT) as an essential tool to do business in the future. Historically,
Information Systems were viewed by many organisations as either a support function or even as a
necessary evil, but not as an essential tool for future business success or as a source of
competitive advantage.
IS have also functioned traditionally without proper management input or skills (generalisation),
mainly because management did not understand the function and allowed technocrats in IS to
lead them as far as the role and ability of the function was concerned. IS departments focus was
technology and not what value can be added to the business. IS traditionally functioned as a
technology silo rather than a supplier of services to the business.
Best practice frameworks were developed to change this approach changing IS to: 1. A responsive business function, focused on delivering business benefit,
2. Ensuring proper management principles were applied,
3. Focusing on customer care and
4. Ensuring that measurable objectives are set and performance measured against these
objectives.
A number of best practice frameworks or methodologies were developed over the last few
decades, and from these a definitive de facto standard emerged, called ITIL. The Central
Computing and Telecommunication Agency (CCTA) devised ITIL or the IT Information
Libraryin the late 1970s. The CCTA was a UK government agency that provided services and
advice to government departments regarding the use and management of IT. The Office of

Page 130 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Government Commerce (OCG) is currently the custodian of the ITIL framework and a revised
framework (ITIL2) was issued in 2001.
Provincial Government, striving to be more responsive to its constituency and to ensure effective
use of resources, looked at various international models to remodel its departments. The decision
was made in 2000 to consolidate non-core departmental functions (IS, Finance & Procurement)
into a single centre of excellence. This initiative was the birth of Shared Service Centre, a
government organisation that supplies the above services to other departments under the auspices
of the Department of Finance. In designing these consolidated service providers, ITIL2 was
selected as the best practice framework to use in the IS function.
Purpose
The purpose of this dissertation is to ascertain if and how best practice contributes to improve
customer (services to other departments) service. Based on data collected and observations
made, the author plans to support the notion that:1. Best practise contributes to better delivery of service and
2. That unless additional effort are made to involve the user community by means of
reviews, feedback and marketing, user perception will not necessarily reflect the
improvement in service levels.
Literature Review
A number of texts will be used to establish a benchmark of best practice. These texts will serve
as secondary data to (1) define the what best practice in the IT service industry is and (2) serving
as a comparative measure to see how IS may or should support Business Strategy/Objectives.
The primary literary sources were selected because they provide academic, practical or popular
references to one of the following topics:

Business trends and the role of IS as a source of Competitive Advantage

IT Service Management (ICT Service Management) frameworks / methodologies

IS Planning Methodologies

IS Maturity Models

Information pertaining to the preparation of Questionnaires and Analysis of Primary Data


collected

Secondary Data

Page 131 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

The list of literature selected for use in the dissertation is not definitive, but rather indicative of
the material to be used to compare and test research with, and as a source of relevant information.
Texts may be added and others discarded if the content proves repetitive or irrelevant. A list of
texts follows: Curtis G, (1999) Business Information Systems Analysis, Design and Practice. Harlow,
Addison Wesley Longman Ltd.
Hammer M & Stanton S, (1996) The Reengineering Revolution Handbook. London,
HarperCollins Business.
James G, (1996) Business Wisdom of the Electronic Elite. New York, Times Business Books.
Laudon K & Laudon J, (1993) Business Information Systems a Problem Solving Approach
2nd Ed. Fort Worth, The Dryden Press.
Porter M, (1985) Competitive Advantage. New York, Free Press.
Planning Overview and Baseline Version 2. Menlo Park, Price Waterhouse.
Robson W, (1997) Strategic Management and Information Systems 2nd Ed. Harlow, Pearson
Education/Pitman Publishing.
Wisniewski M, (1997) Quantitative Methods for Decision Makers 2nd Ed. Harlow, Pearson
Education/Pitman Publishing.
Foster Melliar (2001) IT Service Management Service Support and Delivery. Johannesburg
Office of Government Commerce (2001) IT Information Library Revision 2
Research Approach and Activities
The above literature will be used as a source of pertinent issues to be addressed. Texts will be
used as secondary data to support the validity of the research conducted. Measuring the
improvement of service levels can be a very simple task, if it were not for the fact that people are
involved. This necessitated the broadening of the scope of the dissertation to include the
perceptions of the level of services offered.
Based on the above the author believes that the world is socially constructed and subjective, the
observer is part of what is observed and that science is driven by human interest. A
phenomenological approach will thus be followed in the research and interpretation of data. The
author thus aims to try and understand what is happening and interpret data from this viewpoint

Page 132 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

developing ideas through induction of the data to his disposal. This will necessitate that multiple
methods will be used to devise different views of the situation.
As the magnitude of the whole Shared Service Centre project (integration of 12 Business Units
IT resources into a single cohesive unit) is far too large, this dissertation will focus on the first
phase of the project (establishing Shares Service Centre).
For data to be meaningful, making use of the Phenomenological paradigm, information, data and
observations need to be made over time. The timeline for collection of data is set to a period of
five months during this time a pre - during and post evaluation of customer perceptions will be
made and data will be collected in real time as to actual service levels achieved by Shared Service
Centre IT function.
Primary data will be collected in the form of questionnaires, structured interviews and systemgenerated statistics from the Service Desk implemented in Shared Service Centre.
Management Questionnaire & Survey
The management questionnaire will be conducted at the beginning and the end of the research
period. The purpose of this questionnaire is: 1. To determine the technology maturity of the organisation
2. To determine the Business objectives of the organisation and how IT can contribute to
achieving these.
Customer Questionnaire
As questionnaires will focus on perceived levels of service, a meaningful sample needs to be
collected. The author aims to sample about 15% of the phase 1 Shared Service Centre project
(300 users) the sampling instrument will be an electronic voting. Data to be collected will be
predominantly quantitative and used to identify possible areas of organisational failure and to do
a gap analysis between organisational needs/expectations and current IS services.
System Statistics
System statistics will be collected from the helpdesk system; this will be used to determine if
quantifiable progress was made during the research period with relation to:
1. Less failures and repeated failures
2. Shorter response and resolution times
3. Less outages or down-time
Page 133 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

The use of Primary Data


Data collected will be used to meet the following aims in terms of the purpose of the study: 1. Do a gap-analysis between, pre and post measurements of service levels, using, System
Statistics and User perceptions (customer questionnaire)
2. Infer if any relations exist between any of the data sets thus trying to find meaning or
meaningful trends that my lead to meaningful deductions.
3. Infer what users perceive as good service and acceptable service levels comparing these
with what the IS department perceive to be good service.
4. Compare pre and post management surveys to see if meaningful progress was made with
regard to the technology and business maturity of the IT organisation.
5. Analyse the IT value-add to meeting business objectives.
The above analysis of data will be used to test the hypothesis that best practise contributes to
better delivery of service and that unless additional efforts are made to involve the user
community by means of reviews, feedback and marketing; user perception will not necessarily
reflect the improvement in service levels.
Leading from the research a number of recommendations will be made as to how the organisation
can (1) further improve service levels and (2) influence users perception of service levels and
consequently Customer Satisfaction with IS/IT services.
Research and Dissertation Schedule
An outline schedule was compiled, setting out tasks or milestones that needs to be adhered to and
sub-tasks that makes up the activities that needs to be completed to for these milestones to be
achieved. This schedule follows: Milestone
Complete Review of Literature and
selection of Texts to use for Dissertation
Complete Questionnaire and Ops.
Manager and CIO Interview Outlines
Complete Interviews and Questionnaires

Sub-Task

Due Date

Compile Questionnaires
Submit and review
Format Final Versions
Questionnaire 1
Interview OM 1
Interview CIO 1
System Stats 1
Compile and corporate info.
Questionnaire 2
Interview OM 2
Interview CIO 2

31/8/2002
11/9/2002
12/9/2002
15/9/2002
16/9/2002
23/9/2002
23/9/2002
24/9/2002
27/9/2002
20/11/2002
24/11/2002
25/11/2002

Page 134 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Complete and hand in Literature Review

System Stats 2
Compile and corporate info.
Questionnaire 3
Interview OM 3
Interview CIO 3
System Stats 3
Compile and corporate info.
First Draft for review
Second Draft for review
Submit Final Literature Review

Complete Analysis of Data


Complete Research Methodology for
review
Complete Data Analysis and Evaluation
Review Findings and Recommendations
First review of Academic Report
First review of Business Report
Second review of Academic Report
Dissertation Hand in Academic Report
Second review of Business Report
Dissertation Hand in Business Report
Dissertation Hand in Final Dissertation

26/11/2002
10/12/2002
20/1/2003
23/1/2002
24/1/2002
27/1/2002
30/1/2002
6/2/2003
13/2/2003
20/2/2003
27/2/2003
20/3/2003
6/3/2003
13/3/2003
3/5/2003
17/5/2003
28/5/2003
18/6/2003
9/7/2003
30/7/2003

Bibliography for Synopsis


Blanchard, Brown and Wilson (1998). Organisational Decision Making and Information
Systems, Oxford Brookes, Oxford.
Robson, W (1997). Strategic Management & Information Systems, Prentice Hall, Essex.
Shaw &Thomas (1998). Research Methodology, Oxford Brookes, Oxford.
Wisniewski, M (1997). Quantitative Methods for Decision Makers - 2nd Edition, FT Pitman,
Essex.

Page 135 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-G: The Content of Service Support and Delivery disciplines in


ITIL

The Service Support and Service Delivery books cover the Best Practice processes (functions) as
outlined below (OGC(1),2002):
1. Volume 1 Service Support
a. Service Desk (function). The objective of the service desk is to provide a central
point of contact between users and the ICT department delivering services to the
users.
b. Incident Management (process). The day to day process that restore normal
acceptable service with a minimal impact to the business.
c. Problem Management (process). The process of diagnosing root causes of
incidents in an effort to proactively eliminate and manage service disruption.
d. Release Management (process).The process of testing, verification and release of
changes to the ICT environment.
e. Change Management (process). Standard methods and procedures for effective
management of all changes in or to the ICT environment.
f. Configuration Management (process). Representing the physical and logical
perspective of the ICT services provided or delivered.
2. Volume 2 Service Delivery
a. Availability

Management

(process).

Optimisation

of

ICT

infrastructure

capabilities, services and support to minimise service outages and provide


sustained levels of service to meet the business requirements.
b. Information Technology Service Continuity Management (process). Processes to
manage an organisations capability to provide the necessary level of service
following an interruption of service or a major disaster.

Page 136 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

c. Capacity Management (process). Processes that enable the organisation to


tactically manage resources and strategically plan future resource requirements.
d. Service Level Management (process). Processes to manage maintain and improve
the level of service provided to the organisation.
e. Financial Management (process). Processes to manage the cost associated to
providing the organisation with services or resources to meet the business
requirements.

Page 137 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-H: Balanced Scorecard and CobiTTM


KGIs and Information Criteria, according to the Management Guidelines of CobiT TM (IT
Governance Institute, 2000), thus focus on the Organisational or the ICT organisations Balanced
Scorecards, in terms of:

Financial How do stakeholders look at the organisation (i.e. delivery against


budget)?

Customers How do customers see the organisation (i.e. Customer Satisfaction, on


time delivery and service value)?

Internal Process How does the organisation see itself (i.e. process orientation,
effectiveness and efficiency, and quality.)?

Learning and Innovation Can the organisation continue to improve and create value
(i.e. employee knowledge, satisfaction and quality of technical infrastructure).

KPIs on the other hand are process driven, performance measures, thus the use of a scorecard
and not a Balanced Scorecard KPIs are enablers of the goals (KGIs), Typical KPIs may be:

Responsiveness of ICT operations, production or applications.

Increased Quality and Innovation.

Optimum utilisation of resources and technology.

Service availability.

Customer Satisfaction.

Cost vs. business benefit.

Staff Productivity.

Reduced errors and rework.

Reduction of non-conformance and failed changes reported

Benchmark comparisons

In short, KPIs:

Are measures of how well the process performs

Are lead indicators used to predict future success or failure.

Is process oriented.

Page 138 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Focus primarily on the process and learning dimensions of the Balanced Scorecard.

Are expressed in precise measurable terms.

Helps improving the process when measured and acted upon.

Focus on critical resources that are important for that process.

It should be noted that the Balanced Scorecards should not be used as:
1. A way of imposing targets on people
2. Controlling means rather than ends (time at work instead of measured output)
3. To award merit increments
The Balanced Scorecard system could be used as a tool for goal setting, and if it is used as such,
it draws on several assumptions:
1. Employees perform better when they know what is expected and how they contribute to the
effectiveness of the organisation.
2. Most employees prefer self-determination at work.
3. Employees can be motivated further by well-timed formal and informal feedback about their
work methods and results.
4. Employees prefer intrinsic (Hertsberg, 1966) and extrinsic rewards that are consistent with
performance levels.

Page 139 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-I: Potgieters IT/IS Practices Framework

The figure below illustrates Potgieters (1997) New IT/IS Practices Framework.

Figure-40: Potgieters New IT/IS Practices Framework


The horizontal axis of Potgieters (1997) model is the measure of objective or subjective ness.
Potgieter regarded Information Technology or Systems by nature objective. Potgieter (1997,
chapter 3, p.17 ) states: Knowledge about the Information Technology or Systems system is
available to all observers equally, and clear structured rules and statistics are readily available.
Information on the behaviour and perceptions of people is less objective and is not readily
available, and are usually supplied by the individuals themselves. Of particular importance are
the subjective perceptions about service quality and other people in the Service Management
communityunderlining added). Subjective views of individuals (Human Systems) apply to the
right-hand side of the matrix and the Objective ICT System data applies to the left-hand side of
the matrix.
The vertical axis of the Potgieter (1997) model represents the degree of change (conflict) or
order present in structures in the environment. These structures defined by Potgieter include
Information Technology or systems and social or organisational structures. The top of the matrix

Page 140 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y

systems integration, order, control and the accommodation of differences.


The bottom of the matrix represents activities that focus on change of structures and the resulting
conflict of interests of different groups. Some of these changes are planned and deliberate while
others are not, they are rather a result of deliberate change or in opposition to deliberate
interventions. The result of these four dimensions/paradigms are four distinct quadrants of a
matrix Regulating Information Technology or Systems Structures (top left), Changing
Information Technology or Systems Structures (bottom left), Managing Human Differences
(bottom right) and Accommodating Human Differences (top right) - as seen in Figure-40.
Potgieter (1997) further defines four paradigms (regulating ICT systems, accommodating
different service perceptions, managing different service perceptions and changing ICT systems)
for his Qulaity Systems Practice Framework (Figure-41), a derivative of the above, and state that
paradigms for this framework are all present in any research environment - he thus concluded that
the ICT environment contains elements of objectivity and subjectivity, order and conflict some
elements may be more dominant than others but all are none the less present. This model was

Order

used by the researcher as he needed to consider both objective/subjective data and order/change.

Figure-41: Potgieters Quality Systems Practice Framework

Page 141 of 152

lic

to

bu

represents the activities where the status quo is upheld. Typical behaviour or activities include

.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

(obtain requirements, create


delivery processes,
communicate standards)

.c

(setting and adhering to


standards)

c u-tr a c k

Change / Conflict

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-J: Action Research, Research Roadmapused


The Table-13 below outlines the unfoldingresearch project this dissertation was based on.
Table-13: The unfolding project Plan, Act, Observe and Reflect
Month

Plan

Act

Jul
2002

Develop Systems
procedures for:

and

some

Incident Management

Service
Management

Problem Management

Configuration
Management

Change Management

Service Desk

Operations
Management
Start to draw
specifications
Service Desk Tool

Observe & Reflect

Familiarise
with
design
of
Infrastructure and other design
documentation
Define roles and responsibilities
of operations team

Level

up
for

Aug
2002

Took over infrastructure from


design team
Train technical resources on
Systems
and
Procedures
developed
Implement Service Desk tool

Train Service desk operators on


use of tool, systems and
procedures

Sep
2002

Plan
structured
interviews

customer

Start offering services to users


already in building - start
migration from old infrastructure
to new infrastructure

See how flaws in previous


designs
can
be
rectified
(Operations team - systems and
procedures

Conduct structured
interviews

See how design flaws can be


rectified (standard build by design
team)

Construct baseline for ICT


Service Management Maturity
and Best Practice Conformance
Assessments

customer

Identify flaws in systems and


procedures developed

Major flaws in design and


standard build by design team assumptions not valid.

Identify that
communication

breakdown

in

Page 142 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Month

Plan
Plan user feedback
Service Desk

through

Act

Observe & Reflect

Implement user feedback through


Service Desk

No measurements in place

Plan upcoming migration of new


users to the Shared Service Centre
infrastructure
Oct
2002

Change process is not followed


Offer services to users

Plan how to create awareness of


Change Process

Migrate users to the Shared


Service
Centre
(from
other
departments and from the old
Corporate
Informatics
infrastructure existing in the
building)

Regular Operations meetings

Daily Operations Meeting

Plan monitoring activities

Start
compiling
weekly
operational feedback for own use
and to give to the business
(including mean time to resolve)

Still issues regarding system


design that does not meet
user/customer requirements

Implement the new Release


Management
Function
&
Communicate Change Process

User and Customer requirements


and needs not well defined

Plan
the
new
Management Function

Release

st

Nov
2002

Conflicting
roles
Incident
Management provides support
and do new releases - this
causes
problems.
These
functions need to be split.

Do 1
Customer Satisfaction
survey baseline survey

Monitoring does not


Customer requirements

Redefine services and address


issues of non-alignment with
user/customer requirements

Offer services and support users


&
customers
(Incident
Management & operations)

Customer alignment improved


but some issues remain communication seems to be the
problem.

Plan
increased
engagement

customer

Do planned new migrations


(Release Management)

Process improvement nearly


come to a standstill - everyone to
operationally involved

Plan improvement of service


offering in terms of customer
requirement
alignment
to
services offered.

Actively start to engage business


to improve alignment between
services offered and services
required by customers

User resistance
Process

Develop framework for


service improvement plan

Monitor Customer Satisfaction


(on closure of call only)

Plan
first
major
Satisfaction survey

first

Customer

to

satisfy

Change

Refine monitoring and feedback


Start to enforce Change Process

Dec
2002

Agree on issues to address in


order to improve service based
on service improvement plan
feedback - look for quick wins

Establish baseline for service


improvement plan based on
previous surveys and interviews
ICT
Service
Management
maturity survey (input to service
improvement plan) trial-run
results shown in Appendix-B but
was not used in this dissertation

Customer
defined

interface

not

well

Compile and analyse ICT Service


Management maturity survey
results

Page 143 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Month

Jan
2003

Feb
2003

Plan

Act

Observe & Reflect

Send
out
2nd
Customer
Satisfaction survey on the last
working day of the year
ensuring users will receive it as
soon as they come back from
leave

Compile interim feedback

Improve services as agreed in


service improvement plan

Development team still continue


with
developments
without
involving operations team problems and conflict mount!

Start
to
communicate
engagement model and address
issues that causes conflict.

Resistance to change in terms of


customer engagement, conflict
still high between design teams
(MS and SAP) and operations

Agree on issues to address in


order to improve service based
on service improvement plan
feedback

Compile data gathered during


Customer Satisfaction survey

Analyse
data
definite
improvement except in terms of
product service fit - this staid the
same - this underlines previous
assessments
that
customer
interface is still weak - these
factors needs to be considered in
terms of future engagement
improvement plans

Start planning to migrate other


departments
to
the
new
infrastructure - this signals the
end of the first two phases of
migration and end of research
project

2nd ICT Service Management


maturity and Best Practice
Assessment (input to service
improvement plan)

Compile and analyse 2nd ICT


Service Management maturity
survey results - quick wins will be
less
attainable
for
future
improvements - interdependency
and the lack of fully implementing

ITIL now comes to the fore.

Engage customers - improved


contact and input obtained

Customer engagement highlights


weaknesses in terms of service
offering (as identified previously product service fit)

Improve services as agreed in


2nd iteration of the service
improvement plan

Analyse statistics gathered in


previous months in terms of
mean time to resolve incidents definite improvement - seems to
indicate
that
systems
and
procedures are working

Define customer engagement


model, ownership, roles and
responsibilities

Plan further
service levels

improvement

of

Initiate Underpinning
with Key Vendors

contracts

Plan
Customer
Engagement
Model and how it should be used.

Mar
2003

Start Planning new Services to


be offered to customers

Discuss policies and Procedures


with other roll players. These
should be supportive of best
practice

Meet to discuss issues that may


inhibit
transformation
of
departments to the new Shared
Service Centre infrastructure

Page 144 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Month

Plan

Act

Plan implementation
Service Desk Tool

of

new

Observe & Reflect

Evaluate the appropriateness of


the new tool

st

Apr
2003

May
2003

Planning session with 1 entity to


migrate to new Shared Service
Centre infrastructure

Re-write service catalogue to


reflect new services offered.

Audit current Service Desk Tool

Initiate Workshops with all


Service Desk users to determine
needs of the new system

Plan improvement on some key


procedures

Implement planned changes to


key processes.

Meeting with Second Department


to migrate to start planning the
migration

Finalise requirements for new


Service Desk Tool

rd

Customer Satisfaction

st

(business)

Plan 3
Survey.

Customer
Plan 1
satisfaction survey

Jun
2003

Initiate a Quality Management


program based on ISO9001:2000
Plan
new
organisational
structure, related every job to
KPI
s.

rd

Conduct
3
Satisfaction survey

Gap analysis and initiate new


Service Desk and CMDB design

Customer

st

Conduct
1
Customer
Satisfaction survey
rd
Do 3 ICT Service Management
maturity and Best Practice
Assessment
Results received and compiled

Page 145 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-K: The Service Capability Maturity Model

The Vrije Universiteit Service Capability Maturity Model is aimed at enabling IT service
providers to assess their capabilities with respect to the provision of IT services and to provide
IT service providers direction in terms of future improvement with regard to service capacity
and is aimed at different key process areas (Niessink & van Vliet, 1999). In essence Capability
Maturity Model provides for 5 maturity levels the OGC have increased the Process Maturity
Framework levels to 10 (although the maximum maturity level is still 5).
Some key issues should be remembered about the Service Capability Maturity Model (Niessink
& van Vliet, 1999). These comments also apply to the Process Maturity Framework (text in
brackets added by the researcher):

The model focuses on maturity of (and processes in) the service organisation and does
not measure the maturity of individual services, projects or organisational units.

The model covers the service delivery and support process, it does not cover the
development of new services.

The model is strictly ordered, in other words, key process areas are assigned to different
maturity levels in such a way that lower level processes provide a foundation for the
higher level processes. This means that unless the requisite measures in a maturity level
are satisfied, the organisation remains on that level even if measures in higher process
levels are satisfied.

The model is minimal and focuses on key processes firstly, the quality of other
processes is not deemed essential it Information Technology Service (Process) Maturity
and secondly the model only states the activities or processes that need to be in place to
reach a certain maturity level, it does not prescribe organisational structure,
implementation models, processes, procedures or work methods.

Page 146 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-L: The Service Quality Gap Model

Figure-42: The Gaps Model for Service Quality (Niessink, 2001)

The Gap Model identifies five gaps, the gap between customer expectations and perceptions
(managing customer expectations), the gap between services delivered and feedback to the
customer about service performance (predominantly the task of Service Level Management in the
ITIL Framework), the gap between service design, standards, policies and procedures (the main
domain of ICT Service Management Best Practice/ ITIL), the gap between service design and
the companys (service providers) perception of the service quality provided and lastly the gap
between the companys (service provider) perception of the quality of service provided and the
customers perception of the quality of service provided.

Page 147 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

N
y
bu
k
lic

.c

Appendix-M: Correlation between Primary Data


Table-14: Correlation Coefficient (R) calculated for all data sets collected.
Results of
Assessments and
Surveys

Tangible

3.0

2.5

2.0

Responsiveness

3.5

2.0

2.0

0.9

Assurance

3.0

2.0

1.5

1.0

Empathy

3.0

2.0

1.5

1.0

0.9

1.0

Reliability

3.0

2.0

1.5

1.0

0.9

1.0

Aggregate Customer Satisfaction

3.0

2.0

1.5

1.0

0.9

1.0

1.0

1.0

Service Level Management

18%

70%

82%

0.9

1.0

1.0

1.0

1.0

1.0

Service Desk

15%

57%

65%

0.9

1.0

1.0

1.0

1.0

1.0

1.0

Incident Management

5%

65%

85%

1.0

1.0

1.0

1.0

1.0

1.0

1.0

Problem Management

5%

65%

75%

0.9

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

Change Management

4%

49%

64%

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

Configuration Management

2%

67%

76%

0.9

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

Release Management
Aggregate Best Practice
Conformance
Service Level Management

4%

71%

78%

0.9

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

8%

63%

75%

0.9

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

0.0

1.0

3.0

1.0

0.8

0.9

0.9

0.9

0.9

0.9

0.8

0.9

0.8

0.9

0.8

0.8

Service Desk

0.0

0.0

0.0

Incident Management

1.0

1.0

2.0

0.9

0.5

0.8

0.8

0.8

0.8

0.6

0.6

0.7

0.6

0.7

0.6

0.6

0.6

0.9

Problem Management

0.0

1.0

1.5

1.0

0.9

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

0.9

0.8

Change Management

0.0

1.0

1.5

1.0

0.9

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

0.9

0.8

Configuration Management

0.0

1.0

2.5

1.0

0.8

1.0

1.0

1.0

1.0

0.9

0.9

0.9

0.9

0.9

0.9

0.9

0.9

1.0

0.9

1.0

1.0

Release Management

0.0

1.0

1.0

0.9

1.0

0.9

0.9

0.9

0.9

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

0.8

0.5

0.9

0.9

0.8

Aggregate Process Maturity

0.0

1.0

1.5

1.0

0.9

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

1.0

0.9

0.8

1.0

1.0

1.0

0.9

Calls logged per User

8.7

7.3

3.6

1.0

0.7

0.9

0.9

0.9

0.9

0.8

0.8

0.9

0.8

0.9

0.8

0.8

0.8

1.0

1.0

0.9

0.9

1.0

0.7

Assessment or Survey

0.9

1.0

Aggregate Process Maturity

Release Management

Configuration Management

Change Management

Problem Management

Incident Management

Service
Desk

Service Level Management

Aggregate Best Practice Conformance

Release Management

Configuration Management

Change Management

Problem Management

Incident Management

Service Desk

Service Level Management

Aggregate Customer Satisfaction

Reliability

Empathy

Q- 3

Assurance

Q- 2

Responsiveness

Q- 1

Tangible

Correlation Coefficient (R) of Assessment and Survey Results

1.0

1.0

0.9

1.0

0.9

Note 1: Negative figures are shown in RED.


Note 2: As Service Desk Maturity Statistics are a straight line (thus no standard deviation) the correlation for Service Desk Maturity with other sets of data can not be calculated.
Note 3: R values of 0.8 or higher shows a strong correlation and 1.0 a perfect correlation.

Page 148 of 152

.d o

m
o

c u-tr a c k

lic

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action Research in the Shared Services Centre by JH Botha.
w

.d o

to

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Appendix-N: Balanced Scorecard Perspectives and what they should mean to


Technology Services
The Quality Movement according to Pfeffer (1994) is fundamentally a way of using language and
ideas to mobilise actions that are often talked about but not frequently implemented. In this
regard the primary task of managerial language, according to Eccles & Nohria (1992), always
need to persuade individuals to put forth their best efforts in collective with other men and
women. The argument is thus that the Quality Movement is a powerful way of overcoming some
of the barriers to implementing more effective management especially as it provides a social
support and powerful language and evidence that it works.
The language of Quality revolves around the Customer and Customer Satisfaction this is an
important observation, as Drucker (in Gabor, 1990) notes that the purpose of a business is to
create a customer and satisfy that customer. The first of Demings 14 points is that quality is
defined by the customer, so to does the definitions of Quality by Zeithaml (1988), Lewis (1991),
the International Standards Organisation (2000), Dale (1999) etc.
Frequently the Customer is replaced by some other stakeholder like shareholders, political
leaders etc. Refocusing on customers, assists organisations to break the short-term and narrow
financial focus. Another important word is Processthis term orients management away from
results to rather focus on the steps that produce the results, which in turn assist employees to
improve processes, through training, redesign or other changes. Every job in the organisation is
part of the process and only by understanding the role that each job in the companys customer
driven strategy can the process be improved.(National Institute of Standards and Technology,
1991).
Another word is Teamwork, it is often part of quality improvement programs. Teamwork
orients management to think about implementing self-managing teams, group incentives and
focus less on individual performance. The problem with individualistic focus is that most
jobs/performance in interdependent systems is not only function of the individuals own ability
and motivation but also behaviour of co-workers and the system in which the individual works
(Pfeffer, 1994).

Page 149 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

Quality improvement programs must be management-led and this may require fundamental
changes in the way an organisation function (National Institute of Standards and Technology,
1991).
Deming, one of the fathers of quality, fervently believes in the intrinsic motivationof mankind,
and that it is management policies that often serve to de-motivate employees. Instead of helping
workers develop their potential, he asserts, management often prevents them from making a
meaningful contribution to the improvement of their jobs, robs them of self-esteem they need to
foster motivation, and blames them for systemic problems beyond their control (Gabor, 1990).
Deming further viewed that traditional financial mentality is the greatest impediment to quality
management, because it deflects attention from long-term interests of the organisation because
traditional financial and accounting measures offer few insights to the future. The Deming style
manager learned to probe behind the numbers, knowing that numbers do not give the answers,
only the questions to ask.
Quality programs and the quality movement emphasise consistency of purpose (a value trait in a
setting in which results may not occur rapidly and patience is often in short supply), this entails
an unequivocal long-term commitment to invest in, and adapt to, the challenging requirements of
the marketplace. It is the antithesis of managing for short-term financial gain (Gabor, 1990).
The Quality System should serve to underpin the whole of the service delivery system, each
process, each activity and each (cause-and-effect) relationship in the service delivery system.
The objectives in the Learning and Growth perspective provide the infrastructureto enable the
ambitions of the other perspectives. They are drivers for achieving excellent outcomes for the
other perspectives (Norton and Kaplan, 1996). Managers however frequently find it difficult to
sustain or give sufficient attention to this perspective as short-term financial pressures
necessitatesthis. Adverse consequences of consistent failure to enhance employee, systems and
organisational capabilities will not show up in the short term, but eventually they will and often
it is then to late to take corrective action. The three principal categories need to be focused on,
which in turn leads to Increased Employee Productivity, they are:
1. Employee capabilities developing skills, training, mentoring etc.

Page 150 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

2. Information System capabilities Ensuring that the right level of access to strategic and
operational information is available, empowering employees to be more self sufficient
and make better decisions.
3. Motivation, empowerment and alignment Aligning Personal Goals to Organisational
Goals.
Employee measures are core to this perspective as employee retention and employee productivity
are the drivers for the results obtained by the organisation. These are underpinned by employee
satisfaction which in turn is underpinned by staff competency, the technology infrastructure and
access to knowledge and information and the climate in which action takes place (brining us back
to organisational change management and cultivating a service culture, supported by
management, measure and reward accordingly).
The organisation needs to see employees as a source of competitive advantage and not just as a
cost that needs to be minimised has become essential for any organisations long term survival
(Pfeffer, 1994).
For the Internal Business Process perspective, managers need to identify the processes that are
most critical to achieving customer and stakeholder objectives. The focus needs to be broader
than existing processes but also include new and old processes thus focusing on improved
effectiveness and efficiency of processes used by the organisation.
Critical issues identified for Technology Services are:
1. To understand customer needs
2. To offer services that are valued by the customer
3. Developing a Service Culture rapid response to customer needs
4. Operational Excellence based on Best Practice
5. Quality products, services and processes
The customer perspective allows organisations to align their core customer outcome measures to
satisfaction, loyalty, retention, acquisition etc. It also enables them to identify and measure
explicit value propositions to each customer or customer segment.

Page 151 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

H
F-XC A N GE

H
F-XC A N GE

c u-tr a c k

N
y

To state that the whole organisation should focus on satisfying customer needs can not be faulted
clearly for an organisation to achieve their financial objectives they must create services that
are valued by customers. Core measurement groups for this perspective may include:
1. Market share
2. Customer acquisition
3. Customer Retention and
4. Profitability per customer
But in Shared Service Centre environment the focus should be on:
5. Increased Customer Confidence and
6. Customer Satisfaction through Superior Execution.
The financial perspective is generally well understood in the business environment with measures
like, NPBT, ROCE, EVA etc. that are known and understood. However, focusing on these
measures forces the organisation to take a short term view sometimes at the expense of
investments in soft areas that will create long term competitive advantage. Key focus areas for
this perspective may include:
1. Revenue growth and mix
2. Asset utilisation and investment strategy
But in Shared Service Centre environment the focus should be on:
3. Improving Efficiency thereby reducing cost and improving productivity
4. Improved effectiveness thereby reducing risk, which in turn should lead to
5. Effective and Valued Service Delivery at Optimal Cost.

Page 152 of 152

k
lic
.d o

Improving Customer Satisfaction and Operational Effectiveness with the used of an ICT Service Management Best-Practice framework: Action
Research in the Shared Services Centre by JH Botha.

to

bu
.c

m
o

.d o

lic

to

bu

O
W
!

PD

O
W
!

PD

c u-tr a c k

.c

You might also like