You are on page 1of 11

Ann. Telecommun. (2010) 65:1929 DOI 10.

1007/s12243-009-0143-9

An integrated framework of HoQ and AHP for the QOE improvement of network-based ASP services
Dohoon Kim

Received: 17 November 2008 / Accepted: 24 November 2009 / Published online: 15 January 2010 # Institut TELECOM and Springer-Verlag 2010

Abstract The application service provider (ASP) industry provides an essential infrastructure for Internet-based e-business transactions. First, we introduce the House of Quality (HoQ) framework, which provides the best way to not only arrange and evaluate voice of customers (VoC) and voice of engineers (VoE) but also combine VoC and VoE, thereby presenting explicit directions for quality of experience (QoE) enhancement. However, there have been few studies on HoQ for developing and improving telecom services. Here, we employ the analytical hierarchy process (AHP) method to evaluate VoC, VoE, and their relationships so that qualitative measurement, which is the weakest point of the traditional HoQ approach, can be substituted by quantitative and interactive estimation. The case study discussed here serves as an illustration of the applicability and usefulness of the integrated HoQ/AHP approach to the ASP industry. The proposed integrated framework successfully finds key functional elements, such as business customization and security/failure management, to reengineer the service delivery process, thereby helping service providers develop better ASP services to improve QoE effectively and efficiently. Keywords Quality of service (QoS) . Quality of experience (QoE) . Voice of customer (VoC) . Voice of engineer (VoE) . House of quality (HoQ) . Analytical hierarchical process (AHP) . Application service provider (ASP) . IT service quality . Service delivery process reengineering

1 Introduction In the light of the growing importance of the Internet to present-day society, IT outsourcing and/or IT rental services will shape future e-business transactions by supplying network-based services to firms and individuals. These services enable great flexibility in the utilization of clients' resources. For example, instead of developing and maintaining proprietary business software such as enterprise resource planning (ERP), human resource management (HRM), and customer relationship management (CRM) which incur huge investments and operations costs, firms find that it is more cost-efficient and useful to switch to IT outsourcing. In particular, an application service provider (ASP), which is generally defined as a 3rd-party service organization, deploys, manages, and remotely hosts a software application through centrally located servers in a lease agreement. Thus, it helps firms capture the opportunity of IT outsourcing and plays a key role in transforming business practice through networkbased services. Driven by the potential in IT outsourcing services, the ASP industry in Korea has grown by more than 300% over 7 years since 2001, and the market is around US $310 million as of this writing [22]. Worldwide, prospects for the ASP industry are similarly bright; for example, 20 25% of software will be provided through ASPs by 2010 [9, 28]. In particular, ASP services are perceived to be the most suitable for small and medium enterprises. However, the ongoing global recession is casting a long shadow on this industry. Some predict that only a small number of ASPs will survive in the foreseeable future. As a matter of fact, mergers and acquisitions in this industry have been quite common since the early 2000s, and the industry is believed to be still at the stage of reshaping itself. Given

D. Kim (*) College of Business Administration, Kyung Hee University, Hoegi-dong 1, Dongdaemoon-gu, Seoul 130-701, South Korea e-mail: dyohaan@khu.ac.kr e-mail: dyohaan@soe.ucsc.edu

20

Ann. Telecommun. (2010) 65:1929

this competitive landscape, it is worthwhile for service providers to develop differentiated services and enhance QoS. Survival in the cutthroat, competitive environment of the ASP industry requires the attainment and maintenance of a sustainable competitive edge that is based on the voice of customers (VoC).1 Subsequently, although the industry is at the early stage of the industrial life cycle, firms in the industry pay much attention to the enhancement of the service delivery process for quality improvement in order to win market share. It becomes more critical to redesign and improve the service delivery process in order to meet the quality standard that users request (i.e., VoC), thereby enhancing the quality of experience (QoE) of users. Along this line, we need to develop a systematic framework to measure VoC and combine it with the service delivery process. The Quality Function Development (QFD) or House of Quality (HoQ) methodology, which is a basic design tool in quality and process management, originated in 1972 at Mitsubishi's Kobe shipyard and was subsequently developed at Toyota and its suppliers; it has since been applied successfully in many manufacturing and/or service companies worldwide [1, 17, 21, 25, 34]. However, few studies apply the QFD/HoQ approach to IT services and in particular, network-based service design and development. The purpose of this study is to introduce and analyze IT outsourcing services through ASPs by utilizing the QFD/ HoQ approach, which will be abbreviated HoQ, hereafter. First, an overview of the ASP industry and services is given in the next section. The core features of the service delivery architecture together with the QoE of ASP services are also provided. Section 3 introduces a quantitative framework to handle both VoC and voice of engineer (VoE) and identify the correlations between VoC and VoE using the Analytic Hierarchical Process (AHP) method. Through our framework, one can systematically transform QoE on network services into an appropriate set of technical elements in the service operations. Furthermore, the combination of AHP and HoQ allows one to evaluate the service process quantitatively and interactively. Subsequently, the proposed approach helps to enhance both service performance and QoE. In Section 4, we illustrate how the proposed framework is applied to ASP services with a case study of online software rental services in Korea. Based on the results from the case study, it will be shown that the

proposed, integrated HoQ/AHP model is a good tool that organizes the quality dimensions (VoC and VoE) and combines them to yield a key functional feature in the service delivery process toward QoE improvement. Finally, we conclude the paper with a summary and discussion of future research directions.

2 ASP industry and services: service delivery architecture and QoE dimensions 2.1 ASPs: a novel approach for IT outsourcing through network-based services The use of fast Internet connections has grown rapidly over the last few years as more people buy computers and acquire Internet access. Enterprise intelligence through e-transformation is one of the cornerstones of the network-based business era. Furthermore, the intensely competitive landscape of e-business causes firms to focus on their core capabilities and farm out staffing functions, such as IT services. In such circumstances, one option for enhancing the competitive edge is IT outsourcing via ASPs. Thus, the ASP industry provides an essential infrastructure for Internet-based business transactions, thereby accelerating corporate e-transformation. An ASP is generally defined as a 3rd-party service organization that deploys, manages, and/or remotely hosts a software application through centrally located servers in a lease agreement. ASPs began by providing online application programs, such as ERP, HRM, and CRM solution packages, to corporate clients. The initial clients were small companies or local branches of multinational companies where IT outsourcing was the only option for costeffectively deploying IT assets due to financial or regional constraints. As seen in these instances, the biggest merit of employing ASPs is that corporate clients do not have to own the applications and assume the responsibilities that are associated with both initial and ongoing support and maintenance. Subsequently, ASPs have become differentiated from extant IT services in that they provide IT resources to multiple corporate clients on a one-to-many basis with standardized service architectures and pricing schemes. However, there are a number of factors that are frequently cited as either fuelling or dampening the growth of the ASP market [3, 10, 21, 28, 32, 35]. One striking characteristic observed so far is that immaturity of the industry is the most representative challenge in terms of the market factor, for example, the uncertainty as to whether existing and emerging ASPs are winning enough clients to validate an ASP business model for highly sophisticated enterprise applications. However, there is also a pessimistic

1 Since VoC is an equivalent concept of quality of experience (QoE), those two terms are interchangeable. However, according to the convention in quality management, VoC will be used mainly in this article.

Ann. Telecommun. (2010) 65:1929

21

view about the future of the ASP industry. As a matter of fact, the industry faces many challenges that should be overcome for ASP business models to survive and prosper. In particular, the success of an ASP depends first on the extent to which its client companies assure themselves of the quality of the IT services outsourced. In addition, there are technical factors such as the service level agreement, security concerns, and remote monitoring systems that should be further developed for building a seamless service delivery process. Accordingly, the technical elements in the service delivery process should be further enhanced to support the proliferation of ASPs. Ultimately, only a few successful ASPs that adapt themselves to the market requirements, such as a high standard of service quality or QoE, will survive. The competitive landscape is also characterized by the unique nature of the service system market where independent hardware and software resources are combined and reorganized into a new package in alignment with partners over the value chain and even the clients' business processes. These offerings aim to design seamless and proprietary service delivery processes to sharpen the competitive edge of existing ASPs while raising the entry barriers. Furthermore, an analysis of the service delivery process shows how this industry is likely to evolve and yields some basic insights into the strategic directions to pursue, for example, high service quality over the network (mostly the Internet). For this purpose, we need a mechanism to establish and maintain collaboration among varying functional elements. This study will focus on this nature of the service delivery process and demonstrate how to achieve high service quality through the proposed framework. Figure 1 summarizes the service delivery process in the ASP industry. 2.2 Engineering characteristics of the service delivery architectures of ASPs Network-based ASP services work in a clientserver environment, where the Internet plays a key role in mediating and processing information across various entities. Such distributive computing depends on a new portfolio of technologies and operational approaches, such as open standards and object-oriented design. ASPs are sometimes viewed from the perspective of extensions of network computing. Accordingly, it is not altogether surprising that the most representative system architecture of ASPs can be found in the three-dimensional framework proposed by Sun Microsystems, Inc. We also follow the similar angle and structure the service delivery architectures for network-based ASP service operations in four layers: the management layer, applications layer, platform layer, and infrastructure layer. This description is a modified

version of the generic architecture from Sun Microsystems, Inc. (also refer to [10, 23]).2 Based on the architecture outlined above, we elaborate the technological structure and functional entities so that they can be aligned with the service delivery process shown in Fig. 1. Preliminary studies (for example, [3, 10, 16, 21, 35]) identify key ingredients in the ASP service delivery process. The resulting architecture is a reference model that directs and organizes the analysis, design, and development of functional elements that constitute the service delivery process. Finally, we break down the basic layered architecture into nine functional elements in Table 1. These elements are selected from various published studies and brainstorming sessions and illuminate essential features and technical factors; they are confirmed by a panel of experts (refer Section 4.1). The relationships between the layered structure and functional elements are as follows. & Management layer: This layer is at the top of the architecture and responsible for the stable operation of the applications and utility programs. The core function of this layer involves operational support, security and failure management, and client support. Operational support deals with transaction monitoring for accounting, billing, and so on. The security and failure management module maintains safe and reliable channels between the service provider and clients. Client support includes on/off-line helpdesks, emergency responses, user database management, etc. Application layer: This layer is for data and information processing. Since the ASP services also employ client server computing, scalability and flexibility are required for real-time, interactive applications. The solution comes in the form of the logical decomposition of applications into discrete components that are identified by the service delivery process. This layer can be further decomposed to incorporate the familiar N-tier model in the clientserver environment. Features that are included in this layer are application programs and utilities together with their customization for specific business purposes. Platform layer: This layer initiates, monitors, and terminates transactions between the provider and clients. It also manages the recovery of interrupted sessions. The platform layer should also recognize and manage sudden

&

&

2 The major modification is as follows. We added one layer, the management layer, on top of the application layer in order to explicitly incorporate the service management function. Due to this change, a minor rearrangement of some detailed features in the generic version of Sun Microsystems, Inc. was necessary. This modification was conducted through a literature survey and consultation with experts (refer Section 4.1). The final outcome led us to the resulting key functional elements.

22 Fig. 1 The service delivery process for ASP services

Ann. Telecommun. (2010) 65:1929

ASP
Management Functions
Usage management Business support Operations support

Transaction Middleware Application Functions


Application program Application customization Utility support Network Adapter Firewall

Network
Firewall

Client Business Process

Security/Failure Manager
Authentication, access control Fault management

DB

Storage

Information flow Data flow

&

interruptions and unavoidable delays that are triggered by higher-priority jobs that need urgent attention. The layer is composed of two sub-categories (functional elements): the server/OS and communication middleware. The former is an essential part of ASP services (which is why an ASP is commonly called a server farm), while the latter builds a link and maintains data transactions between applications and servers. Infrastructure layer: Network and data storage are two important infrastructures that support the ASP services at the bottom of the architecture. The network is a vascular system and the main asset as well as a cost center in the service delivery process. Storage management systems, such as network-attached storage and the storage-access network, provide a means for building storage that is external to the computing systems while drastically reducing operational complexity. Through these types of infrastructure, clients can reduce the complexity and efforts for data management and maintenance. Since these functional elements have shown rapid growth in recent years, new types of ASP business models based on these assets are emerging, for example, telecom operators [21].

Abstracted in Table 1 are the fundamental functional elements, which will be the building blocks of engineering characteristics (ECs) in the HoQ model that is introduced in Section 3. 2.3 User requirements regarding QoE of ASP services QoE cannot be completely captured in a single metric. In particular, the issue of quality evaluation in service industries, such as telecommunication and information, is more complicated than in manufacturing industries since the experience of a service is directly affected by the

provideruser interactions, for example, the customerencounter issue [11, 33]. Therefore, the evaluation of QoE for informational services, as in the ASP industry, should start by analyzing and measuring various aspects of the service characteristics that are observed in the course of the service delivery process. In order to capture the diverse attributes of information systems and combine them in an integrated measure, many studies, e.g., [5, 8, 18, 23, 27, 36], provide fruitful dimensions for service quality. However, the most extensive studies with their resulting QoE models can be found with regard to SERVQUAL [2, 26] and the technology acceptance model (TAM; [6, 7]). A considerable extent of the development of operational definitions and measurements for the quality of IT services has been driven by SERVQUAL and TAM. Since these two models have been used by many researchers in diverse IToriented industries and proved to be quite effective as well as robust (for example, refer [4, 5, 37]), we also employ these models to build a QoE model in terms of VoC. From the SERVQUAL model, we adopt four dimensions [service attributes (SAs) in our terminology] of service-attached quality, namely, responsiveness, reliability, assurance, and empathy.3 Furthermore, we can also explicitly consider other features of the perceived service quality by incorporating TAM, e.g., user-friendliness (convenience or ease-ofuse) and usefulness. Through these reference models, primary dimensions will be elaborated into six SAs as surrogate measures of

The original version of the SERVQUAL model assumes that a service experience consists of multiple dimensions such as reliability, responsiveness, assurance, visibility (or tangibles), and empathy. Thus, based on the SERVQUAL model, the level of QoS in each dimension is defined by the gap score between the customer's expectation and his/her perception. For this reason, SERVQUAL is also called a gap model.

Ann. Telecommun. (2010) 65:1929 Table 1 Key ECs in the ASP service architecture Architecture layers Management Engineering characteristics (VoE) Client support (abbreviated as ClSupp) Security and failure management (SecMgt) Operational support (OpSupp) Application programs and utilities (AppPgm) Business support (BusSupp) Server and OS management (SrvPlat) Application and communication middleware (ApplMid) Network management (NetMgt) Data storage management (DatMgt)

23

Applications Platform

&

Infrastructure

and use. We further subdivide this dimension into two sub-categories: convenience and formality. Regarding the ASP service, user-friendliness refers to the simplicity and ease-of-customization of the corresponding software application and utility. In particular, for the latter capability, the service should be well organized, standardized, and modularized so that it can be flexible enough to be rapidly implemented in various environments. Usefulness: This concept asks in a straightforward manner how useful and helpful the corresponding service is for enhancing clients' business performance. This SA involves factors that directly affect the purpose of a client's ASP usage. Thus, relatively objective and quantitative metrics, such as productivity increases or cost reductions, can be employed as surrogate measures.

VoC. The detailed descriptions of the SAs are presented below; we summarize them in Table 2. & Assurance: Assurance is defined as the client's overall feeling about the accuracy of the received service(s) and other attributes that are associated with the correctness of the service(s). Typical examples of this SA dimension include the truth-worthiness and credibility of service outcomes, consistency of results, the provider's understanding of users' requests, etc. Responsiveness: The term is generally defined as the willingness to help clients, for example, the time taken to reply to a client's inquiries. In the case of a network, the mean round trip time can be the best metric for this dimension. On the other hand, this term may be broadly defined to measure the ASP's reaction to market change, such as the adoption of new technology. In this study, we locate the definition of responsiveness midway between these extremes. Reliability: This term is related to stability, security, robustness, availability, etc. While assurance focuses on the content of the service itself, reliability is related to overall transactions in the service system. In this study, two sub-categories will be employed to define this dimension: stability and availability. Empathy: Empathy is the most abstract concept in the SERVQUAL model. This term concerns clients' general sympathy with the service provider. For example, if a staff person at the helpdesk manifests a sincere attitude to clients while resolving their problems, the clients may feel sympathy for the service irrespective of the status of the resolution of their problems. Positive and active support for clients' usage of the service and sincere efforts to understand the peculiar situation concerning clients are another example of empathy in the service delivery process. User-friendliness: From the clients' point of view, it is very important that the applications be easy to access

&

&

As Fig. 1 shows, the delivery of ASP services cannot be completed by a single operation or entity; rather, it requires a long chain of processes across multiple functional elements and various stages. Furthermore, users' evaluation of QoE is largely determined by key functional elements along the service delivery process, as shown in Table 1. Subsequently, the construction of an effective QoE model entails an effort to link VoC with VoE; in other words, one requires a tool that links the SAs in Table 2 with the ECs in Table 1. For that purpose, we will introduce the most suitable model, called HoQ, in the next section, where we will further elaborate the model by incorporating the AHP method for quantitative evaluation. As a result, an integrated framework of HoQ and AHP will be established. Using the ECs and SAs given in Tables 1 and 2, respectively, we will systematically apply the proposed framework to ASP services for QoE improvement in the case study (Section 4).

3 HoQ, AHP, and their integration for QoE model development In this section, we review the generic HoQ framework and the AHP method. The basic structure of HoQ is depicted in

&

Table 2 Key SAs of ASP services Sources SERVQUAL Service Attributes (VoC) Assurance (Assur) Responsiveness (Resp) Reliability (Rel) Empathy (Emp) Ease-of-use (EoU) Usefulness (Useful)

&

TAM

24

Ann. Telecommun. (2010) 65:1929

Fig. 2. We also introduce the idea of utilizing AHP to enhance the performance of the HoQ framework. 3.1 The HoQ framework QFD has been developed to emphasize planning and process innovation for meeting users' needs [1, 17, 34]. It employs HoQ, which is the most fundamental matrix in the entire QFD framework. HoQ is basically built upon two principal components: VoC and VoE. VoC and VoE are operationalized and represented by SAs and ECs, respectively (as described in Section 2). Thus, SAs can also be interpreted as a set of surrogate measures of QoE. EC includes the key concerns of suppliers, for example, technoengineering factors, operational functions, and functional activities in the service delivery process. The horizontal portion of the matrix contains information that is relative to VoC, while the vertical portion contains technical elements (VoE or EC) for responding to user inputs. Consequently, HoQ is composed of five building blocks, as shown in Fig. 2. The structure of HoQ makes it possible to organize SAs and ECs and provides a systematic means of relating ECs to SAs as well. HoQ begins with users' needs. In order to completely identify what users want, it is necessary to operationalize those needs in terms of SAs. Generic SAs may be grouped into a higher level of abstraction so that the overall concern of users can be clearly and effectively represented. This part is typically located on the left margin of the HoQ ( in Fig. 2). After gathering and rearranging the SAs, one prioritizes them in the order that one deems critical for overall user satisfaction or QoE. The prioritization score of each SA is then normalized and registered on one side of the matrix (, Fig. 2). Since the SA priorities reflect the influence of the corresponding SAs on QoE and play an important role in determining the key ECs, the accurate computation of the relative importance of the SAs is critical to the success of

Voice of Engineers ( VoE)

3 1
User Requirements (VoC) Service Attributes (SA)

Engineering Characteristics (ECs)

the HoQ model. Many ideas have been suggested to quantitatively compute the SA ranking, for example, [12 14, 20, 24] provide some methodologies for prioritizing the SAs. This study employs the AHP method (refer Section 3.2) for this purpose as in [24]. On the other side of HoQ, we consider technical requirements called VoE or ECs, which are listed along the top of the matrix (, Fig. 2). This portion is related to the means by which the provider responds to the VoC. That is, (in Fig. 2) abstracts how the provider translates the SAs and responds to them with its assets and resources. Thus, one can list a set of ECs that represent core functions or activities in the production of services. Note that each EC may affect one or more SAs. A set of ECs has to be selected as the most critical factors for improving the QoE as a result of the final analysis of the HoQ framework (refer to Eqs. 1 and 2). Henceforth, we will use the terms, VoC and SA, VoE and EC, etc., interchangeably without any distinction unless any unnecessary confusion may arise as a result. After determining the SAs and ECs, HoQ continues by establishing relationships between the SAs and ECs (, Fig. 2). In a typical HoQ procedure, the level of each relationship is determined through expert consulting. However, a poor investigation may result in distorted outcomes where weak relations are exaggerated or strong relations are underestimated [15, 19, 20]. The relational matrix cannot be too accurate since it controls the overall performance of the HoQ procedure. Therefore, it is a common practice to form a task force team that comprises personnel from varied fields who will lend their expertise to a thorough examination and rigorous construction of the HoQ. However, the traditional approach cannot avoid the limitation set by qualitative judgement. In order to overcome this limitation, we employ a quantitative approach based on the AHP method (refer Section 3.2), which is expected to enhance the confidence of final results. Last but not least, the primary outcome of the HoQ is stored at the bottom of the matrix (, Fig. 2). Let j (j=1, , n, where the number of ECs is n) in row (Fig. 2) denote the importance of the jth EC in order to reflect the SA priorities in QoE. Equation 1 shows how to calculate j's when there are m SAs and n ECs. dj
m X i1

2 4
Relationship Matrix (r ij ) SA Relative Importance ( i )

wi rij ; 8j 1; . . . ; n:

EC Priorities (j )

Fig. 2 House of Quality (HoQ): overall framework and components

In Eq. 1, wi and rij represent the normalized priority of the ith SA (, Fig. 2) and the strength of the relationship between the ith SA and jth EC (, Fig. 2), respectively. That is, j is the average of the cell values in the jth column (, Fig. 2) weighted by the corresponding SA priorities

Ann. Telecommun. (2010) 65:1929

25

(, Fig. 2). After the j's are computed, it can be seen which particular ECs are of importance so that effort can be concentrated on them for QoE improvement. Lastly, the j's can be further normalized (in terms of percentages) using the following equation (Eq. 2) for ease of comparison. dj dj P
j

respectively. The degree of inconsistency in each matrix is ascertained through Eq. 4 [31]. Inconsistency RatioIR II lmax n=n 1 RI 1:98n 2=n 4

dj

100; 8j 1; . . . ; n

By prioritizing the ECs, service operators are able to be more responsive to user needs (i.e., VoC or QoE) that the SAs surrogate. Also, note that the relational matrix in HoQ is one of the most critical parts in Eqs. 1 and 2. This observation indicates that it is essential for a successful application of HoQ to use a sound method for estimating the cell values (rij). 3.2 The AHP method for quantifying SA priorities and matrix cells in HoQ AHP is a decision support tool developed by Saaty [30] for dealing with complex, unstructured, multiple-criteria decisions. AHP can be applied in a wide variety of decision areas. The three key ingredients of AHP are: (1) a description of a complex decision-making problem as a hierarchy; (2) pairwise comparisons to estimate the relative priorities of the various elements on each level of the hierarchy; and (3) an overall evaluation of decision alternatives that constitute the bottom line of the hierarchy. Once a decision hierarchy is determined, the traditional AHP method proceeds as follows. First, one constructs a series of comparison matrices known as decision matrices, whose elements are measured in an 18-point scale, as suggested by [30, 31]. For example, the element of the kth decision matrix, dijk, represents the degree of importance of the ith alternative in relation to that of the jth alternative in terms of the kth attribute or decision criterion. Then, the eigenvectors of each decision matrix are calculated. In particular, the eigenvector corresponding to the maximum eigenvalue will serve as the relative importance of alternatives for a specific decision criterion. Decision Matrix 2 k k d11 d12 k k 6 d21 d22 6 6 . . 4 .
k dn1

3 k d1n k d2n 7 7 7; where dij k 1=dji k 5

k k dn2 dnn

Lastly, one should examine the consistency of each decision matrix. Let n and lmax denote the number of alternatives [i.e., the number of columns (rows) in the decision matrix] and the principal eigenvalue of the matrix,

In Eq. 4, II and RI stand for the Inconsistency Index and Random Index, respectively. Further revision or discards should be considered if IR>10%, which implies that the pattern of the relative-importance scores in the matrix seems to be illogical. Within the HoQ model, AHP is first applied to the SAs as if there were a single final goal (for example, user satisfaction or QoE) as in the customary AHP procedure. First, multiple pairwise evolutions are carried out, where an evaluation, dij, shows an expert's judgment of SA i over SA j. Based on these comparisons, a pairwise-comparison matrix (decision matrix) is constructed for the SAs. Then, an eigenvector is derived from the decision matrix to identify the relative importance of the SAs. Finally, inconsistencies are checked for using Eq. 4. There are many studies on SA priorities that use the AHP method, for example, [14, 24, etc.]. However, few studies have been conducted so far for estimating the rij's based on the AHP method. This study employs the AHP method to quantify the relationship matrix ( in Fig. 2), whose cells display the degree of the relationships between the corresponding pairs of SAs and ECs. This procedure is similar to the computation of SA priorities except that one pairwise-comparison matrix will be built for each SA, as in Eq. 3. That is, for each SA that is indexed by k, the kth row of the HoQ matrix is filled with an eigenvector of the corresponding decision matrix for SA k. Furthermore, this decision matrix is derived by pairwise comparisons of the ECs whereby SA k is set up as the artificial goal for comparison. Accordingly, the resulting m eigenvectors complete all the rows in the relational matrix. Here, an eigenvector, k (typically, the eigenvector of the maximum eigenvalue), of the decision matrix for a specific SA, say SA k, represents the degree of relative importance to which the ECs affect SA k. k is then normalized and placed in the kth row of the relational matrix. This process is repeated according to the number of SAs (m), which completes the relational matrix. Even though subjectivity cannot be completely eliminated, the accuracy of the relational matrix greatly influences the integrity and fidelity of the entire HoQ framework. We expect that by quantifying the cell values in a reliable manner, the proposed, integrated HoQ/AHP framework successfully finds the true key factors (refer in Fig. 2) and generates appropriate guidelines for reengineering the service delivery process. Furthermore, we can also increase the accuracy of estimation by gathering diverse perspec-

26

Ann. Telecommun. (2010) 65:1929

tives and insights and synthesizing them through a series of interactive expert evaluations, as our case study will demonstrate in Section 4.

4 Application of the integrated HoQ/AHP model to ASP service improvement In this section, we apply the integrated HoQ/AHP framework to the ASP service and seek a means of improving its QoE. In particular, we determine what operations or functional elements are to be focused on. 4.1 Quantifying the VoC (SAs), VoE (ECs), and their interrelationships: outline of interactive evaluations Our integrated model of HoQ and AHP cannot succeed without expert knowledge. It is common practice to constitute a panel group that comprises experts who will contribute their expertise to a thorough examination and rigorous construction of the HoQ. This study also employs a series of panel surveys of experts in the area of ASP and IT outsourcing. The expert group was carefully chosen to be well qualified as follows. First, we formed a candidate pool that was composed of professional engineers with more than 5 years of experience in the ASP industry. However, those evaluators were selected not because they were involved in the industry but because they had a similar level of expertise in IT outsourcing operations; the evaluators spanned key software vendors and networkbased providers. Then, several rounds of AHP applications with multiple experts further refined the numerical values of the relational matrix, thereby capturing more accurate relationships between SAs and ECs. If each expert can refer the results of others' evaluations, the overall procedure is interactive and causes the final outcome to be a sort of average opinion. Fifteen experts were involved in the first-round panel survey and were first asked to further confirm the ECs (VoE) and SAs (VoC) in Tables 1 and 2, respectively. They also evaluated the SA priorities as well as the relationship in each pair of SAs and ECs, i.e., they quantified the matrix cells in the HoQ model. After a short pilot experiment, a full-scale survey was conducted between November 2007 and August 2008 to increase the reliability of the quantified values in the integrated model. We intensively examined the pairwise comparison matrices to sort out insincere answers. Among the 15 expert candidates, ten were finally selected based on the veracity and integrity of their responses. We also conducted both face-to-face and documentary interviews with these ten professionals. Furthermore, we gave them feedback (with some brief training in the AHP method) and repeated the survey when

even one of their IR indices exceeded 10%. Consequently, each matrix in our final outcome exhibits an IR that is far less than 10%, which means the survey outcomes preserve at least internal logical consistency. Following the norms of AHP analysis,4 we combined ten different questionnaires according to [29, 31]. That is, the arithmetic means of ten different eigenvectors for the corresponding rows of their own HoQs were computed.5 This calculation generated one unified eigenvector, which completed all the rows in the final HoQ matrix in Fig. 3. The final SA priority vector was constructed in the same way and recorded in the last column of the final matrix (refer Fig. 3). Through these combined eigenvectors for the SA priorities (i's) and the cells (rij's) in the matrix, it is straightforward (refer Eq. 2) to calculate the EC priorities, which are the ultimate result of the integrated model of HoQ and AHP. In sum, our approach in the HoQ/AHP framework is much more rigorous and sophisticated than a typical HoQ approach, where all the numbers are determined by the subjective judgments of a small number of individuals at one time. A series of extensive pairwise comparisons were carried out through the appropriate elucidation of experts' knowledge. The method applied here is a sound alternative to unstructured, conflictive decision analysis. We believe that this quantitative, interactive method will substantially reduce subjectivity and offset the possibly extreme evaluations of experts. Accordingly, the proposed framework more accurately measures the numbers in HoQ and increases the confidence level in the final outcome (EC priorities). 4.2 Step-by-step application of the HoQ/AHP integration model 4.2.1 ECs, SAs, and SA priorities With regard to ECs, we specified in Table 1 the key operational functions and processes to represent how a service provider responds to users' needs for ASP services. Presented in Table 1 are nine functional elements (ECs) that play a key role in the service delivery process. Similar to the approach taken for ECs, SAs have been selected from various published studies and resources based mainly on
4 For example, it is typically assumed that different experts have the same information on the subject and the same capability for evaluation. If this assumption seems unrealistic, one may assume there is no prior method of distinguishing the evaluative capabilities of different surveyors, which eventually leads to the assignment of the same weight to each surveyor. 5 [29, 31] provide other ways for combining survey outcomes. However, it is asserted in [29] that different methods for combining pairwise-comparison matrices result in hardly any difference.

Ann. Telecommun. (2010) 65:1929 Fig. 3 HoQ/AHP analysis for the QoE improvement of ASP services
Resp Rel Assur Emp EoU Useful FE Priority

27
SA Priority

ClSupp SecMgt OpSup AppPgm BusSup SrvPlat AppMid NetMgt DatMgt


0.134 0.148 0.108 0.195 0.130 0.116 0.103 0.187 0.170 0.117 0.122 0.106 0.104 0.110 0.090 0.106 0.110 0.121 0.075 0.081 0.080 0.091 0.101 0.091 0.178 0.106 0.087 0.179 0.161 0.195 0.083 0.094 0.139 0.043 0.044 0.059 0.127 0.113 0.128 0.070 0.074 0.083 0.076 0.070 0.064 0.106 0.107 0.084 0.120 0.091 0.133 0.091 0.151 0.144

0.128 0.247 0.197 0.102 0.129 0.197

0.135

0.141

0.107

0.086

0.144

0.083

0.102

0.081

0.121

prior research about SERVQUAL [4, 26, 37] and TAM [6, 7], which clarify the essential factors with regard to QoE. Table 2 shows six SAs. After determining the SAs and ECs, which are the most fundamental underpinnings of the HoQ/AHP framework, we first prioritize the SAs using the AHP ranking method introduced in Section 3.2. That is, based on the survey responses from ten experts, we gather ten eigenvectors, each of which comes from pairwise comparisons of the SAs. The computation of the arithmetic means of the ten eigenvectors results in a numerical evaluation of the SA priorities, which are recorded in the rightmost column of the HoQ model (refer Fig. 3). If we read the ranking in the order of importance, Reliability is the most critical SA with about 25% of influence on the overall QoE. Assurance and Usefulness follow with the same influence of 19.7%. The Empathy dimension shows the least influence of about 10%. 4.2.2 The relational matrix Based on Eq. 3, the relationship matrix was calculated as shown in Fig. 3. That is, in building relationships between one SA and the set of ECs, a similar procedure to the one used to compute SA priorities was applied here. We also interviewed the expert group again so that they could review the final pattern of the matrix and provide suitable guidance. If a row needed to be modified, some raw survey results were returned to the corresponding experts so that they could revise their pairwise comparisons. The matrix in Fig. 3 has been derived from the final iteration after this survey procedure. Therefore, decisions on the evaluation have been carried out through a series of thorough examinations and multiple double-checks with experts. An example of the strongest relationship is found in the row for the Usefulness SA: the cell for Usefulness and Business Support has a value of 0.195. This implies that the Business Support function has a strong effect on improving the Usefulness SA. The weakest relation resides in the cell for the Assurance SA and the Network Management EC with a value of 0.064, which means that less than 7% of the improvement in the Assurance SA

can be attributed to the Network Management EC. Note that this weak relation might be completely ignored in the traditional HoQ framework due to its coarse method of scoring. On the other hand, the strong relationship is highly likely to be exaggerated in the traditional approach. Further analysis and comparison will be conducted in Section 4.3. 4.2.3 Prioritizing ECs For each EC that is indexed by k, the priority can be calculated using Eq. 1, which is basically the inner product of two vectors: one for the column of EC k and the other for the SA priority. After the column weights are determined, they are normalized (using Eq. 2) so that they can be easily compared with each other for ascertaining the importance of a particular EC. Extra effort should be concentrated on enhancing the functionality of ECs with large d k values since d k represents the percentage power of influence that the clients indirectly ascribe to EC k. It can be also interpreted as the degree of attention that a service operator must pay to the corresponding EC to develop and maintain a high QoE. Further analysis will be conducted in Section 4.3. 4.3 Analysis and discussion Since we have explained the detailed step-by-step procedures in the preceding sections, we only summarize the main results here. Figure 3 depicts the results from applying the integrated HoQ/AHP method. Here, we focus on illustrating the use of the HoQ/AHP framework and deriving insights for improving the current ASP service operations. In particular, we expect this approach will help pinpoint the areas of functional improvement that would yield the highest QoE from the perspective of clients. As briefly explained in Subsection 4.2.3, it is the ECs with the highest priorities that an ASP should put on the front burner. For instance, an ASP with a limited budget may well consider the top priority ECs first for resource allocation and future investment. In our case, the most important ECs are Business Support and Security Management, each of which accounts for 14.4% and

28

Ann. Telecommun. (2010) 65:1929

14.1%, respectively, of the total influence on the overall QoE. The former pertains to security and failure management at the management layer; the latter, which is at the application layer, plays a key role in customizing applications and utilities in response to user-specific situations. Client Support, Data Management, Operational Support, and Application Middleware follow in that order. On the other hand, the analysis reveals that Server/OS Platform and Application Program are relatively less influential ECs. Their powers of influence fall short of 10%. In terms of the ASP service architecture, the management layer is the dominant functional group for QoE. The average value of the EC priorities in this group is 0.128, which outweighs the second influential layer, the application layer, whose average priority is 0.115. The least influential group is the platform layer even though the differences between its average value from those of other layers (except the management layer) are not significant. Furthermore, one can find that at least one EC in each layer has a weight of more than 10%. This fact implies all the four layers are equally important for providing good QoE.6 Synthesizing all the above findings, we may conclude that in terms of both effectiveness and efficiency, ASP service improvement first should focus on the management layer, in particular, the functional elements for security and failure management. In addition, customization of the system and (if possible) the service delivery process in response to clients' requests is another strategic option to pursue in enhancing ASP services. Thus, for an ASP that is about to invest in system upgrades, it would be wise to upgrade these two ECs first. This result may initiate the development of another HoQ as in the case of a typical QFD procedure [17, 34]. For example, the service operator could examine the security and failure management systems that are either currently used or planned in much greater detail and construct a separate HoQ that is designated for the quality of security so that it can specify a clear roadmap for security enhancement. This process of deploying a series of HoQs continues until the required extent of details is attained. A careful examination of the ranking procedure will yield us some lessons in applying the HoQ/AHP framework. First, take a look at the row for Reliability, the most critical SA (refer Section 4.2.1). Even though the highest score in the row is for the cell of Security Management, the cell value for Business Support ranks fifth out of nine elements. This means that we might miss the most influential EC by analyzing only the SA components. Thus, we may generally say that focusing
6 This fact also implies that the architecture suggested in Section 2.2 is well-defined.

solely on SAs may not be sufficient to ascertain the most effective factors; it could fail to guide service operators in terms of what to do for QoE improvement. The reason why Business Support becomes the most critical EC can be found in the structure of the resulting matrix. That is, Business Support is the EC that has substantial effects on more than one SA with high priority. For example, Business Support is the only dominant EC in the row of the second-most important SA, Usefulness (compare the numbers in the row corresponding to Useful in Fig. 3). Again, this analysis implies that, in the context of QoE, we cannot be certain of capturing the underlying keystone factor (either a functional element or a logical correlation) without a balanced approach that simultaneously considers both VoC and VoE. Lastly, we should highlight the methodological advantage of the proposed framework over a traditional HoQ model. As we briefly mentioned in Section 4.2.2, the latter has a propensity to overestimate a strong relationship between an SA and an EC; on the other hand, it tends to underestimate or even completely ignore a weak relationship. This tendency comes from the convention of measuring the relationship in a coarse scale (for example, categorizing the level of the relationship into four classes, such as strong, medium, weak, and nothing, and assigning the same weight to all the cells in the same class). However, note that, in the case of Business Support above, this EC with the highest priority ranks only the fifth in terms of correlation with Reliability. If this relationship were underestimated or ignored due to coarse scaling in the traditional HoQ method, we could fail to single out Business Support as the most influential EC for QoE improvement. Therefore, one should not neglect small changes in a multi-factor model; rather, one ought to try and pin down quantitative values, as in the integrated HoQ/ AHP framework.

5 Concluding remarks In light of the growing importance of the Internet to contemporary society, IT outsourcing and/or IT rental services will shape future e-business transactions by supplying network-based services to firms and individuals. These services enable great flexibility in the utilization of clients' resources. Although the industry is currently at its early stage of development, much attention is being paid to the enhancement of the service delivery process for quality improvement. Provided in this paper was an integrated HoQ/AHP framework to analyze clients' demands (VoC and SAs) and essential functional elements (VoE and ECs) for QoE improvement and identify the core ECs upon which to

Ann. Telecommun. (2010) 65:1929

29 10. Factor A (2002) Analysing application service providers. Sun Microsystems Press, Palo Alto, CA 11. Fitzsimmons J, Fitzsimmons M (2007) Service management: operations, strategy, information technology (6th ed). McGrawHill, New York 12. Franceschini F, Rossetto S (1995) QFD the problem of comparing technical engineering design requirements. Res Eng Design 7:270278 13. Franceschini F, Rossetto S (1998) QFD how to improve its use. TQM 9:491500 14. Franceschini F, Rupil A (1999) Rating scales and prioritization in QFD. Int J Quality Reliability Manage 16:8597 15. Fraser NM (1994) Ordinal preference representations. Theor Decis 36:4567 16. Harney J (2002) Application service providers. Addison-Wesley, New York 17. Hauser JR, Clausing D (1988) The house of quality. Harvard business review, pp. 6373 (May) 18. Kettinger JW, Lee CC (1997) Pragmatic perspectives on the measurement of information systems service quality. MIS Quarterly 21:223240 19. Kim D (under submission) Mathematical Analysis on the Structure of HoQ Matrix (in Korean) 20. Kim D (2003) QFD and principal component regression analysis (in Korean). Kyung Hee Manage Rev 9:1930 21. Kim D (2003) An explanatory approach to the ASP industry evolution where IT services move from p-service to e-service. In: Gupta JND, Sharma SK (eds) Creating knowledge based organizations. Idea Group Publishing, Hershey, pp 127147 22. Korean IT Rental Association (2008) A survey on ASP industry in Korea 2007 (in Korean), National Information Agency, Report No. NIA-II-07097 23. Lee JJ, Ben-Natan R (2002) Integrating service level agreements. Wiley, New York 24. Lu MH, Madu CN, Kuei C, Winokur D (1994) Integrating QFD. AHP and benchmarking in strategic marketing. J Bus Ind Mark 9:4150 25. Mazur GH (1993) QFD for service industries. Proceedings of the 5th symposium on QFD, pp. 117, Novi, MI (June) 26. Parasuraman A, Zeithaml VA, Berry LL (1988) SERVQUAL: a multi item scale for measuring consumer perception of service quality. J Retail 64:1240 27. Pitt LF, Watson RT, Kavan CB (1995) Service quality: a measure of information systems effectiveness. MIS Quarterly 19:209221 28. Pring B (2003) ASP hype cycle: Hype? What hype? Gartner group report 29. Ramanathan R, Ganesh LS (1994) Group preference aggregation methods employed in AHP. Eur J Oper Res 79:249264 30. Saaty TL (1995) The analytical network process: planning, priority setting. Resource allocation. RWS Publishing, New York 31. Saaty TL, Vargas LG (1980) Hierarchical analysis of behaviour in competition: prediction in chess. Behav Sci 25:180191 32. Sparrow E (2003) Successful IT outsourcing. Springer, New York 33. Sturm R, Morris W, Jander M (2000) Foundations of service level management, SAMS 34. Sullivan LP (1986) Quality function deployment. Qual Prog 19 (6):3950 35. Toigo JW (2001) The essential guide to application service providers, Prentice-Hall, Englewood Cliffs, NJ 36. Wan HA (2000) Opportunities to enhance a commercial website. Inf Manage 38:1521 37. Yoon S, Suh H (2004) Ensuring IT consulting SERVQUAL and user satisfaction: a modified measurement tool. Inf Syst Frontiers 6(4):341351

focus. The proposed framework was supported by a case study on ASP services, whereby we could conclude that the HoQ/AHP model is not only feasible for measuring and organizing VoC and VoE for ASP services but also quite promising for developing a strategic direction for QoE improvement. This result suggests great potential for capturing users' needs and devising an efficient and effective solution for upgrading the system towards user satisfaction. Furthermore, since the proposed method quantitatively deals with VoC and VoE regardless of industry-specific data types, it is flexible enough to be applied to various service industries. In future research, we will further confirm the technical features with various types of service operators not only in the typical ASP industry but also in more general IT outsourcing industries.7 We will also elaborate the proposed framework by measuring the relative degree of relationships between ECs and/or SAs. This sophistication will not cause a major modification to our framework but will build an additional feature atop and on the side of the HoQ model. Thus, those changes will consolidate our contribution here in terms of methodological aspects.

References
1. Akao Y (1990) QFD integrating customer requirements into product design. Productivity Press, Cambridge 2. Asubonteng P, McCleary KJ, Swan JE (1998) SERVQUal revisited: a critical review of service quality. J Serv Mark 10 (6):6281 3. Burris AM (2001) Service provider strategy. Prentice Hall, New York 4. Carrillat FA, Jaramillo F, Mulki JP (2007) The Validity of the SERVQUAL and SERVPERF scales: a meta-analytic view of 17 years of research across five continents. Int J Serv Ind Manag 18 (5):472490 5. Chen X, Sorenson P (2007) Towards TQM in IT services. Proceedings of the international conference on automated software engineering. Atlanta, Georgia, pp 4247 6. Davis FD (1989) Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly 13:319340 7. Davis FD, Bagozzi RP, Warshaw PR (1989) User acceptance of computer technology. Manage Sci 35:9821003 8. DeLone WH, McLean ER (1992) Information systems success. Inf Syst Res 3:6095 9. Desisto RP, Holincheck J, Alvarez G, Lheureux BJ (2006) Predicts 2007: Software as a service provides a viable, gartner group report (November)

7 The most recent term for this generalization could be SaaS (Software as a Service). Since it is not easy to define a label such as SaaS, in this paper, we restrict ourselves to a rather classical definition of ASP services.

You might also like