You are on page 1of 33

Page 1 of 33

Islamic Republic of Afghanistan


Afghanistan Telecommunications Regulatory Authority (ATRA)
: 3 : 2 / 1391 / 10 /

Executive Summary

By this Public Consultation, ATRA proposes to revise its existing framework for
measuring and ensuring quality of service (QoS). Specifically, we propose to adopt
regulations that establish key performance indicators (KPIs) as well as a more
efficient and effective process for monitoring QoS. Our revised procedures will rely
on extensive collaboration with the operators to measure and ensure service quality.

We propose to use 2 strategies to measure the KPIs. The first is based on the drive
tests to assess the coverage and quality of communications (voice quality and data
throughputs). Drive tests are not sufficient to give a complete picture of network
performance, however, because they do not measure service quality for all the cells 24
hours a day, 7 days a week. Therefore, in addition to using drive tests, we propose to
require operators to collect and compute Operation and Maintenance Center raw data
for extracting the KPIs. These two sources of measurements will allow ATRA to have
a clear and complete view of each networks KPIs.

The values of the thresholds and KPI targets proposed here are based on best practices
and ITU recommendations (where these have been made). As Afghanistans situation
is different from other countries, and mainly western countries, these particular values
will have to be set during the consultation process with the operators. Target values
will then be established for each type of area (main cities, other cities, village and
roads) according to its specificities, knowing that these values will be revised every
year in order to reflect the continuously improving quality of service of the networks.

We specifically seek the views of the operators on the proposed QoS framework, as
well as the views of customers and members of the public, and we have identified a
number of specific issues on which we seek comment in this Consultation. We have
used international best practices in developing the proposals contained in this Public
Notice, but we have sought to adapt it to the unique conditions of the Afghan market
place.

1. Public Notice

1.1 Introduction

The Telecommunications Services Regulation Law (Telecom Law) reflects the
commitment of Afghanistan to competition in telecommunications as the best method
to protect the interests of consumers, including in high quality telecommunications
services. In addition, all of the operators licenses establish service quality standards




Page 2 of 33

and monitoring programs and the recently issued 3G licenses already held by several
of the operators makes explicit that ATRA has authority to adopt rules in this area.

The Afghanistan Telecom Regulatory Authority (ATRA) hereby issues this Public
Notice for the purpose of initiating a public consultation (1) to determine whether,
under current market conditions, ATRA needs to adopt further rules concerning
quality of service for voice and data services and (2) if so, to elaborate a Quality of
Service framework applicable to mobile telecommunications operators networks in
Afghanistan.

A key role of a telecommunication regulator is to assess and control the Quality of
Service (QoS) delivered by network licensees. To fulfill this task, ATRA is proposing
to set up a framework to facilitate the transparent assessment of delivered QoS on
mobile communications networks, through the definition of all required components
of a QoS framework.

The QoS framework will address all mobile communications networks currently
operational in Afghanistan, including both 2
nd
and 3
rd
generation networks. To
achieve this, appropriate Key Performance Indicators (KPIs) and targets are proposed
in this document. In this context, we believe it is important to be mindful of the
unique challenges posed by the Afghan market.

Interested parties are invited to submit comments on the matters discussed in this
public notice not later than [15 January 2013]. Interested parties are requested to
submit their responses electronically to n_rahmanzai@atra.gov.af.


1.2 Background

Provision of high quality communications services is an essential part of a modern
mobile communications market. Recognizing this, virtually all mobile
communications licenses issued internationally contain QoS related targets that
networks must achieve, as a minimum. The mobile communications licenses issued
in Afghanistan are no exception to this.

The measurement and publication of different networks service quality is also an
essential part of customer choice. By being informed of the relative performance of
different networks, a customer is able to make a more informed choice as to which
network they wish to purchase services from.

In Afghanistan, the Ministry of Communications and Information Technology (MCIT)
has licensed four nationwide GSM/3G wireless providers:Afghan Wireless
Communication Company, Etisalat Afghanistan, MTN Afghanistan and Telecom
Development Company Afghanistan, Ltd (Roshan) as well as one nationwide fixed
line operator (Afghan Telecom), which holds a unified services license that allows
them to provide any fixed line or wireless telecommunications service. In addition,
new WiMax broadband licenses have recently been issued to ISPs (NEDA, IO
Global, ANS). A review of QoS by this consult will therefore extend to this sector of
wireless communication as well.

AWCC operates under its original GSM Services License, which contains the
following QoS requirements:

Page 3 of 33

Schedule 2 imposes roll out and coverage requirements
Schedule 4 of the license establishes a number of performance standards that
AWCC must achieve
AWCC must conduct drive tests in order to ascertain whether it has complied
with performance targets, with drive test routes to be proposed by the Ministry
and to which AWCC may not unreasonably withhold its agreement to.

Etisalat, MTN and Roshan have all recently been awarded Mobile Communications
Services Licenses, which allow them to deploy and operate 3G services. The
following QoS requirements are contained in this license:

Schedule E of the license imposes specific quality standards
Clause 16.2 provides that the Ministry retains the right to amend or clarify
these standards, particularly but not only with respect to mobile broadband
service.
Schedule D of the license also imposes roll out and coverage requirements that
the licensee must meet.
It is also understood from ATRA that at the time of issuing these new service licenses,
the operators had agreed for ATRA to initiate a public consult to revise and improve
QoS implementation.

AfTels Unified License contains the following, relevant QoS conditions:

Schedule A-2 of the license imposes specific quality standards, however many
of these do not relate to mobile communications
Clause 14 provides that the Ministry can amend or clarify these standards.
Clause 16.2 makes clear that ATRA has authority to issue regulations
containing service quality standards, including for mobile broadband service.

Recently ATRA has become concerned that QoS, across the mobile industry, may not
be as high as may be reasonably expected. ATRA is also aware that there is concern
across the industry over the methods by which performance against QoS targets is
established.


1.3 Legal Basis

The Telecom Law does not contain provisions directly discuss Quality of Service.
However since QoS is for consumer protection, the provisions that might be
applicable are Articles 49 and 50 concerning terms of service. One of Article 50s
provisions allows ATRA to prescribe terms of service in areas that it deems necessary.

However, the operators licenses do impose service quality obligations, as discussed
above. These licenses have clearly and legally established service quality standards
and penalties that apply for failure to meet these standards to current mobile operators
as well as new mobile operators. The new 2G/3G Mobile Communications Services
License also makes clear that ATRA has authority to issue regulations containing
service quality standards and to check the Quality of Service, including mobile
Page 4 of 33

broadband service, as well as carrying agreement from operators at time of issuance
that there would be a consult to improve QoS.


1.4 Regulation of QoS in Afghanistan

The purpose of this consultation document is to present and consult on the key issues
relating to the establishment of a complete QoS framework for mobile
communications in Afghanistan. The issues addressed by this consultation document
include proposals on:

Update of the QoS KPIs to be measured during networks QoS evaluation,
Revised/ new thresholds associated with these KPIs,
Process used for measurement of KPI performance,
Responsibilities of each party,
Classification of the areas,
Number of samples required during the measurements,
List of information and documentation to be provided by operators.


Page 5 of 33

2. Proposed Approach

It is proposed that ATRA develops a complete framework for QoS for mobile
communications in Afghanistan. This framework will define the QoS KPIs to be
measured, the KPI targets that must be attained, the methods of measurement for the
KPIs and the processes that surround the QoS framework (such as, the timescales of
measurements and the reporting processes).

ATRA will periodically revisit this framework, particularly the KPIs and the
thresholds, such that it is updated and relevant. ATRA will collect a wide number of
KPIs, but focus on a minimum number that will address key service quality concerns
whilst being sufficient to ensure high network service quality is delivered. The
measurement of these KPIs will be performed as independently as possible from
operators to ensure full confidence in the results.

ATRA could relax some of the constraints if an operator reports and proves that it was
impossible to meet the target values and particularly if one of the following problems
has occurred:
- Military interference problems on one of its network frequencies,
- Problems in the transmission if provided by another operator (in this case, the
carrier operator could be fined),
- Security problems.

To account for this; the proposed approach considers the particular constraints
encountered by the network operators in Afghanistan and especially using the zoning
approach (with the strata definitions) where each zone, according to its specification,
has its own KPI target values.

To define this framework, we therefore need:

First to define the list of KPIs and their target values required to evaluate the
network QoS performance. This list will consider two main types of service: voice
services and data services (including SMS), both for 2G and 3G
Secondly to define the process of conducting KPI measurement. This will require
cooperation from the operators to obtain the necessary information and to provide
access to the necessary network data (such as OMC raw data), as well as definition
of the tools to be used and the methods of measurement
Thirdly to define the areas where KPIs will be evaluated, in terms of the terrain
types (for example, main cities should exhibit better QoS than village) and
programme for KPI introduction (3 phases are proposed).

It is proposed that the QoS framework is subsequently revised every year if required
and according to the improvements noticed through the measurements campaigns.


Question 1: Do respondents have any proposal based on other countries
experience or best practices to improve this framework of KPI s,
targets, measurement processes (including field measurements and
OMC data) and evaluation criteria to evaluate the QoS of their
networks?


Page 6 of 33

The evolution of telecommunications networks and services requires that the way
QoS is evaluated by regulators is regularly revised. In addition, the improvement in
the QoS provided by existing networks is a natural and normal trend that should be
reflected in thresholds which are then periodically updated to reflect the mature
networks performance.

The costs of QoS evaluation will be supported by ATRA. Any evaluation that has to
be made again due to lack of data or to measured values that do not meet the targets,
will be supported by the related operator.

As we have noted, the 3G licenses issued to Etisalat, MTN and Roshan make clear
that ATRA has authority to adopt regulations concerning quality of service. Although
AWCC and Afghan Telecoms licenses do not contain similar language, we
tentatively conclude that, together with other license provisions and general
regulatory authority under the Telecom Law, ATRA has the legal authority to revise
and enforce such QoS standards even if this necessitates amending existing licenses.
We invite commenters to address this conclusion.

Question 2: Do respondents have any proposal concerning this process and
methodology [to the method of implementing this proposed QoS
framework]?




Page 7 of 33

3. Definitions

QoS is defined in ITU-T Recommendation E.800 as the collective effect of service
performance, which determines the degree of satisfaction of a user of the service.
QoS consists of a set of parameters that pertain to the traffic performance of the
network, but in addition to this, it also includes other parameters defined in the
recommendation, such as:

Service support performance
Service operability performance
Serviceability performance and
Service security performance

Regulators evaluate the QoS of the networks through a set of parameters (called Key
Performance Indicators KPIs) the objective of which is to reflect the quality of
experience perceived by a user of the network service. This evaluation requires the
testing of QoS KPIs through analysis of the capabilities of the network. Test data is
collected using appropriate tools and procedures in a controlled environment.

QoS can be part of the characterization of the Quality of Experience (QoE). This is a
subjective measure of a customer's experiences with a service; it shows how services
are experienced by consumers, in their environment and on their device (whether that
is a mobile phone, computer, smart phone or tablet).

Fundamental performance areas of QoS include:

Availability: Is the service available in an area?
Accessibility: Within an area, is the service available to all who request it?
Retainability: Is the service retained throughout the requested period?
Reliability: Does the service work at a consistent level?
Performance: How well do services perform?

Many factors affect the QoS of a mobile communications network. There are standard
metrics of QoS that directly impact the user and that are used by the
telecommunications industry to rate the QoS of a network, including:

Coverage: the strength of the signal measured using test equipment to
estimate the size of the cell
Accessibility (including GoS): determining the ability of the network to
successfully handle calls from mobile-to-fixed networks and from mobile-to-
mobile networks
Audio quality: monitoring of successful calls for a period of time to establish
the clarity of the communication channel

QoS is also measured from the perspective of an expert (such as a teletraffic engineer).
Such measurements involve assessing the network to see if it delivers the quality that
the network design is required to deliver. Certain tools and methods (for example
Page 8 of 33

protocol analyzers, drive tests, and Operation and Maintenance measurements) are
used in such a measurement.

In the field of data services, mainly provided by packet-switched networks and
computer networks, the term QoS is linked to control mechanisms that can assign
different priorities to different users or data flows, or guarantee a certain level of
performance to a data flow in accordance with requests from an application program.
QoS guarantees are important if network capacity is limited, especially for real-time
streaming multimedia applications such as Voice Over Internet Protocol (VoIP) and
Internet Protocol Television (IP-TV), since these often require a fixed bit rate and are
delay sensitive.

This section is focused on those QoS KPIs that regulators are interested in measuring.
Target values are proposed, based on best practices (such as those established by the
International Telecommunication Union, or successfully used by other regulators),
bearing in mind the particulars of operating a mobile communications network in
Afghanistan. Tools and methodologies that are proposed to be used to measure QoS in
Afghanistan are also discussed.


3.1 QoS KPI Definitions

The following KPIs are the common ones used by regulators to evaluate network QoS
around the world. Note that there can be a very large number of KPIs used to evaluate
QoS and operators will typically examine many more KPIs than presented here in
order to analyse and solve specific issues in their networks.

The following definitions are either definitions of KPIs or definitions required to
understand KPIs. It is proposed that those KPIs defined below will be used as a basis
for regulation of QoS in Afghanistan.

Busy hour: the hour of the day during which the traffic volume carried by the
network is the highest. It represents the sliding 60-minute period during which the
maximum total traffic load occurs in a given 24-hour period.

Call blocking rate (or Grade of Service): the probability that a call does not get
through during the busy hour.

Call drop rate: the probability that a call is interrupted before the end of the
communication, measured in the busy hour. This includes any link degradation
greater than 10 seconds, excluding the case where the mobile terminal is moved out of
the coverage area.

Handover failure rate: the probability, measured during the busy hour, that a
handover fails during the moving of the mobile terminal from a cell to another cell.

Voice quality as perceived by the consumer:
Perfect quality: the communication between both parties proceeded without
any problem, equivalent to a Mean Opinion Score (MOS) 4.5 (ITU value)
Average quality: the communication between both parties experienced some
impairments but these did not impact the ability to understand, equivalent to
a 4 < MOS < 4.5 (ITU value),
Page 9 of 33

Bad quality: the communication between both parties is not possible because
of transmission problems, equivalent to a MOS < 4.0 (ITU value).

Availability: ability to establish, 24 hours per day, every day of the year, from or to a
mobile terminal located in the concerned coverage area, the following
communications:
With any PSTN subscriber, national or international
With any other public telecommunication network including mobile
networks
With any other subscriber of the licensees networks.

Cells availability rate: the availability reached by cells (averaged across all the cells
in a network).

MSC/BSC/RNC network availability: the availability reached by MSC/BSC
(averaged across all the MSCs/BSCs/RNCs in the network).

International availability: availability of international links.

Answer Seizure Ratio (ASR): the ratio between the answered calls and the total call
attempts.

SMS access success rate: probability that a SMS is successfully sent.

Received SMS rate: probability that a transmitted message is not blocked by the
network and that its content is not corrupted during the transmission.

MMS access success rate: probability that an MMS is successfully sent.

Received MMS rate: probability that a transmitted message is not blocked by the
network and that its content is not corrupted during the transmission.

Internet connection success rate: probability that the first Internet access attempt
established by a mobile terminal is successful within a given time.

Data transmission throughput: volume of data downloaded during a given period
for a minimum number of sessions.

Internet session maintain: percentage of time that an internet connection is
maintained without interruption during a certain period.

Data connection establishment duration: the time interval between the connection
command and the display of established connection.

Web service unsuccessful rate: the probability that at least one requested page has
not been correctly downloaded.

Apparent web service throughput: the sum of the downloaded pages size divided by
the sum of the download durations when connected to the operators web portal.

Page 10 of 33

FTP data service connection failure rate: the number of failures when an ftp
command for a file is sent divided by the number of attempts measured using a
terminal connected to a laptop.

Apparent throughput of the FTP service: transfer file size divided by transfer
duration to transfer a defined file size, measured using a terminal connected to a
laptop.

Coverage measurement: Three service levels are defined:
Indoor service level: the received signal level threshold which, if measured
during drive test, can be said to guarantee indoor service. In practice, this
means that a subscriber located at this point can call and receive call when
inside a building.
In-car service level: the received signal level threshold which, if measured
during drive test, can be said to guarantee in-car service. In practice, this
means that a subscriber located at this point can call and receive call when
inside a car.
Outdoor service level: the received signal level threshold which, if
measured during drive test, can be said to guarantee outdoor service. In
practice, this means that a subscriber located at this point can call and receive
call when in an open area (i.e., outside buildings and cars).

The foregoing list of KPIs includes, but is more exhaustive than, the KPIs included in
the operators licenses.


Question 3: Do respondents agree that the KPI s defined above should be
measured and used to evaluate the QoS of wireless communications
networks in Afghanistan? I f not, what additional or different KPIs
do respondents recommend should be included in mandatory QoS
standards?


3.2 QoS KPI Targets

The following targets are proposed for the KPIs defined earlier. These targets are
based on best practices (such as those established by the International
Telecommunication Union, or those successfully used by other regulators), bearing in
mind the particular constraints of operating a mobile communications network in
Afghanistan.

Call blocking rate: Air interface: 2.0%; Network interface: 0.5%.

Call drop rate: 2% for switch statistics; 3% for drive tests.

Handover failure rate: < 10%.

Voice quality as perceived by the consumer: using the coverage classification
proposed in Section Error! Reference source not found.:

Main cities stratum
Page 11 of 33

o Perfect quality rate: >95%
o Average quality rate: <3%
o Bad quality rate: <2%
Other cities stratum
o Perfect quality rate: >95%
o Average quality rate: <3%
o Bad quality rate: <2%
Village stratum
o Perfect quality rate: >95%
o Average quality rate: <3%
o Bad quality rate: <2%
Roads stratum
o Perfect quality rate: >85%
o Average quality rate: <10%
o Bad quality rate: <2%

Cells availability rate: > 95% (ITU-T G.1000), with the maximum cumulated
unavailable period of a BTS causing unavailability in the service of 72 hours per year
in urban areas and 120 hours in rural areas.

MSC/BSC/RNC network availability: >99%, with a maximum single outage of
30mins per month.

International availability: 99.5%.

Answer Seizure Ratio (ASR): 80%.

SMS access success rate: >99%.

Received SMS rate: >99%.

MMS access success rate: >99%.

Received MMS rate: non corrupted received MMS within a 10 minutes delay: >90%.

Internet connection success rate: >90% within 30secs.

Data transmission throughput: 100 Kbytes files downloaded in 4 minutes
maximum rate: >80%

Internet session maintain: Internet connections maintained during a browsing of 5
minutes rate: >80%

Data mean connection establishment duration: 30 sec.

Web service unsuccessful rate: < 10%.

Apparent web service throughput: for 2G 120 kbps; for 3G 384 kbps.

FTP data service connection failure rate: < 20%, with successful connection
defined if connected within 1 min.
Page 12 of 33


Apparent downlink throughput of the FTP service: for 2G 120 kb/sec; for 3G 384
kb/s.

Apparent uplink throughput of the FTP service: for 2G: 60 kb/sec; for 3G: 144
kb/s.


2G GSM coverage measurement: Three thresholds for RxLev (Received Signal
Level the RF energy collected):
Indoor service level: Rxlev > -72dBm
In-car service level: Rxlev > -87dBm
Outdoor service level: RxLev > -92dBm

3G UMTS coverage measurement: Three thresholds for RSPC (Received Signal
Code Power the RF collected energy after the decorrelation / descrambling
procedure):
Indoor service level: RSCP > -80dBm.
In-car service level: RSCP > -95 dBm
Outdoor service level: RSCP > -100 dBm

CDMA2000 coverage measurement: Three thresholds for PILOT_STRENGTH
(Pilot Channel Strength - the RF collected energy after the decorrelation /
descrambling procedure):
Indoor service level: PILOT_STRENGTH > -80dBm.
In-car service level: PILOT_STRENGTH > -95dBm
Outdoor service level: PILOT_STRENGTH > -100dBm

WiMAX coverage measurement: Three thresholds for RSSI (Received Signal
Strength Indicator):
Indoor service level: PILOT_STRENGTH > -75dBm.
In-car service level: PILOT_STRENGTH > -90dBm
Outdoor service level: PILOT_STRENGTH > -95dBm

Question 4: Do respondents agree with the thresholds proposed for the KPI s? I f
not, what thresholds do respondents recommend based on other
countries experience and best practices?

Page 13 of 33

4. Measurement Procedure

This section discusses the measurement entity, methodology, and tools that are
proposed to be used for QoS measurements in Afghanistan.


4.1 Measurement Entity

The QoS measurement process could be conducted by:
Operators (with the results being transmitted to ATRA)
ATRA teams, or
Third party.

There are various advantages and drawbacks to each option, as discussed in Figure 1
below.

Measurement
entity
Advantages Drawbacks
Operators
Minimum workload and
costs for ATRA
Difficult to demonstrate
independence.
ATRA
Independent and fully
under ATRA control
Requires trained and up-to-date
staff, tools and means;
Expensive;
Disputable by operators.
Third Party
Independent and fully
under ATRA control;
No need for ATRA to
maintain tools and
measurement teams.
Choosing a Third Party can be
expensive and time consuming
Figure 1: Relative merits of different QoS measurement entities

ATRA proposes that the third option above measurement by a Third Party is
adopted for the initial measurement campaigns. Third party strategy provides
important advantages over the two other options and hence is the more widely
adopted strategy by regulators around in the world including ARCEP (France), Ofcom
(UK), INT (Tunisia), Infocomm Development Authority (IDA) in Singapore, New
Zealand Commerce Commission (CC), Bahrain Telecommunications Regulatory
Authority (TRA), Egypt National Telecommunications Regulatory Authority
(NTRA), ANRT in Morocco, ARCEP in Burkina Faso, and ARPCE in Congo
Republic, among others.

The measurement process will be conducted by this independent entity (called the
Third Party) chosen by ATRA for its experience and skills. The measurement team
will include Third Party measurement engineers and ATRA representatives, as
necessary, who will ensure that due process is followed.

After a given period (e.g., 2 to 4 QoS assessment campaigns for instance), ATRA
team could undertake this task given that it has fully acquired experience and skills to
do so.

Page 14 of 33


Question 5: Do respondents agree that a Third Party should undertake the initial
measurement campaigns?


4.2 General Process

The QoS KPI measurement process that is proposed to be used to in Afghanistan is as
follows:

a. Kick off meeting: a meeting will be organized at the start of each
measurement campaign to gather all operators at ATRAs premises. The Third
Party will present the plans for QoS measurement, including the tools,
planning and measurement process. Any remarks or comments from the
operators on this will be considered and addressed.

b. Operators will be informed of the time period during which the campaign will
occur, but not of the specific plans for the measurements (i.e., which specific
routes on which specific days). Operators will be required to organize their
network Operation and Maintenance activities accordingly, to avoid any
disturbance on their networks (such as equipment upgrade) that could affect
the measurement results.

c. Measurements will be then conducted during the chosen period and in the
defined regions (the drive test methodology and measurement periodicity are
described separately in Sections Error! Reference source not found. and
Error! Reference source not found. below). Operator teams will not be
permitted to accompany or follow the measurement team.

d. Collection of system measurement data (i.e. OMC raw data) will be either
achieved automatically through specific devices by the Third Party or
provided by the operators themselves. The format for system measurement
data provision is defined in Appendix A.

e. During the process, any information required from the operators should be
provided with one week of a request. If information is not supplied within this
period, any corresponding KPIs will be considered as non-compliant.

f. The collected data will be post-processed by the Third Party in order to extract
the QoS KPIs and establish each networks provided QoS.

g. At the end of the measurement mission, a presentation of the results will be
given to ATRA by the Third Party, and then to ATRA in the presence of each
operator alone. Sanctions can be then be applied if the provided QoS does not
meet the requirement defined by ATRA.

h. Service quality results for each operator will then be published on ATRAs
website, to facilitate public examination and enable consumer choice.


Question 6: Do respondents have any comments or additional details to add to
this process description?

Page 15 of 33


4.3 Technical Approach and Methodology

Drive tests and collection of system measurement data (ie. OMC raw data) for QoS
assessment are complementary. For example, the drive testing facilitates the
assessment of coverage quality that system measurement data does not (since no
connection is available in coverage holes and thus no system information is available
for these areas). On the other hand, system measurement data provides continuous and
exhaustive (i.e., all the radio sites) information during 24 hours a day and 7 days a
week, which drive testing is not able to do.

The following sections propose the tools, means and methods used to perform QoS
assessment through both drive testing and collection of system measurement data.

4.3.1 Tools and Means
Drive Test Measurement Tool
Various QoS KPIs will be measured using approved, industry standard drive test
equipment, such as TEMS. The measurements realized in the field will consist of
drive tests in each city and on the roads between the cities. One or more test
equipments may be used.

The drive test tool will include GPS equipment, to allow individual measurements to
be geographically located.

The test equipment tool will be able to measure the voice quality using the PESQ
1
or
MOS
2
algorithms.

ATRA will provide details, to the operators, of the drive test tool in use (including
software version and handset type). Its performance can be tested, if required, by a
joint team (ATRA, operators and the Third Party) before commencing any
measurement campaign.

All drive test log files will be made available to the operators and the processing
technique will be detailed if different from that described in this document.

Drive Test Post-Processing Tool
Drive tests measurements will be processed in order to extract the relevant KPIs.
Statistics will be developed and presented graphically and/ or geographically.

OMC Raw Data
In order to get proper view and perform complete analysis of QoS, system
measurement data, (collected at Operation and Maintenance Centers level) will also
be used and processed to extract the relevant KPIs. When combined with drive tests,

1
PESQ Perceptual Evaluation of Speech Quality is mechanism for automated
assessment of the speech quality enjoyed by the user of a telephone system. It is
standardized as ITU-T Recommendation P.862 (02/01)
2
MOS Mean Opinion Score see ITU-T Recommendation P.800
Page 16 of 33

these will provide a fuller picture of the QoS offered by each network, especially in
terms of blocking rate, accessibility and communication quality, for example.

System measurement data will be required from the operators in a format defined by
ATRA (see Appendix A). Once the Third Party has checked the data, operators will
have one week to correct any issues identified, after which the supplied files will be
considered as final and used for processing the KPIs.

System measurement data will be provided by the operators to ATRA, either
manually or automatically through a server connected to the relevant equipment. Data
should be available covering at least the same period as the drive test measurements.

OMC Raw Data Post-Processing Tools
Collected system measurement data will be processed in order to extract the relevant
KPI. Post processing tools adapted to each equipment vendor (such as Alcatel-Lucent,
Nokia Siemens Networks, Huawei and Ericsson) will be used for this purpose.

The formulas defining the required KPIs will be provided by the operators in terms of
the relevant system measurement counters for each vendors equipment. The Third
Party may nevertheless alter these formulas if necessary to homogenize the definitions
of the KPIs amongst the various vendors equipment.

Presentation of KPIs will be done graphically and/ or geographically.

Vendor Documentation
In order to understand the network architecture and component configuration and
dimensioning, the operators will be required to provide the Third Party with relevant
vendor documentation, on request.


Question 7: Do respondents have any comments regarding these tools, means and
methods as means for measuring the QoS performance of mobile
communications networks?

4.3.2 Drive Test Methods
This section proposes the methods used for the drive tests that will be carried out.
Definition of these methods is essential to obtain the required results in a consistent
and repeatable manner.

There are two types of drive test measurements:

Idle mode: here the mobile is not in any active call state. This mode allows
coverage to be assessed in terms of availability.
Dedicated mode: here the mobile is in an active call state. This mode allows
network and service accessibility to be tested, as well as service
maintainability and quality during use. This mode facilitates the collection of
the highest number of parameters and information on the network from the
users perspective.
Page 17 of 33


Dedicated mode measurements will be performed during busy hours, whilst idle mode
measurements will be performed at any time during the day.

Drive test routes will not communicated to the operators. A set of drive test routes
will be defined covering cities, villages and road. From this list, a sample of drive test
routes will be randomly chosen for each measurement campaign. The choices made
will be, in part, to avoid excessive costs and delays when assessing the coverage of
the networks.

For each measured area, drive test route will cover the area homogeneously (as far as
possible).

The following conditions will apply to drive test tool and its configuration:

Anonymous SIMs will be used to avoid any tracking by operators
All user terminals will be placed in a standardized rack attached to front
window of the vehicle
The same standard of vehicle will be used for all drive tests.
The user terminals used in the drive test tool for call making, shall be 2 Watts
(peak) in the case of GSM, 125 mW (peak) in the case of UMTS, 250 mW
(peak) in the case of CDMA2000 and 200 mW (peak) in the case of
WiMAX. The target user terminals used for receiving on-net test calls from
the drive test tool shall be of the same standard.
The target user terminals will be configured to automatically answer calls
after 2 rings (without any manual intervention).
The target user terminals will not be in any data service mode during test (for
example, receiving email, downloading data or attaching to an IM service)
The target user terminals will be placed in the same vehicle as the drive test
tool.

For each sub-area or road, and after having installed the drive test tool in the vehicle,
data acquisition software will collect all the required network parameters. The
network parameters will be stored in a specific log file in real time and processed after
the drive test.

Internet browsing and FTP measurements will be performed using a laptop with an
appropriate user terminal or a USB modem. During the measurements, the laptop will
run an automated script which performs a network attach, activates a data session and
downloads a specific file from a ftp server (located within the operators premises).

If no data is received during a maximum of 30 seconds, then the transfer will be
interrupted and the laptop/ user terminal will be detached from the network. The test
will then be repeated. If the packet data is received within the specified delay, the
laptop/ user terminal will continue its ftp download until the complete file is received.
It will then be detached from the network.

Page 18 of 33

4.3.3 Measurement Samples
For each area and road, the percentage of each service level will be computed and this
for each operator. Once the signal power level (RXLEV, RSCP or Pilot_Strength) has
been extracted from the log files the statistics will be established based on the
following formulas (Rxlev can be replaced by RSCP and Pilot_Strength and their
corresponding thresholds):
area the in samples of number Total
dBm RxLev as such samples of Number
quality incar of
100 87
%



a r e a t h e i n s a mp l e s o f n u mb e r T o t a l
d Bm Rx L e v a s s u c h s a mp l e s o f Nu mb e r
q u a l i t y o u t d o o r o f
1 0 0 9 2
%


a r e a t h e i n s a mp l e s o f n u mb e r T o t a l
d Bm Rx L e v a s s u c h s a mp l e s o f Nu mb e r
q u a l i t y i n c a r o f
1 0 0 7 2
%



Measurement Number of Samples: a minimum number of samples per area and per
road is proposed to ensure representative measurements. These are detailed in Figure
2 below.

Zone or Road Service Number of samples
Calls /SMS/sessions
Main cities Stratum Voice 500
SMS 200
Data 200
Other cities Stratum Voice 100
SMS 100
Data 50
VillagesStratum Voice 50
SMS 30
Data 20
Roads Stratum Voice 15/50 Km
SMS 5/50 Km
Figure 2: Minimum numbers of measurement samples

Question 8: Do respondents agree with these minimum numbers of measurement
samples?


4.3.4 KPIs extracted from OMC raw data
With regards to the statistics based on system measurement data, the only constraint is
the periodicity with which the counters extracted from the OMC. This periodicity is
Page 19 of 33

detailed in Section Error! Reference source not found. below. The OMC KPIs are
presented in Appendix A.

Note: The KPIs that are obtained through drive test measurements can be compared
with KPIs extracted from OMC raw data. They should match but if not, either an
explanation will justify it and the validated KPIs will be considered for assessing the
QoS, or new measurements or processing will have to be conducted.


Question 9: Do respondents agree with the list of data (see Appendix A) to be
provided by the operators or collected automatically by ATRA?




Page 20 of 33

5. Measurement Periodicity

For drive tests, it is proposed that one measurement campaign will occur each quarter
(3 months). Drive testing is costly and takes time, hence once a year periodicity can
be considered appropriate for certain areas.


Concerning the system measurements, the proposal is to implement an automated
system which collects the OMC raw data from the networks, processes and provides
the corresponding KPIs daily. In this case, the periodicity can be daily.

If any issues with KPIs are identified, it is proposed that a correction delay of 1 to 6
months will be provided to allow the relevant operator to fulfill the requirements.
After the correction delay, measurements will be performed again to verify the impact
of the corrections. These verification measurements will be at the expense of the
relevant operator. If the verification measurements still identify an issue with KPIs,
then ATRA will consider the application of sanctions.


Question 10: Do respondents agree with these periodicities?


Page 21 of 33

6. Coverage Areas Classification

Concerning the coverage quality, three service levels have been defined earlier:
Indoor, In-Car and Outdoor. For each of the areas and roads, the measurements of
signal level will be distributed per level with post-processing tools. More precisely, if:
- Indoor_Level is the indoor level (in dBm) required for indoor coverage,
- Incar_Level is the indoor level (in dBm) required for incar coverage,
- Outdoor_Level is the indoor level (in dBm) required for outoor coverage,

the measurement filtering will enable the percentage of measurements which have a
signal power level greater than Indoor_Level, Incar_Level and Outdoor_Level to be
extracted. These will then be compared to the target values below required for each
stratum and type of coverage (i.e., indoor, incar, outdoor).

Signal power received from each network will be measured using the drive test tools,
means and methods describer earlier and compared to the relevant threshold.

It is proposed that the licensees be required to cover a certain percentage of the
population after a defined period. Three phases of coverage extension will be defined
as shown in Figure 3 below.

Phase Year for 2G coverage Year for 3G coverage
1
2
3
Figure 3: Coverage requirements, by phase

The country is therefore divided in areas and roads that will be subject to coverage in
phase 1, 2 or 3.

It is proposed that coverage control will be achieved through a stratum approach.
Strata will be defined as follows:

Main cities stratum: a main city is considered as covered when at least:
o 90% of the measurements indicate a power level higher than Indoor_Level,
o 95% of the measurements indicate a power level higher than Incar_Level,
o 98% of the measurements indicate a power level higher than
Outdoor_Level,
Other cities stratum: another city is considered as covered when at least:
o 85% of the measurements indicate a power level higher than Indoor_Level,
o 90% of the measurements indicate a power level higher than Incar_Level,
o 95% of the measurements indicate a power level higher than
Outdoor_Level,
Villages stratum: a village is considered as covered when at least :
o 70% of the measurements indicate a power level higher than Indoor_Level,
o 85% of the measurements indicate a power level higher than Incar_Level,
Page 22 of 33

o 90% of the measurements indicate a power level higher than
Outdoor_Level,
Roads stratum: a road is considered as covered when at least :
o 75% of the measurements indicate a power level higher than Incar_Level,
o 95% of the measurements indicate a power level higher than
Outdoor_Level,

The areas associated with each stratum will be defined in Figure 4 below.

Main Cities Stratum
Phase 1
Phase 2
Phase 3

Other Cities Stratum
Phase 1
Phase 2
Phase 3

Villages Stratum
Phase 1
Phase 2
Phase 3

Roads stratum
Phase 1
Phase 2
Phase 3
Figure 4: Areas associated with each stratum

Note: ATRA may adopt a rule where some areas called white areas for instance in
France which may appear not economically efficient for the operators are covered
by at least one operator and where any Afghan network subscriber can be connected.
This model assumes that national roaming is allowed for these particular cases. We
strongly recommend this network sharing model in order to reduce the investments in
areas where only very long term, or even not at all, return on investment can be
expected. In a white area, all the operators are responsible for the quality of service
and if the target values are not met, all operators sharing the network in this area can
be fined at the same time with equal share in the penalty.
Question 11: Do respondents have suggestions on this classification and
distribution of the coverage areas into strata?

Question 12: Which parts of the country and coverage are subject to security
constraints that do not allow operators to maintain networks and
related QoS? Detail the areas and reasons. See Appendix B for
details on the Drive Test itineraries.
Page 23 of 33

7. Summary of Key Points on which Respondents are invited to
comment

Question 1: Do respondents have any proposal based on other countries
experience or best practices to improve this framework of KPI s,
targets, measurement processes (including field measurements and
OMC data) and evaluation criteria to evaluate the QoS of their
networks?

Question 2: Do respondents have any proposal concerning this process and
methodology [to the method of implementing this proposed QoS
framework]?

Question 3: Do respondents agree that the KPI s defined above should be
measured and used to evaluate the QoS of wireless communications
networks in Afghanistan? I f not, what additional or different KPIs
do respondents recommend should be included in mandatory QoS
standards?

Question 4: Do respondents agree with the thresholds proposed for the KPI s? I f
not, what thresholds do respondents recommend based on other
countries experience and best practices?

Question 5: Do respondents agree that a Third Party should undertake the initial
measurement campaigns?

Question 6: Do respondents have any comments or additional details to add to
this process description?

Question 7: Do respondents have any comments regarding these tools, means and
methods as means for measuring the QoS performance of mobile
communications networks?

Question 8: Do respondents agree with these minimum numbers of measurement
samples?

Question 9: Do respondents agree with the list of data (see Appendix A) to be
provided by the operators or collected automatically by ATRA?

Question 10: Do respondents agree with these periodicities?

Question 11: Do respondents have suggestions on this classification and
distribution of the coverage areas into strata?

Question 12: Which parts of the country and coverage are subject to security
constraints that do not allow operators to maintain networks and
related QoS? Detail the areas and reasons. See Appendix B for
details on the Drive Test itineraries.




Page 24 of 33

Appendix A Information and System Data to be provided by
Operators
Network Information
Each operator is responsible for providing data and information about its network. For
the purposes of ATRA carrying out service quality surveys, the licensee shall provide,
on written request, to the licensor, the following;

Maps showing coverage for services by technology
An up-to-date list of the locations of base stations (BTS) (indicating the
technologies deployed at each) and transmission nodes, including GPS
system coordinates and frequency assignments used.

OMC Raw Data and System Data to be provided
We propose to request this list of data and information from the operators. Some are
directly required for KPIs computation, others can be informative and only useful in
case of deeper investigation on the QoS problems origin.

These data will be provided on a daily basis, and averaged over an hour.

Infrastructure Core Network

Nodes Vendor HW Version SW Version Capacity
HLR
MSC-S
MGW
STP
SGSN
GGSN
SMSC
MMSC
VMS
IN


Infrastructure Core Network

Infrastructure Numbers
BSCs/RNCs
Site
Cell
TRX/ Carrier


HLR
NE Name Date Hour Number of Subscribers



Page 25 of 33




VLR
NE Name Date Hour Registered Subscribers Attached Subscribers





Paging and Location Updates

MSC/MSC-S Date Hour Number of
First
Paging
Attempts
Number
of First
Paging
Success
Number
of Second
Paging
Success
Number of
Location
Update
Attempts
Number of
Location
Update
Success




SMS

MSC Date Hour Number of SMS
MO Attempt
Number of SMS
MO Success
Number of SMS
MT Attempt
Number of SMS
MT Success





Page 26 of 33


Voice TDM

NE
Name
Trunk
ROUTE
Date Hour Circuits
Number
Incoming
Calls Number
Outgoing
Calls Number
Number of
Blocked
Incoming
Calls
Number of
Blocked
Outgoing
Calls
Incoming
Traffic
Outgoing
Traffic
Number of
Accumulations
(NSCAN Value in
Ericsson MSC)



GGSN

GGSN Date Hour PDP
Capacity
Through-
put
Capacity
in Gbps
Through-
put
Capacity
in PPS
No. of
Subs
No. of
Active PDP
Contexts
PDP Context
Activation
Request
PDP Context
Activation
Accept
Downlink
Octets Sent
(OUT)
Uplink
Octets
Received
(IN)
Downlink
Packets
Sent
Uplink
Packets
Received
Measurement
Interval in Seconds
(to determine
throughput from
no. of octets)



SGSN

SGSN Date Hour SAU
capacity
PDP
capacity
Throughput
capacity in
Gbps
Number
of
attached
users
Number
of Active
PDP
context
Number
of
Attached
Request
Number
of
Attached
Accept
Number
of
Paging
request
Number
of
Paging
success
PDP
Context
Activation
Request
PDP
Context
Activation
Accept
Downlink
Octets
Sent
(OUT)
Uplink
Octets
Received
(IN)
Measurement
interval in
seconds (in
order to
determine
throughput
from number of
octets)

Page 27 of 33

Sites database

BSC/RNC Site name Sector name Area TRX/Carrier number CI LAC Longitude Latitude Azimuth Carrier 1 Carrier 2


CS counters

Date Hour BSC Cell Full Rate
Traffic
Volume
(Erl)
Half Rate
Traffic
Volume
(Erl)
Configured
TCHs/Carriers
(number)
Available
TCHs/Carriers
(number)
Configured
signaling
channels
(number)
Available
signaling
channels
(number)
Signaling channels
establishment
Requests(times)
Signaling channels
establishment failure
due to
congestion(times)
Successful
signaling channels
establishment
(times)



Date Hour BSC Cell Signaling
channels
Call
Drops
(times)
Traffic
channels
assignment
request
(times)
Failed traffic
channels
assignment due to
congestion(times)
Successful traffic
channels
Assignment(times)
Traffic
channels
Call
Drops
(times)
Outgoing
Internal Inter-
Cell Handover
Requests(times)
Successful
Outgoing Internal
Inter-Cell
Handover(times)
Outgoing
External Inter-
Cell Handover
Requests(times)
Successful
Outgoing External
Inter-Cell
Handovers(times)








Page 28 of 33

PS1 counters

Date Hour BSC/RNC Cell Number of
Uplink TBF
Establishment
Attempts
(times)
Number of Successful
Uplink TBF
Establishments(times)
Number of
Downlink
TBF
Establishment
Attempts
(times)
Number of Successful
Downlink TBF
Establishments(times)
Total
number of
PDCH
preemption
attempts
(times)
Total
number of
packet
channels
allocation
failures
(times)
Total
Bytes
of
Uplink
Users
LLC
PDUs
(byte)
LLC
Throughput
of Uplink
Users
(kbps)
Total
Bytes of
Downlink
Users
LLC PDU
(byte)
LLC
Throughput
of Downlink
Users (kbps)


PS2 counters

SGSN Date Hour GPRS attach request (times) GPRS attach accept (times) PDP Context Activation Request (times) PDP Context Activation Accept (times)






SMS

Date Hour Total Submissions (times) Total Successful Submissions (times) Total Deliveries (times) Total Successful Deliveries (times)



It is proposed that KPIs which have not been computed because of missing information or corrupted data will be considered as non compliant and can be subject
to sanctions.
Page 29 of 33

Appendix B Specification of the Areas (cities and roads)
The test drive is an important task in assessing the quality of service.
For security reasons, two scenarios of drive test itineraries are defined.
Scenario A:
Kabul, Kunduz, Masar I Sharif, Maimana, Herat, Kabul Kandahar.
Scenario A requires enhanced security for the road segment (Herat, Kandahar, Kabul)
Scenario B:
Kabul, Kunduz, Masar I Sharif, Maimana, Heart.
Scenario B does not require special security reinforcement (but regular safety) for the
whole segment.
These 2 scenarios are illustrated in the diagrams presented below.


Code category Site Category designation
C01 Main cities stratum Kabul Capital ( very big town)
C02 Other cities
stratum
Kandahar
Main cities (big town)
Masar-I- Charif
Herat
Kunduz
Jalalabad
C03 Villages stratum
Taluqan,

Others cities
Baghlan
Gazni
Lashargah
Farah
Maimana
Shiberghan

Classification of the Roads
Code
category
Road (Junction)
Category
Designation
Traffic Road
Distance [km]
R01
Road
stratum
Road (Kabul, Masar) Main road Very high 496
Road (Kabul, Kanduz) 347
Road (Kabul, Kandahar) 497
Road (Kandahar, Herat) 568
Road (Masar, Herat) 735
Road (Kabul, Jallalabad) 155



Page 30 of 33




Code
Category
Road (Junction)
Category
Designation
Traffic
Road
Distance [km]
R02 Road
stratum
Road (Kabul, Gardez)
Secondary road Medium
121
Road (Farah, Zaranj) 221
Road (Kabul, Bamiyan) 176
Road (kandahar,kirin-
kot)
160
Road (Baghlan,Ayak) 96.8
Road (Herat,
Chaghcharan)
385





Page 31 of 33


Scenario A

Page 32 of 33


Scenario B
Page 33 of 33

Appendix C List of Relevant Standards
ITU
E-800 et al Quality of Telecommunications Services: Concepts, Models,
Objectives and Dependability Planning
G.1000 et al Communications Quality of Service
G.1010 End-user multimedia QoS categories
G.107 E-Model Voice Quality Analysis
G.1070 Opinion Model for Videophone applications
P.862 (PESQ) Perceptual Measurement of Speech Quality
Y.1540 IP packet transfer and availability performance parameters

ETSI & 3GPP
202 057 Part s 1-4 User related QoS parameter definitions and measurements
TS 122 105 Universal Mobile Telecommunications System (UMTS);
Services and service capabilities
TS 123 107 Digital cellular telecommunications system
3GPP TS 23.107 Quality of Service (QoS) concept and architecture

IETF
RFC 2990 Next Steps for the IP QoS Architecture
RFC 3583 Requirements of a Quality of Service (QoS) Solution for Mobile
IP

You might also like