Professional Documents
Culture Documents
High-Density Wireless
Networks for Auditoriums
Validated Reference Design
Copyright
2010 Aruba Networks, Inc. AirWave, Aruba Networks, Aruba Mobility Management System, Bluescanner, For Wireless That
Works, Mobile Edge Architecture, People Move. Networks Must Follow, RFprotect, The All Wireless Workplace Is Now Open For
Business, Green Island, and The Mobile Edge Company are trademarks of Aruba Networks, Inc. All rights reserved. Aruba Networks
reserves the right to change, modify, transfer, or otherwise revise this publication and the product specifications without notice. While
Aruba uses commercially reasonable efforts to ensure the accuracy of the specifications contained in this document, Aruba will assume
no responsibility for any errors or omissions.
Legal Notice
ARUBA DISCLAIMS ANY AND ALL OTHER REPRESENTATIONS AND WARRANTIES, WEATHER EXPRESS, IMPLIED, OR
STATUTORY, INCLUDING WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, TITLE,
NONINFRINGEMENT, ACCURACY AND QUET ENJOYMENT. IN NO EVENT SHALL THE AGGREGATE LIABILITY OF ARUBA EXCEED
THE AMOUNTS ACUTALLY PAID TO ARUBA UNDER ANY APPLICABLE WRITTEN AGREEMENT OR FOR ARUBA PRODUCTS OR
SERVICES PURSHASED DIRECTLY FROM ARUBA, WHICHEVER IS LESS.
www.arubanetworks.com
1344 Crossman Avenue
Sunnyvale, California 94089
Phone: 408.227.4500
Fax 408.227.4550
High-Density Wireless Networks for Auditoriums Validated Reference Design | Solution Guide
October 2010
Contents
Chapter 1
Chapter 2
Chapter 3
Chapter 4
Introduction
Reference Documents
11
Functional Requirements
12
13
13
14
17
17
18
19
19
20
22
22
23
24
24
25
25
27
27
27
29
29
31
31
32
35
38
40
41
44
44
45
45
46
Aesthetic Considerations
47
Contents | 3
Chapter 5
Chapter 6
4 | Contents
48
48
48
49
49
50
51
51
51
52
52
53
53
54
54
54
55
55
58
60
60
60
61
61
63
64
64
65
65
65
67
68
68
69
70
71
71
72
73
73
74
74
76
76
77
78
78
80
80
81
82
83
83
Chapter 7
Appendix A
Appendix B
Appendix C
Appendix D
Appendix E
85
85
86
HD WLAN Troubleshooting
86
87
88
Symptom #2: Device can see SSIDs but not the one it needs
88
90
91
92
HD WLAN Testbed
95
Testbed Design
What is a Client Scaling Test?
Testbed Design
95
95
95
96
96
97
98
98
99
99
101
102
102
103
103
103
104
107
107
107
109
111
113
113
114
115
116
117
117
119
119
120
DFS Summary
120
121
121
Contents | 5
6 | Contents
Chapter 1
Introduction
This guide explains how to implement an Aruba 802.11n wireless network that must provide high-speed
access to an auditorium-style room with 500 or more seats. Aruba Networks refers to such networks as
high-density wireless LANs (HD WLANs). Lecture halls, hotel ballrooms, and convention centers are
common examples of spaces with this requirement. Because the number of concurrent users on an AP
is limited, to serve such a large number of devices requires access point (AP) densities well in excess of
the usual AP per 2,500 5,000 ft2 (225 450 m2). Such coverage areas therefore have many special
technical design challenges. This validated reference design provides the design principles, capacity
planning methods, and physical installation knowledge needed to successfully deploy HD WLANs.
Optimizing
Aruba WLANs
for Roaming
Devices
Retail
Wireless
Networks
High-Density
Wireless
Networks
Virtual
Branch
Networks
Incremental
Designs
Base
Designs
HD_190
Campus
Wireless
Networks
Wired
Multiplexer
(MUX)
Introduction | 7
A base design is a complete, end-to-end reference design for common customer scenarios. Aruba
publishes the following base designs:
Campus Wireless Networks VRD: This guide describes the best practices for implementing a
large campus wireless LAN (WLAN) that serve thousands of users spread across many different
buildings joined by SONET, MPLS, or any other high-speed, high-availability backbone.
Retail Wireless Networks VRD: This guide describes the best practices for implementing retail
networks for merchants who want to deploy centrally managed and secure WLANs with wireless
intrusion detection capability across distribution centers, warehouses, and hundreds or thousands
of stores.
Virtual Branch Networks VRD: This guide describes the best practices for implementing small
remote networks that serve fewer than 100 wired and wireless devices that are centrally managed
and secured in a manner that replicates the simplicity and ease of use of a software VPN solution.
An incremental design provides an optimization or enhancement that can be applied to any base design.
Aruba publishes the following incremental designs:
High-Density Wireless Networks VRD (this guide): This guide describes the best practices for
implementing coverage zones with high numbers of wireless clients and APs in a single room such
as lecture halls and auditoriums.
Optimizing Aruba WLANs for Roaming Devices VRD: This guide describes best the practices
for implementing an Aruba 802.11 wireless network that supports thousands of highly mobile
devices such as Wi-Fi phones, handheld scanning terminals, voice badges, and computers mounted
to vehicles.
Wired Multiplexer (MUX) VRD: This guide describes the best practices for implementing a wired
network access control system that enables specific wired Ethernet ports on a customer network to
benefit from Aruba role-based security features.
Assumption
8 | Introduction
Assumption
Reference Documents
The following technical documents provide additional detail on the technical issues found in
HD WLANs:
ARM Yourself to Increase Enterprise WLAN Data Capacity, Gokul Rajagopalan and Peter
Thornycroft, Aruba Networks, 2009
Adaptive CSMA for Scalable Network Capacity in High-Density WLAN: a Hardware Prototyping
Approach, Jing Zhu, Benjamin Metzler, Xingang Guo and York Liu, Intel Corporation, 2006
Own the Air: Testing Aruba Networks Adaptive Radio Management (ARM) in a High-Density
Client Environment, Network Test Inc., July 2010
Data sheets for Aruba AP-105, AP-124, and AP-125 access points
Data sheets for Aruba AP-ANT-13B, AP-ANT-16, AP-ANT-17, and AP-ANT-18 external antennas
Introduction | 9
10 | Introduction
Chapter 2
Design Requirements
for Auditorium HD WLANs
HD WLANs are defined as RF coverage zones with a large number of wireless clients and APs in a single
room. With the proliferation of wireless-enabled personal and enterprise mobile devices, a surprisingly
diverse range of facilities need this type of connectivity:
Hotel ballrooms
Airport concourses
Casinos
This VRD addresses auditorium-style areas. When you understand the auditorium scenario, it is quite
straightforward to apply the design principles to almost any type of high-density coverage zone.
The high concentration of users in any high-density environment presents challenges for designing and
deploying a wireless network. The explosion of Wi-Fi-enabled smartphones means that each person
could have two or more 802.11 NICs vying for service, some of which may be capable of only 2.4-GHz
communication. At the same time, maximum HD WLAN capacity varies from country to country based
on the number of available radio channels. Balancing demand, capacity, and performance in this type of
wireless network requires careful planning.
This chapter defines the functional and technical requirements of the auditorium scenario, including
those for client devices, wired infrastructure, and wireless infrastructure. Understanding these
requirements sets the stage for the design, configuration, and troubleshooting chapters to follow.
Functional Requirements
The typical auditorium addressed by this VRD has a total target capacity of 500 seats. If each user is
carrying a laptop and a Wi-Fi-enabled PDA or smartphone, the total WLAN client count could be as high
as 1,000 devices. The average real-world, per-client bandwidth need is usually no more than 1 Mbps
even for many video streaming deployments. In Chapter 3, Capacity Planning for HD-WLANs on
page 17, we discuss how higher or lower throughput targets alter the total capacity of an HD WLAN.
Figure 2 500 Seat University Lecture Hall
The users in an auditorium are evenly distributed across the space because they are usually sitting in
rows of stadium-type seating. The user density in the seating areas is an average of 1 user per 15 ft2
(5 m2), including aisles and other common areas. As many as 20 APs could be deployed in a single
auditorium, depending on the total number of allowed channels in the regulatory domain. Available
mounting locations are often less than ideal, and aesthetic and cable routing considerations limit
installation choices.
Figure 3 shows the user density in a typical auditorium or lecture hall environment.
Figure 3 Auditorium of 320 Seats with Typical Dimensions
The user density of the typical auditorium is approximately 20 times greater than an office environment.
In a typical office environment with a mix of cubicles and offices, a typical client density is 250 350 ft2
(23 to 33 m2) per person, including common areas, with a per-client bandwidth need of 500 Kbps or less.
It is common to deploy one AP every 2,500 to 5,000 ft2 (225 to 450 m2), which provides for average
received signal strengths of -65 to -75 dBm depending on the walls and other structures in the area.
Also, the office environment provides much more flexibility in AP mounting and placement choices.
In universities and convention centers, it is common for several auditoriums of varying capacities to
exist side-by-side or above-and-below. This situation makes the design aspect even more challenging
because the rooms are almost always adjacent and close enough to require careful management of cochannel interference (CCI) and adjacent channel interference (ACI) between auditoriums. This
situation can include intended and unintended RF interaction between APs, clients, and between
clients in different rooms. As a result, such facilities require special RF design consideration, which is
covered in Chapter 4, RF Design for HD WLANs on page 31.
Simultaneous Logins/Logoffs: The RADIUS or other authentication server must be able to handle
the inrush and outrush of users at fixed times (such as a class start and stop bell). Ensure that the
AAA server can accommodate the expected peak number of authentications per second. You can
use the Aruba command show aaa authentication-server radius statistics to monitor
average response time.
IP Address Space: Sufficient addresses must be available to support not only laptops but also
smartphones and other future Wi-Fi-compatible devices that may expect connectivity. Some surplus
space will be necessary to support inrush and outrush of users in a transparent fashion and in
concert with the DHCP service lease times in order to prevent address exhaustion.
DHCP Service: The DHCP server for the HD WLAN must also be able to accommodate an
appropriate inrush peak load of leases per second. Lease times must be optimized to the length of
sessions in the room so that the address space can be turned over smoothly between classes or
meetings.
Chapter 3
Capacity Planning
for HD-WLANs
Over the next four chapters you will learn capacity planning, RF design, configuration, and validation
for HD WLANs. In this chapter, you will learn the basic approach to planning an HD WLAN and making
a first-order assessment of whether the desired level of performance is possible for an area of a given
size.
This chapter uses charts and lookup tables to provide the wireless architect with the necessary sizing
parameters. These tables are based on extensive validation testing conducted in the Aruba labs. For
those interested in the mathematics and theory of HD WLAN design behind the charts, Appendix B,
Advanced Capacity Planning Theory for HD WLANs on page 107 provides a technical explanation of
the process.
Choose capacity
goal
Validate
against goal
Determine usable
channel count
Choose concurrent
user target
HD_277
Predict total
capacity
1. Choose a capacity goal: The first step is to pick an application-layer throughput target linked to
the seating capacity of the auditorium.
2. Determine the usable number of channels: For each band, decide how many nonoverlapping
channels are usable for the HD WLAN. Use a database of regulatory information included here,
augmented by site-specific decisions such as whether or not Dynamic Frequency Selection (DFS)
channels are available.
3. Choose a concurrent user target: Determine the maximum number of simultaneously
transmitting clients that each AP will handle. Use a lookup table based on test data supplied by
Aruba. You must do this for each radio on the AP.
4. Predict total capacity: Use the channel and concurrent user count limits to estimate the maximum
capacity of the auditorium using lookup tables supplied by Aruba.
5. Validate against capacity goal: Compare the capacity prediction with the capacity goal from
step 1. If the prediction falls short, you must start over and adjust the goal, concurrent user limit, or
channel count until you have a plan that you can live with. For large auditoriums over 500 seats, you
should be prepared to accept a per-client throughput of 500 Kbps or less, assuming a 50/50 mix of
.11n and .11a stations and nine usable channels.
This guide assumes that channels will not be reused within a single auditorium.
NOTE
If Channel reuse is required to achieve the capacity goal, see Appendix C, Basic Picocell Design on
page 113 for an advanced discussion of the theoretical issues involved in managing AP-to-AP and clientto-client interference. In practice, reuse is extremely difficult to achieve in most auditoriums due to
their relatively small size and the signal propagation characteristics of multiple-in multiple-out (MIMO)
radios. Reuse requires more complex calculations and testing as well as the potential for modifying
physical structures in the user environment.
Total number of devices: Often, this is just equal to the seating capacity of the area. Sometimes,
each seat may contain more than one client (that is, one laptop and one Wi-Fi-capable smartphone).
This is important because every MAC address consumes airtime, an IP address, and other network
resources.
Minimum bandwidth per device: This is primarily driven by the mix of data, voice, and video
applications that will be used in the room. Aruba recommends using LAN traffic studies to precisely
quantify this value.
Each classroom has 30 students who each need 2 Mbps of symmetrical throughput.
The auditorium holds 500 people. Each one has a laptop that must have at least 350 Kbps for data
and a voice handset that requires at least 128 Kbps.
The trading floor must serve 800 people with at least 512 Kbps each.
Each of these scenarios provides the wireless architect with a clear, concise, and measurable end state.
Its a good idea to build in future capacity needs. While the number of seats in the auditorium is not
likely to change, it is nearly certain that the number of 802.11 radios per seat will increase in the future.
Be sure to consider the actual duty cycle of each device type when setting the capacity goal. In many
cases, it is unlikely that every device will need access to the maximum capacity simultaneously (unless
there are specific applications that require it such as interactive learning systems). It's a good idea to
use a wireless packet capture utility to study the actual bandwidth requirements of a typical user. Many
customers initially overestimate their bandwidth requirements.
Channel
A
Channel
C
HD_246
Wi-Fi operates in the 2.4-GHz band and in different segments of the 5-GHz band. The available RF
channels are subject to national regulations, but generally there is 83 MHz available at 2.4 GHz and
around 460 MHz at 5 GHz. The 802.11 standard uses 20-MHz or 40-MHz (for 802.11n) channels, so
standard Wi-Fi equipment is also constrained by these parameters. The number of allowed
nonoverlapping channels is the primary capacity constraint on an HD WLAN. For this reason, HD
WLANs should always use the 5-GHz band for primary client service because most regulatory domains
have many more channels in this band.
Band
Edg e
515 0
Channel
Frequency (MHz)
Channel
Frequency (MH z)
Channel
Frequency (MHz)
Band
Edge
5 450
36
40
44
48
52
56
60
64
1 00
10 4
108
112
116
120
124
Ba nd 1 49 153 157
161
Edge
57 25 5745 57 65 5785 58 05
165
Ba nd
Edge
58 50
532 0
Band
Edg e
535 0
12 8 1 32
136
140
56 40 566 0 568 0 57 00
Band
Edg e
572 5
US Intermediate Band
US intermedia
(UNII-II
Extended)te band
(UNII II extended)
5470-5725
MHz
5450MHz
-572 channels
5 MHz
11x20
11xMHz
2 0 MHz
channels
5x40
channels
5x 40 MHz
Requires
DFSchannels
Requires DF S
US
USUNI-III
UNII III//ISM
ISM Band
ba nd
5725-5850
MHz
57 25-5 850 MHz
4x20
MHz
channels
4x 20 MH z channels
2x40
2x 40MHz
MH z channels
channels
In 2007 the radio regulatory bodies in many countries allowed the use of the UNII-II extended band
from 5470 MHz to 5725 MHz as long as UNII-II equipment was capable of Dynamic Frequency Selection
(DFS). DFS requires that the AP monitor all RF channels for the presence of radar pulses and switch to
a different channel if a radar system is located. Wi-Fi equipment that is DFS-certified can use the
extended band, which adds up to another eleven 20-MHz channels or five 40-MHz channels (depending
on the radio regulatory rules in each country).
Table 2 lists the typical channels available for some example regulatory domains at the time of
publication.
DFS Channels
Table 2 Typical 5GHz Channels Available for Use in Selected Regulatory Domains
Channel #
Frequency
(MHz)
USA
Europe
Japan
Singapore
China
Israel
Korea
Brazil
36
5180
Yes
Yes
Yes
Yes
No
Yes
Yes
Yes
40
5200
Yes
Yes
Yes
Yes
No
Yes
Yes
Yes
44
5220
Yes
Yes
Yes
Yes
No
Yes
Yes
Yes
48
5240
Yes
Yes
Yes
Yes
No
Yes
Yes
Yes
52
5260
Yes
Yes
Yes
Yes
No
Yes
Yes
Yes
56
5280
Yes
Yes
Yes
Yes
No
Yes
Yes
Yes
60
5300
Yes
Yes
Yes
Yes
No
Yes
Yes
Yes
64
5320
Yes
Yes
Yes
Yes
No
Yes
Yes
Yes
100
5500
Yes
Yes
Yes
No
No
No
Yes
Yes
104
5520
Yes
Yes
Yes
No
No
No
Yes
Yes
108
5540
Yes
Yes
Yes
No
No
No
Yes
Yes
112
5560
Yes
Yes
Yes
No
No
No
Yes
Yes
116
5580
Yes
Yes
Yes
No
No
No
Yes
Yes
120
5600
No
No
Yes
No
No
No
Yes
No
124
5620
No
No
Yes
No
No
No
Yes
No
128
5640
No
No
Yes
No
No
No
Yes
No
132
5660
No
No
Yes
No
No
No
No
No
136
5680
Yes
Yes
Yes
No
No
No
No
Yes
140
5700
Yes
Yes
Yes
No
No
No
No
Yes
149
5745
Yes
No
No
Yes
Yes
No
Yes
Yes
153
5765
Yes
No
No
Yes
Yes
No
Yes
Yes
157
5785
Yes
No
No
Yes
Yes
No
Yes
Yes
161
5805
Yes
No
No
Yes
Yes
No
Yes
Yes
165
5825
Yes
No
No
Yes
Yes
No
Yes
Yes
20
15
19
13
21
20
Actual channel availability for any given installation depends on the specific AP model selected, the
present status of regulations, and any local country specific deviations or changes from this table since
the time of publication. Aruba recommends that you contact our Technical Assistance Center or a
professional installer to obtain a specific list for your deployment. An Aruba controller will also report
the valid channels for a given regulatory domain with the show ap allowed-channels countrycode <country code> command.
As of October 5, 2009, the United States FCC and European Technical Standardization Institute
have disallowed 5600 to 5650MHz (approximately channels 120-132) for use with WLANs. This is to
avoid interference with airport terminal doppler radar systems. Aruba APs with approvals as of that
date, including AP-120 series and AP-105, are allowed to continue using those channels, but future
AP models may not support them.
NOTE
Enabling or disabling specific channels is done through the Regulatory Domain Profile of the AP Group
to which the auditorium APs belong. Configuration of channel availability is covered in Chapter 6,
Configuring ArubaOS for HD-WLANs on page 67.
First, actual or false positive radar events can be extremely disruptive to a WLAN that attempts to use
DFS channels. Users on DFS channels can potentially experience lengthy service interruptions from
radar events. Because radar frequencies do not align with 802.11 channelization, such events can
impact multiple Wi-Fi channels simultaneously. See Appendix D, Dynamic Frequency
Selection Operation on page 119 for a more detailed discussion of radar operation and DFS
compatibility.
Second, as of this writing, many 802.11 client Network Interface Cards (NICs) do not support DFS
channels, especially outside the United States. Client devices in an auditorium are not generally under
the control of the facility operator, so always be sure to include non-DFS channels in your HD WLAN
channel plan for these devices.
NOTE
The question of usability is also a function of the client and what channels its chipset/driver
combination supports for that regulatory profile. For example, with driver version 13.1.1.1, both the
Intel 5100agn and 5300agn WLAN NICs support all DFS channels in the US (both 52-64 and 100140). However, with the same driver, the Intel 4965agn does not support channels 100-140. Another
example is the Cisco 7925g voice handset, which does not support channel 165.
Third, ArubaOS will not allow the Aruba Receive Sensitivity Tuning-Based Channel Reuse feature to be
used with DFS channels, because it could result in the AP missing radar events. This feature is only
available on the non-DFS channels in any regulatory domain.
Site-Specific Restrictions
Because high-density coverage zones are just one part of a larger facility, the channel plan for the rest
of the site may also impose constraints on channel availability. Be sure to consider any reserved
channels that are required for indoor or outdoor mesh operations, or for dedicated applications such as
IP surveillance video. It is prudent to conduct a spectrum clearing survey to ensure that no fixed
frequency interference sources would further reduce channel selection.
161
153
40
48
36
2 cell
161
isolat
ion
48
60
36
149
64
149
52
44
157
44
HD_247
153
64
However, in an auditorium, channel reuse is driven by the number of devices to be served. Because
each radio can serve a finite number of devices, there is a limit to the total number of clients that can be
in an area without either oversubscribing the APs or reusing the allowed radio channels.
Achieving channel reuse in a single room of less than 10,000 ft2 (930 m2) is technically challenging,
requires expensive directional antennas and costly physical installation. The antennas and cables can
negatively impact the room aesthetics, which is a concern in most buildings. However, no channel reuse
is needed for auditoriums of up to nearly 1,000 devices in the United States, Europe, Japan and Korea
with DFS enabled (assuming 50 simultaneously transmitting clients per radio). Without DFS, up to 650
devices can be accommodated in the US and 400 devices in Europe.
As this covers most common auditorium sizes, the main body of this VRD uses a simple lookup table
approach for capacity planning assuming that no channel reuse occurs. Appendix C, Basic Picocell
Design on page 113 presents the mathematics behind channel reuse distances. If your high-density
coverage zone does require reuse, picocells with under-floor mounting will likely be required. This is
described in Chapter 4, RF Design for HD WLANs on page 31.
These channels are available in most countries today. With a small amount of overlap, four channels
have sometimes been employed to increase overall system capacity. However, four-channel plans are
not advisable in HD WLANs due to the very high levels of ACI already present in the environment.
Because of the very limited number of nonoverlapping channels in the 2.4-GHz band, it is vital to
anticipate how many of those radios will be on that band and to conduct a basic traffic study for the
applications expected in your high-density coverage area. Aruba has found that most smartphones that
provide basic push email service have low duty cycles and consume 256 Kbps or less. Voice-over-Wi-Fi
handsets using higher quality G.711 codecs generate 128 Kbps of bidirectional traffic.
450
300
NOT
PRACTICAL
NOT
PRACTICAL
2
Reuses
3
Reuses
150
No
Reuse
1
Reuse
6
Number of 2.4 GHz APs
12
HD_248
Users
In planning mixed 2.4-GHz and 5-GHz deployments with dual-band APs, only one 2.4-GHz radio should
be enabled on each of the three channels. Don't forget that these channels are very likely already being
reused outside the auditorium, which will further reduce overall capacity of each 2.4-GHz channel.
NOTE
With under-floor mounting, it may be possible to reuse each 2.4-GHz channel one time in a very
large auditorium over 10,000 ft2 (930 m2). If this is a requirement in your environment, see the
section on picocells using under-floor mounting in Chapter 4, RF Design for HD WLANs on
page 31.
As part of the validation testing for this VRD, Aruba completed open air client scaling test runs for five
different mixes of 802.11n and 802.11a clients:
The testbed included a heterogeneous mix of 50 different laptops and netbooks with a wide variety of
operating systems and wireless NICs, just as you would find in a real auditorium. Ixia Chariot was used
as the traffic generator.
Figure 9 shows the effect of these combinations on application-layer throughput. The left vertical axis is
the average per-client application-layer throughput in Mbps (shown by the lines). The right vertical axis
shows the total channel capacity relative to the total throughput for 10 clients transmitting at one time
(shown by the bars). When we change just 25% of the clients on a 5-GHz HT20 channel to be 802.11a
only, the average per-client throughput is reduced by between 20% and 25%, depending on the number
of stations in the test. Increasing the .11a client mix to 50/50 results in another 25% reduction in both
aggregate and per-client throughput. Interestingly, little difference was observed with less than 50%
HT20 clients.
Figure 9 5-GHz Per-Client Mixed-Mode TCP Client Scaling Performance
These results were obtained with airtime fairness enabled using preferred access mode which
provides somewhat more transmit slots to HT clients. Without airtime fairness, legacy clients starve
newer 802.11n clients by consuming a greater share of the airtime. Airtime fairness effectively reduces
the amount of time that is made available to legacy stations to transmit, which essentially penalizes
them to allow the fastest clients to obtain the bulk of the airtime. In Chapter 5, Infrastructure
Optimizations for HD WLANs on page 51, you will learn more about how to leverage this feature in
your HD WLAN.
20
30
40
50
100% HT20
5.99 Mbps
2.99 Mbps
1.81 Mbps
1.30 Mbps
0.94 Mbps
4.69 Mbps
2.20 Mbps
1.46 Mbps
1.03 Mbps
0.77 Mbps
4.17 Mbps
1.73 Mbps
1.10 Mbps
0.75 Mbps
0.54 Mbps
3.96 Mbps
1.72 Mbps
1.07 Mbps
0.68 Mbps
0.56 Mbps
100% 11a
1.50 Mbps
0.75 Mbps
0.50 Mbps
0.36 Mbps
0.28 Mbps
5-GHz Capacity
Use Figure 10 to quickly arrive at the total device capacity of your HD WLAN in 5 GHz. Choose your
country and whether DFS is available or not. Follow that upward to the line that matches the
concurrent user target you picked in Step #3: Choose a Concurrent User Target on page 25. The total
user/device count can be seen on the Y axis.
Figure 10 HD WLAN User Capacity Predictor
1,200
1,200
10
10Users/Radio
Users/AP
1,000
1,000
20 Users/Radio
20 Users/AP
30 Users/Radio
30
Users/AP
40 Users/Radio
40
Users/AP
800
800
50
50Users/Radio
Users/AP
600
600
400
400
200
200
9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
HD_249 no
Users
The chart allows a wireless designer to rapidly assess the capacity limit of a given auditorium. Table 4
provides the same information in tabular form.
Table 4 HD WLAN User Capacity Matrix - 5 GHz
Radios
10/radio
20/radio
30/radio
40/radio
50/radio
10
20
30
40
50
20
40
60
80
100
30
60
90
120
150
40
80
120
160
200
50
100
150
200
250
60
120
180
240
300
70
140
210
280
350
80
160
240
320
400
90
180
270
360
450
10
100
200
300
400
500
11
110
220
330
440
550
12
120
240
360
480
600
13
130
260
390
520
650
14
140
280
420
560
700
15
150
300
450
600
750
16
160
320
480
640
800
17
170
340
510
680
850
18
180
360
540
720
900
19
190
380
570
760
950
20
200
400
600
800
1,000
21
210
420
630
840
1,050
22
220
440
660
880
1,100
23
230
460
690
920
1,150
24
240
480
720
960
1,200
2.4-GHz Capacity
We begin by determining how large the current population of 2.4-GHz-only devices is and what type of
growth to expect on that band. The following approaches can be used to answer these questions:
Simply assume that each user has one 5-GHz and one 2.4-GHz client (such as a laptop and a
smartphone). This is the worst case.
If dual-band coverage exists elsewhere in the facility, use historical WLAN client association data
from a network monitoring system, such as the AirWave Wireless Management Suite, to obtain a
ratio of 2.4-GHz to 5-GHz users as well as per-station bandwidth consumption.
In the second case, you would then multiply the base occupancy of the auditorium by the ratio of users
to get the 2.4-GHz population. To be conservative, increase the ratio by 5-10% to provide a safety margin
for near-term growth in the 2.4-GHz band.
Table 5 lists the maximum number of 2.4-GHz devices that are supportable for a given number of
nonoverlapping channels.
Table 5 HD WLAN User Capacity Matrix - 2.4 GHz
Radios
10/radio
20/radio
30/radio
40/radio
50/radio
10
20
30
40
50
20
40
60
80
100
30
60
90
120
150
4*
40
80
120
160
200
5*
50
100
150
200
250
6*
60
120
180
240
300
CAUTION: 1 reuse is required, which requires picocell deployment. See Chapter 4, RF Design for HD WLANs on
page 31 and Appendix C, Basic Picocell Design on page 113 for more information.
The obvious problem with this chart is how to support a 500-seat or larger auditorium where every user
has an iPhone, BlackBerry, or other 2.4-GHz-only-capable smartphone. If picocells are not feasible, then
the only solution is to oversubscribe each radio. Use Aruba's airtime fairness feature to help distribute
capacity evenly among the users associated to each AP.
Chapter 4
RF Design for HD WLANs
Coverage in HD WLANs is achieved by carefully combining the number of APs as determined in the
previous chapter with the physical space for which the designer is providing wireless services.
Placing many APs in close proximity to one another and enabling them to operate with minimal
interference requires the use of a several specific wireless design principles. These principles must be
balanced against building limitations like mounting restrictions, cabling requirements, room shape, and
room size. This chapter will teach you how to achieve this balance successfully.
Overhead Coverage
Ceilings are a common AP mounting location because they generally allow an unobstructed view down
to the wireless clients. By distributing APs consistently and evenly across a ceiling, you are able to limit
AP-AP interference (also known as coupling) while providing very uniform signal levels for all client
devices at floor level. Figure 11 shows what an overhead coverage deployment would conceptually look
like.
Figure 11 Simplified Overhead Coverage Example
40
60
149
52
36
48
56
44
36
Side View
52
Overhead View
HD_250
48
Overhead coverage is a good choice when uniform signal is desired everywhere in the auditorium.
Overhead APs are usually out of view above eye level. It is even possible to conceal the system
completely by flush mounting external antennas to the ceiling. Of course, it must be possible to access
the ceiling without too much difficulty or expense to pull cable and install equipment. No channel reuse
is possible with overhead coverage because the signal spreads. This applies to areas underneath
balconies of up to 10 rows, because APs in the front portion of the auditorium will generally have
favorable line-of-sight even if the AP immediately above is obstructed. Every AP will be available with
high signal strength everywhere in the auditorium.
Some omnidirectional antennas are designed with built-in electrical downtilt. Aruba recommends the
use of these downtilt or squint antennas for overhead coverage, either integrated directly into the AP or
externally connected. Although they are omnidirectional in the horizontal plane, they have
directionality in the vertical plane. They focus substantial energy in the downward direction or, if
mounted under the floor facing up, they focus and receive energy upward. See Table 7 for specifications
of the models that Aruba recommends.
Figure 12 AP-ANT-16 Downtilt Antenna Flush-Mounted to Ceiling Grid
These antennas look like patch antennas but they are installed facing downward. They are electrically
designed to provide a full 360 degrees of omnidirectional coverage with standard vertical polarization.
However, when viewing the E-plane from the side, we can see that the antenna provides approximately
120 degrees of vertical beamwidth with the direction of maximum gain centered around a 45-degree
down angle, as shown in Figure 13. This produces a coverage pattern shaped like a cone underneath
the antenna.
Figure 13 E-Plane Antenna Pattern of AP-ANT-16
0
30
330
60
300
90
270
120
240
150
210
180
HD_116
These are commonly referred to as downtilt or squint antennas. From the plot, it is clear that the
antenna pattern helps with interference rejection in two important ways:
External room interference: Because the direction of maximum gain is straight down, 802.11
signals outside the room on the same floor will not be aligned within the 3-dB beamwidth of the
antenna. In the case of two auditoriums on top of one another, the back lobe is up to 12 dB down
from the main lobe.
Reduced AP-AP interference at ceiling level: In the plane of the ceiling, the pattern of a downtilt
antenna is about 8 dB down from the main lobe, which allows APs to be spaced somewhat more
closely for a given EIRP.
A ceiling deployment can occur at, below, or above the level of the ceiling surface. Care should be taken
with above-ceiling installations when external antennas are not being used to leverage building
obstructions such as pillars, ductwork, or floor joists that can benefit the RF design by further reducing
AP-AP coupling within the room. The closer the obstruction, the greater the blocking effect. APs should
never be placed more than 6 inches above the ceiling material to minimize obstructions in the direction
of the users.
Figure 14 Use Attenuating Building Materials to Reduce AP-AP Coupling
g
g
to reduce AP-AP coupling
I-beam
HVAC
duct
Pipes
HD_251
Ceiling
material
Here is a summary of the advantages and disadvantages of overhead coverage for auditoriums:
Pros
Cons
Co-located APs in an A/V area in the back of an auditorium with directional antennas facing
forwards.
Hotel ballrooms where APs with integrated antennas can only be placed along the sides of the room,
mounted to speaker stands or simply placed on tables.
Where pillars or columns exist in very large auditoriums, it is often practical to mount on them
3-6 ft (1-2 m) above the users.
Structures with no overhead or under-floor access, which could include temporary structures like
tents or open air fairs.
As with overhead coverage, channel reuse is not possible when mounting to walls or pillars. Care must
be taken to orient antenna patterns to cover the intended area and reduce AP-to-AP interference.
Figure 15 shows what a wall-based side-coverage solution that uses integrated omnidirectional
antennas looks like conceptually.
Figure 15 Simplified Side Coverage Example with Integrated Antenna
36
40
44
48
36
52
56
60
149
40
Overhead View
HD_252
Front View
The illustration is meant to show AP position and antenna pattern, not the actual signal propagation. In
fact, even in the very largest auditoriums every AP will likely be able to hear every other AP. It is vital
that adjacent channels, such as 36 and 40, not be adjacent on the wall. Aruba ARM will automatically
manage this for you, but the level of CCI/ACI in a side coverage design is much less desirable than in the
overhead or under-floor cases. You may find that mounting closer to the floor is more successful. For
example, one university customer experienced an issue when they side mounted the APs at 15-20 ft (3-5
m) above floor height. The APs all saw each other with strong enough signal strength that they auto
tuned their power down to match the ARM coverage index causing AP to client signals to be weaker
than required. This resulted in seated clients getting very inconsistent connectivity. When the APs were
moved to floor level, locating them underneath the desks/seats in a few locations, much better
performance was achieved.
You will note that half of the wall-mounted AP signals are lost to the next room (and 75% of the signal in
the corners). With multiple adjacent HD WLANs this can be exploited by the wireless designer, but
otherwise it represents a waste of signal.
You can overcome the signal leakage problem through the use of low-gain external directional antennas
aimed sideways. This can also be achieved very inexpensively by mounting the Aruba AP-105 with its
integrated downtilt pattern vertically on the wall, pointing back to the seats. In this case, no special
antenna is required. See Table 8 for specifications on the models that Aruba recommends.
Figure 16 Simplified Side Coverage Example with Directional Antennas
HD_253
Front View
Overhead View
This strategy also allows APs to be spaced slightly closer together for the same reasons explained under
Overhead Coverage. For details on computing minimum AP-AP separation, see Appendix C, Basic
Picocell Design on page 113.
Aruba strongly advises against the use of high-gain directional antennas (8 dBi or more) in auditoriums
for several reasons:
Questionable benefit: With MIMO technology, signal scattering in typical size auditoriums negates
any value of a narrower beamwidth. At distances typically required in an HD WLAN, higher gain
antennas are not necessary for good coverage and can increase the interfering signal levels within
the coverage space significantly.
Poor near-field signal: Narrow vertical-beamwidth antennas mounted just 12-15 ft (4-5 m) above
the floor do not actually reach the ground for dozens of yards (meters). Close in to the antenna,
clients may experience weak signal as a result of being outside the 3-dB beamwidth
Increased interference outside room: High-gain directional antennas can adversely affect
WLANs outside the auditorium in the direction of maximum gain.
Multiple radomes: The maximum gain for a dual-band antenna in a single radome is about 8 dBi.
Higher gain requires separate antenna radomes for each band. This can be unsightly.
Aesthetics: MIMO panel antennas are relatively large, have multiple RF cables, and generally
require an azimuth-elevation swivel mount. This looks great on a rooftop mast, but not so good in an
ornate auditorium.
Sometimes pillars or columns exist in an auditorium, and they may even have existing cable pathways
to them. These can be used by the wireless designer to achieve more uniform coverage of a room than
is possible from just the walls alone. When using integrated omnidirectional antennas, be sure to take
into account the shadow that a pillar or column creates on the opposite side from the AP. This can be
used to the designers advantage to limit AP-AP coupling. The closer the AP is to the pillar, the greater
the blocking effect.
HD_254
As you can see, an infinite variety of side-coverage scenarios are possible. Here is a summary of the
advantages and disadvantages of side coverage for auditoriums:
Pros
Cons
40
161
153
161
48
60
36
149
64
48
36
52
149
153
64
44
44
157
64
44
Side View
157
Overhead View
44
HD_256
153
Floor mounting is the best choice when there is convenient access underneath the auditorium either for
locating APs or simply pulling cable up into the auditorium from beneath. APs can be located in small
enclosures that are permanently mounted underneath or behind seats.
This strategy has all the advantages of overhead coverage, without the maintenance access headaches.
Because signal is directed upward, impact on adjacent HD WLANs on the same floor is negligible. In
multifloor buildings, inter-floor isolation is also generally good.
It may also be possible to install APs in the ceiling of the floor or basement underneath, shooting up
through the floor. This method can allow for even finer control of the cell size. However, it may be
necessary to use directional antennas with 6-8 dBi higher gain to compensate for interfloor absorption,
such as the AP-ANT-18. Many invisible construction details can influence RF penetration of floor slabs.
Validation testing in a variety of possible configurations should be completed before this method is
selected. The distance from the AP to the slab and floor construction have a direct impact on the size of
the cell in the user space.
Figure 19 Effect of AP Distance on Picocell Width
Floor
slab
Signal
HD_257
Signal
Aruba has studied signal propagation of underfloor mounting. Figure 20 shows an AirMagnet survey of
an AP-124 with AP-ANT-16 facing up on channel 44 at 3 dBm conducted power, or 6 dBm total EIRP. It
is mounted underneath a layer of -in plywood.
20 ft (6 m)
AP
20 ft (6 m)
Channel 44 AP-ANT-16
The radius of the -70 dBm signal was approximately 10 ft (3 m) in this test. Aruba subsequently set up
two APs 40 ft (12 m) apart and measured signal roll off between them. Figure 21 shows that roughly 20
dB of isolation was achieved between these cells.
Figure 21 AirMagnet 3D Survey of Side-by-Side Picocells at 6 dBm EIRP
Here is a summary of the advantages and disadvantages of floor coverage for auditoriums. For more
detailed information on picocell design, see Appendix C, Basic Picocell Design on page 113 or contact
your local Aruba representative.
Pros
Cons
Recommended Products
Aruba offers both integrated-antenna and external-antenna capable 802.11n APs to enable you to
implement the plan of your choice. Table 6 compares features of the Aruba 802.11n APs, particularly
antennas and RF performance.
Table 6 Aruba 802.11n APs
Integrated Antennas
Model
External Antenna
AP-105
AP-125
AP-124
2x2:2
3x3:2
3x3:2
Dual Radio
Dual Radio
Dual Radio
Antenna
Integrated downtilt
antenna
3 dual-band RPSMA
connectors
Same as AP-125
Receive Sensitivity
(5GHz)
Same as AP-125
Maximum Antenna
Gain
n/a
Radios
MIMO
Number
E-Plane (Vertical)
Antenna Pattern
Advantages
Depends on selected
external antenna
Best TX power
Best RX sensitivity
Lowest cost
Integrated downtilt
antenna
Smallest footprint
Wall or ceiling mount
3x3 MIMO
High performance
CPU
Integrated dipole
antenna
Wall or ceiling mount
3x3 MIMO
High performance
CPU
Supports external
antennas
AP can be concealed
behind walls or
ceilings
Table 7 and Table 8 list the antennas that are recommended for use with the AP-124 in external antenna
deployments.
Table 7 Downtilt Antennas
Model
Antenna
Elements
Maximum
Antenna Gain
AP-ANT-13B-KIT
AP-ANT-16
E-Plane (Vertical)
Antenna Pattern
> 60degrees
(centered at +/-45 degrees down angle)
> 60degrees
(centered at +/-45 degrees down angle)
Omnidirectional
Omnidirectional
H-Plane
(Horizontal)
Antenna Pattern
Dimensions
Model
Antenna
Elements
Maximum
Antenna Gain
AP-ANT-17
AP-ANT-18
3
(Linear vertical & dual slant +/- 45 degrees)
3
(Linear vertical & dual slant +/- 45 degrees)
E-Plane (Vertical)
Antenna Pattern
60 degrees
(with 15 degree electrical downtilt)
60 degrees
(with 15 degree electrical downtilt)
120 degrees
60 degrees
H-Plane
(Horizontal)
Antenna Pattern
Dimensions
Side coverage
HD_274
Overhead
Picocell
No
Ceiling
over
20 ft?
No
Either
AP-105
or AP-125
Conceal
AP above
ceiling?
Yes
AP-105
No
Yes
Above
Single
Radome?
AP-124
plus
ANT-13B-Kit
Above or
below
floor?
Below
Omni
Yes
AP-124
plus
ANT-16
AP-105
facing up
AP-124 +
ANT-18
facing up
External
or integrated
antenna?
Integrated
Omni or
directional
AP-125
wall
mounted
Directional
AP-105
wall
mounted
120
External
120 or
60 beam
AP-124
plus
ANT-17
60
AP-124
plus
ANT-18
If a wide horizontal beamwidth (120 degrees), low-gain directional is needed, the AP-ANT-17 should
be used.
If a narrow horizontal beamwidth (60 degrees) is needed, the AP-ANT-18 should be used.
If an external downtilt antenna is needed, and a very small antenna is desired, choose the AP-ANT13B-KIT. This includes three small units, each less than 2 in (5 cm) square. However, they must be
individually mounted with 4-6 in (10-15 cm) separation between them.
Alternatively, if you prefer a single radome, choose the AP-ANT-16. While larger than all the APANT-13B antennas put together, it requires only a single installation.
For underfloor picocell deployments with the AP on the ceiling below, the AP-ANT-18 is
recommended facing straight up. If the AP will be in the auditorium (in the floor itself or a floormounted enclosure) then use the AP-105 with no external antenna facing up.
However, before you choose an external downtilt antenna, be aware that the RF performance of the
AP-105 with its integrated antenna is equal to or better than an AP-124 with either the AP-ANT-13B or
AP-ANT-16. In general, you will find that the AP-105 is the more economical and higher-performing
solution. Unless you have a need to conceal the AP outside the user space, the AP-105 is the better
choice.
Typically, if the APs are co-located with their antennas, the second distance can be ignored because the
characteristics of antennas used will solely determine the recommended distance. This is typically the
case with an integrated antenna AP or an external antenna that is at the same location as the AP (within
one meter). However, if the antennas are remotely located from the APs as may be the case when APs
are located in a closet with RF extension cables to the antennas, the distance between the APs in the
closet can be important to consider in addition to the spacing between the remote antennas.
Main
Entrance
HD_258
In this case, we consider 2.4 GHz as the worst case due to increased free space propagation in that
band. Table 9 lists the minimum required separation for two APs with 20 MHz minimum center
frequency separation (that is, 1 to 6 or 6 to 11). This provides an additional 15-dB reduction in coupling.
The interference target is typically recommend to be -85 dBm to ensure that no channel bandwidth
degradation occurs and all data rates are available. However, in HD WLANs this may not be possible
depending on the number of channels in use, so -75 dBm is sometimes used as a compromise between
increased capacity and reduced peak performance.
Table 9 Interfering AP to AP Minimum Mounting Distance (Five 802.11BG Channel Separation)
Transmit Power
(dBm)
Interference Target
-85 dBm
Interference Target
-80 dBm
Interference Target
-75 dBm
15
200 ft / 61 m
114 ft / 35 m
65 ft / 20 m
12
144 ft / 44 m
82 ft / 25 m
46 ft / 14 m
98 ft / 30 m
58 ft / 17 m
32 ft / 9.8 m
72 ft / 22 m
39 ft / 12 m
22 ft / 6.9 m
See Appendix C, Basic Picocell Design on page 113 for a detailed explanation of the math behind
this table.
NOTE
Interference Target
-85 dBm
Interference Target
-80 dBm
Interference Target
-75 dBm
15
12.8 ft / 3.9 m
7.2 ft / 2.2 m
3.9 ft / 1.2 m
12
9.1 ft / 2.8 m
5.2 ft / 1.6 m
2.9 ft / 0.9 m
6.2 ft / 1.9 m
3.6 ft / 1.1 m
1.9 ft / 0.6 m
4.6 ft / 1.4 m
2.6 ft / 0.8 m
1.3 ft / 0.4 m
NOTE
See Appendix C, Basic Picocell Design on page 113 for a detailed explanation of the math behind
this table.
If the antennas are remotely located from the APs, the values of Table 10 apply to the minimum spacing
between antennas and it is a good idea to check that the minimum spacing between APs meets the
values of Table 11, which are computed for the direct coupling between APs that are located in a closet.
Table 11 AP spacing (channel 1 to 6 or 6 to 11), APs in a closet
Transmit Power
(dBm)
Interference Target
-85 dBm
Interference Target
-80 dBm
Interference Target
-75 dBm
15
1.3 ft / 0.4 m
0.7 ft / 0.22 m
0.4 ft / 0.12 m
12
1.0 ft / 0.3 m
0.5 ft / 0.16 m
0.3 ft / 0.09 m
0.7 ft / 0.2 m
0.4 ft / 0.11 m
0.2 ft / 0.06 m
0.5 ft / 0.14 m
0.3 ft / 0.08 m
0.1 ft / 0.04 m
Aesthetic Considerations
In many auditoriums aesthetics requirements significantly limit the ability to attach APs in view. The
availability of suitable mounting locations can have a significant impact the performance of the overall
RF design. In the auditorium shown in Figure 24, high and low ceilings, dense users, and tightly
controlled aesthetics severely limit the options available to mount APs.
Sometimes a suitable cover can be utilized to hide the AP, but in most cases it is necessary to mount the
AP in spaces that are not visible. These spaces may include interstitial spaces between floors, drop
ceilings, behind curtains, catwalks, and maintenance areas.
Figure 24 Aesthetics Requirements Vary Between Auditoriums
Aruba recommends the following best practices for installations with restrictions on mounting:
The small, attractive design of the AP-105 with no antennas makes it resemble a smoke alarm or
other typical ceiling device. The status lights on the AP can be disabled so there is no indication of
activity from the ground. Aesthetics committees are likely to approve the use of the AP-105 in
ceiling-mounted or wall-mounted deployments.
Another option for wall-mounted installations is to use a flush-mounted panel antenna like the
AP-ANT-18, connected to an AP-124 mounted on the other side of the wall or inside the wall itself.
For installations that absolutely cannot have any visible network equipment, mounting of AP-124
with AP-ANT-18 in the interfloor space below aiming up is the best solution.
Select mounting locations that have no obstructions between the front of the antennas (or
integrated antenna APs) and the intended wireless clients.
If external antennas are being used, plan to mount your APs as close to their antennas as possible. If
absolutely necessary, use good-quality, low-loss coaxial cable to connect AP to antenna when
mounting the AP some distance away from the antenna.
Do not mix mounting strategies in the same room. When planning adjacent HD WLANs, use the
same strategy (overhead, side, or picocell) in all rooms.
Each strategy is carefully designed to (i) ensure a uniform signal level throughout the auditorium; and
(ii) control both AP-to-AP interference inside and outside the auditorium. Mixing strategies will
reduce performance and increase interference.
NOTE
Always mount antennas with built-in downtilt flat against the ceiling or floor so that the beam is
exactly vertical.
Keep a safe distance between your integrated antenna APs and any location where people will be
present. There are Specific Absorption Rate (SAR) distance requirements designed to protect the
human body from coming into too-close contact with wireless devices and wireless energy. In the
U.S. the SAR regulations require at least 6 in (15 cm) of clearance between WLAN antennas and the
human body. Plan to allow at least this much clearance, though more is better.
When using side coverage with directional antennas on opposite sides of the same room, mount the
antennas using an appropriate amount of mechanical downtilt so that the 3-dB beamwidth of the Eplane is aimed below the far antennas. (Note that the AP-ANT-17 and AP-ANT-18 have a built-in
downtilt of about 20 degrees).
Managing Clients
We stated earlier that the client devices dominate the CCI/ACI problem in HD WLANs because they
greatly outnumber the AP. Always use very low EIRP on the AP in an high-density deployment. Then,
enabling TPC is critical to getting as many client devices as possible to lower their power to match the
APs. Clients that do not honor TPC and use full power may create interference with adjacent
auditoriums. There is little you can do about ituser education is the key. Provide resources for your
users that identify the best version of driver and its appropriate configuration. Strongly encourage users
to update their driversand remind them often.
+3 dBi
HD_266
10 dB
loss
Classrooms
-10 dBi
Classrooms
HD_259
+3 dBi
Right
Right
36
44
36
40
Wrong
36
Back-to-back APs
on same channel
HD_260
36
Chapter 5
Infrastructure Optimizations
for HD WLANs
The HD WLAN capacity plan and RF coverage strategy you selected in the last two chapters depend on
a number of very important assumptions. For example, the usable channel count assumes that the AP
radios are optimally assigned and that all clients can use them. The concurrent user target assumes that
all clients in the auditorium are evenly distributed across APs, rather than being clustered together on
just a few of them. In this chapter, you will learn about specific Aruba infrastructure features that help
manage the environment to turn these assumptions into reality so that your design will work as
expected. Along with the capacity plan and RF design, the controller configuration is the third part of
the recipe for a successful high-density wireless network.
Optimal
Channel
Distribution
Even distribution of
channels with ARM
Enable load-aware,
voice-aware, and
video-aware scanning
Unnecessary
2.4-GHz radios
disabled with ModeAware ARM or static
assignment
Enable DFS channels
if being used
Optimal
Client
Distribution
Optimal
Power
Control
Optimal
Airtime
Management
This chapter presents the ArubaOS features behind these optimizations in detail. Some of these features
require that the wireless designer makes certain choices, and these are covered as well.
PHY , M AC errors
Channel selection
Interference + coverag e
Tx power selectio n
Each AP periodically scans all allowed channels for other APs, clients, rogue APs, background noise,
and interference. During the scan, the AP is not servicing its own associated clients, so scanning can be
suspended for situations such as clients in power-save mode, active voice calls, or heavy load on the
AP.
When the scan is complete, two figures are derived: the interference index and coverage index.
These indexes are used to calculate the optimum channel and transmit power for the AP.
The interference index is a single figure that represents Wi-Fi activity and non-Wi-Fi noise and
interference on a channel. When the interference index on the current channel is high compared to
other channels, the AP will look for a better channel, generally choosing the channel with the lowest
interference index. This tends to avoid non-Wi-Fi interference, but also to minimize CCI as other APs on
the same channel contribute to the interference index.
The coverage index comprises the number of APs transmitting on a particular channel, weighted by
their signal strengths as measured by the AP. The ARM algorithm aims to maximize and equalize
coverage indexes for all channels, and this is the primary factor controlling an APs transmit power,
within configured limits. ARM also seeks to maximize the separation of adjacent channels when
possible, for instance separating channel 36 and 40 by at least one cell.
The result of the ARM channel reuse management algorithm in an HD WLAN is an optimum RF plan
that makes the best use of the available spectrum by distributing channels within the high-density
coverage zone so as to minimize CCI with APs outside.
Mode-Aware ARM
HD WLANs need many more 5-GHz radios than 2.4-GHz radios. The Aruba Mode-Aware feature
dynamically shifts surplus radios in the same RF neighborhood to become air monitors. The feature
actually works on both bands, but in an HD WLAN this feature primarily helps reduce or eliminate
overcoverage in the 2.4-GHz band.
The Mode-Aware algorithm is aware of the physical geography of the network, so it will only disable
nonedge APs into temporary air monitors when there is excessive RF coverage.
APs cannot be individually configured for Mode-Aware; the feature works across the entire physical
AP pool in each AP group.
Some customers may prefer to statically assign which 2.4GHz radios are enabled on which APs. This
may be accomplished by making AP-specific profile assignments in either the GUI or CLI.
Band Steering
Most enterprise WLANs use dual-radio APs, which provide simultaneous coverage in the 2.4-GHz and
5-GHz bands. In Wi-Fi, clients are primarily responsible for association choices, and so they should be
able to pick the optimum AP and frequency band, based on where they will achieve the best
performance. However, a number of factors prevent this in practice:
Some clients, including most Wi-Fi phones, older PCs, bar code readers, and other special-purpose
devices are only capable of 2.4-GHz operation. These devices have no option to use the 5-GHz band,
so it is generally desirable for 5-GHz-capable clients to use the 5 GHz band, which minimizes traffic
on the 2.4-GHz band.
While many notebook PCs, the most common WLAN client, are now capable of operation in either
band, they typically have a preference for 2.4 GHz, because that is the most commonly available.
When they find a suitable 2.4-GHz network, they usually stay in that band, even when 5-GHz service
is available.
The result is that even in dual-band networks, most clients connect at 2.4 GHz, even though it is the
most crowded, and interference-prone band and despite 5-GHz availability. As a result, the 2.4-GHz
band becomes congested, even though there is plentiful capacity at 5 GHz, and network usage is
suboptimal.
The solution is for the HD WLAN to steer 5-GHz-capable clients to that band by giving them clear
conditions, which allows 2.4-GHz-limited clients more data capacity as their own 2.4-GHz band
becomes less crowded.
The infrastructure-controller steering mechanism used in ARM monitors probe requests from all clients
and notes when they transmit on the 5-GHz band. Association requests are refused at 2.4 GHz (with
exceptions for persistent clients to avoid disruption), so the client only hears 5-GHz APs, and connects
to them. Wi-Fi devices are not designed with this environment in mind, so the algorithm must be failsafe and must allow connection at 2.4 GHz when the client resists steering.
Figure 29 Effect of Band Steering on Throughput (Mbps)
70.0
60.0
50.0
40.0
30.0
20.0
10.0
0.0
.11b
.11g
.11a
.11n
Figure 29 shows the effect of band steering on data throughput (Mbps, vertical scale) for a population
of 802.11b and 802.11g clients at 2.4 GHz and 802.11a- and 902.11n clients at 5 GHz. In this case, as more
802.11a and 802.11n-capable clients were steered away from 2.4 GHz, the data throughput of both
802.11b and 802.11g clients increased while the new mix of clients at 5 GHz is more favorable to 802.11a
and 802.11n.
Mbps
3
4
5
6
7
Channels in use
With no load balancing
With ARM load balancing
The ARM algorithm uses the number of clients, rather than data rates or load because we have found
that historical patterns of behavior are not a good indicator of future activity. A device may be passive
for hours, and then suddenly start a high-rate transaction, and after it has begun, it would be disruptive
to balance it to another channel. Similarly, a very active client may suddenly fall silent. Traffic is
unpredictable, and the optimum solution flows from assuming that each client is equally capable of
generating traffic.
Wi-Fi devices have, as yet, no standard way to detect dynamic load on an AP. However, the new 802.11k
amendment will allow APs to advertise current traffic and available capacity, and when 802.11k-capable
clients appear, the ARM load balancing algorithm may extend infrastructure control through this
mechanism.
Physical Carrier Sense: For the channel to be idle, the Clear Channel Assessment (CCA) must
report that no energy is detected above a defined threshold. CCA is a complex subject beyond the
scope of this guide. For purposes of HD WLANs, the key point is that strong ACI will cause the CCA
to report a channel as busy.
Virtual Carrier Sense: For the channel to be idle, the Network Allocation Vector (NAV) must be
zero. All 802.11 frames contain a preamble that includes a length field that tells receiving stations
how much time that frame will take on the air. When a Wi-Fi station receives a frame with a valid
preamble from any other stationwhether part of the same Basic Service Set (BSS) or not it must
use the duration field to set a counter called the NAV. This is essentially a timer that is always
counting down. As long as the NAV is greater than zero, the virtual carrier knows that the medium is
busy. This is the primary mechanism of detecting so-called co-channel interference. It is not
interference per se, like Bluetooth, but a way of ensuring that only one station can transmit at a
time.
To maximize the performance of any HD WLAN, it is of fundamental importance to control the transmit
power of stations in the auditorium to reduce ACI, and to limit the receive sensitivity of the AP to
provide some protection against weak co-channel sources. It may also be necessary to enable Requestto-Send / Clear-to-Send (RTS/CTS) depending on conditions in each individual auditorium.
-20 MHz
-28 dBr
+9 MHz
0 dBr
-11 MHz
-20 dBr
+11 MHz
-20 dBr
+20 MHz
-28 dBr
-30 MHz
+30 MHz
-40 dBr
-20 MHz
-10 MHz
fc
+10 MHz
+20 MHz
+30 MHz
HD_275
-30 MHz
-40 dBr
However, in an HD WLAN with multiple adjacent channel APs spaced close together, Wi-Fi signals are
received at sufficiently high power levels to cause the 802.11 carrier sense mechanism to declare the
channel busy. In this situation, adjacent channels have effectively become part of the same collision
domain. This problem is even more significant for adjacent clients that are even more numerous and
more tightly packed than the APs. Therefore, at the densities required for HD WLANs, otherwise
nonoverlapping 5-GHz channels actually do overlap.
Consider the HD WLAN in Figure 32, which has three pairs of APs and clients, each one on an adjacent
20-MHz channel. Pairs 1 and 3 are transmitting heavy-duty cycle traffic such as a video stream. All six
stations are configured to use 20 dBm EIRP.
Figure 32 ACI Example with APs and Clients at Short Range
AP1
(Ch. 36)
AP2
(victim)
(Ch. 40)
AP3
(Ch. 44)
Station 1
0.5 m
(-44 dBr)
1m
Station 2
(victim)
Station 3
1m
(-50 dBr)
HD_268
0.5 m
AP2 and station 2 on channel 40 now want to transmit and perform a CCA. Because pair 1 is only 3.2 ft
(1 m) away, their transmissions are received at -44 dBm, while signals from pair 2 travel 6.5 ft (2 m) and
are received at -50 dBm. Neither AP2 nor station 2 are allowed to transmit because the detected energy
exceeds the CCA threshold, even though no one else is using the channel.
Figure 33 Frequency Domain Illustration of ACI at Short Range
-30
Station 1
(-44 dBr)
-40
Station 3
(-50 dBr)
-50
ACI
-60
dBm
-70
CCA
Threshold
-80
HD_269
-90
-100
Channel 36
5170 MHz
5190
40
5210
44
5230
48
5250
5270
Inside an auditorium, with minimal free space propagation loss between stations, the edge of the skirt
can easily be -75 dBm or higher. This is easily modeled. Note how reducing the transmit EIRP from
20 dBm to just 3 dBm reduces the interference radius by more than a factor of 4.
Figure 34 ACI Power vs. Receiver Distance
(Peak Skirt Power = 20 dBr, n=2.2, 2.4-GHz NF = -95 dBm, 5-GHz NF = -105 dBm)
The effect of ACI is easy to measure in an HD WLAN environment. The following test was conducted
with two side-by-side groups of 10 clients, each associated to an AP on an adjacent HT20 channel. A
baseline was taken first, with each group testing separately. Then the test was rerun with each group
(channel) transmitting simultaneously. The two groups were moved 25 ft (7.6 m) and 50 ft (15.2 m)
apart with tests run at both locations. These results align with the previous model quite well.
Figure 35 TCP Throughput with Decreasing Distance and Increasing ACI
The primary method to reduce ACI is to use the minimum amount of transmit power necessary for the
size of the auditorium. In general, Arubas research shows that an EIRP of 6 dBm is more than adequate
for most high-density zones.
Reducing client transmit power is even more important than the AP power. Ensure that 802.11h
Transmit Power Control is enabled in your HD WLAN to influence those clients that honor it to match
the AP transmit power. In large, heterogenous auditoriums where IT does not control the user devices,
its a good idea to ask all users to go into their client NIC utility and reduce the transmit power to a
medium value. We will explore in more detail how to reduce AP and client power in the next section.
From an AP perspective, the other method of reducing ACI is to ensure maximum possible physical
separation of adjacent channel APs. This is the reason to evenly distribute APs throughout the coverage
area. ARM will then make channel assignments to maximize the physical distance between same and
adjacent channels.
Channel
A
x
A
Data
w
w
D
Data
t
A
Ack
D
Ack
Data
Data
Ack
A w
Data
w
Data
w
Ack
t
A
Ack
Data transferred
A
Ack
Data
Data
t
D
Ack
Data
t0
Data
Ack
A w
Ack
t
time
t1
Data
Data
Data
Data
Data
Data
Data
Data
When A-w1 and D-t1 are not simultaneous (all devices are in one collision domain)
Ack
Data
Data
t1
Ack
D
Ack
t0
Data
Data
w1
w1
Data
Data
Data
Data
Data
A
Ack
Data
t1
t1
D
Ack
time
t1
HD_261
A
w
D
This effect is also very easy to measure. We took the same two groups of 10 clients and put them on the
same channel at a distance of 50 ft (15.2 m). Each group was run separately and the results added. This
produced a solo AP and dual solo AP baseline. Then the test was rerun with both groups transmitting
simultaneously. A third group of 10 clients with another AP was then added to the same channel and the
test was rerun. The results of the previous ACI tests are also shown for comparison.
We see in the results that CCI reduces the overall capacity of a channel as a result of the contention and
collision effects just described. It is also clear that adding more APs actually reduces capacity when
they share the same collision domain.
Figure 37 ACI vs. CCI: Bidirectional TCP Throughput
These are three basic strategies to minimize CCI effects in a high-density coverage area:
Good RF Design: Do not reuse channels inside the same HD WLAN to limit CCI effects in the same
area. Choose a coverage strategy that will minimize CCI from other APs near the auditorium, taking
into account the construction of the building. If using a picocell strategy, engage an experienced
wireless integrator with the training and tools to properly design it.
Limit transmit power: Do not use even 1 dB more power than is absolutely required. Less is truly
more, because less power will produce more throughput.
Control the Receive Sensitivity Threshold: Use the Aruba Channel Reuse Management (CRM)
feature to selectively deafen the APs in the auditorium. CRM includes an intelligent dynamic mode
and also a static mode. This can provide a performance benefit on downstream traffic leaving the
AP.
You may also wish to experiment with enabling RTS/CTS if the above methods are not yielding the
desired level of control.
The RST algorithm measures received signal levels from associated clients and applies a moving
average to reduce the APs receive sensitivity based on the worst-case (that is, farthest) client. The
adjustment is dynamic, so if all clients connect with good signal strength, the sensitivity will be
considerably reduced. However, if some clients are distant, the reduction in sensitivity will be less. A
static mode is also available if the wireless designer wants to specify a fixed power level for the filter.
This has the effect of tuning out more distant transmissions while maintaining responsiveness in cases
where a client has a weak signal.
Figure 38 Operation of the Receive Sensitivity Tuning Threshold
0 dBm
-20 dBm
Frame
decoded
-40 dBm
Power
-60 dBm
Rx sensitivity
tuning
threshold
Frame
ignored
Channel 1
HD_262
-80 dBm
By dynamically or statically setting the RST threshold higher (closer to 0 dBm) signals from more
distant APs are ignored, transmit deferrals are reduced, and network throughput is increased. However,
the RST threshold must not be set so high that there is insufficient SNR to demodulate the highest data
rates. Aruba generally recommends a minimum SNR of 25 dBm to achieve MCS7 and 15. You can use
Figure 34 on page 57 to estimate the ideal threshold if you wish to use static mode. Simply identify the
lowest signal level for the maximum EIRP you allow ARM to choose and use that as your threshold.
The Aruba WLAN infrastructure maintains application performance in high-density areas (such as
lecture halls) with scheduled airtime fairness.
Fair access: Allocates same air time to all clients by the process of token allocation
Preferred access is generally recommended for HD WLANs. This option applies higher weights to faster
modes, for example, assuring that an 802.11n client can complete a transaction much faster than its
802.11a equivalent. Preferential fairness offers the highest overall data capacity, but at some cost to
less-capable clients. Some network managers would welcome this as a subtle nudge to the user
population to upgrade to 802.11n clients.
Figure 39 shows the effect of using all three modes. On the left, the absolute airtime is shown in
milliseconds obtained by a mix of a/b/g/n clients using Default, Fair and Preferred access. On the right,
the UDP throughput achieved by those same clients is shown. We see that Fair access increases 11n
client time-on-channel by up to 1089% and a 3X increase in throughput. Preferred access turns in a
3176% increase in airtime, which yields a 5X increase in throughput. Clearly, the Aruba Airtime Fairness
feature has a significant impact on any auditorium with an expected heterogenous client mix.
Figure 39 Performance Improvement with Airtime Fairness Fair and Preferred Modes
A particularly powerful way to evaluate the operation of Airtime Fairness is to examine its impact on
individual clients and flows. The throughput vs. time graph from a 20-client TCP downstream Ixia
Chariot test in Figure 40 shows the difference in the individual client throughput when the shaping
policy is toggled from default access to fair access. The quasi-random contention-based access on the
left gives way to a much steadier result on the right due to the airtime shaping algorithm that imposes
consistent access tokens to different client types.
Figure 40 Effect of Airtime Fairness Token Algorithm on Individual TCP Streams
Limit what devices can appear in the controllers user table by specifying exactly what subnets and
protocols are allowed through the validuser IP access list. The following CLI command can be
used: firewall local-valid users.
If IPv6 is not required, it is suggested to block it via Ethernet ACL on each mobility controller
interface and user-role. IPv6 quickly consumes user entries on the controller, and it is chatty with
multicast by default with some devices. It is a good general security best practice to disable any
unused network protocols to minimize potential risks.
If netbios-ns, netbios-dgm, mDNS, UPnP, and SSDP protocols are not required, it is strongly
suggested to block them in the appropriate user role. These protocols are quite chatty through
device queries or announcements and are mainly used for discovering devices in small networks,
such as in-home networks. Most devices that support these protocols can easily use DNS instead,
which is a more optimal protocol for large, highly mobile networks.
Prevent HD WLAN clients from accidentally being configured as DHCP Servers by blocking the
protocol port udp 68, which is used for DHCP server replies. This setting should be applied to
every user role.
When creating ACLs, use netdestination aliases when several rules have protocols and actions in
common with multiple hosts or networks, to simplify firewall policy configuration. The
netdestination alias allows adding IP addresses by host, network, range, or by using the invert
feature. It is best to use network to specify a range of hosts when creating a netdestination alias to
minimize the number of ACL entries created on the controllers. The maximum limit is 8,000 entries
for the Aruba M3, 3000 and 600-series controllers. The limit is 4,000 entries for Aruba 2400 and 800series controllers.
x
Optimizing Multicast Modulation Rates
12 Mbps
Multicast algorithm
simple algorithm
A
v
9 Mbps
2 Mbps
6 Mbps
Modulation rate
Lowest basic rate = 1 Mbps
Rate to z = 2 Mbps
Rate to y = 6 Mbps
24 Mbps
HD_263
w
Channel A
The default behavior in 802.11 is to transmit multicast traffic at the lowest configured basic rate for the
AP, so it stands the best chance of reaching all associated clients. This can be very expensive in terms
of time on the medium, and multicast has been the subject of many optimization techniques.
ARM technology includes a number of techniques to reduce the time on the medium of multicast traffic:
Instead of transmitting all traffic at the lowest configured rate of the AP, the AP can identify the
lowest actual rate used by all of its clients, or a configured minimum rate, which is often
considerably higher. This is shown in Figure 41.
APs are configured as bridges by default, so they automatically transmit all multicast traffic whether
or not there is a member of the multicast group on a particular AP. By using IGMP snooping, the
infrastructure can identify which APs and clients need particular transmissions, blocking all others.
The Aruba solution tackles the multicast reliability problem on multiple fronts:
IGMP Snooping ensures that the wired infrastructure sends video traffic only to those APs that have
subscribers.
DMO sends multicast traffic as unicast traffic, which can be transmitted at much higher speeds and
has an acknowledgement mechanism ensuring reliable multicast.
Transmission automatically switches back to multicast when the client count increases high enough
that the efficiency of unicast is lost.
Multicast-rate-optimization keeps track of the transmit rates sustainable for each associated client
and use the lowest common sustainable rate for multicast transmissions.
As a result, reliable high-performance multicast video delivery over a high-density wireless network
becomes a reality.
At this time, Aruba does not recommend disabling any MCS rates as it has been observed to cause
unpredictable client driver behaviors.
Keep each VLAN subnet within a VLAN pool to a 24-bit subnet mask.
Do not have more than 10 VLANs within a pool so that broadcast or multicast traffic does not
consume too much air time access.
Chapter 6
Configuring ArubaOS
for HD-WLANs
This chapter explains how to enable and configure the Aruba controller options described in the last
chapter for a high-density deployment. The configuration options for HD WLANs are:
Optimal
Channel
Distribution
Even distribution of
channels with ARM
Enable load-aware,
voice-aware, and
video-aware scanning
Unnecessary
2.4-GHz radios
disabled with ModeAware ARM or static
assignment
Enable DFS channels
if being used
Optimal
Client
Distribution
Optimal
Power
Control
Optimal
Airtime
Management
This chapter assumes that a complete base configuration already exists on the controller that conforms
with Aruba best practices as laid out in any of the VRD base designs.
Single-band: Enables ARM scanning for channels in either 2.4-GHz or 5-GHz band.
Multiband: Enables ARM scanning for both 2.4-GHz and 5-GHz bands.
Maintain: Keeps the AP operating on the current channel and power level. Does not change the AP
power or channel based upon information gathered during ARM scanning. This setting is most often
used to keep all settings the same while troubleshooting or performing a site survey.
Disable: Returns all APs to the channel set in the relevant RF radio profile. AP power level and
channel will not be changed based upon ARM information.
Figure 42 ARM Profile Configuration
The scanning checkbox enables ARM scanning and is necessary for ARM to function properly.
NOTE
The Aruba controller is application aware and can also perform deep packet inspection of the traffic
flowing across the HD WLAN using its ICSA certified stateful firewall and stop scanning accordingly.
The following scanning modes can be independently enabled or disabled:
Load aware scanning: In the presence of high traffic loads on the AP, scanning is postponed.
Voice aware scan and videoaware scan can be configured in the controller GUI by selecting the
checkbox in the ARM profile or by using the CLI. Aruba recommends that all three modes be employed
in most auditorium environments.
Here are the equivalent CLI commands to configure these features from a secure shell:
Voice aware scanning: Defers ARM
scans when active voice calls are
present on an AP.
!
rf arm-profile <arm profile name>
voip-aware-scan
!
!
rf arm-profile <arm profile name>
video-aware-scan
!
!
rf arm-profile <arm profile name>
mode-aware
ideal-coverage-index 6
!
DFS channels can be also be enabled or disabled via the following CLI commands:
!
ap regulatory-domain-profile default
valid-11a-channel <20mhz channel number>
valid-11a-40mhz-channel-pair <40mhz channel pair>
no valid-11a-channel <20mhz channel number>
no valid-11a-40mhz-channel-pair <40mhz channel pair>
!
NOTE
The Band Steering feature will not work unless you enable the Local Probe Response parameter in
the Wireless LAN SSID profile for the SSID that requires band steering. This parameter is normally
enabled by default.
!
wlan ssid-profile <HD-WLAN client ssid name>
local-probe-response
wlan virtual-ap <HD-WLAN client vap name>
band-steering
!
!
rf dot11a-radio-profile <802.11a rf profile name>
spectrum-load-balancing
!
rf dot11g-radio-profile <802.11g rf profile name>
spectrum-load-balancing
!
NOTE
Minimum and Maximum transmit power must be individually configured on each radio. The example
above shows the 5-GHz radio settings.
!
rf arm-profile <HD-WLAN client arm profile name>
min-tx-power <desired minimum transmit power>
max-tx-power <desired maximum transmit power>
!
!
rf dot11a-radio-profile <802.11a rf profile name>
dot11h
!
rf dot11g-radio-profile <802.11g rf profile name>
dot11h
!
This feature is being renamed to Channel Reuse Management in a future ArubaOS release.
NOTE
The channel reuse mode is configured through an 802.11a or 802.11g RF management profile. You can
configure the channel reuse feature to operate in one of three modes: static, dynamic, or disable. (This
feature is disabled by default.)
Dynamic mode: This mode is recommended for HD-WLANs. In this mode, the Clear Channel
Assessment (CCA) thresholds are based on channel loads, and take into account the location of the
associated clients. When you set the Channel Reuse feature to dynamic mode, this feature is
automatically enabled when the wireless medium around the AP is busy greater than half the time.
The CCA threshold adjusts to accommodate transmissions between the AP and its most distant
associated client.
Static mode: This mode is a coverage-based adaptation of the CCA thresholds. In the static mode of
operation, the CCA is adjusted according to the configured transmission power level on the AP, so
as the AP transmit power decreases as the CCA threshold increases, and vice versa.
Disable mode: This mode does not support the tuning of the CCA Detect Threshold.
This feature is not available for DFS channels.
NOTE
To enable RX Sensitivity Channel Reuse, select either dynamic or static from the drop-down menu in
the 802.11a radio profile.
Figure 48 Configuring Channel Reuse in the Radio Profile
The following example is a CLI configuration of how to configure RX Sensitivity Tuning-Based Channel
Reuse:
1. Valid channel reuse policies are
disable, dynamic or static.
2. For static mode, a threshold in
dBm must be specified.
!
rf dot11g-radio-profile <802.11g rf profile
name>
channel-reuse <disable, dynamic, or
static>
channel-reuse-threshold <Rx Sensitivity
Threshold value in -dBm>
!
rf dot11a-radio-profile <802.11a rf profile
name>
channel-reuse <disable, dynamic, or
static>
channel-reuse-threshold <Rx Sensitivity
Threshold value in -dBm>
!
It must be created.
2. Valid policy types are default,
fair-access, and preferredaccess.
3. The 'wlan traffic management
profile' is applied separately to
each radio at the AP group level.
!
wlan traffic-management-profile <wtm profile name>
bw-alloc virtual-ap default share <percentage>
shaping-policy <policy type>
!
ap-group <ap group name>
dot11a-traffic-mgmt-profile <wtm profile name>
dot11g-traffic-mgmt-profile <wtm profile name>
!
To verify that Airtime Fairness is enabled, you may use this procedure:
1. Open the CLI using a terminal emulation program.
2. Type show wlan traffic-management-profile <default> to check the current configuration
(Default is the profile name which may change based on your configuration).
3. Confirm that the station shaping policy corresponds to your selected mode.
The following configuration suggestions have some parameters that are meant for customers that do
not require IPv6 and other chatty protocol connectivity. Limiting this type of connectivity for their
wireless users cuts down unnecessary traffic that is not needed for most day-to-day application and
network use.
1. IPv6 is disabled by default, but
take this configuration step if it
shows enabled with the CLI
command show ipv6 firewall.
!
no ipv6 firewall enable
!
no ipv6 enable
!
ip access-list session deny_mDNS_acl
any any udp 5353 deny
!
ip access-list session deny_SSDP_and_UPnP_acl
any host 239.255.255.250 any deny
any host 239.255.255.253 any deny
!
ip access-list session
deny_client_acting_as_server_acl
user any udp 68 deny
!
user-role <wireless user role name>
session-acl deny_mDNS_acl
session-acl deny_SSDP_and_UPnP_acl
session deny_netbios_acl
session-acl deny_client_acting_as_server_acl
session-acl allowall
!
interface [all active Fastethernet/
gigabitethernet/port-channel] <slot/port value>
ip access-group no-ipv6-acl in
!
To enable broadcast and multicast rate optimization, check the box in the SSID profile as shown Figure
51.
Figure 51 Configuring Multicast Rate Optimization in the SSID Profile
!
wlan ssid-profile <HD-WLAN client ssid profile
name>
mcast-rate-opt
!
!
interface vlan <vlan number for every active vlan>
ip igmp snooping
!
The DMO threshold has a default value of 6 clients, but it may be set higher. The DMO threshold
specifies the number of HT WLAN clients per Virtual AP, per VLAN for video delivery mode. Video is
delivered as multicast when the number of HT clients exceeds the threshold, and video delivered as
unicast when the number of HT clients is below the threshold. For this computation, 1 legacy client
(802.11a/b/g) has a penalty factor equal to 3 HT clients (802.11n).
For example, if there are three 802.11n clients associated to a VAP and the threshold value is set to 4,
DMO will take place. Once the fourth HT client associates to the same VAP, DMO will no longer take
place. If two 802.11b clients are associated to the VAP and the threshold is set to 4, they will be treated
as if they were 6 HT 802.11n clients and DMO will not take place.
Video Scalability
The example below demonstrates the impact of DMO and MRO transport on video scalability as it
relates to over-the-air channel utilization. Unicast transport is almost always optimal; however, there
are use cases in which optimized multicast delivery will reduce channel utilization. This needs to be
balanced against the need to assure reliable delivery and QoS. Thus unicast delivery is preferred and
recommended to ensure reliable delivery and QoS for multicast video applications.
In the example below, channel utilization is estimated for MRO vs. DMO as a function of 802.11n,
802.11a/g, and 802.11b client counts. This model assumes a single 2 Mbps video stream and average
rates of 180, 36, and 5.5 Mbps for 11n, 11a/g, and 11b clients respectively.
Figure 54 Channel Utilization for MRO and DMO (As a Function of 11n, 11a/g, or 11b Clients)
Note that in Figure 54, 40 11n clients averaging 180 Mbps of PHY rate can sustain 2 Mbps video with
good quality and still remain below the full channel utilization. Also, note that the channel utilization
shown above is for illustration purposes only, and should never exceed 80% in practice.
To verify channel utilization, use the command show ap debug radio-stats ap-name AP-125-2 radio 0
advanced | include Clear and select enter. This will show the channel utilization and the resulting air
time. High numbers represent high channel utilization and low numbers reflect more channel capacity
is available for transmissions, with averages over the past 1, 4, and 64 seconds respectively.
Aruba has tested the following configurations and recommends the following settings be used based on
the size of the video streams that will be delivered:
If the video stream bandwidth is around 500 Kbps, the threshold can be set as high as 12.
If the video stream bandwidth is > 2 Mbps then keep the threshold between 6 to 8.
For HD video (stream bandwidth > 10 Mbps) drop threshold to between 2 and 3.
These values will clearly be dependent on the video stream size, the client mix, the number of unique
video streams or channels, the AP density, and the reserved channel capacity (see earlier sections for
instructions on reserving channel capacity for video).
!
wlan ssid-profile <HD-WLAN client ssid profile
name>
g-tx-rates 5 11 24 36 48 54
a-tx-rates 24 36 48 54
!
!
wlan virtual-ap <HD-WLAN client virtual ap name>
vlan <HD-WLAN vlan # or list of vlans>
!
Chapter 7
Troubleshooting
for HD WLANs
To troubleshoot client device issues in high-density wireless networks, you must systematically narrow
down the source of the problem by knowing the relationship of all the components in the system from
end to end. This chapter provides the processes that are used by senior Aruba support engineers to
resolve problems with mobile client devices. It will help you to identify and troubleshoot the most
common problems found in WLAN connectivity.
Possible Cause
Has anything changed in the WLAN equipment configuration? (All the Aruba Mobility Controllers
have an audit log that tracks every GUI and CLI configuration change.)
Things to Check
wireless client
backend servers
device hardware
device OS
device supplicant
device driver
AP physical location
antenna position
AP status
AP configuration
DHCP server
RADIUS server
LDAP server
user database (for example, Microsoft Active Directory)
HD WLAN Troubleshooting
When you receive a report of a connectivity issue related to an HD WLAN, gather the following
information:
Device location (country, city, building, floor, general location, room number)
The location determines the Aruba controller(s) on which you should concentrate troubleshooting
efforts.
No
Does the device show
it is connected to
the WLAN?
Yes
Is the WLAN
interface enabled?
No
No
Yes
Yes
Yes
No
No
Yes
No
Yes
No
If problem continues,
call Aruba TAC.
Yes
MSG_185
START HERE:
This command lists all APs with their respective AP names and their active SSIDs and BSSIDs.
4. Verify that the SSID is not hidden.
If the SSID is hidden, verify that the client is properly configured to associate to it.
5. Check the wireless NIC enable/disable physical switch on the mobile device.
6. Check the wireless NIC enable/disable soft setting within wireless supplicant software.
7. If the device is using Microsoft Windows Operating System, issue a repair in Network Connections
or wireless NIC system tray icon. For MacOS devices, turn AirPort off and then back on again.
If no issues are found and these actions have not corrected the problem, continue with device
troubleshooting. In addition to previously gathered information (username, location, MAC/IP), gather
device hardware model name and number and wireless NIC brand, model, type, and driver version for
further troubleshooting. Also, take a wireless packet capture so that Aruba Support can perform
analysis by means of the AP Remote Packet Capture method or by means of third-party software (for
example, WildPackets OmniPeek, CACE Technologies AirPcap, and so on). Please also provide the
Aruba Support Team all the necessary CLI command output for mobility controller, AP, and user
statistics.
Symptom #2: Device can see SSIDs but not the one it needs
Suggested actions:
1. Verify that the required SSIDs are active and enabled in the Aruba Mobility Controller.
Show ap bss-table
Issue this command from any mobility controller, master or local, that is servicing APs. This
command lists all APs with their AP names and their active SSIDs and BSSIDs.
2. Verify that all APs are up and active, especially those in the area of the problem device.
Issue this command from the master mobility controller that is servicing the area of the problem
device. This command lists all known APs serviced from that master mobility controller,
regardless of being up or down.
3. If all APs have proper SSID configurations and no APs are reported down, verify that the client
device is attempting to associate and authenticate.
configure terminal
end
This command starts debugging on all Aruba processes for the wireless device and logs the
results in Aruba logging category user-debug.
View the debug output with this CLI command:
show log user-debug all | include <wireless devices mac address>
This command lists the 802.11 management packets (Association Request, Association
Response, Re-Association Request, Re-Association Response, Disassociation, and Deauth) for
the specified wireless device.
If you see the latest packet as assoc-resp, the wireless device should be authenticating (if VAP
is configured for Layer 2 or Layer 3 authentication) or should be authenticated already.
Issue this command from any master or local mobility controller that is servicing APs to which
the device may attempt to associate. This command shows if the problem client is attempting to
associate. Look for the problem client MAC address. It also shows to which AP the client MAC is
attempting to associate. Note the BSSID.
This command shows the 802.11 state of the wireless device, what SSID it is associated to, what
VLAN it is assigned, what PHY type it is using, how long it has been associated to the APs BSSID,
and what capabilities it has such as WMM, Active/Not Active, RRM client, Band Steerable, or
HT-capable.
Use the AP BSSID and device MAC taken from this command.
show ap association | include <AP BSSID that the device is associated to> and
show user-table bssid <AP BSSID that the device is associated to>
This output can be used to verify if there are other devices currently associated to the same AP,
thus helping to rule out infrastructure issues as compared to a single-client issue.
Look for the problem client MAC. This command can be used to determine whether the client is
attempting to authenticate via Layer 2 or Layer 3 authentication and if the request is being
rejected. If the attempt is rejected, this can be established as the reason for client failure.
Investigate authentication server logs as needed.
show auth-tracebuf
If the device is configured to use Layer 2 authentication such as 802.1X, verify that the wireless
device successfully completed all EAP and Key exchange phases using this CLI command:
show auth-tracebuf mac <wireless device mac address>
This command can be used to determine if there are any miscellaneous errors with the mobility
controller, the AP, or the wireless device.
This command can also point to problems with an authentication server not responding to
authentication requests if Layer 2 or Layer 3 authentication is enabled on the virtual AP to which
the device is trying to connect.
If the authentication server is RADIUS, look for excessive RADIUS timeouts or instances of the
Aruba Mobility Controller taking a RADIUS server out of service for the server hold-down timer.
This behavior indicates possible RADIUS server connectivity or performance issues and should
be investigated as needed.
Using these steps, you can determine if the device has passed 802.11 negotiation and is attempting to
authenticate (if Layer 2 or Layer 3 authentication is required). If none of these steps yields information
that helps you correct the problem, take a wireless packet capture for Aruba Support to analyze. You
can use the AP Remote Packet Capture method or third-party software (for example, WildPackets
OmniPeek, CACE Technologies AirPcap, and so on). Please also provide the Aruba Support Team all
the necessary CLI command output for mobility controller, AP, and user statistics.
This command displays all details pertaining to the client. Verify that the IP address is not 0.0.0.0
or a 169.x.x.x address.
This command is also used to verify if the user was successfully authenticated, and displays the
user-role, ACL number, authentication method, and associated AP name/BSSID.
If the device is associated to the right user-role and VLAN but it does not have a valid IP address,
disable and re-enable its wireless adapter or force a DHCP release-renew in the operating
system of the device.
If the problem is not corrected, investigate DHCP infrastructure and connectivity.
DHCP troubleshooting:
Enable DHCP debugging on the Aruba Mobility Controller at the AP device location.
config t
end
View the DHCP debug for the wireless device using this CLI command
show log network all | include <wireless device MAC address>
show user mac <wireless device MAC address> or show user ip <ipaddr>
This command lists all details pertaining to the client. Use this output to confirm that the users
authenticated role is correct.
Use this command to determine which policies are associated to the devices authenticated role
and verify that they allow the required protocols for device IP and application connectivity.
This command displays all IP flows between the device and the network.
Have the device attempt a connection to its required network resource and use this command to
confirm that traffic passing from the device is not being denied by the Aruba stateful firewall
role-based policies by verifying no IP flow is marked with the D flag (denied).
Using these steps, you can determine if the device has received a proper IP address, has been placed in
the correct user-role with the correct policies, and verify network connectivity. If none of these steps
yields information that helps you correct the problem, then prepare a wired packet capture for the
Aruba Support team to analyze between the Aruba Mobility Controller and the uplink switch. This can
be done with built-in operating system applications like tcpdump, network monitor, or third-party
software like Wireshark, Ethereal, or WildPackets OmniPeek/EtherPeek. Another method to achieve
device packet capture is by implementing session mirroring in the devices user-role on the mobility
controller.
Anything with Last Rx SNR value of 25 or greater normally provides good performance with the
higher supported 802.11 data rates.
4. Compare the problem users stated location with the building and AP floor plan or use Aruba RF
Plan.
5. Ask the user who is reporting the trouble if anyone else nearby is having the same issue. This
information assists in determining if this is an infrastructure or single-user problem.
6. Check the user log and the AP 802.11 management frames for possible cause of disconnection.
This command determines from when and where the disconnection originated (either the AP or
the device) and helps determine the reason.
7. Check the 802.11 association state of the wireless device.
show ap debug client-table ap-name <Aruba AP name where the wireless device is
associated to>
Part of this CLI output displays the Last_Rx_SNR, Tx_Rate, and Rx_Rate of the wireless device.
If the SNR is 15 or lower, the wireless device is possibly too far from the AP. This might be due to
the devices roaming algorithm not being optimal and needs to be forced to look for a closer AP
by disabling and re-enabling its network adapter.
If the Tx_Rate or Rx_Rate are 1, 2, or 6, the device may be experiencing interference or is too far
away from the AP.
If the Tx Retry rate is constantly 35% or higher, the device may be experiencing interference or is
far away from the AP.
There might be non-802.11 interference if the MAC and PHY errors are at an aggregate of 20% or
higher, which can be seen using this CLI command:
show ap arm rf-summary ap-name <Aruba AP name where the wireless device is
associated to>
8. Check mobility trail to determine if the client is bouncing between APs even when stationary.
9. Check device frame retry rate, noise levels, and SNR for the client.
10-20% is normal
20-30% is intermediate
This means that 40% of the frames sent to the air have been retransmitted.
This is a symptom of heavy interference or low signal strength between the device and the AP.
Take a wireless packet capture to see if the 802.11 frame retries are due to the AP not hearing the
wireless device, or the wireless device is not hearing the AP due to interference, or the device is
too far from the AP.
Channel Noise:
If channel noise is at a value of 75 or below, this is a critical interference level that should be
viewed with a Spectrum Analyzer.
From these steps you can determine possible causes for poor performance or roaming issues due to
device driver sub-optimal performance, roaming outside of the WLAN coverage area, or interference. If
none of these steps yields information that helps you correct the problem, then take a wireless packet
capture for Aruba Support to analyze by means of the AP Remote Packet Capture method or third-party
software (for example, WildPackets OmniPeek, CACE Technologies AirPcap, and so on). Please also
provide the Aruba Support Team with all the necessary CLI command output for mobility controller,
AP, and user statistics.
7. Provide the wireless devices make, model number, and its OS version, including any service packs
or patches.
8. Provide the Wireless LAN Cards make, model number, driver date, driver version, and configuration
on the wireless device.
9. Provide a detailed network topology:
a. Include all the devices in the network between the user and the Aruba WLAN Controller with IP
addresses and Interface numbers, if possible.
b. The diagram can be formatted as Visio, PowerPoint, JPEG, TIF, etc., or it can even be hand
written and then faxed to the Aruba Support Team (1-408-227-4550).
10. Provide any wired or wireless sniffer traces taken during the time of the problem.
11. Provide the following HD WLAN statistic output on the mobility controller:
a. show aaa state user <wireless client ip address>
b. show ap association client-mac <wireless device's mac address>
c. show ap debug mgmt-frames client-mac <wireless device's mac address>
d. show ap debug client-stats <wireless device's mac address> advanced
Run this command at least three times during the debugging.
e. show ap monitor stats ap-name <ap name> mac <client mac> verbose
Run this command at least three times during the debugging.
f. show auth-tracebuf mac <wireless client mac address>
12. Provide the following AP statistics on the mobility controller output:
a. show ap tech-support ap-name <Aruba AP name where the wireless device is
associated to>
Run this command at least three times for every AP the wireless device has a problem with
performance or roaming to.
13. If Layer 3 Mobility is enabled on the mobility controllers, provide the following CLI output:
a. show ip mobile binding | begin <wireless device's mac address>
b. show ip mobile domain
c. show ip mobile global
d. show ip mobile host <wireless device's mac address>
e. show ip mobile remote <wireless device's mac address>]
f. show ip mobile trace <wireless device's mac address>
g. show ip mobile traffic foreign-agent
h. show ip mobile traffic home-agent
i. show ip mobile traffic proxy
j. show ip mobile traffic proxy-dhcp
k. show ip mobile trail <wireless device's mac address>
l. show ip mobile visitor <wireless device's mac address>
Appendix A
HD WLAN Testbed
In Step #3: Choose a Concurrent User Target on page 25 in Chapter 3, Capacity Planning
for HD-WLANs we presented summary HT20 results of the HD WLAN testbed that Aruba used during
the authoring of this VRD. This appendix explains the testbed design, test plans, and a summary of the
most interesting results for both 20-MHz and 40-MHz channel widths.
Testbed Design
The need for real-world, open air performance data when planning an HD WLAN cannot be
understated. Such data takes out much of the guesswork, but can be expensive and time-consuming to
obtain because it requires dozens of workstations, lots of spare network hardware, skilled engineers,
shielded test facilities, and specialized measurement tools. Recognizing this challenge and the broadbased marketplace need, Aruba undertook a research program into client scaling as part of its industry
leadership efforts to assist customers with HD WLAN capacity planning.
Testbed Design
Aruba tested 50 late-model laptops with a diverse mix of manufacturers, operating systems, and
wireless adapters. They are summarized in Table 14. The goal was to mimic the uncontrolled,
heterogeneous environment that exists in most auditoriums.
Table 14 HD WLAN Testbed Device Population
Laptop
40
Acer
Intel 5100agn
10
Netbook
10
Apple
Intel 4965agn
21
TOTAL
50
Dell
31
Intel 5300agn
HP
Broadcom 4321agn
Windows XP
11
Lenovo
12
Dell 1490
Windows 7
Toshiba
Dell 1505agn
Windows Vista
30
TOTAL
50
Dell 1515agn
MacOS
Linksys WPC600N
TOTAL
50
TOTAL
50
HD WLAN Testbed | 95
Our test facility in San Jose, California is in an area with virtually no wireless transmitters, so the RF is
extremely clean. Laptops were placed in three rows with spacing between units of 4 in (10 cm), as
shown in Figure 58.
Figure 58 Aruba HD WLAN Test Area During 30 Station Test
Ixia Chariot 7.1 was used to generate repeatable IP traffic loads and to provide a control plane for the
tests. Most tests were run three times as a quality check. Each machine in the testbed had two active
network interfaces. The interface under test was the wireless NIC. To ensure that measurement data
was not lost during a test run due to wireless contention, all Ixia management traffic was sent via a
wired Ethernet link.
An Aruba 3600 controller running ArubaOS 3.4.2.3 was used to execute all tests. A single Aruba AP-125
was used for the client scaling tests. CCI and ACI tests used three AP-125s at varying distances
depending on the test case. Open authentication was used on the test SSIDs. Channel 157 was used for
the HT20 tests, and channel 161- was used for the HT40 tests.
96 | HD WLAN Testbed
Each complete test case to characterize a given variable thus included 32 separate runs. Table 15 lists
the actual client counts for each test that was run.
Table 15 20-MHz Channel Tests
Clients
100% HT20 /
0% 802.11a
75% HT20 /
25% 802.11a
50% HT20 /
50% 802.11a
25% HT20 /
75% 802.11a
0% HT20 /
100% 802.11a
1/0
6/0
4/2
3/3
2/4
0/6
10
10 / 0
7/3
5/5
3/7
0 / 10
20
20 / 0
15 / 5
10 / 10
5 / 15
0 / 20
30
30 / 0
22 / 8
15 / 15
8 / 22
0 / 30
40
40 / 0
30 / 10
20 / 20
10 / 30
0 / 40
50
50 / 0
37 / 13
25 / 25
13 / 37
0 / 50
Not applicable
0/1
The Chariot script used was throughput.scr with default settings and a duration of 30 seconds. This
script generates continuous TCP traffic. Four streams were used on each client. Aruba conducted
upstream, downstream, and bidirectional test cases for each combination in Table 15.
For 20-MHz channels, we are interested in the following questions:
How does aggregate channel capacity change as more clients are added to a channel?
How does per-client throughput change as more clients are added to a channel?
How much does throughput change as the ratio of legacy stations increases?
How many stations can contend for the channel before overall channel capacity begins to decline?
How does aggregate channel capacity change as more clients are added to a channel?
How does per-client throughput change as more clients are added to a channel?
HD WLAN Testbed | 97
Ch 44
Ch 40
Ch 48
Overhead View
HD_271
20 ft (6 m)
25 ft (7.6 m)
The ACI test results are presented in Chapter 5, Infrastructure Optimizations for HD WLANs on
page 51.
98 | HD WLAN Testbed
10
20
30
40
50
84.6 Mbps
68.5 Mbps
59.9 Mbps
59.8 Mbps
54.2 Mbps
52.0 Mbps
46.8 Mbps
53.2 Mbps
46.9 Mbps
43.9 Mbps
43.8 Mbps
41.1 Mbps
38.4 Mbps
44.1 Mbps
41.6 Mbps
34.5 Mbps
32.9 Mbps
29.8 Mbps
26.9 Mbps
43.1 Mbps
39.6 Mbps
34.3 Mbps
32.0 Mbps
27.3 Mbps
28.0 Mbps
17.3 Mbps
16.9 Mbps
14.9 Mbps
14.9 Mbps
14.3 Mbps
14.0 Mbps
N/A
22.4 Mbps
Some interesting items in the data stand out from the numerical presentation:
The peak single-client channel capacity was nearly 85 Mbps for pure HT20 vs. 22 Mbps for 802.11a.
The AP-125 provides robust and consistent performance with 50 stations and 200 individual flows.
The results were extremely repeatable, which builds confidence in the accuracy of the testbed and
data collected.
Figure 60 shows the same data displayed in chart form, showing six through 50 clients.
Figure 60 TCP Bidirectional Mixed PHY Scaling Test (Aggregate Channel)
HD WLAN Testbed | 99
The aggregate channel capacity is not constant, but rather decreases as more clients are added. As
more stations contend for the medium, the rate of collisions and other PHY-layer errors begins to
climb. This in turn reduces the effective maximum throughput of the channel.
However, overall channel capacity with 50 stations degrades by just 40%, which indicates that the
channel is robust in the face of significant contention for the medium.
Each PHY type mix produces very repeatable performance relative to other mixes. This suggests
that results obtained by Aruba can be reliably extrapolated to other environments.
The performance of the 50/50 and 25/75 PHY mixes is nearly identical. This implies that the
performance gain of a mixed-mode HT network is capped until the legacy stations fall below than
50% of the population.
Figure 61 shows another view of the same data, showing the channel capacity for each scaling
increment relative to six-client throughput.
Figure 61 Relative Channel Capacity with Increasing Client Counts
The main conclusions that can be drawn from this chart are:
Pure 802.11a legacy maintains the highest relative throughput at high load, losing just 20% of the sixclient channel capacity with eight times more clients.
The 50/50 and 25/75 PHY mixes cluster together, suffering the greatest relative throughput loss at
high load of nearly 40%. Clearly, the presence of many legacy PHYs creates inefficiencies in channel
operation.
The 75/25 and pure HT20 PHY mixes also cluster together, maintaining nearly 70% of the six-station
channel capacity even with 50 stations contending simultaneously.
10
20
30
40
50
84.6 Mbps
11.4 Mbps
5.9 Mbps
2.9 Mbps
1.8 Mbps
1.3 Mbps
0.9 Mbps
8.8 Mbps
4.6 Mbps
2.2 Mbps
1.4 Mbps
1.0 Mbps
0.7 Mbps
7.3 Mbps
4.1 Mbps
1.7 Mbps
1.1 Mbps
0.7 Mbps
0.5 Mbps
7.2 Mbps
3.9 Mbps
1.7 Mbps
1.0 Mbps
0.6 Mbps
0.5 Mbps
2.9 Mbps
1.5 Mbps
0.7 Mbps
0.5 Mbps
0.3 Mbps
0.2 Mbps
N/A
22.4 Mbps
Figure 62 shows the same data charted, showing values from 10 concurrent users out to 50 stations.
Figure 62 TCP Bidirectional Mixed PHY Scaling Test (Per Client)
Some of the principal insights that should be drawn from this chart are:
On average, an auditorium with 100% HT20 devices can deliver 3 Mbps each at 20 stations, and
nearly a full 1 Mbps at 50 stations.
An auditorium with a 50/50 mix of devices can deliver 1 Mbps each at 30 stations and 512 Kbps each
at 50 stations.
Average per-client throughput declines in a very predictable way out to at least 50 concurrent users.
The principal insights that should be drawn from this chart are that on a per-client basis:
A cell with 25% legacy devices will achieve an average of 20% less throughput than pure HT20.
A cell with more than 50% legacy devices will achieve an average of 40% less throughput than pure
HT20.
An auditorium with 100% legacy devices will achieve an average of 75% less throughput than pure
HT20. Put differently, a pure HT20 client environment will deliver four times the performance of a
pure legacy environment.
Interestingly, little difference was observed with more than 50% legacy clients. So long as at least one
HT20 client exists in the environment, the overall throughput will roughly double. But it cannot exceed
this amount until the legacy station ratio drops below half.
How does total HT40 channel capacity change as clients are added?
The behavioral characteristics of 40-MHz channels on Aruba APs in a high-density setting are similar to
that of 20-MHz channels. Table 18 lists the average TCP up, down, and bidirectional throughput for
increasing numbers of HT40 clients.
Table 18 TCP HT40 Client Scaling Test (Aggregate Channel)
Clients
10
20
30
40
50
HT40 TCP Up
138.5 Mbps
145.1 Mbps
136.3 Mbps
126.0 Mbps
113.0 Mbps
96.4 Mbps
133.0 Mbps
134.0 Mbps
132.2 Mbps
112.9 Mbps
116.4 Mbps
97.1 Mbps
154.0 Mbps
151.2 Mbps
141.3 Mbps
132.2 Mbps
115.9 Mbps
108.7 Mbps
Average single-client channel capacity of 154 Mbps for pure HT40 is 181% more than 85 Mbps with
pure HT20.
Average 50-client capacity of 108 Mbps is 232% greater than the 47 Mbps seen with pure HT20.
TCP up and downstream performance were consistent, with approximately a 10% gain seen for the
bidirectional case.
1.
Advances in Wireless Infrastructure Control, Farpoint Group, Document FPG-2008-341.1, September, 2008
Figure 64 shows the same aggregate channel throughput results displayed in chart form, showing one
through 50 clients.
Figure 64 TCP HT40 Client Scaling Test (Aggregate Channel)
10
20
30
40
50
HT40 TCP Up
138.5 Mbps
14.5 Mbps
6.8 Mbps
4.2 Mbps
2.8 Mbps
1.9 Mbps
133.0 Mbps
13.4 Mbps
6.6 Mbps
3.7 Mbps
2.9 Mbps
1.9 Mbps
154.0 Mbps
15.1 Mbps
7.0 Mbps
4.4 Mbps
2.9 Mbps
2.1 Mbps
Figure 65 shows the same data charted, showing values from 10 concurrent users out to 50 stations.
Figure 65 TCP HT40 Client Scaling Test (Per Client)
Some of the key insights that should be drawn from this chart are:
On average, pure HT40 cells deliver 230% more throughput per client than pure HT20 cells.
The rate of channel degradation due to increasing contention is nearly identical for both HT40 and
HT20.
A cell with 100% HT40 devices can deliver 4 Mbps each at 30 stations vs. 1.8 Mbps for HT20.
Appendix B
Advanced Capacity Planning
Theory for HD WLANs
Performance in the IT world is measured in terms of the total data transferred by a given number of
devices in a given time. This benchmark rises inexorably, year after year, as a result of advancing
standards and innovations in implementation. This benchmark can also be applied to a high-density
wireless coverage zone. However, you must consider other variables besides throughput and user
counts to successfully achieve a desired capacity plan. These variables include maximum concurrent
users, the number of usable channels, channel width, and the number of allowable channel reuses. In
this appendix you will learn some of the theoretical basis of the HD WLAN capacity planning
methodology presented in Chapter 3, Capacity Planning for HD-WLANs on page 17.
Description
Notes
Total number of 5 GHz and 2.4 GHz nonoverlapping RF channels available for use
The formula allows us to quickly estimate how many 802.11 clients can be supported in a room when
key values are known. For example, if we make the following assumptions:
Single-band 5-GHz deployment in the United States with 20-MHz (HT20) channels where the UNII-2
Extended band is allowed (C=20)
Then D = 20* 25 * 1 or 500 maximum concurrent devices, all of which must be 5-GHz-capable. This
provides the wireless designer with a quick snapshot of the feasibility of covering a given high-density
zone.
If we know the targeted number of clients, we can solve for the number of required channel reuses
when the other values are known:
R=
D
CxU
If R is less than or equal to one, it means that you do not need to reuse channels within the same room.
This is the preferred situation, because it means a simpler and cheaper RF design. If you are getting a
value of R that is between 1 and 1.5, it is strongly in your interest to revisit your assumptions to see
whether you can compromise in a way that allows you to avoid channel reuse.
For values of R that are greater than 1.5, this means that you must have more than one AP on the same
channel in the same room. This almost certainly means that under-floor mounted external antennas will
be needed to control the propagation of signal within the room, and careful control of AP and client
transmit power will be required (among other factors). RF coverage strategies for multiple-reuse HD
WLANs are explained in Chapter 4, RF Design for HD WLANs on page 31 and Appendix C, Basic
Picocell Design on page 113.
To gain an understanding of the how the radio budget is used to obtain a reuse requirement, consider
the simplified examples in Table 21.
Table 21 Example Radio Budgets
Example 1
Example 2
Scenario
AP Type
802.11gn (dual-band)
802.11a (single-band)
Primary
Capacity Goal
Secondary
Capacity Goals
U = 25
U = 20
Channel
Reuses
Required
Special Notes
500
500
0.9 = No Reuse
22 25 550
1,000 1,000
In the European auditorium (example 1), we have 22 available channels (including three on the 2.4-GHz
band for iPod and smartphone device types and 19 in the 5-GHz band). In the United Kingdom, there are
presently 13 channels in 2.4 GHz and 24 channels in 5 GHz available. We also set U to the commonly-
used value of 25 concurrent users per AP. With this information, we can determine that there are more
user berths (550) than users (500). So radio channels do not need to be reused in the auditorium. This
greatly simplifies our RF design. However, if more than 75 of the devices lack 5-GHz radios, the plan
must be reconsidered.
Example 2, the trading floor in New York City, is more complicated, not only because of the larger
device population, but also because DFS events greatly reduce the available channels. In this case, we
set U to a conservative value of 20 to allow for future growth. The radio budget tells us that well need to
reuse each channel at least five times in the same room to fit all 1,000 devices into the 180 available
concurrent user slots. This will require a very special RF design, and possibly customization of the
client device radio driver, to meet the primary capacity goal. Knowing that the channel reuse factor (R)
is 5.5 allows the wireless designer to assess the difficulty level of the design, and begin to think about
cost/benefit justification.
Bd
C R Br
D
Single-band 5-GHz HT40 deployment in the United States where the UNII-2 Extended band is
allowed (C=11)
Then Bd = (11 * 1 * 150) / 600 = 2.75 Mbps per HT40-capable client assuming that all clients are evenly
distributed across all radios and ACI losses do not degrade per-radio throughput below 150 Mbps.
However, this design has a significant limitation. Each radio would have to support 55 clients assuming
a 100% duty cycle. Even though Aruba APs have been proven stable well beyond 50 clients, you will see
data later in this chapter that shows that the overall capacity of the channel begins to degrade above 20
stations due to contention between stations.
It is always better to use 20-MHz (HT20) channels with a high-density 802.11n deployment than 40MHz (HT40) channels.
NOTE
In this case, Bd = (24 * 1 * 75) / 600 = 3.0 Mbps per HT20-capable client. Not only is this is 9% more
throughput per client, but the number of concurrent users per radio is a much more healthy value of 25.
In fact, the actual HT40 performance is likely to be closer to 120 Mbps than the 150 Mbps planning value
because total channel capacity degrades due to management overhead and contention with larger
numbers of clients. In this case, the use of 20-MHz channels would yield a 26% improvement in total
throughput vs. the HT40 case.
Table 22 shows how the auditorium and trading floor examples from the previous section work from a
bandwidth perspective.
Table 22 Example Throughput Budgets
Example 1
Example 2
Scenario
AP Type
802.11gn (dual-band)
802.11a (single-band)
Primary
Capacity Goal
Secondary
Capacity Goals
D = 1,000 users
R = 5.5 channel reuse
R = 1 channel reuse
Estimated PerDevice
Throughput
Special Notes
Bd
2.98Mbps
500
500
Bd
9 5.5 10 495
495 Kbps
1,000
1,000
In example 1 on the left, we have to use two different values for Br. On the three 802.11g channels we
use 21 Mbps, while on the 19 HT20 channels we can use the full 75 Mbps. The throughput budget
formula shows that the primary capacity goal can be easily met. It may be noted that these are best case
throughput values in practice and one would expect rate adaptation and client orientation/distance
from the AP to reduce best case numbers to some lower average.
Example 2, the trading floor in New York City, is more challenging. Notice that we used a figure of just
10 Mbps for Br . This is because R is very high, with each channel reused five times in the same physical
area. We saw in Chapter 5, Infrastructure Optimizations for HD WLANs that adding additional samechannel APs actually reduces aggregate throughput. Even with a working picocell design, the
contention between clients will be increased. Therefore, we assume that we will achieve no more than
50% of the normal channel capacity in any given cell.
Therefore, to meet the primary capacity goal of 512 Kbps per client, the RF design must deliver at least
10 Mbps of throughput on each radio. Aruba has validated in a lab and in customer production
environments that this result can be achieved with this amount of channel reuse in a single room.
However, even if the full 10 Mbps per radio is successfully reached, the bandwidth budget formula
shows that each client will receive at most 495 Kbps, which is slightly less throughput than the goal.
This result tells the wireless architect that the application developers will need to be consulted to verify
that they can operate with a lower level of bandwidth than requested.
Validate primary
capacity goal
Target (U)
Determine available
per-device
bandwidth (Bd)
Radio bandwidth
target (Br)
Determine
reuse (R)
HD_270
Capacity
reserve (-U)
1. Device count (D): Determine the number of concurrent wireless client connections needed over
the useful life of the network.
2. Channel count (C): Determine the number of different, nonoverlapping frequencies that are
available and usable by the expected client device drivers.
3. Concurrent user target (U): Determine the number of concurrently transmitting clients that each
AP can handle (per radio).
4. Capacity reserve (-U): Choose the amount of spare capacity that you want to hold back for traffic
peaks and future growth. Adjust U downward by this amount.
5. Determine reuse (R): Use the radio budget formula to determine if each channel will need to be
used more than one time
6. Radio bandwidth target (Br): Look up the maximum per-radio throughput for the radio type and
R value (802.11a/b/g/n). Tables are provided for this purpose in Appendix A, HD WLAN Testbed on
page 95. If you plan to reuse channels in the same room, divide the single-radio throughput value by
the number of reuses.
7. Determine available per-device bandwidth (Bd): Derive the per client value and compare to the
primary capacity goal.
8. Validate primary capacity goal: If the primary capacity goal cannot be achieved, make necessary
design compromises and repeat steps 1-7 with adjusted input values.
This methodology allows the wireless architect to make an assessment of how much bandwidth can be
made available to a given population of users.
Appendix C
Basic Picocell Design
The main body of this VRD assumes that no channel reuse is needed to implement an HD WLAN in an
auditorium. This is easily achievable in countries that offer 13 or more 20-MHz channels in the 5-GHz
band. However, this is much harder for countries such as China with only five allowed 5-GHz channels
at the time of this writing, or sites that reserve channels for other purposes such as medical telemetry
or security video.
This appendix covers the basics of picocell design to help wireless engineers understand the
requirements and constraints of channel reuse. A full treatment of picocell design is beyond the scope
of this guide. Contact your local Aruba representative for further information or engineering
consultation.
Client device radios tend to increase picocell radius as opposed to shrinking the cell radius.
The link budget for clients at the picocell edge must factor in variable amounts of structural and
body loss.
The minimum channel reuse distance between picocells on the same channel must be determined
and observed.
20 rows
20 seats
r1
r3
HD_272
r2
Inner AP radius (r1): This is the usual cell edge of the AP. It is the target data rate radius, not the
interference radius. It is defined as the maximum distance at which the SNR exceeds the value
required to demodulate the desired minimum data rate, typically MCS7 and MCS15 in an HD WLAN.
In a picocell operating at very low transmit power, this distance is often less than 30 ft (10 m).
Client interference radius (r2): This is the distance at which a client radio transmission can
interfere with a same-channel transmission by another station. Typically, this means that the SNR is
4dB or greater, which is the minimum required to decode an 802.11a/n frame. Therefore, this is
much greater than the inner AP radius.
A picocell network works best when the seats are full. In this case, the increased lateral human body
attenuation will shrink the client interference radius dramatically. This effect is deliberately exploited
in picocell design to achieve reuse.
NOTE
Outer picocell radius (r3): This is the outer boundary of all the client interference radii when
multiple clients exist at the edge of the inner AP radius. This is the effective radius of the picocell.
r3 expands and contracts depending on how full the seats are.
From this analysis, it is clear that client transmissions have the result of increasing the size of a picocell
due to the small power levels and short distances involved.
L body
L floor
L freespace
L body
HD_273
L freespace
L floor
PTX
Lfreespace
Lfloor
Lbody
GTX
GRX
In the figure, all three types of loss vary with the angle of incidence, which increases the distance that
RF energy must travel. Free space loss increases with distance. Floor loss increases as the length of the
path through the floor goes up. Body loss increases as the number of bodies or body parts in the
transmission path increases.
The quickest, surest way to obtain reliable planning data is to do an active RF survey with a test AP
installed on, in, or under the floor as envisioned by the designer. Using a site survey tool such as
AirMagnet or Ekahau, it is possible to quickly determine Lfloor + Lfreespace for an empty auditorium.
To quantify Lbody, the test can be repeated with volunteers filling up a section, or possibly during an
actual event. The Aruba Customer Engineering (ACE) organization has measured body loss data, and is
available to consult with customers planning picocell systems. Ask your local Aruba systems engineer
for more information.
distance (km) = 10
PTXPRXLfloorLbodyGRXGTX20log(f)32.4
20
In Chapter 4, RF Design for HD WLANs on page 31 we presented minimum separation distances for
adjacent channel APs, meaning those with at least 20-MHz separation of center frequencies. Those were
computed with this formula. In the same-channel case, Table 23 shows the required separation for no
interference and partial interference in both 2.4 GHz and 5 GHz.
Table 23 AP to AP Minimum Separation Distance APs Operating on Same Channel
Power
Setting
5 GHz
2.4 GHz
-85 dBm
-80 dBm
-75 dBm
-85 dBm
-80 dBm
-75 dBm
15 dBm
152 ft / 46 m
85 ft / 26 m
48 ft / 15 m
321 ft / 98 m
180 ft / 55 m
101 ft / 31 m
12 dBm
107 ft / 33 m
60 ft / 18 m
34 ft / 10 m
227 ft / 69 m
128 ft / 39 m
72 ft / 22 m
9 dBm
76 ft / 23 m
43 ft / 13 m
24 ft / 7 m
161 ft / 49 m
90 ft / 28 m
51 ft / 16 m
6 dBm
54 ft / 17 m
30 ft / 9 m
17 ft / 5 m
114 ft / 35 m
64 ft / 20 m
36 ft / 11 m
As a general rule, picocell networks require five or more channels to ensure at least a one-cell gap
between same-channel APs. With nine channels it is possible to ensure at least two-cell separation and
at least 40 MHz of frequency isolation between adjacent channels.
seats =
Concurrent user limit is the value you chose in Chapter 3, Capacity Planning for HD-WLANs on
page 17.
Average duty cycle is the percentage of time they have data to transmit.
For example, a cell with a 50 concurrent user limit where each seat has one device and the duty cycle is
10% could support 500 seats. However, a cell with 25 concurrent users with a single device and a 75%
duty cycle can only support 33 users.
Our focus is 5-GHz coverage in this guide, and it should be noted that some additional clients can be
served in the 2.4-GHz band, which permits some increase in the size of the picocell from a capacity
perspective. However, due to the inherent limitations of 2.4 GHz, it is often better to exclude it from the
capacity plan.
Appendix D
Dynamic Frequency
Selection Operation
With a total of about twenty 20-MHz channels (different vendors support slightly different numbers) the
5-GHz band with Dynamic Frequency Selection (DFS) now has sufficient channels to implement most
HD WLAN scenarios. So why wouldnt everyone use DFS?
Three significant exceptions could adversely affect HD WLAN performance with DFS enabled. The
wireless architect must assess whether any of these exceptions applies to their organization:
Need for the Aruba Receive Sensitivity Tuning-Based Channel Reuse feature
If you do plan to use DFS channels, here is an overview of how DFS works and what you can expect
when radar events occur.
For more information, please see Chapter 4, RF Design for HD WLANs on page 31.
does not mean there is no radar. Other common sources of radar include marine shipping traffic,
military installations, and doppler weather systems at local television stations.
A DFS survey is relatively simple to perform, and requires an Aruba controller and AP. These are the
basic steps:
1. Install the controller with ARM scanning disabled. If the controller is not in the location where the
survey will take place, arrange for wide-area connectivity to the AP.
2. Provision the AP to operate on channel 52.
3. Allow the AP to dwell on that channel for four hours.
4. If a radar event has occurred, it can be noted from the system log, and youll notice that the AP will
be on another channel.
5. Repeat steps 2 and 3 on the next highest 20-MHz channel until channel 140 has been completed.
Unfortunately, radar pulses cannot be detected with any PC-based portable spectrum analyzers on the
market as of this writing. The cost of renting and operating a laboratory-quality spectrum analyzer is
typically much higher than simply using the Aruba equipment you intend to deploy.
DFS Summary
DFS channels are a vital weapon in the wireless architects arsenal when planning any HD WLAN.
However, due to the importance of these possible exceptions, all DFS channels are disabled in ArubaOS
by default but can be easily enabled. This is explained in Chapter 6, Configuring ArubaOS
for HD-WLANs on page 67.
Before enabling DFS channels in any WLAN system, it is critical to complete a DFS survey and to
understand the behavior of all client devices in the system on all DFS channels.
Appendix E
Aruba Contact Information
http://www.arubanetworks.com
Support Site
https://support.arubanetworks.com
https://licensing.arubanetworks.com/login.php
http://www.arubanetworks.com/support/wsirt.php
Support Emails
support@arubanetworks.com
EMEA
emea_support@arubanetworks.com
WSIRT Email
Please email details of any security
problem found in an Aruba product.
wsirt@arubanetworks.com
Telephone Support
Aruba Corporate
+1 (408) 227-4500
FAX
+1 (408) 227-4550
Support
United States
+1-800-WI-FI-LAN (800-943-4526)
Australia
United States
1 800 9434526
1 650 3856589
Canada
1 800 9434526
1 650 3856589
United Kingdom
Japan
Korea
Singapore
Taiwan (U)
Belgium
Telephone Support
Israel
Ireland
Hong Kong
Germany
France
China (P)
Saudi Arabia
800 8445708
UAE
800 04416077
Egypt
India
91 044 66768150