You are on page 1of 20

Rethinking the

Data Center

an Networking eBook
contents
[ ]Rethinking the Datacenter

This content was adapted from Internet.com's


InternetNews, ServerWatch, and bITa Planet Web
sites. Contributors: Paul Rubens, Drew Robb, Judy
Mottl, and Jennifer Zaino.

2 Enterprises Face Data


Growth Explosion
Judy Mottl

4 What’s the State of


2 Your Data Center?
Jennifer Zaino

6 Create a Recession-
Proof Data Center
Paul Rubens

4 6 9 Greening Your Data Center —


You May Have No Choice
Paul Rubens

12 Hardware for Virtualization:


Do's and Don'ts
Drew Robb

9 12 15 Why Tape Libraries


Still Matter
Drew Robb

18 Facilities Management
Crosses Chasm to the
Data Center
15 18 Paul Rubens

Rethinking the Datacenter, An Internet.com Networking eBook.


© 2008, Jupitermedia Corp.

1
[ Rethinking the Datacenter ]

Enterprises Face Data Growth Explosion


By Judy Mottl

include burgeoning Internet access in emerging coun-

I
f you think storing your enterprise data is a tough
challenge now, it's nothing compared to what it tries, increasing numbers of datacenters supporting
might be in just a few years. cloud computing and the rise in social networks, the
study found.
According to a study from research firm IDC and stor-
age vendor EMC, data requirements are growing at Less than 5 percent of the digital universe is from data
an annual rate of 60 percent. Today, that figure tops center servers, and only 35 percent is drawn from the
45 gigabytes for every per- enterprise overall, accord-
son, or 281 exabytes total ing to IDC.
(equivalent to 281 billion
GB). Nevertheless, the IT
impact will be extensive,
What should concern IT ranging from the need to
managers is that the report boost information gover-
predicts the total amount nance to improving data
of digital information -- the security.
"digital universe," as the
study's authors call it -- will Individuals create about 70
balloon to 1,800 exabytes percent of the digital uni-
by 2011. verse, although companies
are responsible for the
The findings should serve Jupiterimages security, privacy, reliability
as a wake-up call to enterprises, said Charles King, and compliance of 85 per-
principal analyst at Pund-IT. cent of that data, the study said.

"Creation of information is accelerating at a torrid King said IT will need to cope by assessing relation-
pace, and if organizations want the benefits of infor- ships with business units that classify data.
mation they'll need effective management tools,"
King wrote in a response to the IDC/EMC report. Additionally, enterprises will have to set and enforce
Chief factors responsible for the growth in data policies for data security, access and retention, and


Individuals create about 70 percent of the digital universe, although
companies are responsible for the security, privacy, reliability and compliance of
85 percent of that data, the study said.

2 ”
Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.
[ Rethinking the Datacenter ]
adopt tools for contending with issues like unstruc- will be storing only about half on average until 2011.
tured data search, database analytics, and resource
pooling, he said. The report comes as a follow-up to an earlier, similarly
aggressive IDC forecast about data growth.
Certain business segments may be more affected than
others, since they churn out more data. The financial The research firm said it based its conclusions on esti-
industry, for example, accounts for just 6 percent of mates of how much data is captured or created annu-
the digital universe. Media and communications firms, ally from roughly 30 classes of devices or applications.
meanwhile, collectively generate 10 times that It then converted the data to megabytes using
amount, according to the study. assumptions about usage and compression. I

The EMC/IDC study found also that while not all infor-
mation created and transmitted is stored, enterprises

3 Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.


[ Rethinking the Datacenter ]

What’s the State of Your Data Center?


By Jennifer Zaino

agers are attempting to contain costs, deploying new

I
f you’re like the data center managers surveyed by
Symantec in the fall of 2007, maybe things aren’t as technologies such as server virtualization, but finding
good as you’d like them to be. that the money they save on hardware sometimes
gets swallowed up by the increasing management
Symantec issued the results of its inaugural “State of complexity of those environments. That’s keeping an
the Data Center” report, based on a survey of more emerging technology such as virtualization from mak-
than 800 data center managers in Global 2000 and ing the leap from the test and development environ-
other large companies worldwide, with average annu- ment to production systems, Derrington notes.
al IT budgets of $75 million
in the U.S. and $54 million “So the recommendation
outside the States. is to figure out how to cre-
ate a software infrastruc-
”We found data center ture that runs across the
managers looking at a num- entire data center and
ber of challenges and tech- works across physical and
niques to combat them,” virtual systems, so regard-
says Sean Derrington, less of the technology you
Symantec’s director of stor- selected to constrain costs,
age management. “Fixed you won’t have to sacrifice
costs are continuing to skills training for IT staff.
increase. Sixty-nine percent They perform the same
say expenditures are grow- task the same way,”
ing 5 percent a year, so a Derrington says.
larger and larger piece of
that IT budget goes to Symantec, of course,
Jupiterimages
fixed costs. That doesn’t makes products designed
leave much incremental dollars for IT managers to to meet this need, such as Veritas NetBackup data
play around with.” protection and Veritas Cluster Server disaster recovery
and high availability solution.
Trying to get out of that vicious cycle, and free up
dollars for more strategic uses, data center man- One interesting finding of the survey was that increas-


Fixed costs are continuing to increase. Sixty-nine percent say expenditures are growing
5 percent a year, so a larger and larger piece of that IT budget goes to fixed costs.

4 ”
Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.
[ Rethinking the Datacenter ]
ing or significantly increasing demands by the busi- “They want to automate the same things that are
ness, in combination with overall data center growth, potentially repetitive,” says Derrington. “Take, for
are compounding the problems of data center com- example, storage provisioning. That task includes
plexity. The respondents noted that service-level storage administrators, server administrators, SAN
expectations have increased 85 percent over the past architects, maybe the business, maybe procurement
two years — and 51 percent admit to not having met and finance. How can a company actually define the
service-level agreements in the same time period. workflow and process so everyone knows what needs
to be done, so the person on the job for a day provi-
“As they’re looking at negotiating service levels, they sioning storage does the same thing as someone who
have to figure out how to deliver those services,” says has been on the job for 10 years?”
Derrington, and hitting a wall because of inadequate
staff. That’s not to say data center managers are looking for
automatons. In fact, part of the reason data center
Fifty-seven percent say staff skills do not meet current managers are having staffing troubles is that want
needs, and 60 percent say skill sets are too narrow. employees not only to solve technology problems,
For example, they don’t just want a Tivoli storage but also to understand the implications of technology
administrator, but someone who can work across for the business.
backup and data protection infrastructures. Sixty-six
percent also say there are too many applications to “The reason being that an individual could write the
manage. best code or solve the best way to write a script to
integrate Technology A with Technology B," Derringer
Consistency in operations supported by a standard- says, "but if they don’t understand how that impacts
ized software infrastructure can make all the difference the business or how it delivers a business benefit, that
here as well, Symantec believes. person is not going to be as valuable." I

5 Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.


[ Rethinking the Datacenter ]

Create a Recession-Proof Data Center


By Paul Rubens

ance. He also recommends reporting progress to sen-

Y
ou don't need a Nobel Prize in economics to
realize that the world's economies are facing a ior managers on a weekly basis and identifying a liai-
slowdown or recession head on. And it doesn't son with a legal representative to make it easier to
take a genius, or large leap of logic, to work out that work through legal issues that may crop up in connec-
your data center's budget is likely to face a cut. tion with maintenance and other contracts or penalty
clauses. This is to ensure cost-cutting measures don't
Whether you have an inkling a cut is coming or you result in increased legal liabilities for your company.
haven't been warned of an impending budget cut,
establishing a course of action to cut costs now would So, having established that now is the time to take
be a wise move, according measures to help the data
to Ken McGee, a vice presi- center weather a recession,
dent and Fellow at Gartner. the question is where
should you look to cut
As far back as 2007 Gartner costs?
was warning about the need
to prepare for a recession. Cost-Cutting
Since then, things have obvi-
ously changed for the worse.
Sweet Spots
"Since that time, the factors One of the most significant
we based the research on — data center costs is electrici-
such as GDP growth projec- ty — for powering both the
tions and expert predictions computing equipment and
for the likelihood of a reces- the systems used to provide
sion — have worsened to a Jupiterimages cooling. Virtualization can
degree that convinces us it is now time for clients to play a key role in reducing overall electricity consump-
prepare for cutting IT costs," McGee said in January. tion, as it reduces the number of physical boxes need-
ed to power and cool.
McGee recommends dedicating top staff exclusively
to investigating IT cost-cutting measures, and A single physical server hosting a number of virtual
appointing a senior auditor or accountant to the team machines can replace two, three, or sometimes many
to provide an official record of the team's perform- more underutilized physical servers. Although a physi-


Whether you have an inkling a cut is coming or you haven't been warned
of an impending budget cut, establishing a course of action to cut costs now
would be a wise move

6 ”
Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.
[ Rethinking the Datacenter ]
Virtualization’s Success
cal server working at 80 percent utilization uses more

Hampers Server Sales


electricity than one working at 20 percent, it is still far
more energy-efficient than running four servers at 20

I
percent along with the accompanying four disk drives,
four inefficient power supplies, and so on. By Andy Patrizio

DC revised its forecast in terms of both server


Virtualization also shrinks costs by reducing the

dollar and unit sales in the coming years. It


amount of hardware that must be replaced. If you

attributes the downshift to the increasing popu-


operate fewer servers, you then have fewer to replace
larity of virtualization and more powerful servers. In
when they reach the end of their lives. Thanks to
both cases, one server can accomplish what previ-
advanced virtual machine management software from
ously took several.
the likes of Microsoft and VMware, the time spent set-
ting up and configuring them (and thus the associated
IDC reported unit sales slid in 2006, while dollar
cost) can be much less than that spent managing com-
parable physical servers. sales grew, an indication that fewer but more power-
ful machines are being sold. Instead of a 61 percent
And virtualization needs not be restricted to servers. increase in server shipments by 2010, IDC now
What's true of servers is true of storage systems, too: expects server sales will grow by 39 percent.

In projecting this trend out a few years, the research


Data center automation can take a
firm had to revise its server sales projections down-
ward. Between now and 2010, IDC sees the x86-based
server market dollars shrinking by 9 percent, from
$36 billion to $33 billion, and actual unit sales
vast amount of investment, but it also
promises significant cost savings. declining 18 percent, from 10.5 million servers to 8.7
million servers.


Storage virtualization can cut costs by reducing over-
provisioning and reducing the number of disks and
This is due to what said Michelle Bailey, research
vice president for IDC's Enterprise Platforms and
Datacenter Trends division, called a "perfect storm"
of virtualization and multi-core processors.
other storage media that must be powered (and
"On its own, multi-core wouldn't have been that
cooled), bought and replaced.
interesting," she told internetnews.com. "It probably
would have been just another speed bump. It's the
This leads to the concept of automation. Data center
addition of virtualization that lets you take advan-
automation can take a vast amount of investment, but
tage of multi-core much more quickly."
it also promises significant cost savings. In a time of
recession it's prudent to look at initiatives that carry a
modest price point and offer a relatively fast payback Virtualization lets you run multiple single-threaded
period. These may include patch management and apps and get the benefits of multi-core technology
security alerting (which in turn may enable lower cost without having to rewrite applications to be multi-
remote working practices,) and labor-intensive tasks, threaded. So a single machine with a dozen or more
such as password resets. Voice authentication systems, virtual environments can run the applications in a
for example, can dramatically reduce password reset way a single-core system cannot.
costs in organizations that have large numbers of
employees calling the IT help desk with password "It allows you to fully exploit an unutilized proces-
problems. Such systems automatically authenticate the sor. Virtualization is what we think of as the killer
user and reset relevant passwords. app for multi-core. It lets customers take advantage
of multi-core early without having to re-architect for
Any automation software worth its salt also has the it," said Bailey.

7 Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.


[ Rethinking the Datacenter ]
Virtualization’s Success... continued
added benefit that when it reduces the number of
man-hours spent dealing with a task, managers have
the flexibility to choose between reducing data center IDC estimates that the number of virtual servers
human resource costs and reassigning employees to will rise to more than 1.7 million physical servers
other tasks, including implementing further cost cut- by 2010, resulting in 7.9 million logical servers.
ting systems — thereby creating a virtual circle. Virtualized servers will represent 14.6 percent of all
physical servers in 2010 compared to just 4.5 per-
A more straightforward, but contentious, strategy is cent in 2005.
application consolidation. Clearly the more applica-
tions your data center runs, the more complex and This means customers are growing more confident
expensive it will be to manage them. Thus, consoli- in the uptime reliability of x86-based hardware.
dating on as few applications as possible makes good While they haven't approached mainframes in relia-
bility, x86 systems are a lot better than in previous
years, and come with better configuration and man-
financial sense, assuming, of course, the apps are up

agement tools.
to the required task. If these are open source applica-
tions, which in practice probably means Linux-based
A virtualized server going down could have far
ones, then there's a potential for significant savings, in
greater impact than a single application server
terms of operating system and applications license
going down, but Bailey said IT is not as concerned
fees, and CALs.
about that. "I would say customer perception around
putting too many eggs in one basket has changed. A
Bear in mind that significant support costs will remain,
virtual environment is no less available than a sin-
and Microsoft and other large vendors make the case
that the total cost of ownership of open source soft- gle environment," she said.
ware is no lower than closed source, but at the very
least, you may be able to use open-source alterna- However, there won't be a great spillover benefit
tives as bargaining chips to get a better deal from when it comes to power and cooling issues, a grow-
your existing closed-source vendors. ing headache for IT. While Bailey sees the potential
for server consolidation, she expects that virtualiza-
As well as looking at changes that can be made at the tion will more likely extend the lifespan of a server,
micro level, it's also useful to look at the macro level thus keeping more machines deployed, so there
at the way your whole data center operations are won't be a thinning of the herd. Worldwide, power
structured. For example, you may have set yourself a and cooling cost IS organizations $30 billion in
target of "the five nines" for system availability, but 2006, and that will hit $45 billion by 2010.
it's worth evaluating if this is really necessary. How
much would it reduce your costs to ease this target to
more sense financially to leave it unmanned at certain
99.9 percent? And what impact would it have on the
times, while having a number of staff "on call" to sort
profitability of the business as a whole?
out problems remotely, should the need arise.
If you can identify only a few applications that require
Finally, it's worth mentioning best practice IT manage-
99.999 percent uptime, it's important to consider if
ment frameworks like the IT Infrastructure Library (ITIL)
your data center is the best place from which to pro-
and Microsoft Operations Framework (MOF). Aligning
vide them. A specialized application service provider
operations to these frameworks is a medium- to long-
may be able to provide this sort of reliability at a
term project, but they are intended to ensure that all
lower cost for a fixed, per user fee, with compensation
IT services, including those associated with the data
if they fall below this service level. It certainly doesn't
center, are delivered as efficiently as possible.
make sense to provide more redundancy than you
need: That's simply pouring money down the drain.
If you can achieve that, you are a long way down the
Also consider whether your data center is operating
path to ensuring your data center can endure any
longer hours than necessary. Thanks to the power of
slowdown the economy can throw at it — not just this
time, but the next time, and the time after that. I
remote management tools, you may find it makes

8 Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.


[ Rethinking the Datacenter ]

Greening Your Data Center —


You May Have No Choice
By Paul Rubens
The writing has been on the wall for some time. about 1.2 percent of total U.S. electricity consump-
tion, equivalent to the output of about five 1000MW
Electricity use in data centers is skyrocketing, sending power stations, and costing $2.7 billion — about the
corporate energy bills through the roof, creating envi- gross national product of an entire country like
ronmental concerns and generating negative publicity Zambia or Nepal.
for large corporations.
Unless data centers go green, costs energy costs
Because IT budgets are limited and because govern- could soon spiral out of control, according to Rakesh
ments in Europe and the United Kumar, a vice president at
States may soon impose carbon Gartner. In a report titled "Why
taxes on wasteful data centers, 'Going Green' Will Become
something's got to give. Data cen- Essential for Data Centers" he
ters are going to have to "go says that because space is limited,
green." many organizations are deploying
high-density systems that require
It's not as if no one saw this com- considerably more power and
ing. The aggregate electricity use cooling than last generation hard-
for servers actually doubled ware.
between 2000 and 2005, both in
the U.S. and around the world as a Add to that the rising global ener-
whole, according to research con- gy prices, and the proportion of
ducted by Jonathan Koomey, a IT budgets spent on energy could
consulting professor at Stanford easily rise from 5 percent to 15
University. percent in five years. The mooted
introduction of carbon taxes
In the U.S. alone, servers in data would make this proportion even
centers accounted for 0.6 percent Jupiterimages higher. "When people look at the
of total electricity usage in 2005. But that's only half amount of energy being consumed and model energy
the story. When you include the energy needed to get prices, and think about risk management and energy
rid of the heat generated by these servers that figure supply, they should begin to get worried," Kumar
doubles, so these data centers are responsible for said.


The aggregate electricity use for servers actually doubled between 2000 and 2005, both
in the U.S. and around the world as a whole, according to research conducted by
Jonathan Koomey, a consulting professor at Stanford University.

9 ”
Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.
[ Rethinking the Datacenter ]
It's Not Easy Being Green your electricity bill by up to $500 per year (and reduce
the amount of carbon dioxide released into the air
Since most data centers historically have not been annually by perhaps 2000 pounds) directly, with about
designed with the environment in mind, Kumar says the same savings again realizable from reduced cool-
more than 60 percent of the energy used for cooling ing requirements.
purposes is actually wasted. This is bad for the envi-
ronment and reflects poorly on the organizations con- It may be that you have servers that don't need to be
cerned — especially if, as increasingly is the case, on at all hours of the day and night, but it's more like-
they have corporate social responsibility commit- ly that you can reduce the number of severs you need
ments. And as a growing number of companies are through virtualization. If you run corporate applica-
adopting a "carbon neutral" policy (either out of gen- tions on separate servers, many may be only 10 to 20
uine concern for the environment of for the positive percent utilized. Virtualization can dramatically cut the
PR this can produce) pressure from head office to number of physical servers you need, while technolo-
reduce the carbon footprint of the data center, to help gy from companies such as VMWare can ensure that
reduce overall carbon emissions, will become more your virtual machines can be switched to higher-
intense. "There's no doubt that in the short term this capacity physical machines during peak times.
problem is a financial one, but behind that there is the
need of organizations to be seen to be green," he If you do retire some servers, it obviously makes sense
said. to get rid of the older ones. This has the added bene-
fit of increasing your overall server energy efficiency
So what can be done to "green" the data center? because newer multi-core chips can offer significant
"There is no one solution that will solve the problem performance gains over older ones, whilst using
— this is a collective issue and it will require a raft of almost 50 percent less power. Power management
solutions," Kumar said. "You need to start by getting technologies such as Intel's Demand Based Switching
some metrics to understand the problem, because it's can further reduce individual processor electricity con-
not going to go away," he said. sumption by up to 30 percent.

The ideal solution is to start from the ground up by Another area where you can make significant power
designing and building a new data center with energy savings is server power supplies themselves. That's
efficiency in mind. This includes looking at the thermal because they can vary enormously in efficiency, espe-
properties of the building being constructed, the lay- cially under certain loads. Bad power supplies waste
out of the building for maximum cooling efficiency, about half of the energy they consume (and thus the
and even the site of the building: Locating a new data same again used by cooling systems to dissipate the
center far from urban areas means that it might be heat generated by this wasted energy.) To compound
more feasible to incorporate renewable energy this, power supplies running at a small fraction of their
sources such as wind turbines or solar panels into the rated capacity are often even more inefficient. Look
design, for example. For more specific guidance, for power supplies with the 80 Plus certification — this
organizations can turn to standards such as the U.S. means that the power supply will run at least 80 per-
Green Building Council's Leadership in Energy and cent efficiency even when running at just 20 percent
Environmental Design certification. Vendor programs of its full capacity.
such as the Green Grid, an information network spon-
sored by AMD, IBM, HP, and Sun, may also be a use-
ful source of information.
It's a Long Way to Tipperary
The answer to the question "how do you make your
Assuming you're not quite ready to tear down your data center greener" is similar to the traditional ques-
buildings and start again, there's still plenty you can tion from the Emerald Isle: "How do you get to
do to reduce your electricity bill and reduce your car- Tipperary?" The answer in both cases is "If I were you
bon footprint. Perhaps the most effective action you I wouldn't start from here." What this means is that
can take is to reduce the number of servers in use at while you can certainly make savings by switching
any one time. Each server you switch off can reduce power supplies and switching off unused machines,

10 Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.


[ Rethinking the Datacenter ]
the real solution requires a total rethinking of the data
center. This ranges from the design of the buildings
and cooling systems they contain, to the extensive
use of virtualization to increase server utilization, all
the way down to the use of energy efficient equip-
ment, from power supplies to smart, power-managed
processors. It's not a cheap undertaking, but one that
may prove vital for the survival of the data center, the
corporation, and perhaps — in a small way — even
the planet. I

11 Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.


[ Rethinking the Datacenter ]

Hardware for Virtualization: Do’s and Don’ts


By Drew Robb

article will focus on a typical case where infrastructure

V
irtualization is catching on like never before.
Just about every server vendor is advocating it and business logic applications are the main targets.
heavily, and IT departments worldwide are buy-
ing into the technology in ever-increasing numbers. With that in mind, one obvious target is memory. It is
a smart policy to buy larger servers that hold more
"The use of virtualization in the mainstream is now rel- memory to get the best return on investment. While
atively commonplace, rather than just in development single- and dual-processor systems can host multiple
and test," said Clive Longbottom, an analyst at U.K.- applications under normal circumstances, problems
based Quocirca. "In addition, arise when two or more hit
business continuity based on peak usage periods.
long-distance virtualization is
being seen more often." "Our field experience has
shown that you can host more
As a result, the time has come VMs [virtual machines] per
to more closely align hardware processor and drive higher
purchasing with virtualization overall utilization on the server
deployment. So what are some if there are more resources
of the important do's and within the physical system,"
don'ts of buying servers and said Jay Bretzmann, worldwide
other hardware for a virtual marketing manager, System x
data center infrastructure? at IBM. "VMware's code per-
What questions should IT man- mits dynamic load balancing
agers ask before they make across the unused processor
selection decisions on servers? Jupiterimages resources allocated to sepa-
And how should storage virtualization gear be inte- rate virtual machines."
grated into the data center?
He advised buying servers with more reliability fea-
Do’s and Don’ts tures, especially those that predict pending failures
and send alerts to move the workloads before the sys-
There are, of course, plenty of ways to virtualize, tem experiences a hard failure. Despite the added
depending on the applications being addressed. This cost, organizations should bear in mind that such


While single- and dual-processor systems can host multiple applications under normal
circumstances, problems arise when two or more hit peak usage periods.

12 ”
Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.
[ Rethinking the Datacenter ]
servers are the cornerstone of any virtualization solu- Storage Virtualization
tion. Therefore, they deserve the lion's share of invest-
ment. Most of the provisos covered above also apply to pur-
chasing gear for storage virtualization.
"Businesses will lose significant productivity if the con-
solidation server fails," said Bretzmann. "A hard crash "Most of the same rules for classic physical environ-
can lead to hours of downtime depending upon what ments still apply to virtual environments — it's really a
failed." question of providing a robust environment for the
application and its data," said John Lallier, vice presi-
Longbottom, however, made the point that an organi- dent of technology at FalconStor Software.
zation need not spend an arm and a leg for virtualiza-
tion hardware — as long as it doesn't go too low end. While virtual environments can shield users from hard-
ware specific dependencies, they can also introduce
"Cost of items should be low — these items may other issues. One concern when consolidating appli-
need swapping in and out as time goes on," said cations on a single virtualization server, for example, is
Longbottom. "But don't just go for cheapest kit that you may be over-consolidating to the detriment
around — make sure that you get what is needed." of performance and re-introducing a single-point-of-
failure. When one physical server fails, multiple virtual
This is best achieved by looking for highly dense sys- application servers are affected.
tems. Think either stackable within a 19-inch rack or
usable as a blade chassis system. By focusing on such "Customers should look for systems that can provide
systems, overall cooling and power budgets can be the same level of data protection that they already
better contained. Remember, too, not every server is enjoy in their physical environments," said Lallier.
capable of being managed in a virtual environment.
Therefore, all assets should be recognizable by stan- He believes, therefore, that storage purchasers should
dard systems management tools. opt for resilient and highly available gear that will
keep vital services active no matter what hardware
Just as there are things you must do, several key problems arise. In addition, Lallier suggests investing
don'ts should be observed as well. One that is often in several layers of protection for large distributed
violated is that servers should not be configured with applications that may span multiple application
lots of internal storage. servers. This should include disaster recovery (DR)
technology so operations can quickly resume at
"Servers that load VMs from local storage don't have remote sites. To keep costs down, he said users
the ability to use technologies like VMotion to move should select DR solutions that do not require an
workloads from one server to another," cautioned enormous investment in bandwidth.
Bretzmann.
As a cost-cutting measure, Lallier advocates doubling
What about virtualizing everything? That's a no-no, up virtual environments. If the user is deploying a vir-
too. Although many applications benefit from this tual environment to better manage application
technology, in some cases, it actually makes things servers, for example, why not use the same virtualiza-
worse. For example, database servers should not be tion environment to better manage the data protec-
virtualized for performance reasons. tion servers? As an example, FalconStor has created
virtual appliances for VMware Virtual Infrastructure
Support is another important issue to consider. that enable users to make use of its continuous data
protection (CDP) or virtual tape library (VTL) systems
"Find out if the adoption of virtualization will cause that can be installed and managed as easily as appli-
any application support problems," said Bretzmann. cation servers in this environment.
"Not all ISVs have tested their applications with
VMware." Of course, every vendor has a different take. NettApp
provides an alternative to FalconStor using the snap-

13 Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.


[ Rethinking the Datacenter ]
shot technology available in its StoreVault S500. This "Don't get trapped into buying numerous products
storage array handles instant backups and restores for each individual solution. One product that is flexi-
without disrupting the established IT environment. ble with multiple options (can handle VMs, create a
SAN, handle NAS needs, provide snapshots and repli-
"Useful products are able to host VMs over multiple cation) may be a smarter investment as a piece of
protocols, and the StoreVault can do it via NFS, iSCSI infrastructure." I
or FCP — whatever your environment needs," said
Andrew Meyer StoreVault Product Marketing Manager
at NetApp.

14 Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.


[ Rethinking the Datacenter ]

Why Tape Libraries Still Matter


By Drew Robb

by leaps and bounds during the past few years. LTO-2

T
ape libraries aren't exactly a booming business or
front-page news these days, but at the same time, offered 200 GB native and 30 to 35 MB/s, whereas
they're not faring all that badly in the face of the disk- LTO-3 provides 400 GB and 80 MB/s, and the new
based backup onslaught. According to Freeman Reports, LTO-4 delivers 400 GB and 120 MB/s. It is also the first
total revenue from all tape libraries declined 15.6 percent in open systems tape drive technology to incorporate
2006 compared to 2005, while unit shipments declined 4.5 native encryption.
percent.
With the growing popularity of disk-based backup
Despite those statistics, tape users purchased more and recovery solutions and the continued consolida-
than 50 percent more capacity as tion of tape library resources,
they migrated to higher-capacity however, tape is increasingly
and higher-performance tape taking on a more specialized
drives and cartridges. Thus, what role in data protection. In many
looks a fading industry on the sur- cases, tape is being used for
face is very much alive and kick- disaster recovery and central-
ing. ized backup.

Revenue still amounted to a "Corporations must retain data


healthy $1.81 billion in 2006 and for long periods of time and
was expected to be $1.77 billion ensure compliance with internal
in 2007. According to Freeman service-level agreements and
Reports, it will rise to $2.15 billion government regulations," said
by 2012. Within those numbers, Mark Eastman, product line
older formats like 8-millimeter and Jupiterimages
director, Tape Automation
DLT library sales continue to falter, Systems for Quantum. "As a
offset by increased sales of LTO and half-inch cartridge result, customers are demanding higher security, capac-
libraries. ity, performance and reliability across their tape invest-
ments. Automation platforms incorporating the latest-
LTO has evolved into the dominant player, accounting generation LTO-4 technology deliver on these impor-
for 88 percent of library unit shipments and 58 percent tant features."
of library revenue. LTO capacity and throughput grew


With the growing popularity of disk-based backup and recovery solutions and
the continued consolidation of tape library resources, however, tape is increasingly
taking on a more specialized role in data protection.

15 ”
Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.
[ Rethinking the Datacenter ]
Quantum Engine is for mainframes and can be configured to par-
ticipate in a grid environment.
On the vendor side, the top players are Sun
Microsystems, IBM, and Quantum. Quantum gained "Two or three TS7700s can communicate and replicate
serious ground in the enterprise tape library market with each other over an IP network," said Master. "This
with its acquisition of ADIC several years back. arrangement helps reduce or eliminate bottlenecks in
the tape environment, supports the re-reference of vol-
At the high end of the scale, the Quantum Scalar i2000 umes without the physical delays typical to tape I/O,
has a starting price of $65,000. According to Eastman, helps increase performance of tape processes, and
the i2000 is designed to meet the rigors of high-duty- helps to protect data and address business continuity
cycle data center operations and integration with disk- objectives."
based backup solutions. It uses a standard 19-inch rack
form and can holds 746 cartridges per square meter, as
well as up to 192 LTO bulk loading slots in one library.
Sun StorageTek
Like the other big vendors, Sun provides encryption for
In the midrange, the Scalar i500 is priced beginning at tape systems. The StorageTek T10000 tape drive, for
$25,000. The entry Scalar 50 has a starting price of example, includes this feature and has a starting price
$8,000. One box contains 38 slots, and its Quantum of $37,000.
StorageCare Vision data management and reporting
tools enable users to monitor multiple tape libraries At the high end on the tape library side is the
and disk systems from one screen. StorageTek SL8500, with a starting price of $195,830. It
can house up to 56 Petabytes (70,000 slots) and can be
"Backup and restore capabilities are just as critical in shared among mainframe, Solaris, AS/400, Windows,
busy workgroups and remote environments as they are Linux and Unix systems.
anywhere else," said Eastman. "The Scalar 50 tape
library provides them with an easy-to-use, reliable and Lower down the line is the StorageTek SL500 (starting
scalable solution that simplifies the backup process." at $16,400), an 8U rackmount tape automation model
IBM Tape that scales from 30 to 575 LTO slots and can deal with
multiple cartridge types, such as LTO and SDLT/DLT-S4.
According to IDC, IBM offers the leading enterprise Its maximum capacity is around 460 terabytes (uncom-
tape drive in the TS1120. This tape drive comes with pressed).
Encryption Key Manager for Java platform (EKM) to
encrypt information being written to tape. "We are seeing strong adoption of the scalable
libraries in the distributed and small business space, as
"EKM technology is used in high-end enterprise evidenced by continued growth of the SL500," said
accounts by Fortune 100 companies in a variety of Alex North group manager for tape at Sun
industries including banking, finance and securities," Microsystems. "The SL500 is particularly good for such
said Master. "IBM's LTO tape offerings have achieved applications as e-mail servers, database applications
nearly 900,000 drive shipments and over 10 million car- and file servers."
tridge shipments."
Encryption is another feature making its way into
The company's highest-end tape library is the TS3500, StorageTek tape technology. Sun's StorageTek T10000
which scales up to 6,800 slots and up to 192 LTO tape tape drive is an example of a product that has built-in
drives. Lower down the ladder comes the TS3310, software to encrypt your data. The T10000 pricing
which can deal with up to 398 slots and 18 LTO drives. begins at $37,000.
The company offers various lower-end models such as
the TS3100 with 24 slots. Green Tape
IBM also offers tape virtualization products, such as the As for the future of tape, these vendors are committed
TS7520 and TS7700. The TS7700 Tape Virtualization to it and believe it will continue to play an important

16 Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.


[ Rethinking the Datacenter ]
role. In fact, as green data center trends strengthen,
tape usage will accelerate.

"Tape storage TCO is as much as an order of magni-


tude less expensive than disk storage," said Bruce
Master, senior program manager, Worldwide Tape
Storage Systems Marketing at IBM. "Its consumption of
energy for power and cooling is anywhere from 20 to
100 times less expensive than disk storage." I

17 Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.


[ Rethinking the Datacenter ]

Facilities Management Crosses


Chasm to the Data Center
By Paul Rubens

is used in the data center. That's an IT decision and

I
t wasn't so long ago that the facilities management (FM)
team stalked the corridors of office buildings with greasy nothing will change that. "Essentially, facility manage-
blue coats and large bunches of keys. That image is now ment is about power, cooling and fire protection, and
as out of date as carbon paper and typing pools: Today's also, where data centers are concerned, physical
facilities manager is more likely to be found in a white short- access controls," said Kevin Janus, vice president of
sleeved shirt behind a 21-inch flat-screen monitor looking at the International Facility Management Association
CAD drawings and updating an asset database in a high- (IFMA) IT Council. "It is not involved in what servers
tech basement lair. you run, but it is concerned with the environment in
which they will live."
The role of the FM department
has changed, too. If you are A facility manager can help
involved in planning and run- with a number of environmen-
ning a modern data center, it's tal factors, purely because he
a good idea to get facilities has a complete overview of a
management involved. Today's building and its current and
FM departments have much to planned future uses — some-
offer data centers and the thing IT staff probably lack.
administrators that manage "Obviously you don't want the
them. Working with them helps IT department creating a data
facilitate a flexible data center center when there are kitchens
that is green and energy-effi- on the floor above because of
cient. Together, they enable the the danger of leaks," Janus
data center to supply the points out.
Jupiterimages
desired IT services to the peo-
ple who need them, at close to optimal cost. But the real issues are power and air conditioning. Air
conditioning is the number one consumer of power.
First, let's clear up some basics: the facilities manage- Servers, as anyone who has worked in a data center
ment department does not dictate what technology can testify, generate a great deal of heat. The high-


Air conditioning is the number one consumer of power. Servers, as anyone who has
worked in a data center can testify, generate a great deal of heat.

18 ”
Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.
[ Rethinking the Datacenter ]
density racks that are becoming increasingly common maximize the efficiency of the cooling systems.
in today's data centers consume vast amounts of Detailing current thinking on hot and cool aisles and
power, and a similar amount of power is needed to dis- other energy efficient data center layout techniques is
sipate this heat. That makes the planning and layout of beyond the scope of this article. However, bear in mind
the data center, and the provision of power and air that input from the FM department and the software
conditioning equipment, crucial. tools at its disposal makes is possible to design a data
center layout that will use significantly less energy and
This falls clearly under the FM purview. cost less to keep at an acceptable operating tempera-
ture than a badly laid out one.
How can FM help? In an organization of any size, it's
likely that the facility managers will have a computer What about making changes to existing data centers?
aided facility management (CAFM) package at their dis- "The contents of racks have to be managed, and if the
posal. Among other things, a CAFM will usually store A/C can't handle it then racks or individual servers have
CAD floor plans of the building and a database of to be moved," said Keller. "Then the question is how
assets. For the data center, this will likely include plans do you know which servers you are moving and how
showing the layouts of racks. In many cases, the data- do you keep track of where they are going? The FM
base will hold the location of each server, the applica- department has, in a CAFM database, the place to
tions running on these servers, and information about store that information, and can offer it to the IT depart-
the departments that "own" each application, where ment. There's no point in the IT department doing it all
relevant. again when the information already exists. From the
CEO's point of view, redundancy is not the way to go,"
Software tools can also carry out calculations to work he said.
out the amount of power that must be supplied in a
given area of the data center, and the corresponding The message from the basement then is very clear. By
cooling capacity needed to remove the resulting heat. involving the FM team in the planning and layout of
Information like this is clearly invaluable for the IT your data center, it can provide the tools and resources
department because no matter what IT strategy is in to ensure the data center will be practical to run, and
place, the available power and cooling capacity pres- as green and energy efficient, as possible. By keeping
ents constraints. The only way the IT department can lines of communication open between the two organi-
be free to install and run the hardware it wants is if FM zations, the data center will be the flexible enough to
has already put in place the power and cooling it accommodate the changes that you have planned, so
requires. And the only way for FM to know the IT you can deliver the services you want in the way you
requirements is for the two departments to communi- want, without worrying about where you are going to
cate regularly. put the boxes, whether you are going to run out of
power, or if the servers might melt when a new system
"The IT strategy may call for increased use of virtualiza- is deployed. I
tion two of three years down the line, but they won't
necessarily know what implication that has for the facili- This content was adapted from Internet.com's
ty, especially in terms of A/C," said Chris Keller, a past InternetNews, ServerWatch, and bITa Planet Web sites.
president of the IFMA's IT Council. "But it's also impor- Contributors: Paul Rubens, Drew Robb, Judy Mottl, and
tant to look at how the strategy will impact on people Jennifer Zaino.
and the office layout elsewhere in the building. If the IT
department wants to replace printer stations with inex-
pensive printers on every desk, then more power and
A/C is going to be needed throughout the building or
it won't be possible."

When a data center space is initially populated, the FM


department can help design the layout of the racks to

19 Rethinking the Datacenter, An Internet.com Networking eBook. © 2008, Jupitermedia Corp.

You might also like