You are on page 1of 8

Using IT initiatives to save money and reinforce green

credentials
An IT platform, in many cases, is still a wasteful environment. Applying certain simple, cost-effective and non-invasive software can result in big financial paybacks and markedly improve green credentials.
April 2016

Much of the content of an organisations website is relatively static. Even where content does change,
much of it will still be accessed repeatedly by users between changes.
Data caching allows websites to load this relatively static data text, images, videos into stores that
are closer to the user, and that can utilise faster volatile or non-volatile storage systems to improve
performance.
As well as providing a much better end-user experience, this off-loading of the retrieving and presenting
information to users also has other impacts.
As long as the right caching software is chosen, the number of back-end servers can be reduced. This
then leads to considerable cost savings, with lower licensing costs, fewer system administrators, lower
power use and less datacentre space. For organisations that want to save money while being able
to demonstrate improved green credentials for their corporate social responsibility (CSR) strategies,
using the right type of web data caching software can be a game changer.

quocirca

Clarifying the business impact of technology

Report authors
Clive Longbottom

Tel: +44 118 948 3360


Email: Clive.Longbottom@Quocirca.com

Bob Tarzey

Tel:+44 1753 855 794


Email: Bob.Tarzey@Quocirca.com

Sponsor

The web server dilemma:


architect for peak or
average load?

ack in the dim, dark past of IT, when the


business was actively dissuaded from knowing
anything about the technology it used, the common
approach was to run one application workload on top
of one server, with dedicated storage. Those server
and storage resources had to be architected to deal
with the peak requirements of the given workloads,
otherwise when those peaks happened, the total
system would grind to a halt.
For early websites, that meant dedicated hardware and
software was purchased, provisioned and operated
for a highly variable workload. This workload could
have massive peaks during certain hours of the day, or
during certain periods of the year - the platform had to
be architected from the beginning to cope with such
peaks - or run the risk of failing to perform adequately
just when the system was required most.

By extension, this over provisioning had a big cost.


Every one of these underutilised servers needed to
be procured. They needed to be deployed, and then
provisioned with an operating system and the rest
of the application stack. Licences had to be paid for;
maintenance agreements signed and paid for; systems
management software, sysadmins, and space in the
data centre provided all costs that the business had
to meet.

The lack of truth in PUE

owever, the resource issue is not the only


downside. A data centres efficiency can be
measured by its power utilisation effectiveness (PUE).
This is a direct measure of the fraction of the energy
that is used directly by the IT equipment in a data
centre, compared to the overall energy use of the
facility. Whereas the direct energy only covers that
used by servers, storage and network equipment, the
total energy includes cooling, lighting, auxiliary power
losses and so on. Most privately owned data centres

Quocircas research back in 20081 showed that this


way of dealing with IT equipment led to average server
utilisation rates of 10-20% and for storage, around
30%. If server clustering was used to provide higher
levels of availability, that average utilisation could drop
to as low as 2.5% - or, to look at it a different way, the
organisation was paying for 97.5% of a system that was,
for long periods of time, doing no useful work. Sure - it
was needed for that hour or so a day, or for the two or
three weeks a year when activity spiked, but otherwise
was just sitting idle.
1

Power Utilisation Effectiveness (PUE)


PUE is an over-simplistic means of calculating the
overall efficiency of a datacentre. It is calculated
by taking the total power used by the datacentre
and dividing it by the amount of power used
by the IT equipment itself. As it does not take
into account what useful work the IT equipment
itself is doing, a datacentre where no compute
workloads are being run can still have a better
PUE than a datacentre where massive amounts of
actual compute is being carried out.
It is therefore incumbent on an organisation to
ensure that its PUE is calculated on a basis of
ensuring that the IT platform has been suitably
optimised as well.

will be running at a PUE of between 1.5 and 2.5


meaning that for every Watt (W) of energy being used
by the IT equipment, somewhere between 0.5 and 1.5
Watts are being used by the rest of the facility.

Virtualisation is not the answer


As an example, consider an IT department that states
that it is doing OK - it has calculated that the data
centres PUE is 1.4 - so only 40% of the energy is being
used in peripheral systems. Sounds good - sounds
like something that can be put on the organisations
website as a nice, big, green tick.
However, this figure has been calculated from systems
that could be 97.5% underutilised for indefinite periods
of time. So, for every 100W of energy fed to the IT

http://quocirca.com/content/data-centre-asset-planning

c
2 d

Wasting resources through data movement


Much of the activity on storage and servers is in
dealing with the movement of data that does not
change very often. A web page that is commonly
accessed by visitors to an organisations web site
will always, without caching, have to be retrieved
from the main server system. All of this takes
up resources and in the majority of cases, this
resource is wasted.

equipment, only 2.5W is being used for useful work.


That 100W is, based on the PUE, actually 140W of total
power. That industry-leading data centre with a PUE of
1.4 is wasting 137.5W of every 140W pumped into it.
This is expensive not only financially, but at a
sustainability level as well. For organisations that want
to make sure that their corporate social responsibility
(CSR) statement is not just for show, green strategies
should be demonstrable with actual results.
Sure virtualisation of servers within the data centre
has managed to push utilisation rates up although
further Quocirca research2 only showed a move to
the mid-teens to low twenties percentages. Private
cloud computing holds the promise to drive utilisation
rates far higher, with the elasticity of shared resources
meaning that architecting for peak usage on a workloadby-workload basis can be managed against a far more
optimised IT estate.
However, such hardware and platform advances are
expensive in themselves. They require complex upfront
2

http://quocirca.com/content/data-centre-asset-planning

planning and a fundamental change to the way the IT


platform is operated. Virtualisation and cloud should
not be seen as the only ways to optimise an IT estate.
A web page that is commonly accessed by visitors to
an organisations website will always, without caching,
have to be retrieved from the main server system. All
of this takes up resources and in the majority of cases,
this resource is wasted.

Solving the website dilemma

far fewer resources (and delivering content more


efficiently).
This may sound like a minor issue. To the end user,
however, the perceived performance can be improved
by up to 1,000%. For the organisation, the server load
can be reduced by up to 80%. Indeed, using Varnish Plus
from Varnish Software, a vendor of web architecture
performance software, several organisations have
found that they can reduce their server estates
considerably.
Examples of some of Varnishs successes include:

ntelligent data caching can significantly minimise the


inefficiency of retrieving the same data repeatedly
from the original server.

Using intelligent data caching


By using intelligent data caching, this wasteful
activity can be minimised. The website can still
have a single physical master copy, but assets
such as web pages can be cached in volatile
or non-volatile storage systems closer to the
main groups of users after just one person has
accessed that web page.
Caching works by moving a single copy of a web page
to volatile or non-volatile storage systems closer to the
main groups of users after just one person has accessed
that web page. Indeed, in some cases, the assets can
be actively pushed through into caches - either as soon
as they are available, or to replace older versions that
are already in the cache. Therefore, when another user
requests the page, it is retrieved immediately, using

An e-commerce customer that went from 46 servers


down to 5, just through implementing Varnish.
A
customer within the mobile advertising space
went from 12 front-end servers to 2 front-end and
2 back-end servers.
O
ne of the largest online media companies in
Scandinavia, was able to run their high-volume
site on a single server (down from 12) after
implementing Varnish.
These examples may appear to be purely technical with
few business repercussions, but there are business and
bottom-line-oriented benefits, such as:
Reduced hardware needs and a drop in licensing,
maintenance, energy and human resource costs.
Existing hardware can be used elsewhere, can be
turned off, can be stored for future use or can be
securely cleansed and sold off on the second-hand
market.

Cutting the cost of providing performance

Improved risk management with a far smaller


server estate to manage, more effort can be put into
high availability, possibly through using some of the
freed-up hardware to provide an N+M redundancy
approach (where N is the number of active items,
such as servers and M is the number of available
failover items). Meanwhile, customers are happier
due to the better performance, and are more likely
to stay loyal.

Providing a fast, responsive website to users


should not be an arduous and expensive task.
Rather than focusing on how best to build a
hardware platform, or how to tune that platform
and the website software to better perform,
looking at intelligent and effective caching will
deliver greater dividends to the business in more
ways than one.

I ncreased flexibility and agility - the business can


be more fleet of foot in what it offers to customers
knowing that new offers will be served far more
rapidly to customers means that more new ideas
can be tried out.
Improved green credentials and CSR-related
benefits - fewer servers using less cooling and
backup/auxiliary power systems means less energy
and a reduced carbon footprint. All of this, with the
minimum of systems change within the IT platform
no re-writing of applications; no fork-lift upgrades of
the platform; no changes to how customers access
the websites.
Providing a fast, responsive website to users should not
be an arduous and expensive task. Rather than focusing
on how best to build a hardware platform, or how to
tune that platform and the website software to better
perform, looking at intelligent and effective caching,
will deliver greater dividends to the business in more
ways than one.

User Case Study


Nikon
Business problem:
Nikons Image Business Unit (IBU) has had to respond to meet market needs. The world of photography has changed many times since Nikon started trading in 1917, and
digital photography has created a market where pricing against quality, have become major areas of focus.
Nikon has invested heavily in e-commerce initiatives, including a highly rich, content-driven customer experience via its website. Much of this content is, by its very nature,
high-definition photography: Nikon has to be able to deliver this speedily and effectively to its prospects and customers, no matter how and when they are accessing the
site.
This content is driven by an upstream content management system (CMS). An asset in this CMS may reference many different entries across the website - Nikon needed to
make sure that if an asset changed, then all versions of that asset changed immediately. The CMS had a rudimentary means of flushing the cache this required a rebuild
of the cache, that then slowed down the system until the cache was rebuilt. However, Nikon not only had to be cognisant of the capabilities of end users technology in
accessing the website, but also of the cost to Nikon itself. It would not be cost effective to architect a hardware platform that managed to meet users expectations natively.
Thought process:
Nikon knew that it needed a solution that was non-invasive and required no special technology at the users end. It also knew that it had to find a way, to deal with
presenting the rich experience to the user, as fast as possible, without the need for costly hardware or proprietary software.
In conjunction with Kanban Solutions, Nikon first looked to use the open-source Varnish Cache system, and noticed an immediate and considerable improvement in
performance. However, Nikon was still left with cache issues. The basic Varnish Cache could have Time To Live (TTL) triggers set, but this was still an essentially time-based,
rather than event-based, approach.
Solution chosen:
Nikon decided to move to the commercially supported version of Varnish Plus from Varnish Software. In this way, Nikon gained the granularity it desired. Now, as the CMS
drives new digital assets into the website, the cache immediately invalidates the existing asset - without impacting all the others assets in the cache.
Business benefits:
Nikon now has a highly performant website, dealing with spikes of thousands of requests per second. Response times have more than halved - high resolution assets
already in the cache are routinely retrieved and served in 500ms, whereas it used to take 1,300ms. Where such assets were not cached, it used to take around 7 seconds,
whereas with Varnishs caching functionality, this came down to less than a second.
Through the use of Varnish Plus, Nikon has brought down its server requirements from 7 to 4 Linux servers, enabling IT budget to be re-focused on customer-facing
initiatives.
The new system has led to an improvement of over 60% in organic traffic to the Nikon site - a figure which is still expected to grow.

c
5 d

Conclusions

Risk

Reverse HTTP
Proxy
Agility

he costs of creating a hardware-focused high performance web site are too high for
most organisations to contemplate. Hardware dependencies lead to over provisioning of
systems, lack of guaranteed performance and to brand reputational issues when customers find
web site response to be too slow.
Using the right type of data caching, web sites can be optimised for consistent, fast performance
without the need for any rewriting of existing code. Server estates can also be optimised,
lowering the number of servers in use and gaining cost benefits from lower energy use, licensing,
administration and other costs.

Granular Controls

Green

CSR

Web Site

For an IT department seeking for ways of better meeting a business needs, web site data caching
is a low-cost, high-impact means of supporting the business in how it deals with its customers.

Stickiness

Sustainability

Flexibility

CMS

Speed

Value

End-user
Experience

Cost

Loyalty

Information Caching

For a business that is looking for ways of ensuring consistent, high levels of performance to its
customers, data caching has to be the preferred means of managing this.
With any change within an organisation only impacting three variables, web data caching can be
looked at in this way:
Cost - web data caching removes the needs for over provisioning of capital-cost hardware,
and minimises all the associated costs around owning, maintaining and running those servers.
Risk - web data caching ensures that performance is consistent and that customers will retrieve
and see information in an acceptable timeframe. Therefore, the risks of losing customers are
minimised, and brand reputation can be more easily maintained.
Value - through the use of advanced web data caching systems, a business can be far more
flexible in the offers it makes to its customers, and can change content on its sites more often.
This additional agility and flexibility opens the door for the business to be back in charge, with
IT being the facilitator to its needs, rather than a constraint.

c
6 d

About Varnish Software


Varnish Software delivers the worlds most widely used caching engine based on open source. Using this flexible caching technology, our products , Varnish Plus and Varnish API Engine, help
companies like the New York Times, Vimeo, Tesco, Nikon, CacheFly and Fujitsu deliver web and application content fast, reliably and at massive scale to provide exceptional customer experiences.
Supported by an innovative developer community, our open source project, Varnish Cache, continues to flourish and makes 2.2 million websites worldwide run faster.
Varnish Software has offices in London, New York, Los Angeles, Stockholm, Oslo and Paris.
Follow Varnish Software on Twitter: https://twitter.com/varnishsoftware
www.varnish-software.com
London +44 20 7060 9955
New York +1 646 586 2052

quocirca

Clarifying the business impact of technology

About Quocirca
Quocirca is a primary research and analysis company specialising in the business impact of information technology and communications (ITC). With world-wide, native language reach,
Quocirca provides in-depth insights into the views of buyers and influencers in large, mid-sized and small organisations. Its analyst team is made up of real-world practitioners with first-hand
experience of ITC delivery who continuously research and track the industry and its real usage in the markets.
Through researching perceptions, Quocirca uncovers the real hurdles to technology adoption the personal and political aspects of an organisations environment and the pressures of the
need for demonstrable business value in any implementation. This capability to uncover and report back on the end-user perceptions in the market enables Quocirca to provide advice on
the realities of technology adoption, not the promises.
Quocirca research is always pragmatic, business orientated and conducted in the context of the bigger picture. ITC has the ability to transform businesses and the processes that drive
them, but often fails to do so. Quocircas mission is to help organisations improve their success rate in process enablement through better levels of understanding and the adoption of the
correct technologies at the correct time.
Quocirca works with global and local providers of ITC products and services to help them deliver on the promise that ITC holds for business. Quocircas clients include Oracle, IBM, CA, O2,
T-Mobile, HP, Xerox, Ricoh and Symantec, along with other large and medium sized vendors, service providers and more specialist firms.
Details of Quocircas work and the services it offers can be found at www.quocirca.com
Disclaimer:
This report has been written independently by Quocirca Ltd. During the preparation of this report, Quocirca may have used a number of sources for the information and views provided.
Although Quocirca has attempted wherever possible to validate the information received from each vendor, Quocirca cannot be held responsible for any errors in information received in
this manner.
Although Quocirca has taken what steps it can to ensure that the information provided in this report is true and reflects real market conditions, Quocirca cannot take any responsibility for
the ultimate reliability of the details presented. Therefore, Quocirca expressly disclaims all warranties and claims as to the validity of the data presented here, including any and all consequential losses incurred by any organisation or individual taking any action based on such data and advice.
All brand and product names are recognised and acknowledged as trademarks or service marks of their respective holders.

quocirca

Clarifying the business impact of technology

You might also like