Professional Documents
Culture Documents
Make your
sites load faster
CLIMB THE RANKINGS Give your sites
a performance boost today!
The future
of SEO
ESSENTIAL! The dos and donts for better
Google rankings in 2014 and beyond
Revised &
updated for
2014
Googles guide
to Analytics
INSIDER TIPS Google reveals how to best
use Analytics to measure your business
Future Publishing Limited 2014. All rights reserved. No part of this magazine may be
used or reproduced without the written permission of the publisher. Future Publishing
Limited (company number 2008885) is registered in England and Wales.The registered
office of Future Publishing Limited is at Beauford Court, 30 Monmouth Street, Bath
BA1 2BW. All information contained in this magazine is for information only and is,
as far as we are aware, correct at the time of going to press. Future cannot accept any
responsibility for errors or inaccuracies in such information. Readers are advised to
contact manufacturers and retailers directly with regard to the price of products/
services referred to in this magazine. If you submit unsolicited material to us, you
automatically grant Future a licence to publish your submission in whole or in part in all
editions of the magazine, including licensed editions worldwide and in any physical or
digital format throughout the world. Any material you submit is sent at your risk and,
although every care is taken, neither Future nor its employees, agents or subcontractors
shall be liable for loss or damage.
Welcome
Welcome
Contents
Contents
Contents
Page 18
Page 8
Page 20
Localising content
8
24
22
Blacklisted tricks
14
26
30
Page 26
Page 56
Page 74
Contents
38
46
50
56
62
Page 50
70
78
Page 92
92
Page 78
Contents
Contents
Contents
Contents
Expert tutorials
Page 120
Contents
98
Page 98
112
108
116
120
126
Page 126
Page 138
Page 146
Page 155
Search
Analytics
Marketing
Semantic search
Optimising web pages
Inconsistent data?
134
135
Blogging
146
15 post-Penguin tips
Backlinking top tips
151
Testing times
148
150
Using infographics
147
Remarketing
136
149
153
154
Google Adwords
138
155
Inbound marketing
Why the term isn't relevant
139
Social
140
Page 159
158
Conversational SEO
144
Website migration
Move sites with renewed ease
157
156
159
160
Contents
Contents
Getting started
Getting started
best tools
8
Getting started
30 best
SEO tools
Getting started
Getting started
Getting started
The data Sometimes you want to be able to see all of the data. There are plenty of tools available to help you on a daily basis
10
Trusted Proxies Suitable for most in-house SME SEO teams and agencies, Trusted Proxies can be configured to run on a server
SEO Spider Screaming Frog's SEO Spider has become the 'go to' site crawler
SEO Analyzer Bing's tool is great for quickly identifying page issues
Getting started
Getting started
Getting started
Getting started
Ahrefs A new tool to the scene, this link data monitoring tool is fast and has a powerful API
12
Python Pyscape Pyscape (http://netm.ag/pyscape-bz92) solves the problem of getting data from the Mozscape API in bulk
Fresh Web Explorer The new darling of the real-time mentions monitoring scene, you can compare mentions of your favourite terms found on the internet up to four weeks ago
Social Crawlytics A great site-crawl-based competitive social analytics tool that provides page-by-page social metrics and it's free
Getting started
Getting started
Getting started
Getting started
14
1. Satisfaction, guaranteed
Lets start with the bedrock of search marketing: there is really no such thing as
guaranteed rankings when it comes to organic, or natural search results. Any
company or specialist proffering such should be treated warily; ask whether
theyre referring specifically to organic search terms or paid search? Whilst is
possible to speculate on long tail niche keywords searches, for all but the most
niche key terms, results will vary and can take weeks, if not months. A good
search marketer will set realistic expectations, using SEO to prime all areas of
your website and content, rather than offer empty promises.
There are one or two hundred factors that influence your ranking with the
search engines no company or individual can control all of these. SEO might be
best achieved with great skill, but there are myriad external factors, dependent
on the success of your products or services, not to mention a slice of luck,
involved with determining whether or not you achieve good visibility on the
internet mantelpiece.
Search Engine Lands (http://searchengineland.com) editor-in-chief, Matt
McGee, says: The only way to even possibly come close to guaranteeing
rankings is if youre doing it on the paid side and happen to have a term that
youre willing to bid high enough on and to get high enough clickthrough to
sustain top spot. Also, personalisation comes into play: what you see might be
different to what I see, so theres absolutely no way to guarantee a number one
ranking on Google.
SEO land Search Engine Land is a new hub for everything related to SEO
Having a high PageRank is nice but it doesnt automatically mean high rankings
for everything, and it certainly doesnt automatically mean youre going to get
tons of traffic and sales.
McGee adds: Its still often seen as the number one factor in Googles
algorithm when its actually one of a couple of hundred factors. Its a very visible
symbol for a lot of webmasters and business owners, but the more time you
spend in the search world, the sooner you realise its not the be-all and end-all.
3. Endorsed by Google
Put simply, if youre dealing with a firm who make any allusion that theyre
endorsed or approved by Google for optimisation purposes, its likely
theyre a fraud. The reality is that Google does not endorse any SEO company.
They do have Analytics and AdWords certification, so providers in these areas
can take tests for accreditation. Google definitely does not put their stamp of
approval on any individual consultant or company, affirms Matt McGee.
Personally, Im not opposed to the idea of some accreditation or regulatory
standards, given this very subject matter, and unregulated nature of the search
world, but I just cant see it happening any time soon. Googles Webmaster
Guidelines (https://support.google.com/webmasters/answer/35769) and
its beginners guide to SEO (http://netm.ag/starter-bz92), as well as various
Getting started
Getting started
Getting started
Getting started
Beginner's guide Googles SEO starter guide (http://netm.ag/starter-bz92) covers around a dozen common areas that webmasters could consider optimising
16
of black-hat SEO is likely to do you much more harm than good as the search
engines get better and better at sniffing out sites, from dubious redirects to
affiliate link farms, that simply dont deserve to be there. The basic adage is, if
it works for the user, its likely to have a place on Google; how far up you climb
is dependent on myriad of factors, and those sites that cheat arent just risking
their credibility, but usually reek of over-optimisation, which in some cases can
be a by-product of a site that was never really designed to please its audience
first and foremost. Being gung-ho in your quest for high rankings at the expense
of your content is nearly always a futile process.
6. Keywords? Cram em in
The notion the keywords that every page needs a certain percentage of time to
outrank the competition is a fallacy. Says Matt McGee: Ive always said you do
have to use the keywords, you need to have pages that talk about the products
and services you sell. Theres no perfect number: its not that if you mention the
keyword seven times on this page Im automatically going to rank well. It
doesnt work that way: there are so many other factors and a page that gets a
lot of inbound links with the right anchor text can rank for terms that dont
even appear on the page. The notion that theres a perfect percentage for
keywords simply isnt true.
Furthermore, your copy should be persuasive, informative and punchy: youll
only serve to limit your copys punch by simply clawing keywords into the text.
Be verbose, create opportunities to talk about your company, products and
niche verticals, but never, repetitive.
David Mihm adds: Its a myth to say I will optimise your websites Keyword
Density. It is important to include keywords on your pages but there is no
magic number of times to use a keyword. Write your text for humans!
Adwords myth Don't expect Google to boost rankings just because you bought Adwords
Check listings Moz's Getlisted.org (https://getlisted.org) checks your listings on Google, Bing, and other local search
8. Land here
Every page on your site should be treated as potential landing page; you cant
assume a visitor is going to land on your homepage or your products overview
page. The idea that you have one special search landing page is not helpful. All
pages are possible landing pages.
a successful company wouldnt settle with just one single marketing investment.
If you think youve achieved all your SEO, Ill bet youre not making the most
of your website and your offline marketing activities. Theres always more that
can be done, and even if your rankings dont immediately benefit, your site
will. Even with limited resources, even adding or improving a single page every
month is better than leaving a static site to flounder, which may, in time, be
superseded by your competition and afforded less currency by your users and
engines alike.
Getting started
Getting started
Getting started
Getting started
Five search
marketing trends
Kelvin Newman, director at SiteVisibility, shares his top
five trends for search marketing
It's been an interesting couple of years for the digital marketing
industry: Facebook continued to rise despite a shakey initial public
offering (IPO) and fightback from Microsofts Bing. Lets not forget
the interesting new platform, Google+, which celebrated its first birthday
in June 2012. While Googles social network may be struggling to catch the
publics attention, it promises to be very influential in the future of search.
The continued impact of Googles Penguin and Panda updates have
re-shaped the search and SEO industry. Brands of all shapes and sizes have had
to learn how to adapt to more white-hat tactics to prevent being penalised
by Googles algorithm updates, which target webspam.
Pleasingly, for most honest SEOs, the decrease in ranking for some sites has
actually opened up opportunities for those who have played by the rules in
the past.
So with Penguins and Pandas aside, heres what we think will shape the
industry throughout the year and beyond:
1. Structured data
Google and other search engines are pulling more structured data into search
result pages. Therefore, it will be vital for digital marketers to mark-up data in
search-friendly ways, such as using schema.org, microformats, or microdata.
The advantage of structured data is that it allows users to refine their searches
using concepts rather than just individual keywords or phrases.
18
Apple's Siri Voice recognition technology provides new ways to discover content online
businesses should have one and I expect this to be part of most digital
marketing strategies in the rest of 2013 and beyond.
THE ALL-NEW
TRY IT
TODAY!
Getting started
Getting started
20 best Drupal
modules for SEO
Mark Roden, editor of WebCommune, assembles
a comprehensive SEO toolkit for the Drupal CMS
Drupal is the best CMS for search engine optimization (SEO). The
community has contributed a ton of modules to ensure webmasters
are adhering to best practices and are equipped for the future. In
fact, the wealth of CMS tools provide users with the ability to control all
elements of their campaign.
Due to the impending final release of Drupal 8 at the time of writing,
the following list of modules are mainly for Drupal 7. (Drupal typically only
supports two versions, so its a safe bet to focus on the middle ground of 7 in
the meantime). With the proper combination of modules, Drupal morphs into a
flexible platform equipped for the ever-changing world of SEO.
2. SEO Checklist
A vital download. This (https://drupal.org/project/seo_checklist) ensures youve
dotted the is and crossed the ts in your quest to be the best. The module
doesnt actually enforce any function, but does provide a comprehensive
checklist for on-site SEO, marking off each as you complete it. This can prove to
be highly valuable for those who arent so familiar with the logistics of SEO.
3. Page Title
This module (https://drupal.org/project/page_title) provides the ability to
control individual nodes, setting content in the <title> tag. Its one of the single
most important elements in a successful SEO campaign, and a vital module.
4. Path Redirect
With Path Redirect (https://drupal.org/project/Path_Redirect), you can redirect
URLs from one path to another path. Using this module is important for
maintaining the integrity of your site, and preventing search engine crawls from
resulting in error. Additionally, links existing on external sites are preserved, and
wont result in a 404.
20
5. Global Redirect
Global Redirect (https://drupal.org/project/globalredirect) fixes duplicate URL
bugs that tend to occur with clean URLs and the Pathauto module. Although
aliases for URLs may appear different, they are in fact duplicate. With the Global
Redirect module, a 301 redirect is created from the old node to the alias.
6. Metatag
The concept of meta tags is still a source of comedy. A great deal of weight used
to be put on them. Hysteria ensued. The importance has decreased, though use
of the module cant hurt for providing structured metadata. In reference to SEO,
meta tags (https://drupal.org/project/metatag) include meta description tags
and the meta keywords tag that search engines often pick up on.
7. Search 404
To ensure Not all who wander are lost, Search 404 performs a search on the
keywords within a non-existing URL. The module does a great job of keeping
visitors on your site using search engine keyword detection and expressionbased filtering from the URL. Search 404 (https://drupal.org/project/search404)
is recommended to keep bounce rates down and user engagement up.
8. XML Sitemap
XML Sitemap (https://drupal.org/project/xmlsitemap) generates a dynamic
sitemap built for search engines, adhering to the specification of sitemaps.org.
It enables you to configure content types, taxonomy terms and more for
intelligent crawls from search engines.
9. Site Verification
This module (https://drupal.org/project/site_verify) helps with the verification
process of your website. Supported methods of verification include adding
meta tags or uploading a file. Once initiated, a bi-directional check is initiated,
and search engines can then verify that you are the site owner.
SEO Checklist Great for beginner SEO, this provides a handy checklist for on-site SEO
Paths Use Pathauto and Sub-Pathauto to create new alias and keyword-rich paths for URLs
15. Footermap
Search engines reward websites designed for the best user experience.
Footermap (https://drupal.org/project/footermap) generates a site map block in
the footer region to improve navigation. Use links sparingly and efficiently.
16. Pathauto
The Pathauto module is (https://drupal.org/project/pathAuto) a staple of Drupal
SEO, enabling keyword-rich paths for URLs. It ensures search engines and site
visitors may gather information on content through URLs.
17. Sub-Pathauto
Automatically creates a new alias for URLs based on, or extending, an existing
alias, sub-pathauto (https://drupal.org/project/subpathauto) allows for further
generation of user-friendly URLS.
Getting started
Getting started
Getting started
Getting started
Localising content
Localising content
for better SEO
Michelle Craw shows you how to improve rankings
and increase traffic by localising online content
For any business operating in or selling to multiple international
territories, the localisation of your online content is a crucial tool
in your attempts to increase traffic, improve rankings and increase
conversion rates.
Google, which is still by far the most commonly used search engine in the
western world, is constantly refining the algorithms that underpin its searches in
order to provide high-quality, relevant, useful results for its users. Several recent
Google updates have focused on providing web developers with the opportunity
to provide localised versions of the same website in different territories.
Page views The data shows one third of UK page views were from mobiles and tablets
your site in a way that search engines can better use it. The type of rich markup
we always recommend, and is recommended by Google, is that found at schema.
org (http://schema.org) also known as microdata. Marking up your content with
these tags provides users with additional information and makes your content
look more interesting on the results page.
Another example is how a few results for a popular recipe will show
additional information in the search results, such as star ratings, reviews, cooking
time, calorie information, which helps attract the attention of users.
Rich snippets
Enhanced search results listings known among SEO practitioners as rich
snippets can capture users attention and encourage them to click through
from the search engine results page.
There are multiple different types of rich markup, and its becoming an
increasingly important feature of SEO. Essentially, its a way to structure code on
22
Hand-picked jobs by Creative Bloq, the UKs fastest growing design community
JOBS SEARCH
Search at http://jobs.creativebloq.com
From the makers of
Getting started
Getting started
Bad medicine
1) Keyword stuffing is bad medicine. Proper keyword use is not the concern of
this article, so for now well focus only on the improper kind. Keyword overuse
leads to synonym underuse, and makes for content thats inaccessible to the
average human user. Though people might not be able to read your content,
search engine robots still will. Oversaturated pages will get you penalised.
The average safe density of keywords should be between two and eight
per cent of your total word count. When creating copy you should think of
your audience, not of your page ranking.
2) Hidden text is invisible to human eyes. Keywords or links can be
camouflaged by colour-matching text to background leaving them unreadable
to human visitors, but perfectly readable to search engine bots.
More complex methods employ cascading style sheets (CSS) or layering to
hide text beneath surface copy. Such text is also readable to a search engine
24
spider and not a human user. Black hat operatives attempt to fill their sites
with hidden content for the express purpose of achieving higher rankings in
search lists, regardless of whether their pages are relevant to a users initial
search request. Google law basically states that you should build your website
for users, not for search engines. Ignoring this advice by including hidden text
or links is one of the quickest ways to get your site blacklist bound.
3) Doorway/gateway/bridge/portal pages are created specifically for search
engine bots. They are designed to target particular keywords or phrases and
will usually be extremely user-unfriendly and/or difficult to read. Because they
are simple devices used to trick people towards actual websites, they rarely
contain anything useful (other than any prominent CLICK HERE links through
to the real destinations). Black hat webmasters create portal or bridge pages
that bypass the need to click on a link completely, using fast meta refresh
commands that whisk you to another site (without so much as a by-yourleave). For this reason, many search engines now refuse to accept pages that
use fast meta refresh.
4) Cloaking can be achieved either through IP address delivery or agent
delivery. As with people, bots are identified by their user agent or their IP
addresses. Two sets of content are created, one delivered to the Google-bot,
the other to human visitors. The bot is deceived by the fake pages (the content
of which is usually saturated with targeted keywords) and grants the website
a higher relevancy ranking. When the user clicks on what they perceive to be
a promising link, theyre promptly forwarded to a browser page thats nothing
to do with their original search.
5) Mirror websites are two or more separate sites that use identical content,
but employ different keywords and meta tags to obtain rankings on
Getting started
search engine rankings. The process is costly as well as time consuming and, if
Google finds out, can lead to you getting your entire network dropped from
the index (including the site that youre optimising).
8) Backlink generation is a good thing. However, generating backlinks too
quickly is a bad thing. A new website that suddenly surfaces with an inordinate
number of backlinks looks suspicious, and spamming will be suspected by
Google. Therefore, you should build backlinks at a natural pace to avoid
incurring penalties.
When attaching backlinks to blog or forum posts, you should always keep
your content relevant and attempt to bring something to the conversation.
If you dont do this, you will be recognised as the spammer you are and
rightfully punished.
6) Link farms, specifically free-for-all link farms (FFAs), are to be avoided like
the plague. When Google inevitably identifies an FFA link farm as a 'bad
neighbourhood', it will infect also any linked pages and eventually deflate
their values.
Link scheme participants obtain links from farm sites in exchange for fees
or backlinks, but in either case its almost certainly an unsound investment.
9) Scraper sites are the online equivalent of a chop-shop. They are spam
websites that steal existing content using web scraping, often to generate
advertising revenue or to manipulate search engine ranks. Web scraping
works in a similar way to web indexing, which is the process employed by
most search engines to create ranking lists.
Unscrupulous black hat webmasters use scraping to gather content, before
repackaging it for their own purposes. Using someone elses content (even
open content) can constitute copyright violation if you fail to adequately
reference it.
10) Phishing pages are (according to Google) a page designed to look like
another page in an attempt to steal users personal information. The reasons
why phishing will get you blacklisted should be obvious, so dont even think
about doing it. l
Bad neighbourhoods
Getting started
Getting started
Page title
Page titles are an extremely important part of your page, because they are
very frequently relayed in search engine results and carry weight in search
engine ranking algorithms. Its important to keep the title as concise and
contextually rich as possible: 65 characters is a good rule of thumb.
A frequent mistake is to incorrectly format the title by placing the name of
the website at the start of the title tag.
Its highly recommended to place the website name at the end of the title
in a search engine result the name of the website is generally of little interest
to the people searching and you may sacrifice a lot of your clickthrough rate
for that page.
Bad form(at) An example of a badly formatted page title. As the website name is
presented first it can create less accessible search results
26
This is not only awful from a usability point of view, but also from an SEO
point of view. One highly important part of a links anatomy is the text within
the link this provides a very strong clue to search crawlers about what the
page being linked to is about. If youve ever heard of Google Bombing this is
the underlying reason why it works.
NoScript NoScript is a popular extension for Mozilla-based browsers that only allows JavaScript to be executed on websites the user chooses. When designing an accessible and SEO
Meta tags
Its common knowledge now that the meta Keywords tag should be
considered redundant for SEO purposes. Not only is it a waste of markup, it
also gives your competitors strong clues about the terms you are targeting!
There are, however, other very useful meta tags which should be utilised
on your website.
1D
escription meta tag: The description meta tag should be a concise
overview of the page. It is often displayed in search engine results, so not
only is it important to design it as concisely and as descriptively as possible
but also think about how appealing it is for a potential visitor to click on.
Descriptions shouldnt extend more than around 160 characters in length.
2C
anonical meta tag: The canonical meta tag is an important one that is often
overlooked by web developers. To understand why we need the canonical
meta tag we have to understand that search engines can treat pages with
slight variations in their URLs as separate and distinct pages. As an example,
take these two URLs:
http://www.example.com/shop/widget.html
http://www.example.com/shop/widget.html?visitID=123
They could be treated as distinct URLs even though they display exactly the
same content. This could impact on your site negatively, because you ideally
want the search engines to only index the first URL and ignore the second.
The canonical meta tag solves this issue:
<link href="http://www.example.com/shop/widget.html" rel="canonical" />
Placing the canonical meta tag on the widget.html page lets crawlers
know your preferred version of the page.
Sitemaps
Sitemaps should be kept up to date and contain every URL you want to be
indexed. You might not realise that some pages on your site are buried deeply
in your website and hard to access a search crawler may not explore that
deeply. By listing every page on your site in a sitemap youve made your site
far more accessible to the search crawlers and you can be sure that the search
engines will know about all your content.
Sitemaps have evolved and now are commonly used with XML. The XML
schema for sitemaps comes with a few options such as the last modification
date, how frequently this page is changed and its relative priority.
If you are not completely confident in your usage of the more advanced
attributes such as the change frequency and priority, its best to ignore them.
Getting started
Getting started
Getting started
Getting started
Sitemaps Sitemaps should be kept up to date and contain every URL you want to be indexed. Sitemaps.org provides guidelines on the protocol
Keyword density
Reading up on SEO you probably have come across words such as keyword
density referring to the percentage of words in a particular body of text that
are relevant to the search terms you are interested in. The theory is that if you
hit a specific density of keywords in a body of text you will be ranked higher in
search results.
Keyword density is often presented as an oversimplification of numerical
statistic called tf*idf. tf*idf reflects the importance of a word in a body of text
or collection of bodies of text in a far more accurate way than rudimentary
keyword density measurements. It's described mathematically probably isnt
the end of the story. Its likely search engines have modified this statistic and
weighted it differently in different cases to improve quality of returned results.
28
What conclusion should we draw from this as a new startup? You should
probably ignore it all. When youre writing content such as a new blog post
you need to remind yourself of your objectives youre trying to write content
that people will want to read. Text tuned to specific keyword densities has a
potential large downside, which is that the text becomes increasingly obscure.
A well written body of text will likely attract more good quality links and social
shares, which in turn will increase the value of your website in search engines
eyes. Dont worry about keyword densities. Instead, worry about the quality of
your writing.
INSPIRATION THAT
LASTS ALL YEAR
2014 ANNUALS
Features
Features
nd
a
s
do
e
h
t
l
er
t
a
t
e
e
v
b
re
We nts for kings
do le ran
g
Goo
30
Features
Features
Author
Pete Wailes
is the CTO at Builtvisible
(builtvisible.com), an
international creative
search marketing
agency, and developer
of the CSS library
OpenDAWS
Illu stration
Linzie Hunter
is a Scottish illustrator
and hand-lettering
artist based in
Peckham, South
London. Her clients
include: Time Magazine,
The Guardian, Hallmark,
VH1, Nike, BBC, Orange
and Marks & Spencer
www.linziehunter.co.uk
Features
Features
Redesign with
SEO in mind
During 2013, we decided to refresh the
SEOgadget brand (now Builtvisible). In doing this,
weve followed the processes outlined here:
Researching existing examples of agency sites
to understand what a good agency website
looks like today
Prototyping first, enabling stakeholders to
better understand the concepts
Taking content concepts to external figures to
understand the impact and gain feedback
Crafting new and legacy content, including
video and presentations
Creating a complete demo site, functionally
accurate, but without all the final assets in
place to allow for final revisions
This process has saved weeks of time and
refinement, as at every stage, something that
truly represents the end product is being built
around and showcased. Its also allowed for a
far more flexible creative process, as if weve
required revisions, weve been able to make
them in the browser and interact with those
changes live.
Its also allowed us to adapt the design
rapidly and sign off amends as the final content
has been produced, where that content has
necessitated changes that hadnt been foreseen
previously. The speed of change testing and
revision development has therefore been roughly
halved, versus the previous PSD to HTML, to final
version method of development.
32
A Better Way
Below Followerwonk is
Features
Further, theres campaigns like Imaginate by
Red Bull (imaginate.redbull.com) that couldnt exist
anywhere other than digital, which reach millions
through really creative storytelling, combined with
the inherent shareability that digital content can
have. These show a wonderful understanding of
the way that the consumer mindset works in 2014.
However, while these have all won multiple prizes
and serious awards, as well as huge traffic and
mindshare for their clients/publications, each has
areas where they fall short.
A regular check-up is
required to make sure
that content continues
to perform
These issues range from failures of cross-browser
compatibility to a lack of specificity around the
message, failing to ensure the content is findable
from search engines and so on. Each would have
been easily fixable. If youre creating something
thats tied to an event, acquire the domains around
the main campaign terms. For #showyourcolour,
that wouldve been showyourcolour.com and
Features
Features
understand those things become less understanding
of the language that creatives use to describe the
outputs they need, and how they work. Enter the SEO
industry, which is perfectly placed to act as a third
component to unify these two vital elements.
34
Features
Above Polygons
Xbox One review was
timely, beautifully built, and
nailed the targeting to a
single group of passionate
users: gamers
Features
Further reading
Features
36
Now that the piece is built, it can seem that the job is
finished. However, theres a key component missing
to all this at the moment, which weve seen time
and time again: no dedicated area for marketing
the content. Looking at two recent examples, firstly
Every Shot Imaginable (www.youtube.com/user/
Everyshotimaginable) was launched with a YouTube
channel, but without a dedicated area on the
European Tour website for that content. As a result,
the site doesnt rank well for the names of its videos,
or of the campaign. Nor was the campaign name
particularly picked up on by the target market. If it
had set up a dedicated section on the site, talking
about the campaign, it would have had a far more
compelling place to drive traffic to. This would also
have likely produced better social engagement,
as they would have been able to tailor the copy
Features
resources
If youre looking to get started with modern SEO best practice,
here are a few blogs and journals to get you started:
Blogs
Conferences
MozCon moz.com/mozcon
searchlove www.distilled.net/events
Future of Web Design futureofwebdesign.com/london-2014
Conclusion
In the SEO industry there exists an army of people
who are passionate about creating amazing
experiences for consumers, and who want to build
amazing content for their clients, pushing to create
great work.
We believe that SEO has a chance to really help not
just the agencies involved, but the consumers and
brands too in enabling discovery and re-discovery of
the great content produced.
The search optimisation industry may not be
perfect, but, at its best, its helping to develop better
websites and create more engaging content for
clients of all sizes.
Its not quite the industry we want yet, but we can
see it from here. l
Features
Features
Features
Get to the
top of Google!
SEO is a shapeshifter: its current,
grown-up incarnation is audiencedriven, engine and user-friendly.
Bryson Meunier has the details
Features
Features
RELEVANT
CONTENT
MOBILE
ONLY
RESPONSIVE
spammers
CONTENT
INSPECTION
USER
EXPERIENCE
NO DEAD
LINKS PLEASE
Features
RELEVANT
CONTENT
Changing trends Google Trends data shows that the number of searches for web design has declined over time to the point where it is now eclipsed by searches for seo
Features
SEO mission control Google Webmasters has many valuable reports on crawling and indexing of content, as well as who links to you and what queries your site appears for
Challenge:
l Images were hidden behind JavaScript and
notindexed.
l Images were hosted on another domain (a
common CDN) making it impossible to create
image sitemaps.
l Title tags were branded, making it difficult for
the engines to understand what keywords the
site was relevant for.
l Most common search phrases were not used
in content.
l As a new site, there was a lack of authority.
Strategy:
l Rewrote the URLs so that they would appear
to the engines to be hosted on clients site to
get more content indexed.
l Made images more accessible by adding
noscript tags to the download page.
l Used sitemaps and image sitemaps to ensure
that the engines were aware of our content
and the structure of our site.
Results
eo
On the rise SEO can increase your traffic significantly if you do it correctly. In the case study shown above, organic
traffic grew by more than 4,000 per cent in the first six months
Way forward Detailed, integrated SEO plans can be crucial
40
Features
Big hitters Google offers more than 570 videos for webmasters, which, in four years, have had over 10 million views
Features
Company Search
EngineLand
Role Founding editor
Web www.
searchengineland.com
Features
Paul A
checked in
at Facebook
Features
On target Vanessa Fox of Nine by Blue (www.ninebyblue.com) created this searcher persona in order to connect audience goals with relevant content from the business
Readymade people Resolution Medias ClearTarget Behavioral Analysis takes keyword research to another level by
harnessing the power of big data and automation to create actionable searcher personas
42
NO DEAD
LINKS PLEASE
Features
Live example
For the example, this is what should be in the
<head> section of the URL (this format should be
included for all pages on the site; additions are in
red). The content title and description should
carry through to the Open Graph tags:
<html prefix="og: http://ogp.me/ns#">
<head profile="http://www.w3.org/1999/xhtml/
vocab">
<meta http-equiv="Content-Type" content="text/
html; charset=utf-8" />
<link rel="shortcut icon" href="http://www.
brysonmeunier.com/sites/all/themes/spirezen/
favicon.ico" type="image/vnd.microsoft.icon" />
<meta content="SEO Mentioned Again on the
Good Wife" about="/seo-mentioned-again-on-
the-good-wife/" property="dc:title" />
<link rel="shortlink" href="/node/398" />
<link rel="canonical" href="/seo-mentioned-
again-on-the-good-wife/" />
<title> SEO Mentioned Again on The Good Wife
| BrysonMeunier.com</title>
<meta name="description" content="CBS
Television show "The Good Wife" mentions SEO
again in a fictional trial. Read more." />
<meta property="og:title" content="
SEOMentioned Again on The Good Wife " />
<meta property="og:description" content=" CBS
Television show "The Good Wife" mentions SEO
again in a fictional trial. Read more."
<meta property="og:type" content="Article" />
<meta property="og:url" content=" http://www.
brysonmeunier.com/seo-mentioned-again-on-
the-good-wife/" " />
<meta property="og:image" content=" http://
www.brysonmeunier.com/wp-content/
uploads/2012/10/seo-from-the-good-wife.jpg" />
<style type="text/css" media="all">@import
url("http://www.brysonmeunier.com/modules/
system/system.base.css?mbovli");
A change in analysis
Delivering on this promise frequently requires a
new type of analysis. In the past, marketers have
done keyword research to uncover keywords as
proxies for user intent. In Marketing in the Age
of Google, Vanessa Fox describes the process
of creating searcher personas that get beyond
simple keyword matching and search volume
exercises. And still others, such as iCrossings
Core Audience (www.coreaudience.com) and
Resolution Medias ClearTarget (http://netm.ag/
cleartarget-238) try to understand characteristics
of audiences, including but not limited to the
keywords that they use.
For some businesses, mobility will not change
user intent. For example, news is not going
to be rewritten for a separate platform, as
Up next Open Graph tags and other structured data help power search innovations like last years Knowledge Graph
Features
Features
USER
EXPERIENCE
Paul A
checked in
at Facebook
Features
To the point Googles Matt Cutts is active in the webmaster community, answering key
questions in videos such as how important is it to have keywords in a domain name?
Primary source Matt Cutts is one of the most popular personalities in SEO. As Googles
head of webspam, his blog has been required reading for SEOs since 2005
Resources
Link building means earning
hard links, not easy links
Danny Sullivan explains the rationale behind his
rant from SMX Advanced 2012, going into detail
about the concept of link earning, as opposed
to link building for the sake of acquiring links en
masse (http://netm.ag/sullivan-236).
Google Webmasters
YouTubechannel
Since 2009, Matt Cutts, Maile Ohye and other
members of the Google Webmaster team
have been posting videos aimed at helping
webmasters increase the visibility of their
sites. Among the 500-plus videos uploaded
44
THE ALL-NEW
TRY IT
FOR FREE
TODAY!
Youll find additional imagery, exclusive audio and video content in
every issue, including some superb screencasts that tie in with
the practical projects from the issues authors. Dont miss it!
Features
46
Reduce your
bounce rate
How do you keep visitors on your site longer
once theyve clicked through from a search
result? David Deutsch gives the lowdown
Sources of traffic
Reducing your websites bounce rate is a great
firststep in improving its overall performance
andconversion rate.
First you need to analyse the bounce rates of
the different traffic sources. Focus your efforts
Features
Features
Features
Features
Testing
How do you know whats working and what isnt
in terms of keeping visitors on your site? Answer:
you dont. So assume nothing, test everything.
Were privileged to work in an industry where
we can test every idea we have without spending
any money. Thanks to Googles Content
Experiments (in Analytics), we can create, test and
monitor the performance of landing pages. The
purpose of testing your landing pages is to reduce
bounce rates and increase conversion rates.
Setting up a A/B split test on Google Content
Experiments is very easy and free. Heres a basic
overview of the steps involved.
lC
reate a campaign to test landing pages
against each other.
lU
pload the URL of the original page that
youre testing.
lU
pload the URLs of the other landing pages
you want to test against the original.
lC
opy the JavaScript codes provided by
Google and paste them in to the relevant
pages youre trying to test. (This will require
access to the HTML code of the pages.)
lS
end at least 500 visitors to the primary URL
and Google will separate those visitors
randomly for you to test the performance of
the pages in a non-biased way. You can do
this by using AdWords, email campaigns or
even affiliates to send traffic from specific
keywords through the funnel.
Golden rules
Stats amazing The bounce rates for Real Gaps PPC pages
are much lower than for the organic pages
48
THE ULTIMATE
SOFTWARE TRAINING
Features
Features
50
Features
Features
Googles
guide to
Analytics
With the emergence of new web technologies, make sure
youre making best use of Google Analytics to measure your
business. Googles Justin Cutroni presents his pros guide
Features
Features
52
Features
Views Use the Dimension selection links to view your event categories, actions and labels
Comparing price Rasmussen College was able to see price interest by school of study
Features
Dimensions The example above shows the Countries dimension and standard metrics for
each value of the dimension
Features
Features
Events In Content > Events reports, view event data based on Category, Action or Label
Custom dimensions Create custom dimensions in the Google Analytics Admin section
54
Cross-device measurement
The next feature in Universal Analytics is the
ability to measure how your customers connect
with your business across devices. The feature
will be able to measure a user as they navigate
across all of these devices. In order to do this, you
as a business must be able to provide some type
of primary key.
For example, if youre an airline and have
frequent flyer ID for each of your users, you can
specify that Google Analytics use this as the
unique identifier on each platform. (This feature
will be released soon.)
Finally, in addition to collecting data from any
Features
Custom dimensions Once you create the Custom Dimension, you must then add the necessary code to collect the data
lG
oals: (http://netm.ag/goals-244), or
conversion tracking, is an absolutely critical
feature. Make sure you define goals for
your business.
l Implementing ecommerce: If you sell a
product online then you should be using
the ecommerce module (http://netm.ag/
ecomm-244) to measure your transactions.
lT
racking campaigns: Like goals, campaign
tracking (http://netm.ag/track-244) is a
critical feature.
lM
obile app tracking: Everything you ever
wanted to know about the Android SDK
(http://netm.ag/android-244) and the iOS
SDK (http://netm.ag/ios-244).
lU
niversal Analytics: Learn more about the
next generation of Google Analytics,
Universal Analytics (http://netm.ag/uni244).
lM
easurement Protocol: The foundation of
Universal Analytics, the Measurement
Protocol (http://netm.ag/proto-244), lets
you collect data from any network
connected device.
GAServiceManager.getInstance().
setDispatchPeriod(60);
In addition to standard data, you can also
measure information thats specific to the app
world, like app crashes and exemptions. This
information is particularly useful as you look to
improve app performance and the user experience.
It only takes a single line of code to collect this
information. In Android, the code would look
something like this:
<bool name="ga reportUncaughtExceptions"
>true</bool>
And, in iOS, expect the code to look something
like this:
[GAI sharedInstance].trackUncaughtExceptions =
YES;
Wrapping up
Hopefully, weve been able to broaden your
perspective of whats possible with Google
Analytics. Its important to remember that, in order
to be actionable, data needs to relate directly to
business strategies and tactics. Using the full
breadth of features that are available in Google
Analytics, youll be able to better align your data
with your business. l
Features
Features
Optimise your
Features
ou r c
re s
es
er
56
Simplify pages
r re
D ef
HTTP compression
D e fe
Features
res
ou
Minify JavaScript
c r uo
se
se
Features
Resize images
Convert events
r ce
Features
Features
Start render
What it means:As its name suggests, start
render indicates when content begins to
display in the users browser. However, it
doesnt indicate whether the first content to
populate the browser is useful or important,
or is simply ads and widgets. This term seems
to have evolved as an alternative to end-user
response time, but its not yet widely used
outside of hardcore performance circles.
When its useful:When measuring large
batches of pages or the performance of the
same page over time, its good to keep an eye
on this number. Ideally, visitors should start
seeing usable content within two seconds. If
your start render times are higher than this,
you need to take a closer look.
Load time
What it means:The time it takes for all page
resources to render in the browser, from those
you can see, such as text and images, to those
you cant, such as third-party analytics scripts.
Load time needs to be taken with a grain of
salt, because it isnt an indicator of when a site
begins to be interactive. A site with a load
time of 10 seconds can be almost fully
interactive in the first five seconds. Thats
because load time can be inflated by thirdparty scripts, such as analytics, which users
cant even see.
When its useful:Load time is handy when
measuring and analysing large batches of
websites, because it can give you a sense of
larger performance trends.
58
Survey An average ecommerce sites takes 8.9 seconds to load over an LTE network and 11.5 seconds over 3G
Intermediate optimisation
techniques
Once youve nailed the low-hanging fruit, this set
of techniques should be next on your list.
3 Compress images
The problem: Images account for a full 60 per cent
of the average web pages payload. In my travels,
Features
After compression,
you can typically
expect a 10 to 40
per cent decrease
in file size
I regularly see sites that use unoptimised and
unnecessarily bulky images. Combating this bulk is
a huge step toward making pages faster.
The solution: I cant stress enough the importance
of ensuring your images are saved in the optimal
compressed format.
These formats are:
l Photos JPEG, PNG-24
l Low complexity (few colours) GIF, PNG-8
l Low complexity with transparency GIF, PNG-8
l High complexity with transparency PNG-24
l Line art SVG
But its not enough to use the right format.
Whatever tool was used to create the graphic
wont save the file in the most efficient way, which
is why you need to pass all images through a
compression tool. (See the sidebar for a list of tools
to consider.) After compression, you can typically
expect to see a 10-40 per cent decrease in file size,
without any noticeable sacrifice to image quality.
4 Minify JavaScript and CSS
The problem: A pages source code can contain a
Features
Waterfall chart In a case study involving an un-optimised version of the Velocity conference homepage (http://
velocityconf.com), keepalives and HTTP compression shaved more than five seconds from the pages load time
6 Optimise localStorage
The problem: Caching is an essential technique
for improving load times for repeat visitors, or
for visitors who view multiple pages during a
single visit, but desktop and mobile caches are
not created equal.
Traditional browser caching doesnt work
well for mobile devices. Mobile browser
Features
Features
Velocity homepage Intermediate performance techniques shaved an additional four seconds off the mobile load time
Advanced optimisation
techniques
After you have successfully implemented the core
and intermediate best practices mentioned
previously in this article, there are still a few
remaining things you can do to ensure that you're
squeezing every last drop of performance from
your pages.
60
Features
Responsive site Global Islands Vulnerability Research Adaptation Policy and Development
(GIVRAPD) is a responsive WordPress website built with LESS & SMACSS
Revenue Less revenue is generated via mobile sites
11 Resize images
The problem: Im filing this under advanced
technique because its tricky to implement
especially for large, complex, dynamic sites but
its a critical performance challenge. As Ive already
mentioned, images account for a huge portion
of a typical pages payload, which is crippling for
mobile, not to mention completely unnecessary for
smaller screens.
The solution: Dont waste bandwidth by relying
on the browser to scale a high-resolution image
into a smaller width and height. Instead, it's best
to dynamically resize images in your application,
or even replace images with smaller versions for
mobile sites.
Another option is to load a very low-resolution
version of an image initially to get the page up as
quickly as possible, and then replace that with a
higher-resolution version on the onload or ready
event, after the user has had a chance to begin
interacting with the page.
12 Simplify pages with HTML5 and CSS3
The problem: Theres no real problem here, per se.
This technique is a pure optimisation play.
The solution: The HTML5 specification includes
new structural elements, such as header, nav,
article, and footer. Using these semantic elements
yields a simpler and more efficiently parsed page
than using generic nested div and span tags. A
simpler page is smaller and loads faster, and a
simpler DOM means faster JavaScript execution.
The new tags are quickly being adopted in new
browser versions, including mobile browsers.
Similarly,new CSS3 features can help create
lightweight pagesby providing built-in support for
things like gradients, rounded borders, shadows,
animations, transitions and other graphical effects
that previously required you to load images.
The short answer is: yes. The long answer is: but
perhaps not as much as you think they do.
Content delivery networks (CDNs) have
emerged as an excellent tool for mitigating web
latency. In web performance circles, latency is
the amount of time it takes for the host server to
receive and process a request for a page object.
The amount of latency depends largely on how
far away the user is from the server.
To put this in real-world terms, say you visit a
web page and that page contains 100 resources.
Your browser has to make 100 individual
requests to the sites host server(s) in order to
retrieve those objects. Each of those requests
experiences at least 20-30ms of latency. (More
typically, latency is in the 75-140ms range.) This
adds up to two or three seconds, which is pretty
significant when you consider it as just one
factor that can slow your pages down.
When you also consider that a page can have
upwards of 300 or 400 objects, and that latency
Take away
No matter what evolutionary leaps we make in
mobile technology, web pages are only going to
grow bigger and more complex. To keep pace and
maintain some semblance of control, we need to
continue to innovate our practices for optimising
directly at the page level. l
Final reduction In the final reveal, implementing some advanced optimisation techniques brought the mobile load time
for the Velocity conferences content-rich homepage down to a satisfactory 5.56 seconds
Features
Features
Features
Master
mobile
navigation
Users want content. But, argues
Aaron Gustafson, you first need
to ensure they can locate it,
whatever their device type
62
Evolution
When the mobile web first became a reality, the
few companies that felt the need to venture into
this uncharted territory mostly airlines and
financial institutions did so by creating
completely separate mobile websites, often with
unique content and navigation. Frequently, these
sites amounted to a lite version of the parent
website, with content
Features
Features
Features
Features
Real deal Authentic Jobs (www.authenticjobs.com) hides nav unless a screen is 768px wide and supports media queries
Clarity is key
Whether designing for desktop or mobile, the
clarity of your navigation labels is key.
As Jared Spool pointed out in Stop Hiding
Behind Products (http://webstandardssherpa.
com/reviews/stop-hiding-behind-products),
we should avoid using generic labels (such as
products) in our navigation when we can use
more specific terms such as snow melters or
DVD players. Meaningful labels will help our
users more quickly (and accurately) suss out
what lies behind our navigation links. It also
sets an expectation for what they will find when
they click, while reducing the likelihood that
they will need to bounce in and out of several
sub-pages to find what theyre looking for.
Similarly, it is important to realise when your
chosen label may be inappropriate for your
audience. For instance, avoid using internal
corporate jargon in your navigation (or copy
64
Navigation strategies
Im going to take you on a whirlwind tour of mobile
navigation strategies. Each has its pros and cons
but, more importantly, each has its own set of
dependencies. Most, as youd expect, rely on media
queries. Some have source order requirements. And
a few rely on JavaScript, the absence of which can
make for an awkward interface.
Hide it
Our first strategy takes its cue from the old mobile
web camp and only enables users to accomplish
key tasks (as identified by the UX team, upper
management or user testing of an existing mobile
site). Users are offered little (or no) navigation and
can only access a subset of a websites features. In
some cases, the decision to reduce or remove
navigation is made to conserve real estate. Users
who only experience such a website on a mobile
device may never know theyre missing features, but
users who visit it on multiple platforms (which is an
increasing trend) are likely to become frustrated
when they cant see items theyre used to accessing.
Features
Trim it
If you are struggling to find a reasonable layout
for a large navigation menu comprised of several
tiers of navigation items, you might want to
consider reducing the number of links to only the
Shrink it
If your sites navigation is relatively succinct, its
possible youll be able to get away with simply
adjusting the layout and size of your navigation
items. Succinct is obviously a matter of opinion, so
you should be cognisant of how much space
your navigation occupies even when
Features
Author Implementing
Responsive Design
(New Riders, 2012)
URL www.timkadlec.com
Features
Features
Baby steps Confab 2012 (www.confab2012.com) makes minor accommodations for small-screen users, but clusters its nav away at the top of the display
66
}
}
Rearrange it
For a number of years now, web standards
advocates, SEO consultants and accessibility experts
have been arguing in favour of putting the content
first in terms of source order. After all, if youre
using CSS, its a breeze to move your navigation
to the top of the page.
The benefit of this approach is that it provides
immediate access to the meat of your page for
your users and for search engine spiders alike.
Collapse it
One of the more popular mechanisms for managing
larger navigation lists on mobile is the drop-down
menu. This particular UI construct can be
accomplished in a a few different ways, each of
which has its own set of dependencies. To choose
the right one, you must first determine whether or
not the menu should push the page content down
when it expands.
Starbucks (www.starbucks.com) is probably
the most popular example of a drop-down menu
that pushes the content down the page rather
than sliding over it. In order to accomplish this,
the navigation list must appear at the top of the
document so that when it expands, it pushes all
subsequent content down the page.
/* Hide the nav by default */
#nav .nav_menu {
display: block;
Features
Natural approach Contents (http://contentsmagazine.com) is well named: its content is placed first in source order
height: 0;
overflow: hidden;
}
/* Open it when JS addes the .open class */
#nav.open .nav_menu {
height: auto;
}
body:not(:target) #nav {
/* these styles are only applied if :target and :not
are understood (and the body is not
targeted, of course) */
}
Convert it
Another popular mechanism for shrinking
navigation on mobile devices is to convert it into a
select element. The benefit of this approach is that
it drastically reduces the space required for
navigation, while maintaining a look and feel that is
familiar to the user. It also places no limitation on
the number or depth of navigation items. The
downside is that using this scheme requires either
Features
Features
Features
Swap out Here a list nav is switched for a select element
Reveal it
The final mobile navigation paradigm showing
potential is the slide to reveal nav treatment
popularised by apps such as Path and Sparrow on
iOS, and Facebook on the web.
On Facebooks mobile site, the navigation is
contained in a div classified as mSideArea and
the main content is contained in a div identified
as page. These two divs are contained within an
outer div identified as viewport. This outer div is
relatively positioned to create a positioning context
for .mSideArea, which is absolutely positioned and
given a width of 260px and a negative left offset of
the same amount to move it out of view; #page is
positioned relatively with no offsets.
#viewport,
#page {
position: relative;
}
.mSideArea {
width: 260px;
Facebook mobile The site uses a common menu icon, then slides the page to the right to reveal the navigation behind it
position: absolute;
left: 260px;
}
The reveal is accomplished by adding a class of
sideShowing to the body element. The addition of
this class triggers #page to receive a left offset of
260px (shifting the page content to the right) and
sets .mSideAreas left offset back to 0, moving it into
the 260px of empty space to the left of #page.
.sideShowing #page {
left: 260px;
}
.sideShowing .mSideArea {
left: 0;
}
This is a pretty clever piese of code to be sure.
And, should you want to make it a little slinkier, add
a CSS3 transition:
#page,
.mSideArea {
/* Insert prefixed versions here */
Reading list
There are tons of smart people experimenting
with different mobile navigation schemes and
concepts. Ive mentioned a few in this article,
but I thought it would be good to point out
some other gems I found in my research.
68
http://netm.ag/kenny-232
Im intrigued by Tom Kennys approach with this
technique. I dont think its quite ready for prime
time, but it is an interesting idea. Keep an eye
on this one.
http://netm.ag/johansson-232
Doing exactly what it says on the tin, in this
piece Roger Johansson explores an alternative
to converting your menu into a select using CSS
and a bit of JavaScript.
http://netm.ag/scharnagl-232
In this experiment, Michael Scharnagl explores
prioritised navigation schemes in responsive
navigation. I like this concept because it allows
http://netm.ag/gillenwater-232
In this bit of required reading, Zoe Gillenwater
runs through a ton of media query configuration
options, giving pros and cons for each approach.
SUBSCRIBE TO
UK OFFERS
6 MONTHS DD 26.99 (Save 23.89)*
1 YEAR 59.99 (Save 17.88)*
2 YEARS 77.87
SAVE
50%
(Save 77.87)*
Features
Features
70
Understand
your audience
Good research isnt just about finding out how many unique
visitors you have. Rob Mills sets out techniques to help you to
get to know your users and make more informed decisions
Before I entered the design world I
was an audience research executive at
BBC Wales. This gave me a valuable
grounding in the importance of audiences,
research and having data to support any
decisions you make. It also taught me that
theres a big difference between knowing your
audience and understanding your audience.
Knowing is associated with top-level data such
as having 23,000 unique visitors a month to a site.
Understanding your audience is about finding out
as much as possible about the people behind those
numbers, including your users' social and cultural
situations and media consumption as well as their
likes, dislikes and needs.
Why understand?
Targeting content
Targeting is vital: the web is evolving quickly and
marketplaces are increasingly competitive. Greater
choice has resulted in fragmentation, so audiences
are now spread more thinly and you need to work
hard to get them and keep them. Users are
selective about where they spend their time online,
and this is a concern if they are consumers with
money to spend.
Knowing vs understanding
Top-level tricks of the trade such as Google
Analytics and the Jetpack WordPress plug-in
(http://netm.ag/jetpack-233) provide an overview
of your audience, but address knowing more than
understanding. With Analytics you can find
out your users language, location,
Features
Features
Features
Features
Interpretation A handy grid featuring the key research methods, together with their main pros and cons
Establishing a framework
To dig deeper you need to choose the research
methods that best offer the level of detailthatyou
need. Then you need to derive meaning fromthe
data to find stories within. The first step is to
establish your objectives. It might be best to
categorise the audience information you aim to
obtain, perhaps in the following way:
l Basics
l L ifestyle
l Media
Colour coded Understanding your audience means you can make informed decisions about design elements
72
Features
Features
Brand identity Use the data from your research about your audience to provide insight and help define your designs
Features
Features
74
Beat Google
link penalties
Whether its a problem with Penguin or a
manual links penalty, Tim Grice says there is a
way back into Googles good graces
Admission of guilt
A lot of people believe adding links into the
disavow tool is an admission of guilt and
submitting a file will cause further trust issues
with your website. This simply isnt true. Ive never
seen a site react negatively to the submission of a
disavow file.
Features
Features
Features
Anchor text
Features
Timescales
Typically, you will get a response after a
reconsideration within two weeks. If you are
successful and have a penalty revoked you
may have up to four weeks to wait before
any rankings come back.
A word of warning
If you havent had an unnatural links
message warning, you need to be very
careful when using the disavow and
reconsideration process. Its likely Google
hasnt found your bad links and are still
counting them. Disavowing and sending
in a reconsideration request will cause a
full valuation of your profile and you may
have added links that still count towards the
disavow tool.
Negative signals
There seems to be a genuine fear of using
this tool around the SEO community.
However, if youve had the unnatural links
message, you really shouldnt worry. I have
yet to see even one negative consequence
when using the tool to remedy an unnatural
links message. Likewise, I have yet to see any
negative results through the submission of
multiple reconsideration requests. If youve
had a manual penalty, you simply need to go
through with this process. Dont worry either
about another penalty hitting as a result of
being transparent.
Reconsideration requests
Even though I would still recommend
sending in a detailed reconsideration, Im
95 per cent sure Google arent reading
them or delving into any Google docs sent.
However, I would continue to write a good
reconsideration request and send all data,
just to show willing.
Disavowed You need to have a Google Webmaster Tools account set up to be able to use the disavow tool to disavow links
Start again?
Is it time to just give up and start again? Ive
yet to come across a hopeless case. Weve
had sites where we have had to remove over
5,000 linking domains and still managed to
secure a positive result.
Reinclusion After filling re-inclusion requests, many sites see dramatic recoveries in traffic (credit: www.johnfdoherty.com)
76
Features
Features
Site explorer The search engine for links, this tool by SEOmoz allows you to perform
competitive website research and explore backlinks, anchor text (and more) for free
Link audit
Make sure you undergo a thorough link audit.
Combine Opensiteexplorer (www.opensiteexplorer.
org), Majestic SEO (www.majesticseo.com) and
Webmaster Tools (www.google.com/webmasters)
Majestic A link intelligence tool for SEO and internet PR and marketing. Majestic SEOs Site
Explorer shows inbound link and site summary data, as well as your sites backlink history
Manual penalty
If you get a manual penalty response, happy days!
You will recover within 10 days.
Algorithmic issue
You may get one of two messages about
algorithmic issues. If your issue is algorithmic,
adding the suspect links into the disavow tool will
help you overcome it. You may be suffering with a
Panda penalty, which will need to be investigated.
Again this message means there arent any
manual actions and the issue is algorithmic. If its
down to links, disavowing them will alleviate issues.
In summary
The best plan of action:
1 Carry out a link audit and classify links
2 Manually remove aggressive anchor text
3 Add spam links to a text file
4 Disavow spam links
5 File reconsideration
6 Await response
7 Recover rankings
Throughout, you should build great links
through real outreach and marketing, too. l
Tim Grice speaks regularly about the search industry.
For more information about upcoming events, visit:
www.branded3.com/events
Features
Content strategy
Features
78
Features
Content strategy
Giving context
to your content
In the first of a four-part series,
Sandi Wassmer establishes a
framework for content strategy
Content is defined in the Oxford English
dictionary as the things that are held or
included in something, which is more
than a tad ambiguous so its not surprising that
interpretations of the word in the context of web
design and development vary so widely.
If you ask 10 web designers or devs what
content is and what role they play in its creation,
you may get 10 different answers ranging from
Content? What? Im a designer not a copywriter
to Im glad you asked. Sit down and let me tell you
about the importance of taxonomies and metadata
management. But they will all know one thing: the
success of their site, web app, mobile app, or other
platform that content is delivered on, relies on it.
Features
Features
Content strategy
In a word Findable
Features
When a website is
broken down into
discrete elements,
its all just content
along and gave it style, with marketers joining in
once they realised what an incredible tool they had
at their disposal. However, in web design, creative
teams are multidisciplinary, a lot more flexible and
have developers as part of their core.
From brand managers to developers, designers to
copywriters, user experience folk to technical project
managers to online marketers and more, all play
pivotal roles in the life of your content.
In large organisations, content strategy is
becoming an integral part of marketing and
business strategy, with clear delineation of roles and
responsibilities, but in smaller ones, content strategy,
like many other aspects of web design, is shared
among those involved in its delivery. However you
slice it, thinking strategically is essential.
80
Content strategy
Features
Features
Features
Content strategy
Making data
Features
meaningful
Content management
Content management and content management
systems are not the same thing, and this little
misunderstanding can be the source of great
frustration for those involved in content strategy.
Content management is a process, something you
plan and do; a content management system is a
piece of software and is something you utilise.
Content management systems from a five-page
WordPress blog to an enterprise-level information
management system are what most digital teams
use to create, publish, manage, distribute and store
all manner of web content, and for those not versed
in the merits of content strategy, its easy to simply
rely on what the system provides. Although you may
get lucky and find a CMS that is perfectly in line with
your content strategy, its improbable. However, as
content management systems are not created by
designers or content strategists, it is important to
understand the inherent systems and structures that
exist within your CMS: these may impose restrictions
on your strategy and approach to design, particularly
if the inherent systems are not flexible, or you dont
have the resources to modify them.
You also need to consider that once your
wonderfully designed website is finished, it will be
handed over to marketing to manage the day-to-day
82
Features
Content strategy
Metadata
Classification
lA
dministrative metadata is used to improve
your CMS or other method of managing
content for administrative purposes.
lS
tructural metadata is the data inherent in
your CMS database. Its unseen by users, but
is key to keeping your content well managed.
lD
escriptive metadata (such as tags used to
categorise and filter blogs or news content)
is what frontend designers and devs need to
consider. If managed well they can greatly
improve the findability of content; used badly,
they are a hindrance and not fit for purpose.
Taxonomy
User intelligence
Amassing user intelligence is an evolving area,
but the variously named methodologies are all
derivatives of either quantitative or qualitative
research. Traditional market research processes still
stand us in good stead, and technology provides
enormously improved ways of collecting, collating
and analysing the collected data.
Market researchs
insights into a sites
target audience
can be invaluable
Quantitative evaluation is objective, and is about
statistics, and analytics measure performance against
set targets. In contrast, qualitative evaluation looks
at the quality of interactions and is more subjective.
Analytics data and reports from your existing website
should be used, where available, because these
provide the context for deeper understanding, and
can also be used to compare against analytics data
post-launch.
Quantitative research
Statistical quantitative research starts with the
collection of data, and there are a number of
tools available for this, such as questionnaires,
online surveys, qualitative research (the whys and
wherefores), interviews, focus groups, random
sampling, projective groups, heatmaps, eye-tracking
and product testing
Ethnography
Admin CMSes offer many ways to manage online content
Features
Features
Content strategy
Features
84
Features
Content strategy
Put knowledge
into action
of bringing the disciplines of responsive
In the third of a four-part series, possibilities
design and content strategy together.
Sandi Wassmer argues that it pays to
Defining design goals
plan before embarking on any project When defining your goals, the understanding you
On commencing any project, you should
give full consideration to the different
places your content needs to get to,
and remember that it frequently needs to do so
simultaneously. So if you dont think strategically
and for the different outcomes, youll find things
coming unstuck pretty quickly.
You will need to use whatever resources you
have to make this process more efficient. If you are
creating a news story that needs to populate different
channels, in different formats, at different times and
in different languages, the way that you create, store,
distribute, manage, maintain and eventually archive
the different versions of a single source of content
(and all of its associated files and metadata) is key.
Even if your brief says that you are only designing
a website, you must nonetheless consider the
different distribution channels, which can be
categorised as follows:
l Websites, web applications and mobile apps
l Social media, blogs, news feeds and aggregators
l Photo, video and media sharing
l IPTV
l Gaming
l Communications
Made-to-measure The
iPlayer on the PS3 and
Xbox delivers the same
content to distinctly
different environments
Features
Features
Content strategy
Plan content before you build
Assess and decide
You need to be brutally honest about what
will work and what wont. Set out clear
criteria before you assess, and stick to them.
Mind the gap
Once you know what youve got, you need
to figure out what you havent got. What
content is missing? What do you need
to fulfil your goals and objectives? What
do your users really want? See this as an
opportunity to innovate.
Features
86
Scoping
Just as you scope a web design project, you will
have to determine the resources available to deliver
against your design goals and websites objectives,
whether theyre people, content assets or budget,
and there will be hard decisions to make. Resources
are always finite, so attention to detail at this stage is
a must. Your scope will need to include details of:
l The different distinct types of content that need
tobe created
l How much of each of the different types of
content is required
l How often the different types of content will
needto be updated
l Which distribution channels will utilise the content
l What the associated costs are
What you determine here will provide the framework
for the work to come. You may be surprised at the
results, because it is usually this stage that serves as
a bit of a reality check. It is incredible how quickly
requirements that were absolutes fall by the wayside
when it becomes evident just how much work goes
into getting content right. It is not that content
Content strategy
Features
Features
Features
Content strategy
Features
Bringing your
content to life
88
Stages of interaction
Users are content editors
too so make it easy for
them. The Flickr Uploadr
guides users through the
process of uploading
images to the site
Content strategy
Features
Features
Multiple devices Content strategy must evolve in line with access patterns. IMDb (www.imdb.com) is a good example
of how a complex set of data can be organised to ensure that users can access it as intended using a range of devices
l CMS development
l Database development
l Client-side scripting
l Server-side programming
l Web applications development
l Mobile applications development
Image: www.strangelove.me
In depth Find out all you can about your users: not
only who they are, but why they use your content
Features
Content strategy
Best fit A good example of
content aligned to
structure, games website
Thunderbolt (www.
thunderboltgames.com)
uses responsive design
techniques to provide
elegant experiences across
different platforms
Features
You will need to determine dependencies and
interdependencies, and make sure you have
made contingencies and can accommodate change.
90
Launch a website
Features
Features
http://karenmcgrane.com
Photography/Daniel Byrne
92
Features
Features
Content vs structure
After leaving the cloistered world of
academia, McGrane landed her first job as an
informationarchitect with Razorfish. I was the
first IA. I was the first person with any kind of UX
background, hired when Razorfish was like 30
people. And I have never done anything else!
I think Im so lucky in that this job [matches]
how my brain works, she adds. I feel really lucky
to [have found] a career that allows me to do what
I do best. I was very lucky the internet came along
right when it did. It seemed like fate.
All this sets us to wondering: when it comes
to books, films anything with a narrative
framework which does she find most alluring:
content or structure? Easily the structure and
the management of it the engineering system
behind it, is her response. I love a good story.
I love a solid narrative. I love a good turn of
phrase and I enjoy the act of writing. Whats
really interesting is trying to reverse engineer
something. Its pattern recognition really. You can
let [structure] just wash over you or you can be the
kind of person who is constantly trying to figure
out how the system works. So yeah, Im definitely
a systems person. If I go to Disneyland, am I
letting it wash over me? No. Im like: why is this so
great? What are they doing to make this a great
experience? Id say thats a common trait in UX
94
Features
Features
Karen McGrane
Job UX and content strategist
Online http://karenmcgrane.com
Twitter @karenmcgrane
Recent projects Hearst
Corporation, American Express
Publishing, Celebrity Cruises
Tutorials
SEO techniques
Page 120
Tutorials
Page 98
Page 126
96
Page 112
Tutorials
SEO techniques
Expert tutorials
Step-by-step skills to boost your page rankings
from leading industry experts
Contents
98
108
116
120
Page 108
126
Page 116
Tutorials
Tutorials
Nginx serve
faster web pages
Download
the files! >
Nginx is a high performance and open source web server. Jaime Hall explains
how to use it to serve all static content and speed up page loading times
Knowledge needed Intermediate command line, basic server knowledge
Requires Linux (uses Debian for the example)
Project time 30 minutes
If you dont have Apache already installed, lets install it, along with PHP:
Apt-get install apache2 apache2.2-common apache2-mpm-prefork apache2-
utils libapache2-mod-php5 php5 php5-common
Tutorials
At this point, you can install any other additional modules that you may
require for PHP like the GD library, cURL, ImageMagick, MySQL etc.
If your server is currently restricting port access via IPTables or similar, then
you will need to open up port 8080 so that any non-static content can proxy
through to Apache on that port as well as the standard 80 port for Nginx.
nano /etc/apt/sources.list
deb http://packages.dotdeb.org squeeze all
deb-src http://packages.dotdeb.org squeeze all
wget http://www.dotdeb.org/dotdeb.gpg
cat dotdeb.gpg | apt-key add -
apt-get update
Need to know Nginx has a full list of all available configuration settings
Load time Use Pingdoms toolkit to test the load time of web pages
98
Tutorials
multiple requests whereas Apache has to spawn new processes or threads for
each request it receives. This uses less memory as each process has a certain
memory overhead every time one is spawned. Because of this, a much more
predictable memory usage can be achieved under large traffic loads.
Because Nginx is only serving static files, the process itself uses very little
memory and doesnt need the overhead of additional modules that Apache
may require (such as mod_rewrite, mod_php, mod_deflate and so on).
Beloware some example memory usages for Apache and Nginx as seen on
acouple of live web servers using fairly default installations. These will vary
between servers but they give a fairly good representation: Nginx, 4.62mb
and Apache, 29.12mb.
Theres a massive difference in overhead between the two, which will
increase with traffic. You can see why spawning multiple Apache processes
can make memory usage fluctuate greatly.
client_body_buffer_size 8K;
# Headerbuffer size for the request header from client
client_header_buffer_size 1k;
# Maximum accepted body size of client request
client_max_body_size 2m;
Network flow Resources as theyre requested from the server via debug toolbar
# Set expiration headers
expires 30d;
}
# All other extensions not in the above
location / {
# Root directory for Apache site files
root /var/www/example-site/httpdocs/;
# Index file to look for
index index.html index.htm index.php;
nano /etc/nginx/sites-available/example-site
server {
Tutorials
# Maximum number and size of buffers for large headers to read from client
request
large_client_header_buffers 2 1k;
Tutorials
Tutorials
# Directory permissions
<Directory /var/www/example/httpdocs/>
Options FollowSymLinks
AllowOverride All
Order allow,deny
allow from all
</Directory>
#
ErrorLog /var/log/apache2/error.log
# Possible values include: debug, info, notice, warn, error, crit,
# alert, emerg.
LogLevel warn
# Location for the access log
CustomLog /var/log/apache2/access.log combined
</VirtualHost>
Test 1
Test 2
There is almost a 33 per cent drop in load time between just Apache and
using Nginx to proxy to Apache. The next test was a concurrency stress test.
This should show more in the way of the benefit of Nginx and how eventdriven asynchronous request handling works better for static content.
Before and between each test Nginx and Apache were restarted and left
for five minutes to allow the server to settle. To run the stress test, AB
(Apache Benchmark) was used against the Oasis Overland site.
Test 1
Test 2
100
Tutorials
selection
02 Natural
Dont over-optimise. With the release
of the Penguin update, Google is rewarding
sites that are more natural. So avoid stuffing
keywords into your title tags and content. Dont
have hidden text or links, dont use cloaking
or sneaky redirects. You get the picture. Keep it
natural and concentrate on user experience.
linking
04 Internal
Make good use of internal linking. When
designing a great looking site we sometimes
forget about which pages are the most
important and how theyre linked to. Make sure
your internal links support your most important
pages and use good descriptive anchor text to
let search engine spiders understand what the
page is about. You can use a website crawler
such as the SEO Spider Tool from Screaming
Frog, or Xenus Link Sleuth (both of which are
free to use) to see how your internal linking
affects spidering.
careful
05 Be
Be careful of free open source themes
and plug-ins. Many free WordPress themes
often include hidden code that will place
spammy external links on your website or,
even worse, malware. Dont trust themes
and plug-ins, make sure you check them first
before installing or get them from a trusted
source. Usually these hidden links and so on
are hidden in Base64 PHP code, so you can
check for that before installing. To easily check
WordPress themes, install and run TAC (Theme
Authenticity Checker).
Place to be Create listings in directories such as Google Places to increase your visibility
keywords early
07 Research
When building your website, perform
keyword research at an early stage so that it
can be built into the design of your site. If you
have an existing site, consider running a test
usingGoogle AdWords to see what converts
for you. This way, you can easily identify the
required landing pages (money pages) to ensure
that yourdesign caters for it, without horrible
bolt-on afterthoughts.
microdata
08 Use
Take advantage of Googles rich snippets
by using microdata in your web build. Ever seen
review rating stars in a organic listing? Thats a
rich snippet and it can greatly improve your click
through ratios (CTR). You can mark up ratings,
videos, people, products and many other things.
Visit http://schema.org to find out more. Also,
information
09 Clear
Optimise your website for local search
results. Create a listing in local search engines and
directories, for example, Google Places and Yahoo
Local. Display your business information clearly
on your website, for example, in the footer. Check
your whois data and make it the same as the
businesss if its different. Consider local business
directories, businesses you work with and your
local chamber of commerce.
social
10 Think
Search engines are placing great weighting
on social signals and so you want to encourage
the use of these networks such as Facebook
and Twitter. Make sure your content is easy to
share, and promote content through your own
social profiles. Concentrate on getting shares and
retweets, rather than just increasing followers and
Likes. Its the sharing of content that we want, the
followers just give us the audience to do so.
Glenn Alan Jacobs is managing director
of SpeedySEO, an online marketing
agency based in Essex
www.speedyseo.com
Tutorials
301 redirects
03 Manage
If moving to a new website or URL,
Tutorials
Schema utilise
structured data
Download
the files! >u
Theres a shift towards the semantic web from Google and other search providers.
Luke Hardiman demonstrates how to use structured data for ecommerce products
Knowledge needed Basic HTML
Requires Text editor, Google Structured Data Testing Tool
Project time 30 minutes
Theres never been a better time for small online businesses and
startups with great merchandise to get their wares in front of new
customers, and all without having to rely on paid clicks, banners or
any other annoying, expensive ad campaigns. In many ways, content is king
again, and not just the editorial kind. Surround your online products with
quality, contextually relevant content, mark it up in a semantically sound way
and youll be taking the high road to commercial success.
Tutorials
Obsessing over semantic markup used to be more or less the exclusive domain
of web standardistas, many of whom were driven by anything but a commercial
agenda. Some, such as web designer Jeffrey Zeldman, did in factpresent the
business case(www.zeldman.com/dwws) for web standards many years ago.
Arguments for big business adopting web standards were met with varying
degrees of success and no shortage of deaf ears.
Structured data now makes that case more emphatically than ever before,
with visual evidence in the form of rich snippets and money in the bank,
delivered on the back of improved search engine performance. These benefits
will extend beyond the big search engines in the near future.
Wrath of Penguin
Major Google updates(Panda in 2011(http://netm.ag/panda-241) andPenguin
in 2012 (http://netm.ag/penguin-241) took a far-reaching swing at spammy and
poor quality content. Sites judged to have a poor user experience, too many
ads, or deemed to have benefitted from dodgy link building campaigns suffered
massive drops in organic traffic. Many of them paid a high price in lost revenue.
So what does this SEO smackdown have to do with frontend development?
Well, its more about the increased focus on semantics and what that means
for nearly anybody producing and marking up online content, from your
Schema.org Google, Bing, Yahoo and Yandex showed unanimous support by declaring
Schemas microdata vocabulary as the preferred standard in 2011
102
Enter Schema.org
In June 2011, Google, Bing, Yahoo and Yandex all announced a rare show of
unanimous support and declared Schema.orgs (http://schema.org) microdata
vocabulary the preferred standard. This provoked no small amount of protest
from the microformats and RDFa people, who had been working on their own
data vocabularies for a considerable amount of time. Nevertheless, pragmatists
have appreciated the across-the-board adoption by the big search players, the
freedom to extend thevocabulary (therell be more on this later), the relative
ease-of-use, the testing tools and the plentiful documentation that Schema.org
brings to the table.
The unilateral declaration of support by the big search players succeeded
in lighting the intended fire. In the recent past, business-minded frontend
developers have set about implementing microdata wherever a sites content
fits the available structured data vocabulary. The rewards are beginning to
Tutorials
Q&A Ask Google a question, and its now trying to deliver you an answer, rather than
just list a bunch of links to sites that may or may not do that
pay dividends as with the author rich snippets in the form of increased
clickthrough rates from the search results.
Theres no question that highly relevant search results that stand out from
the competition are going to bring you more traffic. Not only that, but its
becoming clear that rich snippet traffic is more likely to convert into revenue. A
user that clicks on a rich snippet result invariably has a better understanding of
the content theyre going to find at the end of that click. Theyre making a more
informed choice, which could mean theyll be more likely to place an order at
the end of their session.
By far and away, the best approach is to build microdata into the CMSs
templates. No frontend developer wants an inbox full of Word documents from
the content team and to have to set about marking up the contents of each one
on a case-by-case basis.
Personally, I mostly work with ExpressionEngine (http://ellislab.com/
expressionengine), which is known for being a very flexible publishing system.
I always try to start each project with solid information architecture and at least
a basicdata dictionary. These are the two main ingredients necessary for setting
up the CMS.If youre fortunate enough to already have your data entities welldefined in your CMS, youre off to a flying start and maybe you are ready
to skip ahead to implementation at the template level.
Googles tool This will kick out a list of all of the identified items on your page, as
well as any errors or entities which do not fit within your chosen schemas
Tutorials
Provenance
Tutorials
Data integrity
For microdata purposes, data integrityallows for coding up templates
where each type of content can be outputted into the relevant HTML5
element, which in turn contains the specific microdata attributes to
define exactly what type of content lives inside.
Get your HTML and itemtypes ready to receive whatever new
products the content team throws at them. Once youve achieved this
youve won significant victories for your web platform on three of the
following different fronts:
Tutorials
l At the business level, you have a highly flexible and product-specific
publishing system for your client
l At the user level, you have well-defined data that can be searched
or outputted wherever needed, putting your content to work in the
service of a great user experience
l At the findability level,you have detailed product data and attributes
that describe semantically rich entities for search engines as well as
for aggregators
Marking up a tour
In terms of the immediate benefits available to us, and based on our tours
product attributes as listed above, there are a few different types of rich snippet,
which we can target with our product content.
The following code snippets are taken from the sample HTML file provided,
which marks up an African overland tour with some of the entities listed above.
Go ahead and customise this to your own needs.
Authorship
To start, well add <link> tags for the rel="author" and rel="publisher"
attributes. For the example below, I have these pointing to the Google+ profile
104
Our tour The detail template contains a many aspects which can be formatted as
structured entities, from pricing to events and a considerable amount of geodata
Tutorials
Pricing An 'In Stock' message pops up next to the price in your search result. This is
achieved by creating an itemscope for the product
of the pages author. You may or may not want to do this, based on how
connected your author is within their industry. An author who is a respected
authority on the pages subject matter, with a strong social media footprint or a
considerable weight of content published elsewhere online, is going to be a
valuable person to link up here.
<!DOCTYPE html>
<html>
<head>
<title>Multi-country Overland Tour</title>
<!-- were specifying authorship here, which gets us an author thumbnail in
the Google SERPs -->
<link rel="author" href="https://plus.google.com/u/0/1174663396329391101
11/posts" />
<!-- were specifying the publisher here, which ties the content to our
Google+ Profile -->
<link rel="publisher" href="https://plus.google.com/101112670412082074285
/posts">
With the current state of play, Ive found that authorship tends to override
other structured data Google may have found and leave you with a rich
Breadcrumb navigation
Here we add a breadcrumb (http://netm.ag/crumbs-241) itemprop attribute to
the nav element. All you need to tell Google is that this set of links contains
information about your content architecture.Using breadcrumbs will not only
give you a rich snippet that reflects your sites structure, but consistent use of
breadcrumbs will also serve to expose your information architecture to bots and
users alike. Double win.
<!-- The WebPage itemtype is implied, but Im declaring it below for the sake
of clarity -->
<body itemscope itemtype="http://schema.org/WebPage">
<!-- The breadcrumb is not part of the Product schema and must sit
outside of the Product itemtype below -->
<nav itemprop="breadcrumb">
You are here
<a href="/">Home</a> >
Canon EOS This is the search result detail youre aiming for. Theres breadcrumb, star
rating, review date, author name and thumbnail and links to more of his content
<a href="/budget-safaris/">Budget Safaris</a> >
<a href="/budget-camping-safaris/">Budget Camping Safaris</a> >
Multi-country Overland Tour
</nav>
Pricing
Here we create a new offer (http://netm.ag/offer-241) itemscope for the
product, contained within the <div> that wraps our pricing information. This
specifies currency, amount (price) and Im also using the new GoodRelations
businessFunction attribute to specify that we are selling this tour, as opposed to
leasing it or any other type of trading.
<!-- the 'offer' - pricing details. Using GoodRelations there is huge scope
for additional detail here -->
<div itemscope itemprop="offers" itemtype="http://schema.org/Offer">
Priced from <abbr title="Per Person Sharing">pps</abbr>
<meta itemprop="priceCurrency" content="GBP">
<span itemprop="price">999</span>
<link itemprop="businessFunction" href="http://purl.org/goodRelat
ions/v1#Sell">
</div>
Specifying<link itemprop="availability" href="http://schema.org/InStock">
can also get you a little In stock message that pops up next to the price in your
search result. Im not sure how well this fits with the selling of tours, and there
may be a better way to handle this for products that are not off-the-shelf items.
(Ive left it out of the example code.)
Events
Because events arent catered for at the moment as properties of the product
vocabulary, I havent used an itemprop attribute on any of the <li>s. Im still
going to include them on the page as structured data entities though. This is in
the hope that the search engines and aggregators will make use of the clearly
defined content.
<!-- were marking up the scheduled tour departures as events.
Note: events are not valid properties of the Product schema, hence
no itemprop attribute -->
<h3>Departure Dates</h3>
<ul>
<li itemscope itemtype="http://schema.org/Event">
<meta itemprop="name" content="Scheduled Tour Departure">
<meta itemprop="location" content="Cape Town City Centre">
<meta itemprop="duration" content="10 Days">
<time itemprop="startDate" datetime="2013-03-08">Fri, 08 Mar
2013</time> -
<time itemprop="endDate" datetime="2013-03-17">Sun, 17 Mar
2013</time>
</li>
<li itemscope itemtype="http://schema.org/Event">
Tutorials
Vicky Rowe Heres a rich snippet for a blog post, showing verified authorship and a
Tutorials
Tutorials
106
Webmaster Tools Be aware that entities wont show up immediately. The time scale
depends upon how often and how deeply Googlebot is crawling your content
That wraps up the example code. One thing to bear in mind is, with this
particular example, we have a lot of entities all competing for attention on one
product page. You may find you get better results (particularly with Google
rich snippets and long tail searches) by adding more content to each entity. For
example, for longer reviews, and then breaking things out into separate URLs,
something like this:
.../my-tour/reviews/
.../my-tour/locations-and-map/
.../my-tour/departure-dates/
This way you dont compromise your chances of getting a separate rich
snippet for each entity.
Separate URLs also means youre more likely to rank forlong tailsearch
queries. For example, location+product, product+review and so forth. The long
tail can be a conversion gold mine. The specificity of the query often means
the user is quite far down the decision-making process, and is more likely to
convert to a sale or a booking.
Tracking progress
Once you have some structured data entities live on your site, Google allows
you to monitor the indexation progress via Google Webmaster Tools.
Under the labs section in Webmaster Tools theres also an Authorship report,
showing how often rich snippets were presented in the search results and the
percentage of clickthroughs that resulted.
Similar to itspolicy on cloaking,Googlesguidelineson rich snippets(http://
netm.ag/rich-241) specify that any content marked up with structured data also
be shown to the user. Dont be tempted to present one set of entity-stuffed
structured data to the engines without making it visible on the page. At best
youll get no benefit, and at worst youll be penalised and your site could
disappear from the search results.l
Tutorials
ASP.NET boost
page performance
Download
the files! >
Bundling and minification are two very effective techniques to provide a faster page
load time and browsing experience for your website. Dean Hume explains how
Knowledge needed ASP.NET, basic CSS and JavaScript
Requires Visual Studio 2012 or Visual Studio Express 2012 (free)
Project time 1 hour
Tutorials
Visual studio Visual Studio 2012 offers a whole host of great features designed to
help develop interactive web applications with new tools for JavaScript and jQuery
108
vertical-align: top;
}
If we take this same file and minify the contents, it will look a little
something like this:
input[type="search"]{-webkit-box-sizing:content-box;-moz-box-sizing:content-
box;box-sizing:content-box;-webkit-appearance:textfield}input[type="search"]::-
webkit-search-decoration,input[type="search"]::-webkit-search-cancel-button{-
webkit-appearance:none}textarea{overflow:auto;vertical-align:top}
The contents of the file have had all the spaces stripped out, comments and
line breaks removed, which has made a significant difference to the overall
size of the CSS. The code above isnt exactly very easy to read and will be a bit
of a pain to debug. However, your browser doesnt care what the code looks
like; it will process it in exactly the same way. The key difference here is that
the minified code will download a lot faster and allow your users to begin
interacting with your web pages sooner.
In order to see the size differences between minified and unminified files, I
compared the following popular frontend frameworks:
File Names
98KB
80KB
19%
Twitter Bootstrap JS
60KB
27.8KB
53.7%
Zurb Foundation
99.3KB
74.9KB
25%
jQuery
225.78KB
93.28KB
58.68%
jQuery Mobile
240KB
91KB
62%
As you can see from the table above, there are considerable file size savings
to be made by simply minifying your CSS and JavaScript files. The amount of
Getting started Begin by creating a new project in Visual Studio. In this example, we
are going to use an ASP.NET MVC 4 application
Tutorials
The chart above shows the growth in size of JavaScript file sizes between
January 2011 and January 2013. Its on a steady incline and reflects an
interesting trend. While having a lot of JavaScript on your website may be
inevitable, keep in mind that it may block the UI thread and makes your
website slower. By ensuring that we stick to best practices, such as bundling
and minifying CSS/Script files, we give our users a better and faster
browsing experience.
savings that you will be able to make will differ depending on the contents and
file type. These changes can make a big difference to overall page load times.
What is bundling? Well, bundling is the act of combining all your JavaScript files
in a web page into one single file, and similarly all the CSS files into one. Web
pages can contain multiple script tags and style tags referencing different files
on your server. The more CSS or script files that you have in your web page, the
more HTTP requests your users need to make every time they visit your web
page. HTTP requests are expensive because they mean a round trip from the
browser to the server every time. If this can be reduced, then the web page load
times can be improved. Since minifying a file doesnt affect the way its rendered
Getting started
Lets get started with the code example. Start by firing up Visual Studio and
creating a new project. Choose an ASP.NET MVC 4 application and give your
project a name. Ive chosen to name mine NetMagBundling. Click OK.
Next, you will be presented with a screen similar to the one pictured on the
right. Choose the Basic template and click OK. This will create a new project
and provide you with the default ASP.NET MVC solution files.
In this example, Im going to use the Twitter Bootstrap framework along
with a copy of the latest jQuery library in order to get a simple design up and
running. The Twitter Bootstrap framework is available to download at http://
twitter.github.io/bootstrap. Once youve downloaded the Twitter Bootstrap
files, add them to your project. I have added my CSS to the Content folder and
JavaScript to the Scripts folder. Ignore the minified versions of the file that
come with the Twitter Bootstrap download, as we are going to use the ASP.NET
bundling framework to automatically bundle and handle this for us.
Basic template Next, choose the Basic project template. This will create all the
resource files that we are going to need to get started
In your Solution Explorer, you will also notice a folder called App_Start.
The App_Start folder contains a file called BundleConfig.cs, which enables
you to set and manage the bundling and minification for the project. Open
the BundleConfig.cs file and you will notice a method called RegisterBundles.
We will use this method to notify the ASP.NET framework which files to minify
and how to bundle them together. When you create a basic project in Visual
Studio, it will provide you with common libraries and you may notice that the
RegisterBundles method already contains some code inside it. Remove this code
as we are going to write one that will only bundle the files that we need.
Update the code inside the RegisterBundles method to represent:
public class BundleConfig
{
public static void RegisterBundles(BundleCollection bundles)
{
// Minify the CSS
bundles.Add(new StyleBundle("~/css/minify").Include(
"~/Content/css/bootstrap.css",
"~/Content/css/bootstrap-responsive.css"));
// Minify the JavaScript
bundles.Add(new ScriptBundle("~/js/minify").Include(
~/Scripts/bootstrap.js"));
}
}
In the snippet of code above, you will notice that theres a reference
to the Bootstrap CSS and JavaScript files. We are creating a new
Tutorials
Bundling
Tutorials
Compression +
minification = a boost
If youre looking to squeeze even more speed out of your page load
times, then you should be looking at HTTP compression. Its simply an
algorithm that eliminates unwanted redundancy from a file in order
to create a file thats smaller in size than the original representation.
In the same way that you may zip a file on your hard drive and then
unzip it again when needed, HTTP compression does the same thing
with your web page components. When a user requests a file from
the server, the server can return a compressed version of the file.
The users browser then understands that it needs to decompress the
file that came back from the server. By passing a smaller file back
between the server and browser, it makes the download time a lot
quicker. It also shaves off a considerable amount of bytes that need to
be downloaded.
Most web servers already contain the ability to compress files. Such
web servers simply need to have this feature enabled. Depending on
the web server that you are using, check the settings and dont forget
to enable compression.
Tutorials
StyleBundle and ScriptBundle that will minify and bundle both of these
files together respectively. The paths inside the Include method both
point to the location of the Twitter Bootstrap files. If we take the StyleBundle,
for example, there is a path that points to "~/css/minify". This path doesnt exist
in our file structure, but is in fact an endpoint. We will point to this endpoint in
the HTML and the ASP.NET bundling framework will create our new bundle on
application start-up.
This new bundle that weve created will take the CSS files, minify them and
combine them together into one and do the same with the JavaScript files. This
means less to download and fewer HTTP requests being made.
In order to use the bundles that weve just created in your web page, create
a new view and add the following code:
@{
Layout = null;
}
<!DOCTYPE html>
<html>
<head>
<title>Bundled Boostrap template</title>
@Styles.Render("~/css/minify")
</head>
<body>
<div class="container">
<h1>Minifying and Bundled CSS/JavaScript</h1>
</div>
@Scripts.Render("~/js/minify")
</body>
</html>
Ive created a simple HTML layout that uses the Twitter Bootstrap classes. You
will notice a method called Styles.Render. This method will use the bundle that
we defined and write out the style tags for us. Similarly, theres also a
Scripts.Render method that will bundle the scripts and write out the script tags
onto the page.
Lets fire up the application and compare the differences. If you run the
application and browse to the source of the code, you will notice HTML similar
to the following:
<!DOCTYPE html>
<html>
<head>
<title>Bundled Boostrap template</title>
<link href="/Content/css/bootstrap.css" rel="stylesheet"/>
<link href="/Content/css/bootstrap-responsive.css" rel="stylesheet"/>
</head>
<body>
<div class="container">
<h1>Minifying and Bundled CSS/JavaScript</h1>
</div>
<script src="/Scripts/bootstrap.js"></script>
</body>
</html>
OK, so the HTML contains all the scripts tags and style tags that we added
into the bundles. However, they dont appear to be minified or combined
together. This is because were running the application in debug mode. In
order for the minification and bundling to work, we need to ensure that the
application is in release mode. Bundling and minification is enabled or disabled
by setting the value of the debug attribute in the compilation element in the
Web.config file.
In your solution explorer, navigate to your Web.Config file and update the
following line:
Starting line In your Solution Explorer is a folder called App_Start containing a file
called BundleConfig.cs. This enables configuration of bundling and minification settings
110
<system.web>
<compilation debug="false" />
<!-- Lines removed for clarity. -->
</system.web>
Tutorials
Bootstrap In this example, we are going to use the Twitter Bootstrap framework.
Using frameworks If you havent already heard of Twitter Bootstrap, its a powerful
Download the framework and add the files to the CSS and Scripts folders accordingly
Set the compilation element in the Web.Config file to false and it will ensure
that you are no longer developing in debug mode. If you start the application
again and navigate the HTML of the page, youll notice the bundles in action.
applications, you can process and apply the custom transforms yourself. The
custom transforms will allow the framework to interpret the files and apply
bundling accordingly. Although the example in this tutorial ran through
bundling in ASP.NET MVC, the bundling framework also works in ASP.NET
Webforms (www.asp.net/web-forms).
Summary
By using this new feature of ASP.NET, you get three benefits in one: your files
are minified, combined together and will be cacheable. This means that your
users download less, make fewer HTTP requests and wont need to download
the files more than once unless the contents change.
If we take a look at the file sizes before and after bundling, there is a
noticeable difference:
File Type
206.3 KB
147.41KB
30%
JavaScript
60.3 KB
27.6 KB
55%
CSS
124 + 21.6 KB
119.9 KB
17.66%
While this page is a very simple page, we have still managed to reduce the
overall page weight by around 30 per cent. That is a significant reduction on the
original weight of the page and will certainly benefit your users.
The average connection speed for internet users worldwide is an average
of 1.8 Mbit/s. In the UK, our connection speeds are around 6 Mbit/s, which is
significantly higher than the rest of the world. However, as web developers, its
important that we think about the users globally. Many users may be accessing
your website from different locations around the world with poor connection
speeds. These performance techniques will make a big difference to these users
and their overall browsing experience while on your website. As we know, the
bigger the size of the files, the longer it takes for users to download. This is
especially relevant for users accessing websites on mobile devices or tablets.
3G speeds can vary wildly depending on a number of factors and can be flaky
at the best of times. Think about performance best practices when developing
your website; all your users will benefit from your work!
If you are using ASP.NET MVC in your next application, give bundling a go.
For less than an hours worth of work, your users will notice the results! The
source files for this project are available to download at http://github.com/
deanhume/NetMagBundling. l
Tutorials
<!DOCTYPE html>
<html>
<head>
<title>Bundled Boostrap template</title>
<link href="/css/minify?v=x2tjtoRj4l9AoKFwO-qVI5gpFTC7fxSWPa0gEL-BrNY1
rel="stylesheet"/>
</head>
<body>
<div class="container">
<h1>Minifying and Bundled CSS/JavaScript</h1>
</div>
<script src="/js/minify?v=ii4SomVnNME7Iq1GL51nWqk9KnL_D13MXTIn-
0yYx6I1"></script>
</body>
</html>
Tutorials
Tutorials
One of the first things to look at is the size of your HTML code. This is
probably one of the most overlooked areas, perhaps because people assume
its no longer so relevant with modern broadband connections. Some content
management systems are fairly liberal with the amount they churn out one
reason why it can be better to handcraft your own sites.
As a guideline you should easily be able to fit most pages in <50KB of
HTML code, and if youre under 20KB then youre doing very well. There are
obviously exceptions, but this is a fairly good rule of thumb.
Its also important to bear in mind that people are browsing full websites
more frequently on mobile devices now. Speed differences between sites
viewed from a mobile are often more noticeable, owing to them having
These sorts of targets have real world implications for your website and
business. Googles Marissa Mayer spoke in 2006 about an experiment in which
the number of results returned by the search engine was increased to 30. This
slowed down the page load time by around 500ms, with a 20 per cent drop
in traffic being attributed to this. Amazon, meanwhile, artificially delayed the
page load in 100ms increments and found that even very small delays would
result in substantial and costly drops in revenue.
Other adverse associations linked with slow websites include lessened
credibility, lower perceived quality and the site being seen as less interesting
and attractive (see http://netm.ag/webpsychology-231). Increased user
frustration and increased blood pressure are two other effects we probably
have all experienced at some point! But how can we make sure our websites
load speedily enough to avoid these issues?
slower transfer rates than wired connections. Two competing websites with
a 100KB size difference per page can mean more than one second load time
difference on some slow mobile networks well into the interrupted thought
flow region specified by Jakob Nielsen. The trimmer, faster website is going
to be a lot less frustrating to browse, giving a distinct competitive edge over
fatter websites and going a long way towards encouraging repeat visits.
One important feature of most web servers is the ability to serve the HTML
in a compressed format. As HTML by nature contains a lot of repeating data it
makes it a perfect candidate for compression. For example, one homepages
18.1KB HTML is reduced to 6.3KB when served in compressed format. Thats
a 65 per cent saving! Compression algorithms increase in efficiency the
larger the body of text they have to work from, so you will see larger savings
Free speed Open source web page performance grading browser plug-in YSlow (yslow.
org) is based on the Yahoo Developer Networks website performance recommendations
Tooled up There are alternative high quality resources for measuring performance,
such as Googles free web-based PageSpeed Online tool
112
Tutorials
One-piece Sprite sheets are easy to implement and can offer significant improvements
on page performance by reducing the total number of HTTP requests
Further techniques
There are other, easier to implement techniques that can offer great benefits
to your sites speed. One is to put your JavaScript files at the end of your HTML
document, just before the closing body tag, because browsers have limits on
how many resources they can download in parallel from the same host.
The original HTTP 1.1 specification written in 1999 recommends browsers
should only download up to two resources in parallel from each
hostname. But modern browsers by default have a limit of around six.
Tutorials
Testing time Pingdoms free tool for analysing the waterfall of your web page helps
break down each resources load time, which can help point out bottlenecks
Tutorials
Break it down Google Analytics has several useful tools and reports inside it that can
help you identify the slowest pages on your website
Tutorials
YSlow is an open source plug-in for all major web browsers (except
Internet Explorer) that can help analyse the speed of your web pages.
It uses Yahoos rules for high performance websites to suggest ways in
which you can improve the pages performance, and can be downloaded
for free from http://yslow.org.
Yahoos rules for high performance websites are comprehensive
and regarded as being one of the best sources of information on how
to speed up your website, so YSlow is an exceptionally useful tool. It
will quickly point out issues that you may have assumed were set up
properly but were not. For example, you may assume that all your text
content is served gzipped when in fact some elements may not be, or you
may not be using correct expire headers on some resources. YSlow will
also highlight any non-minified CSS/JavaScript files and duplicate CSS/
JavaScript files, as well as a host of other useful and often easily fixed
problems that can have noticeable effects on your websites performance.
Installing and running YSlow is a quick and easy process, and it will
immediately highlight problem areas. However, sometimes you will come
across recommendations from YSlow that are out of your control. For
instance, on our website (www.scirra.com) it recommends we need to
configure our entity tags (ETags). Unfortunately on IIS it isnt possible to
configure these to meet the recommendation of YSlow (it would provide
such a minor performance improvement if it was possible to configure the
ETags that it shouldnt be of too much concern to you anyway).
If you are using third party plug-ins and scripts, you may also find they
do not appease YSlow fully. You will sometimes have to make judgement
calls on whether finding alternatives or removing these such plug-ins
will actually yield performance improvements that will make your efforts
worthwhile more often than not the answer is no.
114
Spread the load The right hand image shows how content is distributed on a CDN,
compared with a traditional one-server setup on the left side
If your web page has more than six external resources (such as images/
JavaScript/CSS files) it may offer you improved performance to serve
them from multiple domains (such as a subdomain on your main domain
name or a CDN) to ensure the browser does not hit its maximum limit on
parallel downloads.
Rather than splitting multiple requests onto different domains, you may
consider combining them. Every HTTP request has an overhead associated
with it. Dozens of images such as icons on your website served as separate
resources will create a lot of wasteful overhead and cause a slowdown on
your website, often a significant one. By combining your images into one
image known as a sprite sheet you can reduce the number of requests
required. To display the image you define it in CSS by setting an elements
width and height to that of the image you want to display, then setting the
background to the sprite sheet. By using the background-position property
we can move the background sprite sheet into position so it appears on your
website as the intended image.
Sprite sheets also offer other benefits. If youre using mouseover images,
storing them on the same sprite sheet means that when the mouseover is
initiated there is no delay because the mouseover image has already been
Tutorials
Situation normal The Chrome Web Store loads a lot of content with Ajax in a way
that feels like a fast, natural browsing experience
pages or refreshing the page also behave in unexpected ways. When
designing websites its advisable to not interfere with low level behaviours
such as this its very disconcerting and unfriendly to users. A prime example
of this would be the efforts some websites go to to disable right-clicking on
their web pages as a futile attempt to prevent copyright violations. Although
implementing Ajax doesnt affect the operation of the browser with the same
intention of disabling right-clicking, the effects are similar.
HTML5 goes some way to address these issues with the History API. It is
well supported on browsers (apart from Internet Explorer, though it is planned
to be supported in IE10). Working with the HTML5 History API we can load
content with Ajax, while at the same time simulating a normal browsing
experience for users. When used properly the back, forward and refresh
buttons all work as expected. The address bar URL can also be updated,
meaning that bookmarking now works properly again. If implemented
correctly you can strip away a lot of repeated loading of resources, as well as
having graceful fall backs for browsers with JavaScript disabled.
There is a big downside however: depending on the complexity and
function of the site you are trying to build, implementing Ajax content
loading with the History API in a way that is invisible to the user is difficult. If
the site uses server-side scripting as well, you may also find yourself writing
things twice: once in JavaScript and again on the server which can lead
to maintenance problems and inconsistencies. It can be difficult and time
consuming to perfect, but if it does work as intended you can significantly
reduce actual as well as perceived load times for the user.
When attempting to improve the speed of your site you may run into some
unsolvable problems. As mentioned at the start of this article, its no secret
that Google uses page speed as a ranking metric. This should be a significant
motivation to improve your sites speed. However, you may notice that when
you use resources such as Google Webmaster Toolss page speed reports they
will report slower load times than you would expect.
The cause can be third-party scripts such as Facebook Like buttons or
Tweet buttons. These can often have wait times in the region of hundreds of
milliseconds, which can drag your entire website load time down significantly.
But this isnt an argument to remove these scripts its probably more
important to have the social media buttons on your website. These buttons
usually occupy relatively small spaces on your page, so will not significantly
affect the visitors perceived loading time which is what we should primarily
be catering for when making speed optimisations. l
For more ways to speed up your site visit: www.netmagazine.com/features/15surefire-ways-speed-your-site
Tutorials
Downsizing A screenshot from the IIS7 webserver showing how easy it is to enable
Tutorials
CDN improve
page load times
Dean Hume explains how content delivery networks can drastically increase
the performance of your website by serving data thats closer to your users
Knowledge needed HTTP web concepts, FTP tools
Requires Web browser for profiling (Firebug, Chrome Dev Tools)
Project time 30 mins
Tutorials
OK, so the benefits are apparent, but are large companies the only parties
that can afford to use a content delivery network? The answers no CDN
technology is commercially available to all developers and it is highly
affordable. Youll pay only for the file storage space and outgoing bandwidth
that you actually use. I use a CDN for my personal blog that receives a few
thousand hits a month and I only pay 30p a month.
Performance gains
The most important part that developers can play in enhancing the browsing
experience for users is improving the speed and response times of our
applications. At Yahoo a test was conducted, and the sites that moved static
content off their application web servers and onto a CDN improved end-user
response times by 20 per cent or more (see http://developer.yahoo.com/
performance/rules.html#cdn). Now, you may not get this level of an
improvement, but even a performance increase that is close to that is
worth it in my opinion.
Steve Souders is the head performance engineer at Google and originally
coined the term the performance golden rule. This states that developers
should optimise frontend performance first [because] thats where 80 per
cent or more of the end-user response time is spent. Just think about all
the static components that are in your web pages images, style sheets,
JavaScript and so on.
If we can look to improve on the performance of these static files, we can
make big gains in terms of users perception of our sites.
Feeling blue The Windows Azure management portal is an intuitive interface thats a
Canine control This image was used as a benchmark across all the content delivery
networks tested
116
Tutorials
Business impact
In 2006, Google discovered that shifting from a 10-result page loading
in 0.4 seconds to a 30-result page loading in 0.9 seconds reduced traffic
and ad revenues by 20 per cent. When the Google Maps homepage was
shrunk from 100KB to 70-80KB, traffic rose 10 per cent in the first week,
and an additional 25 per cent in the following three weeks.
Over at Amazon it was found that every 100ms increase in load time
of the www.amazon.com homepage decreased sales by 1 per cent (for
more see www.websiteoptimization.com/speed/tweak/psychology-webperformance). Other major online players have revealed similar results:
there are still some other great CDNs out there worth looking into check
out CacheFly (www.cachefly.com), EdgeCast (www.edgecast.com), GoGrid
CDN (http://netm.ag/gogrid-232) and Google App Engine (https://
appengine.google.com).
Amazon Cloudfront
Amazon is by far the most popular CDN option out there. The company has
also been creating cloud services for many years and has built up a great set
of products. I have been using Amazon Cloudfront for a few months now
and so far it has been fast, cheap and relatively easy to set up. Again, as with
most CDNs in the market today, Amazon charges only for the content that
you deliver through the network and theres no monthly fee associated.
Prices compared with the rest of the CDN services are very competitive,
and it seems that by default the cheapest storage solutions are both in the
US and Europe. I do feel that setting up an Amazon Cloudfront solution
was a little more difficult compared to the other CDNs. It seemed more for
developers than junior webmasters. However, once set up it was easy to use
and very efficient.
One downside that I noted about Amazon Cloudfront is that there isnt
native support for Gzip. It can be enabled using custom scripts, but it would
Shining example Chrome Dev Tools come bundled with the Chrome browser
and offer developers a great way to investigate and develop applications
be nice if were offered as part of the package. Customer service also isnt
included by default, and you need to pay a bit in order to receive this facility.
This could make things tricky if you are a new developer starting out with
cloud tools.
Rackspace CDN
Rackspace comes across as a bit of a dark horse in the CDN department.
Pricing was really attractive (similar to Amazons Cloudfront), as well as being
simple to calculate and flexible. Rackspace uses the Akamai network as a
base for its CDN service.
Akamai has been around for a long time now, and has a massive global
network with servers deployed in 72 countries. Akamai is also the world
leader in content distribution, boasting 73,000 servers around the world, so
by using this option you get all the benefits of a world-class network
without prohibitive costs.
Tutorials
Functional fowl Cyberduck is a cloud storage browser for Mac and Windows. It
enables you to upload files to Amazon, Rackspace and Google Storage with ease
l Yahoo found a 400ms delay caused a 5-9 per cent decrease in traffic.
l Bing disovered that a two-second delay caused a 4.3 per cent drop in
revenue per user.
l Mozilla made its download page 2.2 seconds faster and was
rewarded with an increase of 15.4 per cent in downloads.
l Netflix enabled Gzip on the server; simply by this single action pages
became 13-25 per cent faster and saved 50 per cent of traffic volume!
Tutorials
Profiling tools
Most modern browsers come with built-in developer tools enabling you
to see the network usage of your site. I find it extremely useful to use
the dev tools that Chrome has on offer; to fire them up, simply hit F12
in your browser and it will pop up. There is also a Google Page Speed
add-on available that will integrate with Chrome Developer Tools. It
is very useful for determining a rating for your sites page speed, and
also gives you a list of areas that you can improve on.
Alternatively, Google offers a web page that enables you to profile
your site without having to install any plug-ins. Try it out at https://
developers.google.com/speed/pagespeed/insights. The site will also
allow you to profile your site against Mobile devices.
My other favourite tool to use is Firebug, a plug-in available for
Firefox and more recently Chrome. Firebug has an advanced JavaScript
debugger as well as the ability to accurately analyse network usage and
performance. Head to http://getfirebug.com to download it.
The set-up was easy and even included an online tool that I could
use to upload files with. I did also come across a great utility online
called Cyberduck (http://cyberduck.ch). This offers an FTP-esque file explorer
that you can quickly connect to your Rackspace storage account. This tool
can also be used with any Amazon storage account.
In contrast to Amazon, with Rackspace Gzip compression is automatically
enabled for any static file that you upload you automatically receive this
feature. The client support for Rackspace also seems quite impressive: it
offers permanent phone service for client queries.
Tutorials
Fastest Response
(milliseconds)
Slowest Response
(milliseconds)
Rackspace
CDN
18ms
8ms
47ms
Amazon
Cloudfront
37ms
17ms
57ms
Azure CDN
17ms
8ms
59ms
Pricing
Storage
Bandwidth Out
Rackspace
CDN
$0.10 / GB
$0.12 / GB
Amazon
Cloudfront
$0.0075 / 100
transactions
Azure CDN
$0.1252 / 100
transactions
Comparing
As you can see from the tables above, there is little between the three CDNs
and all offered a superb service. But if I were to choose a winner it would
have to be Rackspace: it consistently offered the best response times, the
customer support has been great and the price is very competitive.
Conclusion
It is really easy to get set up using a CDN and If you only made one change
to your site today, serving your static files from a CDN would improve your
performance significantly. You could even have one CDN account serving
loads of different websites that you work on. In this article, Ive reviewed a
few different CDN services out there but whichever one you do decide to
go with, your users can only benefit! l
Global improvements No matter which CDN you choose, your customers will
benefit from having closer access to static files
118
ON
SALE
NOW!
Tutorials
Downlo
the files!ad
>
The fi
les yo
this tuto u need for
ria
found at l can be
htt
ag/resp-2 p://netm.
31
Tutorials
design and build phases. They become fiddly to navigate around or maybe
the fixed width is wider than the users viewport, making it difficult to zoom in,
pan, zoom out and find what they are looking for.
Frustrating? For sure. But more frustrating as a developer is that these
websites should have been built in such a fashion that they scale down to fit
any viewport size.
Many sites using media queries strip out information, hiding certain aspects
of the site that they deem less important. So the user with a smaller device gets
an easier to use website, but with stripped-down content.
But why should I, on a mobile device, not get the same benefits from a
website as a desktop user?
With the help of media queries we can completely customise the layout of
our website dependent on screen size. Which is great, but do we really need to
supply several adaptations of our site?
And why should we settle for a site thats so badly designed or built that it
cant scale gracefully?
User frustration
Some people believe that its okay to cut features and eliminate content they
believe is non-essential to the user. But how can you be sure that the
information you are cutting or pushing to a secondary page is not the content
that is most important to me? You cant.
As an example, I was on the Nike Football website on my MacBook and
reading about the football academy they are running with the Premier
League, which I found really interesting its one of the main features as you
get to the website.
Desktop delights Nike Footballs full site features main navigation offering all available options including the feature on
the companys football academy thats visible in the image above
120
Tutorials
Just padding The iPad version of Nikes site says train like a pro, but the desktop
versions football academy article cant be found here at all
However, when I tried to show a friend of mine on my iPhone, I discovered
Nike has its own mobile site, and Nike Football consists of just two options: one
about the latest Mercurial Vapor boots (not interested), and one about the new
technology used on Nikes football shirts (not interested).
I tried my iPad and it was completely different again and still no sign of the
academy information I was looking for.
Its not just Nike thats guilty of this its hundreds of sites. And I find it
highly frustrating that I should get penalised for using a different device. I feel
that if content isnt worth showing to smaller device user, then it probably isnt
worth showing to anybody.
The first thing we need to understand is that responsive web design isn't just
about mobile it considers all viewport sizes. And secondly, developing a good
CSS Regions
Adobes proposal seeks to enable magazine-style layouts to be created
on the web. Find out more at:
l http://dev.w3.org/csswg/css3-regions
l www.css-tricks.com/content-folding
l www.broken-links.com/2011/11/07/introducing-css-regions
responsive website requires more time and effort than just using media queries
With a vast and growing number of web-enabled devices, its important to give
your website the best possible chance to facilitate a solid user experience.
We know that by having a responsive site we can use a single codebase. This
is great in that it means we neednt adjust our content for each device. But
many websites hide content deemed unnecessary to mobile users, and there
are two issues with this.
Firstly, it effectively penalises mobile users browsing the website. And
secondly, including a hidden style in our CSS doesnt mean the content doesnt
get downloaded. This can massively affect performance, especially for those on
poor connections.
So perhaps the best way to go about designing a website is to consider
mobile, or smaller devices, first. This way you can focus on the most important
information your site needs to give. And then, if necessary, you can use
conditional loading techniques where your layout grid, large images and media
queries are applied on top of the pre-existing small-screen design.
The real reason many full websites are unusable on mobile devices is
because they are unusable on any device. If its designed well enough in the
first place and built correctly, then it should scale up or down gracefully and
effectively. A responsive site doesnt necessarily have to be targeted at mobile
devices; if its built correctly it doesnt have to be targeted to any particular
device. It will just work. And Ethan Marcotte sums it up well in his article
'Responsive Web Design' from A List Apart: Rather than tailoring disconnected
designs to each of an ever-increasing number of web devices, we can treat
them as facets of the same experience, he writes. We can design for an
optimal viewing experience, but embed standards-based technologies into our
Tutorials
The walkthrough
For the purpose of this tutorial I have put together a website that scales
beautifully between large and small screens. You keep all the content on all
sizes. And with the use of media queries I have switched the navigation from
a horizontal display to vertical display for smaller devices, and given the user
enough padding on the realigned adaptation to work well on touch screens.
One thing that I especially like, when you view smaller-screen versions of
sites where the main navigation fills the screen area, is the ability to skip to the
content you really want using page anchors. Having this appear at the top of
the page helps prevent mobile users from having to scroll down to get to
the main body of content.
Tutorials
Tutorials
From the top Designwoops 15 detailed responsive web design tutorials offers
plenty for newcomers to the subject to chew on
122
Tutorials
</nav>
<div id="banner">
<img src="images/kaws.jpg" alt="banner" />
</div>
</header>
<section id="main">
<h1>Main section</h1>
<p>Loremp>
</section>
<aside>
<h1>Sub-section</h1>
<p>Lorem p>
</aside>
</div>
</body>
</html>
When it comes to the CSS, setting a max-width is a good idea in order to
stop the site scaling across enormous screens and this wont withhold the
page from shrinking. One main issue when switching from fixed widths to fluid
is images. And there is a simple fix for this in your CSS. Just set your images
width to 100%:
/* Structure */
#wrapper {
width: 96%;
max-width: 920px;
margin: auto;
padding: 2%;
}
#main {
width: 60%;
margin-right: 5%;
float: left;
}
aside {
width: 35%;
float: right;
}
/* Logo H1 */
header h1 {
height: 70px;
width: 160px;
float: left;
display: block;
background: url(../images/demo.gif) 0 0 no-repeat;
text-indent: -9999px;
}
/* Nav */
header nav {
float: right;
margin-top: 40px;
}
header nav li {
display: inline;
margin-left: 15px;
}
#skipTo {
display: none;
}
#skipTo li {
background: #b1fffc;
}
/* Banner */
#banner {
float: left;
margin-bottom: 15px;
width: 100%;
}
#banner img {
width: 100%;
}
Your image will now display at its parent elements full width and will
contract along with it. Just be sure your images max-width doesnt exceed the
max-width of its container otherwise it may pop outside. Remember to use
this method effectively the image must be large enough to scale up to whatever
size of your largest set viewport.
Using large images can effect load time, so on smaller viewports
where they are unnecessary there is a responsive image method where
Tutorials
Tutorials
Fluid movements Combining a series of grabs (above) enables the impact of transitions between screen sizes to be appreciated fully
you would detect the users screen size and pull in smaller/larger image
depending on what was necessary. There are still a few major challenges
with this method but is still worth looking into. Mat Marquis, a member of the
jQuery Mobile team has written a great article on this method and he explains
the pros and cons: http://netm.ag/respimage-231.
Tutorials
The main reason that you may want to switch the navigation is because the
scaling down could become unreadable and hard to click. By using this method,
you are enabling the user to access it more easily. You will also notice in the
code that we have made some changes to the #main and aside sections to
switch them to one column.
/* Media Queries */
@media screen and (max-width: 480px) {
#skipTo {
display: block;
}
header nav, #main, aside {
float: left;
clear: left;
margin: 0 0 10px;
width: 100%;
}
header nav li {
margin: 0;
background: #efefef;
display: block;
margin-bottom: 3px;
}
header nav a {
display: block;
padding: 10px;
text-align: center;
}
}
You will notice on some mobile devices that your website automatically
shrinks itself to fit the screen, which is where we get the issues of having to
zoom in to navigate through fiddly content.
To allow your media queries to take full effect a typical mobile-optimized site
contains something like the following:
<meta name="viewport" content="width=device-width, minimum-scale=1.0,
maximum-scale=1.0" />
124
The width property controls the size of the viewport. It can be set to a
specific number of pixels like width=960 or to the device-width value which is
the width of the screen in pixels at a scale of 100%. The initial-scale property
controls the zoom level when the page is first loaded. The maximum-scale,
minimum-scale, and user-scalable properties control how users are allowed to
zoom the page in or out.
As I said before, responsive web design has never been about making sites
for mobile devices. Its about adapting layouts to viewport sizes. Having a
responsive site that adjusts to varying viewports should be the default option. If
you wish to create a mobile version that looks completely different and shows
Tutorials
RWD responsively
retrofit older sites
Download
the files! >
You can use responsive techniques on older sites as a first step toward better
small-screen experiences. Check your idealism at the door, says Ben Callahan
Knowledge needed Intermediate CSS and HTML, understanding of
responsive techniques
Requires Text editor, web browser, inspector, patience!
Project time Less than an hour
Tutorials
Lets take a look at an existing site were going to use the Responsive
Design Twitter account, @RWD and start to experiment with the grid itself.
Fire up your browser (in my case this is Chrome), head over to www.twitter.
com/rwd, and open the inspector. You should now see something along the
lines of Figure A.
Next, lets drill into the markup a bit. In the body tag, youll see a div with
an ID of doc. Inside that are two divs, one with an ID of page-outer. Generally, I
start off looking for fixed-width containers.
The #page-outer element doesnt have a width specified in the CSS, so drill
down one level further to the div with an ID of page-container. Youll notice
that this has a width of 837 pixels set in the CSS. Were going to change it to
100% simply by clicking on 837px in the inspector and replacing that with 100%
Figure B Changing the width of the page-container div from 837px to 100% using the
inspector in Chrome
126
Tutorials
its total width is now more than 100% (28px more, to be precise). This
introduces a bit of left/right scrolling in the browser window. We can alleviate
this by adding a new style with the inspector. If youre using Chrome, click the +
symbol (the New Style Rule button) at the top of the Styles palette. If you have
the #page-container element selected still, it will pre-populate the new style
rules selector with that ID. Were just going to add the box-sizing property and
set it to border-box.
#page-container {
box-sizing: border-box;
}
The box-sizing property (http://dev.w3.org/csswg/css3-ui/#box-sizing) forces any
padding or borders of the element to be laid out and drawn inside the specified
width and height. See Paul Irishs article (http://netm.ag/irish-235) on this
property for cross-browser compatibility and performance concerns.
With this rule applied, youll see that the browser no longer requires any
horizontal scroll. The 28 pixels of padding (14 pixels on each side of #pagecontainer) is now counted inside the 100% width exactly what we need in this
case. Now, lets get those columns flexing a bit.
Inside the #page-container div is a div with a class of dashboard, which
contains the entire left-hand column. Upon inspection, youll see that it has a
width of 302 pixels specified. 302 divided by 837 gives the relative width that
the .dashboard element took up when the layout was locked at 837 pixels. Its
approximately 36%, so well set that in the inspector.
.dashboard {
width: 36%; /* 302/837 = 36ish% */
float: left;
}
Taking the same approach with the right column (with a class of content-main),
which has a width of 522 pixels, gives us about 63%. This leaves us a 1% gutter
between the columns, which looks about right in this layout (see Figure D).
.content-main {
width: 63%; /* 522/837 = 63ish% */
float: right;
}
I love this kind of experimentation because it gives you a very good feel for
whats actually possible. As you can see, within just a few minutes and with
only a handful of styles, were able to get the Twitter site flexing pretty well.
Obviously, I selected Twitter because its a fairly simple layout and has clean
markup to work with. You may not be so lucky on your project. Remember this
doesnt mean you cant be successful! However, you will want to try this kind of
in-browser experimentation before you sign a contract.
We dont generally use grid-systems in our HTML/CSS work. However,
there are cases where it makes sense, particularly when youre handing
Tutorials
Figure C The primary container on the Responsive Design Twitter account page,
Tutorials
Tutorials
Figure D Here the Responsive Design Twitter page has been made fluid with the
Chrome inspector
templates off to be managed by another organisation. In scenarios
where you need to use a grid-system and you already have a CSS
preprocessor in your workflow, you may be able to use one of the semantic
grid systems out there. Two that weve used and had great success with are
The Semantic Grid System (www.semantic.gs) and Susy (http://susy.oddbird.
net). These tools dont require non-semantic class names (which break down in
responsive web design). Generally, they use mixins or functions to re-define the
widths at various breakpoints.
Images
After spending some time fighting with inline styles, Ive landed on a few tricks
to help you out. In cases where width and height attributes are specified, you
can actually override these with a simple width or height declaration in your
CSS. If you are dealing with inline styles, you can always use the !important
keyword in CSS to override these declarations. Obviously, be careful where
and when you do this, but it does work.
Another trick is to use min-width and/or max-width instead of !important.
Perhaps you want to set an image to fill 100% of its container and to maintain
its aspect ratio. You can do this even if there are inline styles specified, by
setting the min-width and/or max-width to 100%.
<div class="column">
<img src="/i/image.png" alt="alt text" style="width: 200px; height: 100px;">
</div>
In a scenario like this, you could force the image to be flexible by using both
min-width and max-width, like so:
Take it further For more on how to apply media queries in any project check out
the W3C Recommendation for Media Queries (www.w3.org/TR/css3-mediaqueries)
128
.column {
width: 50%;
}
Tutorials
Figure E Data tables can prove a headache in RWD. Above is the the starting point for
our table example, as viewed in Chrome
Essentially, the min-width and max-width rules combine to force the image
to 100%, regardless of the inline widths. Because the height is also specified
inline, we still need to override that with an !important keyword to maintain
the aspect ratio.
This seems to work really well in all modern browsers. IE8 and older give a
bit of trouble with the auto rule on the height. The real point of this exercise is
to encourage the level of exploration thats needed for retrofitting. Its not until
you attempt something youve never done with CSS that you begin to realise
how powerful it can be.
Check out the /Images folder to see some tests Ive been running to override
inline styles with CSS. These are by no means exhaustive, so make sure you
combine overriding efforts like this with a healthy dose of testing.
Tables
Tables of data are always a challenge in responsive web design. Particularly
in retrofitting, where you often cant touch the markup, they can make for a
difficult time. Lets look at an example. Here is a pretty standard table, with
some attributes specified to apply style (see Figure E):
<table border="0" bgcolor="#eeeeee">
<thead bgcolor="#000000" style="color: #fff">
<tr>
<th width="100"></th>
<th width="81">Today</th>
<th width="81">Sep 28</th>
...
<th width="81">Jun 28</th>
</tr>
</thead>
<tr>
<th width="100">11 Payments</th>
<td width="81">$27.00</td>
<td width="81">$18.00</td>
...
<td width="81">$18.00</td>
</tr>
...
</table>
This table represents a list of payment schedules and its based on an actual
retrofit project that we worked on at Sparkbox. At the end of a series of
questions, this was presented to the user, in a modal dialog. Initially, I thought
there was no way. After a few minutes in the inspector, I was able to get this
responding fairly easily.
Here are just a few styles that shift the table around to make it much more
digestible on small screens:
/* make browsers include padding and border inside the width */
* {
-moz-box-sizing: border-box;
-webkit-box-sizing: border-box;
box-sizing: border-box;
}
/* table cells get "display: block" and "float: left" */
th, td {
display: block;
float: left;
text-align: center;
border: 0;
border-bottom: 1px solid #aaa;
}
/* the far left column will be full-width and called out */
th {
width: 100%;
background-color: #000;
color: #fff;
}
Tutorials
.column img {
min-width: 100%;
max-width: 100%;
height: auto !important;
}
Tutorials
Figure F After applying a few styles, we have a table that is much more manageable at
smaller viewport widths
Tutorials
Wrap up
Remember our focus with a retrofit is really on the user. Were using the power
of responsive CSS techniques to quickly create a better experience. This isnt
long-term solution, but there can be real benefits to a phased approach. l
Thanks to Stephanie Rieger (@stephanierieger) for her peer review of this tutorial
130
Web pro
Page 138
Page 142
Page 146
Page 147
Page 149
Page 150
Page 154
Page 155
Page 156
Page 157
Page 159
Web pro
Page 134
132
Web pro
Analytics
Semantic search
Optimising web pages
Inconsistent data?
134
15 post-Penguin tips
Backlinking top tips
Social
149
139
140
150
151
Testing times
142
Conversational search
Google Adwords
160
Using infographics
Marketing
Blogging
159
Inbound marketing
Why the term isn't relevant
158
144
153
Web pro
157
156
Remarketing
136
154
155
Site migration
Migration without migraines
145
Web pro
Semantic search
S earch
Recommended
Good examples
of semantics
Name Siri
URL http://netm.ag/
siri-230
Info Talk to Sirias you
would to aperson. Say
somethinglike, Tellmy
wife Im runninglate,
Remind me to call
the vet or Do I need
an umbrella? and Siri
answers you. This is
a perfect example of
semantics. Use it to
optimise your site!
Semantic search
One of the hardest things for most
website optimisers to come to grips
with is the idea of using Semantics
when optimising their pages.
In normal everyday conversation we all skip
over the details of what other people say and
assume that they mean one thing when they
actually say another.
Semantic search tries to understand the
searchers intent and the meaning of the query
rather than parsing through keywords like a
dictionary. Currently the search engines give
youresults based solely on the text and the
keywords that you put in that search. Essentially,
they give you their best guess.
Semantic search will dive into the relationship
between those words, how they work together,
and attempt to understand what they mean.
When people search, they aim to answer
a question. They just search in the truncated
version of that question. So far keyword
research has been largely data-driven around
the popularity of the keywords/phrases in their
question. Keyword research in semantic search
focuses on what that person actually means
when searching for that keyword.
hat is a car?
W
Where can I buy a new car? used car?
How do I drive a car?
What are the latest cars?
Name Search
EngineLand
URL http://
searchengineland.com/
library/google/googlepenguin-update
Great articlecollection on
the latest Google Update.
Name SEOmoz
URL http://moz.com/
blog/how-wpmuorgrecovered-from-thepenguin-update
Info An article about
how to recover from the
Penguin update.
Googles Penguin
update has seen the
use of traditional
SEO shortcuts take a
huge hit. Originally,
when search engines
were just beginning
to take off, you could
134
simply hammer in
the right keywords
and generate loads of
backlink traffic for your
website, ranking above
others regardless of
the quality of the
service on offer.
The purpose of
the Penguin update,
is to develop a better
means of determining
relevancy for every
search. By recognising
organic content, real
sites get rewards
affirming the golden
rule of SEO: content
is king.
The importance
of unique visits to
your sites and people
going places knowing
what they want and
how they can get it
Web pro
Useful SEO
resources
09
10
11
12
13
14
15
Name Google
Conversion Optimizer
URL http://netm.ag/
googleoptimiser-246
Info A fantastic tool
for increasing the
performance of your
landing pages and
yourkeywords.
Name Maxymiser
URL www.maxymiser.
com/landing-pageoptimization
Info If you want a
more robust A/B testing
software to use, then
Maxymiser is a fantastic
service and software
that will increase the
performance of your
website by some way.
Web pro
15 post-Penguin
backlink tips
Expert Advice
Name Stephen Lock
Job title Former SEO product
marketing manager
Company Analytics SEO
URL www.analyticsseo.com
Web pro
S earch
Useful SEO
resources
Asynchronous
JavaScript and XML
(Ajax) is used to
create more dynamic
websites. It makes
object request calls
back to the server to
update the content
136
The problem
When a search engine
robot visits any web
page to index the
content, it doesnt click
on the links and
buttons like a common
customer. Instead, it
notes the URLs
associated with each
page, visits them
individually and then
indexes them.
Ajax wants its pages
to be minimal, which is
an approach opposite
to that of the search
Solution
number one
First, you should
degrade your pages to
normal flat mark-up
language. This is
important for nonJavaScript-capable
browsers and search
engines. When you
use the Ajax call, make
sure that the page has
the same content
Solution
number two
Second, you can use
Ajax in minimalist
fashion, so search
engines can see the
optimised content,
while at the same
Web pro
Useful SEO
resources
Name My Blog Guest
Info Guest blogging is
one of the most natural
methods of obtaining
backlinks. Join My
BlogGuest and submit
your posts to other
blogowners.
www.myblogguest.com
Post-Penguin
link building
As most of you will know all too well,
Googles Penguin update took the
online world by storm. Websites that
had historically ranked well suddenly found
themselves nowhere at all on SERPs despite
having never touched black-hat SEO techniques.
With this in mind, many webmasters and SEOs
are unsure as to which SEO techniques are safe,
and what needs to be avoided.
Here are some techniques and opportunities
that are safe to use in post-Penguin times, which
wont see webmasters receiving unnatural link
warning messages from Google.
Always seek out natural linking opportunities.
Ask yourself whether the link youre placing
exists simply to pass on PageRank or whether
it adds to the purpose of where its placed. For
example, an informative guest blog post on a
relevant website with a link to your website is
considered natural and safe to use, as its adding
related resources within a useful article. If a link
Name PR WEB
Info Submit and
distribute your press
release through PR
Web for your website
to obtain authoritative
backlinks from a range
of trusted sources. Its
also searchable.
http://uk.prweb.com
Name Google
SERP Snippet
Optimisation Tool
Info Ensure your
title tags and meta
descriptions not
onlylook great but
display properly on the
search engine results
pages with this free
toolthat simulates
Googles SERPs.
www.seomofo.com/
snippet-optimizer.html
Expert advice
Name Rory Lofthouse
Job title SEO consultant
Company Bronco
URL www.bronco.co.uk
Title tags
Title tags are now
limited by pixel width
Heading tags
Optimising <H> tags is
often overlooked or
wrongly implemented.
Ideally you should only
have one <H1> tag per
page, but Google will
value your site equally
Descriptions
Your meta description
is a chance to promote
your company. Google
Content
Content is king and
everyone knows it, but
people still feel the
need to copy content
Web pro
Web pro
S earch
SEO
troubleshooting
resources
Name Common
Technical SEO Problems
and How to Solve Them
Info A good
overview of common
technical problems
SEOs face, including
canonicalisation
problems, indexed 404s
and improper redirects.
http://netm.ag/
common-235
138
a higher presence in
universal search than
ever before. Done
right, optimising
images can be fairly
quick and easy, and
significantly affect site
rankings, too.
When linking to an
image, use descriptive
anchor text. Avoid
generic terms such
as image or basic
filenames that do not
provide search engines
with information
thats meaningful.
4 Surrounding
content: A page that
contains at least 200
words of quality,
relevant content
surrounding an image
helps define it and
is a key component
in getting it ranked.
Captions describing
it are also beneficial:
theyre one of the
most-read pieces of
content on a site and
offer search engines
further information.
Making images a
bigger part of your
SEO strategy is a wise
investment that can
produce significant
results, without major
commitment of time.
Web pro
Recommended
SEO
resources
Name Inbound
Marketing is
Incomplete Marketing
Info Raven Tools
co-founder Jon
Henshaw says that good
marketing requires
marketing to all stages
of the consumer
decision journey.
http://squawk.im/
industry-news/inboundmarketing-incomplete
Dont call me an
inbound marketer
Someone recently asked me if
Resolution will be adopting the term
inbound marketing to describe what we
do. SEOMoz and others have popularised the
term, with people like Rand Fishkin (http://moz.
com/rand) arguing that we cant just be SEOs
anymore. Heres why I won't be adopting it:
We already have a word for marketing with
explicit or implicit permission. Its called
permission marketing and Seth Godin (http://
www.sethgodin.com/sg) popularised it as far
back as 1999. Inbound marketing is a synonym
with one less syllable. It doesnt seem to have
any advantage to anyone but Hubspot (www.
hubspot.com) and SEOMoz in calling it inbound
marketing rather than permission marketing.
I agree with Rand Fishkins recent video that
SEO touches a number of different fields (http://
netm.ag/seo-243). I dont agree that this is a new
phenomenon. SEOs especially enterprise SEOs
have always had to understand related fields
Expert advice
Name Matt Ballek
Job title YouTube strategist
and video SEO
Company VidiSEO
URL www.vidiseo.com
3 Annotation
On YouTube, add an
annotation to your
selected video and set
the link type to
Associated Website.
Then, simply paste
your URL complete
with tracking string.
Now any clicks on
that annotation will
appear in Google
Analytics, so you can
see how these visitors
interact with your site
or track their
movement towards
conversion goals.
Web pro
Web pro
S earch
Optimisation
resources
Name Disavow
Links Tool
Info Googles tool for
cleaning up a spammy
link profile should
only be used by those
websites who are having
problems getting bad
links taken down. http://
netm.ag/disavow-bz92
140
Heres how we do it at
Resolution Media:
1 Use tools such
as Open Site
Explorer (www.
opensiteexplorer.org),
BrightEdge (www.
brightedge.com) or
Majestic SEO (www.
majesticseo.com)
to identify relevant
websites that might be
interested in linking to
your site. Use brand
keywords to find whos
already talking about
4 Provide value or
go away. Ask yourself
what you can offer to
the webmaster that
would be of value to
them? If the answer is
nothing or money,
youre wasting not
only your time, but
the webmasters too.
Ifyou provide value
over time, you will
most likely get a link
from the site owner
without having to
request it.
SUBSCRIBE TO
GREAT REASONS
TO SUBSCRIBE
l
PRICES
6 MONTHS $67.99 (Save 38%)*
1 YEAR $135.99 (Save 38%)*
2 YEARS $220.87 (Save 50%)*
LIVE IN THE
UK? TURN
TO PAGE 14
SAVE
50%*
Terms & conditions: * Savings compared with buying 13 full priced issues at $16.99 from the US newsstand. This offer is for new North American
print subscribers only. You will receive 13 issues in a year. Minimum subscription term is 12 months. If at any time during the first 60 days you are
dissatisfied in any way, please notify us in writing and we will refund you all unmailed issues. Prices correct at point of print and subject to change.
Offer ends 30 September 2014.
Web pro
S earch
SEO integration
resources
Name Structuring
an SEO Project
Info OMD UK SEO
director Sam Crocker
explains how to
integrate SEO into the
web design process,
proposing ways to
structure SEO beyond
the retainer model.
http://netm.ag/
crocker-238
Name How SEO
CanWork with
ContentStrategy
Info SEO veteran
Lee Odden explains
the common ground
between SEOs and
content strategists.
http://netm.ag/odden238
Name SEO
Webmaster Tools Help
Info Googles official
SEO guide says: A great
time to hire [an SEO] is
when youre considering
a redesign, or planning
to launch a new site,
and lists specific points
SEOs can help with.
http://netm.ag/
seodef-238
The SWFObject
method (http://
code.google.com/p/
swfobject) makes it
possible to embed
Flash content in a
standards-compliant
way. But, improper
142
copying or failure to
update the alternative
content in the
container div can
mean this becomes
out of line with
content within the
Flash movie.
Web pro
SEO
resources
Name Tools for
Pulling Rank from
SMX Advanced
Info Michael Kings
tools presentation from
SMX Advanced 2012 is
almost a year old, but
the tools he presents are
still useful to SEOs.
http://netm.ag/king-240
Seven essential
tools for SEOs
There are so many essential tools for the practice
of SEO. Here are seven more tools to add to your
SEO toolbox:
ri Valet (http://urivalet.com) Checking
U
server headers is essential for diagnosing a
number of crawling and indexing issues. This is
the best server header checker that Ive seen.
This one has more user agents than most,
including mobile user agents for mobile SEO.
Feedly (http://feedly.com) Knowledge
management is a task in SEO that can separate
the casual practitioner from the SEO expert.
Many SEOs use social networks such as Twitter
for knowledge sharing. I personally find it too
noisy. As Google Reader is discontinued, Feedly
can handle RSS and social feeds.
Screaming Frog (www.screamingfrog.co.uk/
seo-spider) If Xenu were created with SEO in
mind, Screaming Frog would be the result. Easy
exclusions, greater flexibility and sitemap
Name 78 resources
for every Internet
Marketers Toolkit
Info These arent
exclusively SEO tools,
but this big list is fairly
exhaustive and is
organised by function
and clearly labelled as
free or paid for, easily
getting to the most
valuable tools for you.
http://netm.ag/
davies-240
Expert advice
Name Sam Crocker
Job title Digital director
Company OMD UK
URL www.samuelcrocker.com
Although microformats
and RDFa may be
supported, Google
recommends using
microdata, so this
would be the best long
term implementation
from a rich snippet
perspective.
Before you implement
this mark-up, check
content types are
currently supported by
rich snippets (updated
list from Google:
http://netm.ag/
richsnippets-240).
Plus additional
requirements to get
Google to display
authorship
information: http://
netm.ag/
authorinfo-240).
Visit http://schema.org
to see hierarchy,
schema types and
other documentation.
Implement microdata
onsite or use Googles
Data Highlighter
(http://netm.ag/
highlighter-240) if
onsite HTML
implementation is
not feasible.
Test the
implementation
with the Rich Snippet
Testing Tool (http://
netm.ag/
richtesting-240).
Top tips
esults may vary. For
R
many schema types
(such as reviews),
this can lead to a
considerable uplift
in traffic compared
with ranking in a
similar position
without rich
snippets displayed.
isplaying some
D
information (such as
price or stock) in the
SERPs may actually
reduce CTR but
increase conversion.
Always consider the
user experience.
If you are showing
reviews on your site,
try to implement the
mark-up on a
product page from
which a purchase
can be made.
Keep information
up to date to keep
users happy.
Web pro
Web pro
S earch
Conversational
search links
Name Googles
Impressive
Conversational Search
Goes Live on Chrome
Info Danny Sullivan
explains what it is and
why he thinks it really
is one of those
significant changes.
http://netm.ag/
search-244
Conversational
search and SEO
If you use Chrome, you now have the
ability to speak your search terms
instead of type them. What, if anything,
does conversation search mean for SEO?
Now Google has the ability to answer more
conversational queries, searchers may begin
speaking to it with more natural language
queries (eg Tell me the name of the best
restaurant in Hells Kitchen in New York), which
are generally longer and more complex. If youre
relying on a few high volume keywords for the
bulk of your revenue, consider long-tail
keywords. Companies like Bloomreach (www.
bloomreach.com) create pages for long-tail
queries to help with conversational search.
What keywords bring up direct answers?
If youre depending on exact match domains
like WhatTimeIsItInLondon.com to drive your
business, this is another nail in your coffin.
Where possible, Googles conversational search
continues the trend of providing users with the
Name Google
announcement on
conversational search
Info The official
announcement on
conversational search by
Googles Amit Singhal.
http://netm.ag/amit-244
Name Implications of
conversational search on
SEO from 360i
Info 360is group
director of SEO, Mike
Dobbs, provides his take
on how conversational
search will affect
SEO, noting how
conversational search
may further complicate
the analytics blackout
that secure search has
given marketers.
http://netm.ag/360i-244
Web pro
144
1 Claim a location
To ensure you show up
for relevant local
search queries, make
sure that the full
Name, Address and
Phone number (NAP)
of your business
appears together in
static HTML. See the
coding information in
Schema.org/
LocalBusiness. Claim
your local Google+
Page at www.google.
com/business/
placesforbusiness. If
you operate a multi-
location business, be
sure that each location
has its own dedicated
location page on
which this NAP
appears. Then claim
all of your locations
at Google Places for
Business. During the
claiming process, use
the individual location
pages as the URLs
associated with each.
2 Be a local brand
To increase the
likelihood that Google
will show a Knowledge
Web pro
SEO
resources
Name How to use the
Bing site move tool
Info We need to give
Bing a lot of credit for
providing a useful tool
in its webmaster tools
before Google. This one
allows you to let Bing
know which new URLs
are associated with
which old ones, even
without 301s in place.
http://netm.ag/bing-242
Site migration
without migraines
I was fortunate to be one of the experts
at SES New York (http://sesconference.
com/newyork), which is part of a series of
events for search and social marketing. I was part
of the conference's Meet the Experts: Roundtable
Forum, talking webmasters through the perils
of site migration along with Eric Enge of Stone
Temple Consulting.
If you dont account for SEO when developing a
new site, the worst case scenario is losing all of the
historical data that the search engines depend on
to rank your site properly in search results.
Over time, links and shares accumulate to pages
URLs on the web and if you dont have a plan
for how that equity will be transferred, you could
find yourself starting over. However, theres an
easier way. The easiest way is to keep your URLs.
So many times, URLs are changed arbitrarily: for a
new CMS, or for tracking purposes, or because the
new developers want to start over entirely and
dont consider SEO or the business consequences
Name Website
migration tips for SEOs
Info An article by
iCrossings Modesto
Siotos on site migration,
including a full process
for webmasters and
SEOs to follow.
http://netm.ag/
seomoz-242
Name How to avoid
SEO disaster during a
website redesign
Info An article by Glenn
Gabe includes case
studies of companies
that didnt take SEO
into account
before launching.
http://netm.ag/
disaster-242
Expert advice
Name Tyson Braun
Job title SEO supervisor
Company Lowes
URL www.lowes.com
1 Unobtrusive
JavaScript
If youre using
JavaScript, separate into
behaviour and content
layers so text-only
browsers and search
engines can read the
content presented
without disrupting the
basic HTML.
2 Semantic HTML
Wherever possible
utilise HTML markup
that provides semantic
meaning over
presentation-only
signals, such as
content to display
without complex CSS
styling. This also has
the added benefit to
improve your text to
code ratio that permits
the search engine bots
to crawl your pages
very quickly.
5 Flash as an
enhancement
If youre utilising Flash,
build it into your page
as an enhancement to a
HTML or HTML5
experience. This will
allow the rich
experience of Flash
Web pro
Web pro
A
nalytics
Q&A
Name Peter ONeill
Job title Founder
Company L3 Analytics
Web www.l3analytics.
com
Inconsistent data in
Google Analytics?
When Google introduced the premium
version of Google Analytics (GA), it was
more aimed at implementing reliability
rather than providing an upgraded feature set.
Its no secret that one of the problems plaguing
the web analytics community is the lack of human
resource (decent web analytics practitioners). In
bringing out the premium version of GA, Google
was aiming to alleviate some of this anxiety by
providing paying customers with, among other
things, a service that guaranteed 24-hour support
to deal with issues relating to their product. (A
short trawl around some of the forums will confirm
that GA, like pretty much all other web analytics
tools, is not without its faults.) An additional part
of the service included what appears to be a more
robust data set as a result of zero sampling.
Sampling is an issue that presents the rest of
the non-premium GA user population with a bit
of a problem relating to accuracy and consistency.
Google sells the benefits of sampling in GA from
the standpoint of speed, although for long-term
users this may be a debatable point. Users are now
able to adjust the level of sampling themselves,
but the degree to which they can access a full data
set varies depending on the report being run and
Web pro
Percentage
Methods used
Percentage
A/B testing
46%
Competitor benchmarking
27%
Copy optimisation
42%
Segmentation
22%
40%
21%
40%
Abandonment email
20%
Usability testing
30%
Multivariate testing
17%
29%
15%
146
Web pro
Q&A
Name Dan Barker
Job title Independent
ecommerce consultant
Web www.barker.dj
2012
2013
38%
54%
NA
18%
37%
31%
11%
23%
27%
24%
19%
NA
43%
35%
35%
30%
27%
25%
23%
23%
18%
18%
14%
3%
What are the most exciting digital-related opportunities for your organisation in 2013?
Web pro
Web pro
A
nalytics
Q&A
Name Tim
Leighton-Boyce
Job title Google
Analytics and
ecommerce consultant
Remarketing with
Google Analytics
Imagine a person walks into your store,
has a good look around, picks out an
item, gets to the cash till and is all ready
to pay, but decides they want to do some last
minute shopping around first and walks out; that
would be disappointing. The first thing youd
want to do is get them back in. If you were the
owner of an online store, getting that customer
back would be much easier using Remarketing
with Google Analytics (http://netm.ag/ga-bz92).
Last year, Google Analytics (GA) announced its
new remarketing option, which is like an enhanced
version of AdWords remarketing. Making it work
is a pretty simple. First, ensure your AdWords and
Google Analytics accounts are linked. If youre,
in the parlance, a joined up marketer then this
should already be done. Second, add a line of
code to the standard Google Analytics tracking
code on every page of your site. This brings in the
DoubleClick cookie thats needed for remarketing.
You can find the requisite line of code in GAs
help centre (http://netm.ag/help-bz92). Third,
amend your privacy policy to reflect the use of GA
remarketing. Find guidelines in the help centre.
Now, for the interesting bit. In the admin menu
for your Google Analytics account, youll find a tab
On the radar
Not an issue
2012
29%
49%
23%
2013
21%
45%
34%
2012
26%
33%
42%
2013
21%
37%
43%
148
Web pro
How does your organisation regard the following challenges for 2013?
Web pro
Q&A
Name Anna Lewis
Job title Digital
marketing executive
Company Koozai
Web www.koozai.com
Twitter @Koozai_Anna
Conversions
are changing
which is a legitimate thing to do; and four if you
consider one and three as part of the whole.
It wouldnt be unreasonable to work on the
basis that if each step in the conversion journey
were optimised to its maximum using a rigorous
testing schedule (at least for the digital steps in the
journey), then this will act as the rising tide that
floats all [customer] boats in the context of driving
improved performance. In addition, the issue of
visitor analytics still has a slight European-shaped
cloud hanging over it so its not yet clear how
much we will really be able to rely on it.
In truth, it doesnt matter how smart the
analysis and insight is, if the resource to act on it
isnt available internally or externally via agencies
and consultants, then progress wont be made.
Its surprising how many organisations arent
set up to move quickly to take advantage of the
output they get from their web analytics and other
data. This is where a startup mentality must be
adopted in organisations of all sizes. As everybody
proclaims, the pace of change online is both rapid
and relentless. To know that and not structure
human resources accordingly is baffling. l
40%
30%
20%
10%
0%
14
46
10
12
5
BRAND
AWARENESS
INCREASED
FOOTFALL
3
PRODUCT
LAUNCH/RELEASE
12
REGISTRATIONS
14
29
SITE
TRAFFIC
38
11
SUSTAINED
IN-MARKET PRESENCE
50%
Web pro
Web pro
M
arketing
Q&A
Web pro
Increase: 68%
Decrease: 2%
Stay the same: 18%
No plans to utilise: 12%
150
Source: www.socialmediaexaminer.com/
SocialMediaMarketingIndustryReport2012.pdf
Web pro
Q&A
Name Paul Roberts
Role Motion/
graphic designer
Company
Harmonix Graphics
URL www.behance.
net/harmonix
Using infographics
asa marketing tool
Start by submitting your infographic to free
showcase websites like Visual.ly, Infographic File,
Infographics Only and Visual Loop. Getting featured
on these services will get you a high-quality link to
your website, and will also serve to drive visitors.
Next, publicise it on social sharing sites like
Tumblr, Pinterest, Reddit and StumbleUpon, and get
ready to tweet about it and put it on your Facebook
page. Now you are going to really pump up the
marketing action. Have a look around, and find
other infographics that have covered a similar topic.
Use a backlink research tool like Open Site Explorer,
and discover sites that have referred to these
infographics. Get in touch with these site owners,
and let them know about your new infographic.
And finally, make sure your influential contacts
know about your infographic. Drop them an email
and invite them to use it on their own blogs! l
A powerful tool in
your marketing arsenal
Pie Chart
24%
Pictoral Chart
24%
Line Chart
32%
Bar Chart
*Source: http://ivancash.com/Infographic-Infographic
Web pro
TTHE
H E AWA
WINNING
AWA RRDD -- WINNING
ComputerArts
Artsreaders
readers know
know design
design matters.
matters. Thats
Computer
Thats why
whyweve
wevecompletely
completely
reinvented
our
digital
edition
as
a
fully
interactive
iPad
experience
reinvented our digital edition as a fully interactive iPad experiencewith
with
impeccable
usability.
Theres
also
additional
interactive
content,
impeccable usability. Theres also additional interactive content,
suchas
asimage
image galleries
galleries and
and bonus
bonus videos,
such
videos, which
whichbring
bringthe
the
motion
content
featured
in
the
magazine
to
life.
motion content featured in the magazine to life.
TRY IT
IT FOR
FOR FREE
FREE TODAY
TRY
TODAY WITH
WITH
OUR
NO-OBLIGATION
30-DAY
OUR NO-OBLIGATION 30-DAY TRIAL
TRIAL AT
AT
http://goo.gl/sMcPj
(UK)
or
http://goo.gl/aib83
http://goo.gl/sMcPj (UK) or http://goo.gl/aib83(US)
(US)
Web pro
Testing times
Q&A
Name Steve Durr
Role MD
Company This Is
Kode Limited
URL http://thisiskode.
com
Testing times
71% 70%
63% 61%
2011
65%
2012
56%
49% 49%
49%
44%
43%
39%
34%
26%
13% 11%
Call to
action
buttons
Page
layout
Copy
Navigation
Checkout
process
Promotion
and offers
Images
Product
selection
process
Security
fields
Web pro
Source: http://econsultancy.com/uk/reports/conversion-rate-optimization-report
Web pro
M
arketing
Q&A
Name Ben Wood
Role Online marketer
Company
Hallam Internet
URL www.
hallaminternet.com
Web pro
154
Industry
Bounce rate
20-40%
Simple landing pages with one call to action such as add to cart
70-90%
10-30%
10-30%
Content websites with high search visabilty (often for irrelevant terms)
40-60%
30-50%
Google
AdWords
The cons
lR
eporting of performance by device will now be
a lot more time consuming.
lW
ell be able to view tablet performance
individually, but we wont be able to optimise
it. If we want to increase visibility on tablet
The pros
l Fewer campaigns to manage; three device
campaigns will become one.
l We still have the ability to adjust mobile bids.
l We can now adjust bids by location, allowing us
to be more visible to customers nearer to our
offline business.
l We can now see performance by each individual
site link in AdWords, allowing PPC managers to
use these insights to improve engagement.
l We can schedule site links by time and day, and
assign specific site links to mobile devices.
Web pro
Xxxxxpro xxxxxxxx
Web
Creating good social content
Social
Social influence
measuring tools
Name Klout
Info Loved and hated
in equal measure, Klout
frequently elicits
negative comments due
to being less than
transparent about its
scoring method. It can
be useful as a research
starting point, but isnt
to be taken too seriously.
(www.klout.com)
Creating good
social content
As content is so vital to successful social
media engagement, you want to make it
more likely that your content is engaged
with and shared online. Here's how to effectively
create and use visual and audio content.
Its clear that we live in the era of the visual
web. Human beings are hardwired to respond to
visual cues. We can specifically see this at work on
Facebook, where its algorithm, Edgerank, gives
more weight (and therefore priority in the news
feed) to photographs over pretty much any other
type of content. So how do you go about creating
photos that people will be more inclined to share?
Well, assuming youre not a brilliant professional
photographer, there are a couple of points to think
about. Firstly try to develop your own style; on
Instagram, the most popular users tend to have
a style that doesnt change wildly from photo
to photo. Secondly, there has been an explosion
in popularity of photo memes. Come up with,
or contribute to, a good one and you could find
Name Peerindex
Info Does a similar job
to Klout. Includes useful
lists of people within
different topics. Like
Klout they also run
perksfor those with
scores that measure up.
(www.peerindex.com)
Name Kred
Info Created by the
team that run social
analytics tool
PeopleBrowsr, Kred is
more open about the
factors it takes into
account with its scoring.
(www.kred.com)
156
3 Impersonation is
not always flattering.
Ask the tweeter who
impersonated Wendi
Deng. You could face a
libel action, or one of
passing-off and could
even face a fraud claim.
4 Content can be
shared on the terms of
use of most platforms,
but content that
contains an extract
from a copyright work
could lead to a claim for
copyright infringement.
Web pro
Tools to travel
back in social
media time
Name Timehop
Info Timehop collates
updates from various
social media platforms
from one year ago and
sends them to you in a
daily email. Its
surprisingly rewarding to
be reminded of what you
were doing.
www.timehop.com
Targeting
the customer
Very few people would argue with
the assertion that smartphones have
revolutionised our behaviour as
individuals and consumers. By 2016, the number
of UK users is expected to doubled to 41.9m.
Marketers have long understood the importance of
optimising websites for mobile but the Social Local
Mobile (or SoLoMo) phenomenon means there
are a whole host of other opportunities to connect
with customers.
On one level SoLoMo is about serving up
useful information, offers or discounts based on
the physical location of an individual. This may be
as a result of local search: for instance, I search
for the nearest drycleaners on my mobile and the
site gives me the details of the nearest branch to
my physical location. Or I check in on Foursquare
and am notified of a nearby restaurant offering
a special discount that day, a restaurant that also
happens to have been recommended by several
of my friends on Foursquare. Thats SoLoMo in
action. The concept has been around for a few years
now, but the technology and associated apps are
Name OhLife
Info Appealingly simple
to use, OhLife sends a
daily email asking how
did your day go?. Reply
with as much or as little
as you like, and you end
up with a neat life diary.
www.ohlife.com
Name Momento
Info This beautifully
executed iPhone app
helps capture your life as
you go along. It collates
updates from many social
networks, you can add
your own updates, and
its also a useful way to
search back through old
tweets by date.
www.momentoapp.com
Expert advice
Name Julian Ranger
Job title CEO
Company SocialSafe
Web www.socialsafe.net
As internet users,
we are increasingly
pouring our lives into
online profiles that we
dont own. We may be
in possession of the
keys, but the truth is,
were really driving
Web pro
Xxxxxpro xxxxxxxx
Web
Why you can't speed up social marketing
Social
Tools to create
social media
infographics
Name Visual.ly
Info One of the betterknown tools to help you
produce infographics,
Visual.ly plugs into your
Facebook or Twitter data.
You can use existing
templates or tap into the
Visual.ly marketplace for
a more custom and
costly solution.
www.visual.ly
Name Easel.ly
Info An infographiccreation tool similar to
Visual.ly, Easel.ly offers a
few more options for
customising its output,
using its bank of available
icons and graphics.
www.easel.ly
Name Infogr.am
Info Infogr.am is a
simple-to-use tool that
enables you to create not
just infographics but also
charts and diagrams. It
costs nothing to sign up.
www.infogr.am
158
1 Timed repetition
Information can be
lost quickly on Twitter
as your tweet slides
down your followers
timelines. Mentioning
something more than
once is fine, but do so
at intervals, with other
tweets in between: if
its all you tweet about,
people will just see your
content as noise which
will have the opposite
effect to the one you
intend. Consider time
zones, too, so that your
Colin Grieves on
Social data
and search
With the release of Facebooks Graph Search, the inevitable rise of social search
will increasingly be at the core of marketing budgets in 2013, says Colin Grieves
Social search
With Graph Search, marketers will be faced with
the same issues, as they look to use it effectively
for their business. For a long time, Facebooks
search bar has been uncomfortably inadequate.
Therefore, the Graph Search announcement
was a relief, but, as always, advertisers are already
working to identify the associated opportunities
The obsession
with blindly chasing
fans or followers
has thankfully lost
credence
Colin Grieves
Web pro
Xxxxxpro xxxxxxxx
Web
Make your content more shareable
Social
Online
professional
profiles
Name LinkedIn
Info The original
andmost well-known
professional social
network. Often mocked,
but worth keeping up to
date and seeking
recommendations. LI
canbe a great source of
work for freelancers.
www.linkedin.com
Name Publicate
Info The idea behind this
newer platform is that by
allowing people to share
a wide variety of content
about not just their work
history but also their
interests and passions, it
will provide a better
snapshot of the individual.
http://publicate.it
Name Pinterest
Info Admittedly, Pinterest
isnt a professional profile
tool per se, but there are
some brilliant examples of
creatives using Pinterest
as a visual CV of work. Its
a good way to get
yourself noticed.
http://pinterest.com
Expert advice
Name Tom Mason
Job title Head of social media
Company Delineo
Web www.delineo.com
Twitter @totmac
Content may be king, but maintaining a blog
takes a bit of effort. Yet blogging doesnt have to
take up all of your time and there are a number
of ways you can minimise the legwork. Here are
a few techniques to reduce the stress of blogging
and provide quality content for your readers.
160
Any questions
Dont be afraid to
ask for ideas from
your social media
connections. Some of
the most interesting,
relevant posts come
from questions posed
by followers. Address a
specific need or concern
and youre providing
readers with a valuable
resource theyll want to
visit again and again.
10 things
10 things you
may have missed
Subscribe to
net magazine
today and save
See page 69 fo !
more details r
Hopefully by now you'll be an SEO expert but just in case you're still
hungry for more, here's 10 things in this issue you may have missed
PAY FOR LINK BUILDING
01 DON'T
What happens when you pay an SEO firm
to link build for you? More often than not, it
will spam other websites on your behalf with
automated tools. Its a selfish tactic youre
receiving negligible (if any at all) benefits at the
expense of honest webmasters time and they
have to clean it up off their sites.
This and more SEO tips for startups on page 26
03 RESPONSIVELY
RETROFITTING OLDER SITES
Most of us probably agree that the web is never
really done. The real-time nature of the beast is
what makes our medium unique, yet we often
choose File > New over a steady evolution of our
sites. The truth is that we do not always get to
start over.
Ben Callahan on the first steps towards better
small-screen experiences. See page 126
LOADING SITES
04 FASTER
One of the first things to look at is the size
of your HTML code. This is probably one of the
most overlooked areas, perhaps because people
assume its no longer so relevant with modern
broadband connections. Some content
management systems are fairly liberal with the
amount they churn out one reason why it can
be better to handcraft your own sites.
Tom Gullen shows you how to make your sites load
ultra-quick. See page 112
162
02
BOUNCE RATE
DOESN'T
08 GOOGLE
ENDORSE SEO COMPANIES
Put simply, if youre dealing with a firm who
make any allusion that theyre endorsed or
approved by Google for optimisation purposes,
its likely theyre a fraud. The reality is that Google
FUTURE?
09 THE
Looking at the industry a moment, over
10 OPTIMISE!
I would like to propose that the SEO
industry, and all of us who define it, start living
up to our names. Were optimisers, James
Swinarski. We compete with one another in
the search results by making content relevant,
accessible and findable. It's not an easy task.
More from James Swinarski on page 140
Visit www.myfavouritemagazines.co.uk/design
for more information