You are on page 1of 8

SEO

SEO is a technique which helps search engines find and rank your site higher than the millions of other sites in response to a search query. SEO thus helps you get traffic from search engines. search engines perform several activities in order to deliver search results crawling, indexing, processing, calculating relevancy, and retrieving.

SEO What We Need To Do a)KeyWords* b) Back Links c) Meta- Tags* e) Static URL* f) Images and Visuals (alt attribute and filename)* g) Sitemap.xml and submitting website to google
h)Site Promotion to Increase Traffic

SEO Tools : Complete SEO process automation is impossible, thats for sure. Tools can be used as an advantage bt can also a disadvantage. hence it can be used to : to speed up the SEO process to analyze advantages and drawbacks to see what can be automated to get information in a more categorized and organized way to get inspiration and more ideas

Variouos Tools will be used along to help us optimize Content and Links.

http://www.smashingmagazine.com/2006/09/22/complete-list-of-best-seo-tools/

A ) KeyWords -Key words the most important item in SEO, visible by Users and Search Engines. - Keywords with Low Competition should be given more Preference , they help get better results, - Location of Keywords are very important as not only quantity but as quality as well i.e. if you have more keywords in the page title, the headings, the first paragraphs this
counts more that if you have many keywords at the bottom of the page or in the description Key word stuffing should be avoided and content should make sense and be releavant and not made just for the sake of search engines. There can be penalties imposed , If Google finds manipulation of keywords.
-

- We have to try and use more headings for sub topics, google loves this and uses headings for crawling and indexing. - Key word Density should be high but again balanced with Content Tools

Google Adwords Keyword Tool Keyword tool from Google that provides Specific and Similar keywords

Keyword Density Checker Keyword Density is the percentage of occurrence of your keywords to the text in the rest of your webpage. This tool will crawl the specified URL, extract text as a search engine would, remove common stop words and Analyze the density of the keywords. Keyword Difficulty Tool The keyword difficulty tool is used to analyze the competitive landscape of a particular search term or phrase, this tool issues a percentage score and provides a detailed analysis of the top ranking sites at Google and Yahoo

Keyword Suggestion Tool This keyword suggestion tool will help you with the choosing of the right keywords for your website. You can see which keyword combinations are more popular and also get ideas for more keyword combinations..

b) Back Links/Anchor Text

In layman's terms, there are two types of links: inbound and outbound. Outbound links start from your site and lead to an external site, while inbound links or backlinks, come from an external site to yours. Why Backlinks are important? The number of backlinks is an indication of the popularity or importance of that website. Backlinks are important for SEO because

some search engines like Google, give more credit to websites that have a large number of quality backlinks, and consider those websites more relevant than others in their results pages for a search query. Arco needs to create a lot of backlinks, from web portals like indiamart alibaba etc, search directories, or any other relevant websitesle( Renwableworldenergy.com, other solar community/magazines/forums. Backlinks should not come from websites that are irrelevant to our Content. Getting backlinks from youtube will be very beneficial if we can manage to get high number of hits on our videos. BackLinks are not only used by users to generate traffic, but also helps search engines rank your page higher. BackLinks can also be used to promote websites by paying other website owners to display yours URLs.

Anchor Text -Very simple concept , helps google search calculate your website relevancy for a search query. - Example for hyperlinks that lead to a page on our website,instead of writing click here or copy pasting URL, it is much mor e beneficial to write the relevant key word for that page i.e Solar Energy Systems, in our case. c)MetaTags
Meta tags are used to summarize information of a page for search engine crawlers. This information is not directly visibles to humans visiting your website. The most popular are the meta keywords and description tag.

Meta Tags is made for search engines, to help them, crawl index and calculate relevancy of your website/page. It is not for users, as that info is not visible by them Meta Description is what you include in the description of the page. Google uses this to sometimes to display informations and summarize your website for users. Meta Tags is all the relevant keywords you write related to your product/content. Avoid Meta Tags Stuffing . Popularity of Meta-Tags has gone down as of late, search engines dont give it as much importance, but still no harm in using this for SEO deve lopment

For instance, for the dog adoption site, the meta Description tag could be something like this: <Meta Name=Description Content=Adopting a dog saves a life and brings joy to your house. All you need to know when you consider adopting a dog.>

You may consider including alternative spellings (or even common misspellings of your keywords) in the meta Keywords tag. It might be a very small boost to your search engine rankings but why miss the chance? eg. <Meta name=Keywords Content=adopt, adoption, dog, dogs, puppy, canine, save a life, homeless animals>

Meta Robots - robots.txt

d) Static URL - Write the URLs in simplified form i.e www.arco.in/ solarenergysystems/installations instead of it look like www.arco.in/904*^*%4345solar#@#Install34 -Google does not like dynamic urls(cryptic text) and has trouble indexing them. -URL Rewriting Tool (only works on Linux Servers)

e) Images and Visuals - Change design of the images*


As already mentioned, search engines have no means to index directly extras like images, sounds, flash movies, javascript. Instead, they rely on your to provide meaningful textual description and based on it they can index these files. In a sense, the situation is similar to that with text 10 or so years ago you provide a description in the metatag and search engines uses this description to index and process your page. If technology advances further, one day it might be possible for search engines to index images, movies, etc.
E)

but for the time being this is just a dream.

Images are an essential part of any Web page and from a designer point of view they are not an extra but a most mandatory item for every site. However, here designers and search engines are on two poles because for search engines every piece of information that is buried in an image is lost. When working with designers, sometimes it takes a while to explain to them that having textual links (with proper anchor text) instead of shining images is not a whim and that clear text navigation is really mandatory. Yes, it can be hard to find the right balance between artistic performance and SEO-friendliness but since even the finest site is lost in cyberspace if it cannot be found by search engines, a compromise to its visual appearance cannot be avoided. With all that said, the idea is not to skip images at all. Sure, nowadays this is impossible because the result would be a most ugly site. Rather the idea is that images should be used for illustration and decoration, not for navigation or even worse for displaying text (in a fancy font, for example). And the most important in the <alt> attribute of the <img> tag, always provide a meaningful textual description of the image. The HTML specification does not require this but search engines do. Also, it does not hurt to give meaningful names to the image files themselves rather than name them image1.jpg, image2.jpg, imageN.jpg. For instance, in the next example the image file has an informative name and the alt provides enough additional information: <img src=one_month_Jim.jpg alt=A picture of Jim when he was a one -month puppy>. Well, don't go to extremes like writing 20-word <alt> tags for 1 pixel images because this also looks suspicious and starts to smell like keyword-stuffing.

2. Animation and Movies

The situation with animation and movies is similar to that with images they are valuable from a designer's point of view but are not loved by search engines. For instance, it is still pretty common to have an impressive Flash introduction on the home page. You just cannot imagine what a disadvantage with search engines this is it is a number one rankings killer! And it gets even worse, if you use Flash to tell a story that can be written in plain text, hence crawled and indexed by search engines. One workaround is to provide search engines with a HTML version of the Flash movie but in this case make sure that you have excluded the original Flash movie from indexing (this is done in the robots.txt file but the explanation of this file is not a beginners topic and that is why it is excluded from this tutorial), otherwise you can be penalized for duplicate content. There are rumors that Google is building a new search technology that will allow to search inside animation and movies and that the .swf format will contain new metadata that can be used by search engines, but until then, you'd better either refrain from using (too much) Flash, or at least provide a textual description of the movie (you can use an <alt> tag to describe the movie).

4. JavaScript
This is another hot potato. It is known by everybody that pure HTML is powerless to make complex sites with a lot of functionality (anyway, HTML was not intended to be a programming languages for building Web applications, so nobody expects that you can use HTML to handle writing to a database or even for storing session information) as required by today's Web users and that is why other programming languages (like JavaScript, or PHP) come to enhance HTML. For now search engines just ignore JavaScript they encounter on a page. As a result of this, first if you have links that are inside the JavaScript code, chances are that they will not be spidered. Second, if JavaScript is in the HTML file itself (rather than in an external .js file that is invoked when necessary) this clutters the html file itself and spiders might just skip it and move to the next site. Just for your information, there is a <noscript> tag that allows to provide alternative to running the script in the browser but because most of its applications are pretty complicated, it is hardly suitable to explain it here.

g) Sitemap.xml Prepare two sitemaps: one for users, one for search engines An XML Sitemap (upper-case) file, which you can submit through Google's Webmaster Tools, makes it easier for Google to discover the pages on your site. Using a Sitemap file is also one way (though not guaranteed) to tell Google which version of a URL you'd prefer as the canonical one (e.g. http://brandonsbaseballcards.com/ or http://www.brandonsbaseballcards.com/; more on what's a preferred domain). Google helped create the open source Sitemap Generator Script to help you create a Sitemap file for your site.

h) Site Promotion for increase in Traffic Youtube SlideShare Twitter Pinterest Linkedin Social BookMarking Websites - digg.com, reddit.com, stumbleupon.com etc Other Methods to Increase Traffic

PAID SEM Google Adword ALTERNATIVES When it comes to running a PPC campaign, the first name that comes to your mind is AdWords from Google. While this is a really good ad network, it is also a very expensive one. Even with careful keyword selection and targeting, it is not an exception to pay more for AdWords ads than you make in revenue. Cost is the main reason to look for alternatives to AdWords. Fortunately, such alternatives do exist and the best is that some of them could even outperform AdWords in terms of return on investment. Here are 12 great AdWords alternatives to consider

http://www.webconfs.com/google-adwords-alternatives-article-46.php
Top Ten Seo Mistakes
1Targetting the wrong keywords
This is a mistake many people make and what is worse even experienced SEO experts make it. People choose keywords that in their mind are descriptive of their website but the average users just may not search them. For instance, if you have a relationship site, you might discover that relationship guide does not work for you, even though it has the relationship keyword, while dating advice works like a charm. Choosing the right keywords can make or break your SEO campai gn. Even if you are very resourceful, you can't think on your own of all the great keywords but a good keyword suggestion tool, for instance, the Website Keyword Suggestion tool will help you find keywords that are good for your site.

2Ignoring the Title tag


Leaving the <title> tag empty is also very common. This is one of the most important places to have a keyword, because not only does it help you in optimization but the text in your <title> tag shows in the search results as your page title.

3A Flash website without a html alternative


Flash might be attractive but not to search engines and users. If you really insist that your site is Flash-based and you want search engines to love it, provide an html version. Here are some more tips for optimizing Flash sites. Search engines don't like Flash sites for a reason a spider can't read Flash content and therefore can't index it.

4JavaScript Menus
Using JavaScript for navigation is not bad as long as you understand that search engines do not read JavaScript and build your web pages accordingly. So if you have JavaScript menus you can't do without, you should consider build a sitemap (or putting the links in a noscript tag) so that all your links will be crawlable.

5Lack of consistency and maintenance


If you want to be successful, you need to permanently optimize your site, keep an eye on the competition and changes in the ranking algorithms of search engines.

6Concentrating too much on meta tags


A lot of people seem to think SEO is about getting your meta keywords and description correct! In fact, meta tags are becoming (if not already) a thing of the past. You can create your meta keywords and descriptions but don't except to rank well only because of this.

7Using only Images for Headings


Many people think that an image looks better than text for headings and menus. Yes, an image can make your site look more distinctive but in terms of SEO images for headings and menus are a big mistake because h2, h2, etc. tags and menu links are important SEO items. If you are afraid that your h1 h2, etc. tags look horrible, try modifying them in a stylesheet or consider this approach: http://www.stopdesign.com/articles/replace_text.

8Ignoring URLs
Many people underestimate how important a good URL is. Dynamic page names are still very frequent and no keywords in the URL is more a rule than an exception. Yes, it is possible to rank high even without keywords in the URL but all being equal, if you have keywords in the URL (the domain itself, or file names, which are part of the URL), this gives you additional advantage over your competitors. Keywords in URLs are more important for MSN and Yahoo! but even with Google their relative weight is high, so there is no excuse for having keywordless URLs.

9Backlink spamming
It is a common delusion that it more backlinks are ALWAYS better and because of this web masters resort to link farms, forum/newgroup spam etc., which ultimately could lead to getting their site banned. In fact, what you need are quality backlinks. Here are some more information on The Importance of Backlinks

10Lack of keywords in the content


Once you focus on your keywords, modify your content and put the keywords wherever it makes sense. It is even better to make them bold or highlight them.

Googles Guidelines

When your site is ready:

Submit it to Google at http://www.google.com/submityourcontent/.

Submit a Sitemap using Google Webmaster Tools. Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your webpages.

Make sure all the sites that should know about your pages are aware your site is online.

Design and content guidelines

Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages.

Keep the links on a given page to a reasonable number.

Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. If you must use images for textual content, consider using the "ALT" attribute to include a few words of descriptive text.

Make sure that your <title> elements and ALT attributes are descriptive and accurate.

Check for broken links and correct HTML.

If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

Review our recommended best practices for images, video and rich snippets.

Technical guidelines

Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.

Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.

Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.

Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visit http://code.google.com/web/controlcrawlindex/docs/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you're using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.

Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by arobots.txt file.

If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.

Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.

Test your site to make sure that it appears correctly in different browsers.

Monitor your site's performance and optimize load times. Google's goal is to provide users with the most relevant results and a great user experience. Fast sites increase user satisfaction and improve the overall quality of the web (especially for those users with slow Internet connections), and we hope that as webmasters improve their sites, the overall speed of the web will improve.

Google strongly recommends that all webmasters regularly monitor site performance using Page Speed, YSlow, WebPagetest, or other tools. For more information, tools, and resources, see Let's Make The Web Faster. In addition, the Site Performance tool in Webmaster Tools shows the speed of your website as experienced by users around the world.

Quality guidelines
These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here. It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.

If you believe that another site is abusing Google's quality guidelines, please let us know by filing a spam report. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. While we may not take manual action in response to every report, spam reports are prioritized based on user impact, and in some cases may lead to complete removal of a spammy site from Google's search results. Not all manual actions result in removal, however. Even in cases where we take action on a reported site, the effects of these actions may not be obvious.

Quality guidelines - basic principles

Make pages primarily for users, not for search engines.

Don't deceive your users.

Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you, or to a Google employee. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?"

Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.

Quality guidelines - specific guidelines

Avoid the following techniques:

Automatically generated content

Participating in link schemes

Cloaking

Sneaky redirects

Hidden text or links

Doorway pages

Scraped content

Participating in affiliate programs without adding sufficient value

Loading pages with irrelevant keywords

Creating pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware

Abusing rich snippets markup

Sending automated queries to Google

Engage in good practices like the following:

Monitoring your site for hacking and removing hacked content as soon as it appears

Preventing and removing user-generated spam on your site

If you determine that your site doesn't meet these guidelines, you can modify your site so that it does and then submit your site for reconsideration.

You might also like