Posts filed under ‘Search Engine Optimisation’

Search Engine Optimization Easy Guide

This guide will see you through the search engine optimization (SEO) process from code to upload, with a no-nonsense explanation of each technique, and direct links to the resources you need.

First off, here are two myths that need to be dispelled:

1st Myth: There are ‘tricks’ that can get you to the top of major search engine rankings.

Whilst I’m going to show you how to optimise your site to give you the best possible chance off appearing high up in the search rankings, you need to be aware that Google, and Yahoo have been successful by making an art form out of suppressing spam techniques, and thereby producing the most relevant results for their users.

In short, they are very practiced at spotting deliberate attempts to deceive the web crawlers, and may even penalise your site if it severely breaks the rules.

2nd Myth: You need to pay ‘expert’ to help you achieve the highest possible placement in the Search Engine rankings.

Of course there’s plenty of best practices to follow, and a good understanding of how the search engines work will be an asset to you, but this is no harder than getting a decent grasp on best usability practices, or how to design an accessible site. Follow this guide, and you’ll be covering pretty much everything an SEO expert would suggest.

Rule Numero Uno

After all is said and done, the most important thing you can do to boost your sites rankings is to get other search engine indexed web pages to link your site.

Do not however use special services that offer to link your site on loads of other web sites. They’re scams at the end of the day, and the search engines are wise to it. You might even hurt your rankings by using these services.

Now that’s out of the way let’s fine tune your site, because every little helps…..

Code to Include In Your Web Pages

Before we run off scouring the web for bit part search engines to suggest our lovely new site to, let’s get our web pages sorted first. We’re going to make them search engine friendly.

1. Start with the file name, and make it meaningful. Rather than call your page ‘cogSEO2.html’, use some proper words, hyphens and underscores and try something like ‘cognize_seo_optimization_guide.html’.

2. Give every page a meaningful <title> tag. Put your site name in by all means, but add a little information after, up to about 70 characters.

3. Add in relevant <meta> information. Two very common meta tags are the ones that provide the description of the page and keywords.

Whilst you may think that these two tags might be just the thing to get you shooting to the top of google, the historic abuse of keywords has limited the attention that the search engine bots will pay to them. If you do use the keyword meta tag, be careful not to spam. This may be considered bad practice by the search bots and harm your ranking.

The description meta tag also has limited use for ranking purposes. What it will do however is dictate the way that you want your page to be described in the search engine results. If this suits, then you may want to use this meta tag.

<meta name="Description" content="Cognize -  a web developers resource.">
<meta name="Keywords" content="html, css, javascript,  web design, web design,
article, tutorials, resource, programs,  internet, web, cognize, coding,
code, script, form, validate,  w3c, w3, mozilla, mdc, lint, debugger, search">

Other meta tags you can use are shown below. Meta tags are not essential however, the search engines will still make sense of your site. Use them at your own discretion.

<meta name="author" content="me">
<meta name="subject" content="Web Development, Design, Consulting">
<meta name="Classification" content="Cognize - a web  developers resource." >
<meta name="Geography" content="Your Full Address"> <meta name="Language"
content="English">
<meta http-equiv="Expires" content="never"> <meta name="Copyright"
content="Cognize.co.uk">
<meta name="Designer" content="me"> <meta name="Publisher"
content="http://Cognize.co.uk">
<meta name="Revisit-After" content="1"> <meta name="distribution"
content="Global">
<meta name="city" content ="My City">
<meta name="country" content ="England, UK, United Kingdom,
Britain, Great Britain">

A good automatic meta tag generator can be found at SEO Tools.

4. Create a site-map.html page and link to it from every page.

You should create and maintain a file named site-map.html, place it in the root directory, and link to it from every page on your site. Make the links to the page and on the page basic <a> link tags to ensure that the bots to not have any problems in accessing the file. This is a good way of ensuring that the whole of your site is crawlable.

5. Follow Usability and Accessibility best practices, they also help search engines.

Basic rules really, all the little things that make your web site more accessible to special browsers such as using the alt attribute for images or using html links rather than dynamically created ones.

Providing html alternatives where other technologies are used all help the search bots crawl your site more easily. If you are able to check your web site in Lynx, that may give you a good vision of how a search bot sees your site. If you use images, make the alt tags descriptive, using around 7 words if you can.

6. Add multiple basic html links to the main areas of your site on all pages.

7. Make your page content rich. Some search engines test the ratio of code to text on your page. The more text, the more content rich the page will be considered, thus improving your ranking. Write pages that clearly and accurately describe your content.

8. When adding text and content to your page, think about keywords. Some of the SEO tools suggested later will help you to work out which are the best keywords. The keywords in the meta data are often ignored, but those in your content will not be.

9. Check the page for broken links. If the links to a page are broken, a search bot might not be able to reach them. The W3C link checker can check online sites automatically.

In your web space

There are two main types of file you can include in your web space that can help to guide search bots through your site.

1. Add a ‘sitemap.xml’ file.

This is a site map in a special recognised format (XML) especially for search engines. It should be placed at the highest level in your website (i.e. in your root directory).

Here is an example:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>http://www.cognize.co.uk/</loc>
<changefreq>daily</changefreq>
<priority>0.7</priority>
</url>
<url>
<loc>http://www.cognize.co.uk/resources.html</loc>
<changefreq>daily</changefreq>
<priority>0.5</priority>
</url>
</urlset>

You can copy this code exactly, change the urls in the <loc>, <changefreq> and <priority> tags, and this will be enough to help the search bot check all of the pages you specify in your site, even if it struggles with the links for some reason.

As a quick guide, <loc>, refers to the URL of the page.

<changefreq> tells the search bot how often the page changes. Example options are ‘daily’ ‘weekly’ or ‘monthly’.

<priority> Tells the search bot how important this page is in the context of your site. Values range from 0 to 1. Putting 1 for every page will not increase the ranking of your site, it is only relevant to the page your on.

There are other tags and options you can use, although they are beyond the scope of this guide. For complete information on site maps, visit Google Site Maps Protocol.

If you want a short cut, a nice little xml sitemap generator can be found here xml-sitemaps.com.

2. Add in a file named robots.txt

A robots.txt file placed in your root directory will give a search bot a little more info. If you would like the bot not to crawl certain pages, maybe because they aren’t particularly relevant to incoming searches, then this the way to do it.

Again, a full blown explanation of exactly how these files work is beyond the scope of this guide, but full documentation can be found on Google’s robots.txt guide

For now, here is an example of a robots.txt file that you can place in your web space. It’s very simple, this is the contents of a (very short) complete file asking the bot not to check the directory named ‘boring’, because, well it’s boring!

User-Agent: *  Disallow: /boring/

Putting it Out There

To make things easy for you, here is a list of the main search engines that you will want to submit your site to. You may have to satisfy certain conditions before they will index your pages. For example, you will need to sign up for a Google account if you want to use their stats system to check how your site is doing.

Please note that even with submission, it can take the search engines a little while to get around to your site, we’re talking weeks. If however you’ve followed the rules so far, you should find it’s probably quicker than that, especially if you can get links to your site into web pages that are already indexed.

Google https://www.google.com/webmasters/tools/siteoverview

Yahoo https://siteexplorer.search.yahoo.com/submit

MSN http://search.msn.com.sg/docs/submit.aspx

LIVE http://search.live.com/docs/submit.aspx

Open Director Project http://www.dmoz.org/

AltaVista Included uses the Yahoo engine, so no need to submit to this one.

Ask Jeeves included in Teoma, which you can not manually submit to.

Checking Your Position and More Fine Tuning

Once you’ve completed the above, you can start using more tools to check how your sites doing.

Some great SEO resources for webmasters can be found here:

SEO Chat

Self SEO

Google Web Masters

The Final Word

We started with it and so we’ll finish with it, because it can’t be emphasised enough:

Get as many incoming links to your site as you can. If you maintain other sites, have them link to one another.

It all helps!

Advertisements

March 26, 2008 at 3:00 pm Leave a comment


Calendar

October 2017
M T W T F S S
« May    
 1
2345678
9101112131415
16171819202122
23242526272829
3031  

Posts by Month

Posts by Category