A little flavour of what's in store...

Friday, November 03, 2006

article - what search engine factors do you have control over?

“Outside of the obvious (webpage title and description) those items over which the webmaster has the most control are: Page Rank, Trust Rank, Anchor Text, Keyword Density, Domain Age, URL, and Relevant Links.”

Did you know Google are supposedly watching the sale of old domains in an attempt to cut down on blatant attempts to gain higher listings? Older domains may perform better because they have been around long enough to be discounted as a link builder and get that coveted ‘trust’ status. There’s not much you can do about that except to continue building inbound links slowly and using Blogs and RSS feeds to spread the word of your URL’s existence – something the older URL may not be doing.
Your first choice URL may have already been taken, however, it pays to register and use a URL that includes your product or service e.g fabulousphotogifts.co.uk sells personalised photo gifts. The advice below talks of using hyphens to separate keywords in a URL but beware, hyphens confuse people – do they use them or not? Etc and invariably get your hyphened address wrong – error page or worse, a competitor’s website.

How Can a Webmaster Use These Items to Help Ranking?
“First off--the obvious, each and every web page should have a descriptive page specific title and description. The title, description, and header tags are channels to communicate the most important details of a specific webpage. They should be used effectively, but not be abused. The web page should make use of h1 and h2 tags (header tags) to emphasize pertinent keywords and phrases.”

Equally important is to use your page descriptions in your sitemap document. If you use your page description and then hyperlink that whole line to the relevant page, you should score a few extra brownie points with the search engines and because you have a link to your sitemap document from your home page, its only 1 level down so the search engines should crawl it easily.

Google Site Maps won’t allow you to link to off site URL’s like an RSS feed generated via a third party site etc but your Site Map will allow it and you can also add a link directly to the.xml (etc) document on your web folder.
H1 and H2 tags – of course. Be wary of keyword stuffing here. It’s not enough to simply repeat your keywords as search engines will spot this obvious attempt. Instead, look at it from a context of does this sentence or paragraph makes sense. By simply starting a line with... “We provide double glazing in the following areas...” you can legitimately add a list of name places etc. Expand on this by repeating the exercise in H2 tags but with a different context...”as well as providing double glazing quotes in (places here) we also supply and fit upvc conservatories, windows and doors” etc.

“Particular attention should be paid when formatting urls. Keywords related to the webpage can and should be used in the webpage urls. Use hyphens rather than underscores between the keywords. Search engines are designed by developers and programming languages will recognize a hyphen and distinguish separate words, while an underscore blends the words. Keywords in the URL should not be abused, as search engines do not appreciate excessively long urls. Avoid using characters like ID= in the URL since many search engines will see it as a unique session ID and not spider the contents of the webpage.”

A really excellent tip here about using ID. Long URL’s are more a problem for your potential visitor remembering it and managing to type it in without any errors. If they don’t get it first time, unless there’s a very good reason to persevere, they’ll just try finding you via a search engine or worse end up doing a general search for the product or service you sell. www.kumandavealookatemormissout.co.uk might have seemed like a good idea at the time.

“The website's navigation depth should not exceed 3-4 levels. The shallow website depth will make search engines deep crawl easier, ensuring that they will be able to spider the entire content of your website. If you add a new page and wish for it to be spidered quickly, add a link to it from an existing spidered web page.”

If your website has a deeper page structure than this or indeed, uses a database to provide site content, then one of the ways you can get around this problem is by using a Site map, written in plain html and linking to from your home page (see above). On a similar theme, once you’ve submitted your site to Google (by hand), then create a Google account and set up a Google Site Map for your site. This document is usually in .xml file extension, you’ll find details on how to create one on Google and submit it.

As part of the submission process, you’ll have to verify that ‘own the site concerned’, i.e that you have access to it’s index page via FTP (file transfer protocol) as you have the choice of either uploading a separate html file with a filename provided by Google or inserting a tag in the 'head' of your index page.
Once uploaded, you can then submit your Google Site Map and help Google spider all your pages. I’ve had great success with this Google facility. Particularly where it’s a new URL, it seems that hand submitting the URL then submitting a Google Site Map and getting a link from a crawled established page does get you noticed on specific name searches.

Don’t forget, Blogs and RSS feeds are a good way of getting your domain linked to or visited. Also, if you take part in any online forums, then adding links to your site as part of your ‘signature’ will get attention and hopefully a few visits.

My thanks and acknowledgement to Sitepro News and Sharon Housley

Purple 13

No comments: