Author: SEO Ray

How do I use the robots meta tag?

The robots meta tag is used by search engine crawlers to know if they should crawl and index the web page they are on. Most major search engines use the robots meta tag but using the robots.txt standard is more efficient. Since all search engines will try to crawl every web page it’s only necessary to use the meta tag when trying to prevent a web page from being indexed. Below is an example of the robots meta tag syntax.

The above code must be inserted between the tag in your web page’s source code. The section in red is where you can insert one of the below elements to tell a search engine crawler what to do this this page, you can insert more than one element just seperate them with commas.

* INDEX (Crawl and add the web page to the crawlers index)
* NOINDEX (Do not crawl and do not add to the crawlers index)
* NOFOLLOW (Do not crawl or index outbound links to other sites)
* NOARCHIVE (Do not cache and archive a copy of this web page)
* NOSNIPPET (Do not display a description in search results)
* NOODP (Do not use the description Open Directory Project)

Remember, you only need to use this tag to prevent crawling, caching, etc of a web page. You can usually just leave it out of your web page’s source code or use the robots.txt standard to direct crawlers on indexing and crawling of your site.

What is the best domain name extension to use?

When it comes to picking the correct domain name extension it really doesn’t matter. The only thing to consider is if you choose a country domain extention, you’ll get better ranking in that countrys search results. The only reason that certian extensions are more common then others in search results is because they’ve been around much longer.

Some domain extensions are only a few years old compare to others that were used almost sine beginning of the web. You do want an extension people can easily remember. Web surfers are more prone to remember *.com, *.org and *.net URLs then lets say *.us or *.ws extensions. What you need to remember is the extension will not have any affect on your search engine positioning.

If you register or they will both have the same equal ability to rank in the search engines, one will not have any higher value over the other due to the extension. It should be noted that web surfers would probably remember the *.com version over the other.

What is the most search engine friendly blogging software?

Almost all blogging software is unique in it’s own way. You find some have unique features while others have standard features found in all other blogging software. When it comes to SEO Toronto Services not many blogging software packages are up to par. In fact, even some of the most SEO friendly blogs still have drawbacks, mostly with duplicate content. The issue is you can find the same content for a blog article at 3 or more URLs on your site, which search engines would consider as duplicate content. Lets review a list of things SEO wise a blog should have or do.

1. Static Search Engine Friendly URLs. (ex.
2. RSS Feeds (feeds should only display a summary of posts, not the entire post)
3. Create Unique Meta Tags Per Post (title, keywords, description, etc)
4. Customizable (ability to change design, layout, colors, etc)
5. Proper Internal Linking (all pages should some how inter-link with one another)
6. W3C Validated (all pages should successfully pass w3c validation)

Most likely there are other things to consider when picking out the most search engine friendly blogging software, but the above list is the basic things to look for. You’ll see a small list below of blogging software Civic SEO picked out, they are not in any specific order. Contact us to report bad links or to get your blogging software added to the list.

Search Engine Friendly Blogging Software

* – WordPress
* – b2evolution
* – Blogger (This is also a free hosted blog service)
* – Moveable Type

We would love to help this list grow, please contact us with your favorite search engine friendly blog software today!

How do I redirect non www url to the www url?

This is a great method to get your website indexed correctly by search engine spiders. Many search engine spiders consider and to be separate and get indexed separately. This can create Pagerank and duplicate content issues, if your links are spread out between and your site will show a lower Pagerank value than it really is. You’ll also risk the chance at search engines thinking you have duplicate content, and in search engine optimization terms this is bad thing.

Add the below code into you .htaccess file to redirect your non-www url to your www url. You need to change to your websites domain name.

RewriteEngine on
RewriteCond %{HTTP_HOST} ^yourdomain\.com$ [NC]
RewriteRule ^(.*)$$1 [R=301,L]

If you want to redirect your www url to your non-www version then instead use the below code and insert it into your .htaccess file.

RewriteEngine on
RewriteCond %{HTTP_HOST} ^www.yourdomain\.com$ [NC]
RewriteRule ^(.*)$$1 [R=301,L]