So this fine morning I attended an e-business seminar arranged by Vaughan Business Center in Vaughan, Ontario. It was good to see our good old Dolores Rotondo back at VBEC (centrebusiness.com) after her 9 months of absence.
The first part of the seminar was presented by Andrew of Biz Launch. I had attended a few of Andrew’s previous seminars years ago and I always enjoyed his seminars. He has a way of making the seminars more interesting. A nice touch of humour and his efforts to engage the audience always works like a charm for me. He talked about the importance of social media exposures for businesses. I realized once again, this is something we just all have to do including doing it for our own business.
The second part was equally “un-enjoyable”. This part of the seminar was run by a guy name Bruno. The guy was running a seminar on Search Engine Optimization yet, his only experience seemed to have been SEO for his own website. He was saying he expects a web designer to do the search engine optimization (say what? Now, the web designer also needs to be a SEO expert?) and by this “SEO” term he seemed to have defined the “onsite” SEO only. Yeah sure! Back in the days when the grass used to be greener maybe “his” SEO may have worked well.
Anyways, the food that followed his seminar was good so overall it was worth attending the seminar. Pino and I spoke with Andrew afterwards as well as Judith, a business consultant. We are sure to hook up again in the coming weeks.
The meta tag keywords is used by search engine crawlers to acquire keywords related to your website. This meta tag is widely believe to not be currently used by the major search engines. Many small search engines and services do use the keywords meta tag, and you never know what algorithm changes major search engines might make in the future.
All your web pages should contain the meta tag, it will never hurt your rankings in search engines if used properly. Below is an example of the keywords meta syntax.
The above syntax must be inserted between your
tag in your web pages source code. The area that is red is where you insert your keywords and keyword phrases, you can use commas to seperate them. Civic SEO recommends only using around 15-20 keywords at most to prevent keyword stuffing, this is not an industry standard but a recommendation by Civic SEO.
Always make sure to use the proper keywords for your site’s content.
Due to certain factors a forum signature link may or may not provide an SEO benefit. On most forums that allow signature links the members almost always utilize this feature, due to that simple fact the SEO and Pagerank value of the backlink isn’t that great. The more backlinks that appear on a web page the less weight each backlink will get. It is still a good way to promote your website and increase backlinks; you will also receive some traffic from the forum itself depending on the forum niche and your websites content.
Now, let’s say the forum uses the rel=nofollow tag for all signature links, which means you receive zero SEO or Pagerank value from the backlink. You still would receive traffic from the forum by members visiting your site via your signature. So in all aspect even if the forum(s) you go on use the rel=nofollow tag you can still receive some benefit from the link in the form of traffic.
So, after reviewing the SEO value or non-value of a signature link it’s still wise to utilize them where ever possible. Not only will you receive some sort of traffic, the forum itself in the future might remove the rel=nofollow tag from their signatures. Always make sure to use the proper anchor text in your signature backlinks. Proper anchor text would be either the keyword or keyword phrase you are trying to rank for on the search engines.
The robots meta tag is used by search engine crawlers to know if they should crawl and index the web page they are on. Most major search engines use the robots meta tag but using the robots.txt standard is more efficient. Since all search engines will try to crawl every web page it’s only necessary to use the meta tag when trying to prevent a web page from being indexed. Below is an example of the robots meta tag syntax.
The above code must be inserted between the
tag in your web page’s source code. The section in red is where you can insert one of the below elements to tell a search engine crawler what to do this this page, you can insert more than one element just seperate them with commas.
* INDEX (Crawl and add the web page to the crawlers index)
* NOINDEX (Do not crawl and do not add to the crawlers index)
* NOFOLLOW (Do not crawl or index outbound links to other sites)
* NOARCHIVE (Do not cache and archive a copy of this web page)
* NOSNIPPET (Do not display a description in search results)
* NOODP (Do not use the description Open Directory Project)
Remember, you only need to use this tag to prevent crawling, caching, etc of a web page. You can usually just leave it out of your web page’s source code or use the robots.txt standard to direct crawlers on indexing and crawling of your site.
When it comes to picking the correct domain name extension it really doesn’t matter. The only thing to consider is if you choose a country domain extention, you’ll get better ranking in that countrys search results. The only reason that certian extensions are more common then others in search results is because they’ve been around much longer.
Some domain extensions are only a few years old compare to others that were used almost sine beginning of the web. You do want an extension people can easily remember. Web surfers are more prone to remember *.com, *.org and *.net URLs then lets say *.us or *.ws extensions. What you need to remember is the extension will not have any affect on your search engine positioning.
If you register news.com or news.ws they will both have the same equal ability to rank in the search engines, one will not have any higher value over the other due to the extension. It should be noted that web surfers would probably remember the *.com version over the other.
The description meta tag is used by search engine crawlers to display an accurate description of your web page in search results. This meta tag is used by most major search engines and an excellent search engine optimization tool. If you do not use a description meta tag the search engine will attempt to create its own with data from your web page. You would get much better results using your own description then using the one generated by the crawler.
Below is an example of the description meta tag.
The above syntax code must be inserted between your
tag in your web page source code. Always use a human friendly accurate description using no more than 200 characters. Do not make descriptions strictly for the crawlers, the better the description the more likely a user will click your site in search results. Never use the same description on every web page, each web page’s meta tag description should be different and unique.
One thing you need to remember, Google is constantly updating your web pages pagerank behind the scenes, the update actually refers to when you can publicly see the update. That means in the Google toolbar and with Pagerank lookup websites, sometimes during the update you can catch a change in some of the data centers.
Since Google doesn’t publicly announce their official Pagerank update schedule we can only go by the recorded times in the past and assume when they do updates. Currently it seems Google is updating the public Pagerank quarterly, or every 3 months. This still isn’t an exact time frame and sometimes it can go beyond 3 months before we see a public Pagerank update.
I suggest you don’t worry about Pagerank as a webmaster; instead you should spend your time and effort increasing your SERPs (Search Engine Results Position). It is not uncommon for a lower Pagerank website to rank higher in search results than a similar content site website with higher Pagerank. Consider spending your worrying on SERPs and not Pagerank.
When someone refers to search engine friendly URLs they are talking about web page addresses that are designed to be indexed easier by search engine crawlers. Many websites use dynamic content, which is HTML that is rendered in the browser on the fly instead of HTML stored in a static HTML file. This means that not only does the URL reference a fixed web address it also passes data to the software being run on the web server. This is used to create large websites without the hassle of making each web page one by one.
The problem with dynamic URLs is that search engines don’t like certain characters in URLs; this can create major issues with dynamically created websites. If a crawler removes certain data or strings from a URL a web page won’t be rendered correctly or maybe not even at all. This is where search engine friendly URLs come into play. By using rewritten URLs your web server knows that a certain URL really points to another, which means that web browsers or crawlers see one URL when really it’s being mapped server side to another. This is done using apache mode re-write or an IIS add on. Below are a few examples of a friendly URLs and not so friendly URLs.
Non-Search Engine Friendly URL Example
Search Engine Friendly URL Example
Learning to correctly re-write urls can be a tricky task, many excellent guides are available on the internet and on Civic SEO. Always back-up your .htaccess file in case you have to revert back to old settings or make a mistake. Once your successfully transform your URLs into search engine friendly URLs not only will they look much cleaner they will also help index your site more efficiently. You’ll also see better search engine rankings due to keywords and keyword phrases that can be injected into the URLs.
Almost all blogging software is unique in it’s own way. You find some have unique features while others have standard features found in all other blogging software. When it comes to search engine optimization not many blogging software packages are up to par. In fact, even some of the most SEO friendly blogs still have drawbacks, mostly with duplicate content. The issue is you can find the same content for a blog article at 3 or more URLs on your site, which search engines would consider as duplicate content. Lets review a list of things SEO wise a blog should have or do.
1. Static Search Engine Friendly URLs. (ex. www.blog.com/post/how-to-blog.com)
2. RSS Feeds (feeds should only display a summary of posts, not the entire post)
3. Create Unique Meta Tags Per Post (title, keywords, description, etc)
4. Customizable (ability to change design, layout, colors, etc)
5. Proper Internal Linking (all pages should some how inter-link with one another)
6. W3C Validated (all pages should successfully pass w3c validation)
Most likely there are other things to consider when picking out the most search engine friendly blogging software, but the above list is the basic things to look for. You’ll see a small list below of blogging software Civic SEO picked out, they are not in any specific order. Contact us to report bad links or to get your blogging software added to the list.
Search Engine Friendly Blogging Software
* www.wordpress.org – WordPress
* www.b2evolution.net – b2evolution
* www.blogger.com – Blogger (This is also a free hosted blog service)
* www.movabletype.org – Moveable Type
We would love to help this list grow, please contact us with your favorite search engine friendly blog software today!
When you are changing web hosting providers there are two factors would determine if you lose Pagerank….
1. Are you keeping the same domain name?
2. Are you keeping the same URL and file structure?
Basically if you answered yes to the two top questions your Google Pagerank will remain intact. If you switch domain names you have to make sure you 301 redirect all the old URLs to the new ones to transfer over the Pagerank, also this will prevent broken bookmarks and broken links from other sites. If you are changing your URL or file structure you also need to make sure you 301 redirect all old URLs or filenames to the new ones.
Due to unforeseen factors and changes in the Google search algorithms you still might lose some Pagerank and never notice, this is due to the fact that Google Pagerank changes real time behind the scenes, but the public Pagerank visible in the Google Toolbar and Pagerank check websites is only updated ever 3 to 4 months.