So you want great rankings, do ya? Well if that’s your goal, then my question to you is “does your site make it easy for the search engines to give you great rankings?” Tough question, I know; but if you start here, your chances of succeeding are much higher!
When a search engine bot or visitor clicks on a link only to find the page is not there, do they see the dreaded 404 error message? If they do, you need to know two things. 1. It’s ugly and provides a poor user experience! Worse yet, the search engines can’t click on a back button, thus preventing the search engines from progressing through your site.
By simply locating the 404 error page on your server, you can easily create a custom 404 error page with the look and feel of your own site, utilizes your site’s navigation and provides a simple message about the page not being available. This not only allows the message to be passed to the search engines that the page is not available (via a 404 error code), it likewise allows both visitors and search bots to continue through your site.
Unique Titles & Meta Data
As more and more CMS and Ecommerce shopping carts begin to incorporate additional SEO-friendly practices into development, we tend to not hear as much about issues with the inability to have unique titles and meta data for your site. However, if your site was built on a later version or still stuck in the stone ages, take heed! Duplicate titles and meta data can create omitted results. Omitted results are unable to rank because … well… they’re omitted! This means if you have say 2,000 pages on your site and 1,500 of those pages are omitted then you have 1,500 pages that are not helping drive traffic to your site. You can easily do a site search on Google and see how many results they are showing for your site, then simply go to the end of the results and see how many pages they are omitting. If the number is high, I recommend you begin to look at how to incorporate more unique titles and meta data into your website. If you’ve taken the time to create 2,000 web pages, you might as well have all of those pages working in your favor! For some, it may be as simple as uploading the latest upgrade, while for others it may require asking your developers or design firm to make some simple configuration changes to your database and admin panel.
This one many still seem to get all wrong! If you drop a page from your site with the intention that it will never come back, you need to redirect the old URL to either a comparable page (or product if it’s an Ecommerce site) or point it up to the main category page that will help people better find what they are looking for. Leaving the page to show a 404 error message is like leaving money on the table! If this page has PageRank, inbound links or even had PageRank flowing to it from other internal pages with a fair amount of PageRank, simply set up a 301 redirect. If you have a large amount of pages that need to be removed or deleted, I recommend working with a developer to setup the proper code for the .htaccess file, otherwise the browser and server will respond slowly for the visitor when trying to load the page. The 301 error message informs the search engines this page is gone forever and that it should pass all PageRank and links pointing to this page to a new location. Additionally, it informs the search engines they should remove the URL from their index. If you’re taking pages down temporarily with the intention of bringing them back at a later date, you will want to set up what is referred to as a 302 redirect. A 302 redirect means ‘temporarily moved’ (essentially not passing PageRank and link credit on to the new page location). Remember: 301 means ‘gone forever’, while 302 means ‘will return’.
If you have decided to rename your files to ensure they are more search engine friendly or to provide a better site structure, you will want to learn to utilize the rewrite rules for your .htaccess file. If you’re using a Windows IIS server, then a program such as Asapi Rewrite will allow you to carry out the same task. Essentially you will want to send a message from the server that says any pages that used to follow a certain structure should now follow this new structure; it additionally informs the search engines that this is a 301 (permanent) redirect. The most important part to remember is that not only does the page tell the server to rewrite the URLs for you and follow the new structure, but also to pass the message to the search engines that this is a permanent change and all credit should be passed from the old file structure to the new one.
The robots.txt file communicates with the search engines, telling them which folders and page locations they are restricted from. Some people go wild with confining the search engines from indexing a lot of folders of their site, not realizing the impact of this. For example, if you sell products online and restrict the images folder from being indexed, it is likely photos of your products will not be able to be indexed within Google Images. Granted, if your company has some wild photos you post online from a great holiday party, you may wish for those not to be indexed in the search engines; but, for the most part, if the files are not private and it’s “okay” for them to be indexed, I recommend allowing those images to be indexed. One thing I like to use the robots.txt file for is to block duplicate content. There are various ways to stop duplicate content such as the rel=”canonical” tag, Google Webmaster Tools, etc. I actually find it fun, however, locating all the variations of duplicate content, blocking them and having control over them through the robots.txt file. If you are inexperienced in site configuration, you can easily take advantage of Google Webmaster Console and utilize the “Let Google Decide” option, which basically keeps you from getting it wrong!
So the next time you ask yourself the question “does my site make it easy for the search engines to give me great rankings?” you can be confident knowing it does!