SEO FAQ: Why Isn’t My Site on Google?

Is your website not showing up in Google search results?

That’s a common problem for everyone, including new and experienced SEOs!

If you’ve noticed you’re not showing up in SERPs, this is how you can start!

Video: Why won’t my site show up in Google search results?

 

Thomas Liquori, an Internet marketing analyst, discusses the 5 most common problems that keep your site from showing up in Google search results.


Transcript: 5 common reasons your site doesn’t show up in Google

Hello everyone! My name is Thomas Liquori and I’m an Internet Marketing Analyst at WebpageFX. For today’s question, we are going to be discussing is “Why isn’t my site on Google?” Now, there could be a number of reasons as to why your website is not showing up in the organic search result pages and we are going to discuss a few of the most common ones today.
The first and most common reason for brand new sites on the web using a brand new domain name is just that, your site is too new and Google hasn’t had the time to discover it as of yet through their crawling and indexing process. Crawling and indexing are processes which can take some time and rely on many factors. In general, Google does not make predictions or guarantees about when or if the URL’s of your website will be crawled or indexed. One way to ensure your site is getting crawled and indexed in the SERPs is by submitting your site to Google Search Console, formally Google Webmaster Tools. Once you go ahead and verify that you are the rightful owner of the site you can then go ahead and submit a sitemap which lists all of the URL’s associated with your site. By submitting a sitemap to Google you are essentially telling them “Hey! Come check out my site to find some great information on the topic whatever your website is about”.
This is the best practice to ensure you get Google to come out and start crawling and indexing your site properly.
Once a few days or a couple of weeks go by and you still are not seeing anything in the SERPs for your website, you can go ahead and perform a site search on Google to ensure your website is being indexed properly. A site search would consist of a search query on google such as site:webpagefx.com. On submit, Google will load all of the pages of your site that they have indexed in the search engine ranking pages. You can identify how many pages are being indexed by looking right above the first listing which is usually the link to Google Search Console or the home page of your website. In this instance, WebpageFX has 4,000 pages indexed in the Google search engine ranking results.
If you are seeing that Google is not indexing all of your URL’s the best thing to do would be to return to Google Search Console and see if there are any warnings in the sitemap section and any URL errors in the Crawl section which are visible when you first log into a website’s dashboard in Google search Console.
When it comes to crawl errors, the most common crawl error you will see is the 404 http status code error. This means that whatever page is linking to this URL it cannot be found. Putting a 301 redirect on not found errors are best practice for telling Google that this page has moved or the next best thing associated to this page is this URL. A 301 is a permanent redirect different from a 302 which is a temporary redirect. 301 redirects also pass link juice to the corresponding URL you are redirecting to and it is best practice to use a 301 redirect when it comes to SEO. Once you put a 301 redirect in place you should go ahead and mark this error as fixed in Google Search Console. Google will then attempt to re-crawl the URL to see if it has indeed been fixed or not. If it has not been fixed you can expect to see the same error again when Google discovers it on their next crawl of your site.
Now that we know one of the most common reasons as to why your site may not be appearing on Google, let’s touch on some other topics.
The first reason your site is not being shown on Google could be you are blocking robots in your robots.txt file. Making sure the user agent is set to allow and not disallow in your robots file is important. It’s basically like this, let’s pretend you have a robot knocking at your door at your house trying to get in, you can either let the robot in and check out your house or not. Same holds true for robots on the web. They use the robots.txt file to see which areas of the site they are allowed to come in and crawl. If the user agent in the file is set to disallow, it’s basically telling them to take a hike which means they have no idea what the site is about so how are they supposed to index it properly in the SERPs? If you are using a WordPress site which many people do, making sure under the Reading Settings tab you do not have the checkbox checked to Discourage Search Engines from indexing this site.
Another reason could have to do with a poorly configured .htaccess file. HTAccess files reside in the root folder of your website and if not configured correctly could have infinite loops on the home page for example which will never let your site load properly.
Meta tags is also a culprit as well. Make sure you do not have a NoIndex, NoFollow as a robots meta tag on your site, rather that tag should be Index,Follow.
Another issue you may come across, is when you are using a domain name that had a life on it before you came along. The previous owner of that name may have done shady tactics such as scraped content, link schemes, cloaking, being deceptive, malicious or simply just adding no value to the web. Google may have gone and put a manual action or in other words penalized this site for their shady practices and dropped them from the SERPs. Then you come along to use this domain thinking everything is fine and bam, you now inherited the problem the previous owner had.
One way to check if any manual actions have been applied to your site would be to look in Google Search Console under Manual Actions under the Search Traffic Tab. Google will also send out notifications through Search Console to your email if they find anything suspicious they feel could be causing problems with indexing your site properly. For example, if your website is hacked Google will send out a notification to the site owner through Search Console warning you if you do not fix the problem you risk your organic placement in the SERPs. In the search results itself, Google will also place a little line of text in red under your site saying “This Site May Be Hacked” which will definitely scare potential visitors away.
Another way to see if you have any potential issues would be by looking in Search Console under Links to Your Site under the Search Traffic tab. Here you can see all of the back links that Google sees pointing to your site where you will be able to identify potential harmful links. If any are found you can go ahead and create a disavow file and submit that to Google.
If you come across anything harmful or if you fix anything that Google has deemed as a manual action on your website, the best thing to do would be to fix the issue and once you have that fixed go ahead and put in a reconsideration request with Google so they can come back to ensure all issues with the site have been addressed so they can remove the manual action on your site.
The above points we have discussed are some of the most common issues you will come across when trying to figure out why your site is not appearing on Google. Ultimately, the best thing to do is to follow Google’s Webmaster Guidelines when trying to get your website indexed into the search engine ranking pages.