For those of you who have taken care of your website – conscientiously publishing, keeping it up to date, and marketing it intelligently – it’s frustrating to see it trumped by competitor sites in Google’s search results. Worse, when your manager or clients see the same thing, they’ll be asking: What’s wrong?
In a previous post I explored answers by looking at a site’s Page Authority (PA) and link profile, factors that have been correlated with relevance and ranking. In this post we’ll look at a couple of technical factors that appear to impact search results – things that can be monitored in order to head off problems in the first place.
Using Google Webmaster Tools
As a website owner, you should have a Google Webmaster Tools (GWT) account. If you haven’t set it up yet, this is an opportune time to figure it out. If you do have an account, but haven’t check it in a while, don’t wait for a problem to drive you there.
Google says that its guidelines are in place to ensure a good user experience. When Google detects an issue with your website and reports it in GWT, it’s your opportunity to fix it and possibly impact how your website is displayed in the search results.
Sorry Pages = Bad User Experience
When people try to access your website, but get a “sorry” page instead – known as a 404 Error page, that’s not a good result for the user or for you. Google monitors the incidence of 404 Error pages, and reports them as Crawl Errors – located in the Crawl section in the left column of the Site Dashboard.
If you recently switched your site and the pages remained the same but the URLs changed, each page should have been redirected to the new version through a “301 Redirect.” If a large number of them were not redirected and show up on the crawl error report, work with your web developer to make sure website visitors get to a real page, not a 404 Error page. By helping them get to the information they want, you’ve created an opportunity to engage with them, so your efforts will pay off!
Manual Actions: A Day of Reckoning
When Google detects violations of its quality guidelines, it might let you know by reporting a Manual Action in the Search Traffic section.
A commonly reported violation of Google’s quality guidelines, “unnatural links detected” is a euphemism Google uses to identify link spam – links to your website that appear to be from spammy (useless) websites. If you do not recognize the “unnatural” links, they could be the result of a link building project that was carried out for your company. Google’s Penguin algorithm change in 2012 cracked down on links of questionable value. If you were paying somebody for links, the site might be suffering as a result.
Whatever the manual action against the site, you don’t want to be on Google’s radar. If you need help figuring out how to fix it, consult the GWT help forum or a reputable SEO firm. And remember, don’t pay for links!
A complete discussion of GWT is beyond the scope of this post, but I hope having touched on a couple of areas help you appreciate the importance of getting familiar with every section, and checking in on a regular basis.
The Danger of “Set it and Forget it”
Despite the great tools that Google supplies, managers stressed for time often don’t use them to monitor their websites’ performance on a regular basis. The Internet is a dynamic and fluid environment that affects even the most static website. Use the information available in Google Webmaster Tools and Google Analytics to help guide marketing priorities and to keep an eye on trends. Doing so will help you stay ahead of the curve, and ahead of your competitors.