How Google Search Algorithm Updates Affect Your Business | Urban Element
Insights

How Google Search Algorithm Updates Affect Your Business

07/11/2017 | Digital Marketing | 15 minutes

Like it or not, if you want to rank on Google Search then you need to pander to their ranking algorithm. The Google Gods will be most displeased if you don’t and you’ll struggle to see first page rankings unless you follow their ever changing guidelines. If you’re not immersed in the world of SEO on a daily basis then it can be difficult to keep up with these changes, let alone edit your website to comply with them. More importantly, almost all of the businesses we work with see their website as a vehicle to drive sales of their product or service, so if you’re not ranking on page one then you’re going to struggle to attract new customers or clients organically.

We’ve taken a look at how these algorithm updates can affect the search visibility of your website, the impact of this on conversion and the overall impact on your business, as well as how you can react to these changes and stay ahead of your competition.

How do I know if I’ve been affected by a Google Algorithm Update?

Unless you‘re checking your organic traffic and keyword rankings on a regular basis, you won’t know if you’ve been hit by a Google Algorithm Update until it starts impacting your enquiries or sales. At Urban Element we use Google Analytics to track our organic traffic, the SEMrush sensor to measure volatility in the SERPs, the SEMrush keyword rankings module and Moz’s Google Algorithm Update Timeline to monitor confirmed Google Algorithm updates.


Pandering to Panda

Google Panda has been around since 2011, but Google released updates to the content centric algorithm up until 2015. Panda aims to give users better quality search results by giving pages a quality score based on content being useful and unique. Pages that meet their criteria will rank and those that don’t will inevitably bomb. Finding out if you’ve been hit by Google’s Panda update would mean looking at historical Google Analytics data from as early as 2011, or taking a look at when the subsequent updates happened and correlating this with organic traffic. Once you’ve identified you’ve been smote by Google’s Panda, you’ll need to identify where your shortcomings are.

How do I recover from Google Panda?

Depending on the size of your website, Panda can be relatively easy to recover from in terms of complexity; the difficulty will come in the time it takes to recover. You may need to create fresh content, cull any pages with thin (short) content or update content on any old, outdated blog posts. First, you’ll need to find out the offending pages on your site. To do this you can use a number of tools;


What about Google’s other pets?

Many content specialists band about the phrase “content is king” and Panda’s close relation Hummingbird would agree with you. Like Panda, Hummingbird aims to penalise sites with poor, keyword stuffed content, but places more importance on searcher intent. In essence, Hummingbird has paved the way for devices such as Google Home, Amazon Echo and the personal assistants on our phones such as Siri. Our searching habits have become less direct and more conversational, meaning giving the user relevant results as quickly as possible has become increasingly important.

How can I use Hummingbird to my advantage?

The key here is to answer questions from a unique perspective. Finding a niche that other content creators haven’t covered is difficult, but not impossible. I like to use the following tools to find out what questions people are asking;


Pleasing the Pigeon and the Possum

Pigeon was released in 2014 to help Google understand where its users are located and adjust its SERP to show results that are close to the searcher. You can help to please the pigeon algorithm by adding organization or local business schema.org mark-up to your site; a universally understood mark-up used by search engines to gain a better understanding of website content. Another way of telling Google where you’re located is by adding your business listing to local directories with consistent NAP (Name, Address, Postcode).

More recently, in 2016, Google released an extension to its local algorithm called Possum which placed more importance on the physical location of the searcher. As we have learned at Urban Element, it’s a challenge to rank for “web design oxford” when your physical address is in Witney. For ourselves, and many of our clients, this has prompted us to focus more on tracking keywords based on user location, rather than national keyword tracking.

What about backlinks?

Links have been important to Google’s ranking algorithm from the very beginning and its spiders use these links to crawl from site to site, indexing as they go. However, this algorithm was open to exploitation and it wasn’t uncommon for SEOs to buy backlinks in an attempt to gain higher search rankings. When Penguin was launched in 2012, Google had a very clear goal; to audit these links and crack down on any unnatural or manipulative linking practices.

How do I recover from Google Penguin?

Dealing with internal links is easy, either remove the link or slap a ‘rel=nofollow’ on it. External links are trickier to tame. Most of the time external backlinks are out of your control; however, Google recognises this and offers a ‘disavow tool’ where you can ask Google’s robots to ignore links from a particular domain or page.

How do I find bad backlinks?

Google Search Console allows you to see the sites within its index that are linking to your site but it’s difficult to know which links are good and which are bad. Many SEOs will suggest AHREFs or Majestic SEO for deep link analysis, but for a lot of businesses the cost of these tools will outweigh the benefits. Personally, I’m a fan of using the SEMrush Backlink Audit tool which breaks down links which it believes to be ‘toxic’, ‘potentially toxic’ and ‘non toxic’. This way you can see which links are doing harm to your visibility and which are beneficial to your overall SEO, you can even create your own disavow file from within the tool. Pretty neat, huh?


What is Google’s most recent algorithm update?

Fred is Google’s most recent algorithm update. Fred is similar to Panda in that it targets pages with thin content, but it also takes a disliking to content that’s in article form with limited types of media on display. In addition, Fred targets sites with a large number of affiliate links and adverts that dominate the page, meaning that many sites that rely on advertising revenue are having to tone it down with their ratio of ad content to page content.

TL;DR

Google just wants you to create good, useful content that is shareable, unique and will generate traffic organically. They have a number of basic principles and specific rules set out in their Webmaster Guidelines. If you’re interested in staying in Google’s good books then we suggest you check it out.

About the author