Google Penguin : Recovery Questions & Examination Update

 

Preface: For those of you that aren't familiar with Google updates, (where have you been!), please read our previous posts for more background and history:  A 'Wake-up' Call For SEO - & - Learning From The Panda Update - & - Dealing With Google Penalties - & - Google Penguin : An Examination & Conclusions

 

It's now been quite a while since Matt Cutts officially announced the worldwide 'launch' of the Penguin algorithm change (April 24th 2012) on the Google Webmaster Central Blog. In all my years in SEO, I've never seen such wide-spread panic and confusion over an algorithm change. And it's completely understandable; Google literally cut the legs out from underneath millions of sites and their rankings.

And we're not talking about 'dodgy black-hat' sites here, but solid real-world business websites; businesses that pay the salaries of many people. The sheer quantity of highly respected websites that have been slammed into search-oblivion is just mind-boggling. You only have to visit any major SEO forum these days and much of the talk is about one thing: Penguin (or Panda) Recovery.

I wrote about Penguin within days of it going live, after analysing a mountain of information across tens of thousands of client domains, and consulting with several other technical SEOs with equally large data-sets... (See here)

...And in all honesty, not a great deal has changed from my original analysis, but there are many refinements which have become apparent as time has ticked by. Add to this the constant on-going discussion and comparison of data between myself and many other senior SEO's, and I felt it was time to post an update.

Why Make These Changes?

One key factor that you should never lose sight of is why Google has done this. Google's spin-doctors will tell you that it's all about combating 'web-spam'; but all evidence points to the contrary. All of the Panda and Penguin updates so far have been about one thing: Making more money and achieving corporate/share-holder growth targets. By turning the natural rankings upside down and thus collapsing the cash-flow of so many businesses, Google has immediately increased global AdWords spend (and individual bid-prices along with it unfortunately.) The double-whammy result is that business-people who don't understand what is happening are scare-mongered away from using SEO as a source of leads and sales. All this is in Google's and its share-holders interests; everybody else hurts, while Google's incomes and profits go up.

Now, maybe that sounds a little dramatic for some of you; and if that's the case then I guess you were one of the very few who escaped being 'Pandalised'. But as an SEO with literally thousands of paying customers and tens of thousands of domains under my link-building purview, I can state quite plainly that you would definitely be in the minority!

Current Google search results are now the lowest quality they have ever been. The volume of bland sites with little or no useful information that are now ranking is quite simply astounding. And I would strongly recommend that you try out Bing for your own personal searches now; as I think you'll quickly find the returned results far more useful than the majority of rubbish that comes back from Google. But as SEO's and business people, we obviously still have to focus on Google rankings from the commercial and business-building perspective; as that's where the majority of the world is still searching (for the time-being at least!)

Key Factors In Panda & Penguin

First, you need to completely understand that Penguin is an algorithmic change, as I said in my last post. This means it's pointless contacting Google with reconsideration requests or pleading for any kind of help or direction. They have no particular desire to help you anyway, and if you cause yourself to be further reviewed, I can guarantee that what they'll find will only ever be to your detriment.

And the simple fact-of-the-matter is that they can't do anything anyway. Since Penguin is algorithmic in nature, there are no penalties etc. that they could remove. The only thing that will change your rankings now is either another change to the algorithm, or a change in the factors that cause your ranking. Since we have no control over the algorithm, we can only address the factors that have caused the ranking-change.

All current analysis shows that there 2 main areas that have affected rankings:

  1. Anchor-text profile: Generally over-optimisation or over-use of 'keyword' terms.
  2. On-page SEO and content quality/factors of your site/pages.

The Google Panda updates of late 2011 and early 2012 introduced more and more gradual changes in these areas, but it was Penguin that really turned up the sensitivity to crazy proportions. SEO has, at a simple level, always been about getting inbound links with a variety of your keywords used as the anchor-text. This is a gross over-simplification, but at its core, it forms the fundamental drive of all link-building done for the last 10 years. Google has 'played-along' with this SEO tactic by rewarding sites with higher rankings when they build more links with the 'appropriate' keyword anchor-texts. Now, of course, there's a lot more to it than that, and link-velocity and volume has been an important part of avoiding penalty triggers, together with diversity in IP/referring domains, as well as link-type and platform etc. But essentially, the 'rules' of the game (not as defined by Google outwardly, because they never do that, but as defined by what actually worked,) were fairly constant. There were plenty of things you could do wrong to mess with your results, but the triggers and percentages were always reasonably constant. That was until 2012... What Panda started to do (from mid 2011) was alter the fundamental trigger-rules and values; Penguin just whacked up the sensitivity dial a lot.

Up till mid 2011, you could often get away with anchor-profiles that were 70-90% keyword-based. I've been advocating the use of diverse anchor-texts since 2006/2007, and using URL/Branded anchors as well as junk/generic anchors has been a systemised part of our link-building for many years. A sensible link-profile would always have been limited to 1/3 keyword anchors (highly varied ones which provide long-tail/LSI variation,) followed by around a third of generic anchors and 1/3 URL/Branding anchors. (For those of you using Backlink Banzai since 2008/2009, you'll recognise this as the 100% Naturalisation option.) But we saw many sites that had absurdly high numbers of exact-match anchors that still did very well, and many SEO's argued that our more cautious system 'wasted' links! Seeing the colossal numbers of these sites that were laid to waste by Panda and Penguin wasn't exactly the kind of vindication we were hoping for; and I've had many conversations with other SEO's who've come to me 'hat-in-hand' asking for help with over-optimisation. But the key point here is that everyone was playing by the rules that Google rewarded; they might not have openly stated this or even admitted it to be the case, but the fact that following these rules gave increased rankings means it was their system that rewarded this kind of link-building.

This is why Penguin has upset so many people; it's not the fact that there are new rules to follow, but that Google has almost done an 'about-face' on some of the major factors that everyone has been using for many years. It's easy for them to hide behind their holier-than-thou attitudes as they've always been anti-SEO, and therefore a change in the rules is par-for-the-course as far as they're concerned. But it's the dramatic and sudden nature of these changes that has caught so many unawares.

Understanding Anchor-Text Profiles

Anchor-text profiles are widely misunderstood and are often misread at a far too superficial level, so I thought I'd take some time to explain some basics.

Essentially, you start with some kind of link report which provides the list of URLs that link to your sites/domains. Each URL on this report should also show the anchor-text used for that link. i.e. The text of the hyperlink that is shown on the site that is providing the direct link to you.

Many reports will group the anchor-texts together and maybe show that 126 links use 'X Anchor' and 94 links use 'Y Anchor'. You want this report to be as comprehensive as possible so you can get a good idea of the distribution of your inbound anchors.

Majestic SEO, as well as most other professional link-analysis tools, will have an anchor text report which can be downloaded as a CSV (Comma Separated Values) file, and then be opened in MS Excel or some other spreadsheet for more in-depth analysis.

At a simple level, the key information you're looking for is the percentage of links that contain those keywords that your page has been created/optimised for. But you also want to see the overall number of keywords used, and the range and density of 'root' keyword-terms (we'll cover this in a moment in point 3) that makeup the anchor-texts of the overall profile.

You'll actually be paying attention to a number of important factors here:

  1. The top 10 most-used anchors to your page, and their percentages of use in the overall list.
  2. The total number of different anchor-phrases used as compared to the size of the overall list.
  3. Whether a few 'root' keywords are continually repeated throughout; thus increasing their individual word density.
  4. The number/percentage of one-off/original anchors used overall.
  5. The number/percentage of keyword vs non-keyword related anchors overall.
  6. The number/percentage of brand/URL/naked anchors overall.
  7. The number/percentage of generic/junk anchors overall.

So, one-by-one, let's cover each of these. And we'll use arbitrary examples based on the weight-loss niche to help us explain.

1. The top 10 most-used anchors to your page, and their percentages of use in the overall list.

'Ideal' percentages and ratios vary, depending on the sector or market you're in. But generally, to be safe; we've found that no more than 10% of your inbound anchors should contain any single keyword-phrase - especially one that has been optimised on-page. e.g. if the exact-match term 'weight loss diets' appears more than 10% of the time, then that search-term runs the risk of being penalised for that page. It doesn't necessarily mean it will of course, but the chances start to increase the higher the percentage is.

For most URL's, you'll find that the top 3 keywords used will utterly dominate the percentages. This is simply because most people build links for about 3 terms per page. It's quite usual to find that the top 3 terms will occupy 60-95% of the link-spread, with the top-term accounting for well over half the links in many cases. When you see these numbers, it's easy to understand how Google is using this against you. It's not natural and stands out like a sore thumb.

If you've got less than 10 different anchors texts then you automatically have over 10% on some of them.

Dilute, Dilute, Dilute!

We introduced the 'Penguinator' option in Backlink Banzai and Wiki Whirlwind shortly after Penguin went live. This was a further diversification of our Naturalisation option.

We also recommended that the 'base' keyword-set for each URL be increased to 5-8 phrases, with a decent amount of keyword variety for the naturalisation software to key-off and create even more extended anchors.

It was also worthwhile adding company/brand names etc as keywords to further add to the anchor mix and dilute the current anchor-text profile. Many members changed all their anchors to different LSI keywords (without using any of the old root keywords for that page) to completely dilute their profile.

2. The total number of different anchor-phrases used as compared to the size of the overall list.

If you have 250 links and 25 different anchors, then you have a unique-phrase density factor of 10. Higher density is bad, as it means less individual link-phrases per page. You should ideally aim for a density of below 5 to help make the overall profile look natural. So if you have 250 links, then there should be at least 250/5 = 50 different terms used overall. 1000 links would require 200 different terms used. Now obviously these aren't all phrases that you're 'targeting', but that anchor diversity is important to create the look and feel of natural link-building. And don't forget that a majority of this diversity will be made up of one-off anchor-phrases (see point 4.)

3. Whether a few 'root' keywords are continually repeated; thus increasing their individual word density.

This is the area that most commonly confuses everyone, but it's very important to understand, as otherwise your link-building efforts to dilute your anchor-profile can be a complete waste of time.

Google parses anchor-text phrases using the individual 'root' keywords that makeup that phrase. When they look at 'weight loss diets' as an anchor-text, they see an anchor for 'weight', an anchor for 'loss', and an anchor for 'diets'. This is kind of obvious really, and why we can get a lot of long-tail rankings from using a variety of keywords. But people often don't make the reverse connection when it comes to over-optimisation.

Therefore, if you had 100 unique anchor phrases from 100 different links, then it sounds like you're completely safe from Penguin; as you never exceed 1% individual phrase-density. But what if all those keywords were extensions/long-tails of 'weight loss diets'?

i.e. 'rapid weight loss diets', 'successful weight loss diets', 'top weight loss diets uncovered' etc.

They're all different when considered as a single, stand-alone exact-match phrase, but they end up with a 100% density of the root keywords 'weight', 'loss' and 'diets'. You can pretty much guarantee that you'll be penalised for the phrase 'weight loss diets' and also 'weight loss' etc.

This is why it's so important to use a variety of 'root' keywords and LSI/synonym keywords to break up the overall density of your keyword-phrases. Yes; you want to use extenders and modifiers heavily to vary your exact-match anchor-phrases and provide lots of variety - thus helping you with point 2. But you also need to use a decent variety of base keywords in the first place, as otherwise you end up in the scenario we've just explained.

However, the ratios for this are NOT the same or as strict as the 10% in point 1. We generally find that as long as your root keywords are below 20% in terms of overall density throughout all the anchors, you won't experience many problems.

Using EMD/PMD URL's (which are inherently keyword-rich) aren't as bad as simple keyword phrases either. You can get away with a higher percentage of URL/naked anchors that are exact/phrase-match domains, as Google seems to go easier on these, by virtue of the fact that they're brand/URL anchors - and a more 'natural' anchor-text to use for linking anyway. But you still need to be very careful these days with EMD/PMD domain names, as Google has gotten increasingly strict with keyword-oriented domains.

4. The number/percentage of one-off/original anchors used overall.

You want plenty of 'one-off' anchors; phrases that only ever appear once (or twice.) The more, the merrier. Nothing looks more natural than a one-off anchor that doesn't get repeated, especially if it's slightly longer and seems 'informal' or 'chatty', e.g. 'check out this total doozer of a post on weight-loss'. That sounds like a friendly/fun and 'social' link that someone has created to your page.

Percentage/ratios are hard here, but we take care of it as part of our usual link-building by always using a massive amount of extensions on our keyword phrases - thus creating lots of links that only feature that exact phrase once or twice. (Backlink Banzai and Wiki Whirlwind, for example, automatically use over 100 extensions per defined keyword as part of our anchor naturalisation process.) We also use hundreds of different variations on generic phrases like 'click here' etc.

5. The number/percentage of keyword vs non-keyword related anchors overall.

Taking all the previous points into account, you should ensure that you're always building a decent percentage of links that are non-keyword based. We've found that depending on how well you've mixed up the keyword phrases etc, you need to have at least 2/3 of anchors that are non-keyword based. For Penguin recovery, we recommend starting with pushing out 1/2 of all your links as bare URL anchors and 1/2 as generic/junk anchors. You should do this until a more stable balance has been achieved and you start to see rankings improve.

6. The number/percentage of brand/URL/naked anchors overall.

URL/Naked anchors look like this:

'http://www.example.com' or 'www.example.com' or 'example.com' etc.

You're using the URL itself as the anchor-text. This is one of the most natural looking links - as it doesn't smack of any keyword manipulation.

Brand anchors are the company/product/brand name - and are often contained in the URL anyway. The URL itself could be the brand - as in 'facebook.com'. But a brand anchor can also be non-keyword related and can be useful in diluting the anchor-pool with non-keyword related terms that are commercially relevant. i.e. 'Amazon' would be a brand link to Amazon.com, but is obviously not related to any keywords. 'Amazon Books' starts becoming more keyword related - as it's now using a contextually relevant keyword. Mixing these up and using brand names with keywords can also be a good way of generating a wider variety of anchor-texts.

Post-Penguin, the ideal percentage of URL-based anchors has increased dramatically. It's hard to put an individual percentage on it, as they need to be taken in context with the rest of the overall profile, but 1/3 to 1/2 or 33%- 50% seems a good target right now.

7. The number/percentage of generic/junk anchors overall.

Generic or junk anchors look like this:

'Click here for more info' or 'Check these guys out' or 'This site offers some useful tips' etc.

Again, post-Penguin, the ideal percentage of generic anchors has increased dramatically, and they also need to be taken in context with the rest of the overall profile, but 1/3 to 1/2 or 33%-50% is a good target. Combined, the percentage of non-keyword anchors probably needs to be around the 70%-90% mark, but again, percentages vary by sector somewhat.

Mixing Up Your Root Keyword Pool

As discussed in point 3 above, the overall use and density of some root keywords can be the downfall of a page post-Penguin. When you originally target a page for an SEO phrase, you probably (hopefully!) based it upon some keyword research that suggested that that term would be commercially worth your while. It's this view that has stopped many SEO's from using variety in their anchors over the years, as they figure that building links without commercially-driven keywords is simple a waste of effort. But as we've discussed, this thinking completely ignores the reality of Google's algorithm. If you were a chef and you wanted to make a good stew, then you wouldn't just keep adding more of the most important ingredient; the dish 'works' because of a combination of factors, and often some of the smallest ingredients can make the biggest impact.

LSI is a phrase that has been over-used in SEO for years, and true LSI is mind-bogglingly complex computationally. But when SEO's say use 'LSI phrases', they mean use keywords or phrases that mean the same thing with different words; commonly synonyms or alternate meanings. Words often belong to a conceptual 'family'; if you looked in a thesaurus under 'contract', you might find the synonyms: agreement, deal, treaty, bond, pact. These all carry similar meanings in some context, and are therefore useful in anchor creation as they'll support LSI-similar keywords and also help increase long-tail traffic to your site.

Google uses a variety of clever contextual/relevancy algorithms (it couldn't be full LSI - as the processing power required would be staggering,) which you can see when you search for certain terms and similar meaning words are included in the search-results. This therefore shows that using these themed word-sets can help further diversify your anchor-pool, while still retaining commercial relevancy, and at the same time keeping Google happy.

Generally, we recommend starting with a base of a few main anchors (the closely similar ones that you're targeting) and then extending them with LSI keywords and thematically similar long-tails so that your base-list of keywords is at 7-15 terms. These terms should follow the rules discussed in point 3, and should avoid centring too much on 1 or 2 root words. You can then take these starting keywords and extend them out with modifiers to create hundreds of different anchors that each contain one of these base anchors. Backlink Banzai and Wiki Whirlwind do this automatically with our Naturalisation options; you simply enter the base 7-15 terms and we take care of the extensions/modifiers plus the generic and URL anchors. All you have to do is choose the level of Naturalisation: What percentage of the output anchors you want modified.

Non Anchor-Related Factors

You'll also want to monitor the following factors that are not anchor-related:

8. The distribution/percentage of Do-Follow vs No-Follow anchors overall.

Many SEO's have considered no-follow links to be a waste of time, as the point of the no-follow tag was to tell the search engine not to pass any ranking factors along that hyperlink. At least, that's what seemed to be the case...

Now, we know that no-follow links do affect rankings and seem to pass authority/trust-rank quite well. And of course, it's another part of what makes a link-profile look 'natural'. If all your links are do-follow then that's another flag of manipulated link-building.

Since Penguin hit, we've seen a rise in the required percentage of no-follow links to a minimum of 10-20% of the overall link-profile, but we recommend 20-30% as a safe target to shoot for.

9. The overall distribution of link-type/platforms.

Many studies show that those sites with a narrow band of link-types are usually hit much harder than those with a wider variety of sources. If all your links are from blog comments and forum-profiles, then you've got an unnatural link type/platform spread. You want links from as many different sources as possible, but keep an eye on the quality. There are many link-building services out there; just be sure that you can meet the criteria established in the points covered above. It's pointless building more platform/type variety if you screw-up your anchor-text profile along the way.

10. The overall number of individual referring domains/IP's.

This has always been important, and isn't really a change, but something that you should continue to monitor. It's pointless having 1,000 links if they're all from the same site or IP address. This is not entirely the case with very high PR sites, but the likelihood of you getting hundreds or even tens of links from a very high PR site is remote (e.g. 100 links from 100 different Amazon pages would all be worth having!) And the big 2.0's also act slightly differently here too. It would be worth having 100 links from different WordPress.com pages/sub-domains for instance, as it's a PR9 site with millions upon millions of pages. But this needs to be tempered with all the other points discussed above and be in proportion to the rest of the links built.

Try to avoid or remove any site-wide/blog-roll type links which cause thousands of links to show from the same domain. These can quickly back-fire and cause a lot of ranking damage. Many SEO link-analysis tools have reports which show the number of referring domains, and this can be useful for spotting site-wide or blog-roll type links when they happen.

As a side-note, you'll often hear SEO's discuss SEO/Network Hosting; or 'Class C' hosting. This is the practice of hosting multiple web-sites (often a blog network) on different servers and IP addresses to make them look individual and separate, i.e. to try and avoid Google finding their fundamental interconnectedness - and thus penalise or remove their linking-power. Although not technically correct from the networking perspective, SEO's refer to Class C hosting as varying the third number in the IP address. i.e. 111.222.333.444 is the IP address, with each of the 4 numbers being from 1-255. The first number is the most 'important' - and you could think of it like the 'country', the 2nd number could be compared to the 'state/region', the third to a district/road, and the fourth to a house/building. IP addresses bear no actual resemblance to geo/real places in this way at all, but this analogy helps you picture how the numbers get smaller in significance towards the end. A single server might commonly have 1 or a few Class 'D' numbers. So a lot of web-sites on shared hosting would have similar (or the same) Class D addresses. By varying the 'Class C' number, SEO's attempt to mimic the server diversity of real 'disconnected' and independent websites. Although you'd need variety in the Class A and B too of course. The only point to this is when you're linking from these sites to other sites, and you want to make each site link 'count' as an independent/separate link.

General Link-Building Factors

Overall link volume and link velocity are still very important; never lose sight of this. Maintaining a steady (or rising) and constant flow of links to a page is an important part of establishing social factors for a page. Constant link activity means regular page caching, and tells Google that the page remains socially significant and worth ranking. Fresh links are as important now as aged links.

But that doesn't mean that all your link-building needs to be pointed directly at your 'money' sites; far from it. Post-Penguin, it has become more important than ever to protect yourself from Google's destructive and arbitrary changes. You can do this by using 'tiered' linking and establishing a range of 1st tier 'disposable' ranking assets.

I've discussed this at length in other posts, but as a summary, each of these 1st tier pages will be directly linked to your money-pages by a single outbound link, and will be well-written and contextually-relevant content. Each should also follow the points above and use varying anchor texts; not just keywords. These assets can then be SEO'd and have links built to them in the same way that your money-pages would be, but you can (and should) push higher volumes of links at them.

You're essentially trying to rank them in their own right, so that when you have lots of them, your money pages are being linked to by a lot of other ranking-assets which are building their own authority.

They're 'disposable' in the sense that if they get messed-with by Google, you can just throw them away and build another one! You should aim to build hundreds (or thousands) of these over time, and if you can get these on high PR/trust domains like 2.0's/micro-blogs/quality article sites or forums etc. then you also leach the authority from that site. With 2.0's, it's essential that it's relevant content and not too 'salesy', preferably written from the style of a hobbyist or fan - as these get left alone more.

The whole point of tiered link-building is to:

  1. Establish external assets that form a ring of defence against the possible Google penalties that can be experienced from direct-linking,
  2. Build more ranking assets in their own right that can be ranked for long-tail terms and thus increase your traffic,
  3. Possibly enable you to achieve multiple-rankings in the Top 50 - for extra chances of being clicked,
  4. Allow more aggressive link-building by not sending it all directly to your key pages.

Many aggressive black-hatters use 3-4 tiers or more and push literally millions of links at the bottom levels to help 'juice up' the entire link structure. You should note that these pyramids/structures should not be organised; they need to be messy and chaotic in construction, so as not to look contrived. Do not used closed 'link-wheels' as they're far too obvious. The outside 'edges' of any link structure need to be as messy as possible - so that's it's not an obvious self-contained entity. It's also worthwhile linking out to other sites at the edges of the structure so that it's not a closed system; which will further grey the boundaries.

It's incredibly important now to get into the mind-set of 'scale' and a diversified sales-funnel. As you build more ranking-assets, you add constant breadth to your sales-funnel. It's far better to have 100 pages generating 1% of your traffic, than 1 page generating 100% of your traffic; you're protected to a much greater degree. Look into the vast range of long-tails that are sure to exist in your industry/sector; they're MUCH easier to rank for - and most large sites will get 50-80% of their traffic from long-tails - so go after those easy targets and build automatic diversity and protection into your traffic flow.

All 1st tier linking should always be done via content-based linking. i.e. a link that is published with surrounding content. This should be at least 300-500 words of reasonable content; not junk or auto-spun gibberish. Non content based links should be saved for the lower tiers. We always use content-based linking for tier 2 as well, and tiers 3 & 4 are where we would use lower quality/spammy links.

Although we've seen an increase in the power of using relevant content on the linking page, it's not always necessary (or even viable) for the vast majority of your lower-tier links. You should ensure that all your manually-built 1st tier assets are relevant (obviously; as you're trying to rank these in their own right essentially,) and the majority of your 2nd tier too, but to go for relevant content in all your lower-tier link-building would hugely restrict the number of sites you can use, and the pool of IP's etc. that are available. And volume is still important!

On-Page SEO Factors

Some of the key issues that we've found with penalised sites are:

  1. Link-swapping  or auto-reciprocal link schemes which have a single self-hosted page that you've linked to from your home-page. When you view this page there are often multiple tabbed pages with hundreds of links which are shared across the member network. You've become a link-farm in Google's eyes!
  2. Sites where the owner has decided to target multiple geo/locations by adding extra pages; one for each area. So 'Bristol Plumbing', 'Manchester Plumbing', 'Southampton Plumbing' etc. We're seen sites with hundreds or even thousands of these auto-produced pages which feature spun content and very little variation. Even with great quality spun-content, the page structure is always the same and the page and header titles are really only varying by the place name. This is blindingly obvious for Google to spot; and they don't like sites trying to take up 1,000 pages of their index when only 1 is 'necessary' - just to cover 1,000 different keyword terms. This used to work; but now it's a ticking time-bomb for an established site. You could use this technique on disposable grey-hat sites (which you're prepared to lose,) but NOT on brand/important sites.
  3. Pages with massively over-optimised content/titles and very high/unnatural keyword densities. You can de-optimise a title by adding extra words and make it more human sounding. Maybe move the keywords so they're not the first words of the title - to soften them a little. Keyword density used to be an SEO target; now it should be viewed as a limit. Keep it natural and use synonyms/LSI phrases which keep the page contextual but avoid over-use of the keyword terms. If you have a good and natural sounding page-title (6/7 words or more), you can use that in a few anchors that link into that page.
  4. Pages that feature constant duplicate-content; some large block of text that appears on all the pages. If you have a standard block that needs to be seen on each page, then link to it and maybe use a graphic to highlight it, but don't just add the same 300 words to all 150 pages of your site.
  5. Very 'thin' content on most pages; maybe only 100-200 words and a lot of pictures or adverts etc. Google doesn't like thin sites... You need written content to rank well. You'll also inherently end up with very short visitor times and high bounce-rates - which will tell Google that you're not worth ranking so well. You want to be an authority on your subject, and to do that you need good content to explain and engage your visitors. The longer you can keep people interested on your site, the better. (Short videos can also work very well here to keep users on-site - and the video content can be provided as written content to win both sides of the equation.)
  6. Bad internal navigation. It's generally agreed that a site that has more internal page inter-linking with embedded contextual anchors looks more natural. If you look at Wikipedia, you'll see a lot of in-content inter-linking throughout the site. This is a footprint of a larger, more organised and professional information-rich website. Typical affiliate or sales websites have very little (or no) inter-linking; just a static link bar. Obviously, you need the content available to inter-link, but if your site is so short of content that you can't inter-link between relevant subjects then you need to look at increasing this. And we're talking about specific content-related inter-linking inside sentences, (like Wikipedia,) NOT generic site-wide links.
  7. Slow site page-load/access times. Often from low-quality shared hosting, over-laden plugin-heavy WordPress sites or too many large graphics files. There are many ways to optimise and speed up your site page-load times, but good-quality business-class hosting takes care of most of your issues. It doesn't need to be a dedicated server, or even a VPS for most sites, but you want to make sure you're with a good business-quality host with low user ratios and proper 'virtualised' server technology. Virtualised server hosting means that individual sites and blocks are handled as separate entities, and so if another site 'goes down' or is experiencing slow load-times, then you're not affected by it; which is common with domestic shared-hosting.

Conclusions

I know there are people that will read all this and say "Just tell me what to do...", or "Cut to the chase... What do I need to get my rankings back?" and my answer to them is the same that I give to all the people I speak to daily asking the same thing...

There is NO universal cookie-cutter solution. What works for one site will be completely different from what works for another, because the bottom line is that the Google ranking algorithm is incredibly complex, and what tripped your issues won't be what tripped someone else's. Only a solid understanding of the possible factors, together with some time spent in analysis of your sites/pages will yield the clues as to the direction you need to follow. There are services out there that will do this for you of course; but they're not cheap.

I, for one, believe that it's important to start gaining a higher level of technical understanding of the SEO and on-line marketing process - if you want to survive online going forward.

I'm not saying everyone needs to become a technical SEO, but if your business relies heavily on 'free' methods of online lead and sales generation, then it's in your interests to start taking this very seriously, and gain the education and experience necessary to win. The days of simple, cheap, 'quick-and-easy' SEO are pretty much over.

There are many tools and services which can streamline and automate much of what needs to be done, but a proper understanding of how to use and intelligently direct them is becoming paramount. If nothing else, you'll then be able to spot the services that will just waste your time, and those that offer something of value.

Obviously, as explained in-depth above, you certainly need a constant drip-feed of diverse and naturalised anchor-text links; something that our members have enjoyed for many years!

Google is in a constant state of change, and the smart money would bet that they're not going to be taking it any easier on us any time soon. Therefore, it's now absolutely critical to start protecting yourself with smart SEO and scale/diversity-oriented business-thinking.

Good luck with your SEO...

Jay

Only $67 per month

Up to 1200 new contextual links
automatically built for you each & every month
- so you can promote up to 20 different pages or
sites, each via its own specific keyword phrase.

1,000 comment & 1,000 profile links
automatically built for you each & every month,
pointing at the contextual links to form a 2nd tier 'link-pyramid' powered-up SEO structure.


Privacy Policy and Site Terms & Conditions