Archive for June, 2012
Whilst in the past we used to accumulate all the directories Google would present to us (Free, Bid, Paid) and submit blindly, without understanding the effects that this might have in the long run. Today, in the post Panda and Penguin age, it’s evident that this activity can be flagged as a low quality SEO activity, resented by Google.
Did Google Drop 100+ Small Directories? On May 16, 2012 Barry Schwartz reports that ~100+ small directories have been removed from Google’s index.
SEOMoz dug deeper, and conducted a research to investigate how widespread is this phenomenon: Web Directory Submission Danger: Analysis of 2,678 Directories Shows 20% Penalized/Banned by Google
Then why are directories still relevant?
1. Because it works, and it’s a good tactic for the long run when done right as a picky one-time effort.
2. SEOMoz did not come to the conclusion you should not submit to directories, on the contrary. All they did is to remove from their list of directories (available to PRO accounts) all those deindexed directories, and left only high quality recommended directories, and even stated that
“Some directories and link resource lists are likely going to be valuable and useful long term”.
How can we use directories as an efficient SEO effort?
We need to hand-pick our directories:
• Check the site’s metrics and traffic (Page Rank, Alexa Rank, Majestic’s Citation/Trust Flow, SEOMoz’s Domain and Page Authority).
• Make sure to submit to a relevant category, that is also cached (PR as well would be preferable).
• Best directories would be niche directories, meaning specific directories of many sites relating to the same field however a high quality directory with a specific relevant category would be better than a low quality one in our niche.
• Make sure that they have a strong quality assessment mechanism. If a link directory puts quality at the top of its priorities, they will obviously not publish any link submitted to them, regardless of its contents and the site it points to. The kind of directory that would actually be good for your site, is the kind that hand-picks the sites submitted to it and approves only the high quality sites in each category. To understand how Google might evaluate a good quality assessment mechanism, we can go back in time to February 23, 2011, to the Farmer/Panda Update.
This update dealt with article directories with thin content, with poor quality assessment mechanisms, meaning that anyone could have anything published on them absolutely free, and this added immense quantities of spam to the web. Once Google realized how ridiculously easy it is to get spam published on these sites, without any filtering mechanisms, many of the biggest article directories sustained serious ranking hits or disappeared off the index altogether.
This is why we need to be prepared, and to make sure to submit in directories with a strong quality mechanism, the kind Google respects.
Since it’s not easy to maintain such a serious quality assurance operation, especially on big link directories, getting hundreds of submissions a day, some directories take volunteers, like DMOZ, the highest quality directory on the web, and getting accepted could sometimes take months.
Other high quality directories hire quality personnel, and require payment from each submitter to finance this effort, these are called ‘Paid Directories’.
Google realizes that this sort of high quality effort requires funding, so the links in those directories, are not considered ‘paid links’ of the negative kind, like those you would just buy from someone, there is reasoning behind each inclusion. There was someone behind the scenes handpicking those links and publishing them only if relevant.
Andrew Shotland refers to this type of directories on this post he wrote for Search Engine Land, The Four Keys To Post-Penguin Directory Submission Happiness –
“Many paid directories continue to be good signals of authority, due to the cost of entry that pays for their editorial reviews that help ensure quality. In particular, vertical and niche directories can be high quality, but you should steer clear of those that are covered in ads and don’t have an editorial review process.”
• If you choose to find a list of directories and start submitting, this can be a good tactic, just make sure it comes from a trusted source:
Case Study about Efficiency of Directories, Neutralizing Other Link Building Methods
The best case study to show how to use directories right is a brand new site without any other link building activity on it, like we did.
- In order to not waste time and make it as efficiently as possible, we decided to choose the absolute best ultimate 20 link directories (following the parameters above) and submit the listing in all as a one-time effort (as opposed to ongoing).
- Each of the listings included a different and unique description.
- Following 99.9% of the directories’ request, the title had to be the sites name/company name, as opposed to generic keywords. We were only allowed to submit the generic keyword in 2/20 directories, so the rest were submitted under “the brand name + keyword” (which, surprisingly, turned out for the best).
- Without any other links for the site, and just those ~20 directories accepted, we managed to reach the top #100 under the generic term we used in combination with the brand when we weren’t even trying (FYI, we are talking about one of the most searched and competitive terms in Google UK).
- Once we did add link building activity under this keyword, we immediately jumped to #35.
To conclude, good, high quality directory links are not only helpful, but they could be a great basis for your site’s liftoff and a good source of link diversity.
In order to start with this effort, you have to get into Google’s state of mind, and think if the directory you’re assessing would be appreciated by them, and ask yourself a few important questions:
- Are there other high quality sites in my niche?
- Reading the guidelines – do they talk about their filtering mechanisms (which sites are welcome or out)?
- Did I find this directory on a trusted source?
- Would you trust this directory enough to pay them to register?
- Does it have traffic good metrics?
- Once you ask relevant questions, this effort could really boost your SEO efforts.
On June 7, 2012 Google released a summary of their (39) activities throughout May. We have divided these into 8 short but important highlight categories:
1. Auto Complete:
Google continues to refine the autocomplete suggestions by:
- Adding predictions below the search box as the user types
- Reducing arbitrary predictions to remove low quality suggestions
- Better handling input such as caps lock and a better adaptation to user language.
- Tweaks to autocomplete in mobile as well
Google improves the ranking algorithm for news search results, enabling fresher and more accurate results to appear on the 1st page. (Improvements to ranking for news results)
3. Ranking Signals
Google continue to tweak their ranking algorithm:
- Improve their ‘inorganic backlinks’ detection mechanism, to better identify link schemes (and purchased links, as those go hand in hand), to make sure it is less likely that a page ranks for those links. ‘This change ensures we’re using those signals appropriately in the rest of our ranking.’
- Penguin tweaks – minor changes done to the Penguin algorithm to improve signals and refresh their data (this is probably a result of the amount of feedback forms received, judgment of what is a ‘fair complaint’ and bringing back a few of the sites that got hurt)
- Tweaks to their scoring of results, with a minimal impact on the results quality.
- A better display of results from diverse domains, as relevant as possible to the user’s query [project codename “Other Ranking Components”]
4. SERP Titles
- Improving their mechanism to display truncated or irrelevant titles in the SERPs – shortening or changing the titles so they are relevant and sensible. They also remove irrelevant bits of text such as ‘go back’
This is an important element Google are constantly trying to improve in order to affect their rankings:
- A better detection of new events
- Improvements to their algorithm designed to set the threshold of what fresh content really is.
- A better identification of queries which would most likely include fresh results
6. Google Plus:
- Updates to +Pages [project codename “Social Search”] algorithm to decide which are the relevant Google Plus pages to display on the right hand side of the search results has been improved (make your G+ as relevant as possible and up to date and users could get it as a suggestion even if they didn’t mean to search your site).
- Precise location option on tablet. (personalized search results relevant to your more precise location on mobile)
- Improvements to local search on tablet.
- Internationalization of “recent” search feature on mobile.
- Improvements to autocomplete on Mobile.
- Unified Soccer feature. This change unifies the soccer search feature experience across leagues in Spain, England, Germany and Italy, providing scores and scheduling information right on the search result page.
- Improvements to NBA search feature. This launch makes it so we’ll more often return relevant NBA scores and information right at the top of your search results. Try searching for [nba playoffs] or [heat games].
- New Golf search feature. This change introduces a new search feature for the Professional Golf Association (PGA) and PGA Tour, including information about tour matches and golfers. Try searching for [tiger woods] or [2012 pga schedule].
This edition of Google’s monthly changes truly reflects what really matters to Google, and which topics are of high priority. Upgrades to freshness, autocomplete, mobile compatibility and ranking signals are issues that are tweaked on a monthly basis.
There are, however, a few points we should follow up on:
Ranking factors: Google is constantly improving the way they can detect purchased links. If that is the tactic that works for you, make sure you link like a human, this study will provide useful insights on natural linking. Also, despite Matt Cutt’s denial of a new Penguin update, the algorithm tweak was important enough to appear on the summary.
Freshness: Update your site frequently to rank much quicker and easier, as freshness is one of the most important ranking signals, tweaked constantly.
Mobile: Mobile accessibility is becoming more crucial than ever – adapt your site to mobile users to be favored by the mobile bot.
Google Plus: It’s very reasonable that we’ll see more and more updates in the Google Plus category. Google are trying to encourage SEOs to use Google Plus with the underlying promise of the future evolvement of this social media into a powerful ranking factor. The more join – the more reliable is this factor, and the stronger its effect.
On the latest advanced SMX, Seattle, Matt Cutts finally sat down with Danny Sullivan, Search Engine Land’s Editor-in-Chief, for a post Penguin interview, entitled You & A with Matt Cutts, where they discussed many of the issues close to our hearts as SEOs, such as penalties vs. algorithm updates, links vs. social signals, is Google Plus powerful as a signal, paid links, allowing admins to disavow links, do they act on all spam reports and much more.
The interview has been broken-down by topic, Matt Cutts view on that topic (as quoted from SearchEngineLand), and what can we make of it?
Topic: Penguin – Penalty vs. Algorithmic Change and Settling New Penguin Rumors
MC: “It’s an algorithmic change, but when we use a word like “penalty,” we’re talking about a manual action taken by the web spam team — it wasn’t that.
We don’t think of it as a penalty. We think of it as, “We have over 200 signals, and this is one of the signals.”
“We don’t use the word “penalty” much, we refer to things as a “manual action.” in the past year, we’ve been more transparent about times when we take manual action. We send out alerts via Google Webmaster Tools.”
As far as Google are concerned, Penguin was an algorithmic change, one of the 200+ signals they use for ranking. Since it was in fact an automatic filter, admins were given the opportunity to provide their feedback on how the change affected their site (similar to a reconsideration request, but not the same, as reconsideration is after manual action has been takes), and once all data from all feedback forms has been processed, Google has released a “minor tweak” or “data refresh”. While admins refer to it as Penguin 1.1, Matt Cutts does not see it this way. Purely semantics.
More on Semantics: What SEO’s view as ‘penalty’ Google view as ‘Manual Action’. Only if you were penalized you will receive a Google Webmaster Tools Alert, you will not be notified for an algorithmic change as Penguin, or any other for that matter.
Topic: ‘Negative SEO’
MC: “Our guidelines used to say it’s nearly impossible to do that, but there have been cases where that’s happened, so we changed the wording on that part of our guidelines.
Some have suggested that Google could disavow links. Even though we put in a lot of protection against negative SEO, there’s been so much talk about that that we’re talking about being able to enable that, maybe in a month or two or three.”
Matt Cutts refer to Google’s guideline changes in reference to ‘Negative SEO’, changes that many SEO’s has already spotted.
While they were confident in their ability to filter-out negative SEO, quite a few recent cases / reconsideration requests showed them that they are not bulletproof when it comes to blocking that sort of attack (thus the guideline changes).
They are working on a new tool to enable admins to ‘deny’ links on their profile, which should be available in the coming months.
Topic: The Value of Sitewide Links
MC: “We’ve done a good job of ignoring boilerplate, site wide links. In the last few months, we’ve been trying to make the point that not only is link buying like that not doing any good, we’re turning the dial up to let people know that certain link spam techniques are a waste of money.”
Google officially ignores sitewide links, don’t get those, it’s a waste of your money. The best links are editorial links from a reputable site that has chosen to include your site manually
Topic: Interacting with Admins for Quality Guidelines Violations
MC: “If you roll out a new algorithm, it can affect millions of sites. It’s not practical to notify website owners when you have 500 algo changes every year, but we can notify when there’s been manual action against a specific site.
One thing I’d like to clear up — the news earlier this year about 700,000 warnings. The vast majority of those were because we started sending out messages even for cases of very obvious black hat techniques. So now we’re completely transparent with the warnings we send. Typically your website ranking will drop if you don’t take action after you get one of those warnings.”
Google say that they are moving towards a more “transparent” perspective, where they notify admins about whether a manual action has been taken, such as the case of the 700K “unnatural links” notifications. Your action once this notice has been received is to remove bad links from your profile or your rankings will drop (i.e. too many exact anchor matches or links purchased from networks).
However this does not provide an answer for admins who have witnessed a significant drop in rankings, with no update in sight or alert. You can only guess you have bad links pointing at you, or that Google have received complaints on your site.
Topic: Paid Links
MC: “We’re always working on improving our tools. Some of the tools that we built, for example, to spot blog networks, can also be used to spot link buying. People sometimes think they can buy links without a footprint, but you don’t know about the person on the other side. People need to realize that, as we build up new tools, paid links becomes a higher risk endeavor. We’ve said it for years, but we’re starting to enforce it more.
I believe, if you ask any SEO, is SEO harder now than 5-6 years ago, I think they’d say it’s a little more challenging. You can expect that to increase. Google is getting more serious about buying and selling links. Penguin showed that some stuff that may work short term won’t work in the long term.
DS: What’s the deal on paid inclusion? Is it coming to web search?
MC: You call it paid inclusion, but it’s a separately labeled box and it’s not in web ranking. Google’s take on paid inclusion is when you take money and don’t disclose it.”
Google are now very clear on how a purchased link looks like, i.e., not like a natural person would link to a site, and they are taking measures to sight these, making link buying a very dangerous activity. Later on in the interview, Matt Cutts will say that Google are still using links for ranking purposes. The Internet world is not as naïve as Matt Cutts perceives it, if someone has a spectacular site , he’d want to make money off it. If you want to rank your site, make sure to think like a human and get your anchors to make sense, people don’t use exact match domains naturally.
When it comes to paid inclusion, it’s a two-edge sword, you have to clearly admit to paid inclusion, but when you do, you could get penalized. It’s a sensitive issue.
Topic: Nofollowing Affiliate links
“If it’s large enough affiliate network, we know about them and handle it appropriately on our side, but if you’re at all worried, use nofollow”
(Corrected quote from Bruce Clay)
While Matt Cutts recommends to nofollow affiliate links, this is practically impossible, you can’t always control what your affiliates do (rarely ever).
Topic: The Value of Links vs. Social Signals
DS: Do links still work, or are social signals gonna replace them?
MC: “The actual percentage of nofollow links on the web is a single digit percentage, and it’s a pretty small percentage. To say that links are a dead signal his wrong. I wouldn’t write the epitaph for links just yet.”
Though social signals are growing in importance, Google still can’t afford to devaluate all links, and they depend on them for rankings, with Facebook blocking a lot of the information from them, this situation is going to be true for a while.
Topic: Bounce Rate Data From Google Analytics
DS: You don’t look at how quickly someone bounces from a search result and back to Google?
MC: “Webspam doesn’t use Google Analytics. I asked again before this conference and was told, No, Google does not use analytics in its rankings.”
That’s true, Google does not use Analytics for bounce rate stats, to help them rank sites. They do use their internal bounce rate, measured when someone bounces from a certain result back to the search results (as you can see, he’s not answering Danny’s question accurately)
Topic: Wordpress Theme Links
DS: “How come WordPress didn’t get penalized with all the blogs that have WordPress links in their footer?
MC: If you look at the volume of those links, most of them are from quality sites. WPMU had a pretty good number of links from lower quality sites.”
WordPress theme links penalize you if there are too many of them from totally unrelated, low quality sites. If that equation is balanced, you will not get penalized (or submit a feedback form, with your genuine efforts of removal to bounce back)
Topic: Positive Vs. Negative Ranking Factors
DS: “So we have positive ranking factors and negative ranking factors?
There are indeed positive and negative ranking factors – you can rise and fall by your own actions, it’s not only relative to the other results.
Topic: Handling of Rich Snippets Abuse
MC: “Maybe removing ability for a domain to have rich snippets if there’s abuse”.
People have been abusing rich snippets lately, Google have picked up on this, and they either change the snippets, or deny the site’s usage of rich snippets altogether.
Topic: Removing links following an Alert via Webmaster Tools
MC: “We want to see an earnest effort to remove the links. When you do a reconsideration request, we’ll look at some of the links and see “how much progress have they made?” We’ve talked about the idea of adding a disavow-this-link tool.
DS: What if you can’t get rid of bad links pointing to a page, should we get rid of the page?
MC: If it’s not an important page, you could. Or you could at least document the effort to remove the links and share it with us.”
Following the “unnatural links” notification, they do want to see a genuine removal attempt, and if that doesn’t work, start over with a fresh new domain, redirecting the old one.
Topic: Do they act on all Spam Reports?
MC: “We’ve shut down tens of thousands, maybe hundreds of thousands of domains involved in link buying. When you get a spam report, you want to take action, but it may not be as high impact as doing something about one of our own leads. We use a factor of four — we measure the potential impact by four and if it still shows up near the bottom of the list, we may not take action on it.”
When it comes to spam reports, they prioritize guideline violations, if the site has violated a relative minor one, they won’t necessarily act.
Topic: Google Plus as a Ranking Factor for SEO
MC: “When we look at +1, we’ve found it’s not necessarily the best quality signal right now.
DS: You have to be on Google+ if you want to rank well in Google.
MC: No!!!! It’s still early days on how valuable the Google+ data will be.”
The only reason why +1 is not necessarily the best quality signal is because a very small amount of users actually use it, that’s the real reason why it’s too early to know. There are accounts of admins saying that +1 have helped them rank better. Once there are enough people on Google Plus, they will have substantial data they could really use. They are probably already using it right now, but to a small extent, nothing too definitive.
Topic: If Panda and Penguin Hit You – Should You Start Over?
“Sometimes you should. It’s possible to recover, but if you’re a fly-by-night spammer, it might be better to start over.”
Google are probably not aware that a 301 redirect enables a penalized site to regain its old rankings, as David Naylor pointed out on his blog.
So have we learned anything new from Matt Cutts?
- The more experienced SEOs have already learned to perceive any statement from Matt Cutts as “the tip of the iceberg”; a catalyst to start digging below the surface in order to get to the underlying truth, this interview was no exception.
- Listen carefully and understand the different semantics – SEOs vs. Matt Cutts – many SEOs refer to the latest Penguin tweak as ‘Penguin 1.1’, while Matt Cutts denies an additional Penguin update, there were just minor tweaks to the algorithm. If you read between the lines there are a few more interesting examples where you have to ‘understand’ both sides in order to translate his statements to usable SEO data.
We did learn, however a few interesting facts:
- Penguin is an automatic algorithmic filter, not a penalty.
- Penalty means manual action has been taken.
- There is a good probability that your competitors could hurt you with negative SEO – Google’s filters are strong but not bulletproof.
- Google are working on a mechanism for Google Webmaster Tools, giving admins the opportunity to deny links.
- Sitewide links have little to no value
- Google have much better detection and enforcement mechanisms to deal with paid links.
- Google do not use Bounce Rate from Analytics as a ranking factor.
- Affiliate links should be no followed.
- Links still count for ranking purposes.
- Social Media metrics are (still) not strong enough to make a real difference.
- Google do not act on all Spam reports
- Both Panda and Penguin eliminated your site? Start fresh with a new domain.