Wednesday, February 25, 2015

Google Panda Turns Four Years Old Today

Can you believe it has been four years since the Google officially released their Panda algorithm update? Before that, Google was getting a tremendous amount of flack over the quality of their search results and then Panda touched down.

Panda 1.0 was released on February 24, 2011 - four years ago today and it may have been the most significant quality algorithm Google has released to date, even more so than Penguin.

Since then, we've covered the Panda update hundreds of times. We tracked over 20 confirmed updates and likely 50+ unconfirmed updates to the algorithm.

One thing is for sure, Panda is part of the SEO handbook. It fundamentally changed how many SEOs worked on sites. It fundamentally changed which type of web sites ranked well in Google. It fundamentally changed hundreds, if not thousands, of businesses and how they operate online. It changed the search landscape tremendously.

Now, it has been around four months since a Panda refresh, which concerns many. The last official update was Panda 4.1 in September and the last unconfirmed update was in October.

To see all our coverage on Panda, go here.

I know Penguin was substantial for many SEOs because of the link aspect, but Panda did impact more sites overall. It also tremendously cleaned up Google's quality perspective almost overnight.

Forum discussion at Twitter.

Tuesday, February 24, 2015

Google Crawling Your Robots.txt File?

A fairly common question I see in the support forums is why is Google crawling my robots.txt file.

Today I spotted a thread from a webmaster in the Google Webmaster Help forums where Google was crawling the mobile version of the robots.txt file, which does not exist.

In short, it was trying to crawl /robots.txt?m=1. In this case, it is not really something this webmaster can control, being that he is on Google's Blogger system.

Here is a screen shot of the error:

click for full size

Google's John Mueller told the webmaster he would relay the issue to the Blogger team but he doesn't need to worry. John wrote:

This can happen when you explicitly crawl the robots.txt with a smartphone Googlebot, such as you can do through Fetch as Google in Webmaster Tools, or such as what might happen if we try to crawl the robots.txt file for indexing (for example, if there's a link to it). We don't fetch the robots.txt file in that way for normal robots.txt processing, so that's not something you'd need to worry about. I agree it's a bit confusing (when trying to diagnose this kind of issue) when Blogger does that redirect for smartphone requests, so I'll pass that on to the team as feedback. Thanks!

Yes, these errors can get messy in Google Webmaster Tools.

Forum discussion at Google Webmaster Help.

Google: When Will Be The Next PageRank Update? Probably Never

Google has said time and time again that you should not pay attention to the Google Toolbar PageRank score, but yet they have yet to zero-out any of the data, instead, they keep it as it was since the last accidental update back on December 6, 2013.

The last Google Toolbar PageRank update was over 14 months ago and before that, the last intentional update was on February 4, 2013, just over two-years ago.

Google has told us they won't be updating the metric and they probably won't update it in the future.

John Mueller from Google was asked this again at the 53:16 mark in a video hangout from about two weeks ago. The question was "When will be the next google PageRank update?"

John said, "Probably Never."

He said:

Probably Never 

This is something I think we're stopped updating, at least the toolbar PageRank that is shown. I don't know the future of the Toolbar in general, but at least from the PageRank side, this is probably something we're not going to update.

John then goes on to explain the accidental PageRank update. Then the webmasters ask him what is a good metric to look at instead and he doesn't give a clear answer.

Here is the video, so you can see the confident he had this time when answering:


Forum discussion at Google+.

Google News Publishers Should Use Standout Tags For Third-Party Sources Often

Back in September 2011, Google News introduced the standout tag, a way for publishers to say these specific stories are really exceptional and Google News readers should really see it.

But did you know that it isn't only reserved to be used for your own stories and that Google encourages you to use it to point to third-party stories that are better than yours?

The Google News help article explains this and explains the standout tag as:

If your news organization breaks a big story, or publishes an extraordinary work of journalism, you can indicate this by using the standout tag. When determining whether to use this tag for your own article, consider whether that article meets the following criteria:
  • Your article is an original source for the story.
  • Your organization invested significant resources in reporting or producing the article.
  • The article deserves special recognition.
  • You haven't used standout on your own articles more than seven times in the past calendar week.
In addition, we strongly recommend citing standout articles from other publishers when your own article draws from that standout piece of journalism. When determining whether to use this tag to cite the work of others, consider the following criteria:
  • The publisher's article was the original source for the story you are now reporting.
  • The original source invested significant resources in reporting or producing the article.
  • You know that the original article deserves special recognition.

Also, in the hangout from the other week with Stacie Chan of the Google News team, Stacie answered the question:

Is it necessary to occasionally use 3rd party standouts in order for your own standouts to carry any weight?

The answer:

Oh, great question. Yes. 

The standout tag is built on the ecosystem of publishers using the standout tag. 
In a sense, you can build up credibility by using the standout tag and referring to XYZ.com rather then always referring to your site, ABC.com. 
The only thing we ask is that you use the standout tag in your own site only seven times per week. 
But really, you can link out to as many other third-party sites with the standout tag as frequently as you’d like. In fact, we encourage it.

So it seems the more you score well using it on third-party sites, maybe the more Google trusts you when you use it on your own site?

Here is the video that starts at 49:38:


Forum discussion at Google+.

Friday, February 20, 2015

Google Tests A Search Algorithm Update Yesterday Morning & Revert Back?

It seems Google may have tested a search algorithm update yesterday late morning and then may have pulled it back.

The ongoing WebmasterWorld thread has chatter from the SEO community around the update but the chatter soon died down after things settled back. Plus most of the automated Google SERP tracking tools show very little fluctuations, if any.

Here are some of the posts late yesterday morning at WebmasterWorld:

comparative to yesterday my traffic is up with about +25%


Something major appears to have rolled out just after yesterday's short, hour long surge of excellent converting traffic. Today it's back to 1 or zero on the site.


I'm seeing plenty of ranking changes for my site. I regained dozens of previously dropped key phrases. A few downward moves, but only -1 or -2, but at the same time many +3 and +4 deltas. BUT......


Woke up to almost double traffic, im almost at my daily visits of 3k unique s already and its 10 am here in the uk, something is happening here, anyone else?

Clearly, timing based on these posts in the forum is hard to nail down. The webmasters can be looking at different metrics that are delayed or not real-time.

Both SERPs.com, SERP Metrics and Advanced Web Ranking tools to show a slight uptick in changes in the Google search results.

Maybe something big is coming and this is just an early sign? We are due a Panda update. :)

Forum discussion at WebmasterWorld.

Tuesday, February 17, 2015

Google: The Submit To Index Feature Is For Critical Cases

Google Webmaster Tools Submit To Index

In Google Webmaster Tools, there is an option to fetch and render as GoogleBot would crawl your pages. After the page has been fetched, Google will sometimes give you the option of submitting the page to the index.

John Mueller said in a Google Webmaster Help thread that while the fetch and render quota is 500 per week, the submit to index quota is 500 per month.

He also said that submit to index feature is meant to be used on a very limited basis, only in "critical" cases where you need the changes to be "reflected in search faster than usual."

It seems, at least in the thread, some want to use it all the time for new pages added daily and they do not have enough quota.

Forum discussion at Google Webmaster Help.

Google Has Issues Crawling Recursive Redirects

Ever see a web page that when you visit it, it just keeps redirecting back to itself? So if you visit Google.com, it will reload Google.com over and over again.
If you send a spider/crawler, such as GoogleBot into such a redirect, it can get dizzy and not be able to have enough time to crawl the content on that page.

That is the issue one webmaster had in a Google Webmaster Help thread. He recently had his .com home page redirect back to the same exact URL, the .com home page. He just changed it to go to the .net, but you can still see the issue when you try to use Google Translate on the web page, it spits back an error saying "The page you requested attempted to redirect to itself, which could cause an infinite loop."

infinite loop web page

John Mueller said in the thread that once he removes the recursive redirect, Google should be able to crawl and index the content. John said:

The main problem we're seeing is that your homepage it redirecting to itself -- so we can't actually crawl it at all. Once you remove that recursive redirect, we'll be able to focus more on the content of your pages. 

Of course, this is where Fetch and Render come in handy, to see if this is just happening to spiders.

Forum discussion at Google Webmaster Help.

Google's John Mueller: I'd Avoid Link Building In General

In a Google+ live hangout with Google's John Mueller on last Friday morning, John answered the question, "is link building in anyway way good?"

Yes, a point blank question on should you do link building or not.

The question was asked at the 55:40 mark and is embedded directly below at the start time.

John said, "in general, I'd try to avoid that," that being any link building. John added that focusing on link building will probably going to lead to more problems for your site versus actually help your site.

Here is the transcript to the question, "is link building in any way good?"

That is a good question. 

In general, I’d try to avoid that. 
So that you are really sure that your content kind of stands on its own and make it possible for other people of course to link to your content. Make it easy, maybe, put a little widget on your page, if you like this, this is how you can link to it. Make sure that the URLs on your web site are easy to copy and paste. All of those things make it a little bit easier.
We do use links as part of our algorithm but we use lots and lots of other factors as well. So only focusing on links is probably going to cause more problems for your web site that actually helps.

Here is the video:


Will you stop with the link building after hearing this?

Forum discussion at Google+.

Friday, February 13, 2015

See The New Search Impact Reports In Google Webmaster Tools Over Here

As we covered, Google promised to test a new alpha version of the search queries report and that has now been released to the first set of alpha/beta testers this morning.

Google said this new report will be changing drastically over the testing period but I wanted to show you the reporting features in the report they named "Search Impact."

The Search Impact report shows you a breakdown your clicks and position metrics by one of these six dimensions: date, popular queries, top pages, leading countries, user device and Google Search property. In addition, you can filter and compare across these dimensions. The Google Search property dimension did not work for me, but heck, it is in Alpha.

Let me share screen shots (click on them to enlarge) of the reports, keep in mind, early testing and the data and reports can be inaccurate:

By Date:

click for full size

Date Comparison:

click for full size

By Queries:

click for full size

By Pages:

click for full size

By Countries:

click for full size

Country Filter Option:

click for full size

By Device:

click for full size

By Web Property:

click for full size

Note, you can drill into each line by clicking on them to filter the report specifically by the query, page, country, device and so on.

This is still an early alpha and those who see this are expected to contribute feedback to Google based on having early access.

Forum discussion continued at Google+.

Tuesday, February 10, 2015

Google's E-Commerce Update: Was Last Week's Search Tweak E-Commerce Focused?

There is no doubt in the vast majority of SEOs and Webmaster's mind that there was indeed a search algorithm change last week with Google. Google wouldn't confirm it but did tell us it was not related to Panda or Penguin.

But since it was fairly significant, at least to a nice number of webmasters, we are still asking, what was this update.

Throughout the weekend and over the past few days, I've been hearing people saying this mostly impacted commerce, e-commerce, related web sites. Now the results are still fluctuating, so it might be Google testing something, but our friends at Search Metrics broke down their data for us.

In a post by Marcus Tober, he says it looks like this update seems to have focused mostly on brand e-commerce terms. Most around misspellings of brand keywords. Marcus said it is "mostly concentrated on e-commerce and keywords with measurable CPC."

He pinpointed a few keywords that were shuffled big time, such as [adiddas], [ebbay], [zapos] and others that focus on keywords that misspell the brand name. But also for real brand names, without the typos, such as [nike] and others. Marcus said:

It appears quite possible that Google tries to clean-up brand searches and therefore tries to adjust also typos. Since the development seems to not be finished, we will keep an eye on that.


Adding:

Brands seem to profit from the development, while other have lost a good chunk of rankings, e.g. sites in the fashion industry. Google seems to be optimizing brand searches. An interesting side effect: for keywords with typos the SERPs seemed to be strongly edited – and adjusted to the “correct” SERP.


Marcus also mentions mobile but says there isn't much evidence there, yet, to show anything.

The search results seem to be settling down a bit this weekend but who knows what today brings.

Forum discussion at WebmasterWorld and Twitter.

Tuesday, February 3, 2015

Google's John Mueller: My Friend's Web Site Has No Links & Ranks In Google Well

In a Google+ webmaster hangout from a couple weeks ago, Google's John Mueller was at the GooglePlex and had a hangout with two of his colleagues.

Josh asked at 26 minutes and 3 seconds in, why does Google use links for ranking at the level they do. He said, Google should use other means to rank sites. Of course, he knows they do use other factors but his point was why do they place so much weight on links.

John Mueller answered that saying they do have tons of other factors outside of just links.

John then said he has a friend of his who is “back home” in Zurich, his friend just put up a new web site for the local neighborhood. The new site does not have any links at all and over 300 pages are indexed and pages are ranking and they are getting a lot of traffic from search.

No one ever linked to the site ever, he said, but they did submit sitemaps to Google and they have an RSS feed.

The web site does fairly well without any links at all, John said. So it is not the case you absolutely need links to rank, he added. John did say that links are "obviously" it is part of Google’s ranking factors, but it isn’t something Google only relies on.

We know a year ago, Matt Cutts said Google did try turning off links in their ranking algorithm and the results were horrible. We also know that Yandex, the huge Russian search engine, did remove links from a niche segment in their ranking algorithm when that segment was spammed to death.

Here is the video embed at the start time:


Forum discussion at Google+.

Facebook Likes, Increase FB Likes Free