Thursday, March 28, 2013

12% Say Social Has No Influence On Google Rankings


A few months ago, we asked if social signals influence Google rankings?

We have almost 500 responses from the SEO community and since I am offline today, I figured it would be a good day to share that response.

In short, only 12% said social signals have absolutely no influence on Google rankings. 88% said there was some influence.

  • 46% said there was social influence on Google rankings.
  • 42% said there was only a little social influence on Google rankings.
  • 12% said there was no social influence on Google rankings.

Who is right? Probably the 12%. But it depends on how you think of the question. Core ranking? No personalization? Excluding the fact that social impacts links?

Anyway, there you have it.

Forum discussion continued at WebmasterWorld.

60% Of SEOs Not Worried About Using Mass No Index


A few months ago, I polled you guys asking how concerned you are about too many no index tags hurting your search rankings.



Well, the poll results are in with over 200 responses and I thought you'd like to see that most of you are not concerned with using too many no index meta tags.

  • 60% said you are not worried
  • 23% said you were worried
  • 16% said it depends, you may be worried

If it is on purpose, I wouldn't worry myself.

Forum discussion continued at WebmasterWorld.

Google Publishers Question AdSense Case Study


Google published a case study on their AdSense blog showing how a web site named MakeupAlley.com was able to increase their clicks on affiliate links by over 20% and realize a 50% lift in AdSense revenue.
The results were:

  • 171% increase in on-site interactions
  • 21% more clicks on affiliate link
  • 50% lift in AdSense revenue
  • Deeper understanding of using Analytics to assess user experience

Looking at the before and after design and layout with Web Archive doesn't show much of a difference. martinibuster at WebmasterWorld highlights the main changes:

A side by side review shows that the new MUA site only has two navigation choices in the nav bar, compared with eight choices last year. Site visitors are now funneled to the Product Reviews section (an obvious money page section) and the forums. Fonts and layout look identical.
Gone is the link to the Links section and the link to Favorites is also removed. Somewhat redundant so no surprise. Also gone are links to MY MUA, Swap, Mail, and Diary. What they did to the navbar was create a tighter focus on driving site visitors to the money pages and to their community (content creation & site stickiness).
Clicking through to the Product Reviews section and doing a side by side review reveals a web page layout that is essentially the same. The site is supposed to be highly successful.

You can see that he is questioning how a site like that can really see such positive results? Maybe something else is going on?

He asks, "What do you think? Big change? Meh? Any lessons here?"

The thread is pretty large, so chew it up and maybe we will learn something?

Forum discussion at WebmasterWorld.

54% Say Google Indexes New Sites Within A Week


A few months ago we asked you how long does it typically take Google to index a new site?

Google is all about speed, from their index to their crawlers, they want new, relevant and unique content in their index immediately. So how long does it really take Google to index a new site?

Well, without Google telling us. I asked SEOs and webmasters what they thought. Here are the results...
New Sites Get Indexed Within:

  • 54% said within a week
  • 30% said within a day
  • 12% said within a month
  • 3% said within 3 months
  • 1% said longer than 3 months

Of course, a lot of it is dependent on the type of site, content and quality. But this is good to see from the eyes of the SEO and webmaster community.
We had almost 500 responses on this poll as well.

Google Indexes New Sites Within... Poll

Forum discussion continued at WebmasterWorld.

This story was scheduled to be posted today and was not written today.

Google Discontinues Blocked Sites After Months Of It Not Working


In March 2011, two years ago, Google introduced a blocked sites feature that allows searchers to block unwanted sites via Google's search interface. In December 2012, the feature stopped working and today, Google posts a message that blocked sites is discontinued.

Google said back then, "we're adding this feature because we believe giving you control over the results you find will provide an even more personalized and enjoyable experience on Google." Today, I guess Google doesn't believe in giving you control over the results you want to find in the search results?
Or maybe searchers are not really using it? Google doesn't say.

Google's Matt Cutts did respond to this in a Hacker News thread saying:

Note that you can continue to use the Chrome extension to block sites, which I believe we rolled out before this feature. Get it at chrome.google.com/webstore/detail/personal-blocklist-by-goo/nolijncfnkgaikbjbdaogikpmpbdcdef.

And here's the blog post we did about the Chrome extension.


So if you really want to block results, you can.

Forum discussion at Hacker News.

Image credit to BigStockPhoto for blocks

Wednesday, March 27, 2013

A new opt-out tool


Webmasters have several ways to keep their sites' content out of Google's search results. Today, as promised, we're providing a way for websites to opt out of having their content that Google has crawled appear on Google Shopping, Advisor, Flights, Hotels, and Google+ Local search.

Webmasters can now choose this option through our Webmaster Tools, and crawled content currently being displayed on Shopping, Advisor, Flights, Hotels, or Google+ Local search pages will be removed within 30 days.

Saturday, March 23, 2013

More Verification Controls With Google Webmaster Tools

Google announced they have improved a few things with Google Webmaster Tools verification.

(1) They now show more details on who is verified for your site and how.

(2) They also do not let you remove a user unless that verification method is removed.

(3) They shortened the CNAME string to support more DNS providers.

Here is an example of number one:

click for full size

Here is an example of an error message with number two:

verified error

These are very welcomed improvements to Google Webmaster Tools.

Forum discussion at Google+.

Thursday, March 21, 2013

Easier Management of Website Verifications

To help webmasters manage the verified owners for their websites in Webmaster Tools, we’ve recently introduced three new features:
  • Verification details view: You can now see the methods used to verify an owner for your site. In the Manage owners page for your site, you can now find the new Verification details link. This screenshot shows the verification details of a user who is verified using both an HTML file uploaded to the site and a meta tag:


    Where appropriate, the Verification details will have links to the correct URL on your site where the verification can be found to help you find it faster.
  • Requiring the verification method be removed from the site before unverifying an owner: You now need to remove the verification method from your site before unverifying an owner from Webmaster Tools. Webmaster Tools now checks the method that the owner used to verify ownership of the site, and will show an error message if the verification is still found. For example, this is the error message shown when an unverification was attempted while the DNS CNAME verification method was still found on the DNS records of the domain:



  • Shorter CNAME verification string: We’ve slightly modified the CNAME verification string to make it shorter to support a larger number of DNS providers. Some systems limit the number of characters that can be used in DNS records, which meant that some users were not able to use the CNAME verification method. We’ve now made the CNAME verification method have a fewer number of characters. Existing CNAME verifications will continue to be valid.
We hope this changes make it easier for you to use Webmaster Tools. As always, please post in our Verification forum if you have any questions or feedback.

Google: Sorry We Accidentally Penalized Your Site

Yesterday, Digg was completely delisted from Google's search index. Some thought it was an issue with their robots.txt file, some thought it was a Google bug and some had no idea what to think.

What was it? Google accidentally penalized the whole site. Yes, it was an accident. 

Matt Cutts, Google's head of search spam, said on Hacker News, "we were tackling a spammer and inadvertently took action on the root page of digg.com."

Google released an official statement as well:
We're sorry about the inconvenience this morning to people trying to search for Digg. In the process of removing a spammy submitted link on Digg.com, we inadvertently applied the webspam action to the whole site. We're correcting this, and the fix should be deployed shortly.
Shortly after, someone responded to Matt on Hacker News asking, "if this would happen to a less popular site, what chances does a site-owner have of getting attention to this problem, and getting it fixed?"

Forum discussion at Hacker News.

TheShortCutts.com - The Quick Answer To Google SEO Videos

One of those, "why didn't I think of that" idea - introducing theshortcutts.com.

The Short Cutts catalogues the 500+ videos Matt Cutts, Google's head of search spam, published at the Does Google still need text to understand my site? gets a one word answer "Yes."

Here is that video:



Forum discussion at Google+.

Google Translates Mobile SEO Recommendations In 11 Languages

About a year ago, at SMX Advanced, Google finally documented what the best practices and official recommendations were for mobile friendly SEO design.

Now, Google is translating those recommendations into 11 additional languages. Google said they are doing this because there are "more and more users worldwide with mobile devices access the Internet."

Here are the links to the translated versions:
Forum discussion at Google+.

YouTube Search Trends On Google Trends

YouTube announced they now are supported within Google Trends.

Go to Google Trends, plot some searches and then on the left hand side click on the "limit to" option and select "YouTube."

You can then see the search trends on YouTube search.

For example, here is the spike in interest on YouTube for [harlem shake]:

click for full size

And while cat videos are more popular than dog videos on YouTube, recently goat videos have been the talk of the town:

click for full size

Google keyword ideas for SEOs and publishers come from Google Trends.

Forum discussion at Google+.

Wednesday, March 20, 2013

Bing Webmaster Tools One-Ups Google With Site Move Tool

Bing announced they added a feature named the Site Move Tool to Bing Webmaster Tools.
 
Yea, Google does have a "Change of Address" tool but this tool does a bit more than what Google's tool does. Bing was clear to mention that:
Whereas other Webmaster Tools only allow a "Change of Address" from one domain to another, we wanted to allow web publishers more flexibility.
It offers two things, one of which Google does not offer.

(1) Moving a ton of URLs from within the same domain. For example, you upgrade your CMS and all your URLs change but you are still on the same domain name. This tool allows you to communicate the mass URL change.

(2) A typical domain name to domain name change. For example, you finally acquire the domain name of your dreams and you migrate your site from the old domain name to the new one.

Just like the Google change of address, Bing still wants you to have your 301 redirects in place.

Bing also has two caveats; 

(1) you must set up the redirects to a new domain that resolves, and 

(2) once you do the move, you cannot issue another move request for six months.

Forum discussion at Google+.

Tuesday, March 19, 2013

Google DMCA Algorithm Worries Some Webmasters

Several months ago Google launched a new search algorithm aimed at downgrading the ranking of those who have many DMCA requests that were legit.

Since then, we've seen the number of DMCA filings spike. Was it related to the algorithm update or more savvy publishers trying to claim their content - I don't know.

What I do know is that even several months later, there are some webmasters still nervous about this update.

One WebmasterWorld thread has a webmaster asking if he should worry about receiving "8 DMCA complaints in 10 days time." The quick answer is no. You'd probably need a heck of a lot more than 8 DMCA requests for Google's algorithm to kick in.

But any user generated content site should have policies and measures and checks to ensure unique, valuable and not-stolen content is used on the site. If you do not do that, you can be hit by not just the DMCA algorithm but algorithms like Panda.

Forum discussion at WebmasterWorld.

Google Stops Indexing Craigslist; Matt Cutts Fixes

A HackerNews thread highlights a blog post by Tempest Nathan where he said Google stopped indexing Craigslist.

It was true, Google did stop indexing Craigslist. But why?

Did Craigslist spam Google? Did they violate Google's webmaster guidelines? Did they add the noindex directive to their pages? Nope. None of this.

It was a technical quirk. 

Matt Cutts, Google's head of search spam, explained at the HackerNews thread saying they are fixing the issue on Google's end but this is what technically happened:
To understand what happened, you need to know about the “Expires” HTTP header and Google’s “unavailable_after” extension to the Robots Exclusion Protocol. As you can see at http://googleblog.blogspot.com/2007/07/robots-exclusion-protocol-now-with-even.html , Google’s “unavailable_after” lets a website say “after date X, remove this page from Google’s main web search results.” In contrast, the “Expires” HTTP header relates to caching, and gives the date when a page is considered stale.
A few years ago, users were complaining that Google was returning pages from Craigslist that were defunct or where the offer had expired a long time ago. And at the time, Craigslist was using the “Expires” HTTP header as if it were “unavailable_after”–that is, the Expires header was describing when the listing on Craigslist was obsolete and shouldn’t be shown to users. We ended up writing an algorithm for sites that appeared to be using the Expires header (instead of “unavailable_after”) to try to list when content was defunct and shouldn’t be shown anymore.
You might be able to see where this is going. Not too long ago, Craigslist changed how they generated the “Expires” HTTP header. It looks like they moved to the traditional interpretation of Expires for caching, and our indexing system didn’t notice. We’re in the process of fixing this, and I expect it to be fixed pretty quickly. The indexing team has already corrected this, so now it’s just a matter of re-crawling Craigslist over the next few days.
So we were trying to go the extra mile to help users not see defunct pages, but that caused an issue when Craigslist changed how they used the “Expires” HTTP header. It sounded like you preferred Google’s Custom Search API over Bing’s so it should be safe to switch back to Google if you want. Thanks again for pointing this out.
Forum discussion at HackerNews.

Google Admits To Penalizing The BBC, But Only Granularly

Friday we broke the story that the BBC received a Google link notification of unnatural links. 

This was a big deal - a huge news organization received a Google notification about bad things happening to their web site. I mean, if you can't trust the BBC from a link quality point of view, who can you trust? (Fox News folks, relax)

So what happened? Google's John Mueller dug into the details and discovered this was a "granular" penalty. John said in the Google Webmaster Help thread:
Looking into the details here, what happened was that we found unnatural links to an individual article, and took a granular action based on that. This is not negatively affecting the rest of your website on a whole.
So one page on the BBC had unnatural links. Because of that, Google took action on that one individual article. Google did not take action against the rest of the BBC web site.

The thing is, John did not tell Nick which page. So there is a manual penalty on a specific article page and Nick has no clue which page it is. Shouldn't Google tell him which page so he can fix it some how?

Anyway, this is an interesting case of Google penalizing the largest news organization in the world but only one specific page.

Forum discussion at Google Webmaster Help.

Google Panda Goes Into Hiding: No More Official Confirmations

On Friday I reported at Search Engine Land that Google will no longer give us confirmations or more details on future Panda updates.

Why? Google no longer plans to push out manual Panda refreshes, all future ones should be part of Google's rolling updates. So Google told us at Search Engine Land:
I don't expect us to tweet about or confirm current or future Panda updates because they'll be incorporated into our indexing process and thus be more gradual.
Honestly, I prefer it that way. There is so much stress for me to get Google to respond and give confirmations on these updates. I feel it on both ends. You guys, the SEOs and Webmasters want the confirmation. Google is not always eager to give them to me.

So now, with Panda, I won't have to worry about it anymore.

For Webmasters not knowing is never good but if it is part of Google's rolling updates, if you believe that, then there is nothing they can do to really help you there.

Forum discussion at Google+.

Saturday, March 16, 2013

Did Google Penalize BBC News For Unnatural Links?

The BBC News, the world's largest broadcast news organization, has just received a unnatural links notification from Google. I kid you not.

A Google Webmaster Help thread has representative of the BBC web site asking Google for help in identifying the unnatural links, so they can make sure BBC.co.uk is clear of any Google penalty.
Nick from the BBC wrote:

I am a representative of the BBC site and on Saturday we got a 'notice of detected unnatural links'.

Given the BBC site is so huge, with so many independently run sub sections, with literally thousands or agents and authors, can you give us a little clue as to where we might look for these 'unnatural links'.

The truth is, he is not saying the BBC or one of the agents who run a section of the BBC didn't do anything wrong. He is saying that he needs help finding which section has the issue.

Was the BBC penalized by Google? Hard for me to tell. They do rank for their name [bbc]. Have they lost any traffic to their site due to this link notification? Only the BBC would know for sure.

But clearly Nick is looking to rectify the situation as soon as possible.

It makes you wonder, even a site with such a huge backlink profile can have these issues. I am not sure if the issue is with incoming links or external links - but it seems to imply this is an incoming link issue.

Oh, so I guess this doesn't hurt Matt's claim that big brands do get penalized and treated equally to other brands.

Forum discussion at Google Webmaster Help.

Friday, March 15, 2013

Google Panda Update 25 Seems To Have Hit

There are many webmasters and SEOs believing right now that Google has released an update to their Panda algorithm late yesterday.

We’ve reached out to Google to confirm or deny the Panda update, as we’ve done 24 times previously; but this time, Google told us they are unlikely to confirm future Panda updates since Panda will be incorporated into their indexing processes.

It would not be surprising if this was indeed a Panda update since Matt Cutts, Google’s head of search spam, did say at SMX West that a Panda update will be rolling out this Friday through the weekend. Matt then said although an update is expected this weekend, don’t be surprised if you don’t notice it because the Panda updates are going to be more integrated and less noticeable in the future.

I am not sure if this last push was the last manually updated Panda refresh or if it is already fully integrated into the normal Google indexes process. I think this was Google’s last manual push and they will, from now on, most likely not do manual pushes of the algorithm in the future.

The last Panda update we had confirmation on was Panda #24, so this one would be coined Panda version 25.

Here are all the releases so far for Panda:
  1. Panda Update 1, Feb. 24, 2011 (11.8% of queries; announced; English in US only)
  2. Panda Update 2, April 11, 2011 (2% of queries; announced; rolled out in English internationally)
  3. Panda Update 3, May 10, 2011 (no change given; confirmed, not announced)
  4. Panda Update 4, June 16, 2011 (no change given; confirmed, not announced)
  5. Panda Update 5, July 23, 2011 (no change given; confirmed, not announced)
  6. Panda Update 6, Aug. 12, 2011 (6-9% of queries in many non-English languages; announced)
  7. Panda Update 7, Sept. 28, 2011 (no change given; confirmed, not announced)
  8. Panda Update 8, Oct. 19, 2011 (about 2% of queries; belatedly confirmed)
  9. Panda Update 9, Nov. 18, 2011: (less than 1% of queries; announced)
  10. Panda Update 10, Jan. 18, 2012 (no change given; confirmed, not announced)
  11. Panda Update 11, Feb. 27, 2012 (no change given; announced)
  12. Panda Update 12, March 23, 2012 (about 1.6% of queries impacted; announced)
  13. Panda Update 13, April 19, 2012 (no change given; belatedly revealed)
  14. Panda Update 14, April 27, 2012: (no change given; confirmed; first update within days of another)
  15. Panda Update 15, June 9, 2012: (1% of queries; belatedly announced)
  16. Panda Update 16, June 25, 2012: (about 1% of queries; announced)
  17. Panda Update 17, July 24, 2012:(about 1% of queries; announced)
  18. Panda Update 18, Aug. 20, 2012: (about 1% of queries; belatedly announced)
  19. Panda Update 19, Sept. 18, 2012: (less than 0.7% of queries; announced)
  20. Panda Update 20 , Sept. 27, 2012 (2.4% English queries, impacted, belatedly announced
  21. Panda Update 21, Nov. 5, 2012 (1.1% of English-language queries in US; 0.4% worldwide; confirmed, not announced)
  22. Panda Update 22, Nov. 21, 2012 (0.8% of English queries were affected; confirmed, not announced)
  23. Panda Update 23, Dec. 21, 2012 (1.3% of English queries were affected; confirmed, announced)
  24. Panda Update 24, Jan. 22, 2013 (1.2% of English queries were affected; confirmed, announced)
  25. Panda Update 25, March 15, 2013 (confirmed as coming; not confirmed as having happened)

Google: We’re Unlikely To Confirm Current Or Future Panda Updates

What’s that, a Panda Update you just felt? Some believe so. But Google says it is unlikely to confirm that or any future Panda Update, as it has done in the past, because of the new “gradual” rollout infrastructure it is using for Panda Update changes.

Earlier this week at SMX West, Google’s Matt Cutts said a new Panda Update might hit this week, then later said that the update — and future updates — would no longer be apparent as an abrupt change. Rather, Panda changes would roll out over a series of days.

Because of this, Google now says it’s unlikely it will confirm officially if a Panda Update has hit. We were told from a Google spokesperson.

“I don’t expect us to tweet about or confirm current or future Panda updates because they’ll be incorporated into our indexing process and thus be more gradual.”

Until now, Google has confirmed all 24 Panda Updates that have happened, which were:

  1. Panda Update 1, Feb. 24, 2011 (11.8% of queries; announced; English in US only)
  2. Panda Update 2, April 11, 2011 (2% of queries; announced; rolled out in English internationally)
  3. Panda Update 3, May 10, 2011 (no change given; confirmed, not announced)
  4. Panda Update 4, June 16, 2011 (no change given; confirmed, not announced)
  5. Panda Update 5, July 23, 2011 (no change given; confirmed, not announced)
  6. Panda Update 6, Aug. 12, 2011 (6-9% of queries in many non-English languages; announced)
  7. Panda Update 7, Sept. 28, 2011 (no change given; confirmed, not announced)
  8. Panda Update 8, Oct. 19, 2011 (about 2% of queries; belatedly confirmed)
  9. Panda Update 9, Nov. 18, 2011: (less than 1% of queries; announced)
  10. Panda Update 10, Jan. 18, 2012 (no change given; confirmed, not announced)
  11. Panda Update 11, Feb. 27, 2012 (no change given; announced)
  12. Panda Update 12, March 23, 2012 (about 1.6% of queries impacted; announced)
  13. Panda Update 13, April 19, 2012 (no change given; belatedly revealed)
  14. Panda Update 14, April 27, 2012: (no change given; confirmed; first update within days of another)
  15. Panda Update 15, June 9, 2012: (1% of queries; belatedly announced)
  16. Panda Update 16, June 25, 2012: (about 1% of queries; announced)
  17. Panda Update 17, July 24, 2012:(about 1% of queries; announced)
  18. Panda Update 18, Aug. 20, 2012: (about 1% of queries; belatedly announced)
  19. Panda Update 19, Sept. 18, 2012: (less than 0.7% of queries; announced)
  20. Panda Update 20 , Sept. 27, 2012 (2.4% English queries, impacted, belatedly announced
  21. Panda Update 21, Nov. 5, 2012 (1.1% of English-language queries in US; 0.4% worldwide; confirmed, not announced)
  22. Panda Update 22, Nov. 21, 2012 (0.8% of English queries were affected; confirmed, not announced)
  23. Panda Update 23, Dec. 21, 2012 (1.3% of English queries were affected; confirmed, announced)
  24. Panda Update 24, Jan. 22, 2013 (1.2% of English queries were affected; confirmed, announced)
  25. Panda Update 25, March 15, 2013 (confirmed as coming; not confirmed as having happened)

Going forward, we’ll have to make our own call if a Panda update has hit. We’re doing that with the last on the list, Panda Update 25, which we believe has hit.

Facebook Likes, Increase FB Likes Free