Tuesday, June 24, 2014

Google Responds To Impact Of Blocking CSS & JSS & Panda 4.0

Yesterday we covered some SEO theories around blocking JavaScript & CSS triggering Panda 4.0 issues. I didn't honestly believe there was a relation, based on the example provided and very few other sites reporting the same effects but now we have a response from a Googler.

Well, maybe the response is a bit Google-like and cloudy.

One webmaster posted the theory on Google Webmaster Help and John Mueller responded to the specific case at hand, not necessarily Panda 4 and how it related to blocking CSS & JavaScript. But he did respond to the question about being hit by Panda and blocking content via external files.
John Mueller of Google wrote:

Allowing crawling of JavaScript and CSS makes it a lot easier for us to recognize your site's content and to give your site the credit that it deserves for that content. For example, if you're pulling in content via AJAX/JSON feeds, that would be invisible to us if you disallowed crawling of your JavaScript. Similarly, if you're using CSS to handle a responsive design that works fantastically on smartphones, we wouldn't be able to recognize that if the CSS were disallowed from crawling. This is why we make the recommendation to allow crawling of URLs that significantly affect the layout or content of a page. I'm not sure which JavaScript snippet you're referring to, but it sounds like it's not the kind that would be visible at all. If you're seeing issues, they would be unrelated to that piece of JavaScript being blocked from crawling.


So is John saying that if you block content, then it may impact the Panda algorithm? Is he saying that? Or is he saying that the content that is blocked, Google can't see anyway and it has no impact on Panda? Or maybe it may or may not have an impact on Panda because Panda is about content and maybe layout?

See how this can get confusing. What is your take?

Forum discussion at Google Webmaster Help.

Update: John responded again basically implying it is not Panda. He wrote:

Looking at your site, those disallowed scripts are definitely not causing a problem -- it's primarily an issue of problematic links here. That's what I'd focus on first. Since there's a manual action involved, that's something which you can work on to resolve.


He then aims to answer the specific question at hand head on:

Regarding your more general question of whether disallowed scripts, CSS files, etc play a role in our Panda quality algorithm: our quality algorithms primarily try to understand the overall quality of a page (or website), and disallowing crawling of individual aspects is generally seen as more of a technical issue so that wouldn't be a primary factor in our quality algorithms. There's definitely no simple "CSS or JavaScript is disallowed from crawling, therefore the quality algorithms view the site negatively" relationship.


He goes on in more detail, so check out the thread.

Image credit to BigStockPhoto for Panda Java Mug


Saturday, June 21, 2014

Video: Matt Cutts Interview At SMX Advanced On Payday Loan, Reconsideration Requests & Metafilter

Here is the video from last week's SMX Advanced last week with Matt Cutts and Danny Sullivan. In the video you will find a lot of what we covered, specifically Matt announcing the Payday loan algorithm going live the next day, the new reconsideration request rejection notices and the metafilter story. Plus a couple of other things.

Here are the times for the main points he made:


Here is the video:


You can see how I was confused with the Metafilter part...

Here are our stories coming out of it:


Forum discussion at Google+.

Friday, June 20, 2014

Google Manual Action Over 302 Redirected Links

A WebmasterWorld thread has one webmaster claiming he received a reconsideration request rejection notice and in that notice, Google gave example links.

One of the example links includes a link that 302 redirects to his web site. A 302 redirect is a temporary redirect and often thought to not pass proper and full link juice from the URL to the target site. But as many of you know, it may and often does.

This webmaster was surprised Google would use links that are redirected in a 302 manner as a bad link towards the site. He wrote:

I got a reply to a recon request this morning and one of the example links was a page that didn't actually contain a physical direct link to my site.

It contained a URL that included a /goto/[mysite] command which eventually 302 redirected to my site.

If this is the sort of thing that Google is concerned about then it really increases the workload for anyone cleaning up their link profile.


So, clearly, you need to dig deep into your link profile and clean out even the URLs being redirects as a 302.

Forum discussion at WebmasterWorld.

Tuesday, June 17, 2014

Google's Matt Cutts Said Pay Day Loan Algorithm Rolled Out But Did It?

As I reported from the airport Thursday night at Search Engine Land, Google's Matt Cutts said the Payday Loan Algorithm was rolling out but truth is, I haven't seen signs of it.

In fact, I've asked many folks in the black hat and spammy categories about this and they've seen nothing. There is only one site I see reporting any changes and even there I am not convinced.

As you know, the Google Payday Loan 3.0 algorithm was released, targeting spammy queries over spammy sites - whatever that means. But even though it began rolling out Thursday night, as Matt Cutts said on Twitter - I do not see reports from within the community about it.



What I do see are sites not in the spammy category reporting major changes in their rankings. The ongoing WebmasterWorld thread has folks complaining of major shifts starting Thursday night/Friday - but they are not in the spammy categories.

Here are some complaints:

  • I'm not in a spammy category (consumer electronics) and my site was pushed down for no apparent reason with this update. Seems Google messed something up.
  • I'm not in a spammy category. I've been decimated sad We can't pay our bills now.
  • Meanwhile the site that never gets updated, that rips off other content got a nice boost. Way to go.



Some benefited, as you would imagine:

  • We don't know what Google classifies as "spammy". But I'd say consumer electronics queries might indeed fall into that category. Take a look how many (legit and automated) sites are out there.
  • I had a nice boost on June 12th. Consumer electronics too, with high quality indepth (often 6000+ words) articles.

A senior member I think has it right, saying this was a slight modification to the Panda 4 algorithm:

  • I've noticed a slight drop in traffic since Thursday.
  • I've compared my (low level) bing and yahoo traffic to google and as best I can tell it seems just related to google.
  • I benefited from the recent Panda 4.0. A couple of other people posting here have have mentioned slight drops in traffic and a positive Panda 4 effect.
  • I'd bet that Google has tweaked the Panda 4 algo.
  • In regard to the PayDay algo, I don't think that would much apply to my content.

Scanning the Black Hat World forums, there is very little chatter there. Most of the chatter is asking, did you see anything? Where most people are saying, "I saw some minor fluctuations for a few of my sites but really nothing major" and the like.

I asked Matt Cutts on Twitter if there really was a PayDay Loan 3.0 and didn't get a response:


I suspect what most of you are seeing is a small update to Panda 4.0 and not Pay Day Loan specific. But I am not Google and I do not have inside information.


Friday, June 13, 2014

The MetaFilter Google Update Is In Process Says Google's Matt Cutts

As you may remember, MetaFilter got nailed by an unconfirmed update on November 17, 2012 that Google denied.

Google has now confirmed that update, 1.5 years later. Not only that, Google's Matt Cutts said on Twitter, in response to my story on Search Engine Land that they are working on updating that unconfirmed update from 1.5 years ago.

Specifically, MetaFilter doesn't have to make any changes to their site, but when Google releases the update in the next few weeks or so, MetaFilter will see an increase in referrals from Google.

Matt Cutts tweeted these tweets:





The update from November 17, 2012 seemed to have had a major impact on many forum-like web sites. Google wouldn't talk about it then, but now, they are talking.

I suspect any of you hit on that date, will want to follow to see when the MetaFilter update (naming it that) is released and see if you are also released from that unconfirmed update.

Google's Matt Cutts will not give us more details on that update.

Another side note: When Google does not confirm updates, it clearly doesn't mean there was no update. So I will keep reporting unconfirmed Google updates - just in case years later Google confirms them.

Forum discussion at Twitter.

Personalized Certificates & Public Profiles Pages For Google Partners

Google announced that Google Partners now have two new ways to share their AdWords certification with prospects or clients.

(1) You can now get a personalized HTML certificate that you can showcase.

(2) You can now also use your sharable public profile page.

Here is a screen shot of them both, one overlaid on top of the other:

click for full size
To view the HTML version of your certificate:
  1. Sign in to your Google Partners account at www.google.com/partners
  2. Click the ‘My profile’ link in the ‘Overview’ section. 
  3. If you’re AdWords certified, click the ‘AdWords Certified’ link to open a printable, HTML version of your certificate.
  4. You can print the certificate or save it to your desktop to share with potential clients.
To share your Google Partners public profile:
  1. Select the ‘My Profile’ page link from the menu.
  2. Visit the ‘Public Profile’ section of this page. By default, your public profile page is visible only to you. 
  3. To make your profile visible to others, click the ‘Share with’ drop-down menu and select ‘Public.’ 
  4. You can now click the ‘View Profile’ link and begin sharing your page with anyone. 
  5. If you’re signed in to your Partners account, your profile's visibility status will appear at the top of the page. You may always change it from public back to private. 

Forum discussion at Google+.

Google Spam Algorithm Version 3.0 Launches Today

As I reported last night out of the SMX Advanced keynote with Matt Cutts - the third version of the Google Spam Algorithm, also known as the PayDay Loan algorithm, is launching today.

Didn't we just have version 2.0 a couple weeks ago? Yes we did.

Matt Cutts explained that 2.0 was targeting specifically spammy sites, while 3.0 targets more spammy queries. So the algorithm looks more at the query versus looking at the site. You are smart SEOs, so you can figure out what that means.

Matt also explained that version 2.0 also added some negative SEO factors, to reduce the amount of negative SEO that can happen. If you believe that.

I do not believe this launched just yet, it may launch today sometime, or maybe tomorrow - but likely today, Matt Cutts said. Based on the lack of complaints in the black hat forums, I do not think it launched at the time I am writing this.

Reminder, this will look specifically spammy queries, such as terms like [payday loans], [casinos], [viagra] and other forms of highly spammy queries.
Forum discussion at WebmasterWorld, Twitter and Google+.

Update: It began rolling out 4:40pm EST:



Tuesday, June 10, 2014

Webmaster: I've Tested Negative SEO Through Links & It Works

Last week we reported that most webmasters are claiming negative SEO is easier now than ever. If you look at the conversation there, you will see it is somewhat of a hot topic.

That being said, since then, one webmaster decided not to just say it works but also said how he implemented the technique. Sadly, it was not too difficult, according to this webmaster.
The steps?

The first month, contract a couple $5 guest blog posts [make sure the posts are in broken English of course], then go back to what you were doing.

Second month, try a few more [4-8] $5 [broken English] guest blog posts and add some forum link drops to the mix. Go back to what you normally do -- Nothing will happen.

Third month, add even more [broken-English] guest blog links [2x or 3x per week], increase the forum link drops and sign up for long-term ["undetectable"] directory additions.

If the site hasn't tanked yet, month 4 hit 'em with 20,000 inbound links all at once -- Keep doing it and eventually the site you're aiming at will tank and they won't be able to figure out how to recover -- It takes almost none of your time and costs very little to tank a site due to the "penalty mentality" Google has decided to run with.


Yea, not rocket science and any SEO who would go about this would likely and logically take these steps.

Does and can it work on most sites? I do not know. I doubt it can work on really well established sites. But on the average mom and pop e-commerce site, sure - why not.

As we said before, negative SEO is not new, in fact, Google has said it is rare but possible since 2007. Sites as large as Expedia may have suffered from it and Google had to reword their documentation on the topic. 

Saturday, June 7, 2014

Google's Matt Cutts: The Number One SEO Mistake Is...

If you had to guess what the most common SEO mistake would be, what would you guess? Building a site of all images or in Flash? Not making a title tag that says home versus your core keyword phrase? Spamming Google? Nope.

Matt Cutts, Google's head of search spam, said it is not having a site at all.
In a fun video, where Matt goes invisible for parts of it, he says that is the number one mistake. If you do not have a web site, then Google does not know you exist. You are invisible to Google. So number one priority, get a web site.

Here is the video:


Forum discussion at Twitter & Google+.

Thursday, June 5, 2014

58% Say Google's Penalties Are Immoral

In May, we wrote twice about the ethics of Google penalties and covered the debate around that, asking you to vote and let us know if you think Google penalties are unethical or immoral.

We have over 200 responses now, I am actually surprised by the lack of votes, but I'll share them with you anyway.

58% of you said you do think the penalties issued by Google are unethical or immoral. 34% said you do not think they are unethical or immoral. Finally, 8% said you are unsure if they are or are not.

Here is the pie chart:

Google's Penalties Are Immoral Results

Forum discussion continued at Google+, Google+ and Twitter.

This post was scheduled to be posted today but was written at an earlier point in time. The author is not around on June 4th or 5th to respond to comments.

16% Claim Recovery After Google's Panda 4.0 Update

On May 20th, Google began rolling out Panda 4.0 and shortly after we asked you to take our Panda poll. Thousands of you did and I wanted to share the results.

I asked, How Did Google Panda 4.0 Impact Your Site?

  • 28% : My Rankings Decreased, But I Never Previously Hurt By Panda
  • 24% : My Rankings Remained The Same, But I Never Previously Hurt By Panda
  • 20% : My Rankings Increased, But I Was Never Hurt By Panda
  • 16% : I Recovered From My Previous Panda Penalty
  • 12% : I Did Not Recover From My Previous Panda Penalty

I am surprised so many are claiming a recovery, that is over 200 of you saying your site recovered! Congrats!

Google Panda 4.0 Update Poll Results


This post was scheduled to be posted today but was written at an earlier point in time. The author is not around on June 4th or 5th to respond to comments.

Wednesday, June 4, 2014

You Can Now Connect Your Google+ Page With Your Google Maps Listing

You can finally connect your Google+ company page to your Google Maps listing.

For the past year or so, I had two listings for my company in Google+. Then, somehow, I had a third. Now, I was able to connect my Google Maps verified listing to my Google+ company page. And the biggest surprise to me, it worked! Google Maps is known for bugs and this seemed to actually work.

My company's Google+ page, over here now also has a "verified local business" check box, because I went through the steps to do so.

The steps are described here:

(1) Assuming you have a verified maps listing and Google+ page, then Log in to Google+, select Pages from the left-hand navigation ribbon, and click Manage this page on the local page.

(2) The local page will look almost the same as your current page. It will include a verification shield next to the name of a page. If you hover the badge, you’ll see "Verified local business".

Connect Google+ Page

(3) From the top left corner, choose Dashboard > Settings

(4) Scroll to the "Profile" section. Next to “This page is connected to Google Maps”, click Connect a different page.

(5) In the "Link a different page to this location" dialog, pick the page that you’d like to connect to Maps and click Next.

(6) You’ll see a list describing the changes to the newly-created page you’re connecting to Maps, and the local Google+ page you’re disconnecting from Maps. Click Confirm.

That is it and it worked.

click for full size

The page that’s newly connected to Maps will:
  • Display the name and verification badge from the former local page.
  • Display the business information (hours, phone number, etc.) and reviews from the former local page.
  • No longer display prior owner responses to local reviews. Your existing reviews may take a few hours to show up after connecting the page.
  • Carry over followers, posts, and managers. 
  • Carry over the custom URL (if you’ve set one up).
  • Possibly remove ad campaigns associated with the page. To start a new campaign, visit http://www.adwords.google.com/express and follow the step-by-step instructions.

The former local page will:
  • No longer display on Google Maps
  • No longer display local business information or reviews
  • Be renamed to "Backup of "
  • Still be visible on Google+
  • Retain followers, posts, and managers from the former local page
  • Retain the custom URL from the former local page (if you’ve set one up)
  • No longer display AdWords Express campaigns associated with the page.

Forum discussion at Google Business Help.

70%+ Want Public Disclosure Of Google Penalties

Back in February we covered the topic of penalty disclosure and asked our audience if Google should confirm, publicly, that a site was or was not penalized by a manual action or algorithmic action.

We asked in our poll "Should Google Confirm Penalties To Public?"

The majority of you think that it is good for Google to publicly disclose penalties to the public. 71% of you actually and only 21% think it would be a bad idea.

Here is the chart:

Disclose Google Penalties Publicly

For more details on this topic, see our post named Should Google Confirm Penalties To Public?

Forum discussion continued at WebmasterWorld.


June 2014 Google Webmaster Report

May was an incredibly busy month for Google SEO related topics, specifically seeing Panda 4.0 roll out, the Google Spam algo being updated, press release sites taking a dive and several unconfirmed updates that Google wouldn't talk about.

We also saw eBay get slapped, a popular site named MetaFilter talk about their Google penalty and a Nest competitor call out Google.
So it was a busy month!

Here is a recap by category, if you want to catch up quickly:

Google Panda 4.0:
Google Spam Algorithm 2.0:
Not Confirmed Google Updates:
Google Penalty Topics:
Google Videos & SEO Topics:
Google Webmaster Tools:
Google Legal Issues:

Also check out the previous month at the May 2014 Google Webmaster Report.

Tuesday, June 3, 2014

Google App Indexing Now Supports Other Languages

Google announced Friday that their App Indexing protocol, the ability to markup your content and have Google deep link to your Android apps content from the search results, now supports other languages. It started off slowly and then rolled out internationally but for English language Android apps only.
Now, it supports Chinese (Traditional), French, German, Italian, Japanese, Brazilian Portuguese, Russian, and Spanish.

With the announcement, the first set of internationalized apps include Fairfax Domain, MercadoLibre, Letras.Mus.br, Vagalume, Idealo, L'Equipe, Player.fm, Upcoming, Au Feminin, Marmiton, and chip.de.

To learn more about bringing your Android app into the mix, see this page. For other languages check out Chinese (Traditional), French, German, Italian, Japanese, Brazilian Portuguese, Russian, and Spanish.

Bing also has quietly added app linking for Windows based apps in the Bing search results.

Google: You Rank As You Should, It's Not Panda

I honestly love seeing this because it somewhat upsets me to see, I know weird... But Google's John Mueller responded to a webmaster in a Google Webmaster Help thread about him losing 90% of his "indexing" after the Panda 4 .0 update. He said, 15 years of work now "down the toilet."

John Mueller said nope, "you're sites are ranking where they'd normally rank." I scratched my head and said, do they? And looking at SearchMetrics, it doesn't seem Panda 4.0 hit it recently at all:

searchmetrics-postcards

Of course, this makes me go back to me wanting Google to add an automated action viewer or the like to Google Webmaster Tools.
John told this webmaster that the "door is open" and he can do wonders with his site. John wrote:

Your sites are ranking where they'd normally rank -- there's no specific algorithm that's treating them in any special way at the moment. So in short, the door's open. That said, things always change in search, be it our algorithms, what users expect, or what other sites are doing. It's normal to see fluctuations over time. My recommendation would be to look at your sites overall and think about what you could do to significantly improve them. Maybe it makes sense to fold them all into a single site, so that it's easier to focus & implement changes? With a handful of sites, that's essentially up to you (personally I'd try to keep things simple and put everything into a single one, if these were my sites).


I just don't get hope people just claim things without even consulting anyone about this. Step one, see what is impacting your web site and tackle it from there.

Forum discussion at Google Webmaster Help.

Tabke: Google Penalized eBay For Dropping AdWords Ads

As you know, eBay was penalized by Google either via a manual action or algorithmic action or both. Google and eBay won't confirm it, but it is clear from the SearchMetrics charts.

The reason for the penalty is not 100% clear, a site like eBay likely had lots of issues with the site. But there are also a lot of theories out there. Some citing that eBay stopped AdWords ads and thus this is the way Google is going to get back at them.

Brett Tabke, the founder of WebmasterWorld and head of the PubCon conference, he's been around the block for a while, said at WebmasterWorld:

eBay 1 Year After Dropping AdWords: Pay up or get booted.

I don't think I have seen a more egregious case of Googles complete duplicity and lack of transparency in it's search methods. I think it clearly sends a signal to others that if you want organic results, you have to "pay up" or you will be booted.


Why did Google wait a year according to Tabke?

Yes. They needed ye Old plausible deniability after ebay has been running AdWords for almost 14 years. I've heard it is well over a billion that they spent.


Of course, not all agree with Tabke's theory but many do.

Do you think this is Google's ad side connecting with the organic side and pay back on that?

Sunday, June 1, 2014

Press Release Sites See Huge Drop In Google Ranking After Panda 4.0

Yesterday I reported at Search Engine Land that Panda 4.0 may have hurt press release sites and showed SearchMetrics data for PRWeb, PR Newswire, BusinessWire and PRLog all losing between 60% to 85% of their SEO visibility.

This drop seems to have come right after the Google Panda 4.0 update. The controversy around press release sites were mostly about links flowing from those releases, not necessarily the issue with the duplicative nature of that content. But let's be honest, many many press releases issued are content thin and spammy on the content end, not just on the link end. So maybe, just maybe, Panda 4.0 adjusted for it and the big sites felt it?

Sean from SEER also documented the drop via SEM Rush data and screen shots of before and after rankings for some press releases.

Here are SearchMetrics charts:

click for full size
click for full size
click for full size
click for full size

Was this a Panda thing? It is not clear. It seems like it but these be manual actions also, similar to the overlap with eBay?

Google's Matt Cutts: It's Silly To Think We Penalize Vivint Because It Competes With Nest

Pando wrote an article named After Google bought Nest, it removed one of the company’s biggest competitors from search results showing how Vivint was removed from Google shortly after Google acquired Nest.

Matt Cutts, Google's head of search spam called this "silly," in a Hacker News thread. In fact, he said the reason they were removed was because they participated in guest blog link spamming. Matt said, Google penalized them in November 2013, well before they acquired Nest in January 2014.

Here is Matt's response:

It's a shame that Pando's inquiry didn't make it to me, because the suggestion that Google took action on vivint.com because it was somehow related to Nest is silly. As part of a crackdown on a spammy blog posting network, we took action on vivint.com--along with hundreds of other sites at the same time that were attempting to spam search results.

We took action on vivint.com because it was spamming with low-quality or spam articles like...[removed, see hacker news for the links]

and a bunch more links, not to mention 25,000+ links from a site with a paid relationship where the links should have been nofollowed.

When we took webspam action, we alerted Vivint via a notice in Webmaster Tools about unnatural links to their site. And when Vivint had done sufficient work to clean up the spammy links, we granted their reconsideration request. This had nothing whatsoever to do with Nest. The webspam team caught Vivint spamming. We held them (along with many other sites using the same spammy guest post network) accountable until they cleaned the spam up. That's all.


Matt added:

In this case, we started dissecting this particular spammy guest blog posting network in November of 2013, and Google didn't acquire of Nest until January of 2014. So Vivint was link spamming (and was caught by the webspam team for spamming) before Google even acquired Nest.


You can then see Matt and the author of the Pando arguing on Twitter:

pando-matt-cutts-vivint


AJAX Hash Bang URLs No Longer Work In Fetch As Google Tool

This week, Google introduced the new fetch as Google tool that now supports rendering the page as well. The tool is actually very sweet in many ways, including visualizing what Google sees and showing you what Google cannot see.

But there is an issue, the Fetch as Google feature no longer supports hash bang URLs, the AJAXy URLs some newer sites use.

Here is a picture from Aaron Bradley on Google+ and Google Webmaster Help showing the issue:

click for full size

John Mueller from Google responded saying this is something Google should probably fix. He wrote:

That seems like something we should fix / support here too - thanks for posting! In the meantime, you can just rewrite the URLs yourself and submit those.


Forum discussion at Google+ and Google Webmaster Help.
Facebook Likes, Increase FB Likes Free