Friday, October 31, 2014

Google AdSense To Google Analytics Linking Updated

Google said they've simplified the linking process between your Google AdSense accounts and Google Analytics reporting.

Google said "you can link your AdSense account within Google Analytics in fewer steps and can also link your AdSense account to multiple Google Analytics accounts."

This is the process to do it:

  1. Sign in to your Google Analytics account.
  2. Click the Admin tab at the top of the page.
  3. In the "Account" column, select the Analytics account that contains the property you want to link with your AdSense account.
  4. In the "Property" column, select the Analytics property you want to link, and click AdSense Linking.
  5. On the "AdSense Linking" page, click + New AdSense Link.
  6. Select the AdSense property that you want to link with your Analytics property.
  7. Click Continue.
  8. Select the Analytics views in which you want your AdSense data to be available.
  9. Click Enable Link.
  10. Click Done.
    Your Analytics and AdSense accounts are now linked.

Forum discussion at Google+.

Negative SEO Is Real & Google Needs To Fix It

There is no doubt that the perception in the industry is that negative SEO is a problem.

So much so, that in our last poll, only 11% thought negative SEO doesn't work. Now that we have "white hat" clients requesting negative SEO and most SEOs feel it is easy to do, it is an issue. Heck, even Google has been saying it is possible since 2007.

But yet Google continues to down play it.

Bottom line, negative SEO is an issue. It is clear that it is an issue with the perception of negative SEO and I personally believe that negative SEO is an issue for Google's algorithms. Even if Google denies it, they need to at least fix the perception and that can be done through more transparency.

Recently I spotted a Black Hat World thread where these SEOs are talking about how they use fiverr to get lots of links to a competitor, so they get hit by negative SEO (either a manual action or Penguin action).

It is an issue and I strongly think Google needs to think hard about this and try to fix it.

Forum discussion at Black Hat World.

Tuesday, October 28, 2014

Google's New nositelinkssearchbox Meta Tag

In early September, Google announced the new Sitelinks Search box. It basically enables you to control how the search box within Google's search results sitelinks works with your site, that is, if you have a search box come up.
But if you do not want a sitelink search box to come up for your branded name, you can now disable it. A meta tag is available to disable it, the meta tag is:


<meta name="google" content="nositelinkssearchbox">


Using this will tell Google not to show a sitelinks search box when your site appears in the search results. The sitelinks search box will be disabled as part of the normal Googlebot crawling and processing of the page, which can take a few weeks depending on the site and other factors.

Menashe Avramov spotted this...

Forum discussion at Google+.

Google Guidelines: Blocking CSS Or JavaScript Directly Can Harm Your Rankings

Google announced they made a change to their Webmaster Guidelines specifically telling webmasters, what they've been saying for years, but more strongly the past few months. Do not block us from crawling your CSS or JavaScript!

Google's Pierre Far wrote, "Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings."

The old guideline that was changed was from:

Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.

To the new guideline:

To help Google fully understand your site's contents, allow all of your site's assets, such as CSS and JavaScript files, to be crawled. The Google indexing system renders webpages using the HTML of a page as well as its assets such as images, CSS, and Javascript files. To see the page assets that Googlebot cannot crawl and to debug directives in your robots.txt file, use the Fetch as Google and the robots.txt Tester tools in Webmaster Tools.

In May, Google introduced a new fetch and render GoogleBot feature and told us they are fully rendering your HTML web pages, just as a user would render them.

To quote Pierre Far from Google via Google+:

Let me be super clear about what this means: By blocking crawling of CSS and JS, you're actively harming the indexing of your pages. It's the easiest SEO you can do today. And don't forget your mobile site either!

The mobile reference is super important as well, read more on that over here.

Google's DMCA Piracy Algorithm Went Live Last Week

Remember we reported that during all the Penguin and Panda madness, Google said they will be pushing out a DMCA Algorithm update?

Well, that actually rolled out last week.

Torrentfreak, a site that covers torrent news, reported many sites noticed a big drop in Google traffic.

The Isohunt.to team told TF late last week, "earlier this week all search traffic dropped in half." They shared this graph showing the drop in Google traffic:


click for full size

SearchMetrics noticed the drop in traffic also on Torrent sites. Marcus Tober shared tons of examples of sites getting nailed in the Google search results. Here is one example of a site losing 90%+ of their SEO visibility in Google:

click for full size

How do you know if Panda, Penguin or Pirate hit you? Good question. Glenn Gabe posted that some of his clients are confused.

Forum discussion at Google+.

Thursday, October 23, 2014

SEO: 68% Of Videos Removed Or Demoted In Google's Search Results

Videos in the Google search results use to be one of the most popular vertical search features you'd find in the search results. But it is happening less often these days.

Some SEOs say 68% less often. A Black Hat World thread has one SEO asking where have they all gone, and another SEO posted stats from a sample size of about 10,000+ queries saying the drop has been close to 68%.

He said, "roughly 68% of video keywords were demoted/deranked/removed right off page 1 of Google."

Here is the data he published:

Based on a sample pool of 10,014 video keywords ranking on page 1, 11 months ago.....


6,782 were removed from page 1 
3,101 dropped from the top 3 to positions 3-14
131 stayed in the top 3

Mozcast has shown a decline as week:


What has replaced it? Well, this doesn't even factor in the answers Google is showing way more often. Even if there is a video, the video is often pushed down for answers, such as in this example:


Have your videos taken a dive?

Forum discussion at Black Hat World.

Google Penguin 3.0 Updating Again?

Google launched Penguin 3.0 late Friday night and we have a lot more detail on it now, including that it should be rolling out for the next few weeks now.

Late Saturday night, the Penguin update seemed to have settled down. Many thought it wasn't done and those assumptions were indeed correct. Google told us it wasn't done. But it was partially done because the fluctuations stopped then.

It seems that some savvy SEOs are noticing an uptick in ranking changes yesterday, maybe more so in the UK region.

The ongoing WebmasterWorld thread has folks chiming up again around 4pm EDT yesterday. Here are some of the comments around it starting back up:

Its definitely still rolling out, 10% down on this time last week per hour.

Agreed. Are you in the UK?

I think it's starting to look better after nearly a full week of shuffling.

It is definitely still rolling. Some sites are popping in and out of first page on long tail keywords since Tuesday in the niche I monitor. Anyone else seeing big movements in the long tail keyword part of their serps?


Although, the folks in the Black Hat World forum are not really talking much about it starting back up yet, which is weird.

Did you see any changes this morning? The various rank checking tools are all pretty stable with high readings but Penguin updates don't often show up as much on these tools.




One SEO said he saw changes:


Wednesday, October 22, 2014

Google AutoCorrects: Penguin 3.0 Still Rolling Out & 1% Impact

As you already know, Google launched Penguin 3.0 late Friday. But yesterday, Google's John Mueller said he believed the roll out was complete but then hours later stepped back on that.

Today we learn from Google's Pierre Far on Google+ that the roll out is far from complete, that it is still rolling out and will so for the "next few weeks."

Pierre said, "it’s a slow worldwide rollout, so you may notice it settling down over the next few weeks."

Pierre also confirmed, once again, the rollout began "last Friday," but added that it "affecting fewer than 1% of queries in US English search results." He also stated the obvious about any algorithm update:

This refresh helps sites that have already cleaned up the webspam signals discovered in the previous Penguin iteration, and demotes sites with newly-discovered spam.


But the key here, Pierre said "refresh," which means to me there were no new signals added to the algorithm, that the only thing Google did was rerun the algorithm. Why couldn't they just "refresh" the algorithm several months ago is beyond me. I thought we were waiting for a major rewrite that adds more signals and makes it faster to run?

Penguin 3.0 Recap
  • Started rolling out late Friday night, October 17th
  • Will continue to roll out for next few weeks
  • Is a worldwide roll out
  • Impacts less than 1% of English queries (but may have a smaller or greater impact in other languages)
  • Only a refresh, no new signals added
  • It helps sites recover from previous Penguin updates that fixed their link profile
  • It demotes sites that have a bad link profile

Forum discussion at Google+.

Tuesday, October 21, 2014

Google On Being In Google News As A Search Quality Ranking Signal

Yesterday in the hangout where John Mueller felt the Penguin 3.0 roll out was complete, but it wasn't, John had a special guest from the Google News team, Stacie.

Stacie is the Google rep in the Google News forum, we've quoted her here dozens of times. She answered a ton of specific Google News related questions in the Google Hangout on Google+.

She said in the hangout that being accepted into Google News as a publisher "is a signal that I am sure the search algorithm uses." Why? Because Google humans review each publisher to make sure it meets the quality guidelines, so it would make sense for it to be a ranking signal. Stacie said this at 40 minutes and 21 seconds into the video.


But then I had to follow up and confirm this with John, who has more access to the search algorithm that Stacie. I asked John to clarify what Stacie said and he did at 42 minutes and 39 seconds:


He said it is mostly around how quickly Google indexes pages but not about a quality signal for ranking. He said it is something they probably treat separately but I am not sure if John was confident with that answer. He then added there is likely a correlation between quality sites being approved in Google News and the algorithm liking.

However, both didn’t seem confident in answering how Google News may or may not impact the normal web search algorithm.

Forum discussion at Google+.

Google Currently Won't Accept HTTPS Sites Into Google Trusted Store Program

Google launched the Trusted Store program back in 2011, their way of vouching for online merchants.

But if those merchants are fully HTTPS, based on Google latests recommendations and algorithms - they are currently out of luck.

Michael who has a fully HTTPS enabled site tried to apply but was rejected because his site was fully HTTPS. He posted his concern in the Google AdWords Help community and said:

In May, I was told that we did not qualify to become a trusted store at this time because our entire site is served via HTTPS, that you are aware of this limitation and that it is being addressed. Despite arguing that there is no reason to shuttle e-commerce customers between HTTP / HTTPS protocols anymore, that using HTTPS to protect all customer interactions is essential, and that Google now recognizes HTTPS as a signal to rank sites higher in search results, there was nothing that could be done and I would have to wait for a fix.


Google did indeed respond, where Mini from the AdWords team said they are working on it:

The Google Trusted Stores team is actively working with Merchants that have a full HTTPS website to make it possible for them to display the Google Trusted Stores badge. We ask that they please contact the Trusted Stores team for more information.


When will it come exactly? Well, that is unclear but I suspect very soon.

Forum discussion at Google AdWords Help.


Google AutoCorrects: Penguin 3.0 Still Rolling Out & 1% Impact

As you already know, Google launched Penguin 3.0 late Friday. But yesterday, Google's John Mueller said he believed the roll out was complete but then hours later stepped back on that.

Today we learn from Google's Pierre Far on Google+ that the roll out is far from complete, that it is still rolling out and will so for the "next few weeks."

Pierre said, "it’s a slow worldwide rollout, so you may notice it settling down over the next few weeks."

Pierre also confirmed, once again, the rollout began "last Friday," but added that it "affecting fewer than 1% of queries in US English search results." He also stated the obvious about any algorithm update:

This refresh helps sites that have already cleaned up the webspam signals discovered in the previous Penguin iteration, and demotes sites with newly-discovered spam.


But the key here, Pierre said "refresh," which means to me there were no new signals added to the algorithm, that the only thing Google did was rerun the algorithm. Why couldn't they just "refresh" the algorithm several months ago is beyond me. I thought we were waiting for a major rewrite that adds more signals and makes it faster to run?

Penguin 3.0 Recap



  • Started rolling out late Friday night, October 17th
  • Will continue to roll out for next few weeks
  • Is a worldwide roll out
  • Impacts less than 1% of English queries (but may have a smaller or greater impact in other languages)
  • Only a refresh, no new signals added
  • It helps sites recover from previous Penguin updates that fixed their link profile
  • It demotes sites that have a bad link profile

Monday, October 20, 2014

Confirmed: Google Penguin 3.0 Released Late Friday Night

I am working on getting confirmation from Google but I have never seen the forums light up as much as they are now.

Update & Confirmed: Google Sunday afternoon has confirmed they have done a Penguin update. I am trying to get more details at this moment.

Early reports came from webmasters and SEOs in various online forums and social media. The SEO industry chatter is at an all time high and it all leads to Penguin 3.0 being released.

The Google Penguin algorithm, which has not been updated in over a year, since Penguin WebmasterWorld, Google Webmaster Help, DigitalPoint Forums, Threadwatch and BlackHat World. Twitter, Facebook, Google+ and other social networks are on fire now with Penguin chatter.

The first report I saw came from BlackHat World late Friday night. That thread is now about 250 threads deep and a lot of those SEOs there are claiming they were hit. But there are also a lot of people reporting major increases in rankings.

It is unclear if this is a refresh to the Penguin algorithm or a revised algorithm update. Again, I am waiting to get more details from Google on this.

But it seems like 90%+ of SEOs are in agreement that Google refreshed Penguin over the weekend. Will they reverse it? Was it a test? Will it stick? That is the big question.

Update & Confirmed: Google Sunday afternoon has confirmed they have done a Penguin update. I am trying to get more details at this moment. 

Tuesday, October 14, 2014

Google: Spam Is A Hard Problem For User Generated Content Platforms

Google's John Mueller admitted on Twitter that for user generated content (UGC) sites, spam is a hard issue and problem.

John wrote, "spam is a hard problem for sites that focus on UGC; keeping spammers out & keeping things easy is tough."

We know both Mozilla and Sprint had small Google penalties around user generated content on their sites. One of the reasons I killed my forum was it was too much spam to deal with (have enough of that in the comments here) and Google has given advice over the years on how to handle it.

John from Google, although, admits it is a hard problem to tackle.




Here are some stories we have on this problem:




Google Change Of Address Tool Adds Step-By-Step Process Walk-Through

Google has updated their change of address tool, the first major update since it launched back in 2009.

The new features make it easier for novices to go through the process of using the tool correctly.

Google's John Mueller announced it on Google+ explaining the new tool will "guide you through a part of the process, double-checking that you have the sites verified, that the redirects are in place, and that the redirect doesn't break verification."

Here is a screen shot of the process the tool will take you through:

click for full size

The new update does not bring HTTPS migration support to the tool. John Mueller said "this is only for moves from one domain to another" and thus HTTP to HTTPS is the same domain.

Forum discussion at Google+.

Wednesday, October 8, 2014

Google: We Won't Be Updating Toolbar PageRank

December 6, 2013 was the last time we had a Google Toolbar PageRank update, that was over 10 months ago - not as long as a Penguin refresh (which is way more important) but still, significant.

Google's John Mueller said in a video Google+ hangout yesterday that Google will probably not be updating Google Toolbar pagerank in the future. He said this at 20 minutes and 30 seconds into the video:

We will probably not going to be updating it [PageRank] going forward, at least in the Toolbar PageRank.



In December, when Google updated Toolbar PageRank, it turned out to be not really intentional and I doubt that Google will be updating it going forward.

Goodbye Toolbar PageRank!
Forum discussion at Google+.

Tuesday, October 7, 2014

Google Webmaster Tools 404 Errors Sorted By Priority

Did you know that the Google Webmaster Tools report for crawl errors, specifically the 404, page not found report, is ordered in priority order?

I didn't know that but supposedly that was the way it always was. Google will order the errors they find on your site in their crawl errors report by order of importance or priority.

Google's John Mueller said this in Google+ video hangout 10 minutes and 30 seconds in.

Here is the snippet of the video:


Many SEOs and webmasters, including myself, did not know that.

Forum discussion at Google+.

Chatter: Google Penguin or Panda Refresh Happening?

The forums and SEO community has been buzzing all weekend about major shifts in the Google search rankings. Some are 100% positive it is the final roll out of Panda 4.1 while others are saying Google is rolling out Penguin 3.0.

I am honestly not sure if it is either. There are people who are saying their Panda sites have recovered over the weekend and there are people who said they are 100% sure it is a Penguin update. For one thing, Google has currently not confirmed any algorithm update with me over the weekend. I'll try to get something on record.

Over the weekend, the WebmasterWorld -Panda thread sparked up as did the WebmasterWorld - Penguin thread. Here are some quotes:

Panda Quotes:

I think it's just finishing. This weekend I might have been hit. It's been a very bad weekend.


And there we go again.


Friday night I was hit again; I do not know if it's Panda or another algo.

Penguin quotes:

Pretty sure they are at least testing penguin right now. Seeing some big movements.


If it's rolling out, then I guess it's no lube again. Another downward plunge.

The comments in both my Penguin posts are pretty on fire with people saying Panda or Penguin is refreshing now also.

Marie Haynes posted on Twitter that she sees nothing regarding Penguin but one of her Panda clients recovered a lot of traffic over the weekend. She also shared this graph:


The various tracking tools do show higher volatility over the weekend:

mozcast october
serpmetrics october
serps october
algoroo october

Again, I do not have confirmation on Google about any algorithm update yet but I can tell you the comments and forums are pretty wild right now. This normally means Google is or has launched something.


Update: Google's John Mueller told me in a hangout this morning that Penguin was not launched yet.

Update #2: Google told me Panda 4.1 is still rolling out. 

Saturday, October 4, 2014

Is Google Really Releasing Penguin 3.0 Next Week? Maybe...

Yesterday at SMX East, I reported on Search Engine Land in real-time that Google's Gary Illyes said on stage that Penguin may come next week.

Tomorrow, which is Yom Kippur, will be the one-year anniversary of the last Penguin refresh, Penguin 2.1. So, this update better come today or next week or else! Or else what? :)

So what did Gary say on stage yesterday? People don't believe that it may come next week. He did not say it WILL come next week. He said that based on internal communication from two weeks ago, the decision maker on if and when the algorithm will be pushed live, said that it will be live in weeks. Since that was about two weeks ago, we asked if weeks meant a "few weeks" and if so, that would mean Penguin would be released in a week from now. He said, with a smile, it may be released next week. You can also see Gary's comments about my coverage on these two posts on Google+.

But he clarified, if the tests show issues, then they won't push it out.

What was interesting was that he said that as of that internal email, any new link disavow changes won't be included in that. Again, if you disavowed links within the past two weeks or so, those links will not be disavowed in the next generation Penguin release. So clearly Google has done some initial work to stage this Penguin update - if the disavows have been fully processed for this release already.

In addition, Gary explained that this new update will support faster refreshes of Penguin in the future, as we knew, but when prodded more, he said likely on a monthly cycle. I guess a lot like how Panda is run monthly now.

Gary also explained the new Penguin refresh should be less painful. He said they could have run Penguin over and over again but that would have just hurt more and more webmasters. But the new one will be "easier a bit" on webmasters, so much so webmasters will find it a "delight." He used the word "delight" but I am not sure it made much sense in context (I don't think his first language is English and he lives outside of the US, so keep that in mind).

That being said, Gary seems to be intimately involved in the Penguin algorithm.

Forum discussion at Google+, HighRankings and WebmasterWorld.

Thursday, October 2, 2014

Connection Time & HTTP Status Codes Used By GoogleBot For Crawl Efficiency

There is nothing worse than knowing Google is having a tough time accessing your content unintentionally. Well, truth is, if you robots.txt or nofollow your site out of Google by accident, I don't feel bad for you. But if your server flakes out on you, then I do feel your pain.

Yesterday, at SMX East, the great Gary Illyes from the Google search quality group, shared two tidbits that you may not have officially heard on-record from Google about crawl efficiency with GoogleBot.

Now, you know that GoogleBot will play nice with your server. If they feel crawling it too hard will hurt the server, they back off. But what signals do they use for determining that? Google has never really shared that information until yesterday.

They use (1) connection time and (2) server status codes.

If Google sees it takes longer and longer to connect to a web page on your domain between GoogleBots hops, it will figure, it should back off a bit or stop crawling. If GoogleBot is served up HTTP server status codes in the 5xx realm, it will also back off a bit or stop crawling. Of course, it will try again later soon but the last thing Google wants to do is take down your site for users.

So if I were you, I'd have reporting configured on (1) connection time and (2) 5xx server status codes.

Forum discussion at Google+.

Was I Hit By Google Panda 4.1? Looks Like I Was...

Last Thursday, September 25th, Google announced that they began rolling out Panda 4.1 - a revised algorithm to improve their detection of Panda like signals and ultimately aim at helping smaller sites do better with the algorithm.

That being said, it seems like this site, the Search Engine Roundtable, took a major hit on September 25th, the day they announced Panda. I suggested it previously in my post on Monday sharing this graphic:

SER Panda Hit?
But now that it is a week later, I wanted to share more data with you.
Let's look at SearchMetrics showing I got nailed by over 60%!
SearchMetrics

Here is Google Webmaster Tools showing I took a dive of about 70%!

Google Webmaster Tools
Google Analytics makes things look less bad:
Google Analytics

Was this site hit by Panda? My sources tell me not. I have two different sources telling me this. I am just not sure what is going on if it is not Panda. I mean, look, any good SEO would see, wow, huge drop on the day of and after Panda was confirmed by Google. This has to be Panda. But it was not. What was it?

Forum discussion at Cre8asite Forums and Google+.
Facebook Likes, Increase FB Likes Free