Tuesday, September 30, 2014

Matt Cutts Missing Google Search Quality?

Back in July, Matt Cutts shocked the SEO industry by announcing his extended vacation - many thought he'd never come back. He has not come back yet, officially, so it is unclear if he will.

But for one thing, it does seem he misses the search quality team and the SEO industry.

Last week, he shared a bit about his spammy search history and couldn't resist but to mock black hats. Now, Matt is getting into the Panda 4.1 news.

Matt Cutts reshared the news about Panda 4.1 on both Google+ and Twitter.
On Google+ he wrote:

Sounds like there's a new Panda update slowly rolling out.


On Twitter he said a similar thing:




Is Matt Cutts itching to come back into his spam fighting role? Will he come back soon?

Forum discussion at Google+ and Twitter.

Google: There Isn't A Magical SEO Advantage By Switching To HTTPS

Google's John Mueller said in a video hangout 22 minutes and 21 seconds in that there is no magical SEO advantage when switching from HTTP to HTTPS.

But didn't Google tell us that HTTPS is a ranking signal? Yes, but only a "very lightweight signal" but that may increase over time. It hasn't yet.

Jennifer Slegg first covered it and transcribed it:

I wouldn’t expect any visible change when you move from http to https, just from that change, just from SEO reasons. That kind of ranking effect is very small and very subtle. It’s not something where you will see a rise in rankings just from going to https

I think that in the long run, it is definitely a good idea, and we might make that factor stronger at some point, maybe years in the future, but at the moment you won’t see any magical SEO advantage from doing that.

That said, anytime you make significant changes in your site, change the site’s URLs, you are definitely going to see some fluctuations in the short term. So you’ll likely see some drop or some changes as we recrawl and reindex everything. In the long run, it will settle down to about the same place, it won’t settle down to some place that’s like a point higher or something like that.


Here is the video:


My thought, if you know what you are doing, do the switch. It can only benefit you if you do it right and in the long run, Google will tweak the signal and give more weight to it.

Google Panda 4.1 Now Rolling Out; Aims To Help Smaller Web Sites

One of my fears was that Google would announce a big algorithm update while I was offline for Rosh Hashanah and they did just that - although the update was not Google Penguin, it was Google Panda related.

This one, Pierre Far has the privilege of announcing on his Google+ page. He wrote:

Panda update rolling out

Earlier thisweek, we started a slow rollout of an improved Panda algorithm, and we expect to have everything done sometime next week.

Based on user (and webmaster!) feedback, we’ve been able to discover a few more signals to help Panda identify low-quality content more precisely. This results in a greater diversity of high-quality small- and medium-sized sites ranking higher, which is nice.

Depending on the locale, around 3-5% of queries are affected.


So the rollout began likely on Wednesday or Tuesday of last week. On Wednesday, I actually was thinking something was going on and asked some folks I know who track this closely on Twitter but was pretty much shot down by them. So I let it slide. :(

Anyway, Panda began rolling out and I suspect by now, it is fully rolled out. This is code named Panda 4.1, the previous official major Panda release was Panda 4.0 on May 20, 2014. Since then, Panda has continued to refresh fairly often on a monthly cycle and will continue to do that even with Panda 4.1.

This update, Panda 4.1, impacts 3-5% of search queries, so it is a pretty major one. But Pierre Far from Google said this is friendly to smaller sites, "greater diversity of high-quality small- and medium-sized sites ranking higher," he said. Is that true? Well, time will tell but many are not too confident.

Pierre also said the reason they are announcing it is because it is not a simple refresh but they added more signals to the algorithm. Pierre explained Google added a "few more signals to help Panda identify low-quality content more precisely."

I am a bit concerned this site was hit by Panda 4.1. Here is my traffic for the past few weeks, notice the huge decline from the previous two weekends to this one. It may be that I was offline, posting stale content, on Thursday and Friday but I rarely post new content on weekends (Saturday and Sunday).

SER Panda Hit?

This shows a ~30 drop in organic Google traffic from the previous weekends to this past weekend. The rest of the week is actually normal traffic, including the days I was offline for a Friday and Thursday with content that is not ground breaking.

There is a lot of analysis going on, which I will dig deeper into after the dust settles over this week. But if you want to dig now, here are those threads listed below. 

Saturday, September 27, 2014

SEO Web Traffic- Easy SEO Tips To Boost Website OR Blog Traffic

Every blogger and web site owner desires his/her blog posts or the net pages to achieve most variety of individuals and find quality within the search engines. this {can be} once more important for the journals and therefore the sites intrinsically as a result of while not guests no blog or web site can survive longer.

Even if you’ve got the simplest doable content, it’s useless if you don’t have enough readers. you just can’t deny that higher traffic suggests that more cash and lesser traffic suggests that dead business.

To make your blog or web site do rather well and find countless traffic, you’ll need to follow few tips:-

1. Keep Writing and build Your Content distinctive and of fine Quality

You need to stay posting new content as a result of the a lot of you’ll post the a lot of your journal or web site can get got wind within the search engines. oft updated sites and blogs area unit most most well-liked by the search engines. thus keep writing and keep posting from time to time. If doable post once each day however just in case you’re too busy to possess it done, post a minimum of one to 2 times hebdomadally.

But merely keep posting won’t assist you the maximum amount as can posting quality content. build your content informative, useful, and fascinating. select what’s presently making a buzz. obtain correct keywords and have them in your post.

2. Use Keywords

Using keywords is vital for your web site still as journal content. correct keyword usage will increase the possibilities of your journal posts or web site page obtaining displayed within the search engines for specific keyword search. the simplest thanks to select your keywords is to use some keyword analysis tool and obtain the foremost favorable keywords associated with your post. There area unit many of them out there for complimentary like Google Keywords tool.

3. Select simple name

If you haven’t created a 1 already, you continue to have an opportunity to figure on now. make certain that your name isn’t an enormous and sophisticated word play however straightforward|a straightforward} to recollect simple set of words. Again, the simplest thanks to select a 1 that is said to the central topic with that your web site or journal posts area unit attending to deal and check out keeping the name as short as doable because it is straightforward to recollect such names compared to the longer ones.

4. Build Your Post Titles engaging

It isn’t terribly tough task to try to to however after all desires you to figure a bit showing intelligence. Titles area unit what your readers bump into at the terribly initial. thus have to be compelled to be an additional careful concerning them. thus keep them fascinating, attractive, to the purpose, and if doable strive inserting your keyword in it. bear alternative widespread blogs in your niche and look at their post titles, this may provide you with a stronger plan.

5. Build Your Posts computer programme Friendly

Pick up very best keywords that fit your content and place them in your post titles, within the main body, links, footers, Meta tags, etc. to create them a lot of outstanding, place them in daring or in Italics. you’ll be able to even think about victimization them within the file names. thus you’ll be able to have files like yourkeyword.html.

But you furthermore might have to be compelled to recognize that over victimization the keywords is additionally unlikable by the search engines thus keep them in limit. For this the simplest manner is to use synonyms of the keywords.

6. Discover What Your Competitors area unit Doing?

Search engines analyze incoming links to your web site as a part of their ranking criteria. Knowing what percentage incoming links your competitors have, can provide you with an incredible edge. Of course, you continue to need to discover your competitors before you’ll be able to analyze them.

Your analysis of competitors ought to embody these extraordinarily vital linking criteria, such as:

-Competitor rank within the search engines
-Quantity AND quality of incoming links (prioritized)
-What keywords area unit within the title of linking page?
-% of links containing specific keywords within the link text
-The Google Page Rank or MozRank of linking pages

-The quality of the linking domain and therefore the linking page (measured by links & mentions)
Aside from victimisation a number of the awe-inspiring SEO code mentioned on this web site, here area unit some things I in person do once researching a competitor:

-Click the link to their web site Map page and see what keyword you discover within the links
-Get a savvy net person to search out and analyze their XML web site Map to search out keywords in page names
-View the markup language title and meta tags of your high competitors to compile a listing of required content

7. Use Heading Tags Properly

Make good use of heading tags in your web content content; they supply search engines with data on the structure of the markup language document, and that they typically place higher price on these tags relative to alternative text on the net page (except maybe hyperlinks).

Use the tag for the most topic of the page. keep use of through tags to point content hierarchy and to delineate blocks of comparable content. I don’t suggest victimisation multiple tags on one page so your key topic isn’t diluted.

8. Follow W3C Standards


Search engines love grammatical, clean code. Clean code makes the location easier to index, and might be AN indicative issue of however well a web site is made.

Following W3C standards additionally nearly forces you to jot down linguistics markup, which might solely be an honest issue for SEO.

9. Place Your journal and web site Links in Your Social Profiles

If you’ve got profiles on social sites like LinkedIn, Google and, Facebook, Twitter, Pinterest, etc. merely place the link to your journal or web site to your profiles on these social sites. this may send you traffic from the guests of your profiles.

10. Submit Your journal or web site to the Search Engines


Every computer programme and directory offers a special section through that you’ll be able to send the link to your journal ad web site. Google, Yahoo!, Bing, Dmoz area unit the foremost ones.

Friday, September 26, 2014

Google Panda 4.1 Update: Google Launches Panda 4.1

Google has announced new Panda update which is count as 4.1. The latest Google panda algorithm came after 4 months, which is designed to penalize “thin” or poor content website from ranking well.

Google said in a post on Google+ that a “slow rollout” began in the 2-3rd week of September 2014 and will continue into next week also, before being complete. Google said that depending on location, about 3%-to-5% of search queries will be through this Google Panda algorithm update 4.1.

This latest Google Panda algorithm update and not a refresh that some have picked up in earlier updates.

This new Google Panda algorithm update means that new sites not previously hit by Panda might get impacted.

The update also means that who was penalized in the last update and has made the right changes in his website, can be recovered.

The latest Google Panda algorithm update comes four months after the last, which suggests that this might be a new quarterly cycle that we’re on.

Google Panda algorithm had been updated on a roughly monthly basis during 2012. In 2013, most of the year saw no update at all.

The rollout means anyone who was penalized by Panda in the last update has a chance to emerge, if they made the right changes. So if you were hit by Panda, made alterations to your site, you’ll know by the end of next week if those were good enough, if you see an increase in traffic.

The rollout also means that new sites not previously hit by Panda might get impacted. If you’ve seen a sudden traffic drop from Google this week, or note one in the coming days, then this latest Panda Update is likely to blame.

Past Google Panda Update:



Panda Update 1, AKA
Panda 1.0, Feb. 24, 2011
Panda Update 2, AKA
Panda 2.0, April 11, 2011
Panda Update 3, May 10, 2011
Panda Update 4, June 16, 2011
Panda Update 5, July 23, 2011
Panda Update 6, Aug. 12, 2011
Panda Update 7, Sept. 28, 2011
Panda Update 8 AKA
Panda 3.0, Oct. 19, 2011
Panda Update 9, Nov. 18, 2011
Panda Update 10, Jan. 18, 2012
Panda Update 11, Feb. 27, 2012
Panda Update 12, March 23, 2012
Panda Update 13, April 19, 2012
Panda Update 14, April 27, 2012
Panda Update 15, June 9, 2012
Panda Update 16, June 25, 2012
Panda Update 17, July 24, 2012
Panda Update 18, Aug. 20, 2012
Panda Update 19, Sept. 18, 2012
Panda Update 20 , Sept. 27, 2012
Panda Update 21, Nov. 5, 2012
Panda Update 22, Nov. 21, 2012
Panda Update 23, Dec. 21, 2012
Panda Update 24, Jan. 22, 2013
Panda Update 25, March 15, 2013
Panda Update 26 AKA Panda 4.0, May 20, 2014 (7.5% of English queries were affected; confirmed, announced)
Panda Update 27 AKA Panda 4.1, Sept. 25, 2014 (3-5% of queries were affected; confirmed, announced)

Google: GoogleBot Doesn't Lose Sleep Over Broken Links

Google's John Mueller has an awesome line in a response to a Google Webmaster Help thread question about broken links.

John wrote, "The web changes, sometimes old links break. Googlebot isn't going to lose sleep over broken links."

John explained that GoogleBot can handle broken links but many of your users cannot, so make sure to fix them more for usability issues versus SEO issues. 

John wrote:

If you find things like this, I'd fix it primarily for your users, so that they're able to use your site completely. I wouldn't treat this as something that you'd need to do for SEO purposes on your site, it's really more like other regular maintenance that you might do for your users.


That being said, if GoogleBot only hits broken links on your site and your internal navigation is all broken, it may have a problem indexing and ranking your web site.

Forum discussion at Google Webmaster Help.

Thursday, September 25, 2014

Only 11% Say Negative SEO Never Works In Google

Back in June we asked you guys to fill out a poll asking if you've tried negative SEO and if it worked.

We have over 300 responses to our poll and I wanted to share the results with you.

  • 53% said It Works; I Was Able To Negatively Impact A Third-Party Site
  • 36% said It Sometimes Works
  • 11% said It Does Not Work, I Was Not Able To Negatively Impact A Third-Party Site

Here is the fancy pie chart:
Negative SEO Works Poll Results

Disclaimer: Please see my poll disclaimer post before coming to any conclusions on these results.

Forum discussion at WebmasterWorld.

Panda 4.1 — Google’s 27th Panda Update — Is Rolling Out

Google has announced that the latest version of its Panda Update — a filter designed to penalize “thin” or poor content from ranking well — has been released.

Google said in a post on Google+ that a “slow rollout” began earlier this week and will continue into next week, before being complete. Google said that depending on location, about 3%-to-5% of search queries will be affected.

Anything different about this latest release? Google says it’s supposed to be more precise and will allow more high-quality small and medium-sized sites to rank better. From the post:

Based on user (and webmaster!) feedback, we’ve been able to discover a few more signals to help Panda identify low-quality content more precisely. This results in a greater diversity of high-quality small- and medium-sized sites ranking higher, which is nice.

New Chance For Some; New Penalty For Others

The rollout means anyone who was penalized by Panda in the last update has a chance to emerge, if they made the right changes. So if you were hit by Panda, made alterations to your site, you’ll know by the end of next week if those were good enough, if you see an increase in traffic.
The rollout also means that new sites not previously hit by Panda might get impacted. If you’ve seen a sudden traffic drop from Google this week, or note one in the coming days, then this latest Panda Update is likely to blame.

About That Number

Why are we calling it Panda 4.1? Well, Google itself called the last one Panda 4.0 and deemed it a major update. This isn’t as big of a change, so we’re going with Panda 4.1.

We actually prefer to number these updates in the order that they’ve happened, because trying to determine if something is a “major” or “minor” Panda Update is imprecise and lead to numbering absurdities like having a Panda 3.92 Update.
But since Google called the last one Panda 4.0, we went with that name — and we’ll continue on with the old-fashioned numbering system unless it gets absurd again.

For the record, here’s the list of confirmed Panda Updates, with some of the major changes called out with their AKA (also known as) names:
  1. Panda 1.0, Feb. 24, 2011 (11.8% of queries; announced; English in US only)
  2. Panda 2.0, April 11, 2011 (2% of queries; announced; rolled out in English internationally)
  3. Panda Update 3, May 10, 2011 (no change given; confirmed, not announced)
  4. Panda Update 4, June 16, 2011 (no change given; confirmed, not announced)
  5. Panda Update 5, July 23, 2011 (no change given; confirmed, not announced)
  6. Panda Update 6, Aug. 12, 2011 (6-9% of queries in many non-English languages; announced)
  7. Panda Update 7, Sept. 28, 2011 (no change given; confirmed, not announced)
  8. Panda 3.0, Oct. 19, 2011 (about 2% of queries; belatedly confirmed)
  9. Panda Update 9, Nov. 18, 2011: (less than 1% of queries; announced)
  10. Panda Update 10, Jan. 18, 2012 (no change given; confirmed, not announced)
  11. Panda Update 11, Feb. 27, 2012 (no change given; announced)
  12. Panda Update 12, March 23, 2012 (about 1.6% of queries impacted; announced)
  13. Panda Update 13, April 19, 2012 (no change given; belatedly revealed)
  14. Panda Update 14, April 27, 2012: (no change given; confirmed; first update within days of another)
  15. Panda Update 15, June 9, 2012: (1% of queries; belatedly announced)
  16. Panda Update 16, June 25, 2012: (about 1% of queries; announced)
  17. Panda Update 17, July 24, 2012:(about 1% of queries; announced)
  18. Panda Update 18, Aug. 20, 2012: (about 1% of queries; belatedly announced)
  19. Panda Update 19, Sept. 18, 2012: (less than 0.7% of queries; announced)
  20. Panda Update 20 , Sept. 27, 2012 (2.4% English queries, impacted, belatedly announced
  21. Panda Update 21, Nov. 5, 2012 (1.1% of English-language queries in US; 0.4% worldwide; confirmed, not announced)
  22. Panda Update 22, Nov. 21, 2012 (0.8% of English queries were affected; confirmed, not announced)
  23. Panda Update 23, Dec. 21, 2012 (1.3% of English queries were affected; confirmed, announced)
  24. Panda Update 24, Jan. 22, 2013 (1.2% of English queries were affected; confirmed, announced)
  25. Panda Update 25, March 15, 2013 (confirmed as coming; not confirmed as having happened)
  26. Panda 4.0, May 20, 2014 (7.5% of English queries were affected; confirmed, announced)
  27. Panda 4.1, Sept. 25, 2014 (3-5% of queries were affected; confirmed, announced)
The latest update comes four months after the last, which suggests that this might be a new quarterly cycle that we’re on. Panda had been updated on a roughly monthly basis during 2012. In 2013, most of the year saw no update at all.


Of course, there could have been unannounced releases of Panda that have happened. The list above is only for those that have been confirmed by Google.

Google's Matt Cutts Taunts Black Hat SEOs

Yesterday we reported about Google targeting PBNs with massive manual actions. The funny thing, Google's PR team would not confirm or deny it but Google's Matt Cutts, who is currently on an extended vacation, took the time out to taunt and mock black hat SEOs.

Keep in mind, this is part of Google's strategy, they want to break the spirits of black hat SEOs.

Matt Cutts wrote on Twitter "Blackhat SEO fads: like walking into a dark alley, packed with used car salesmen, who won't show you their cars." He references the story I wrote on Search Engine Land on the PBNs.



Seems like Matt is almost ready to come out of early retirement or his extended vacation and get back to work. This tweet from Matt comes a couple hours before Matt says he still searches for spammy queries these days.

Forum discussion at Twitter.

Wednesday, September 24, 2014

Report: Google Severely Hits PBNs (Private Blog Networks)

The black hat and gray SEO communities are buzzing about Google's latest efforts to target PBNs, Private Blog Networks.

Over the past couple days, Google has reportedly went after many PBNs that are being used to manipulate their rankings. Google has sent out hundreds, if not thousands, of manual action notifications to those participating in these networks.

Bill Lambert posted a comment on this blog saying there is a "massive private blog network update going on." He added this is a "complete slaughter fest."

BlackHatWorld has comments from some of those who lead PBNs, obviously trying to downplay the impact but they do admit it is going on.

Alex from on PBN wrote in the thread:

Yes, I'm aare of the current wave of PBN de-indexation. I have quite a few friends in this industry who got totally destroyed and I also know folks with the shittiest possible PBN content ever out there whose networks are totally intact. Even though there is no doubt that Google is taking serious action against webmasters who are hitting it big with PBNs, either promoting their own money sites or running services, I don't believe the end of PBNs is somewhere near. NOT EVEN CLOSE.


Greg Nunan has a more detailed blog post on the de-indexing of these PBNs. He said many sites are receiving "thin content" warnings via Google Webmaster Tools.


Have you been hit by this?

Google has not confirmed or denied these reports but they likely would not confirm or deny them if I asked.

Forum discussion at Black Hat World.

Update: Here is another site hit by this manual action.

Sunday, September 21, 2014

Google: You Don't Rank Well Because GoogleBot Is Not Impressed

I am sure you have seen it at least a 100 times, a person complaining their site doesn't rank well in Google. You look at the site and say, well, there are thousands of other sites competing for the same terms and your web site, well, has nothing unique and value-add compared to the other 1,000 web sites.

Google's John Mueller responded in the same tone, to an adult (NSFW) site who complained about some issues in the Google Webmaster Help forums.

John said your site "doesn't impress Googlebot much" because it is "essentially the same as so many other sites that just reuse video feeds."

Here is what John wrote:

The bigger problem is that your site is essentially the same as so many other sites that just reuse video feeds. That doesn't impress Googlebot much, so while i understand it's not trivial, I think in the long run it would be really helpful to just make sure that you offer something that users explicitly want to see from your site, something that they'll recommend over all of the other, similar sites, something unique, compelling and of high quality. That isn't something which can be fixed with a meta-tag, so instead of spending too much time on the technicalities of language-detection, I'd take a step back and think about what you could do to make something much more significant.


I run into this so often and why you can squeeze things out of an ordinary site, why? Why not go big and better? But going outside of the box for most of these business owners is just way too hard.

Forum discussion at Google Webmaster Help.

Tuesday, September 16, 2014

Google AdWords Ad Preview & Diagnosis Tool Technical Issues

There are confirmed reports at Google AdWords Help forums of a bug with the Google AdWords Ad Preview and Diagnosis tools.

Some are reporting that when testing the localization within the tool, the tool is showing that the ads are not running, when they are indeed running.

Dan S, an AdWords advertiser wrote:

I'm having a similar problem with the Ad Preview Tool but the ads themselves seem to still be running. Keyword Diagnosis gives same result as bubble and preview tool.


Google's Kathleen confirmed the issue and said the Google team is looking into it. Kathleen wrote:

Thanks for bringing this to our attention. This issue has been detected by our technical team and they are currently looking into this. I'll update this thread when I have more information!

Forum discussion at Google AdWords Help.

Google Malware Notice Only On Organic Result?

Colin McDermott posted on Twitter that Google shows a malware warning on the organic free search result for a site but not on the paid search ad.

Here is a screen shot that I was able to replicate:

Google Malware Notice Only On Organic Result

Funny how on the same exact search results page, the malware warning only shows for the organic result and not the paid search ad. Which one is more delayed? The paid result or the organic result?

You'd think Google would run both through the same security checks?

The site was hacked a few days ago, but seems to be okay now.

But I guess not.

Forum discussion at Twitter.

Google Shares Common Webmaster Issues With SearchAction Schema & Sitelinks Search

Google Sitelinks Search Box

About ten days ago, Google revamped their search within a site feature and renamed it Sitelinks Search. They also gave webmasters a way to communicate that they have an internal search that should be used when using the Sitelinks search.

We already discussed some myths around it but now Google's Mariya Moeva shared some of the more common technical implementation issues they've seen when webmasters have used the SearchAction schema markup.

These issues include:

(1) If you replace the curly braces and all that's inside of it with a search term, it'll lead to a valid URL on your site. For instance, if your "target" value is "http://www.example.com/search?q={searchTerm}", please make sure that "http://www.example.com/search?q=foo" and "http://www.example.com/search?q=bar" both lead to search result pages about "foo" and "bar".

(2) The "query-input" field must point to the string that's inside the curly braces in the "target" field. For example, if your "target" value is "http://www.example.com/search?q={searchTerm}", you must use "searchTerm" as the "name" within "query-input".

Don't make these two basic but common mistakes.

Forum discussion at Google+.

Google Now Using Lakh Numbering Format For Estimated Search Results

One of our Indian readers from Arindam Inc. sent me a screen shot via email of Google now using the Lakh numbering format for the estimated search results count.

A lakh or lac is a unit in the South Asian numbering system equal to one hundred thousand (100,000; Scientific notation: 105). In the Indian numbering system, it is written as 1,00,000. It is widely used both in official and other contexts in Bangladesh, India, Myanmar, Nepal, Pakistan, and Sri Lanka. It is often used in Indian, Pakistani, and Sri Lankan English.

Here is a screen shot of how it looks like in Google India for [weather kolkata]:

Google Using Lakh Number Format For Number Of Results

Same result in Google.com, with the standard number format:


I believe this is a pretty new change that is now available on Google India and other localized properties.

Forum discussion at Google+.

Friday, September 12, 2014

Google: Penguin 3.0 Will Most Likely Launch In 2014

Google's John Mueller in a video hangout this morning on Google+ announced that the 3rd version of Penguin will launch "in the reasonable future" and he expects it will happen within the 2014 year.

The question was raised by @edwardjohnnash, "Will the Penguin 3.0 launch in 2014?"

The answer John gave was:

My guess is yes. But as always there are always things that can happen between. I am pretty confident we will have something in the reasonable future. But not today, so we will definitely let you know when it is happening.


He answers this at 54 minutes and 45 seconds into the video. Here is the embed:


Hat tip to James Hale @Jameshale21 and @ChrisLDyson for the heads up on this.

And we do know the next version of Penguin will be designed to run faster and more frequently.

Forum discussion at Google+.

Google Processes The Disavow File Continuously

For many SEOs and Webmasters, this is probably not news, but I like to document things like this in any case.

The disavow file, the file you use to tell Google these links are not links I want pointing to my site, is a file that Google "continuously" processes.

John Mueller, who works at Google and we quote several times a day here, said this in response to a question posed by @WholesaleClear and @Marie_Haynes:

John posted on Twitter:



This is in response to migrating to HTTPS and moving the disavow file there. Since the file is continuously processed, moving the file to HTTPS would kick in pretty quickly as well. Oh, of course, with Penguin, the disavow won't work until Penguin is refreshed, which should be within the year

Thursday, September 11, 2014

Google Robots.txt File Updated & Google Stops Indexing Itself

esterday we reported Google was indexing itself and Google caught wind of it and Gary from Google commented on my Google+ post telling me they are looking into it.

Last night at around 6pm EDT, Google updated their robots.txt file to block the search results from being indexed.

Hacker News picked up on it as well, and yesterday before 6pm, it looked like this:


Now it looks like this:


Even Google makes mistakes I guess.

Forum discussion at Hacker News and Google+.

Google: We're Working On A Solution To Refresh Penguin Faster

We know webmasters, especially those impacted by Penguin 2.1 are getting really anxious. It has been 11 months and 6 days since the last Google Penguin refresh. That is a long time for a site.

But Google understand that but has told us it is complex to refresh Penguin and it is taking them time.

Google confirmed a few newish items around Penguin and what the engineers at Google are working on with the algorithm this week.

In a video hangout on Google+ with Google's John Mueller, John shared they are working on speeding up the algorithm so it refreshes faster. John also admitted that Google's algorithms don't always reflect the changes webmasters make to improve their sites in a "reasonable time," specifically around Penguin.

At 34 minutes and 32 seconds into the video, John said:

(1) Google is working on a "solution that generally refreshes faster" specifically talking about Penguin.

(2) He said "we are trying to speed things up" around Penguin.

(3) He also admitted that "our algorithms don't reflect that [webmasters efforts to clean up the issues around their sites being impacted by Penguin] in a reasonable time."

Here is the video:


Forum discussion at Google+.

Wednesday, September 10, 2014

Google: Even After Algorithm Update, Your Rankings Changes May Not Be Immediately Visible

In response to our story on Google confirming a Penguin refresh is needed to recover from Penguin, Google's John Mueller added more context to his answer in the Google Webmaster Help thread.

In short, he made a few points:

(1) Even when Google runs an algorithm update or refresh, the "changes aren't immediately visible even after a refresh, that's normal," John said.

(2) A single site is "never in a void alone with just a single algorithm," many algorithms impact a single site.

(3) In "some cases where a site is strongly affected by a single algorithm," but John added "that doesn't mean that it won't see any changes until that algorithm or its data is refreshed."

(4) "Theoretical void of just your site and a single algorithm," John said, "you'd need to wait for the algorithm and/or its data to refresh to see any changes based on the new situation."

In summary, John is explaining that changes in rankings happen to sites impacted by Penguin as well. Because a site impacted by Penguin is also likely impacted by other algorithms. If you clean the Penguin side up, you are likely also cleaning up things with other algorithms that may be negatively or positively impacting your site. Thus you may see changes, slowly, during the Penguin hit, until Google releases the full Penguin update.

Here is the full post by John, I recommend you read it a couple times:

Let me try a longer answer :-)
In theory: If a site is affected by any specific algorithm or its data, and it fixes the issue that led to that situation, then the algorithm and/or its data must be refreshed in order to see those changes. Sometimes those changes aren't immediately visible even after a refresh, that's normal too.
In practice, a site is never in a void alone with just a single algorithm. We use over 200 factors in crawling, indexing, and ranking. While there are some cases where a site is strongly affected by a single algorithm, that doesn't mean that it won't see any changes until that algorithm or its data is refreshed. For example, if a site is strongly affected by a web-spam algorithm, and you resolve all of those web-spam issues and work to make your site fantastic, you're likely to see changes in search even before that algorithm or its data is refreshed. Some of those effects might be directly related to the changes you made (other algorithms finding that your site is really much better), some of them might be more indirect (users loving your updated site and recommending it to others).
So yes, in a theoretical void of just your site and a single algorithm (and of course such a void doesn't really exist!), you'd need to wait for the algorithm and/or its data to refresh to see any changes based on the new situation. In practice, however, things are much more involved, and improvements that you make (especially significant ones) are likely to have visible effects even outside of that single algorithm. One part that helps to keep in mind here is that you shouldn't be focusing on individual factors of individual algorithms, it makes much more sense to focus on your site overall -- cleaning up individual issues, but not assuming that these are the only aspects worth working on.
All that said, we do realize that it would be great if we could speed the refresh-cycle of some of these algorithms up a bit, and I know the team is working on that. I know it can be frustrating to not see changes after spending a lot of time to improve things. In the meantime, I'd really recommend - as above - not focusing on any specific aspect of an algorithm, and instead making sure that your site is (or becomes) the absolute best of its kind by far.
Cheers John

The Google algorithm(s) is one big melting pot of zeros and ones.

Forum discussion at Google Webmaster Help.

Google Panda Update On September 5th

On Friday, I saw dozens of threads in the Google Webmaster Help forums with people complaining about major traffic loses. But the automated tracking tools from Moz, SearchMetrics, Algoroo, SERPS.com and others didn't really show any signs. Plus places like WebmasterWorld were fairly come, with the occasional person complaining.

So at first, I thought it was one of those mass manual actions that impacted some networks of sites. But I seem to be wrong. It looks like a Panda refresh, a large one of that.

Why do I think so?

(1) One person who complained received a response from Google's John Mueller this morning in the Google Webmaster Help forums. The response is the templated response Googlers give to Panda victims:

I'd recommend making sure your website has unique, compelling, and high-quality content of its own -- not just content from other websites.


(2) Glenn Gabe reported several of his clients who is helping with Panda issues saw recoveries on Friday.





He even shared screen shots of Analytics showing the recoveries:


Did you see any Panda sites recover or get slammed on Friday?

Forum discussion at Google Webmaster Help and Google+.

Google Indexing & Ranking Google Search Results

Google is pretty clear about not wanting to have search results listed in their search results, they have been since at least 2007. So why is Google indexing and ranking their own search results?

Rob, a reader here, spotted this and had his friend Chris Dyson do a blog post on it. Chris shared it on Twitter but here are screen shots that I can personally replicate:

Google Indexing & Ranking Google Search Results
Here is the cached version of the page:
click for full size

Dan Petrovic also found a case like this a few months ago.

Maybe that is why Gary locked up GoogleBot.

Forum discussion at Twitter.

Google: We're Working On A Solution To Refresh Penguin Faster

We know webmasters, especially those impacted by Penguin 2.1 are getting really anxious. It has been 11 months and 6 days since the last complex to refresh Penguin and it is taking them time.

Google confirmed a few newish items around Penguin and what the engineers at Google are working on with the algorithm this week.

In a video hangout on Google+ with Google's John Mueller, John shared they are working on speeding up the algorithm so it refreshes faster. John also admitted that Google's algorithms don't always reflect the changes webmasters make to improve their sites in a "reasonable time," specifically around Penguin.
At 34 minutes and 32 seconds into the video, John said:

(1) Google is working on a "solution that generally refreshes faster" specifically talking about Penguin.

(2) He said "we are trying to speed things up" around Penguin.

(3) He also admitted that "our algorithms don't reflect that [webmasters efforts to clean up the issues around their sites being impacted by Penguin] in a reasonable time."

Here is the video:


Forum discussion at Google+.
Facebook Likes, Increase FB Likes Free