Thursday, May 29, 2014

Does Deleting Your Web History At Google Really Do Anything?

Delete Google Web History
Jacques Mattheij wrote a blog post named Google Web/Search History Disable Does Absolutely Nothing. Basically claiming that when you hit that delete button, Google just hides it from your Google Web History page but does nothing more.

Of course, this upset Google's Matt Cutts, who said he was wrong.

Matt posted on Hacker News:

Just a quick point. The article says "there is no guarantee whatsoever that google does anything except for changing what they display to you."
If you're on the page at https://history.google.com/history/ and click on the gear and then "Help" the page about deleting search history is at https://support.google.com/accounts/answer/465 and it says
"What happens to your history when it's deleted
When you delete items from your Web History, they are no longer associated with your Google Account. However, Google may store searches in a separate logs system to prevent spam and abuse and to improve our services."
The article claims that there's no guarantee that Google does anything other than change the display. Google actually does quite a bit of work to disassociate items from your Google account if/when you delete them.

He then explains why Google does keep the search results for 18 months, for spam reasons. Matt said, "an easy example of preventing spam would be to detect and stop people trying to spam Google Suggest, which is based on queries that users do."

I am sure many of you will enjoy that last quote, because clearly there is a lot of Google Suggest spam attempts that Google watches over.

Either way, technically, there are likely ways to associate your searches back to you even if you delete them. Like in the AOL example from 2006. But Google does do things to disassociate the queries to your Google account.

Google: In 2009 We Selected Only One Of Your Anchor Text When...

Back in 2008, Moz published a report claiming the first anchor text counts - meaning, if you have a web page that has two or more links going to the same page, but the anchor text of those links differ, Google would pick the first anchor text and ignore the ones in the past.

We covered that back in 2008 and then SEOs began debating the validity of the study. Later on Branko confirmed the claims, questioning the scientific nature of the other studies.

Now, Google's Matt Cutts comes out with a video saying in 2009, Google may (which means did, imo) have picked one of the anchor texts and not counted the other. He did not say the first, but it is likely the first. First how? First in source code or based on skipping the navigation. That is unsure, but watch the video:


Either way, SEOs like to obsess about these things - Cutts says, don't.

It changes and he clearly didn't check to see what Google does now, in 2014.

Forum discussion at Google+ and Twitter.

Wednesday, May 28, 2014

Google May Be Pushing Out A New Update: Possibly Penguin?

I am seeing very early signs of another possible Google update both based on chatter at the WebmasterWorld and in other forums and social media spaces. I am also getting private data shared with me by those who are tracking this closely.

The feeling is that Google is testing a new Penguin refresh and that it may be rolling out slowly. Or that it might be something else. So you know, we had a Panda 4.0 update last week, and then last weekend (before Panda 4) we had a Spam algorithm update. Prior to that we reported on tons of ranking shifts which Google denied despite all the signs of an update.

And now, we are seeing something rolling out yesterday, that had big impacts on some sites.

Here are two graphs of sites impacted by a few things, including Penguin and notice the spikes in both over the past couple days:

click for full size
click for full size

On the 26th, these sites, both without any manual actions, saw huge changes. This was well after the official Google updates.

Last night, Rae Hoffman posted on Twitter that she has seen something going on.   


Patrick Gavin, an old SEO and founder of Text Link Ads back in the day, thinks it is Penguin related:


In the updated WebmasterWorld thread, we have users saying:

I think something is rolling out again

I'm seeing a lot of international traffic suddenly.

Yep I recovered all traffic last week and im seeing a traffic drop off now over the last 2-3 hours, hope its not a roll back.


I should note, most of the tracking tools, i.e. Mozcast, Algooro, SERPs.com and SearchMetrics are not showing major signs of an update. Keep in mind, Penguin updates impact mostly SEOed sites, where Panda is felt more by those tools.

Have you seen any change in the past couple days with your Google traffic?

Google: Adult Web Sites Are Not Automatically Spammy Web Sites

Some folks in the search space feel that adult sites, pornographic sites, are automatically considered to be spammy by Google. Truth is, while there is a ton of spam in those industries, not all adult sites are spammy.

Google's John Mueller said so in a Stack Exchange thread. He wrote:

Also, "adult" doesn't necessarily mean that it's spammy.


Google has given advice in the past on how to build quality adult sites. There was a time where many adult sites were penalized but Google reversed that penalty.

That being said, many adult sites are spammy because the adult industry is very competitive. But Google has been targeting that spam with their Spam algorithm.

Forum discussion at Stack Exchange.

Google Webmaster Tools Adds Block Resources Debug With Fetch & Render

As expected, Google has introduced a new tool to help webmaster determine what JavaScript, CSS, and other resources they are blocking from GoogleBot's crawl.

Google announced it yesterday, as a new feature in the Fetch as Google tool in Webmaster Tools, which now has Fetch & Render:

Google Fetch & Render

Clicking on it will give webmasters the option to specify how Google should fetch it, i.e. desktop, smartphone or feature phones. Then it will show you a visual representation of what Google renders and more importantly, at the bottom, it will show you which resources you are blocking GoogleBot from crawling.

The RustyBrick desktop version of our site rendered:

click for full size

The RustyBrick mobile version of our site rendered:

click for full size

Notice what is being blocked, all Google resources. :)

Now, the old version is still available under the other "fetch" tab and that is very useful as well.

Tuesday, May 27, 2014

Law Firm Sues Their SEO Company For Using Spammy Tactics

Google's Matt Cutts posted on Twitter a link to Eric Goldman (a guy who knows his legal SEO stuff) story about how a law firm sued their SEO company. Not necessarily for not achieving the rankings they wanted but rather for violating Google's webmaster guidelines and/or using spammy techniques.

Matt Cutts of Google called the claims "interesting." Indeed. You often here of people suing SEO firms for not getting what they paid for but not for violation of Google's guidelines. The court document reads, in part:

The action is based on the fact that, at the time that the Defendants were promoting this marketing scheme to the Victim Firms, they knew that the techniques they proposed to use were in violation of the guidelines already well-established and published by Google; knew that Google was moving rapidly to crack down on violators; knew that use of these techniques would not only fail to enhance the likelihood that the Victim Firms would rise in Google’s rankings but would actually be downgraded to the point where the websites being used by the Victim Firms would become “contaminated” for search engine purposes; knew that they intended to use automated programs rather than direct personal effort to create the appearance that links to the Victim Firms webpages (the key to rising in search engine rankings) were being generated in the numbers represented; and knew that they intended to cloak their schemes in allegations of “trade secrets” to avoid the balance of the scheme from coming to light.


Greg Sterling at Search Engine Land asks some interesting what-ifs:

(1) Will SEO firms that go outside the bounds of established “white hat” SEO practice be automatically vulnerable to liability?

(2) Will the court limit liability in cases where the plaintiff has not done any “due diligence” on the SEO practitioner? In other words, what burden does a buyer of SEO services have to investigate the SEO firm? (Probably none.)

(3) What damages might be assessed in situations where a ranking penalty has occurred? (e.g., fees paid, lost revenue?)

(4) What might be recoverable when there is no Google ranking penalty?
Lesson for SEO firms working with law firms - be incredibly careful or just don't do SEO work for lawyers. :)

Also, if the law firm does win in court for violating Google's guidelines, I assume that will give Google's guidelines a bit more clout and make them not just guidelines but maybe even "the law" in some sense. Which can be very scary.

Forum discussion at Twitter.

Sunday, May 25, 2014

SEO: I Got Comfortable & It Resulted In Being Hit By Google's Panda Filter

The ongoing thread about Google Panda 4.0 at WebmasterWorld has an interesting post from someone who was hit by Panda a while back and is slowly recovering.

In short, he blames himself. He said, he got comfortable and lazy and didn't invest in his site because it was doing well. Then one day, Panda came along and took it all away from him but he said he blamed himself for sitting back and not innovating, thinking it will last forever.

It is a pretty amazing post to read and honestly, it is a nice read. Let me share it here, but he shares more feedback in the thread:

We're up 100% (Google traffic) on one site and 75% on another since Monday. That doesn't come close to closing the gap that Panda and Penguin created over the last 3 years but it helps.

The site with a 100% increase (from 600 daily visitors to 1,200) was completely rebuilt. We've actually seen a gradual 1,200% increase since early this year with various other updates. As of today, that's 2,400% traffic growth since January.

The main work done was: [snip]
We pretty much did everything we could think of that would push the needle in the right direction. We spent over $100,000 on the rebuild not to mention thousands of man hours.

On the site with a 75% traffic increase we did much of the same however it was newer and needed less of an overhaul.

Frankly, after all of our work and frustration for years now not seeing an increase, always dreaming of the day when Google would come back around, I still have high expectations for growth to come. Back in 2011, we had 7,000 users a day coming from Google that eventually slipped to 50 early this year. We're back to 1,200 but we'll need to double three more times to attain our former glory.

I will say this about Panda/Penguin and Google's algo improvements in general. I was the first to complain about how Google almost destroyed my business. It has been very hard. We have laid off numerous employees and lost more advertising customers than I can count. It's cost us more than one potential acquirer. Safe to say that Panda/Penguin has cost me personally millions of dollars in the last 3 years.

And I believe, Google did the right thing with these improvements. I got comfortable and when you get comfortable you can get lazy. I got lazy. I didn't innovate my sites. I didn't improve the content like I should have. I didn't police the scrapers and copyright thieves to protect my content. I didn't employ the newest, fastest technology to make sure my users had the best possible experience. I didn't do these things because we were getting $10,000's of AdSense revenue a month without making any investments. I thought that would never change. Why invest in the health of the cash cow if it just keeps producing milk every single month with only minimal care?

Shame on me and every other webmaster that got comfortable. It's like a marriage. You don't get to stop working at impressing your spouse once you're married. You have to keep working at it or one day, you'll wake up and they'll be gone.

I will never let this happen again to my business. I have learned a valuable lesson that I hope I get to use to my advantage before the bankers come beating down my door.


Just a reminder, to take my Panda 4.0 poll if you have not.

Wednesday, May 21, 2014

Is eBay A Big Loser In Google’s Panda 4.0 Update? — Winners & Losers Data

Yesterday, Google began rolling out their Panda 4.0 update designed to punch low-quality content. That’s generated both “winners” who have moved up in rankings as “losers” have dropped down — and eBay might be one of the big losers.

Searchmetrics gave us their initial winners and loser charts, based on rankings they continually monitor. These show that one of the biggest losers was eBay. According to the data, eBay lost a tremendous amount of traffic from Google, much of it from the ebay.com/bhp/ area of its site.

Another huge loser was Ask.com, yes, the search engine, that lost a tremendous amount of traffic in their Questions section at ask.com/question/.

The Losers: Ask.com, eBay, Biography.com & Google-Backed RetailMeNot

Among the top losers include Ask.com, ebay.com, biography.com and retailmenot.com. I should note, retailmenot.com is venture backed by Google’s venture arm. Here is the top list of losers from the Searchmetrics initial analysis:

panda-4-loser

Here is a chart showing eBay’s UK drop by their main root domain versus the directory from Searchmetrics:

ebay.co.uk in Google UK

Dr. Peter Meyers from Moz also documented with their analytics how much eBay lost with this update. Pete said, “over the course of about three days, eBay fell from #6 in our Big 10 to #25.” Meyers digs deep into the analysis on the Moz blog.

537bc5a06fb781.04093434

Refugeeks looked at early SEM Rush data, which also showed a steep decline for ebay’s web site in Google. Note, SEMRush will be sending me more data as they work it up at their office. Here is a chart from the UK data:

ebay-dot-com-keywords-lost-UK

The Winners

With all algorithm updates, there are also those who win and gain rankings. The big winners seem to be glassdoor.com, emedicinehealth.com, medterms.com, yourdictionary.com and shopstyle.com.

panda-4-winner

The SearchMetrics data is sorted by increase in SEO visibility in absolute numbers, but sorted by percentage (relative).

Only Losers Really Know If They Lost

As we said with the Panda 3.5 Winners & Losers report, lists like this aren’t perfect. The sites above may have had gains and drops for other reasons; less visibility this week because last week they were visible for different news stories, for example.



It’s also worth remembering that this is a sample of search terms. The only way to really know if any update has hurt or helped you is to look at your search-driven traffic from Google, rather than particular rankings or lists like this, which have become popular after Google updates. If you’ve seen a significant increase, you’ve probably been rewarded by it. A big decrease? Then you were probably hit.

Google Spam Algorithm Version 2.0 Released Over Weekend

As I broke last night, Google has released an update to the Payday Loan Algorithm also known as the Google Spam Algorithm over the weekend.

No, I was not crazy expecting something big that happened over the weekend and throughout the month and this week. In fact, Google also started rolling out Panda 4.0 yesterday, which I will talk about a bit later.

The Spam Algorithm specifically targets spammy queries such as payday loan, viagra, and the related types of queries that spammers target heavily.

Matt Cutts described the algorithm before they launched it initially a year ago in this video:


This update impacted English queries by about 0.2% to a noticeable degree, Google's Matt Cutts told me.

The previous one impacted impacted English queries by 0.3% but was much more noticeable outside of the English language.

There is no doubt that black hats took notice, a BlackHatWorld thread has a bunch of them talking about how "Google is getting better" and they noticed drops in their rankings.

Anyway, even though Google said nothing is going on, we've been seeing signs of major changes and ranking shifts all throughout the month. I suspect those were tests for both this Google Spam Algorithm version 2.0 and the Panda 4.0 release.


Google AdSense Publishers Receive Personally Identifiable Information (PII) Breach Notifications

A Google AdSense Help thread has well over a hundred posts from AdSense publishers complaining they received a serious notification from Google that they are in "breach" of "passing personally identifiable information (PII) to Google."

This email was sent from the Google Policy Team and reads:

Dear customer:
It has come to our attention that you are passing personally identifiable information (PII) to Google through your use of one or more of Google\'s advertising products -- DFP, AdSense, and/or Doubleclick AdExchange.
Our systems have detected PII, including email addresses and/or passwords, being passed from each of the domain names below. We have also included below an example of an ad request that we received from your account (from which the PII detected has been redacted).
Our contracts and policies prohibit information being passed to us that we could use or recognize as PII. Sending us PII has put you in breach of those terms.
You should review your implementation of Google tags on your pages, including whether PII of any nature may feature in the URLs of such pages.
Please give this matter your immediate attention. You should submit your response in this form.
If you
fail to achieve compliance with your contract within 30 days we may disable ad serving on your account(s). If you fail to submit any response within 14 days, access to your account will be suspended.
Domain names at issue:

Again, there are a tremendous number of AdSense publishers complaining they received this violation notice and are clueless as what to do.

Despite there being hundreds of posts in the thread, not a single Google representative has responded about the issue since it was posted on May 19th.

This may just be Google starting to enforce a policy they had in place for a while, but this is the first time they are enforcing it?

Again, Google needs to chime in and help these publishers.

Tuesday, May 20, 2014

Google Begins Rolling Out Panda 4.0 Now

Google’s Matt Cutts announced on Twitter that they have released version 4.0 of the Google Panda algorithm.

Google’s Panda algorithm is designed to prevent sites with poor quality content from working their way into Google’s top search results.

But didn’t Google stop updating us on Panda refreshes and updates since they are monthly rolling updates? Yes, but this is a bigger update.

Panda 4.0 must be a major update to the actual algorithm versus just a data refresh. Meaning, Google has made changes to how Panda identifies sites and has released a new version of the algorithm today.

Is this the softer and gentler Panda algorithm? From talking to Google, it sounds like this update will be gentler for some sites, and lay the groundwork for future changes in that direction.

Google told us that Panda 4.0 affects different languages to different degrees. In English for example, the impact is ~7.5% of queries that are affected to a degree that a regular user might notice.

Here are the previous confirmed Panda updates, note, that we named them by each refresh and update, but 4.0 is how Google named this specific update:

  1. Panda Update 1, Feb. 24, 2011 (11.8% of queries; announced; English in US only)
  2. Panda Update 2, April 11, 2011 (2% of queries; announced; rolled out in English internationally)
  3. Panda Update 3, May 10, 2011 (no change given; confirmed, not announced)
  4. Panda Update 4, June 16, 2011 (no change given; confirmed, not announced)
  5. Panda Update 5, July 23, 2011 (no change given; confirmed, not announced)
  6. Panda Update 6, Aug. 12, 2011 (6-9% of queries in many non-English languages; announced)
  7. Panda Update 7, Sept. 28, 2011 (no change given; confirmed, not announced)
  8. Panda Update 8, Oct. 19, 2011 (about 2% of queries; belatedly confirmed)
  9. Panda Update 9, Nov. 18, 2011: (less than 1% of queries; announced)
  10. Panda Update 10, Jan. 18, 2012 (no change given; confirmed, not announced)
  11. Panda Update 11, Feb. 27, 2012 (no change given; announced)
  12. Panda Update 12, March 23, 2012 (about 1.6% of queries impacted; announced)
  13. Panda Update 13, April 19, 2012 (no change given; belatedly revealed)
  14. Panda Update 14, April 27, 2012: (no change given; confirmed; first update within days of another)
  15. Panda Update 15, June 9, 2012: (1% of queries; belatedly announced)
  16. Panda Update 16, June 25, 2012: (about 1% of queries; announced)
  17. Panda Update 17, July 24, 2012:(about 1% of queries; announced)
  18. Panda Update 18, Aug. 20, 2012: (about 1% of queries; belatedly announced)
  19. Panda Update 19, Sept. 18, 2012: (less than 0.7% of queries; announced)
  20. Panda Update 20 , Sept. 27, 2012 (2.4% English queries, impacted, belatedly announced
  21. Panda Update 21, Nov. 5, 2012 (1.1% of English-language queries in US; 0.4% worldwide; confirmed, not announced)
  22. Panda Update 22, Nov. 21, 2012 (0.8% of English queries were affected; confirmed, not announced)
  23. Panda Update 23, Dec. 21, 2012 (1.3% of English queries were affected; confirmed, announced)
  24. Panda Update 24, Jan. 22, 2013 (1.2% of English queries were affected; confirmed, announced)
  25. Panda Update 25, March 15, 2013 (confirmed as coming; not confirmed as having happened)

Google Algorithm & Ranking Shifts On Fire This Month

Despite Google saying nothing is going on, we've been seeing signs of major changes and reports of major changes in the Google search results and rankings.

Again, over the weekend, I've been getting private emails with tons of public chatter in the WebmasterWorld and even BlackHatWorld forums about Google making a lot of changes over the weekend.

Some are suspecting a massive Penguin update is about to hit, while others think it might be a Panda refresh and others think it is just Google's normal actions on link networks. It is almost impossible to tell without a confirmation from Google, for all we know, it can be all three or more.

Even the tracking tools are all over the place, it seems like it even broke MozCast, SERPS.com shows major activity on Saturday, so does SERP Metrics and Algoroo.

click for full size
click for full size
click for full size

Here are some recent chatter comments from the forums:

Regardless of the weather anywhere overall my lowest-ever page views at 45.7% of 2014 average with the UK at 37.4% however fortunately not as bad as Friday at 26.5%, what a mess.

i did a small analysis.. i used two of my sites and ranked purely with GSA and 10 articles scraped and smashed one was used. it got hit.

for my one another site i used 5 unique articles, spun with wordai, smashed as 1 articles with GSA.. it got hit but only 4-9 position drops.. not like first one..

so its getting clear that its penguin 3.0

Saturday, May 17, 2014

Google: If Content Isn't Changed, Manual Actions Won't Be Removed

An interesting Google Webmaster Help thread has a webmaster who has a manual action, he said he removed the content, and even left the page blank but the manual action won't go away.

Google's John Mueller explained why in an interesting way. He wrote:

If there's no content on these pages, there's nothing that users would be missing by lifting a manual action. If you'd like your reconsideration request to be processed, you really need to first have unique, compelling, and high-quality content of your own on these pages (not just rewritten, spun, autogenerated, or otherwise reused content).


You see, he said "if there's no content on these pages, there's nothing that users would be missing by lifting a manual action."

That is true, but if the manual action was for the content, then if it is gone - shouldn't it be removed. That is an assumption that here the penalty is for content.

Anyway, I wanted to share this with you all because the response was pretty interesting.

Wednesday, May 14, 2014

Google's Matt Cutts: Anticipate The Query To Better Control Titles In Google

Google's Matt Cutts posted a video explaining why and when Google may use something other than your title tag for the search results title snippet.

Matt Cutts suggested that it is best for your to try to anticipate what the user will search for when crafting your title tags. When you do that and then when it matches the query, then Google will likely show your title tag.

Google uses three criteria when determining if they should use your title tag:


(1) Something that is "relatively" short


(2) Have a good description of the page and "ideally": the site that the page is on.

(3) And that it is relevant to the query.


If you fail on these criteria, then Google may use 

(1) content on your page

(2) anchor text links pointing to the page and/or 

(3) may also use the Open Directory Project.

Here is the video:


Other title tag related stories:

Google's Matt Cutts Looks Back & Shares His Webspam Regrets...

I rarely submit questions for Google's Matt Cutts, but I did submit one and he actually answered it in a video.

The question was, Matt - what are your regrets related to web spam. I specifically asked "Was there a key moment in your spam fighting career where you made a mistake that you regret, related to spam?"

Matt said he regrets two things, maybe more, but he listed two things:

(1) Not acting faster on paid links that pass PageRank.

(2) Making the wrong decision early on about content farms.

Here is the video where he digs into each one:


I also summarized it in detail at Search Engine Land, so if you do not want to watch the video, you can read my summary there.

Here is the transcript of the video:
0:00: today we have a fun webmaster question from Barry Schwartz in New York
0:04: barry asks with their key moment in your spam fighting career
0:09: where you made a mistake that you regret related to spare him so he's not just
0:13: talking about choices order over anything like that
0:16: I can think about least two mistakes other than a half million related to
0:20: stand hopefully in Upper
0:22: I remember hmmm talking to a very well-known SEO
0:27: and a search conference in San Jose no
0:30: probably seven years ago give or take
0:33: and and then issues that you know what
0:36: paid links are just to prove wonders to cook their two common there's no way
0:40: that you guys will be able to
0:41: crack down on them and enforced and come up with good algorithms or or take
0:45: manual action disorder
0:46: put the genie back in the bottle as he put it and a and that was when I
0:50: realized that made a mistake and that we had allowed
0:53: paid link that has been trying to go a little bit too poor
0:56: and become a little bit too common when and so in the early days over out yet
1:00: 2005-2006
1:02: you'd see Google cracking down a lot more aggressively in taking a pretty
1:06: hard liner rhetoric about feelings the past Adrian
1:09: at this point most people know that Google disapproves amid
1:14: probably violates like the Federal Trade Commission's guidelines all those sorts
1:17: of things
1:18: we have now released a targeted we take spam reports about it
1:21: and so for the most part people realize that's not a good idea
1:25: if they do that they might base the consequences and so for the most part
1:29: people try to steer clear appealing to the past a drink at this point but we
1:33: probably waited too long before we started to take a strong stand
1:37: on that particular issue hmmm another mistake that I remember
1:41: is on there was a Gruber content farms
1:45: and a and we were getting to internal complaints where people said look
1:50: this website or that website is really bad it's it's just
1:54: poor quality stuff on a legal it's been our low quality
1:57: but it's a really were pleaser experience and I had been to one
2:01: particular page along the sides because
2:03: one point my my toilet was running and I was like okay how you diagnose the
2:08: toilet was running
2:09: and I had gotten a good insert from that particular page
2:13: and I think I might have over generalized a little bit been like no no
2:16: there's not a great bonding
2:17: quality content onto the sides because look here with this one-page the Hope
2:21: Solo
2:22: the diagnostic why does your toilet run and how do you think that all that sort
2:25: of stuff
2:26: and the mistake that I made was judging from that one in it do and not do in
2:30: larger-scale samples are listening to the feedback or
2:33: you know looking at more pages on the site and so I think it took us a little
2:36: bit longer
2:38: to realize that some of these lower quality sides are content farms or
2:41: whatever you want to call them
2:43: we're sorta mass creating pages rather than really solving users needs with
2:48: with fantastic content so I think Tom as a result we did wake up to that we
2:53: started working on it
2:53: months before really became wide scale in terms of complaints
2:57: %uh but we probably could have been working on it even earlier mom
3:01: regardless you know we're always looking for good feedback we're always looking
3:04: for where we missing when we need to do to make our web
3:07: result better-quality and a and so anytime we roll something out
3:11: there's always the question up could you have bought have some way to
3:15: to stop that were to take better action or or more clever algorithm
3:19: and could you have done it sooner and so on you know
3:22: you like Google does a lot a great work and that's very rewarding than we feel
3:26: like okay we have
3:28: cool you know heard are bird
3:31: working hours with many people work and you at the same time you always wonder
3:35: could you be doing something better
3:36: could you belinda cleaner way to do it a more elegant way to do it something with
3:40: higher precision
3:41: I recall and that's okay you know it it's healthy for us to be asking
3:44: ourselves then
3:45: so great question those are coupled he moments like to remember where
3:49: like we made a mistake by not paying attention to a particular topic soon
3:54: book that helps

Google's Update From Last Week Reversing Itself?

On Friday I reported about a lot of chatter and signals of a possible update from Google coming down the pipe.

Well, as of this morning, it seems some are saying their rankings are back to where they were, for better or worse.

One person said in the comments:

My site is now back rankings. Maybe Google running test for new algorithms.

And another:

All the rankings have resumed :) Two hours ago the shuffle brought an improvement


Folks at the ongoing WebmasterWorld thread said similar things:

my website is back again on the first page, did some kind of update happened ?


The tracking tools are kind of all reporting normal patterns, although many won't show this chatter until tomorrow anyway.

Patrick Altoft documented a lot of ranking losses for big brands, but I assume by now, most of those returned. Note, SearchMetrics does the analytics weekly, so it could have happened over the 24-48 hours of this Google change, which now reverted itself?

Forum discussion at Twitter & WebmasterWorld.

Update: Google told us nothing is going on and there was no update.

Would A Domain Like appleandapples.com Trigger A Google Penalty?

Sometimes I see these unique questions in the SEO forums and I just have to share them.

This one comes from High Rankings Forums where a webmaster asks if he can use a singular and plural version of the same word, to be his brand name and domain name. For example, appleandapples.com would be the domain name and the company name would be Apple & Apples or something like that.

His concern, would it lead to a Google penalty?

Why the concern? I assume around Google's exact match domain penalty algorithm, the Google EMD update.

But truth is, if you have a nice combo name and you like it for your brand, go for it.

I doubt it can hurt you, will it help? I doubt that also. Go with a name you are proud of but make sure it works for your customers.

Forum discussion at High Rankings Forums.

Matt Cutts Likes Duane Forrester's Example Of Good Links: Unknown Links

On Monday, we shared a bold example from Bing's Duane Forrester about what good links are versus bad links.

It was bold because Duane from Bing basically said the only real valuable links are the ones you don't know you are getting in advance. Meaning, the other links you get, can maybe get you in trouble or not help you. So if you write a good article, maybe something a little better than this article, and you know you'll get links out of it, then that is bad? If you build a useful and fun tool and you know you'll get links out of it, maybe that is bad?

Of course not. But Duane is making a point. And I think we all get his point.

Google's Matt Cutts, the head spam man in the world of search, said on Twitter that he liked or "enjoyed" his definition. That maybe pre-knowledge of links is by definition, an unnatural link.

Is Google endorsing the definition of an unnatural link is a link that you know you will get before you get it? Then does it mean, if you know you are getting a link from source X or does it mean that you know that an effort will result in getting links in general? I'd assume source X would be a better definition?

Google: Nothing To See Here, No Update Going On

Despite all the signals, chatter and reporting showing that Google's search results appear to be updating as if there was an algorithmic change at Google, Google has told me that there is nothing going on.

Google said that they are not working on anything and there is nothing going on around this. They wouldn't comment more than that.

How can there be this much discussion and noise around a non-Google update? It has happened before. Either this is surrounded around normal algorithmic shifts without a refresh or update, maybe around link penalties or maybe Google is hiding something from us? I am not sure. I am just telling you that Google has denied this.

So if your site took a dive in the past week or so and you have no manual action, then it isn't real - it is fake. No, I am kidding, it obviously is real but we cannot attribute it to a confirmed Google update. In fact, it is Google saying there was no update.

I tried.

Forum discussion at WebmasterWorld.

Monday, May 12, 2014

Official Google Advice On Internationalizing Your Home Page

Google has published their official advice on the Google Webmaster Central blog on how to handle your home page when your web site serves multiple languages and countries.

Zineb Ait Bahajji and Gary Illyes, Google Webmaster Trends Analysts, wrote the post together trying to break out the possibilities into three categories:

(1) Having one home page that shows all users, no matter of language or location that same home page.

(2) Bringing users to a landing page asking them to pick their desired home page.

(3) Automatically redirecting users to the proper home page based off of various location/language detection techniques.

Google supports all these options but gives guidance in this post on how to handle it in all these cases.

Same Content For All Users

The first method is to have the .com users get the English version, the .fr get the French version, the .co.il version to get the Hebrew version and so on. Each domain name will serve different versions of your home page based on someone accessing the URL of choice. If someone lands on the .com, you may want to show an overlay to users who are not expecting the English version, that you have an alternative home page for that user.

Landing Page For Users To Choose Version:

The next option is to send users to a country selector page on your homepage or generic URL that lets the user pick which content they want to see. If you implement this method, Google recommends you use the x-default rel-alternate-hreflang annotation for the country selector page. Google said the x-default value helps them recognize pages that are not specific to one language or region.

Dynamic Serving Based On Location/Language Settings:

The final option is to just send the user to the home page you think they want to go to. You can determine this by detecting the location and language settings of the user and then use a server-side 302 redirects or by dynamically serving the right HTML content. In this case you will want to use the x-default rel-alternate-hreflang annotation as well. Google highly recommends that when you do this, you consider that you offer the user a way out, to go to a different version, just in case you get it wrong or the user prefers a different language.

Google always recommends you add to the country and language pages:
  • Have rel-alternate-hreflang annotations.
  • Are accessible for Googlebot’s crawling and indexing: do not block the crawling or indexing of your localized pages.
  • Always allow users to switch local version or language: you can do that using a drop down menu for instance.

You can learn more about this all on the Google Webmaster Central.

Google’s Matt Cutts Regrets Not Acting Faster On Paid Links & Content Farms

In Google’s Matt Cutts latest video, he answers a question I personally asked about what he regrets, what decision he regrets making in the past related to webspam. My question specifically was:

Was there a key moment in your spam fighting career where you made a mistake that you regret, related to spam?

Matt answered it in than four minutes explained he regrets not acting sooner on (1) paid links and (2) content farms.

Google’s Paid Links Regret

Matt explained that several years ago at a search conference in San Jose, a well-known SEO told him that paid links are too common and there are no ways for Google to fight against it. That is when Matt said he realized that Google has made a mistake and they allowed paid links that passed PageRank to go too far. So in 2005 or so, Google cracked down heavily on paid links and now at this point, Matt said “most people” realize paid links are against Google’s guidelines, possibly against the FTC’s guidelines, that they have algorithms that fight against it and also manual actions around paid links. But Matt regrets not taking action sooner and waiting too long.

Google’s Content Farms Regret

The second regret Matt admitted to was around not acting sooner on content farms. Matt Cutts explained that early on, he did get some user complaints about the horrible user experience some of these content farms had. But when Matt himself went to one of the sites based on a search on how to fix a toilet in his home, he felt the user experience was good. He said he “over generalized” based on that one example, when he should have looked at the site overall and not just one page.

Because of that over generalization, Google didn’t act as fast as they should have on content farms and thus it became more of an issue on the web and for Google to deal with. Here Matt is specifically talking about Panda.

Matt did say that Google does do a lot of “great work” and finds it “rewarding” on the whole. But at the same time, he said he always “wonders” if you could do better by acting one way or another.
Facebook Likes, Increase FB Likes Free