Wednesday, July 30, 2014

Google Search Tool To Filter Private Content

Kenichi Suzuki, one of my favorite SEOs from Japan, posted a screen shot on Google+ of a search tool filter to filter results by private content shared with me.

Here is his screen shot:

Google Search Tool To Filter Private Content

I personally do not see this feature because I am a Google Apps user, but most Gmail users should be able to see it by conducting a search and selecting "search tools" and clicking on the "private" option in filters.

This type of filter will show you content that was shared with you on Google+ or Gmail if you are signed in to your Google Account.

It is similar to what they launched in 2013 where you typed "my photos" and so forth but this is now an explicit search filter.

I don't think this is new, but I never see it myself since it is not enabled for Google Apps users by default.

Would Google Release A Penguin Update In The Summer? It's Been 10 Months!

It has been exactly 297 days since the last official Penguin update, Penguin 2.1, which was on October 4th. That means it has almost been 10 months since the last update.

10 months of waiting for those hit hard by the Penguin algorithm. 10 months of having a fraction of their normal traffic and hoping Google pushes out an update for 10 months, waiting to see if their spam removal efforts have worked and to see if they can start paying their payroll and families.

Let's look at the history of the Penguin updates:

Google Penguin Updates:


We had breaks of:

  • 4.5 months between Penguin 2.1 and 2.0
  • 7.5 months between Penguin 2.0 and 1.2
  • 4.5 months between Penguin 1.2 and 1.1
  • 1 month between Penguin 1.1 and 1.0

So this is by far the longest anyone has had to wait for a Penguin refresh or update from Google.

Would Google release the update in the summer? Penguins are not really summer animals and Google is often less likely to release major updates in the summer, I guess with the exception of the Google Pigeon update?

I cannot imagine Google wait until October to push out their next update.

When do you think Google will push out a Penguin refresh? If I had to guess, we'd see one before the end of August.

Forum discussion at WebmasterWorld.

Google Begins Issue Invites To Google Domains

A month ago, Google announced it is slowly getting into the domain name registration business through Google Domains and offered a form to sign up to be invited.
Well, some have been invited over time but now those who have had access are now being given the ability to invite friends and colleagues. I believe this is flowing mostly out of Googlers who have accounts with Google Domains.

For example, John Mueller is asking if certain folks want invites to Google Domains on his Google+ account. I know other Googlers have offered invites to others as well.

It seems like this Google Domains rollout is working a lot like the Gmail invite system.

Kim Clinkunbroomer posted on Google+ a screen shot of the interface for Google Domains:

click for full size

I also believe GDG Kansas City is sending out invites tomorrow, learn more about that on Google+.

Google Long Term SEO Requires Solid Money Management

Greg Niland, aka goodroi, posted a thread in WebmasterWorld explaining that money management is vital in your long term SEO success.

Greg said, "having good SEO results in Google actually requires smart money management." Yes, this applies to any business you are on in, but maybe a bit more for SEO, because of all the aspects out of your control.

Greg laid out why:

  • SEO required money: You need to buy domains, servers, content writers, developers, etc.
  • Google updates can be unpredictable and you can find your web site penalized and not making you money. You need cash reserves in those cases.
  • Don't waste your money now, assuming you'll always be making as much money as you have in the past.
  • Invest in other areas, not just your web sites because one day, Google can replace you.

Have you learned this the hard way?

Forum discussion at WebmasterWorld.

Google: Moving A Site Has No Long Term Negative Impact

Google has said this before but it is always comforting hearing them say this multiple ways.

Google's Zineb Ait Bahajji said in a Google Webmaster Help thread that if done correctly, there will be "no negative impact on the website's visibility in the long term."

She said this is in Google's documentation:

As the documentation explain very well, if the site move is done correctly, it will have no negative impact on the website's visibility in the long term.

To quote the article : "[T]he visibility of your content in web search may fluctuate temporarily during the move. This is normal and a site’s rankings will settle down over time."


"Settle down over time" doesn't necessarily mean "no negative impact" but since Zineb is making that 100% clear now, it is good to share with you all.

Google: "if the site move is done correctly, it will have no negative impact on the website's visibility in the long term."

How can you do it correctly? Well, Google has a very detailed document and walk through on their help section.

Saturday, July 26, 2014

Google Hypocrisy: Keyword Rich & User Friendly Links Should Die

Back in the days before Google, online usability folks were all about making user friendly hyperlinks that communicated to the user what the link was about and what to expect when they clicked it. That means, a keyword rich anchor text link that describes the page it is linking to.

I remember early on, folks in that community, being upset with me when I linked the words "click here" or "more here" and so forth and not using keyword rich anchor text to describe it.

Usability advocate, Kim Krause Berg posted in the Cre8asite Forums a rant about this.

She said, "Hypocritical Google Dislikes User Friendly Links."

I've stated before that I don't give a flying cow about whether or not Google thinks a site I link out to is acceptable or not. If I decide to link to a page I like, I'm doing it. If I choose link anchor text that describes where my visitor will land, that is my right. All links are mystery links to people unless it is clearly explained where the link will take them. This is usability 101. Google knows this.


Is Google trying to kill user friendly links because of the manipulation of it by a group of SEOs?

Do they have other choices? Is this what Google is aiming to do?

Forum discussion at Cre8asite Forums.

Google Right To Be Forgotten Success Rate Is 50%

As you know, Google has started taking down content in the European Union based on the right to be forgotten form submission requests.

Google shared statistics with both the Wall Street Journal, Search Engine Land and others on the number of requests and take downs.

Google said as of the 18th of July, they have received 91,000 right to be forgotten requests involving more than 328,000 different URLs in Europe. Google rejects about 30% of takedown requests, Google asks for more information in around 15% of the cases and Google approves over half of the requests.

So you have a pretty good chance of getting content removed from the European Google results by using that form. Better than a 50% shot.

The WSJ added, most requests came from France, with 17,500 requests. Germany had 16,500 requests, and 12,000 requests originated in the U.K., the person said. Some 8,000 requests came from Spain, 7,500 from Italy, and 5,500 from the Netherlands, the person added.

Google Launches New Local Search Algorithm: SEOs Notice Significant Ranking Changes

Last night, Google pushed out a new and major local search ranking algorithm change. I broke the story at Search Engine Land where Google provided details for me on this update.

Note: Later we named the update the Pigeon update.

This is not really a spam change but more of a fundamental change to the local search ranking algorithm. Google would not tell me the percentage of queries impacted by this change but based on early reports, I'd say it is a significant number of queries impacted by this local algorithm change.

The changes have rolled out to both the Google Maps search results and Google Web search results. Google told me that the new local algorithm has "deeper into their web search capabilities, including the hundreds of ranking signals they use in web search along with search features such as Knowledge Graph, spelling correction, synonyms and more." It also has better accuracy over distance and location rankings.

This has rolled out only for US English results and we have no idea when and if it will roll out for other languages or countries.

Significant Changes Seen


SEOs and webmaster have already picked up on significant changes. I have a screen shot from a search result for [ice cream] from earlier this week and the results are even different with that query.
Before:

Google Maps Search Adds Scrollable Search Results

After:

Google Launches New Local Search Algorithm Results

Even without knowing there was an update, SEOs picked up on it last night and this morning in the Black Hat World forum and WebmasterWorld (paid membership required) forum. There are many forums discussing all the changes and there are a lot of ranking changes.

Also, back in 2009, Google dropped local results for web designers and SEOs. Well, now it seems some of that has come back with this new local search algorithm.

Thursday, July 24, 2014

Google Issues Manual Action For Links On Moz's YouMoz

The co-founder of Moz, Rand Fishkin, posted on Twitter and also on the Moz blog that one of the YouMoz, the user generated content portion of Moz, contributors received a Google Manual action for a link violation and it cited an article posted on YouMoz.

Here is the violation notice as posted by Rand:

YouMoz Google Link Violation

So this goes back to guest blogging being dead for SEO purposes. Which concerned a lot of YouMoz contributors, but Moz reached out to Matt Cutts and he responded via email that Moz should be okay. Here is a snippet of the email:

That said, with the specific instance of Moz.com, for the most part it's an example of a site that does good due diligence, so on average Moz.com is linking to non-problematic sites. If Moz were to lower its quality standards then that could eventually affect Moz's reputation.

The factors that make things safer are the commonsense things you'd expect, e.g. adding a nofollow will eliminate the linking issue completely. Short of that, keyword rich anchortext is higher risk than navigational anchortext like a person or site's name, and so on."


In this case, the anchor text was indeed keyword rich:

link example

But each and every YouMoz post is reviewed by someone at Moz, so Rand is upset about this.

Danny Sullivan on Twitter pulled out more questionable things about this guest blog post:



Anyway, I did email Google about this and I may post something more detailed at Search Engine Land if/when I get a response. A shame Cutts is on leave, I wonder who will take the hit for this.

Tuesday, July 22, 2014

Google: The Effects Of Duplicate Content In Search

Duplicate content in the SEO space has been an important topic for as long as I've been in the industry. I've covered it here countless times.

Today, I wanted to share how Google's John Mueller says what "effects you'd see with content duplication within a website are."

John says there are two main issues with duplicate content:

(1) Google's algorithms will choose one URL to show for the content in search. Maybe it won't choose the URL you'd choose. If you have a preference, make it known (through redirects, rel=canonical, internal links, etc).

(2) Depending on the amount of duplication (is each piece of content hosted 2x, 20x or 200x?), it can happen that the process of crawling is too much for the server, or that new/updated content isn't picked up as quickly as it otherwise might be.

John said in most cases, in cases of "reasonable amount of duplication" and with a "reasonably strong server", these are not issues. John said in these cases, "neither of these are real problems. Most users won't notice the choice of URL, and crawling can still be sufficient."

Forum discussion at Stack Exchange.

Thursday, July 17, 2014

Google Revamps Webmaster Tools Robots.txt Tester Tool

Google announced that they've updated the Robots.txt tester tool within Google Webmaster Tools.

The tool adds three things:

(1) Highlights which line in your robots.txt file is blocking a specific page.

(2) Make test changes to the robots.txt tool and test it before you make the file live.

(3) Google will also show you older versions of your robots.txt file to see past issues.

Here is a test showing you how it highlights the single story I block from search engines on this site:

click for full size

This a view of the history of my robots.txt file:

Google Robots.txt Tester Tool History

Google also issues a subtle hint to try the fetch and render tool again and stop blocking stuff from Google.

Forum discussion at Google+.

Tuesday, July 15, 2014

Google Wants You To Link Your Google My Business & AdWords Accounts

Google announced they want you to link your new Google My Business account, formerly Google Places, with your Google AdWords account.

This is the new and "upgraded location extensions" that should offer you a "a better way to display your business locations in every ad," said Google.

How do you do it?

1) Create your Google My Business account and add your business locations over here.

(2) Link your Google My Business account to AdWords. Linking your accounts in one easy step allows your business info to appear with your ads. All campaigns will have location extensions automatically enabled when you link your accounts. You’ll also be able to customize your upgraded location extensions for different devices. Learn more over here.

Google Wants You To Link Your Google My Business & AdWords Accounts

Google said they will continue to "improve account location extensions with the goal of upgrading all AdWords accounts over the next few months."

Google Notice: This Site Uses Flash, May Not Work On Your Device

Google announced that if you conduct a Google search on your iOS or Android device and a site that comes up is built in Flash technology, they will warn the searcher not to click to the site because it would be a bad user experience.
The note reads:

Uses Flash. May not work on your device.

Try anyway | Learn more


Google said this is launching today, and the notice will show when their algorithms detect pages that may not work on their devices. For example, Adobe Flash is not supported on iOS devices or on Android versions 4.1 and higher, and a page whose contents are mostly Flash may be noted.

Google's Pierre Far said on Google+:

If your website still uses deprecated technologies that don't work on mobile devices, it's already well past the time to update it. For example, if a page's main contents (or solely) uses Flash that doesn't work on many mobile devices, starting today we will note that in the snippets in our search results.

So what should you do? Simple: Use HTML(5), JS, and CSS as they are the only technologies widely (and sometimes solely!) supported by all devices. For that, many Googlers have been working on Web Fundamentals to bring you the modern best practices.


Time to upgrade folks.

Google recently began showing faulty redirect notifications in the mobile search results as well.

Forum discussion at Google+ & WebmasterWorld.

Sunday, July 6, 2014

Google On Can You Redirect A Penalty To Another Site

Can you redirect a penalty to a new or different domain name? Let's say site A has a bad link penalty, either manual action or algorithmic issue. And you or a competitor decide to use a 301 redirect from site A to a site without a penalty (i.e. site B). Would site A's penalty cause site B to get a penalty also?

This is not a new question and we know penalties may follow you even without redirects. But if done in a negative SEO way, will Google catch it?

John Mueller of Google answered this in an unofficial Google Webmaster Hangout Friday afternoon at 27 minutes and 30 seconds in.

John said:

(1) Google is usually good at catching these cases.
(2) He has personally never seen a case where this caused an issue for a good site.
(3) If you are worried and you want to ensure nothing bad happens, you can disavow the links pointing to the penalized site (i.e. site A). Disavowing the domain that was 301ed to you, seems like it wouldn't help.

Here is the video:

Let Google Know About Your Negative SEO

A thread at Google Webmaster Help asks a question that goes unanswered for the most part, "Negative SEO. How can we let the Webmaster Team know?"
How do you let Google know when you think you've been hit by negative SEO?
Good question, I believe this was answered by various Googlers.

Besides for most of them saying that it is rare to be hurt by negative SEO, if you are worried, you can post the details in Google Webmaster Help and also even submit a reconsideration request, if you have a manual action, and let them know that way.

Truth is, either way, the best response you'll get from Google or someone trying to help you with your Google issues but do not work from Google, is the Google Webmaster Help forums.

You can also try to get on a live hangout with John Mueller or another Googler by visiting Google Webmaster on Google+ and finding one of their live hangouts.

Either way, you will almost have to always do work to "fix" the issue even if someone else did it to you. To get help on those steps, you need to isolate what your issue is and then fix it from there.

Google HTTPS Everywhere: Why, How & SEO Implications

Google has published a video of the HTTPS Everywhere presentation on YouTube. The presentation was give by Pierre Far from the Google Webmaster team and Ilya Grigorik from the Google Developer team.

The presentation was awesome because it was conversational and kept you interested.

The outline of the presentation was:

  • Why you need HTTPS
  • How to deploy it correctly that doesn't impact website performance through techniques like HSTS, session resumption, SPDY, and more
  • How to make sure your secure sites get indexed correctly

The video is 45 minutes long:

Here are some resources as discussed in the presentation:


Forum discussion at Google+.

Google's Panda Algorithm Forces PR Newswire To Remove Press Release Spam

As you may remember, Google Panda 4.0 had a major negative impact on many press release sites including the largest one, PR Newswire. In response, PR Newswire issued a press release about them cracking down on "spammers" by removing and "taking action" on low quality press releases both before they hit the feeds and going back and removing a lot of the spammy older press releases.

PR Newswire, according to SearchMetrics, saw a huge dive in "SEO Visibility" after Panda 4.0.

click for full size
What are their new guidelines that will fix this? They include:

  • Inclusion of insightful analysis and original content (e.g. research, reporting or other interesting and useful information);
  • Use of varied release formats, guarding against repeated use of templated copy (except boilerplate);
  • Assessing release length, guarding against issue of very short, unsubstantial messages that are mere vehicles for links;
  • Overuse of keywords and/or links within the message.

Of course, you got to love this quote:

"Google's recent algorithm update is essentially a technology-based editorial guideline for content quality, and PR Newswire is aligning our processes with those standards to ensure that press releases distributed are high-quality, authenticated content," noted Jason Edelboim, senior vice president of global product for PR Newswire. "Google's recent action targeting low-quality content in the Panda 4.0 update affirms the importance of ensuring press releases and other content distributed via PR Newswire's network are of real utility and interest to journalists and bloggers, as well as the general public."

If only they took these steps before Google slapped them. If only.

I assume this is having a major impact on their business both the traffic drop and now preventing a whole segment of spammers customers from paying them for press releases.

Google Removes Authorship Images From Search Results

Yesterday, I broke the news at Search Engine Land that Google has decided to drop the images and circle counts from the authorship snippets in the Google search results.

Google's John Mueller posted this on Google+ and is taking the brunt of the hit on this announcement. He wrote:

We've been doing lots of work to clean up the visual design of our search results, in particular creating a better mobile experience and a more consistent design across devices. As a part of this, we're simplifying the way authorship is shown in mobile and desktop search results, removing the profile photo and circle count. (Our experiments indicate that click-through behavior on this new less-cluttered design is similar to the previous one.)


Yes, he actually wrote, "our experiments indicate that click-through behavior on this new less-cluttered design is similar to the previous one." There are plenty of studies and even Google early on claiming otherwise, including examples from webmasters. But whatever, why would Google lie to us about that?

Do you think this:

google-authorship-image

Wouldn't get more clicks than this?

google-authorship-without-image
Really, Google?

Anyway, at least the news results do still contain an image, a smaller image:
google-authorship-news

I told Google, they really need to update the Structured Data Testing Tool, because it still shows the authorship image in the preview:

google-structured-data-tool-author image

Now, this isn't 100% live yet, it is rolling out over the next few days. So you still can benefit from it.

Why is Google removing it? Well, it is less cluttered. Truth is, I think Google can't handle the spam issue even after rich snippet reductions. Remember when Google dropped authorship for 75 minutes? I guess that was a real test?
As you can imagine, publishers are not happy. There are tons of comments at Search Engine Land and complaints on Google+ and WebmasterWorld. Many in fact are considering dropping it completely now from their markup. But would that be wise? There are signs it may be used in rankings in the future and is currently used for in depth articles.

Bing: How We Pick Your Title Tag

Microsoft Bing published a blog post on how they decide on what title tags to publish in their search results.

We covered this in detail most recently via a Google Matt Cutts video on how Google selects a title tag. So let's go over it with Bing, which is fairly similar.

Bing has the user in mind, so their goals in displaying the proper title tags are:

  • Optimizes titles for relevance to the user. Titles are very powerful when it comes to showing how a site or document is relevant to a user’s query.
  • Optimizes snippets. Snippets also help the user differentiate between search results at a more granular level.
  • Optimizes display URLs. Users look at URLs to validate the source of information and gauge its authenticity. Bing tries to make it easy to see who is providing the information.

Bing then provides four ways to help ensure Bing chooses the title tag you specified.




  • Make the HTML Title relevant to the queries that would be used to search your site without being overly long or repetitive. Avoid generic titles like “Home” or “About Us”.
  • If you embed OpenGraph, etc., make sure it is consistent with the title you want, and that all the fields are correct, for example that your site name is correct.
  • If your site is listed on dmoz.org or other directories make sure the entry is correct.
  • Don’t block Bing's crawler. Please refer to crawler control instructions on how to control the crawler, but keep in mind, you should not block Bingbot if you want your content to be indexed. By slowing the crawl rate (via the Webmaster Tools) or by blocking Bingbot in the robots.txt file, or even blocking Bing's IP addresses, you can prevent us from crawling and indexing your content.

Matt Cutts, Google's Lead Spam Fighter, Takes Extended Time Off

Google's head of search spam, Matt Cutts, announced on his blog that he is taking "a leave" through October of this year. In short, he is going completely off the grid and not dealing with anything Google related, i.e. his work, for the next four months or so.

Why? Did he burn out? Is he looking to do other things in work? He says no. He said, he is doing it because his wife deserves it:

When I joined Google, my wife and I agreed that I would work for 4-5 years, and then she’d get to see more of me. I talked about this as recently as last month and as early as 2006. And now, almost fifteen years later I’d like to be there for my wife more. I know she’d like me to be around more too, and not just physically present while my mind is still on work.

So we’re going to take some time off for a few months.


Spammers are going to take over Google's search results? Nah. Matt Cutts wrote, "Thanks to a deep bench of smart engineers and spam fighters, the webspam team is in more-than-capable hands. Seriously, they’re much better at spam fighting than I am, so don’t worry on that score."

Honestly, Matt has been dealing with so much negativity for so long, both on how Google isn't fair, how Google is immoral, how Google is evil and even on positive things that turn around. He has been attacked for years and just recently was verbally called a liar by Jason Calacanis who said he wants revenge. I doubt now that it got to him and that is why he is taking off. He would have burned out a long time ago.

Is this a sign of things to come? Will he eventually retire and get into something else, like maybe politics? I wouldn't be surprised if that wasn't part of him 2-5 year timeline.

But in October/November, he will come back, he will see that Google did manage to chug along without him. He will be more comfortable leaving Google with their day-to-day stuff and will likely be more comfortable leaving Google. That being said, he also will be refreshed, energized and ready to get his hands dirty in web spam again, at least for the time being.

Who will be the next poster-boy at Google to be slammed and blamed for Google's algorithm updates? Maybe John Mueller? John did take the most recent hit for announcing the authorship change. Maybe it would be nameless and spread across Google as a team? It is unclear. But I think things are set up in a way that this can work.

In any event, Matt can take the time off (he has earned it) and more importantly deserves the time off. We will miss you Matt but your wife deserves the time. Enjoy!

Forum discussion at Twitter, Google+, WebmasterWorld and more coverage on Techmeme.

Google Killer Robots.txt File Blocks Terminators From Co-Founders

Google posted a new robots.txt file at google.com/killer-robots.txt that is designed to help secure the Google co-founders, Larry Page & Sergey Brin.
The file says:

Google Killer Robots.txt

User-Agent: T-1000
User-Agent: T-800
Disallow: /+LarryPage
Disallow: /+SergeyBrin

There are actually some Googlers who are upset that they are not included on the disallow list. Okay, maybe he was joking.

T-1000 and T-800 are two earlier versions of the Terminator models used in the two original movies from The Terminator.

This is pretty funny.

Forum discussion at Hacker News.
Facebook Likes, Increase FB Likes Free