Friday, January 31, 2014

Can You Rank In Google Without Content?

A WebmasterWorld thread has a webmaster who has a site that doesn't have any real content. It is basically statistical downloads and specifications downloadable as PDFs or Zip files.

Can you rank web pages with no content at all in Google?

A good example of a page that ranks without having the exact words on it is the Adobe Reader page which ranks for [click here].

But what about a page with almost no content? It is possible to rank on anchor text alone?

Yes, but it has to be very obscure and non-competitive words.

The WebmasterWorld thread has netmeg adding:

Some sites just aren't meant for search engines.


Forum discussion at WebmasterWorld.

Thursday, January 30, 2014

Dear Google, We Can Help With Your Google Rankings

You and I get them all the time, spammy emails from SEO companies telling us how they can get us more traffic and higher rankings in Google.

But did you know Google gets them as well? Yes. Not just Matt Cutts to his personal blog or other Googlers to their personal blogs. But Google.com gets these emails, telling them that they aren't ranking well in their own search engine but not to worry because this SEO company can help.

John Mueller from Google posted such an example email solicitation made out to Google on Google+. Here is how it read:

Dear Google Team,

I thought you might like to know some of the reasons why you are not getting enough Organic search engine and Social Media traffic foryour website.

1. Your website is not ranking top in Google organic search for many competitive keywords.

2. Your website profile needs to have regular update in major Social Media sites.

3. Your site has less number of Google & Yahoo back links, this can be improved further.


I think Google should take them up on the offer. I mean, can it hurt?

It just amazes me people do this.

Google's Matt Cutts: Don't Use Article Directories For Link Building

In a short video yesterday, Google's Matt Cutts told webmasters and SEOs not to use article directory sites for link building strategies.

Well, he kind of said not to use it and hinted to it at the end by saying:

My personal recommendation would be probably not to upload an article there.


But on Twitter, he was clearer saying "no."




Here is the video:


Will this stop SEOs from using it? I doubt it. 

Friday, January 24, 2014

Was Expedia Targeted By Negative SEO?

Earlier this week, Expedia saw a 25% decline in search visibility by Google. Both Google and Expedia gave a no comment on what was going on - so we are left speculating.

A USA Today article cites that negative SEO might have been a factor in Expedia's ranking drop.

Expedia may have been hit by a "negative SEO" campaign that hammered the travel website's rankings on Google searches, according to an analysis by the firm that uncovered the problems.


I can't imagine that negative SEO would have such a serious impact on a large internet brand like Expedia. Sites as large as Expedia to be hit by negative SEO is almost impossible. Unless the whole thing was staged and it fooled Google, which is possible, but unlikely, I can't imagine that this would be negative SEO.


(1) It would make Google look ridiculous. 

(2) It can set Google up for legal worries because it impacted their shareholders.

(3) It would make it possible for any huge site and publicly traded company to be gamed both in search and in the stock market and that is way too scary.

Robert at WebmasterWorld added a funny point:

Worth noting that the links from the USA Today online article are apparently not nofollowed. Maybe this is part of the Expedia recovery effort.

Guest Bloggers Ask About Disavowing Links In Their Posts

Earlier this week, Matt Cutts of Google, stuck a fork in guest blogging for links or SEO purposes.

Now we have guest bloggers asking in Google Webmaster Help forums if they should be disavowing, nofollowing or removing links they have on these sites they've guest blogged on.

If you are asking the question, I'd say the answer is yes - disavow, nofollow or remove those links.

Deon in the forums said that he once considered those to be his "good backlinks" but now with all the guest blog frenzy, he is rightfully worried.

Like I said, if you are asking the question, in my opinion, you should be likely disavow the links. I am sure many would argue with me at this point but better to be safe than sorry.

Thursday, January 23, 2014

Does Google Read Content Within Strike Or Del Tag?

Here is a question I never saw asked before - which is rare. A Google Webmaster Help thread has one webmaster asking how Google handles the content within a strike or delete tag.

The strike tag is not supported in HTML5 and was replaced by the del tag but they both do the same thing. For example, if you want to strike a word in HTML, it would look like this. But is Google reading, indexing and ranking the words within the strike or del tags?

Google's John Mueller responded to the thread adding:

If these are only additional mentions of the same kind of content as you otherwise have on the page (and not the only mentions, or the primary content of the page), then removing them is fine.


So let's try it, I am going to put the following in the del tag and see if this page ranks any time in the near future. The phrase is erjdlcytnslx84mdss2.
Let's see what Google does with it.

Forum discussion at Google Webmaster Help.

Update: It looks like Google News indexed it so far:

del-strike-index-google

Google: No Special Indexing For Facebook Or Twitter Because They Can Block Us But..

Yesterday Matt Cutts of Google released a video explaining that Google currently doesn't do any fancy indexing or ranking for Facebook or Twitter pages. He also said that part of that is they don't currently try to extract social data, such as the number of likes or tweets a page gets.

Why? Isn't it good data? Well, yea, it is but here is why, according to Matt:
(1) They don't want to start pouring engineering time into getting this data and then be blocked by Facebook or Twitter. In fact, Matt said they did it once and were blocked. I believe he is talking about when Google real time search was killed because of a deal break with Twitter.

So they don't want to spend all that time capturing and figuring out data that they don't know if they will be blocked from using in the future.
(2) They are worried they can't crawl it fast enough to keep it up to date. Social data changes a lot and fast, so it may insult someone if they change their relationship status from one thing to another.

But Matt adds they will and do plan on doing this in the future. Ten years down the road or soon? I believe we should expect it sooner than later. They are actively working on a subject authority ranking and this is a large piece of it.

Here is the video, worth a watch:


December 2013 U.S. Search Engine Rankings By comScore

I don't often cover these reports but I figured it would be nice to compare year end search ranking market share from December 2013, which was recently released and to last year December 2012 market share.

Google is of course the winner but they aren't really gaining much share year-over-year. Microsoft has, as is Yahoo - which are both powered by Bing. So In December 2013, Google has 67.3% share, while Bing powered 29% of search. AOL is powered mostly by Google, so if you add that in, that brings Google to 68.6%.

Here are the charts from the two years.

December 2013:

comScore Search Market Share December 2013

December 2012:

comScore Search Market Share December 2012

Yahoo Search Goes SSL: Not Provided

Yahoo is now defaulting all their search results over SSL, which is secure search but also means that pesky [not provided] figure will increase.

Danny Sullivan broke the story at Search Engine Land yesterday. In short, if you do a search at Yahoo.com or search.yahoo.com, the results will direct to https://search.yahoo.com/. When a searcher clicks from the search results to your site, you won't be getting the referrer data, nor the query data. Well, that is if your site is not secure itself.

Google shows not provided for 90%+ of the search referrers. Google is specifically not going with protocol, they are wiping out the search referrer data even if the searcher is going from HTTPS to HTTPS, i.e. from Google to your secure site.

Yahoo does not do that, Yahoo will pass the referrer and query data if your site is secure.

Bing just added SSL support to search but you need to actively go to the HTTPS version of Bing to see it. When you do, the referrer data is not passed, like Yahoo, to an unsecure site. Which again is protocol. But from HTTPS to HTTPS, it does pass, unlike Google.

Danny summed it up well, if you have a standard HTTP site, then:

  • Google: secure search is the default, Google referrer passed but search terms stripped, except for advertisers.
  • Bing: secure search is optional, no referrers passed
  • Yahoo: secure search is the default, no referrers passed

I decided to launch a What Is My Secure Referrer? site, so when Google, Bing and Yahoo index it, you can search for What Is My Secure Referrer? and click on the https://referrer.rustybrick.com URL and it will report back your referrer. There are plenty of sites that show the non HTTPS referrer path.

Wednesday, January 22, 2014

Google AdSense Direct - Google Takes 15% Of Your Direct Ad Sales

Initially, Google's answer to help small publishers do direct ad sales while filling those ad spots with AdSense when the direct ads were not available, was the DoubleClick for Publishers (initially called Google Ad Manager). Truth is, using it is complex for most publishers.
So Google AdSense came out with a new method yesterday named AdSense Direct, which basically swaps out your AdSense units with your direct sales and then when that campaign ends, the AdSense units return. What does Google get?
(1) More AdSense placements when your direct ads are not live where they get about 30% of the revenue. (2) On the direct ads, Google gets to 15% of the ad revenue. (3) Advertisers need to use Google Wallet.
But for many small publishers, this is a no brainer. Direct Ad sales can bring in more money but are hard to get. So when you get one, using this platform might make more sense.

Here is a video on how it works:


For more details, see the FAQs.

It seems like many publishers are not terribly into the idea of letting Google or AdSense manage their direct ad sales. A WebmasterWorld thread has senior member Netmeg saying "Not sure I'm on board but I have to look into it more."

Google Drops Discussions Search Filter & Others

Now when you search Google, the number of search filters have been removed, including the discussions filter. The number of filters or vertical options have been drastically reduced. Here is an example.

Google Drops Discussion Filter

But if you look at an article from six months ago, the options for that search were much more vast.

Google Discussion Filter

Now we have web, images, shopping, maps, news, videos, books, flights and applications. We are now missing places (which is maps I guess), blogs, discussions, recipes, and patents.

All Google Testing also notes that the order of the filter options have changes based on your query. Interesting.

For some, the removal of these additional filters is very upsetting. A Google Web Search Help thread has one searcher who said "very annoying as I rely on this function a lot."

Google AdWords New Design Live

A few months ago we reported that Google was testing a new AdWords design for the advertiser console. Well, Google announced last night that it is rolling out to everyone now.

The new design is aimed at making the AdWords console more unified with the other products, services and platforms Google offers. Google said the new design aims to also make the console " simple and beautiful user experience that helps you get tasks done quickly and efficiently."

Here is a picture of the new design:
Google AdWords New Design - click for full size

If you do not see the changes yet, you should in the next "few weeks" Google says.

Honestly, there is not much chatter about this new design in the community - I guess not too many people are seeing it yet.

Google Webmaster Tools Now Reports Crawl Errors On The Final Redirect URL

John Mueller from Google announced on the Google Webmaster Central blog that they've updated the crawl error reports specifically to handle better reporting on redirects.

In the past, Google reported the error observed at the end under of the first URL (URL A in diagram below). Now, instead they report it at the final URL (URL B in diagram below).

Google Webmaster Tools Updates Crawl Error Reports For Redirects

John Mueller summed it up on Google+ saying:

The short version is that we'll be showing these errors for the final destination URL instead of the redirecting URL, which hopefully makes things a bit easier to diagnose.


Many webmasters are happier with this change because it makes it easier for them to diagnose the crawl errors on these redirects. 

Pretty Google Alerts Won't Let Users Switch From HTML To Text Version

For those who subscribe to Google Alerts via email, it is hard to not notice the new formatted HTML emails you get for those alerts. Most of my alerts are set up via RSS but I do have some legacy ones I kept just to track the email versions.

The only issue with the launch of the new design is that the text version no longer works.

Here is the new HTML version of the alert, it is indeed nicer to look at:

new HTML Google Alerts

Here is the old HTML version of the alert:

old HTML Google Alerts

Now, when you go to manage your alerts, there is a button to switch to text alerts. But when you click it, it doesn't take and the HTML versions continue.

Text Google Alerts

This was reported in the Google Web Search Help forums but no one has yet to reply about the issue.

Google's Matt Cutts: Guest Blogging For Links Or SEO Is Over & Dead

Yesterday, Matt Cutts, Google's head of search spam and the most loved to be hated man at Google, posted on his personal blog that guest blogging is dead at least for link building or SEO purposes.

At first his language in the blog post was much more broad, saying guest blogging as a whole is dead. He then clarified that only for purposes of link building or search engine optimization is guest blogging dead.

Matt explained that while guest blogging was a great thing, the SEO space ruined it. He said, this is why "we can’t have nice things in the SEO space." Matt decided to use some serious language and said "so stick a fork in it: guest blogging is done; it’s just gotten too spammy."

He said:

In general I wouldn’t recommend accepting a guest blog post unless you are willing to vouch for someone personally or know them well. Likewise, I wouldn’t recommend relying on guest posting, guest blogging sites, or guest blogging SEO as a linkbuilding strategy.


Later on he added a bit more clarification:

It seems like most people are getting the spirit of what I was trying to say, but I’ll add a bit more context. I’m not trying to throw the baby out with the bath water. There are still many good reasons to do some guest blogging (exposure, branding, increased reach, community, etc.). Those reasons existed way before Google and they’ll continue into the future. And there are absolutely some fantastic, high-quality guest bloggers out there. I changed the title of this post to make it more clear that I’m talking about guest blogging for search engine optimization (SEO) purposes.

I’m also not talking about multi-author blogs. High-quality multi-author blogs like Boing Boing have been around since the beginning of the web, and they can be compelling, wonderful, and useful.

I just want to highlight that a bunch of low-quality or spam sites have latched on to “guest blogging” as their link-building strategy, and we see a lot more spammy attempts to do guest blogging. Because of that, I’d recommend skepticism (or at least caution) when someone reaches out and offers you a guest blog article.

Of course, the SEO community is buzzing on the topic. We have threads at Cre8asite Forums, WebmasterWorld and a big Q&A session at Hacker News. Of course, Twitter and Facebook is exploding, you can see Matt Cutts tweet.

Personally, I never felt comfortable allowing anyone to guest post stuff here. It is my site, I am not that inviting I guess. We maybe had three guest posts here from outside authors on rare occasions.

Of course, none of you should be shocked by this news. Here are some recent stories we posted on guest blogging:

Thursday, January 16, 2014

Google Apologizes For The Hotel Listing Hijack In Google Places

I have to assume most of you by now heard about the huge mess going on with Google Maps business listing in the hotel sector? Danny Sullivan at Search Engine Land, with the team, wrote up an awesome story explaining the how Thousands Of Hotel Listings Were Hijacked In Google+ Local.

I can tell you this story was in the work for a few days and Danny broke it just the other day. It is honestly shocking how something like this can happen to huge hotel chains. It is even more shocking how Google tries to sweep it under the rug. Yea, I know Google Maps is plagued with issues, especially on the Google Places business listing side. But this is a huge mess.

In short, some how, spammers hijacked listings of hotel chains across the world, replacing the hotel's URL with a URL to book the listing on their own affiliate site. This likely ended up costing the hotels a tremendous amount in affiliate fees, which I wouldn't blame them if they didn't pay and end up suing the affiliate that did this.

Here is an example showing one listing with a hijacked URL:

Google Responds To Hotel Listing Highjack

Google barely said anything but now they have their community manager, Jade Wang, respond in a Google Business Help thread that no one really looks at. She wrote:

Spam issue for some hotels in Places for Business:

We've identified a spam issue in Places for Business that is impacting a limited number of business listings in the hotel vertical. The issue is limited to changing the URLs for the business. The team is working to fix the problem as soon as possible and prevent it from happening again. We apologize for any inconvenience this may have caused.


At least they apologized for "any inconvenience" it may have caused "some hotels."

It is just amazing how bad Google Maps for business is and how many hacks and issues are within it.

Google's Matt Cutts: We Don't Have Different Algorithms For Different Ranking Positions

Google's Matt Cutts released a video yesterday answering the question; does Google have algorithms for different positions in the search results.

So does position one through three use one ranking algorithm, position four through six use another and so on.

The answer is no, at least with the web organic search results. Of course the ads have different algorithms, the local results have different algorithms, the videos and so on - but the web results do not work that way.

Here is the video:


Tuesday, January 14, 2014

Dated Content Impacts Your Google Rankings?

There is an interesting thread that I am not sure if I am looking into too much. The Google Webmaster Help thread has a representative from Rasmussen College complaining that his business management degree page no longer ranks well in Google. Now that is not the interesting part, the complaint. :)

The interesting part is that he said it only started dropping in the rankings when Google decided to add a date in the search results to the snippet. Despite the page not having any date based information on it, despite the fact that the page is not article like content, despite the fact that the page is a sales page for the school - Google dated the page.

google date ranking drop

So John Mueller of Google simply responds:

Thanks for posting -- I've passed your feedback on to the team!


The question is, why did Google date the page? If it did, did that impact the rankings of the page or was it something else?

Clearly for QDF types of keyword phrases, articles that are old should not rank. But what made this page a dated page? What made the keyword phrase responsive to QDF?

I may be looking into the lack of information in John's post, a bit too much. But I figured I share it and see what you all had to say.

One-Line Google Reconsideration Request Worked

A webmaster at WebmasterWorld claimed that while some reconsideration requests he submit take five or more times, with tons of details sent to Google - one he sent just recently went through with a one-line request.

The request read:

We managed to clean up all our backlinks and we have updated the disavow list. Please remove our manual penalty. Many thanks!


Google approved it and responded the reconsideration request was accepted and the penalty was removed.

What is even more amazing is that they didn't even contact any webmaster to remove a link. They just uploaded a disavow file.

We know Google said they want to see you manually removing links or may not approve your disavow and reconsideration request.

The thing is, it depends on which Google representative and what time and what mood they are in, is reviewing you reconsideration request.

Do lazy reconsideration requests work for you?

Only 15% Say The Disavow Tool Benefits Their Rankings

The Google Disavow tool is a popular subject with all the link penalties floating around. But the big question is, does it actually help your site to rank better?

In November of last year, we ran a poll asking our audience and over 400 SEOs responded. The responses are not shocking at all.

Only 15% said it benefited their rankings by disavowing sites. 50% said it had no benefit at all. The rest were unsure.

Why is this not shocking? Well, we said this before, disavowing links won't always improve rankings because the links you are disavowing once benefited you in the rankings. When you remove those links, those links no longer count - and probably didn't count when you used the disavow tool because Google penalized them. Either way, you need new links to make up for the links that no longer count. Yes, it might remove the penalty but it won't necessarily improve your rankings.

The disavow tool is not the answer to rankings - it is a solution for fixing a penalty but not necessarily for improving your rankings.

So when the WebmasterWorld thread picked up some steam again, one person said:

Case 1) Had a site that was penalized. Disavowed a bunch of domains, opened reconsideration request. Manual penalty removed. Two months on, the domain gets even less traffic than before.

Case 2) Domain is penalized. Sent in a huge disavow file. Site falls further in rankings and is still penalized.

If I was a webmaster I would just ignore the disavow tool completely. It's useless. 

Thursday, January 9, 2014

Google To Publishers: Use The Canonical Tag For Duplicate Article Consolidation

Duplicate content is a fun topic in the SEO space because Google says don't worry about it because 30% of the web is duplicative. Google deals with it. But all good SEOs know, you don't want to have 5 pages all targeting the same keyword phrase because it is spreading you too thin.

In Google's Matt Cutts latest video, he talks about publishers who write several breaking stories on the same topic, that they can and maybe should use the canonical tag to clean it up and point the stories all to one main story when all is done and settled.

Listen to the video:


For publishers, that is really hard to do.

It is true, if you have the exact same story on multiple URLs, make sure your CMS handles the canonical tag for that. But if you have several writers covering the same story and it produces several similar stories on the same topic, that is hard to deal with. I guess it is best to have an editor consolidate the stories, but that is rare in the news publishing world.

Wednesday, January 8, 2014

Google Search Queries Report Now Showing Exact Numbers

If you visit your Google Webmaster Tools Search Queries report, you will see an "update" line across the graph. When you place your cursor over the update, it reads, "An improvement to our top search queries data was applied retroactively on 12/31/13."
Indeed it has! Google is now showing the exact number, they are no longer rounding your search query data to a 'round' number. This now gives you the exact impressions and clicks for a given keyword or page on a specific date or within a specific date range.

Here is a picture:

Google Search Queries Report  Update Line

John Mueller announced this on the Google blog and said on Google+, "the search query data is now no longer bucketed / rounded."

Those who use the data for their own reporting via the API are even more excited. 

Google's Matt Cutts: Yea, Google Changes Search On Average Twice A Day

The other day, Matt Cutts of Google released a 10 minute video on YouTube explaining the how search works page Google launched in March 2013.

Matt said a few relevant things that you may have not noticed before. The main thing I took out of it, which I knew, but he made a point to say it, was that Google made 665 changes to the search engine (UI and more) in 2012. So that is about two changes per day. So when people like me go to Google and say, something is up, Matt said - yea, every day we release a couple changes - so something is always up.

Here is the video, it is about 3 minutes and 10 seconds in:



Another thing he mentioned was that he loved that the portal shows examples of spam sites removed by Google in almost real time. So you can see the spam as it is removed. Going back to that, my company built a search engine for the spam sites shortly after they launched it. This would allow you to search the sites Google manually penalized so you can get a better understanding of sites they are removing. Google forced us to shut it down because they didn't want us crawling their examples and giving you an index to search through.

That being said, Google's Matt Cutts also has a video on how search works in general from 2012.

Google AdWords Invalid Activity Refunds Increasing?

WebmasterWorld's administrator, engine, aka Neil Marshall, created a thread at WebmasterWorld saying he is noticing a large increase in the refunds he is getting to his AdWords campaigns. The refunds are coming from "invalid activity."

He was surprised to see so many refunds recently specific to that reason of invalid activity.

He said, "I have noticed an increase in invalid activity refunds recently."

Then he asks why? It is because of an increase in bot activity? More accidental clicks from mobile users? More spammers? Why?

Have you noticed an increase in refunds due to invalid clicks?

Forum discussion at WebmasterWorld.

Google Webmaster Tools Search Queries Improved For Separate Mobile Sites

Google's Maile Ohye announced yesterday improvements to how Google reports on search queries for web sites that run their mobile sites on a separate URL. Such as on m.example.com or example.mobil or so forth.

Now, instead of counting the Skip Redirect impressions with the desktop URL, it will break them out into the mobile site. You can set the filter to view your m. site* and set Filters to “Mobile,” from Dec 31, 2013 onwards.
Here is a picture showing the details:

click for full size

John Mueller from Google said on Google+, "this will make it a bit easier for you to keep track of the queries going there. "

Of course, if you have the same URL or a responsive design, this wouldn't apply.

Tuesday, January 7, 2014

Issues With Google Webmaster Tools Verification

A Google Webmaster Help thread has dozens of webmasters complaining about not being able to verify their web sites.

The first report came in yesterday afternoon. The webmaster said, "I just lost verification of all of my sites."

Google's Gary Illyes responded an hour later that the team is on it. Gary wrote:

Thanks for the reports, guys! I escalated them to the Webmaster Tools team and they are looking into it. Thanks again for reporting it!


Hopefully the issue will be fixed by the time this post goes live.

Google: Your Rankings Dropped Because Of Links, Not Hummingbird

A Google Webmaster Help thread has an office furniture site complaining that his Google rankings dropped after the Hummingbird update was pushed out.

Of course, you and I know that sounds a bit too much. Being that we do not know the exact rollout and dates of the Hummingbird algorithm.

But it is nice to see Google representatives saying so too.

Zineb Ait Bahajji from Google said it wasn't the Hummingbird algorithm but rather links, not any links but over-optimization links.

Here is what Zineb wrote:

The Hummingbird update does probably not have anything to do with your website's traffic loss.

The primary issue here is that you are using certain over-optimization techniques on your site that are in violation of our quality guidelines, especially when it comes to building unnatural and spammy links.


Interesting she calls it "over-optimization techniques" - don't you think?

Google Webmaster Tools Search Queries Shows Top Pages By Keyword Clicks

You know the search queries report in Google Webmaster Tools? Of course you do. Do you know, you can show your top pages by impressions and clicks? Did you know that now you can see the top keywords that led to impressions and clicks to those pages?

Cyrus Shepard posted about it on Google+ and it is true. Here is a screen shot showing that you can expand the page report to show the keywords that led to the traffic to that page.

Google Webmaster Tools Search Queries Shows Top Pages By Keyword Clicks

As Cyrus said, "Massive, much needed improvement."

Indeed, this gives you back some of the data Google took away with [not provided] and SSL search.

Friday, January 3, 2014

Google: You Must Wait A Few Weeks To Submit A Reconsideration Requests

We know Google doesn't like it when you submit a new reconsideration request a day or two after you get a rejection response from your reconsideration request. They want to see you put effort in after you get that response.

So now, it seems Google is telling webmasters who get reconsideration request responses that they need to wait a "few weeks" before submitting a new one. If they don't listen, Google will ignore their submissions.

A Google Webmaster Help thread has one example of Darren Jamieson posting one for his site. The section that is new is:

Removing links takes time. Due to the large volume of requests we receive, and to give you a better chance of your next reconsideration request being successful, we won't review another request from this site for a few weeks from now. We recommend that you take the necessary time to remove unnatural backlinks to your site, and then file another reconsideration request.


It seems like more and more reconsideration request rejection replies have this message in them.

Hat tip to Jon Hogg from iprospect.co.uk for sending me this thread.

Google: Can't Crawl Your Robots.txt Then We Stop Crawling Your Site

Did you know that if Google cannot crawl your robots.txt file, that it will stop crawling your whole site?

This doesn't mean you need to have a robots.txt file, you can simply not have one. But if you do have one and Google knows you do and it cannot access it, then Google will stop crawling your site.

Google's Eric Kuan said this in a Google Webmaster Help thread. He wrote:

If Google is having trouble crawling your robots.txt file, it will stop crawling the rest of your site to prevent it from crawling pages that have been blocked by the robots.txt file. If this isn't happening frequently, then it's probably a one off issue you won't need to worry about. If it's happening frequently or if you're worried, you should consider contacting your hosting or service provider to see if they encountered any issues on the date that you saw the crawl error.


This also doesn't mean you can't block your robots.txt from showing up in the search results, you can. But be careful with that.

In short, if your robots.txt file doesn't return either a 200 or 404 response code, then you got an issue.

Spammers Cold Calling Local Businesses - Claiming They Need To Update Their Business Listings

Baruch Labunski from Rank Secure in Canada pinged me the other day asking me why I can't stop the phone calls he is getting from spammers claiming implying they are from Google, trying to sell him services to update his local Google Places Business Maps listing.

He said he constantly gets phone calls, automated phone calls, that say "Our records show you have not updated your free Google (business) listing." It then asks you to press 1 to update the listing and 2 to exit. I had no clue what he was talking about. I never got a call like this but it seems many do.

There is a blog post from David Meharey with complaints and also an older Google Business Help thread with complaints.

The calls get excessive, annoying and are incredibly spammy. If you use Google Voice, an iPhone or have a flexible PBX, you can easily block the numbers. Your phone company might be able to help as well.

I assume Google can sue the company doing this but it is not Google calling you.

The thread has some opinions on who is making the calls.

UserInteraction Schema For SEO & Social?

A WebmasterWorld thread has a webmaster who asked if Google will give him a boost because one piece of his content on his site hit over 1,000 Facebook likes.

It is one of those weird questions that makes you nod your head sideways. Why?

(1) Google said they don't use Facebook likes or even Google+s directly in their ranking algorithm. Heck, Google doesn't have access to Facebook data for the most part.

(2) However, if you really had content that was organically loved and liked by so many people. It is natural for a lot of people to link to it and share it and thus Google will pick up on other signals and likely rank that content well.
That being said, should you better markup your pages so Google can easily pick up the Facebook Likes on a specific page?

There is a Schema.org markup named UserInteraction that lets you markup social related features such as Likes, Checkins, Tweets, Visits, and so forth. Here is the list:
Of course, you need a way to automate this and not manually enter this data into your HTML code. So you need a good social tracking system or integrate directly with each API from Twitter, Google+, Facebook, Foursquare, Analytics, etc.


Do you add this schema data to your site? If not, will you?  
Facebook Likes, Increase FB Likes Free