Tuesday, December 31, 2013

Google Not Indexing Your Sitemap URLs? Might Be A Canonical Issue

A Google Webmaster Help thread has a webmaster all upset that Google shows that they have indexed none of the URLs they submitted via the XML Sitemap file. Obviously, this can be concerning to any webmaster.

The thing is, you need to be careful what Sitemap file you submit. If you verify a non-www version with Google Webmaster Tools and submit a www version of your sitemap, or visa versa, Google may be very literal and show you that they didn't index any non-www versions of your URL.

Google's Zineb said in the thread:

All the URLs listed in your XML Sitemap are non www. and they all permanently redirect to the www. version (which is your preferred domain). That explains why these specific URLs are not indexed. In order to fix that, you'll need to specify the right URLs with www., resubmit your Sitemap and wait for it to be processed again.


So technically, it is an "easy fix" and the site is indeed being indexed. But a report like this can be scary to see in Google Webmaster Tools.

Top Search Stories Of 2013: Penguin, Panda, Other Google Updates & Search Logos

I normally hate posting the top stories recap of the year, since I spend a heck of a lot of timing hand-selecting what I think the most important stories for the year were when I do my anniversary post but here it is, like I did last year.

Similar to last year you can see much of the top stories are specific to Google Updates such as Google Penguin, Google Panda and others. And to no ones surprise, there were a lot of popular Google logo stories. This is ordered by most visited pages by Google Analytics data.

Top Search Stories Of 2013:

Friday, December 27, 2013

Google's Matt Cutts: Expired Domains With Penalties Last For...

We know manual actions expire but what about algorithmic actions, do they expire?

When you pick up a new domain name, you now need to look to see if it had a bad history. We know expired domains can either benefit you, do nothing for you, or seriously hurt your efforts on your new site.

Google's Matt Cutts chimed in about the difference between a manual action on an expired domain and an algorithmic action and the difference, to me, is a bit scary. This is based on a Google Webmaster Help thread from Christmas.
First a copy and paste of what Matt said and then my interpretation of it:

The short answer is that it depends. If domain hasn't really been on the web since 2001, I would expect any manual webspam actions to have expired a long time ago.

It's possible that the domain did some things in 2001 that would lead to algorithmic ranking issues, but the web typically changes enough in ~12 years that I'd be surprised if you ran into issues. Typically when you buy a site and run into problems, it's because someone was spamming more recently with the domain.


So clearly, if the domain expired years ago, you probably don't need to worry about a manual action. But, to be safe, login to Webmaster Tools and see if it still has a manual action. If so, then submit a reconsideration request. I wouldn't be surprised if it did have a manual action, that some algorithm is also impacting it.

On the algorithmic action side, it is unclear if the penalty will last. Matt implies that it would be rare for a 12 year old expired domain to still have an algorithm that is hurting it but it is possible. In that case, you are probably in trouble and probably should find a new domain before you start doing much more.

But how much time would it take for the expired domain to have the algorithmic actions expire also? I guess it depends on the links pointing to the site still.

Google's Matt Cutts Tells Webmaster The Penguin Algorithm Is Impacting Rankings

It seems like Google is on a trend of telling webmasters which algorithms are impacting their site the most, at least in a negative way.

The other day a Googler told a webmaster that Panda was hurting their site, whereas now, Google's Matt Cutts is telling another webmaster that the Penguin algorithm is impacting their site.

The thread is at Google Webmaster Help and this is how Google's Matt Cutts responded:

Nope, it's not the title and meta descriptions. I'd recommend reading up about Penguin and looking into how to clean things up for Penguin.


Which is why I'd love to see an automated actions viewer in Google Webmaster Tools. I know it is a lot to ask but hey, I am asking.

Google's Matt Cutts: Don't Copy Wikipedia Content & Expect To Rank Well

A Google Webmaster Help has Google's head of search spam, Matt Cutts, giving advice to a webmaster. The advice, don't copy content from Wikipedia and expect to rank well.

The truth is, Wikipedia is a great source for facts. Webmasters simply should not copy and paste it verbatim. You can use it when writing stories and for fact checking, but not for copy and paste.
Matt Cutts wrote:

I picked a page at random: http://www.listofwonders.com/top-10-famous-haunted-places-in-the-world and the first sentence of the first haunted place is "Berry Pomeroy Castle, a Tudor mansion within the walls of an earlier castle, is near the village of Berry Pomeroy, in England."

If you look up the Wikipedia page of Berry Pomeroy Castle, the first sentence of the Wikipedia page is "Berry Pomeroy Castle, a Tudor mansion within the walls of an earlier castle, is near the village of Berry Pomeroy, in South Devon, England."

That was the very first random thing I checked, and it doesn't bode well for your site. If you're just copying text and pictures from other sites, I'd expect that your site would only be adding a limited amount of value for visitors, so it's not a huge surprise that your site doesn't get a ton of traffic at this point. I'd take some time to think about ways to add more value for someone who lands on your site.


It is tempting to just copy and paste and try to pass it off as your own. But use it as research, not as a content source. Write your story or summary around your research and don't just copy and paste.

Thursday, December 26, 2013

Black Hats Prepare To Spam Google's Author Authority Algorithm

For the past six-months, Google has been working on an algorithm to promote authorities on topics.

In short, Google is going to try to figure out which authors or individuals are authorities on a specific topic and promote their content across any site, in some way. You can read more about it in the links above.

This morning, I spotted a thread at Black Hat World where "black hats" are seeking ways to exploit this algorithm by "faking" author authority.
This is how one explained it:

So Google now allows you to "tag" an author in your content. Good authors who are popular get extra ranking bonuses for their articles.

So it seems very simple to me. Find a popular author in your niche, and tag him in your links to your content.

Extra link juice off someone else's work.


Another added:

see i am thinking about using my own "fake authors" and starting to build authority around them by using them on all my press releases and article submissions... then later in time, anything posted by that author will be easier to rank?


I doubt, seriously doubt, it will be that easy. But hey, someone has to keep Google on their toes.

Google: Sitemaps Do Not Guarantee Indexing

This is likely obvious to most the readers here but it is simple, submitting an XML sitemap to Google does not mean the pages in that sitemap will be fully indexed.

A Google Webmaster Help thread has Google's Gary Illyes responding to a question about why a site that has submitted 40,000 pages only has 100 pages indexed in Google.

For example, here are two sites that have submitted their URLs to Google via an XML sitemap file. One, has submitted 17,987 pages and Google has actually index all of them, plus one. :) The other has submitted over 7 million pages, but Google has only indexed about 4 million of them, which is about 53% of the pages submitted.

Why did Google index all the pages on one site but only about half on this other site? Why did Google only index 100 pages of the 40,000 of the site complaining above?

Gary from Google explains:

First and foremost, submitting a Sitemap doesn't guarantee the pages referenced in it will be indexed. Think of a Sitemap as a way to help Googlebot find your content: if the URLs weren't included in the Sitemap, the crawlers might have a harder time finding those URLs and thus they might be indexed slower.

Another thing you want to pay attention to is that our algorithms may decide not to index certain URLs at all. For instance, if the content is shallow, it may totally happen it will not be indexed at all.


Google make take a look and decide based on the content or the PageRank that the page is not worth indexing.

Tuesday, December 24, 2013

Matt Cutts: Google's Hummingbird Algorithm Affects 90% Of Searches

I keep coming back to this episode 227 from TWiG and in the video, Matt Cutts talked about the Hummingbird algorithm at exactly one hour and twenty minutes into the video. He spent maybe a minute or so talking about it.

Matt Cutts said that the Hummingbird algorithm actually effects 90% of all searches but he said only to a small degree. So while Panda may have impacted 10% or so and Penguin closer to 3% or so, Hummingbird impacted 90%. But Matt Cutts said only to a small degree where users should not notice.

Here is the snippet of what Matt Cutts said:

Hummingbird effects 90% of all searches but usually to a small degree because we are saying, this particular document isn't really about what the user is searching for.



I know Google has told us searchers and SEOs should not have noticed any impact to rankings and traffic based on this algorithm update. But we suspected it may have impacted some.

With such a large footprint, 90%, it had to on some degree.

Google: Is Amazon Spamming Google With Footer Links?

Remember, the warning from Google's Matt Cutts on linking too many sites together using footer links and keyword rich anchor text?

Former Cre8asite moderator, posted in Cre8asite Forums that this is exactly what Amazon does. He said, yes, Amazon deploys a "Massive, Insane Crosslinking Scheme." Well, at least based on how Google's Matt Cutts makes it feel.

Where? Right on their home page:

Amazon Footer Links

Oh, maybe Amazon nofollows the links? Nope.

amazon footer links

The thing is, the anchor text for all the links are incredibly keyword rich. Here are some examples:

  • Diapers.com - Everything - But The Baby
  • Bookworm.com - Books For Children Of All Ages
  • DPReview - Digital Photography

Those are just a few, as you can see.

EGOL asks, "So, I wonder how Matt would have answered the question if the person was asking about fifty followed links in his footer was J. Bezos?"
I bet he would say, Google knows about them and doesn't count them?

Friday, December 20, 2013

Google Continues Work On Promoting Topic Authorities

Back in May, we covered how Google is planning on releasing an algorithm to promote subject authorities in the search results. Some call this the "good guy algorithm" and some call it the "author authority algorithm," while others may have some other names for it.

Part of episode 227 on TWiG, Google is not only trying to break the spirits of the "bad guys" but he also mentioned they are trying to promote the good guys.

Craig Moore transcribed what Matt Cutts of Google said on Google+. He said it about 1 hour and 20 minutes into the video.

We have been working on a lot of different stuff. We are actually now doing work on how to promote good guys. So if you are an authority in a space, if you search for podcasts, you want to return something like Twit.tv. So we are trying to figure out who are the authorities in the individual little topic areas and then how do we make sure those sites show up, for medical, or shopping or travel or any one of thousands of other topics. That is to be done algorithmically not by humans ... So page rank is sort of this global importance. The New York times is important so if they link to you then you must also be important. But you can start to drill down in individual topic areas and say okay if Jeff Jarvis (Prof of journalism) links to me he is an expert in journalism and so therefore I might be a little bit more relevant in the journalistic field. We're trying to measure those kinds of topics. Because you know you really want to listen to the experts in each area if you can.



I am terribly interested to know when this launches and how it impacts the results.

Google has not given an ETA to this launch, and I don't expect them to. But with any winners, in this case the "good guys", others will see loses. So while this algorithm may be about promoting content, some content will ultimately drop in the results because of it. 

New Look For Medical Knowledge In Google's Search Results

A WebmasterWorld thread has discussion around a new look for answers around medical or health related queries in Google.

The results, as described by a webmaster, "replaces results with content," which is what the knowledge graph does. But he goes on to explain that the "content from our sites is placed directly on Google I feel that this will have a huge negative impact in terms of how much actual traffic Google will send to our sites."

Here is a screen shot of what he means:

click for full size

At least in this case, they are linking to the results. It won't be long until Google removes those links and just places it in their knowledge graph.
Of course, this pushes down the results and you won't even see results on most screens.

I doubt any of you are surprised with this, outside of the fact that Google is actually linking to the source here.

Google December 19th Ranking Fluctuations Reported

So Google confirmed they reduced the authorship snippets from showing in the results. We know that. Matt Cutts strongly implied that there was no update on December 17th, despite all the tracking tools lighting up on that date. That implication that Google minimizes the algorithm updates before the holidays should apply to what I am seeing today - a lot of chatter, in some niches, of a Google update.

The key indicator I use is webmaster/SEO chatter. I check the chatter at WebmasterWorld and dozens of other forums and the chatter has picked up yesterday. Martin Ice Web, who is a Senior Member at WebmasterWorld but is based in Germany, is the loudest on claiming updates today. There are many who agree and see major changes and there are many threads in the Google forums with individual complaints.

That being said, one of the tools I rarely show you, because it often doesn't match the other tools, is the DigitalPoint Keyword Tracker averages changes. It is something fairly new added to the forums sidebar and this reports changes of actual rankings for hundreds of thousands of sites and I'd say millions of keywords. It is based off what webmasters enter into the tracking tools.

It showed a bit of a change on the 17th but on the 19th, it really skyrocketed, like the forums did.



I've emailed Google yesterday to find out if something is going on specific with rankings but I have yet to hear back.

It can be a refresh of Panda or something else but I have no confirmation from Google.

Wednesday, December 18, 2013

Google's Matt Cutts: Our Algorithms Try To Break Black Hat SEOs Spirits

A couple weeks ago, Google's Matt Cutts was on This Week in Google (TWiG) and on episode 227 Matt had some interesting things to say. He said that Google specifically tries to break the spirits of black hat SEOs.

At about an hour and 25 minutes into the video, Matt said:

If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is sort of break their spirits. There are lots of Google algorithms specifically designed to frustrate spammers. Some of the things we do is give people a hint their site will drop and then a week or two later, their site actually does drop. So they get a little bit more frustrated. So hopefully, and we’ve seen this happen, people step away from the dark side and say, you know what, that was so much pain and anguish and frustration, let’s just stay on the high road from now on.


Here is the video, scroll to just before 1:25 on this video:


So in short, Google actually doesn't just look to prevent money to go to spammers, they look to break their spirits.

Google: We Don't Control Content On The Web

I spotted a thread that is a common question in the Google Webmaster Help forums about removing content from showing up in Google. The response was even more interesting.

Google's Eric Kuan, from the search quality team said that Google does not control the content on the web.

He wrote:

Google doesn't control the contents of the web, so before you submit a URL removal request, the content on the page has to be removed. There are some exceptions that pertain to personal information that could cause harm. You can find more information about those exceptions here: https://support.google.com/websearch/answer/2744324.


True, Google cannot control what people publish on the internet but they are the gateway to that content.

I bet many SEOs and webmasters would argue with Google's definition of "control" here.

Tuesday, December 17, 2013

Anglo Rank Promises To Rebuild After Google Link Penalty

As many of you remember, Anglo Rank was penalized outwardly by Google, as Google keeps targeting more and more link networks.

Shortly after, some customers began receiving Google penalties and the owner of Anglo Rank has been reversing link orders since.

Saturday night, the owner of Anglo Rank announced on Black Hat World that he has halted orders for the links and is working on rebuilding his network. bluematter, the alias the owner goes under, wrote:

From last few days we are working 24/7 to sort out our affected clients In the recent update, We think that we have spoken to most of you and sorted something out for you, If there Is someone who hasn't got In touch with us kindly contact us and we'll surely help you out.

Our main focus will be now to relaunch this service and make It even better and stronger than before, We'll be closing this thread for now so we can fully work on Improving our system and keep ANGLO RANK moving forward.

The support will be there 24/7, If you need anything just get In touch

Thank you for your patience and understanding guys


I guess this link network has not learned their lesson or maybe they have. As Google burns a network, rebuild and do it all again. That is the black hat way, right?

Backlinks.com: The Next Link Network Penalized By Google

Google's head of search spam, Matt Cutts, publicly outed on Twitter another link network that Google has penalized. Matt Cutts' new trend is the share a link from the marketing material of the link network and then add a word or two to say the opposite, that Google caught you.

Here is Matt's tweet:

"Our installation code/software used to publish the sold links is not detectable by the search engine bots." Au contraire!

— Matt Cutts (@mattcutts) December 13, 2013


This came a week, literally a week, after Google outed Anglo Rank.

In fact, Matt joked on Twitter that Google "should start taking requests for which link networks to tackle next."

Meanwhile, the folks at Black Hat World are not happy for a few reasons. One said, "It's crazy how he can get away with ruining businesses. It's not like spamming the internet's illegal and Google doesn't own the internet." Well, it is spamming Google and I guess Google has the right to fight back?

That being said, guys - stop buying links unless you want to play the crash and burn game.

I received an email from someone who got one of these link penalties but swore they never paid for links. They did and found out their SEO company did for them.

Be careful and don't mess your 10 year old web site up with these schemes.

Friday, December 13, 2013

Reporting Offensive Images In Google Image Search

Google has been changing the way they let searchers report offensive images for years now.

Now, it may be even more confusing to report offensive images. A user reported a NSFW search to me on Twitter for a keyword that is very safe for work. The keyword is [dining] and if you scroll down through the image results, you get a topless women in a seductive pose.

So I tired to figure out how to report the image and it took me some time. Here are the steps to report an offensive image on Google search.

(1) Click on the image so the preview opens and then click on the "send feedback" link on the bottom right of the image box, it is really small:

Reporting Offensive Images In Google Image Search
(2) Type in your feedback and hit next:
Reporting Offensive Images In Google Image Search

(3) Select how you want to send the feedback, I used the default "highlight" option:

Reporting Offensive Images In Google Image Search

(4) Review the details and hit submit:

Reporting Offensive Images In Google Image Search


(5) Google then thanks you. Although, I did this yesterday and the image is still there:

Reporting Offensive Images In Google Image Search
Forum discussion at Twitter.

Thursday, December 12, 2013

Google Panda Impacting Your Mega Site? Use Sitemaps To Recover?

A WebmasterWorld thread has discussion around how to determine which sections or categories of your web site are impacted by Google's Panda algorithm.

Panda has not gone away and sites are still suffering from not ranking in Google after the Panda algorithm. New sites are hit monthly, some sites are released in some way from its grips as well.

The thread talks about one technique for large sites to determine which sections of their sites are impacted by Panda.

The concept is to use XML Sitemaps and break the sitemap files into a logical structure of the web site. Then once Google processes all the files, Google will quickly show you how many pages within each sitemap file was indexed or not indexed.

One webmaster explained how he went about this technique:

The sector I was perfoming in. allowed me to created dozens of sitemaps of 100 pages each. No reason why any of the pages should not be indexed. I found some sitemaps with 0 listed then others from 25 up to the full 100. I then discovered trends. IE pages with similar title tags and URLS. (the on page content was considerably different, which is why I did not remove them initially)

I then did different experiments with each sitemap group, until I saw a recovery, then applied the solutions across the board.


The question I have and I am not certain of... I thought sites impacted by Panda, the pages are indexed but they don't rank as well. Meaning, if a page is not indexed, that is more of an issue with crawl budget and PageRank (or sever set up issues) versus Panda. Panda, the content has to be indexed for Google to know not to rank it well. Am I wrong?

Google's Matt Cutts Agrees, Guest Blogging Is Getting Out Of Hand

So Matt Cutts released another video yesterday and this is at least the fourth video on the topic of guest blogging.

The deal is, Google is saying Guest Blogging as a whole is getting spammier by the day. As good things get abused, over time, those good things turn into bad things.

Matt said, the guest blogging spam is "growing" in terms of spam and abuse. So he laid out basic "do nots" that everyone who reads here already knows:

  • Don’t make guest blogging your only link building strategy
  • Don’t send out thousands of mass emails offering to guest blog to random sites
  • Don’t use the same guest article on two different sites
  • Don’t take one article and spin it many times

Here is the video:


Like I said, we've covered the topic many times and Matt has three other videos. Here are links to those topics:

Google Penalty Notifications Sent To Anglo Rank Link Buyers

As you know, Google's Matt Cutts publicly outed that Google went after Anglo Rank's link network and that the penalties and notifications will go out in a few days.

Starting yesterday, some Anglo Rank customers are claiming they have received such notifications and some have claimed they have seen huge drops in rankings.

One shared a screen shot of his link penalty notification publicly in Black Hat World. The link notification was for "unnatural inbound links" and here is a picture of that notification, including the effected site:

Anglo Rank Google Webmaster Penalty Notification

Now, I have no way of confirming this specific site received this specific notification because of Anglo Rank but this person claims so.

Another customer of Anglo Rank claims he saw major drops in rankings, and shared a chart showing the drop in rankings.

Meanwhile, the operator of Anglo Rank, code named bluematter is responding to customers offering cancelations or redirection of links. Here are some of the things he has said in the thread:

It will keep working like It was before, they can target few sites here and there but It Is not possible for them to take these private networks down which has 10000s of sites.

After this whole tweet thing happened, we did a full backend audit which we do anyway in few days. We went through all the links and checked and replaced all the links which were no longer Indexed In google anymore, when they flag network sites this Is the first thing they do deindex them.

The other thing which we are doing right now Is to move all our clients to the sites which were added recently in these networks for example like a week ago, as these sites were added just days ago so there Is no chance they'll be flagged.

Mreover If you are using a blackhat service and not expecting a manual penalty or rank drops after few months then I don't think this or any other blackhat service is for you man. These links are for churn and burn sites like i have said many times before in this thread do not point these links to a site which you can't afford to lose

Thank you for the update but what was you expecting that you gonna be still ranking for 10 years ehhh? thats why i have been mentioning this 10s of times in this thread that this service is blackhat and for churn and burn sites. If you would like us to move your links to a new domain let us know and we'll sort it out for you.


It is always interesting to see the fall out and the reaction on this from both the webmasters and owners of the program.

Tuesday, December 10, 2013

Facebook's News Feed Gets It's Own Google Panda Algorithm

A week or so ago, Facebook announced changes to the news feed algorithm, trying to surface better shared content to users.

Facebook said they have made an "update to News Feed ranking recognizes that people want to see more relevant news and what their friends have to say about it." They want more relevant items in the feed to show up, and fresher comments as well.

AllThingsD equated this to the likes of Google's Panda algorithm.

Peter from AllThingsD interviewed the News Feed manager Lars Backstrom at Facebook. He asked him about Panda, in which Lars responded:

I’m not totally familiar with the details of Panda. At least from the way you described it, it’s maybe not quite at that scale. But it’s kind of a step in that direction.

Whenever we make a change like this, it has the potential to break some of the strategies employed by people who get distribution on Facebook. My favorite example of this is when you have a photo, and then a very explicit call to action where you say “one like = one respect.”

So, when the text or photo has a call to action, those posts naturally do much better. And in a traditional feed ranking, where we’re evaluating just on the number of likes, those things all did very well.


The funny thing is Matt Cutts tweeted it:




Google’s Matt Cutts: Link Spamming Google For A Specific Time Period? Then Mass Disavow Those Links.

In a recent video released by Matt Cutts, Google’s head of search spam, Matt answered the question, “How can a site recover from a period of spamming links?”

The example given was when Interflora was penalized by Google for buying links and only was penalized for 11 days. The question was, how can a site with a penalty get their rankings back in 11-days like Interflora?

Matt didn’t give a specific answer to the question, instead he said he wanted to answer it in a general sense.

Matt said that you should disavow the bad links with a vengeance and disavow all the links that might be paid. Don’t use the disavow tool a single link at a time, instead use the domain level disavow option. Matt said this before, explaining that you should use the disavow tool more like a machete.

So, if you know you paid for links between a specific date range, technically, you can disavow all the links you acquired between those date ranges, or at least disavow most of them, at the domain level.

Here is the video:

Friday, December 6, 2013

Google Shocks Webmasters By Updating Toolbar PageRank Before Year End

Early this morning, Google pushed out a Google Toolbar PageRank update. This update shocked webmasters because no one expected it, at least not in 2013.

As you may remember, the last Toolbar PageRank update was over 10 months ago on February 4th. As I said at the six-month point, it was unusually for Google not to push out a PageRank update quarterly. Then Matt Cutts, Google's head of search spam, told us there won't be a PageRank update before 2014 - or at least he thought so.

Today, December 6th, the Google Toolbar PageRank values have been updated. I guess the upper management, executives, or Larry Page, didn't want PageRank to go away after all.

This makes me sad, as I am sure it makes many Googlers sad. Why? Because SEOs and Webmasters obsess about Toolbar PageRank, to the fault of Google. And as I said time after time:

Despite PageRank still being part of the algorithm, SEOs know that toolbar PageRank is often outdated and not that useful. In addition, there are many other factors part of the algorithm that may or may not be as important as PageRank.


Anyway, I do hope your green pixel dust improved since February.

Here is a useful video from Matt on this:



Update: Matt Cutts confirmed it on Twitter, seemed like an afterthought:




Google In-Depth Articles Adds Links To More In-Depth Articles

Google's in-depth articles just got an update. Google announced it on Google+ highlighting two new features.

(1) A link under the 3 in-depth articles to show more in-depth articles

(2) Explore links to other topics and queries that will show more search results that contain in-depth articles.

Here is a picture of a search for [federal reserve] that shows these two features:

Google In-Depth Articles

Rubén Gómez documented how to search Google' in-depth articles with a URL parameter addition. Specifically adding &ida_m=1 to the end of your search URL.

For more on how to get your content to show up in these results, see our original story on in-depth articles.

Google's Matt Cutts: We Don't Like Content Stitching

Google's Matt Cutts posted a video yesterday explaining that "stitching content" is a bad idea. In short, stitching content is when you take snippets from various articles across the web and place it on a single page, even with linking to it.

Matt explains there is a difference between writing a summary on a topic by using sources. But you aren't simply copying and pasting those sources, you are summarizing them and explaining them in more detail. He cites Wikipedia as a good example of doing this.

But bad examples would be just copying and pasting quotes and adding links to those sources or not.

I joked on Google+ that isn't Google theoretically doing this with their knowledge graph? Basically taking snippets of content and putting it together in a box and heck, they aren't even citing the source.

Truth is, no one likes to read these types of stitched pages - that is indeed true. But the knowledge graph is user friendly and useful.

Here is Matt's video:


Tuesday, November 26, 2013

M-Dot Domains Need To Be Verified Separately In Google Webmaster Tools

In Google Webmaster Help there is a straight forward question and answer about how to handle M-dot (i.e. m.domain.com sites in Google Webmaster Tools.

In short, an M-dot is a separate site and should be verified separately in Google Webmaster Tools.

Zineb, a Google Webmaster support representative answered each question:

(Q) Having verified domain ie.http://www.domain.com in GWT do I need to separately verify it's mobile version ie. m.domain.com? If so which method is the best for mobile?
(A) Yes. You need to verify both URLs in Webmaster Tools. Regarding verification methods, it depends on what you prefer :)

Furthermore, she answered the question on a separate mobile domain.

(Q) Also if the mobile website will have different URL ie. m.domainmobile.com how will that affect the verification (obviously having verified http://www.domain.com wouldn't help).
(A) I don't see why it would affect the verification. Make sure to add the bidirectional annotation to both your sites (mobile and desktop) to help our algorithms understand the relationship between your desktop and mobile pages.

Google Places For Business Adds Reviews Section

Google announced that Google Places for Business has added a new section for business owners to manage and respond to reviews left on their Google Maps business listings.

Google said:

Today, we’re introducing Reviews in Google Places for Business. Now you can learn what your customers are saying about your business on Google and across the web, in one place. If you have a verified business listing, you will now see your customer ratings and reviews in the easy-to-use review inbox.


Google's Jade Wang shared the news with businesses in the Google Business Help forum adding:

To get started, go to the dashboard you use to manage your business information, click the listing you’d like to manage, and choose Reviews from the left hand navigation menu.

You’ll see a Reviews inbox listing any reviews Google users have left for your business as well as snippets of reviews written about your business on other websites. The Reviews analytics tab includes information detailing where users have evaluated your business, and the average score of reviews of your business.


Not all listings are in the new Google+ local section and they will be upgraded in the future.

Here is a picture:

Google Places For Business Adds Reviews Section

Local SEOs are very happy about this addition, that is until it gets plagued with bugs.

Google’s Matt Cutts: We Dropped The 100 Links Per Page Guideline But We May Take Action If It Is Too Spammy

Google’s Matt Cutts posted a video explaining why Google no longer has that 100-links-per-page Webmaster guideline.

In fact, the guideline was dropped well before 2008, but SEOs and webmasters still think having over 100 links on a page is something that may lead to a penalty.

The truth is: no, it won’t. Sites like Techmeme likely has thousands of links on their home page, and they are not penalized by Google.

That being said, Google said if a site looks to be spammy and has way too many links on a single page — Google reserves the right to take action on the site.

Matt also explained that your PageRank is divided by the number of links on a page. So if page A links to page B, C and D, that PageRank is split into three. If you have hundreds of links, it is divided by hundreds, and so forth.

Here is Matt’s video:


Sunday, November 24, 2013

Google's Search Results Rocky This Week.

This week has been a mess for Google's search results, despite Google denying anything is going on to me.

There were some tools that sparked up on November 14th that I didn't see from the webmaster chatter.

This whole week, I've been seeing a lot of sporadic complaints in both WebmasterWorld and Google Webmaster Help forums. Typically, these sporadic reports, at least I think, mean Google targeted a link network and some sites were majorly impacted by it. It also may be a weird Google bug. But not necessarily a Google update. Of course, it could be that Panda was updated, which Google stopped confirming.

It is hard to tell.

The tools are all lighting up over the past few days. Moz has warmer than normal temperature the past couple days, SERPs.com has higher volatility numbers than normal, SERP Metrics shows higher flux than normal and Algoroo is in the red the past couple days. Something seems to be up.

Is it a major algorithm update - I don't think so. If I had to guess, maybe Panda was rerun or maybe Google squashed some sort of link network.

I am honestly not sure but Google is indeed a bit rocky the past few days.

Forum discussion at WebmasterWorld and Google Webmaster Help.

Google Not Indexing Your Home Page? Remove The NoIndex Tag.

One of the most basic reasons Google or any search engine won't index and rank a web page is because you won't allow them to.

Either the spiders cannot access the site due to technical issues or you are explicitly instructing them to not enter your site with a noindex tag.

I've written about this before, Don't Overlook The Obvious: NoIndex Tag - but it often is overlooked.

Heck, I see it all the time in the forums. I've been called by large fortune 500 companies with SEO issues. I've seen more than once, they have a noindex tag on their home page causing the issue. Sometimes they are hard to spot due to redirects, so use a http header checker tool to verify before the redirects.

But don't overlook the obvious, check that first.

John Mueller of Google spotted a complaint in the Google Webmaster Help forums and after a day of back and forth between the site owner and people in the forum, John came in and said - it is more obvious than that.

It looks like a lot of your pages had a noindex robots meta tag on them for a while and dropped out because of that. In the meantime, that meta tag is gone, so if you can keep it out, you should be good to go :).


Of course, Google knows that because they have those details but if you hide those details from SEOs and remove the meta tag before they see the issue - then what.

Forum discussion at Google Webmaster Help.

Friday, November 22, 2013

Google’s Matt Cutts: Feel Free To Use The Disavow Tool Even Without A Manual Action

In a new video answer today from Google’s head of search spam, Matt Cutts, he says that you can use the disavow tool even if you do not have a manual action.

In which cases can you use this tool even without a manual action?

(1) You tried to remove links but the webmaster linking to you will not remove the links.

(2) You think you may have been hit by negative SEO.

(3) You see links pointing to your site you do not want to be associated with.

(4) You saw a link bomb attack and are afraid it might hurt your site.

(5) You are afraid someone will submit a spam report about you.

(6) You see your rankings dropped and you think it has to do with an algorithm Google ran, i.e. Penguin algorithm.

(7) You can’t sleep at night because of some of the links you have.

But Matt said the primary use for the disavow tool is that you did bad SEO or hired a bad SEO who built bad links to your site and you can’t get those links removed, then use it. But feel free to use the tool in the cases above, even without a manual action.



Facebook Likes, Increase FB Likes Free