Tuesday, February 25, 2014

Google Targets Two Polish Link Networks While Continuing To Target German Link Networks

Google lead of search spam Matt Cutts posted on Twitter that Google has taken action on two link networks operated in Poland this week. Matt wrote that Google is “not done with Germany yet, but we just took action on two Polish link networks.”

The Google Poland Webmaster Blog posted a reminder today about unnatural links and how to submit a reconsideration request.

Matt Cutts didn’t drop a hint on which Polish link networks were specifically targeted, like he has done in the past. But he did specifically say Google did take action on two link networks within Poland.

Here is Matt’s tweet:



Earlier this month, Google’s Matt Cutts announced they took action on a large German SEO agency and their clients for link schemes. This came after a warning from Cutts that Google would target German link violations.

Google Penalties Might Follow You To A New Domain Name

So we know that if you have a penalty on your site and you move your site to a new domain and redirect the URLs to that new domain, the penalty will flow because of the redirects. That is known.

What I did not know is that if you took your site and moved it to a new domain but did not redirect the old domain to the new, that Google may also pass along the penalty without redirecting the URLs.

If the site is basically a copy of the site and all you are doing is moving it to a new domain in order to leave your link or other Google penalty behind, it might back fire on you. Google's John Mueller told me yesterday in a video at 23 minutes and 15 seconds in that it is very possible that the penalty will follow the new domain even without redirects.

John said that if you have a site with a penalty and you take the site and simply move it to a new domain name, even without using the site migration tool or setting up redirects, Google may figure out it was a site move and pass along the penalty.

So the easy answer is not to simply move to a new domain for your site, cause Google may pick up on it and stamp it as a site move automatically and thus pass along your penalty.

Here is the video, fast forward to 23 minutes and 15 seconds in:


Google May Or May Not Use EXIF Data For Ranking Images

Typically when Matt Cutts of Google posts a video, there is a yes or no, or he is leaning strongly one way or another in his answer. But this time, I am honestly confuses.

Matt posted a video on the question "Does Google use EXIF data from pictures as a ranking factor?" Yes or No? The answer he gave was that they reserve the right to use EXIF data in image search rankings. But do they? I don't know.

Listen to his answer:


Was that a yes or no? Was it a depends? Was it, they don't now but they may in the future? I am honestly not sure.

What is your take?

In Google+, some think yes and some think of course they do. But why such a wishy washy answer?

Sunday, February 23, 2014

Google's Matt Cutts Stops By WebmasterWorld After Three Years Of Silence

The last time Matt Cutts of Google posted something at WebmasterWorld was over three years ago on January 4, 2011. He broke his silence there yesterday posting in a private member-only WebmasterWorld thread.

Why does this matter? I am not sure but he was the infamous GoogleGuy at WebmasterWorld for years and years. Not posting there, which was like a home to Cutts, for three years, is a pretty long time. Of course, Matt is busy helping webmasters at a larger scale with his videos and blog posts.

What did Matt post? I won't share all the details of the thread exactly, since it is a private thread. But I will say a large site (which was unnamed) accidentally blocked themselves with a robots.txt file and was looking for ways to get reindexed quickly. Matt suggested to use the Fetch as Googlebot feature to expedite things.

Here is a screen cap of what Matt wrote:

matt cutts webmasterworld post

So the tip is useful in that we know Fetch as GoogleBot can expedite some indexing.

Google Analytics Now Within Google+ Local Dashboards

Daniel Waisberg, the Analytics Advocate at Google, posted on his Google+ page a new feature that rolled out within the Google+ Local Admin Dashboard screens.

Now, if you link your Google+ Local listing with your Google Analytics, in the Google+ Local Dashboard, it will show a snap shot of Google Analytics data. The data will include a 30 day snapshot of your web site's traffic, showing new visits, unique visits and pageviews and the percentage change from the previous 30-days.

Here is a picture, you can click to enlarge:

click for full size

Forum discussion at Google+.

Business Names Google Places Quality Guidelines Updated

Google has updated their Google Places quality guidelines once again this time to clarify how you can name your business within Google Places/Google Local/Google Maps.

Jade Wang from Google pulled out the changes and posted them in the Google Places Help forums. The changes include:


  • Your title should reflect your business's real-world title.
  • In addition to your business's real-world title, you may include a single descriptor that helps customers locate your business or understand what your business offers.
  • Marketing taglines, phone numbers, store codes, or URLs are not valid descriptors.
  • Examples of acceptable titles with descriptors (in italics for demonstration purposes) are "Starbucks Downtown" or "Joe's Pizza Delivery". Examples that would not be accepted would be "#1 Seattle Plumbing", "Joe's Pizza Best Delivery", or "Joe's Pizza Restaurant Dallas".


Hopefully that clarifies things a bit better, because these guidelines are updated relatively frequently.

Thursday, February 20, 2014

Bing Updates Webmaster Guidelines: Keyword Stuffing Now Off Limits

Last night, Bing has updated their webmaster guidelines adding a section about "keyword stuffing." Surprised it wasn't there from the onset? Yea, me too but truthfully, there are a ton of things they can/should add there that are not currently there.

What is new? The section on keyword stuffing, which reads:

When creating content, make sure to create your content for real users and readers, not to entice search engines to rank your content better. Stuffing your content with specific keywords with the sole intent of artificially inflating the probability of ranking for specific search terms is in violation of our guidelines and can lead to demotion or even the delisting of your website from our search results.


I verified using various caching services that the paragraph was indeed not there a day or two ago.

That being said, the language is pretty strong. If you do use keyword stuffing techniques on your site, Bing may give your site a "demotion" or even worse "delist" your site from the Bing search results.

Hat tip to +MenasheAvramov for informing me about this.

Google: We May Show Your Redirected URLs In The Search Results

aakk9999, the moderator at the Google forum at WebmasterWorld spotted that Google's John Mueller said that Google may display the redirect URL in the search results over the destination URL, when they deem it is appropriate.
What this means is if you are redirecting URL A to URL B, Google may decide to show URL A in the search results over URL B.

It is interesting to note that Webmaster Tools just updated to show the destination URLs in the crawl errors.

Why would Google show the redirected URL over the destination URL in the search results? John said, the redirect URL may be "actually a nicer looking URL or we have more signals pointing to that URL, maybe canonical, maybe a lot of links pointing to that URL."

Here is the video where John answered this at about 46:37 into the video:


Here is the transcript by aakk9999:

Q: And then shouldn't be that the target is part of the Google's index and not the first one? Because some pages are not basically indexed although they are the target of 301 redirect.


A: That depends a bit on which how we actually index that, so what would happen here is we would crawl and index all of these pages and then we might see that one of these URLs is actually a nicer looking URL or we have more signals pointing to that URL, maybe canonical, maybe a lot of links pointing to that URL and we will actually show that other URL, the one that is still redirecting instead of the final destination URL in Search Results.

So oftentimes we'll see that, especially from URLs that redirect from the root of the domain to the lower level page and we try to show a more reasonable URL in Search Results even if we know that it actually redirects somewhere else in the end.
So it is not that there is anything technically wrong with that kind of setup but you might see that we just show one of higher level pages instead of the lower level URLs just because it looks nicer in the Search Results.

Google's Matt Cutts: We Tested Dropping Backlinks From Algorithm, It Was Much Worse

Google's Matt Cutts latest video has Google admitting they did and do indeed test their search results by turning off linkage data as part of their algorithm. Matt Cutts said the results would be "much much worse" if they did indeed do that in real life.

That does indeed make sense since Google's core algorithm was mostly based on links and PageRank and all these years they spent improving on it and such. They invested so much time and resources in using links to rank sites that dropping it now would make for a mess.

It is funny, because a couple weeks ago, we asked you what you would do if Google dropped backlinks from the algorithm. We so far have over 300 responses and 34% said they would be very excited, 32% said they'd be curious and 17% said they'd be very concerned.

Here is Matt's video on the topic:


Here is the transcription:

So we don't have a version like that that is exposed to the public but we have our own experiments like that internally and the quality looks much much worse. It turns out backlinks, even though there is some noise and certainly a lot of spam, for the most part are still a really really big win in terms of quality of search results.

We played around with the idea of turning off backlink relevance and at least for now backlinks relevance still really helps in terms of making sure that we turn the best, most relevant, most topical set of search results.

Wednesday, February 19, 2014

Google’s Matt Cutts: Backlink Relevancy Is A Big Win In Terms Of Search Quality

In today’s video from Matt Cutts, Google’s head of search spam reaffirms the significance of backlinks.

The video is in response to the following question:

Does the big G have a version of the search engine that totally excludes any backlink relevancy? I’m wondering what search would look like and am curious to try it out.

While Cutts says there is no public version of the search engine that excludes backlink relevancy, he confirms his team has run internal tests to see what would happen.

According to Cutts, excluding backlink relevancy resulted in “much worse” search quality.

“It turns out backlinks, even though there’s some noise and certainly a lot of spam, for the most part are still a really, really big win in terms of quality for search results,” says Cutts.


Tuesday, February 18, 2014

Google: GoogleBot Follows Up To Five Redirects At The Same Time

Google's John Muller said in a webmaster hangout on Friday that GoogleBot will follow up to five redirects at the same time, past that, you are probably out of luck.
I don't believe we had a number, a solid number, on how many redirects Google will follow. This may, and I may be wrong, be the first time Google gave a number on the number of redirects they follow at one time.

We had Matt Cutts talk about PageRank dilution through redirects in the past.
Google's John Mueller said this 46 minutes and 3 seconds into the hangout embedded below:


Of course, this is useful information for SEOs when doing audits.

Recovery Stories Emerge From Google's Third Page Layout Refresh

On February 6, 2014, Google did their third refresh to the Page Layout algorithm.

I am happy to report that some webmasters who were initially hit by either the first or second implementations of the Page Layout algorithm have reported recoveries on February 6th.

The ongoing WebmasterWorld thread has a couple stories, here is one:

Lost 50% traffic then, in one hour. In years never recovered until now, this change, February 6th, 2014. I'm 25-30% up, high for top keywords.

Even though it's about 2.5 years too late, I will take an acknowledgement of all the positive changes I made since that penalty, and I hope it sticks around.

After I redesigned my site in early 2012 using Matt's recommendations, this site, google above fold view tool and removed ads above fold (It was too many, I admit it then and now, I followed adsense suggestions blindly), text naturally moved up. I left site alone and have not touched it since but to add new pages, in that same redesigned early 2012 style.


Of course, with any stories or recovery examples, we do not have access to their raw data, so we are taking their word for it.

And of course, with any recovery story, there are many stories of sites that were penalized by Google with this update.

Friday, February 14, 2014

Google's Advice On Infinite Crawl Pages & SEO

Yesterday, Google's John Mueller, Maile Ohye, and Joachim Kupke co-authored a technical blog post on the Google Webmaster Blog on how to make infinite scroll pages more search engine friendly.

The issue, as you can understand, is that GoogleBot and crawlers can't scroll down a page and thus can't load more content with that action. In Google;s blog post, they announced the definitive guide on how to make infinite scroll pages more search-friendly.

In short, Google is recommending that you convert the infinite scroll page to paginated series by using the HTML5 History API. John even made a demo page of infinite scroll that is search engine friendly.

click for full size

Google's key points here are two fold:
  • Coverage: All individual items are accessible. With traditional infinite scroll, individual items displayed after the initial page load aren’t discoverable to crawlers.
  • No overlap: Each item is listed only once in the paginated series (that is, no duplication of items).

The Typical Day Of A Google Spam Fighter

Yesterday, Matt Cutts, Google's head of search spam, posted a long video describing what the typical day is like for a Google spam fighter.

First he explains that there are two types of spam fighters, the engineers who write algorithms and the manual action spam fighters who handle specific cases manually.

He also explains that there really is no typical day. What a spam fighter might be assigned to do for the week, might change as the week goes on depending on the crisis at hand.

It is an interesting video that helps you get some insight into the thought process behind spam algorithms and manual actions.

Here is the video:


What is your favorite takeaway?

Google February Update: Possibly A Google Panda Refresh?

There is a tremendous amount of chatter going on about Google updating over the past couple days or so at WebmasterWorld.

On February 6th we had the Page Layout algorithm update but that didn't cause much of a fuss in the SEO forums. But something over the past couple days is.

Moderator, Travelin Cat, said:

I'm following about 25 client sites and all but 3 had a huge jump in traffic on the 11th. Some doubled their traffic. Hoping this is a trend going forward.

Also, these are all in the U.S., on the West Coast.


There are a lot of people who agree and are noticing changes between the 11th and today.

Mozcast shows heavy activity earlier on but not on the 11th. However, it has not updated today yet, so who knows. SERPs.com also shows a steady high volatility pattern. SERP Metrics is off the charts on February 12th and Algoroo also shows a lot of high activity.

Plus, we have a lot of chatter at WebmasterWorld and the other Google forums.

Have you noticed a change in rankings at Google on February 11th through 13th? Some are suspecting it might be the monthly Panda refresh.

Google Webmaster Guidelines Adds Not To Block Google Ads With Robots.txt

A new update has been made to the Google webmaster guidelines document, which describes you should not block the Google ads destination URL with your robots.txt file.

The new text is under the technical guidelines and reads:

Make efforts to ensure that a robots.txt file does not block a destination URL for a Google Ad product. Adding such a block can disable or disadvantage the Ad.

The guideline before somewhat implies the complete opposite:

Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.


So what do you make of this new change to the Google webmaster guidelines?
This was spotted first by Menashe Avramov.

Thursday, February 13, 2014

Oy Vey: How Google Crippled SEO Companies By One "SEO Expert"

I was almost floored reading a post at High Rankings Forum from an "SEO expert" who explained how Google has crippled SEO companies and is driving them into "the dreaded adwords."

Besides for all the grammar issues, it is amazing that (1) an SEO can blame Google for trying to make their algorithm detect SEO tactics that are low quality and (2) an SEO can call these professional SEO tactics.

Here are the "SEO companies tactics" that Google killed and destroyed and crippled SEO companies by reversing. I will quote:

1 - Guest Posting, dont do it
2 - Directory Submissions, dont do it
3 - Article Directories - Dont do it
4 - Forum posting - No relevance or juice passed (dont do it)
5 - Forum Signatures - dont do it
6 - Forum Profiles - dont do it
7 - blog commenting - dont do it
8 - Larger more popular sites dont automatically rank better - Cough bullshit - Asda, Tesco, Amazon, John Lewis just to name a few
9 - Link wheeling - dont do it
10 - Dont buy Links
11 - Dont allow dofollow links in image adverts (banners etc)
12- Create good content. - might work great for very localised search terms but doesnt help in the slightest for national searches

This saddens me. It really does.

Google's Matt Cutts: Bad Grammar In Comments Doesn't Hurt Your Rankings

The other day, Google's Matt Cutts posted a video answering if poor grammar in comments hurts the page's rankings. In short, the answer is no - bad or poor grammar does not hurt your rankings if done in comments.

Here is the video answer:


Now, Matt did add that you should not let spam comments through, implying that those may impact your rankings. He also implied that bad grammar that you write on your own site is a bad thing and you should make sure to use proper grammar. Back in 2011, Matt actually said grammar is not used in the ranking algorithm at all. He said this after Panda launched, by the way.
Here is that video:


Now he does go on to explain that reputable sites do spell better. He also mentioned the reading level feature they launched a while back.

Thursday, February 6, 2014

Google & EU Settle Antitrust Claims Over Competitor Links

Yesterday, news broke via Bloomberg that Google and the European Union have settled over the anti-trust probe regarding rival links in the search results.

Google will dodge EU fines and any finding that it discriminated against competing sites, a year after the U.S. Federal Trade Commission dropped a similar investigation by saying Google was motivated more by innovation than by trying to stifle competition.

The five-year pledge to the European Commission lets Google add new services or alter its search page as long as it grants three links to rival services next to its own specialized search results such as Google Shopping, the Brussels-based EU said. Competitors will pay at least 3 euro cents (4 U.S. cents) to bid for a spot in a shaded box on some of Google’s search pages.


How far will Google have to go to make the EU happy about showing competition in the search results? Danny Sullivan posted examples of images the EU posted on what they expect to see. Will Google go to this extreme? I can't imagine but we will see:

Here are examples:

click for full size
click for full size
click for full size

This is some bold stuff expected from Google.

Google Giving Some Business Owners 3 Weeks To Save Their Google Place Listing

Some business owners are reporting getting emails from Google and are asking if this is legit in the Google Business Help forums.

The emails are legit and coming from Google and the subject line is, "Action Required: You have 3 weeks to save your Google Places Listing."

Jade Wang from Google explained that they are making "some changes to Google Places for Business and Google Maps." Because of these changes, Google is "asking business owners to review and confirm some of the information in their Google Places accounts."

Jade added:

If you did receive this email, don’t worry. Please log into Places for Business, take a look at your business information, update it if necessary, and click “Submit.” You’ll need to do this for all listings in your account by February 21, 2014, so they can stay on Google Maps. Otherwise, you’ll need to add your business information and undergo PIN verification using Google Places again.

Google+ Local Pages Now Show To User Before Pin Verification

Google's Jade Wang posted an announcement in the Google Business Help forum thread that they are changing how Google+ local pages work for business owners.

Now, the business owner can see the Google+ local page they created before they are verified using the PIN verification system. It won't show on the map or other places consumers might see it, until it is verified but it will show to the local business owner.

Jade explained this in detail:

If you’re creating a listing in the new Places for Business dashboard, now, you won’t have to wait to complete PIN verification before you can see the +page, for most businesses. Just follow the link from your dashboard to see the new page. You will be able to use Google+ social features on this unverified page, but please note -- you still need to complete PIN verification before the page will start showing up in Google Maps and across other Google properties.

If you’ve got an unverified local Google+ page (made using Google+ in the local business/place category), then we still encourage you to PIN verify this page so that it can start appearing in Google Maps and across other Google properties.

If you’re creating a local Google+ page (using Google+ selecting the local business/place category) for a business that we think is already in Google Maps, then you may need to go through both PIN verification and our admin request flow before you can manage the page.

Wednesday, February 5, 2014

Google Sends Manual Actions For Rich Snippet Spam & Spammy Structured Markup

Rich snippet spam has been an issue since rich snippets came out and eventually added a report rich snippet spam tool. Then Google dropped the amount of rich snippets showing in the search results recently.

It seems like Google is now sending out notifications to those who have been spammy with their rich snippets.

One webmaster posted a notification he received in the Google Webmaster Help forums of a manual action he received for "Spammy structured markup."
Here is the text of that notification:

Spammy structured markup

Markup on some pages on this site appears to use techniques such as marking up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google's Rich Snippet Quality guidelines.


This is the first time I have seen a webmaster report getting a manual action sent to them for spammy structured markup.

Tuesday, February 4, 2014

Google's New Search Funnels Attribution Modeling Tool

Google announced on Google+ a new AdWords tool for determining the attributions that lead to your conversion. The tool is named the Search Funnels Attribution Modeling Tool.

The Search Funnels Attribution Modeling Tool offers five models for assigning value to the keywords, ad groups, and campaigns that lead to conversions. While most advertisers assign the last click to the conversion, that might not be a true value.

The attribution models included in the Search Funnels Attribution Modeling Tool are:

  • Last Click model Last click: Gives all credit for the conversion to the last-clicked keyword
  • First Click model icon First click: Gives all credit for the conversion to the first-clicked keyword
  • Linear model icon Linear: Distributes the credit for the conversion equally across all clicks on the path
  • Time Decay model icon Time decay: Gives more credit to clicks that happened closer in time to the conversion
  • Position Based model icon Position-based: Gives 40% of credit to both the first- and last-clicked keyword, with the remaining 20% spread out across the other clicks on the path


adwords-search-funnel

AdWords said on Google+:

Use this tool to examine five different attribution models in AdWords to better understand how different bids for undervalued keywords can help you reach customers earlier in the purchase journey, driving even more conversions.
We’re working hard on attribution-related features to help you better measure the value of your AdWords advertising. Stay tuned!
More details over here.

DMOZ Drops Over 1 Million Sites From Directory?

Did you notice that DMOZ, one of the oldest and largest human crafted web directories, has removed over 1 million sites and 10,000 editors from their directory?
A DigitalPoint Forum thread first noticed it. If you look at the live site now, you will see 4,261,763 sites, 89,252 editors and over 1,019,865 categories in the footer. But if you go to the WayBackMachine archive you will see 5,310,345 sites, 99,997 editors and over 1,019,508 categories.

Here are screen shots:

NOW:

dmoz-new-sites

OLD:

dmoz-old-sites

As you can see, DMOZ dropped about 1 million sites from their directory and 10,000 editors. There was no announcement about this, so I am not sure if this is just a glitch on the footer.

They did however post a rare blog post announcing a new feature for reporting listings.

Of course, most of you don't bother with DMOZ listings anymore anyway but still, interesting to see 1 million sites just vanish from DMOZ.
Facebook Likes, Increase FB Likes Free