Monday, June 24, 2013

Google's Matt Cutts Says DuckDuckGo Maybe Not So Private?

Matt Cutts, Google's head of search spam, has a history of defending Google and pointing out misconceptions of their competitor, DuckDuckGo, on Hacker News.

He did so when talking about DuckDuckGo & robots.txt directives and he did so when talking about filter bubbles.

Next up, a new one talking about DuckDuckGo's traffic increases due to the NSA privacy concerns. The Hacker News thread has Matt (1) defending Paul Buchheit's slogan of "don't be evil" and then (2) saying he believes he say IP addresses in DuckDuckGo ads.

Matt wrote in the thread:

In the past I believe I've seen search ad links on DDG that included my IP address in the URL.


This is in response to a Bing ad deal DuckDuckGo has.

What is Matt insinuating back at DuckDuckGo? :)

I love this stuff but note, the animated GIF is not meant to mock, I wanted to find an image of Matt pointing something out and decided to go this route.

Forum discussion at Hacker News.

Saturday, June 22, 2013

Google Links Reconsideration Request FAQs

Two of my favorite Google Search Quality folks, Kaspar Szymanski and Uli Lutz, put together an article on the Google Webmaster Central blog named Backlinks and reconsideration requests.

The article is basically an FAQ on when to use the reconsideration request specific around bad and low quality links.

Here are the questions but to get the answers, go to the story:

  • When should I file a reconsideration request?
  • Should I file a reconsideration request if I think my site is affected by an algorithmic change?
  • How can I assess the quality of a site’s backlinks?
  • How do I clean a bad backlink profile?
  • How much information do I need to provide?
  • How long does it take to process reconsideration requests?
  • What are the possible outcomes of a reconsideration request?
  • Where can I get more guidance?

Kaspar added on Google+ that they are "working on translating this post to Chinese, Japanese, French, Polish, Portuguese, German and Italian."

Forum discussion at Google+.

Friday, June 21, 2013

Google Language Change: Content That Gets Used & Shared, Not Linked To.

As I reported yesterday at Search Engine Land with Google Changes Ranking Advice, Says Build Quality Sites Not Links - Google made a change to their messaging on the ranking article help document. This was first spotted by @Baeumlisberger and it is an important change.

As I said yesterday, this "change is to keep Google consistent with their general change in messaging that content is what webmasters should focus on, not links."

What was the actual change? The line use to read, "In general, webmasters can improve the rank of their sites by increasing the number of high-quality sites that link to their pages." Google changed the last part to read "by creating high-quality sites that users will want to use and share."

So it is no longer about increasing the number of quality links, it is more about increasing people who want to use and share your content. Again, this is Google's new messaging.

It is not to say that links are not as important and that social is more important. Social is currently not so important, but that may change in the future. It is the way webmasters should think about their content.

Here are screen shots from before and after...

Old:

Google high quality links

New:

Google high quality content

Some folks feel Google is trying to water down things and confuse webmasters.

Forum discussion at Threadwatch, Google+ and Twitter.

Wednesday, June 19, 2013

Google Search May Be Updating

There is a lot of new chatter in the WebmasterWorld forums, along with random threads at other forums including Google Webmaster Help about Google's search results shifting around and some webmasters claiming huge declines in referrals from Google starting around 8pm EDT last night.

Some are hoping it was just an Analytics issue but looking at the reports throughout the day and this morning, it seems like it wasn't.
One webmaster said:

Another 50% drop from what has left till yesterday. That is now 25% left from pre-panda/penguin. Serps itself look quiet spammy like normal.

Another said:

Seeing 10-15% drop here. Pretty much all of our Google traffic is gone. Direct only now.


We know we are waiting for Google to slowly push out a softer Panda update but would softer result in more complaints or less complaints?

Last time we reported a shift without Google confirmation was on June 5th and there are still many sites who have seen huge drops on that date, despite Google not confirming anything.

Mozcast just updated and it is not showing major changes.

Did you notice major changes late last night into this morning?

Forum discussion at WebmasterWorld.

Tuesday, June 18, 2013

Search Spammers Admit Defeat After Spam Algorithm But Vow To Return

Last Wednesday Google released the spam algorithm to target very spammy niches and "illegal" link building and spamming techniques. Did it work?

For the most part, it seems it did. Despite some black hats mocking Matt Cutts, it seems most of the black hatters in that niche are complaining and admitting defeat.

A BlackHatWorld thread has several saying they are hit by this update. Here are some quote.

Seems Matt cutts has been true to his words... first page which was once full of "black hat methods" ranked sites for "payday loans" is now pretty much cleared up...

i managed to rank 2nd and 8th on page one a few days ago, i lasted 8 hours, then was totally sandboxed...


Others explained that while whatever was working in the past, they will have to adapt to get their sites to work in the future.

Not dead at all. 

Will just take longer than a few hours this time round.

The question also being asked in the thread is what other keyword phrases impacted by this update. It is not just payday loans, it is likely pornographic terms, casino terms and others.

As a side note, there is another thread started the same say as the payday update, where SAPE link users are claiming they were hit. Not all, but many. Maybe they were in those spammy niches? Wasn't SAPE links penalized? Yea but I guess maybe not all of those links?

So while some blackhats may give up, the good ones won't.

Forum discussion at BlackHatWorld.

Matt Cutts: Stock Images Currently Don't Impact Rankings But We'll Look Into It

Google's Matt Cutts latest video answers a quick question, "does using stock photos on your pages have a negative effect on rankings?" The answer is currently no.

The images you use on a page have no ranking impact, either positive or negative, on that web page ranking well or not. At least directly.

Matt doesn't get into the possibility of nicer, more unique, images having more of an appeal to readers, bloggers and reporters. Which then can encourage more shares and links to the story. But looking at stock images as a ranking factor is currently something not done.

Matt Cutts said they will look into maybe looking at using the unique value of an image as being part of the ranking algorithm or a signal in the ranking algorithm. But I am not sure if he meant that. :)

Anyway, stock images, like the one I used here, should have no impact on this story not ranking.


Forum discussion at Google+.

Verify your site in Webmaster Tools using Google Tag Manager

If you use Google Tag Manager to add and update your site tags, now you can quickly and easily verify ownership of your site in Webmaster Tools using the container snippet code.


Here’s how it’s done:

1. On the Webmaster Tools home page, click Manage site for the site you’d like to verify, then select Verify this site. If you haven’t added the site yet, you can click the Add a site button in the top right corner.





To do this, you must have "View, Edit, and Manage" account level permissions in Google Tag Manager.

2. On the Verification page, select Google Tag Manager as the verification method and follow the steps on your screen.





3. Click Verify.

And you’re done!

If you’ve got any questions about this verification method, drop by the Webmaster Help Forum

Saturday, June 15, 2013

Google Search Engineer Defends SEO On Hacker News

Normally it is Googlers defending Google on Hacker News but Ryan Moulton, a software engineer at Google since 2006, who I think works on the Google search team, is defending SEO and the SEO business on Hacker News.

In short, a blogger called out an email he received from a link builder asking to buy or get a link from his site. You and I get them all the time. But this blogger called this the destruction of the web.

In which, Ryan Moulton, aka moultano on Hacker News said:

A large part of SEO is just making it obvious what your page is about. This helps both the user and the search engine. A lot of it is just usability, but usability specifically for a user who is coming from a search engine.


Well, he said that in response to a comment where someone said SEO is a "bad thing."

He explains why sites and business should rank on Google later on:

You should read this as shorthand for "build your business the traditional way and the users will come." Don't just count on ranking highly for a competitive query for your business to succeed. Build a brand and customers, and people will seek you out. You will have no difficulty ranking for [the name of your company] and this is where most good sites get most of their traffic from Google.


Matt Cutts also chimed in there a few times, also on some levels defending SEO as a business practice and also defending Google when needed.

Forum discussion at Hacker News.

Thursday, June 13, 2013

Google’s Matt Cutts: Same Ad-To-Organic Ratio As Google, You’re Safe From The Top-Heavy Alogrithm

At SMX Advanced tonight, Google’s head of search spam, Matt Cutts announced that if you have the same or less ads than Google does in their search results, then you are safe from their top heavy algorithm.

In short, if your ad to organic ratio is the same or less than what you see in Google’s search results, you are safe.

This came up durin the Ask The SEO session, where Matt Cutts was encouraged to come up on stage to answer some questions.

One question was around why does Google have so many ads in the organic result. Danny Sullivan joked, would Google penalize Google for top heavy algorithm? Matt responded seriously that even if the search results pages were indexed by Google, the algorithm that determines if a web page should be penalized or impacted negatively by the top heavy update, would not be triggered.


So you can use Google search results pages as a benchmark for not going overboard on the top heavy update.

Matt Cutts: Google Panda Updated Monthly But Slowly Rolled Out

Back in March, Google said they will stop confirming Panda updates because they are now more baked into the index and algorithm.

Well, at SMX Advanced, Matt Cutts of Google announced it is still roughly updated monthly (although it has been 6 weeks because they are trying to soften Panda - note, next Panda update sites should be released) but these updates are rolled out gradually over a 10 day period. (How is that for a run on sentence?)

So, Panda might be pushed out on the 1st of the month but take ten full days to fully roll out everywhere. Then it will happen again roughly a month later. So 1/3rd of the month, Panda is rolling out. I called this at Search Engine Land, the Panda Dance.

For more details on the Panda updates, click here.

Forum discussion at WebmasterWorld.

Tuesday, June 11, 2013

Google Payday Loan Algorithm: Google Search Algorithm Update To Target Spammy Queries

Google has officially launched a new search update to target “spammy queries” such as payday loan, pornographic and other heavily spammed queries.

Matt Cutts, Google’s head of search spam, announced this on Twitter saying “We just started a new ranking update today for some spammy queries.” He pointed to the video he published where he talked about upcoming Google SEO changes.

Our summary then was:

While queries that tend to be spammy in nature, such as [pay day loans] or some pornographic related queries, were somewhat less likely to be a target for Google’s search spam team – Matt Cutts said Google is more likely to look at this area in the near future. He made it sound like these requests are coming from outside of Google and thus Google wants to address those concerns with these types of queries.

Here is the video where he pre-announced this change at about 2 minutes in 30 seconds in:


While at SMX Advanced, Matt Cutts explained this goes after unique link schemes, many of which are illegal. He also added this is a world-wide update and is not just being rolled out in the U.S. but being rolled out globally.


This update impacted roughly 0.3% of the U.S. queries but Matt said it went as high as 4% for Turkish queries were web spam is typically higher.

Google's Matt Cutts On Disavow Tool Mistakes

Google's Matt Cutts posted a video the other day explaining the top six or so mistakes that SEOs and webmasters make when using the disavow tool.

By far, the most common mistake is uploading anything but text (TXT) files. Many upload Word documents or Excel files, they should not - you should only upload TXT files.

Here is the video followed by the six most common disavow tool mistakes:


(1) File you upload should be a regular text file only. No syntax, etc. People often upload word docs, excel spreadsheets, etc. Just upload a text file.

(2) Typically, the first attempt by users are to be very specific and fine tuned with their individual urls. Instead use a domain: command and disavow the whole site. That is often better. See our machete story.

(3) Wrong syntax is another common issue, use the right syntax.

(4) Do not write the story on why you are disavowing in the disavow text file. Do that instead in the reconsideration request, not in the text file.

(5) With that, when you do that, they use comment out tags. So don't add lots or any comments, it will increase the chance of errors on Google's parser.

(6) The disavow is not the be all and end all. It will not cure all your URLs. Clean up your links outside of the disavow tool as well, don't just go this route.

Kaspar Szymanski, a member of the Google web spam team said on Google+:

As I'm involved in the reconsideration request process, let me tell you this a very important video to watch for any webmaster who has experienced spammy backlinks issues.


Forum discussion at Google+.

Google Webmaster Tools Reporting False URL Removals?

A Google Webmaster Help thread has a possible bug report in Google Webmaster Tools.

The issue is, the index status report within Webmaster Tools is reporting two URLs as being removed from a specific web site. But that webmaster promises not to have used the URL removal tool within Webmaster Tools, nor did the webmaster block it via robots.txt or other methods.

Here is a screen shot showing no URLs removed via the removal tool:

google removal url tool

Here is a screen shot showing two urls removed via the index status report:

google removal url tool

Seems like no one in the thread has a clear answer.

Google's Gary Illyes finally did respond, promising to have someone at Google look into the issue. He wrote:

Thanks for posting this. I sent it over to the guys responsible for that feature in Webmaster Tools and they're going to take a look.

I'll come back with more details in the unlikely case you need to change something on your side.

Have you noticed issues in this area recently?

Forum discussion at Google Webmaster Help.

Thursday, June 6, 2013

Google’s Matt Cutts: Web Spam Benefits From Using Rel=”Author”

A new video by Google’s head of search spam, Matt Cutts, talks about how potentially using rel=”author” structured data can help Google’s Web spam team improve search quality.

Matt Cutts explains that moving from the anonymous Web to a Web with identity helps Google understand the authority and trust of the person writing that content. It can help identify a spammer from an author with a lot of authority and credibility.

The example given by Cutts is of our own Founding Editor, Danny Sullivan. If Danny writes something in a low PageRank forum, Google may consider that post written by Danny with more authority, despite the overall domain/forum it was written on having low PageRank.

Here is the video from Matt:



Let’s not forget that despite Google talking about an algorithm or having a patent on an algorithm, it does not mean Google uses the algorithm in their live search results.

A Google Update Kicking Into Gear Now?

As I mentioned in our monthly Google webmaster report this morning, it seems like there is an uptick in chatter around a possible Google update kicking off this morning. It is extremely early but I've received emails, tweets and other correspondence of a possible Google update.

In addition, the WebmasterWorld thread has some renewed discussion around changes in traffic and search results and both SERPmetrics and especially SERPs.com show volatility in the Google search results. And MozCast just updated showing incredibly high changes in the Google search results.

Mozcast:

mozCast June 5

SERPs.com:

SERPs June 5

SERP Metrics:

SERP metrics June 5

So it does seem something is going on based on all the signals I am following.

Did you notice an update? Have your rankings changed?

Forum discussion at WebmasterWorld.

Wednesday, June 5, 2013

Google: Your UGC Content May Have Triggered Panda

There is an interesting discussion going on in the Google Webmaster Help forums between an old community site that is supposedly heavily moderated and Google. It is long but in summary, it seems like the details are as follows.

This site has been around since the 90s. It is a community site where people ask English questions and others respond and help. The site has done well in Google since maybe before Google was even on the radar. But in mid-November, likely because of the Panda 22 update, that site took a major hit in Google's search results.

John Mueller of Google came in to pinpoint specific pages that have poor quality content and linked the site owner to the official Google Panda advice blog post.

John then goes deeper and talks about the pros and cons of UGC, which he has done many many times. John had a lot of advice but let me pull out one point:

One of the difficulties of running a great website that focuses on UGC is keeping the overall quality upright. Without some level of policing and evaluating the content, most sites are overrun by spam and low-quality content.

The thing is, this site owner said the site is very heavily moderated. He said:

Unless i'm being a 'love-blind' we wouldn't have survived this long (as a community) without a heck of a lot of policing and housekeeping.

He then lists out all the things they do to police and moderate the content and it is a long list.

The interesting part is that DaniWeb, which we've covered before as being hit by Panda also claimed they were hit by this November Panda #22 update as well. Dani posted those details in a different thread. Both this English forum and DaniWeb are huge communities that are supposedly heavily moderated.

So is Panda not working right or are these site's moderation guidelines not working right?

Forum discussion at Google Webmaster Help.

June 2013 Google Webmaster Report

In the past month, we had a major Penguin update, Google went after a link network, likely TLA, Matt announced ten SEO changes coming down the pipe and made a clear stance on advertorials.

It was a busy month to say the least and now I am seeing more than normal chatter around a possible Google update. I'll probably write about that right after I submit this post, so stay tuned.

I'll summarize the key posts specific to webmaster related Google topics below. But to see last month's summary, go here.

Google Update:
Google Links:
Google SEO:
Google Search Features:
Google Webmaster Tools:
Polls:

Forum discussion at WebmasterWorld.

Tuesday, June 4, 2013

Google's Matt Cutts: Text Link Ads Link Sellers Targeted

A few weeks ago, we reported that Google busted another link seller network. We didn't know which link seller network it was - until now. Matt Cutts, Google's head of search spam, tweeted last week that the link network they went after was "TLA linkselling sites."

TLA, as everyone in our industry knows is Text Link Ads. They are one of the older and more substantial and much more visible link selling networks. Heck, I even had them on this site until I nofollowed my paid links. Also, in 2004 or so, my company build the software behind the site, mostly how they bill link buyers and pay link sellers but it grew into more. Keep in mind, this was all before the nofollow attribute, not that it matters.

That being said @patrickaltoft asked Matt Cutts, "do you mean link sellers in general or specifically ones hosting ads via Text Link Ads the company?"

Matt responded, "capital TLA."



TLA's home page PageRank is still a 6 but the link sellers may have taken a hit?

Are you a TLA link seller? Did you notice a hit? Are you a TLA link buyer? Did you notice a hit?

Forum discussion continued at BlackHatWorld, WebmasterWorld and Twitter.

Have Links To Disavow? Google Says Google Webmaster Tools Link Report Is Enough?

Last week I wrote about How Do You Uncover Your Spammy Links? There I said, when you want to disavow links or have them removed, using Google Webmaster Tools link report is probably not enough. You probably need to use third party tools as well.

Well, according to Google you may not need to use third party tools.
Spotted by @Marie_Haynes and @joehall, Aaseesh Marina from Google's search quality team said all you might need is Google Webmaster Tools "Links to Your Site" report.

Aaseesh said:

You can get a good idea of your backlink profile from the links provided in the 'Links to Your Site' section under 'Traffic' in your Webmaster Tools account. You can use that download a sample of your site's backlinks from there and remove any unnatural link you find.


Is that Google saying all you need is Google Webmaster Tools link report or do you really need more?

If you think about it, the links Google reports are the links Google knows about. Or does Google not report all your links? Are the reports very delayed or inaccurate? Those are all questions at the top of mind of link builders.

There is a pretty good conversation around these questions as well as how to best use the Google link tool to get at this data at Twitter.

Would you trust Google's link tool in this case?

Update: I asked Google's John Mueller this in a Google Hangout today and he confirmed that you do not need to use third-party tools and that Google Webmaster Tools is fine. He did say, sometimes third-party tools may help with cleaner reports but not needed.

Forum discussion at Twitter.

Google's Official Tips On Internationalizing A Web Sit

Google's Jens O. Meiert and Tony Ruscoe posted on the Google Webmaster Blog 6 quick tips for international web sites.

This is a topic we covered a lot and honestly is somewhat foreign (pun) to me.
The 6 tips in short are:

1. Make pages I18N-ready in the markup, not the style sheets

2. Use one style sheet for all locales

3. Use the [dir='rtl'] attribute selector

4. Use the :lang() pseudo class

5. Mirror left- and right-related values

6. Keep an eye on the details

Now, a WebmasterWorld thread asked why is Google saying to use the language attribute when it doesn't seem to work. One webmaster said, "Google ignores code level language informations," so why are they recommending it? Do they really ignore that information? I didn't think they do.

In any event, feel free to join the discussion on these tips in the forums.

Forum discussion at Google+ & WebmasterWorld.

Saturday, June 1, 2013

6 Quick Tips for International Websites


Note from the editors: After previously looking into various ways to handle internationalization for Google’s web-search, here’s a post from Google Web Studio team members with tips for web developers.

Many websites exist in more than one language, and more and more websites are made available for more than one language. Yet, building a website for more than one language doesn’t simply mean translation, or localization (L10N), and that’s it. It requires a few more things, all of which are related to internationalization (I18N). In this post we share a few tips for international websites.

1. Make pages I18N-ready in the markup, not the style sheets


Language and directionality are inherent to the contents of the document. If possible you should hence always use markup, not style sheets, for internationalization purposes. Use @lang and @dir, at least on the html element:

<html lang="ar" dir="rtl">

Avoid coming up with your own solutions like special classes or IDs.

As for I18N in style sheets, you can’t always rely on CSS: The CSS spec defines that conforming user agents may ignore properties like direction or unicode-bidi. (For XML, the situation changes again. XML doesn’t offer special internationalization markup, so here it’s advisable to use CSS.)

2. Use one style sheet for all locales


Instead of creating separate style sheets for LTR and RTL directionality, or even each language, bundle everything in one style sheet. That makes your internationalization rules much easier to understand and maintain.

So instead of embedding an alternative style sheet like

<link href="default.rtl.css" rel="stylesheet">

just use your existing

<link href="default.css" rel="stylesheet">

When taking this approach you’ll need to complement existing CSS rules by their international counterparts:

3. Use the [dir='rtl'] attribute selector


Since we recommend to stick with the style sheet you have (tip #2), you need a different way of selecting elements you need to style differently for the other directionality. As RTL contents require specific markup (tip #1), this should be easy: For most modern browsers, we can simply use [dir='rtl'].

Here’s an example:


aside {
 float: right;
 margin: 0 0 1em 1em;
}
r='rtl'] aside {  f
[d iloat: left;
m 1em 0; }
 margin: 0 1
e

4. Use the :lang() pseudo class


To target documents of a particular language, use the :lang() pseudo class. (Note that we’re talking documents here, not text snippets, as targeting snippets of a particular language makes things a little more complex.)

For example, if you discover that bold formatting doesn’t work very well for Chinese documents (which indeed it does not), use the following:


:lang(zh) strong,
:lang(zh) b {
normal;  color: #900;
 font-weight:
}

5. Mirror left- and right-related values


When working with both LTR and RTL contents it’s important to mirror all the values that change directionality. Among the properties to watch out for is everything related to borders, margins, and paddings, but also position-related properties, float, or text-align.

For example, what’s text-align: left in LTR needs to be text-align: right in RTL.

There are tools to make it easy to “flip” directionality. One of them is CSSJanus, though it has been written for the “separate style sheet” realm, not the “same style sheet” one.

6. Keep an eye on the details

Watch out for the following items:
  • Images designed for left or right, like arrows or backgrounds, light sources in box-shadow and text-shadow values, and JavaScript positioning and animations: These may require being swapped and accommodated for in the opposite directionality.
  • Font sizes and fonts, especially for non-Latin alphabets: Depending on the script and font, the default font size may be too small. Consider tweaking the size and, if necessary, the font.
  • CSS specificity: When using the [dir='rtl'] (or [dir='ltr']) hook (tip #2), you’re using a selector of higher specificity. This can lead to issues. Just have an eye out, and adjust accordingly.


If you have any questions or feedback, check the Internationalization Webmaster Help Forum, or leave your comments here.
Facebook Likes, Increase FB Likes Free