Thursday, April 24, 2014

Look Back At Older Google Maps Street Views Images

Google announced they will show, when they have it, the older images they've taken of streets via Google Street Views, so you can go back in time.

Google Street Views has been taking pictures since 2007 and Google is going to let you go back in time to view those images as they are taken.

Now, if you see a clock icon in the upper left-hand portion of a Street View image, click on it and move the slider through time and select a thumbnail to see that same place in previous years or seasons.

click for full size

It does not seem to be live yet but when it goes live, I am sure many of you will have a lot of fun.

Here is a fun video of this:


Google Keyword Planner Tool Gets New Features

Kim reports the keyword planner tool within the Google AdWords console has quietly added nine new features today.

The new features include:

  • Specify a time period for keyword data
  • Compare time periods over time
  • View absolute and relative changes for each ad group and keyword’s volume comparing two periods
  • Visualizing mobile trends
  • See breakdowns by device and targeted locations
  • Flexible time periods
  • Device segmentation and bid adjustments
  • Location breakdowns for sub-geos of your targeted location
  • Visualizations and estimates for sub-geos

Kim has tons of screen shots and explanations of each new add-on to the keyword planner tool on her blog.

Google's Matt Cutts On How To Buy Domain Names

Google's Matt Cutts posted a video on the question How can I research a domain that I may want to purchase?

Matt Cutts gave a few tips on what to do, what steps to take, when researching a domain name buy.

(1) Do a site command and if no content is found, that can be a bad thing. Of course, if the domain name was never registered or is new or is parked, then it is likely no content will be found. But if it is buying someone else's domain name, then no content found is a red flag.

(2) Search for the domain name without the .com and see what people wrote about it. See if there are bad stories on it. See if someone went around and did a lot of spamming.

(3) Use archive.org to see the site before you owned it. Did it look spammy?

(4) Ask to see Google Webmaster Tools and look for messages and stats there.

(5) Ask to see Google Analytics or other analytics they may use.

Now if you do buy a domain name that has issues, then you can submit a reconsideration request. Matt Cutts said that before you do that, ask yourself why you bought the domain name. Was it for links and rankings or just because you like the domain name. If it was for links, typically those links will not help you. You may want to disavow all those links and go forward with the reconsideration request to start new.
Here is the video:

Wednesday, April 23, 2014

Google AdWords App Marketing, Estimated Total Conversions & More

Google announced what they showcased at an event they hosted for 170 search marketing experts at the AdWords Performance Forum the other day.
In short, Google is rolling out new AdWords features for Apps, improved Estimated Total Conversions and more advanced features within the AdWords console for managing ads.

(1) Apps: You can now promote apps to users who install apps related to yours across search, display and YouTube. Google will use the app installs and in-app purchases to enable advertisers to show ads. Also, there is a new feature to do "app re-engagement", which is a new campaign type in AdWords for both search and display so that consumers can be taken directly into already-installed apps. Plus, you can soon measure conversions across the entire lifecycle of the app - from install to re-engagement to in-app purchases.

(2) Estimated Total Conversions has been upgraded to support in-store purchases. Google said "As people search more online for local businesses and then go into the store to make purchases, we're testing ways to measure the effectiveness of search ads at driving in-store sales, using anonymized purchase data from retail partners."

(3) Google also added new tools within the AdWords console for advertisers including:

  • More bulk actions: With new bulk capabilities for extensions and settings, you'll be able to easily set up campaign settings like location targeting and ad rotation across all your campaigns (even if you have thousands of them!). This should be particularly useful for big seasonal promotions, when you might want to update all of your campaigns at once.
  • Automated bidding: Building off of the momentum that Google has with automated bidding, Google is adding the ability for you to maximize the number of conversions or the total value of conversions. For example, if you’re a car maker, you’ll be able to set up your campaign to allocate your budget in a way that maximizes the number of people who visit your site to design a custom vehicle.
  • Advanced reporting: To help you analyze your data better (without the endless downloading and re-formatting of data) Google is providing you with new multi-dimensional data analysis and visualization tools so you can perform most, if not all of your data analysis, right here inside AdWords. Google is also making it easy for you to turn your data into tables, graphs and charts so that you can download them and share with your teams.
  • Your own lab: One of the great benefits of AdWords is that it offers an incredible platform to test and tweak live campaigns. So Google built a lab inside AdWords for you to prepare ideas for your campaigns, see what they’ll look like, and then run tests with live traffic as an experimental trial. You can experiment with almost anything in your campaign, including bid changes, new keywords, different campaign settings, special bids for times and locations, different kinds of ad formats, and more.

A Google Update Brewing?

The ongoing WebmasterWorld thread has been pretty active with webmaster chatter since Sunday, when I went offline for two days.

It also appears that the automated tracking tools are seeing nice spikes over the past couple of days including Mozcast, SERPs.com and SERP Metrics. Although algoroo doesn't seem to agree.

About ten days ago we thought we saw a Panda refresh but Google would not confirm it.

Are you noticing fluctuations in your Google rankings and referrals?

Wednesday, April 16, 2014

70% Said Google's Penguin 2.1 Update Hurt them

About six-months ago, Google unleashed a major update to Penguin, the Penguin 2.1 update.

The earlier feedback from that update was that is was a big impact to webmasters and mostly SEOs who used links in an aggressive manner.
So I posted a poll asking you how you did with the update. Being that six-months later there has been no reports of a new Penguin refresh or update, I figured I post the poll results today.

We had over 2,800 responses to the poll, and 70% said they saw a drop in Google referrer traffic after the update. So 70% seemed to have been negatively impacted by it. While only 7%, or just about 200 people, reported their sites recovered from a previous Penguin update with Penguin 2.1. 6% said they saw an increase in traffic, where one site goes down, another benefits. And 17% said they saw no change at all before or after.

Here is the pie chart:

google penguin 2.1 poll

Again, we are still waiting for a Penguin refresh for over six-months. I think we are due one soon.

Google Penguin Updates:



Disclaimer: Please see my poll disclaimer post before coming to any conclusions on these results.

Tuesday, April 15, 2014

Google Panda Refresh Or Softer Panda Update?

Let me start off by saying that I normally would wait another 24 hours before posting this because the chatter is so new, but I will be offline tomorrow and the day after and I am going with what I see now.

It seems, based on the very very early chatter that a Panda refresh started late last night or this morning. Some are asking if it is that version two of the softer Panda update that Google's Matt Cutts promised.

The ongoing WebmasterWorld thread has posts from late yesterday and early this morning with questions about changes at Google that seem to relate to sites impacted by the Panda algorithm.
One webmaster said:

Ok, I see a bad sign of another silent update. Lowest traffic in the last 5 years. It looks like a Panda reiteration.

Another webmaster said this is the opposite of the "softer" update that they were expected:

My figures and search results seems very similar to the pre-soft-Panda...

A senior member agreed:

@Mentant, I can confim your observation. Lowest traffic ever. All our main keys are gone, replace by brands that do not have anything in common with the search string. For sure this is a Panda. I think they took an old one and let it go through the index.

Google keeps going their way to destroy all ecom except amazon/ebay. There is no sign of "Leveing" or even Panda being softer.


So the questions we have now:

(1) Is Google pushing out an update?

(2) If so, was it just a typical monthly Panda refresh?

(3) Or was it the softer Panda update that seems harder for many?

Forum discussion at WebmasterWorld.

Update: Google told me on the record there was no update:

Just checked with the team and there is nothing we're aware of on this. Thanks for reaching out.

Friday, April 11, 2014

Google AdWords Goes Not Provided: Referrer Data Gone For Paid Search Ads

The rumors were true, Google has indeed went the not provided route for paid search ads, AdWords ads.

Let me step back and explain what is happening, so you really understand it.

What is changing is that the advertiser will not see the full referrer data, including the keyword string, passed through to their server. In short, when someone goes to Google, searches for widget and then clicks on your ad, Google would communicate the details of that click through what is called a referer. Now, a lot of what is communicated in that click has been removed, including the keyword the person searched for. That won't pass to your log files, through the referrer data or to the standard analytics packages.

Does that mean you are losing the knowledge of what people are searching for to click on your ads? Nope. Not really.

The AdWords reports still show them to you, the APIs still give them to you and a lot of the AdWords software built by third parties use the API and thus nothing has changed with those applications. But, the raw referer data is gone and just like with organic search, that data won't be passed through the referer and to your log files or analytics.

What is interesting is that there are ways to get the data but not necessarily the true keyword data. You can use ValueTrack, a parameter you can set in AdWords to trigger the destination URL to include the keyword and other parameters. So technically, you can set the destination URL to have that data, such as:

http://www.example.com/?matchtype=b&keyword=blue%20socks

The keyword parameter will pass for the specific keyword that triggered your ad. So if you aren't using that, you can. Of course, again, most of tools automatically handle a lot of this.

This impacts mostly those who don't use those tools or who relied on basic Google Analytics and/or old fashion technology.

You can learn more about this over here.

Google: We Mostly Ignore Web 2.0 Links

In a Google Webmaster Help thread, Google's John Mueller said Google mostly ignores Web 2.0 links, specifically calling out Pinterest, YouTube and others.
He wrote:

We're already mostly ignoring those links (just like we're ignoring the "web 2.0" / Pinterest / YouTube / article / etc links), but if you see them and want to make sure that they're not causing any issues, disavowing them is fine.


Yea, most of the "Web 2.0" links are nofollowed anyway, which is why Google is ignoring them, because they are being instructed to do so.

Most links on YouTube, Pinterest and so forth are nofollowed. But John also say "article" in his example.

In any event, figured I'd share this comment from John more widely. 

Wednesday, April 9, 2014

Google Analytics & AdWords Make It Easier To Link Accounts

Google announced they have streamlined the process to link your Google AdWords account to your Google Analytics account.

The new mechanism is launching in the upcoming weeks and will give you the ability to link multiple AdWords accounts all at once.

To access this, go to your Google Analytics account, click the "Admin" button in the header, and you should see a new "AdWords Linking: section in the "Property" column. If you do not see it yet, like me, then it isn't live yet.

When you do see it, you can "X" off the properties you want to include in your Analytics reports:

Google Analytics & AdWords Linking

Google's Matt Cutts: You've Got A Mild Case Of Penguin

The Google Penguin algorithm has been taunting SEOs for a long time now, but did you know there are different degrees of Penguin?

It is not that you are hit by the Penguin algorithm or not. You can be hit harder or softer depending on your links.

Google's head of search spam, Matt Cutts, posted on Twitter to a webmaster that he has a "very mild case of Penguin." Yea, there is clearly such a thing as degrees of Penguin.

Here is Matt's tweet:




So I guess a site that is impacted by the Penguin algorithm can be impacted a lot or a little or somewhere in between?

Matt Cutts: Google Penalizes Seven Link Networks In Japan

Google's Matt Cutts announced on Twitter at 2:30am this morning that over the past few months, Google has taken action on seven link networks in Japan.

Matt said, "over last few months they've taken action on seven link networks!" He said this makes him "incredibly proud" because this levels the playing field in Japan.

Here is Matt's tweet:




There is no specific date as to when these link networks were targeted and penalized, so it would be hard to tell if your site was impacted by this or an algorithm. That being said, most of my readers here likely do not try to rank for Japanese related content (except a few maybe).

So let's see who Google targeted over links in the past few months:




Google Places Updates Bulk Location Management Dashboard

Just a couple weeks ago, Google updated the bulk import tool for managing your Google Places local listings.

Yesterday, Jade Wang from Google announced in the Google Business Help forums that there is another update coming to the location management dashboard. The update includes status of your updates, a conflicts interface and an edit timeframe screen.

Here is a screen shot of the new interface:

click for full size

The update includes:

  • Status of your locations on Maps: Now, Google will show you a column that describes the status of each location on Maps. You will be able to tell at a glance which locations are live, unverified, have errors or data conflicts, are duplicates, or are pending review.
  • Updated data conflicts interface: The updated interface will show you details on how a location page might differ on Maps/Search results versus what's in your dashboard. Google will show you what is live on Google, and which field is different from what’s in your dashboard. From there, you'll be able to take action.
  • Improved edit timeframe: Google is working on improving the speed with which your data goes live on Google.

Other things coming in the future include:

  • Make updates and posts to your customers using the Google+ page for your location
  • See Insights to help you track your business's performance on Google
  • Filter to view relevant subsets of locations within your account

Thursday, April 3, 2014

Google Maps Search Nearby Link Returns

When Google launched the new Google Maps the search nearby functionality got harder to use. That has now changed, Google brought back the quick link to search nearby within Google Maps.

Now, when you browse or search Google Maps for a specific area, Google will return a link to "search nearby" and clicking it will give you the ability to search for a specific keyword nearby that location. So parking, pizza, gas and so on.

Here is a picture:

Google Maps Search Nearby

Google announced this on Google+ and said:

Looking for a bookstore by your work? You can now easily search for places nearby a location in a few ways using the new #GoogleMaps for desktop.

Econsultancy Nofollows All Contributor & Guest Posts

Based on all the guest blogging for SEO being dead and sites being penalized over guest blogging, the Econsultancy blog has decided to nofollow all guest blog contributions.

Econsultancy, in my mind, is a respected blog on internet marketing topics. The content quality, both editorial and contributed, always seemed very high to me. For them to make this move makes me wonder. They said they "want to play it safe," but is the move a way for them to get attention and links? It worked here.

Either way, I don't blame them for going that route. Heck, I don't even allow guest posts here, never really did.

But this is a question many blogs and news sites are asking themselves. Should they also go the route of nofollowing links left by guest bloggers even when those guest posts are reviewed by a strict editorial process. Lots of sites are asking or going to be asking this question in the near future.

Of course, you should check out the conversation going on in Twitter about this change. The debate on guest blogging will live on for some time now.

Matt Cutts: Google's Search Results Will Always Be In Flux

In an April 1st video released by Google's Matt Cutts - Matt makes it clear, Google will never stop adapting, changing the search results. The search results will always be changing, the result of that change is "flux," Matt said.

Google has to change because the web is changing, searchers are changing, devices are changing and spammers are changing.

Even if the results are perfect today, Cutts said, they still need to work on making them perfect in the future. Why? Because searchers will demand more and more.

So yes, SEOs, you need to always be adapting and changing as well.

Here is the video and yes, you will notice his shirt is changing colors as he does this video (that was part of the April Fools bit).


Wednesday, April 2, 2014

Google Index Status Reports Now With HTTPS & Subdirectories

Google announced the Index Status reports within Google Webmaster Tools now lets you differentiate between HTTP, HTTPS and subdirectories.

Google's John Mueller said "If you're a data-driven SEO (or just love to see how your site's indexed), you'll love this change :). In Webmaster Tools, we've now made it possible to differentiate the "index status" information for http / https as well as for subdirectories." Zineb explained "you can now see index status data for your HTTPS sites and subdirectories."

You will see on the report an "update" line that will convey when the reporting changed to handle this.

Google Index Status Update HTTPS

As of March 9, 2014, the Index Status reflects the data of your specific protocol and site combination as it is verified in Webmaster Tools (i.e. distinguishing www and https variations).

Here are the technical details:

We do not show aggregate data for all versions of your site. While Google crawls and indexes content from your site regardless of whether you have verified the site in Webmaster Tools, the number of indexed URLs reported in Index status are specific to those associated with your site version.

For example, suppose you have a site with 10 URLs that people can view without signing in, and 100 URLs that people can only see once they sign into your site. If you have added only one version of your site to Webmaster Tools (e.g. http://www.example.com), you would see Index status totals only for the non-secure portion of your site, which would be a much lower number than for all URLs on your site.

Therefore, in order to see the index count for your secure site, you will need to add it to Webmaster Tools (e.g. https://www.example.com) and then select it from the Site Selector.

Similarly, you can verify a subdirectory of your site with Webmaster Tools, and only data for that subdirectory will be shown in its Index status (www.example.com/blog/). However, the top-level domain will continue to reflect the total count of URLs indexed for that domain.

Tuesday, April 1, 2014

A Shorter Google Title Tag After Redesign? Maybe.

Google has a new design and with that your title tag length, the blue clickable link in the Google search results, may be impacted.

Pete Meyers from Moz posted New Title Tag Guidelines & Preview Tool. He says the new title tag length is not set at 55 or a specific length, but ranges between 42 and 68 depending on Google's algorithm.

Here is the distribution chart on how likely it would be a specific character length:

click for full size


I should note, this has no impact on rankings. Just because Google cuts off your title tag, it doesn't mean it isn't used in its entirety for rankings. But it does mean that your title tags may be less click friendly.

A WebmasterWorld thread is seeing a lot of questions about the title tag changes.

As martinibuster put it:

This is just the display of the Title tag, not the consumption of it by the algorithm. So, as you already noted, a shorter title may be more general. I'm not changing anything for Google. If a slightly longer title makes sense then that's what's going in the title.


This doesn't really change anything for me, though. I don't do exact match longtail titles. Prefer to match it generally in the title and more exactly in the text of the content.


Check out the Moz post for more details.

Time To Get Creative With Your Link Removal Requests

One trend we've seen in the past year or so is that link builders are now extremely busy with link removals. Getting links to your web site has never been easy, it is a talent to get webmasters to read your link requests. But getting link removed is often harder.

Why is it harder? Well, asking for a link is a positive thing. People like positive things. You tell them they have a great site, you are wonderful, etc etc. But link removal requests are negative and people do not like negative things. You are asking a webmaster to remove a link you have to them because Google thinks it is low quality. How dare you tell a webmaster their site is low quality!

That being said, some webmasters and SEOs are getting creative with link removal requests.

As I covered at Search Engine Land, diamond shop Brilliance, did just that. They sent emails with a pirate theme and hoped the webmaster would take the bait. And some did. Here was the email they sent out:

click for full size

Shai Barel of Brilliance said it worked well and even shared three email responses from webmasters who complied with the link removal request. Some of those replies were fun as well:

Ahoy Ye Matey website removed.


I appreciate the way you approached bloggers about this so I will waive my typical fee and remove the link.


What a great removal request! Brightened up my day for sure...

Did you do something creative or fun for your link removal requests?

Google’s Matt Cutts On How They Evaluate New Search Algorithms

Google Head of Search Spam Matt Cutts posted a video today answering how Google goes about evaluating which new search algorithms they use and which they throw away or adapt.

The question was posed by James Foster of Sydney, Australia who asked:

What are some of the metrics that Google uses to evaluate whether one iteration of the ranking algorithm is delivering better quality results to users than another?

Matt Cutts breaks it down to about three steps of the evaluation process:

(1) They test the algorithm offline, benchmarking how the results rank with the new algorithms and if the URLs are higher quality than the previous algorithms in place. The quality is based on how the search quality raters rate the URLs in previous cases. If the URLs were unrated, Google can request these raters to rate the new URLs or compare the old search results to this new test set. Then based on those metrics, Google may decide to move the test to the next phase.

(2) Live tests, where Google will sample a subset of real live searchers and give them the new results with the new set of test algorithms. If Google sees a higher click rate on the new search results, it may imply that the new results are better than the older ones. This is not always the case, specifically with webspam, Cutts said. But in general, the more clicks on a specific search result page, the better quality the results.

(3) Then the Google Search Quality Launch Committee has the ultimate say on if the algorithm goes live to the public or not.

Matt said Google has this down to a “pretty good system” but every now and then they need to refine some of the processes within this workflow.

Here is the video:



More Precise Index Status Data for Your Site Variations


The Google Webmaster Tools Index Status feature reports how many pages on your site are indexed by Google. In the past, we didn’t show index status data for HTTPS websites independently, but rather we included everything in the HTTP site’s report. In the last months, we’ve heard from you that you’d like to use Webmaster Tools to track your indexed URLs for sections of your website, including the parts that use HTTPS.

We’ve seen that nearly 10% of all URLs already use a secure connection to transfer data via HTTPS, and we hope to see more webmasters move their websites from HTTP to HTTPS in the future. We’re happy to announce a refinement in the way your site’s index status data is displayed in Webmaster Tools: the Index Status feature now tracks your site’s indexed URLs for each protocol (HTTP and HTTPS) as well as for verified subdirectories.

This makes it easy for you to monitor different sections of your site. For example, the following URLs each show their own data in Webmaster Tools Index Status report, provided they are verified separately:



The refined data will be visible for webmasters whose site's URLs are on HTTPS or who have subdirectories verified, such as https://example.com/folder/. Data for subdirectories will be included in the higher-level verified sites on the same hostname and protocol.

If you have a website on HTTPS or if some of your content is indexed under different subdomains, you will see a change in the corresponding Index Status reports. The screenshots below illustrate the changes that you may see on your HTTP and HTTPS sites’ Index Status graphs for instance:


HTTP site’s Index Status showing drop

HTTPS site’s Index Status showing increase


An “Update” annotation has been added to the Index Status graph for March 9th, showing when we started collecting this data. This change does not affect the way we index your URLs, nor does it have an impact on the overall number of URLs indexed on your domain. It is a change that only affects the reporting of data in Webmaster Tools user interface.

In order to see your data correctly, you will need to verify all existing variants of your site (www., non-www., HTTPS, subdirectories, subdomains) in Google Webmaster Tools. We recommend that your preferred domains and canonical URLs are configured accordingly.

Note that if you wish to submit a Sitemap, you will need to do so for the preferred variant of your website, using the corresponding URLs. Robots.txt files are also read separately for each protocol and hostname.

We hope that you’ll find this update useful, and that it’ll help you monitor, identify and fix indexing problems with your website. You can find additional details in our Index Status Help Center article. As usual, if you have any questions, don’t hesitate to ask in our webmaster Help Forum.
Facebook Likes, Increase FB Likes Free