Saturday, June 16, 2018

Google: Cache-Control Headers Do Not Impact GoogleBot

The Cache-Control general-header has no impact on GoogleBot and how it crawls your web pages. The cache-control general-header "field is used to specify directives for caching mechanisms in both requests and responses," says MDN docs. "Caching directives are unidirectional, meaning that a given directive in a request is not implying that the same directive is to be given in the response," they explain.
So using this header should not impact how GoogleBot picks up content on the page.
Here is John's tweet:

Tuesday, February 23, 2016

Google Confirms Four Ads At The Top & Removing Right Side Ads

Google has been testing four AdWords ads since 2006 and recently stepped up those tests in late 2015. Now, according to Search Engine Land, Google has confirmed that they are switching over from showing ads on the right side to having four ads at the top.
Google will no longer show ads to the right of its search results, instead it will show four ads at the top with two exceptions. Google will show PLAs (Product Listing Ads) and ads in the knowledge graph on the right hand side.
This is the talk of the SEM industry, which has been buzzing about it all weekend.
Here is a screen shot of the ads showing on the right:
click for full size
Now imagine those ads on the right completely gone.
Here is a picture of that:
click for full size
Dr. Pete Meyers said on Friday on Twitter that 19% of the queries he tested showed four ads. He then wrote a more data intensive blog post at Moz uncovering the data around this:
A Google spokesperson said:
We've been testing this layout for a long time, so some people might see it on a very small number of commercial queries. We'll continue to make tweaks, but this is designed for highly commercial queries where the layout is able to provide more relevant results for people searching and better performance for advertisers.
Overall, the community is mixed about this but it seems most are just in shock.
The topic on this at Google AdWords Help is "Bye bye small businesses." The folks atWebmasterWorld are saying "what a mess" and the Local Search Forums aren't into the organic results being pushed down even more.
The shocking thing to me is that the forums are pretty quiet overall about this massive change but that might change as more and more webmasters see this being pushed out.
I have to assume most my readers are not into a fourth ad at the top, even if it removes the ads on the right.

Sunday, October 11, 2015

Google Confirms The Real Time Penguin Algorithm Is Coming Soon

Google’s Gary Illyes said today at SMX East that next Penguin update will be in the “foreseeable” future, adding “I hope” by the end of the year, and it will be the real-time version of the algorithm.
Back in July, Gary Illyes told us that Penguin was months away, and we are almost there. Illyes was overly cautious and would not give us a timeline or date, but he did imply it will be happening soon.

Real-time Penguin

This version of the Penguin algorithm will be real-time, at least that is the goal, Gary said. That means that as soon as Google discovers that a link is removed or disavowed, the Penguin algorithm will process it in real time, and you would be able to recover from a Penguin penalty incredibly quickly. However, you could end up with a Penguin penalty just as quickly.
Google had already told us this back in June, but it is nice to know they are on track to make this happen soon.

Google: New Algorithm Changes “Aggressively Targeting Hacked Spam,” May Impact 5% Of Queries

Google says it’s rolling out a series of search algorithm changes that “aggressively” target the presence of hacked spam in its search results.
Ning Song, the engineer who wrote today’s blog post, says Google is turning up the dial in its algorithms to remove hacked sites from Google’s search results:
We are aggressively targeting hacked spam in order to protect users and webmasters.
The algorithmic changes will eventually impact roughly 5% of queries, depending on the language. As we roll out the new algorithms, users might notice that for certain queries, only the most relevant results are shown, reducing the number of results shown.
This is due to the large amount of hacked spam being removed, and should improve in the near future. We are continuing tuning our systems to weed out the bad content while retaining the organic, legitimate results.
Hacked sites are a long-running and common problem on the Web, which makes them a problem for Google, too. Earlier this year, the IT security company Sophos announcedthat it had notified Google of “hundreds of thousands” of high-ranking, cloaked PDF documents on hacked websites. In 2013, Google revealed that hacked sites were the second most common cause of manual actions. Around that same time, Googlelaunched a help center for hacked sites that’s still online today.
Google is encouraging webmasters, site owners and SEOs with questions or feedback to speak up in the Webmaster Help Forums.
Postscript by Barry Schwartz: Google’s Gary Illyes confirmed that this algorithm typically impacts only the realm of “spammy queries” and not generic normal queries.

Thursday, July 30, 2015

How To Quickly Unblock Google From CSS & JavaScript, What Google Looks At & Number Notified

Yesterday, Google sent mass notifications for blocked JavaScript and CSS. I recommend you read that story if you haven't yet.
Since then, there have been many questions about what to do. I recommended yesterday to unblock your CSS and JavaScript files, use the Fetch and Render tool to check those issues and check the email you received from Google for more details.
But Google is sharing more information now.

How To Quickly Unblock JavaScript & CSS Assets


Gary Illyes from Google posted on Stack Overflow the cheat, or quick way, of unblocking your JavaScript and CSS files from Google. Gary said the "simplest form of allow rule to allow crawling javascript and css resources" is to add this to your robots.txt file:

User-Agent: Googlebot
Allow: .js
Allow: .css

Gary said this will open it all up for GoogleBot.

Google Checks Your Home Page & Mobile View


Primarily when Google checks for blocked CSS and JavaScript assets, they don't go too deep into your site. They look mostly at just your home page and then the mobile/smartphone view of your web site.
John Mueller of Google said this in a comment on his own post on Google+ saying "'re primarily looking at the site's homepage & for the smartphone view of the page."

Google Doesn't Look At 3rd Party Embeds


John Mueller of Google also said there that you shouldn't get this notification from Google if it is a third party embed (ad code, social embeds, etc) that has blocked JS or CSS. You will see these warnings in Google Search Console, but you should not have received an email from Google for 3rd party issues.
John wrote in that Google+ post:

We're looking for local, embedded, blocked JS & CSS. So it would be for URLs that you can "allow" in your robots.txt, not something on other people's sites (though blocked content on other sites can cause problems too, eg, if you're using a JS framework that's hosted on a blocked URL).


How Many Webmasters Received This Notification?


I asked Gary Illyes from Google about how many people received this notification. I didn't think he would answer, but he did shed some light on it.
He said on Twitter that Google sent out 18.7% of what they sent out for the mobile usability issues. So you thought this JS and CSS notification was sent out to a ton of people? The mobile usability notification was sent out to almost 85% more webmasters.
Facebook Likes, Increase FB Likes Free