Kate O'Donovan from the Google AdWords team in Dublin confirmed that the mobile friendly algorithm that began rolling out last week has not impact on your AdWords quality score.
Kate actually posted the announcement about the organic mobile friendly algorithm in the Google AdWords Help forum and ended the announcement by clearly stating this algorithmic change has no impact on your AdWords ads.
Kate wrote:
Remember this update will not affect your AdWords performance or your individual Quality Scores.
In the SEO world, there is a lot of strong feelings about what should
be indexed and what should not be indexed by Google. SEO is now about
removing content, removing links, removing potentially useful stuff from
your site. I find it incredibly comical at times that this is where
the industry has gone - mostly Google is at fault for this with Panda and Penguin.
That
being said, someone asked Google's John Mueller about tag clouds, and
should the tag results on your site be blocked from Google indexing it?
This was asked on a Google+ hangout at the 49:57 mark in the video:
Question:Would
you suggest to block tag-pages in the robots.txt? I use a lot of tags
to group my tutorial pages for my readers. Is that duplicate content and
so on bad for my site?
Answer: This really kind of depends on your web site - the kind of site that you have.
I
think there are some kind of tag pages that are essentially going to
search results pages, which probably don’t make sense to get indexed.
There
are other kinds of tag pages that are almost like category pages where
you have a useful collection of individual pieces of content that match
this category. And that might be something you do want to have indexed.
So
that’s not something where I’d say there is a default answer that works
well for everyone. You have to work that out for your web site
yourself. Look at some of those sample pages and be as objective as you
can in saying, well, is this really something I want to have indexed or
is this something I don’t really need to have indexed.
Here is the video embed:
Heck,
I am constantly linking to tag pages on this site because I find it
more useful for me to tag my content then place them in categories. I
personally use my tag pages all the time but it may not be as useful to
others, as it is useful to me. Either way, I personally find mine
useful. :)
Well, it is now over six months since a Panda update and just a couple weeks ago, Google's John Mueller told us both Panda and Penguin are pushed manually which specifically contradicts the statements around them being real time.
In a Google+ thread, Rae Hoffman has been giving it to Google over the confusion and contradictory information they've been giving.
So
John Mueller of Google responds that he knows it sucks but it wasn't by
design and they will likely have more cases of this but again, they
will try to limit it. Here is what he wrote:
Rae
Hoffman I agree, that sucks. Please call us out when you see that
happening. I don't think we can eliminate all of these cases, sometimes
the basis of a comment changes internally, but it's certainly not by
design.
Yes, it really does suck. It makes it
look like Google has no clue what they are doing. It makes it hard for
SEOs and webmasters to communicate to their clients. It makes it hard to
build a better web all around.
Danny wrote the ultimate solution for this is likely an automated action viewer in Google Webmaster Tools but I can't see that happening that soon.
Can you believe, the last official Google Panda refresh/update was Panda 4.1 on September 25, 2014? We've seen updates to it since, but those updates stopped around October 24, 2015, which is 5 months and 2 weeks or so ago.
I asked John Mueller of Google in a Google+ handout at the 47:50 mark in the video:
There hasn’t been a Panda update in a while, since October or so, right?
In John fashion, he answers:
That’s possible, yea.
Watch the video to see how he answers it:
We know Google has to push these updates, at least now, but hasn't in about 6 months.
A half a year is a long time to wait for a Panda refresh, don't you think?
A thread at Google Webmaster Help
forums has a webmaster concerned that Google isn't allowing him to
manually use the submit to index in the fetch in render tool in Google
Webmaster Tools for all the URLs on his site. As you know, there is a
limit to how often you can use that feature per day/per month.
In response to the webmasters concern, Google's John Mueller said on the thread:
You
don't need to submit pages when they're changed -- we recrawl the web
automatically to pick those changes up (you could also use sitemaps
& feeds if you wanted to point out individual changed pages).
Yep,
that is what GoogleBot does, it is all about seeking out and consuming
new and changed content and web pages. So if you really want to
expedite it for a large number of pages, use Sitemaps.
Google News optimization fascinates me because it is a different
algorithm and there are weird rules. I knew Google News had a rule
about having 3-digit unique number in the URL if you do not use Google
News Sitemaps but did you know that if you used 1999 or 2000 in the URL
that you are out of luck?
Stacie Chan said this in her presentation Google+ hangout at the 3:51 mark. Here is the slide she shared:
She said:
The exception is if you've got four digits and it leads off with 199 or 200, as you can see, those typically reflect a year.
However, now that we've got into the 2010 and beyond years, 201 actually works for your article URLs.
All
this is said always with an exception. If you decide that your CMS
doesn't spit out these random three digits, that's fine as well. You can
always submit a Google News Sitemap. You submit that through your
webmaster account. And then you don't have to abide by Google News's
three-digit rule.
A thread at Local Search Forum
asks a question most SEOs will ask themselves over time... Does your
local ranking in Google's local algorithm at all impact the overall core
Google search algorithm?
The truth is, it does impact your
visibility and rankings in web search because Google's local results are
often embedded in the web search results as a local pack.
But purely on the algorithmic side, does Google's local algorithm, which is now known as the Pigeon update directly impact your organic rankings? Not really in my opinion.
The truth is, the Pigeon update
now includes many of Google's core web algorithm factors. Google said
back then that there is "deeper into their web search capabilities,
including the hundreds of ranking signals they use in web search along
with search features such as Knowledge Graph, spelling correction,
synonyms and more."
So if anything, the web search algorithm impacts more of the local results than the other way around.
One local SEO said:
In
my experience local rankings are influenced by organic ones (pigeon
update) and not vice versa. They are two very different algorithms. In a
competitive market you usually have to have something ranking on the
first page organically to show up in the local pack.