In the SEO world, there is a lot of strong feelings about what should
be indexed and what should not be indexed by Google. SEO is now about
removing content, removing links, removing potentially useful stuff from
your site. I find it incredibly comical at times that this is where
the industry has gone - mostly Google is at fault for this with Panda and Penguin.
That
being said, someone asked Google's John Mueller about tag clouds, and
should the tag results on your site be blocked from Google indexing it?
This was asked on a Google+ hangout at the 49:57 mark in the video:
Question:Would
you suggest to block tag-pages in the robots.txt? I use a lot of tags
to group my tutorial pages for my readers. Is that duplicate content and
so on bad for my site?
Answer: This really kind of depends on your web site - the kind of site that you have.
I
think there are some kind of tag pages that are essentially going to
search results pages, which probably don’t make sense to get indexed.
There
are other kinds of tag pages that are almost like category pages where
you have a useful collection of individual pieces of content that match
this category. And that might be something you do want to have indexed.
So
that’s not something where I’d say there is a default answer that works
well for everyone. You have to work that out for your web site
yourself. Look at some of those sample pages and be as objective as you
can in saying, well, is this really something I want to have indexed or
is this something I don’t really need to have indexed.
Here is the video embed:
Heck,
I am constantly linking to tag pages on this site because I find it
more useful for me to tag my content then place them in categories. I
personally use my tag pages all the time but it may not be as useful to
others, as it is useful to me. Either way, I personally find mine
useful. :)
Forum discussion at Google+.
No comments:
Post a Comment