Thursday, February 28, 2013

Google Appending Brand In Front Of Search Results Title


Typically, when Google shows a clickable headline in the search results, it pulls from your title tag meta data in the source code of the page. There are of course plenty of times where Google will rename the headline depending on the query.

What is new is that Google seems to be appending the brand followed by a colon before some titles or headlines in the search results. Here is an example:

Google Brand Title Snippet

Now if you look at the source code of this site you will see the title is not written this way:

Title Tag No Brand

Why is this happening, we are not sure. Google has not commented on this yet.

Gordon Campbell also noticed this and posted about it.

Forum discussion at Google+.
Typically, when Google shows a clickable headline in the search results, it pulls from your title tag meta data in the source code of the page. There are of course plenty of times where Google will rename the headline depending on the query.
What is new is that Google seems to be appending the brand followed by a colon before some titles or headlines in the search results. Here is an example:
Google Brand Title Snippet
Now if you look at the source code of this site you will see the title is not written this way:
Title Tag No Brand
Why is this happening, we are not sure. Google has not commented on this yet.
Gordon Campbell also noticed this and posted about it.
Forum discussion at Google+.
- See more at: http://www.seroundtable.com/google-brand-title-appending-16432.html#sthash.tB3dit5M.dpuf

Wednesday, February 27, 2013

Matt Cutts: 301 Redirects Dilute PageRank Equally To Normal Links

Back about three years ago, we covered an interview between Google's Matt Cutts and Eric Enge of Stone Temple Consulting where we thought we learned that 301 Redirects Do Not Pass Full PageRank & Link Value.

The truth is, they do not pass full PageRank but nor does normal links, which we knew. The issue was, most people felt that 301 redirects pass LESS PageRank than normal links and that is not true.

Google's Matt Cutts posted a video yesterday saying:
 
The amount of PageRank that dissipates through a 301 is currently identical to the amount of PageRank that dissipates through a link.

So currently, there is no difference between a 301 and a link in terms of PageRank dilution.

That being said, if you have many redirects, like chains of them from one url to another to another, that is a known bad thing. But one or so won't hurt you.

Here is Matt's video



This is one of those topics that is pretty cut and dry but as you ask more questions, with more variables in the mix, it can become a "what if" type of scenario that is not so cut and dry. Which is why we had the confusion in the first place, because Matt did not want to lead people on the first time - I guess?

Forum discussion at WebmasterWorld & Google+.

Saturday, February 23, 2013

Google AdSense Publisher Claims JavaScript Framebuster Lead To Google Ban

A WebmasterWorld thread has one Google AdSense publisher claiming that a JavaScript framebuster technique led to his site and AdSense account being banned by Google.

He said after several years, his account was banned by Google AdSense because the JavaScript framebuster code he had led the page to reload and reload in an endless loop. It lead to crazy numbers of impressions over a 40 minute period. The publisher wrote: 

The account was disabled on Jan 18 2013. But Google Analytics intelligence report showed a spike in pageviews (and they weren't clicks but pageviews do inflate advertiser costs) on 18th Jan 2013. After digging deep, I realised that someone from France (I'm in India) was using the Google Translate service to view my site. And my site has a framebuster javascript which caused the page to load in an endless loop. The usual pageviews of about 800 per day got inflated to 4500 that day. I've sent them a clarification. But I doubt they'll listen of their highhandedness. I was at fault but I'm not a cheat. I've paid my apologies and the last respects to AdSense.

Some are having a hard time believing a publisher would be banned so quickly for such a mistake. Some are saying it is possible and are trying to ensure things like this do not happen to themselves.

Netmeg said that she is extra careful: 

Stuff like this is why I watch my traffic like a hawk (not my AdSense, my traffic) and if I see any anomalies, AdSense comes off till I nail them down.
I guess that is being incredibly careful.

Are you paranoid about this? Have you seen anything like this before?

Forum discussion at WebmasterWorld.

How To Leverage Search Queries Data Within Google Webmaster Tools

About three years ago, Google launched search query data within Google Webmaster Tools.

The reporting in the search queries data is one of the more value areas within Webmaster Tools, especially since Google defaulted SSL search years ago.

Maile Ohye from Google posted a 12 minute video going through all the ways SEOs and webmasters can use the search queries data to benefit their sites and conversions. Here is that video:



I recommend you watch it when you are not distracted. I think for most people, it will be a good 12 minutes spent.

Forum discussion at Google+.

A reminder about selling links that pass PageRank Friday

Google has said for years that selling links that pass PageRank violates our quality guidelines. We continue to reiterate that guidance periodically to help remind site owners and webmasters of that policy.

Please be wary if someone approaches you and wants to pay you for links or "advertorial" pages on your site that pass PageRank. Selling links (or entire advertorial pages with embedded links) that pass PageRank violates our quality guidelines, and Google does take action on such violations. The consequences for a linkselling site start with losing trust in Google's search results, as well as reduction of the site's visible PageRank in the Google Toolbar. The consequences can also include lower rankings for that site in Google's search results.

If you receive a warning for selling links that pass PageRank in Google's Webmaster Tools, you'll see a notification message to look for "possibly artificial or unnatural links on your site pointing to other sites that could be intended to manipulate PageRank." That's an indication that your site has lost trust in Google's index.

To address the issue, make sure that any paid links on your site don't pass PageRank. You can remove any paid links or advertorial pages, or make sure that any paid hyperlinks have the rel="nofollow" attribute. After ensuring that no paid links on your site pass PageRank, you can submit a reconsideration request and if you had a manual webspam action on your site, someone at Google will review the request. After the request has been reviewed, you'll get a notification back about whether the reconsideration request was granted or not.

We do take this issue very seriously, so we recommend you avoid selling (and buying) links that pass PageRank in order to prevent loss of trust, lower PageRank in the Google Toolbar, lower rankings, or in an extreme case, removal from Google's search results.

Thursday, February 21, 2013

Updating Your Images? Should You Keep The Original File Names?

A WebmasterWorld thread brings up a topic I've honestly never considered before (which is why I love SEO forums). Here is the situation...

You have a web site with lots of images, let's say an e-commerce site with lots of product images. You decide to replace all the old images for each product with new fancy images.

The question is, what do you do with the old images? Options are:

(1) Leave them on the server.
(2) Delete them from the server.

Now, if you decide to leave them, all it will do is eat up your bandwidth. That is unless they somehow continue to display on the original product landing page and those somehow convert.

If you decide to delete them, then you have a things to consider. Do you just delete them and upload new ones or do you delete the old ones and replace them with the new ones but keep the same file name?

For example, you sell blue widgets on a page. That page has an image of a blue widget at domain.com/images/bluewidget.jpg. Do you replace the old one with a new image on a new file name, i.e. /images/newbluewidget.jpg or just overwrite the old one with /images/bluewidget.jpg.

What would you do?

Update: Google's Pierre Far responded to my question on Google+ with the answer:

You can keep the same filename if you're just updating the image of a product - i.e. the new image and the old image are about the same thing. As Googlebot re-crawls the images (assuming there aren't any robots.txt disallow directives), we'll start showing the new images in the search results. How long it will take to refresh all images on a site depends on a lot of things.

Forum discussion at WebmasterWorld.

Google Rarely Updates The Penguin Algorithm

It has almost been five months since the last Penguin refresh and no updates to the Penguin algorithm are in sight.

In fact, I reported yesterday at Search Engine Land that No, Google Hasn’t Released Unannounced Penguin Updates. Why did I have to report that? Couple reasons:

(1) There was some speculation based on a video hangout with John Mueller that Penguin refreshed regularly. It does not, it never did, and the truth is, it refreshes very rarely.

(2) It has been almost five months since an official Penguin update and I wanted to make sure we didn't miss any updates.

Google has told us that Penguin is rarely refreshed, unlike Panda and we didn't miss any Penguin refreshes since.

What was John talking about? He was talking about normal link analysis is refreshed and rerun continuously.

I posted this on my Google+ page and then someone brought up the Zebra update. There is no such thing, stop asking me about Zebras. There was not a Zebra update.

Forum discussion at Google+.

Google's Official Guidelines For Moving A Business On Google Maps

We've covered the major issues with moving a business within Google Maps a couple times - in short, it is a scary and upsetting thing to have to deal with - on top of all the other issues with Google Maps.

Google's Jade Wang finally gave official advice on what to do when you do move your business. She wrote in a Google Business Help thread:

Verified business owner of a page, and is your business moving locations? Here's what you do.

Edit your address in Google Places for Business or in the Google+ page admin area, whichever you are using to manage the page. This will either make a new page or edit the address on the existing page. It may take a week or two after editing your address before you see an update. At this point, you may need to go through a verification process again. Don't worry -- this is normal.

If you see a page that's still got the old address, click on Report a problem and mark that location as closed. Provide the link to the new address or information about the new location if possible. You can find more instructions on closing a location here: http://goo.gl/YZIjq

I actually took these steps for my business and it did create two listings. But I do not want to mark my old listing as "closed" because my old listing still is the number one result.I do not want my business to appear as if we closed down.

Wednesday, February 20, 2013

Google Panda #25 Coming Today? Not Sure.

An ongoing WebmasterWorld thread has some chatter around an increase in GoogleBot crawl activity as well as some early ranking fluctuations.

That and we are just about the 30 day mark from the previous Google Panda update, Panda #24, we are suspecting a Panda update is about to be hitting today or tomorrow.

Normally, days before a Panda update is announced by Google, we see this type of chatter and GoogleBot activity. The issue is, it has been almost 5 months since the last confirmed Penguin update, so webmasters are unsure what is going on with that.

That being said, Mozcast showed some activity the other day, so did SERPs.com, however SERPMetrics doesn't show much, and now DigitalPoint shows changes (See "Search Engine Rank Changes") also but nothing crazy.

Is a Panda refresh about to hit us? I suspect so but only Google can confirm that.

Forum discussion at WebmasterWorld.

A Google Penalty Removal Leads To Less Google Traffic?

A WebmasterWorld thread has an interesting case I've never seen before.

A webmaster, who is a "senior member" at WebmasterWorld, claims he received a notification that Google revoked a manual penalty they had on his site for a while. Three days after the penalty revocation the traffic from Google "completely" stopped coming in.

It is like he was better off with the manual penalty versus not having the penalty at all.

We know Google does manually revoke partial penalties but this doesn't seem like a partial revocation.

One theory given in the thread is that now the manual penalty was revoked, an automated algorithmic penalty kicked in and made things worse. I am not sure I believe that 100%. I don't think it works that way, but who knows - I don't have that type of knowledge. It just doesn't seem right based on the knowledge I do have.

Netmeg explained his theory in the thread:
Is it possible that once the manual action was revoked, one of the algorithmic changes finally kicked in?
Or maybe it is just a temporary glitch and things will fix itself in a few more days?

Forum discussion at WebmasterWorld.

Tuesday, February 19, 2013

Webmasters Sending Link Modification Requests Due To Google Link Notifications

A WebmasterWorld thread has one webmaster explaining he is now receiving emails from other webmasters to change how he links to them.

The issue is, Google is sending out messages about unnatural links and in response to that, webmasters are freaking out and emailing everyone and anything that links to them to change the link.

Heck, I've even received some asking me to either change the link or remove the link completely. 

We even asked if someone can sue for linking to someone else? The issue now is that Google is sending a link wanring saying your site is being penalized, so there is more evidence showing a site linking to you is potentially causing financial damage to the site. So a lawsuit is not out of question. 

Some are taking other approaches and charging a fee to remove links. I kid you not.

In the case at WebmasterWorld, this webmaster received an email from another webmaster saying:
We have been penalized for unnatural links, can you please change the anchor text of your links from "blaH to "bla blah".
I don't see how changing the anchor text will help, it is still an unnatural link.

SEO Best Practices: High Quality Content

Today we’re discussing SEO best practices. When you’re trying to get a handle on “how to do SEO”, you’ll often find articles that refer to “old” SEO practices versus “new” SEO practices.

One well known historical (i.e. “old”) method of Search Engine Optimization includes dominating search results by stuffing keywords into titles, domain names, URLs and content. Now, it’s much more complicated, factoring in social engagement, social shares, the ‘weight’ of the incoming links, the amount of time (seconds and minutes) that visitors stay on your site versus bounce rate, etc.

One SEO ingredient that seems to be staying in first place on a consistent basis, and probably will remain there, is ‘high quality content’. Such content is defined differently all the time, but here’s a nice quote by Miranda Miller that describes it perfectly:

When the purpose of your blog posts, press releases, website copy or other content is to attract people to your business and convert them to customers, your content needs to offer them some value, by way of informing, convincing, or entertaining.

You need to add value for your readers by writing and sharing clear and concise information that they’re looking for, by helping them see you’ve provided a solution for their problem, or by just simply making it enjoyable to spend 10 minutes reading your piece of content. For tips on how to create great content, check out our previous post on how to write great blog posts.

Poorly written content is an SEO technique that keeps rearing its ugly head all over the Internet. SEO “experts” write low-quality content overflowing with keywords and promise their clients that they’ll achieve first-page rankings. Sadly, this isn’t the case. Google is concerned with one thing: serving the most relevant and high quality search results to the user in a timely manner.

SEO Best Practices Call to Action

Here are some questions to ask yourself when developing your content marketing plan:
  • Who will create the content for my website? Am I doing it myself or will I be outsourcing this task? If you decide to outsource this task, make sure you see samples of their work.
  • Who is your target client? Who are you writing this content for and why? Your copy should reflect the answer to these questions.
  • Do you know what your keywords are? If you are writing your own content, are you worried that it will be boring and bland because you’re writing about the same thing over and over?
  • Do you know how to measure the results of your SEO? Do you understand industry lingo such as bounce rate, conversions, etc.?

15 Important Factors of SEO Structure

The seriousness of any website will depend on its design, content, accessibility, rankings and its SEO structure. Many websites compromise with their site On-Page, Off-Page Optimization and expect quick results in less time. This is not possible until your website follows a proper SEO structure, as most search engines concentrate on a website back-end SEO structure to rank them high. Your website should be able to guide the search engine bots to tell what to index and what not to. Below are the 15 important factors of SEO structure you should follow in your site on-page, off-page optimization. These techniques when followed will give amazing results, and the beauty of these practices will not cost you even a penny unless you hire someone to do it.

15 Important Factors of SEO Structure

15 Important Factors of SEO Structure: On-Page and Off-Page Optimization


These below on-page and off-page optimization factors are clubbed alphabetically for easy understanding, since these both practices are very crucial for a website success, its better to follow them all without skipping anything.

 

Anchor Text:


Search engines give a lot of importance to anchor text links for improving their SERPs (Search Engine Results Pages). Even if a website is not properly SEO structured, it can still rank up with good amount of anchor text links pointing towards a page. Remember that when you are linking any particular text, always use a relevant keyword to point a link towards it. So instead of using Click here to know more, you should use the actual keyword to rank that page higher.
15 Important Factors of SEO Structure

The more number of links towards that page, the more higher that page ranks up in Google. Also make sure to insert proper Alternate Text from images with links, so that you don’t miss a chance of ranking high even with images. In the above screenshot, you might have observed that Adobe Reader is ranked no 1 with keyword “Click here” This means a lot of websites are pointing to the page get.adobe.com/reader/ with the keyword Click Here.

 

Critical Errors:

 

Your website should always avoid HTTP critical client, server errors to rank high in search engines. When your website throws these critical HTTP errors to your site visitors, it creates are bad impression on the site and ultimately make the visitor to leave the site immediately. This might create a serious issue in your website analytics, and will surely create a bad reputation in long run. Avoiding these kind of critical errors will keep your site clean to the search engines and help your site to rank well in long run. Some of the critical client and server issues are,
  • 400 – Bad File Request
  • 401 – Unauthorized
  • 403 Forbidden/Access Denied
  • 404 File Not Found
  • 403 Forbidden/Access Denied
  • 408 Request Timeout
  • 500 Internal Error
  • 501 Not Implemented
  • 502 Service Temporarily Overloaded
  • 503 Service Unavailable
Note: Take quick action on 404, 500 errors in your website.

 

Canonical Link Element:

 

The Canonical link element syntax is used to clean up the duplicate URLs in a website. If a website is having ugly URL structure, rel=”canonical” syntax is used to describe the actual URL of the site and helps to eliminate the duplicate URLs within the website. If your website produces different kinds of URL structures within its pages, search engines might consider it as a duplicate issue and ban the pages if necessary. So we should always keep an eye on canonical syntax to avoid these issues.
Syntax: <link rel=”canonical” href=”http://example.com/page.html”/>

15 Important Factors of SEO Structure

Example:
  • http://www.example.com/page.html?pid=fgq3304 (without rel=”canonical” syntax)
  • http://www.example.com/page.html (with rel=”canonical” syntax)
You can learn more about canonical link element and canonical HTTP headers from Google itself.

 

Do-Follow & No-Follow Attributes:

 

Do-Follow and No-Follow are the two most important attributes you should understand before structuring your site SEO.  Do-Follow is an attribute which suggest the spiders to index through the link and crawl as much as possible. This is the reason why most website link internally to allow the bots to crawl within their website. Similarly No-Follow is used to guide the bots to ignore the link completely, and always make sure to balance the number of do-follow and no-follow links within your website. Too many of these do-follow or no-follow links might effect your website in long run. 

dofollow and nofollow links

 

Duplicate Pages:

 

Google doesn’t show any mercy on websites which are careless about their site duplicate issues. A website should always be clean with their pages, content and links, else your website might be banned from Google search engine listings. Though there are many ways to find out these duplicate issues within a website, I personally use Screaming Frog SEO spider tool to detect and eliminate my site duplicate pages.

duplicate issue 

 

External Links:

 

External links are nothing but the hyperlinks which point any other external domain. Which means, if your website is pointing to other website links, it is considered to be a external link. Similarly if any website is pointing towards your site, even that is considered to be a external link. You should keep a track of all followed links and their subsequent status codes from your website, as this is very important to know how our site is linked with other domains. Never try to link to bad sources for quick results, this bad practice can ruin the whole website in long run.

external link 

You should also take care of the Inlinks and Outlinks from all the pages linking to a URI.

 

File Size:

 

A website should always determine its page file size, as this might effect the site loading speed and decrease the site performance in many ways. Google always concentrate on page speed, as it directly connects with the user experience and bounce rate. If the visitors are experiencing low page speeds with high bandwidths, they are expected to leave the page quickly without waiting for the page to load completely. This is very bad for your site reputation, and you might be loosing a loyal visitor without correcting the issue. You can avoid these kind of issues with necessary changes like page compression, uploading smaller pictures, avoiding too many codes and etc.

speed up your website

 

 

Header Tags:

 

Whenever you are using H1 and H2 tags in your site on-page optimization, make sure to follow some guidelines. You should never miss writing a H1 and H2 tagline for your page content, and if you are writing any article, make sure to follow this H1 and H2 pattern with relevant keywords and try to avoid multiple header tags. You should also make sure the header tag is not duplicated in a page and doesn’t exceed above 70 characters.

header tags

 

 

Images:

 

You should keep track of your site images by checking if the files are exceeding over 100kb size, images which are missing alt text, images which have over 100 characters alt text. These factors are very important for image SEO, as they can fetch good amount of traffic in long run.

 

Meta Data:

 

Meta Title, Meta Description, Meta Keywords, are the most important on-page optimization techniques you need to follow compulsorily. When a page title is optimized, it should be genuine, same as h1 and shouldn’t exceed more than 70 characters to avoid duplication. Also make sure to write the site meta description without  exceeding 156 characters and not duplicating it anywhere else. Google doesn’t concentrate much on meta keywords these days, but yahoo search is still using this as their main factor to rank the websites in top.

meta data

 

 

Page Depth Level:

 

Page depth level is a difficult concept to understand for novice users. This concept tells us how many levels does a search engine has to crawl a website to index the content. If your content is deep inside folders to access, it would be little difficult for the search engines to read the data. It also depends on how many clicks does the folder is away from homepage to access the data. For example,
  • example.com/deep/deep/deep/deeppage.php (is linked from home page and is okay)
  • example.com/rootpage.php (4 clicks from homepage might create a problem)
These kind of page depth levels might create several issues without your notice. You have to take care of what depth your site pages have, and what steps can be taken to avoid these kind of issues.

 

Redirects:

 

Redirects are very useful when you are moving your website to a new domain. It is a process of forwarding one URL to different URL with three major kinds of redirects.
  1. 3o1 Redirect – Moved Permanently
  2. 302 Redirect – Found or Moved Temporarily
  3. Meta Refresh
Syntax: wp_redirect(get_permalink($url),301);

301 redirects

There are many plugins in WordPress which help you to redirect a particular page to some other page without any issues.

 

Robots File: 

 

You should learn how to write a Robots.txt file easily, as this is the most important factor to avoid the Google bots to index what all comes in their way. A robots.txt file will mainly consists of index, noindex, follow, nofollow, noarchive, nosnippet, noodp, noydir and etc. Each of these commands are important in their own way. Try to understand the Robots topics for keeping your site SEO structure in a perfect shape.

google robots

 

 

X-Robots-Tag HTTP Header:


X-Robots-Tag header is the advance version of meta robots tag, which allow us to do what we normally do in a robots meta tag, but little differently. If any of your site page is having too many links pointing towards it, the page can rank high even without proper SEO or indexing. This shows that the page cannot be hidden, even if you have stopped the access in robots.txt file, and you cannot really hide anything which you don’t want to get search engines notice. Since this topic is little complicated, I would suggest you to read this only if you are interested.

XML Sitemap Generator:

 

Any website which doesn’t have an XML sitemap is considered to perform very poorly in all aspects. Your website should have a proper XML sitemap, and must be submitted to Google search engines via Webmasters for good results in future. You can create basic XML sitemap using Google XML sitemaps plugin for WordPress.

15 Important Factors of SEO Structure
These are the 15 important factors of SEO structure you have to follow for better results. Some topics which I mentioned above might be completely new and confusing, don’t put much efforts to understand which is beyond your site optimization. Please leave your valuable comments and queries if you are confused about any part of this article.

Sunday, February 17, 2013

How to Recover From Google Panda Update #24?

Are you are still looking to recover from the Google Panda update #24? If you were affected by the new Panda update, which was released on 22 January 2013, then you need to change your techniques, SEO plans and working strategies.  Here I will share some important tips that can help you recover from the recent Google Panda update.

Google’s new search algorithm update has affected 1.2% English queries. It also has affected most if not all low quality websites featuring low quality back links and content of poor quality. The most recent update to existing algorithms was designed to remove low quality websites from top of the SERP (Search Engine Result Pages). If webmasters follow some good SEO techniques, then they will not be penalized by either the Panda or the Penguin updates.

Some SEO Tips that will you help recover from the Google Panda Update

Quality Content

It is often said that content is the king of SEO, so you need to update your website or blog content on the regular basis to keep the information fresh and appealing to readers. Also check to ensure your website content is not copied to other website. If your website content is copied from anywhere, then you must change your current site content or be penalized by Google. Duplicate content definitely affect your search engine ranking.  Do not submit single article or blog on multiple websites, such practices are also penalized by Google updates. Quality content means writing a complete post along with full details and targeted keywords for one website only.

Create Backlinks from high PR Websites

Create high quality backlinks from high Page Rank (PR) websites and ensure that such websites are related to your article/blog topic. Do not create back links for unrelated topics. Submit your website to high page rank directory, bookmarking sites, forums etc. Some other back linking techniques that you can use, are:
  • Guest Posting: - Guest posting is effective for creating back links to your website/blog from high page rank websites and it also helps drive a lot of traffic to your website or blog.
  • Article Submissions: - Submit article on the high PR Directory like ezinearticles.com, but never submit a single article/blog on multiple websites. You should use also one URL link in your author biography in the article/blog, which will help drive traffic.
  • Forum Posting: - Forum posting is another good way that can help improve traffic to your website and it also gives good quality backlinks. You can include a link to your site in signature area.
  • Press releases: - Press releases can also help provide good back links to your website/blog.
  • Video Submission: - Video Submissions of presentations also help increase in-bound traffic to your website. Create a video and Submit on YouTube, Blip.tv, Vimeo and viddler. Google tends to give more weightage to a video as compared to a text article. Video Submissions will definitely increase product visibility on the Internet.

Currently social media is playing a key role in the success of new websites/blogs. Google is often accused of giving priority to G+ (Google+) over Twitter but still making your presence felt on Facebook, G+ and also Twitter is essential.  If you are not currently using social media sites like Google Plus, Facebook, Twitter, Tumbler, then you should start immediately. If you don’t have enough time to perform social media optimization (SMO) by yourself, you should hire a 3rd party to carry out the tasks.

Stop External links:-If you are selling paid links from your website then you must stop doing this immediately if you want your website to stay on the top of Google’s search rankings. If you need to use links for any website, then ensure that you use “No Follow” attributes for such links, because Google seldom penalizes sites carrying “No Follow” links even if they are not related to the topic of the website or blog.
Facebook Likes, Increase FB Likes Free