Wednesday, June 27, 2012

Google Panda Update Version 3.8 On June 25th

Google has announced they pushed out a new refresh to the Panda algorithm recently. This update “noticeably affects only ~1% of queries worldwide,” said Google on Twitter.

There were earlier rumors of an update over the weekend but Google said the rollout started today and not over the weekend.

The previous Panda update was on June 8th and before that on April 26th. Typically, Google pushes out algorithm updates for Panda and Penguin every month or so. While the last Panda update was just over 2 weeks ago, Google felt they wanted to push out a new refresh.


Here is the tweet:



Google said there were no updates to the algorithm or changes in the signals. This was simply a basic data refresh where they ran the algorithm again.

For more on Panda update, see our Panda update category.

Wednesday, June 20, 2012

What is Robots.txt?

The robots exclusion standard, also known more commonly as Robots.txt, is a text file present in the root directory of a website. The Robots.txt file is a convention created to direct the activity of search engine crawlers or web spiders. The file tells the search engine crawlers which parts to web and which parts to leave alone in a website, differing between what is viewable to the public and what is viewable to the creators of the website alone. A Robots.txt file is frequently used by search engines to categorize and archive web pages, or by webmasters to proofread source codes.

The Robots.txt file of a website will work when it is used as a request to specific robots to ignore directories or files specified within the Robots.txt file. Websites with sub-domains generally need a Robots.txt file for each sub-domain, all so that information that is not viewable to the public is not picked for a keyword search. It also heightens the keyword density of the actual web page text, and keeps visitors from coming across misleading or irrelevant to the keyword searches. Robot.txt protocols are simply advisory though. There is no law requiring websites to have Robot.txt files, or to use them on their web pages. 
  
What are Search Engine Spiders?

These sneaky devils are the informative bits that search your website for content marked as available for web robots to retrieve and appropriately rank for the searcher.  These spiders or web crawlers essentially seek out information not masked by the robots.txt format.

How Does a Robots.txt Blockage Come About?


The usage of robots.txt formatting is most commonly used for staging servers.  If you find yourself at the mercy of a robots.txt problem, it likely stems from when your staging server was rolled over to the live server.  Web developers utilize the robots.txt format to prevent the duplication of your web content during the building process and when your site does eventually go live.

How To Check Your Site for Robots.txt 


You are able to manually check your website to rule out the possibility that is suffering from the effects of an inappropriately placed robots.txt setting. No need to panic over the possibility of being Google blacklisted, keep calm and check the following simple steps:
  • Enter your domain name followed by a backslash and robots.txt in the address bar. For example: http://thedomainname.com/robots.txt
  • If a 404-error page is the result, then you may not have the robots.txt feature.
  • An additional route would be to log into your Google Webmaster Tools page to tell you which URLs include a robots.txt file restriction.
  • If your robots.txt file shows:
     User-agent: *
     Disallow: /
 
You’ll need to be sure to make changes.  You should never see the above coding on a live website.


How To Prevent Parts of Your Site From Being Indexed


Robot.txt can actually work to serve you just as they can hurt your website. To essentially hide certain sections of your website from these spiders or web crawlers, you can implement the features of the robots.txt formatting.  To disallow ads or log files from being searched on your website, these pages or features should be respectively coded:

    User-agent: *
    Disallow: /ads

    Disallow: /logs
  • Unfortunately, the usage of the robots.txt isn’t a cure-all for those items you wouldn’t like searched. You may also notice the blanket effect of this feature. Basic protocol doesn’t allow for Wildcards in the Disallow line or “Allow:” lines.  Subsequently, Google has expanded this basic format issue to allow both of these options, but these are not universally accepted, so it is recommended that these expansions ONLY be used for a “User-agent:” run by Google.

 

Does the Robots.txt Prevent Users From Viewing Certain Content?


Absolutely not.  Adding the robots.txt to your web coding will only prevent web-screening spiders from selecting content from these portions of your site.  All content will be left for the viewing pleasure of all visitors to that page and will be completely unaware of the robots.txt status of the content on that page.  In all honesty the robots.txt will only disallow “polite” spiders from access to the information, in reality there are likely less well-mannered searchers weaving through that data.

If you really want to protect certain data, content or certain sections of your website, your best bet is to password protect these areas. Also remember that if you want content officially removed from the index, you must include a robots no index meta tag on each and every page you want to unequivocally remove from the index of your site.

Understanding the slightly more simplistic features of running and maintaining your website will likely save you money on the front and the running end of your business.  If you find that your website has disappeared from Google search or is extremely hard to find otherwise, your first step should be to double-check your robots.txt.  No need to spend extra money on a tech professional when you are well equipped to rule out the easy fixes and get back to the world of the living as far as the web is concerned!

Search Engine Optimization Techniques: Article Submission Is Best

SEO professional implement several techniques but article submission one of the most successful among of them. You write article for article submission relevant your business for enhance the popularity of your websites and business.
Your main purpose of article submission derives good traffic for your website and get high quality back link from other reputational websites. Article directory free websites where you can get good traffic without any cost.
There are several of benefits of article submission that are supremely helpful for promote your business all over the world.  Article submission increase your search engine ranking within less time and get high quality back link that are essential for good rating websites.
During the writing article you must keep in mind some important tips that are very helpful make your content good quality and that entice to every visitor read your article up to down.
1 Always your article content must very original and valuable that provides best info to visitor.
2 You must use keywords in your article content but don’t make too much saturate your keywords that is also called keywords stuffing.
3 Your article content need good length mean don’t write content too small and too long that annoy to visitor.
4 Always select unique Title of your article that covering with your main keywords.
Content is king and this is key of successful website so always produce unique and high quality content for visitor not for search engine. If your search engine optimization professional then you must keep in mind rules of latest updated Google Algorithm and you need high quality and legitimate work. Here I am share some most popular and important websites of article submission that will be definitely helpful for attain good ranking on major search engine.

Off Page Optimization

Off Page Optimization is integral part of the Search Engine Optimization, lot of factors involve in this part of SEO that help us appears our website in top ranking on major search engine in organic results. Off page factor is totally different from on page. In this post I will explain whole purpose of every single factor so here we let’s start with first and essential factor.

     Off Page Search Engine Optimization Checklist

1   Search Engine Submissions
2   Directory Submissions
3   Social Bookmarking Submissions
4   Article Directory Submissions
5   Press Releases Submissions
6   RSS feed Submissions
7   Classified Ads Posting
8   Forum Posting
9   Link Wheel
10  Hub Pages Creation
11  Squidoo lens creation
12  Yahoo Answers
14  Video Submissions
15  Blog Creation & Commenting
16  Deep Link Submission
17  Blog Directory Submission
18  Product Listing
19  Profile Creation
20  Use Keywords in anchor text
21  Obtain links from high ranking publisher sites

Latest Updated On Page Optimization Strategy


In this post, I am summarizing most important factors of on page optimization that must be supremely helpful to make your website well and also enhance readability of your website for your old and unique visitor, your website soon come on ROI.

In this era, who are running in online business, want achieve success in their venture, remember one thing that “No gain without Pain” so whether you want to success you have to do unique and better way. “Successful people don’t do different things but they do things with different way” so here you also implement same technique for optimization you website.

On page factors

1 Title Tags Optimization
2 Meta Tags Optimization
3 Meta Descriptions
4 HTML Tags Optimization
5 Keywords Optimization
6 Link Optimization
7 Images Optimization
8 Content Optimization
9 URL Structure
10 Internal Link Strategy
11 Keywords Density
12 Site Map Both XML and HTML


During the on page optimization avoid some common mistake like:

1 Never use duplicate Title & Content in your website
2 Never use URL variant of the same page
3 Avoid Hidden text
4 Hidden Links
5 Keywords Stuffing
6 Doorways pages
7 Clocking

Because if you will revise these common mistakes then you Google give you penalty, your website never come on ROI and you will not achieve success in your online ventures.

What is Press Release: Its Benefit and Writing Steps

Press Release is also known as a simple news release, people announce a range of new products and services, sales, etc. Its gives update information to users about any company products, website and much more. Publishing press release is correct way to spread awareness about your new services and products all over the world.  Press release is best tool for draw attention of the user toward your website.

A press release is essentially a communicative tool that lets user know about your company and website.  Press release communicates with user and they also communicate with Search Engine. This mean you can get good traffic from press release and increase popularity and more visibility your website. Generally people and your client can know about your latest services and products that you have lunched or will lunch very soon.

Your website will be attaining great benefit from press release: —

1 Press release is marketing tool:– you can increase your marketing sales with the help of press release because it provide latest information about any sort of product and services and generally visitor and your client can get better and intensive information from press release…

2 It provides exposure for your website and business: — Press release is very helpful for exposure your website and business in the whole world and users collect best information about your business …

3 It will introduce your site or product to new user: — People can introduce your website and your new produce by press release, when user read your press release then you will know about your new product and after some time they can make your client if your product and services is very reliable and effective…

4 It can give your website better visibility for search engine (spider):—Press release increase your website visibility for the search engine and you will get good hit and traffic form press release.

5 You will increase the amount of back links from other websites:– You can increase you back link which play the vital role for you search engine ranking if you have good and too much back link then your website will show better results as per your keywords..

How can write press release: —

First of all, your press release Title or Heading should very effective and strong which grab the reader’s attention. This is very essential to keeping the reader’s interest as they read through the detail section of the press release.

Second step, make it interesting, but avoid embellishments. When you are giving the detail of press release then use real life example that will be relate your topic and avoid the illustrates.

Now you should make accurate and interesting body of the press release. The body of the PR is very basic mean who, what, when and why. The first paragraph should be containing in brief detail what the press release about. The second paragraph explain and detail who write, why should write, where these things available and when it will be happen.

Finally PR should be concise and free from grammatically mistake and avoid the extra and those words which are very tough, may not be understood by the general readers. Be sure check the press release for punctuation and grammatical errors and make sure that you have followed the proper formation which is very essential for writing any type of press release…

Classified Ads Best Source Generate Best Traffic for your website

In this era internet spread in every corners so now you can find everything on internet within few time use some button of your keyboards. Millions of the website running on the internet with different purpose, mainly people generate their website for promote their brand, products and services, for achieve this goal they looking Search Engine Optimizer.  SEO professional very familiar with all different activity and techniques, every activity have own function. Search engine optimization is best source appear website on top ranking in organic listing.

Among of all SEO factors Classified Ads one of the best methods for derives good traffic for your website.  Classified ads are basically online ads which are mostly free and place on classifieds sites. It work the same way as your give advertisement in any other media such as Television, Radio and Newspapers.

You can promote your business by submit your classifieds in classified websites and you will get instantaneous traffic for your website. Thousands people when they looking for any kind of products and services they follow the classifieds site and get valuable information that are supremely helpful for take decision.  The major benefits of classified ads get posted immediately and customers contact you at time, you don’t need wait long time for traffic like organic ranking.

Classified ads are really helpful to promote your small business and if you can’t afford to costly services and your website don’t rank in organic listing, then this one best option for you. Mostly people who want to sell their products and services they always give more preference to classified ads.

I hope this article really helpful for you to attain good information about classified ads and you well aware about what is worth of classifieds submission and how it work for your business.  Latest Google Algorithm giving more important high PR classified ads that are very helpful for derive good traffic. I have shared some important classified website that will bring huge traffic for your website.

What is Search Engine Submission and Benefits


In my previous posts, explained about what Search Engine Optimization is and how many parts of SEO but this article especially about first and foremost activity of the off page optimization “search engine submission”. People who are running in SEO fields and want to become Search Engine Optimization expert, they must have entire knowledge & I hope my every post supremely helpful for newbie and specialist.
Search Engine Submission is first step or activity in SEO, search engine submission mean submit your website directly in search engine, this activity do for easily and soon crawls website by crawlers but this activity is not mandatory  because major search engine like Google, Yahoo and Bing crawl your website automatic.

Benefits of Search Engines Submission

When you create immaculate and elegant website with unique and high quality content, basically you want to tell your old and new customers about your level and it best activity promote your website. Search engines submission increase your website link popularity, traffic, and ranking all these important things that improve yours business and make it more profitable and more valuable.

Importance of Search Engines Submission

Mostly people finding the answer of what is importance of search engine submission, first of all when you upload your new website and submit all major search engine more than thousands for website index in all search engines, and get good ranking on all major search engines like Google, Yahoo and MSN.

What is Forum Posting And How It Can Improve Your Website Ranking


Updated Google Algorithm brought huge changes in link building and SEO strategy, in this post we will be talking about only SEO techniques. In search engine optimization covering with lot of factors but forum posting is best rather than others because forum posting have lot of benefits that are supremely helpful for improve your website SERP’s. Forum posting is not only getting one way back link but also getting the niche traffic for your website.

Forum Posting basically mean post new threads and reply to old and new threads. User always selects relevant threads for their website and start to discussion. The most important part in forum posting creates Signature; in the signature part user generate links on specific keywords. Every forum posting have own roles some forum posting websites allow you during the register time and after several postings.

Benefits of Forum Posting

1 Create Relevant Back Link: People who running in online business, everyone have common goal to become their website successful and popular all over the world, if you want to make good reputation of your website then relevant and good quality links are key of get success on all desires. After May 24th 2012 thousands of the website lost ranking, traffic and reputation due to unnatural and low quality back links, so if you want get good ranking on major search engine then you must do  high quality and legitimate work.

2 Niche Traffic To Your Website: if your website have relevant traffic then you can good sale of your products and service because visitor eventually become your clients and customers, so forum posting is best place where you can derive targeted and correct traffic for your websites. When you reply other threads, make sure write concise reply that entice to mostly visitor to get further detail about your business because you can derive huge traffic form others thread rather than yours.

3 Get Direct Links From Forum Posting: One of the foremost and insight features of forum is producing direct traffic to your website. Readers who read your post and if they find it interesting or if it they want more info about your website they will visit your site for surely. Forum Posting also enhance your website SERP’s because regular incoming links is automatically recognize by the popular search engines, which frequently index your website, eventually the ranking for your site increases.

I hope, you have learned lot of things about forum posting and always make good strategy to keep in mind Google Algorithms and always follow the roles. Now update Algorithms give more important high RP websites back links so here I am shared lot of high PR forum posting websites that will be extremely helpful to bring your website on ROI.

What is Directory Submission and Benefits ?


What Is Directory Submission

Directory Submission is a easy and reliable method to build back-links for website.With Directory submission you can easily build backlinks for your websites.

Directory helps you to gain one way back links.Directory submission is simply filling of your websites detail in online web directory. Online directory submission is a process which includes identification of proper category and filing the website or webpage detail in particular related category and get it approved by directory editor. It is not merely a data entry process it involve the decision of choosing a proper category which can be carried only by intelligent directory submitter.

Directory Submission Benefits

Directory submissions are the affordable search engine optimization (SEO) method of choice for thousands of internet business owners across the planet. If you've spent any amount of time looking for affordable website promotion, you've likely come across webmaster forums or blogs boasting about the benefits of directory submissions. For those who don't know what all of the fuss is about, we've outlined a few directory submission benefits and what this website promotion service can do for you and your internet business in today's competitive keyword market.

Directory submissions provide increased link popularity.

By and large, the biggest reason that you should use directory submissions as part of your websites optimization and promotion is because of the fact that obtaining incoming links for your website is one of the most important ranking factors in the major search engines ranking algorithms, especially in competitive keyword markets. For instance, if you wanted your website to improve rankings in Google for keywords like "search engine optimization", "web hosting" or "car insurance", you absolutely could not get your website to move up into the top 10 in Google for these terms without a significantly large number of incoming links. In competitive keyword markets it is impossible to maintain, or even obtain, good placement in the SERPs (search engine results pages) without obtaining a large number of backlinks to boost your global link popularity.

In addition to increasing link popularity, there are a number of other directory submission benefits:

1. Getting listed in major search engines. The days of using the "add URL" page to submit your website to Google are long passed; major search engines find new websites through incoming links. When using directory submissions, you can get your website listed in Google and other major search engines in a matter of days sometimes, not the 4-6 weeks it takes using Google's "add a website" page.

2. Increased visits from search engine robots. Search engine robots are the agents that scour the web looking for new websites; if you use a directory submission service, or submit to directories on your own, you are increasing the likelihood of search engine robots crawling your website more often.

3. Increased visibility to search engines. The more incoming links you have to your website, such as can be obtained through directory submissions and other link building strategies, the better. By increasing your global link popularity you are increasing the frequency by which search engine robots access your website, thus increasing your websites visibility to the search engines.

4. Keyword targeting. Directory submissions help website owners build keyword relevancy for their websites. By using keyword focused anchor text for your directory listings, you are improving the keyword relevancy of your website for the keywords that you use for the "website title" field during directory submissions.

5. Brand building. This falls in line with keyword targeting, but it deserves a spot on this list. Website owners can increase brand awareness by using their business name within the anchor text of their directory listing. When used in conjunction with keyword-focused anchor text during the directory submission process, you're effectively getting keyword targeting, link building and brand name building all in the same step.

6. Relevant link building. If a directory is well-managed, you will get the benefit of receiving a contextual link, or a link from a page with topically related content. This, in itself, is a powerful directory submission benefit.
There are many directory submission benefits, and a big reason that they are such a popular website promotion method is because they are affordable. Directory submissions, when considering the cost to benefit ratio, are the wholesale website promotion method of choice for many of today’s website owners on a budget.

If you're looking for an affordable website promotion service, directory submissions are all that and more. Mixing in the benefits of keyword targeting, brand building and increasing your website's link popularity that offer long-term results for a nominal fee is a no-brainer. When you're a website owner on a budget, directory submissions give you the biggest "bang for your buck" in terms of affordable website promotion services, and coupled with the benefits they provide what more could you ask for?

Importance of Directory Submissions:-

To increase visibility online is the core of every online business marketing strategy. Driving traffic to your website and getting noticed by search engines takes a lot of time and effort.
‘Links’ play a major role in determining your ranking position on the various search engines. Obtaining inbound links or back links in quantity and quality can influence your search engine rank immensely. One simple way to obtain back links is to submit your link to directories. Online directories exist for the sole purpose of providing links to web users to various sites categorized under relevant topics. “Directory submission” is a website optimization strategy that no website owner should ignore.

Directories catalog links for easy access to users. Much earlier, directories were the primary source for web users to find websites on various topics. Today search engines have taken over but directories have not lost their importance yet. You cannot obviously expect too much traffic from a directory submission. But you can expect an improvement in your search engine placement.

Here is how it works. When submitting to a directory, the major hindrance as well as the major benefit lies in the human editor. Your submission will be viewed by an actual person who determines how relevant and unique your website is. After passing his scrutiny and editing, your website will be accepted in the directory. This automatically establishes the credibility of your website; hence search engines will consider links submitted to directories.

Some directories even feed their databases to other directories and search engines. Search engines base a certain factor on directories in judging a website’s popularity and relevance. Depending on the quality of the directory and the number of back links your website has, search engines will be able to determine your relevance and qualify your website accordingly.

Note how important listing in a directory is. If your website is low on content but rich in images, flash content, etc. search engines may not be able to categorize your website when the search spider visits your site. By submitting to a directory under a particular category, you allow the search engine to categorize your site under a relevant topic.

There are general and specific directories available online. Specific directories target only those websites based on a particular subject or field. No matter what your business is, odds are that directories will be available that cater specifically to it. There are also regional directories available which may be highly useful if your online business targets a local audience.

What is Directory Submission and Benefits?

Search Engine Optimization consist various kinds of activities; directory submission is basic and foremost. When you start work on SEO project, first you do search engine submission for attain fast index by major search engines after that you come on directory submission where you make pile of one way back link for your website that essential for every successful website and search engine always clamoring one way back link.

Directory submissions enhance your website presence in search engine and also help for achieve good ranking on major search engine, during the submit your website in directory sites you must pay more attention towards common mistakes, if you will revise mistakes then you website never come on ROI. I am writing this article to keep in mind latest changes in Google algorithm, everyone know who running in online business thousands website lost their back links, ranking, traffic and popularity after May 24th due to Google Penguin updated, this software find various kind of mistakes but now I want to recommend to everyone don’t revise these mistake again.

First of all mostly directory sites are free where you no need to spend money and need spend time with good concentration. Google more clamoring one way back link so don’t give more preference to paid or reciprocal links and submit your website major directory DMOZ and Yahoo directories organize links under a variety of categories and sub-categories.

Correct categories selection is most important because visitors reach their destination with the help of perfect categories and they find themselves correct place. By the correct way visitors come on your website they give good rating to your website because your website provide accurate information that they want, the major benefits your website will attain perfect bounce rate and your website get good popularity in search engine.

Relevancy is very important on every place like website content, keywords and back links if your website have good relevancy in all type of factors you will make pile of money from your website so always get relevant back link from other websites. You can get relevant back link from directory submission through relevant categories so select perfect and correct categories.
High quality work: thousands of website get penalty by Google penguin and panda, people who want to recovery their website they need high quality and legitimate work, get back link from high PR websites. Thousands of directory submission that have good PR so give more preference and get back link form that websites.

Tuesday, June 19, 2012

How to get XML Sitemap for a Blogger Blog

Search Engine Optimization is not a new term for the webmasters today. If I want my website to be popular, i have to spend a lot of time performing different SEO aspects. Well, SEO includes a lot of things collectively, like submitting your URL to search engines, Link building etc. When you create a new page on your website, it gets automatically indexed by the search engines because search engine bots keep visiting your website regularly. But what if the bots miss out some of the pages?? Well, in this kind of condition, the best thing you can do is, you can create an XML sitemap and submit it to major search engines like Google, Bing and Yahoo.

What is XML sitemap?

XML sitemap is a URL directory of all the pages which exist on a website. When you submit a sitemap to search engine, the search engine can crawl and index all the pages which might have been left out while the bots visited your website.

Sitemap of a Blogger blog

The default sitemap of a blogger blog might have some issues, because it consists of only 26 URLs’. Those 26 URLs’ include your pages, your home page and your posts. So, if you have 4 pages on your blog, you are left with only 21 posts URLs’ which can be included on your default sitemap. Well, this might not be a problem for some small blogs, but if you have been a blogger since a long time, then you might want to figure out a way to create a fully functional XML sitemap including all the URLs’ of your blog. Below in some steps, i will tell you some tips, so that you can create a full XML sitemap to submit to the search engines and get a better search engine rankings.

How to get a complete sitemap for a blogger blog?

Below steps will work for the blogs hosted on blogger platform. Whether it is a genuine blogger blog (something.blogspot.com ) or a self hosted blogger blog using a custom domain.
So finally, below are the steps to guide you throughout your process to create a complete XML sitemap for a blogger blog.
  • Open Sitemap Generator and type the address of your blogger blog. (In case of a self hosted custom domain, type your domain name)
  • click on Generate sitemap and wait for a few moments.
  • This will automatically generate the list of all the URL’s in your blog. Copy all the text generated below
  • Now, Open your blogger Dashboard and navigate to Settings >> Search Preferences and enable the Custom Robots.txt option (available in the crawling and indexing section). Paste the text you have already copied and save the changes.
Next time, when the search engine bots will visit your website, all of the URLs’ will be automatically picked from your robots.txt file. You can submit your sitemap manually also. For that, Click here and enter your robots file URL i.e. BlogURL/robots.txt .

Monday, June 18, 2012

Proper SEO and the Robots.txt File

When it comes to SEO, most people understand that a website must have content, "search engine friendly" site architecture/HTML, and meta data (title tags and meta descriptions).

Another meta element, if implemented incorrectly, that can also trip up websites is robots.txt. I was recently reminded of this while reviewing the website of a large company that had spent considerable money on building a mobile version of their website, on a sub-directory. That’s fine, but having a disallow statement in their robots.txt file meant that the website wasn’t accessible to search engines (Disallow: /mobile/)


Let’s review how to properly implement robots.txt to avoid search ranking problems and damaging your business, as well as how to correctly disallow search engine crawling.

 

What is a Robots.txt File?

 

Simply put, if you go to domain.com/robots.txt, you should see a list of directories of the website that the site owner is asking the search engines to "skip" (or "disallow"). However, if you aren’t careful when editing a robots.txt file, you could be putting information in your robots.txt file that could really hurt your business.


There's tons of information about the robots.txt file available at the Web Robots Pages, including the proper usage of the disallow feature, and blocking "bad bots" from indexing your website.


The general rule of thumb is to make sure a robots.txt file exists at the root of your domain (e.g., domain.com/robots.txt). To exclude all robots from indexing part of your website, your robots.txt file would look something like this:


User-agent:
* Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/

The above syntax would tell all robots not to index the /cgi-bin/, the /tmp/, and the /junk/ directories on your website.

 

Other Real Life Examples of Robots.txt Gone Wrong


In the past, I reviewed a website that had a good amount of content and several high quality backlinks. However, the website had virtually no presence in the search engine results pages (SERPs).


What happened? Penalty? Well, no. The site's owner had included a disallow to "/". They were telling the search engine robots not to crawl any part of the website.


In another case, a SEO company edited the robots.txt file to disallow indexing of all parts of a website after the site's owner stopped paying the SEO company.


I also remember reviewing a company's website and noticing that several directories that were part of their former site were disallowed in their robots.txt file. The company should have set up a 301 permanent redirect to pass the value from the old web pages on the site to the new pages instead of disallowing the search engines to index any of the old legacy pages. Thus, all of the value was lost.

 

Robots.txt Dos and Don'ts


There are many good reasons to stop the search engines from indexing certain directories on a website and allowing others for SEO purposes. Let's look at some examples.
Here's what you should do with robots.txt:
  • Take a look at all of the directories in your website. Most likely, there are directories that you'd want to disallow the search engines from indexing, including directories like /cgi-bin/, /wp-admin/, /cart/, /scripts/, and others that might include sensitive data.
  • Stop the search engines from indexing certain directories of your site that might include duplicate content. For example, some websites have "print versions" of web pages and articles that allow visitors to print them easily. You should only allow the search engines to index one version of your content.
  • Make sure that nothing stops the search engines from indexing the main content of your website.
  • Look for certain files on your site that you might want to disallow the search engines from indexing, such as certain scripts, or files that might contain email addresses, phone numbers, or other sensitive data.
Here's what you should not do with robots.txt:
  • Don't use comments in your robots.txt file.
  • Don't list all your files in the robots.txt file. Listing the files allows people to find files that you don't want them to find.
  • There's no "/allow" command in the robots.txt file, so there's no need to add it to the robots.txt file.
By taking a good look at your website's robots.txt file and making sure that the syntax is set up correctly, you'll avoid search engine ranking problems. By disallowing the search engines to index duplicate content on your website, you can potentially overcome duplicate content issues that might hurt your search engine rankings.

Tuesday, June 12, 2012

The Four Keys To Post - Penguin Directory Submission Happiness

The Google+ Local launch came out while I was working on this piece so it merits a brief mention. While Google+ Local will be a big deal, at the moment for most local businesses, it is not a huge deal. Thus far, there does not appear to be much change in Google’s main local rankings algorithm besides the fact that Google+ Local pages will be indexed versus the old Places pages which were pseudo-indexed.
It seems as if this update is mostly about getting us ready for changes coming down the road, where social activity gets even more ingrained in Google’s algo. If you’re interested in the subject, you can read some more thoughts on why Google+Local may be a ghost town and some prognostication from about a year ago that I think is still pretty much on the money. Of course, the local searcharati has plenty to say on the matter.
Now back to our regularly scheduled programming…
In this post-Penguin landscape (imagine Mad Max, but with more geeks and less leather) there has been a renewed focus on quality directory submissions. While not as glitzy and glamorous as more recent strategies (like infographics and guest posting on blogs) – locally focused, vertical and niche directory submissions have been a solid bet in any link-builder’s portfolio for quite some time.
For the most part they are a pretty painless task, and while some may cost you a bit, they can provide positive results. However, when it comes to directory submissions are generally safe, a focus on quality, diversity, timing and relevance must play a central part in selecting which directories to submit to.

Quality

All local, vertical and niche directories are not created equal, and higher quality ones should really be the only ones that you are focusing on.
Make sure to check and see if Google has the directory indexed. If it doesn’t, that is a major red flag that you should pass up that particular directory.
Paid niche directories are typically seen as a safe bet. When I say “paid”, I’m not talking about those kind of paid links. It seems tricky that paid links are bad and paid directories are ok, but Google is looking for and refining their definition of signals of authority.
Many paid directories continue to be good signals of authority, due to the cost of entry that pays for their editorial reviews that help ensure quality. In particular, vertical and niche directories can be high quality, but you should steer clear of those that are covered in ads and don’t have an editorial review process.

Relevance

If you were beginning to peruse a scrapbooking directory (yes, they exist) and came across multiple listings for divorce lawyers, would you be likely to stay on the site? No? Well, Google agrees and has been actively deindexing directories that do such things.
Making sure that the directories that you are submitting to are relevant and have a suitable category for your listing is crucial.
If a quality directory is showing up in the SERPs for keywords that you want to rank for, then you should definitely look into getting a listing there.
Another good way to find directories that are relevant to your website is to add “directories” to any preferred search terms. Or follow blogs that put together directory lists like this one.
Even those directories which don’t link back to your site may have considerable authority and will add value as a citation.
If you’re a lawyer, look at Avvo, Lawyers.com and the like. Doctors? Healthgrades andLocateADoc are oldies but goodies (full disclosure: I have consulted with LocateADoc and got a good deal on a nose job.)

Diversity

Remember: “Variety is the spice of life.” Get yourself a kitschy framed cross-stitching of this idiom and put it on your desk so you will see it every day.
Link builders can no longer rely on the same tired old bag of tricks to get the job done.
A diverse link profile is an absolute must. Directory submissions alone are not going to cut it, and using the same title, link, and keywords for every submission really isn’t going to help and could possibly get you Penguin-slapped.
Make sure that you are using various deep links. Create a variety of titles and descriptions using various keywords that are appropriate for the directory you are submitting to. Also, remember that natural link profiles have nofollow links as well, so don’t pass them over.

Timing

As tempting as it may seem to use an automated submission tool, it won’t get you anywhere in the long run. Directory submissions used to be about submitting to as many directories as you could in one fell swoop and hopefully 60% of them would stick.
Now that Google has a feisty bird paying attention to your link profile, taking a more laid back approach and submitting to highly relevant niche directories over several months or even a year is highly recommended. Patience, grasshopper. Patience.
Many SEO bloggers tend to behave like these updates from Google are huge paradigm shifts that are going to melt our faces off. We can be a pretty dramatic bunch.
In reality, the majority of these updates are simply trying to make sure relevance and quality is at the forefront of the SERPs. Directory submissions are still highly useful SEO tool, but as with every other weapon in your modern link-building arsenal, it better be used in a legitimate fashion.
Now go watch a motivational half-time speech on YouTube and get out there and build some links.
Facebook Likes, Increase FB Likes Free