Article to know in depth about how to remove/fix Supplemental Results from Google search results

Steveb of webmasterworld has an excellent posting on how to remove supplement results, I agree 100% with what he says and I recommend his posting to everyone who have supplement results in google and want to remove them, Supplement results are mostly caused when a page of a site once existed and later removed by the site owner of because of any other problem, Supplement results are also caused when a page which is crawled once had links to it then the links dropped off completely.

Article posted on this blog by Afzal Khan.

Here is his posting

"Google's ill-advised Supplemental index is polluting their search results in many ways, but the most obviously stupid one is in refusing to EVER forget a page that has been long deleted from a domain. There are other types of Supplementals in existence, but this post deals specifically with Supplemental listings for pages that have not existed for quite some time.
The current situation: Google refuses to recognize a 301 of a Supplemental listing. Google refuses to delete a Supplemental listing that is now a nonexistent 404 (not a custom 404 page, a literal nothing there) no matter if it is linked to from dozens of pages. In both the above situations, even if Google crawls through links every day for six months, it will not remove the Supplemental listing or obey a 301. Google refuses to obey its own URL removal tool for Supplementals. It only "hides" the supplementals for six months, and then returns them to the index.
As of the past couple days, I have succeeded (using the below tactics) to get some Supplementals removed from about 15% of the datacenters. On the other 85% they have returned to being Supplemental however.
Some folks have hundreds or thousands of this type of Supplemental, which would make this strategy nearly impossible, but if you have less than twenty or so...
1) Place a new, nearly blank page on old/supplemental URL.
2) Put no actual words on it (that it could ever rank for in the future). Only put "PageHasMoved" text plus link text like "MySiteMap" or "GoToNewPage" to appropriate pages on your site for a human should they stumble onto this page.
3) If you have twenty supplementals put links on all of them to all twenty of these new pages. In other words, interlink all the new pages so they all have quite a few links to them.
4) Create a new master "Removed" page which will serve as a permanent sitemap for your problem/supplemental URLs. Link to this page from your main page. (In a month or so you can get rid of the front page link, but continue to link to this Removed page from your site map or other pages, so Google will continually crawl it and be continually reminded that the Supplementals are gone.)
5) Also link from your main page (and others if you want) to some of the other Supplementals, so these new pages and the links on them get crawled daily (or as often as you get crawled).
6) If you are crawled daily, wait ten days.
7) After ten days the old Supplemental pages should show their new "PageHasMoved" caches. If you search for that text restricted to your domain, those pages will show in the results, BUT they will still ALSO continue to show for searches for the text on the ancient Supplemental caches.
8) Now put 301s on all the Supplemental URLs. Redirect them too either the page with the content that used to be on the Supplemental, or to some page you don't care about ranking, like an "About Us" page.
9) Link to some or all of the 301ed Supplementals from your main page, your Removed page and perhaps a few others. In other words, make very sure Google sees these new 301s every day.
10) Wait about ten more days, longer if you aren't crawled much. At that point the 15% datacenters should first show no cache for the 301ed pages, and then hours later the listings will be removed. The 85% datacenters will however simply revert to showing the old Supplemental caches and old Supplemental listings, as if nothing happened.
11) Acting on faith that the 15% datacenters will be what Google chooses in the long run, now use the URL removal tool to remove/hide the Supplementals from the 85% datacenters.
Will the above accomplish anything? Probably not. The 85% of the datacenters may just be reflecting the fact that Google will never under any circumstances allow a Supplemental to be permanently removed. However, the 15% do offer hope that Google might actually obey a 301 if brute forced.
Then, from now on, whenever you remove a page be sure to 301 the old URL to another one, even if just to an "About Us" page. Then add the old URL to your "Removed" page where it will regularly be seen and crawled. An extra safe step could be to first make the old page a "PageHasMoved" page before you redirect it, so if it ever does come back as a Supplemental, at least it will come back with no searchable keywords on the page.
Examples of 15% datacenter: 216.239.59.104 216.239.57.99 64.233.183.99 Examples of 85% datacenter: 216.239.39.104 64.233.161.99 64.233.161.105 "


Regard's

Afzal Khan

Article posted on this blog by Afzal Khan. Actual source of this article has been picked from Search Engine Genie Blog. I recommend all my reader of Toprankseo Blog to visit links at my favourite blogs.

Importance of Sitemap Page in your website

Howdy Folks,

Hope you people are rocking in your life!!! Well it’s being long I have not written any new fresh SEO article. Today I finally decided to come up with new topic, picking some time from my busy schedule.

Today I am going to talk about the hottest topic now a day – Sitemap. About the importance of Sitemap to rank well in search engines there are many SEO tips and tricks that help in optimizing a site but one of those, the importance of which is sometimes underestimated is sitemaps.

Sitemap, as the name simply speaks of is like a map of your website – i.e. on one single page you show the structure of your site, its sections, the links between them, etc. Sitemap helps in making navigation easier for your site and keeping an updated Sitemap on your site is fruitful both for your users and for search engines. It is an important way of communication with search engines. By providing sitemap page to your site you tell search engines where you’d like them to go, while in robots.txt you tell search engine which parts of your site to exclude from indexing.

Sitemap have always been part of best Web design practices but with the adoption of sitemaps by search engines, now they become even more important. However, it is necessary to make a clarification that if you are interested in sitemaps mainly from a SEO point of view; you can't go on with the conventional sitemap only (though currently Yahoo! and MSN still keep to the standard html format). For instance, Google Sitemaps uses a special (XML) format that is different from the ordinary html sitemap for human visitors.

One might ask why two sitemaps are necessary. The answer is obvious - one is for humans, the other is for spiders (for now mainly Googlebot but it is reasonable to expect that other crawlers will join the club shortly). In that relation it is necessary to clarify that having two sitemaps is not regarded as duplicate content. In 'Introduction to Sitemaps', Google explicitly states that using a sitemap will never lead to penalty for your site.

Do check Toprank SEO Blog for other feature articles or mail me at afzal.bpl@gmail.com for other SEO articles which you would love to know in detail.

SEO Forum

Hey friends,

Sharing you with a great news about SEO and Other Search Engine Marketing news, do join the exclusive forum made to discuss about search engine optimisation techniques at SEO India Forum.

Wish to see you guys.

Njoy!!! n have fun!

Afzal