Importance of Sitemap Page in your website

Howdy Folks,

Hope you people are rocking in your life!!! Well it’s being long I have not written any new fresh SEO article. Today I finally decided to come up with new topic, picking some time from my busy schedule.

Today I am going to talk about the hottest topic now a day – Sitemap. About the importance of Sitemap to rank well in search engines there are many SEO tips and tricks that help in optimizing a site but one of those, the importance of which is sometimes underestimated is sitemaps.

Sitemap, as the name simply speaks of is like a map of your website – i.e. on one single page you show the structure of your site, its sections, the links between them, etc. Sitemap helps in making navigation easier for your site and keeping an updated Sitemap on your site is fruitful both for your users and for search engines. It is an important way of communication with search engines. By providing sitemap page to your site you tell search engines where you’d like them to go, while in robots.txt you tell search engine which parts of your site to exclude from indexing.

Sitemap have always been part of best Web design practices but with the adoption of sitemaps by search engines, now they become even more important. However, it is necessary to make a clarification that if you are interested in sitemaps mainly from a SEO point of view; you can't go on with the conventional sitemap only (though currently Yahoo! and MSN still keep to the standard html format). For instance, Google Sitemaps uses a special (XML) format that is different from the ordinary html sitemap for human visitors.

One might ask why two sitemaps are necessary. The answer is obvious - one is for humans, the other is for spiders (for now mainly Googlebot but it is reasonable to expect that other crawlers will join the club shortly). In that relation it is necessary to clarify that having two sitemaps is not regarded as duplicate content. In 'Introduction to Sitemaps', Google explicitly states that using a sitemap will never lead to penalty for your site.

Do check Toprank SEO Blog for other feature articles or mail me at afzal.bpl@gmail.com for other SEO articles which you would love to know in detail.

SEO Forum

Hey friends,

Sharing you with a great news about SEO and Other Search Engine Marketing news, do join the exclusive forum made to discuss about search engine optimisation techniques at SEO India Forum.

Wish to see you guys.

Njoy!!! n have fun!

Afzal

Precharge Projectnet SEO Contest

MEGA SEO Contest - Guys an Blasting news for you all those who readign my Blog regularly there an SEO Contest is running on and th details are written below, you can also check this at SEOCompetition Blog

Precharge Projectnet SEO Contest, The Exact Phrase - "precharge projectnet"
The Goal - to drive attention to the ever increasing problem surrounding the biggest challenge we may ever face. Everyday, people in Washington DC and all over the country are trying very hard to limit the exponential growth that the internet continues to have. These limitations may forever change the way we work and play online and preCharge believes this challenge needs some serious exposure before it's too late.

Contest starts Sunday August 19, 2006 and ends December 19, 2006. (duration is approximately 4 months) - Just in time for an extra special holiday surprise. We will even guarantee payment by December 24th, 2006 via Western Union or PayPal.

The URL who ranks highest on Google with a domain registered on or after August 19, 2006 wins for the phrase "precharge projectnet".

Good Luck for you all people and for me too as m also participating for this contest.

Regard's

Afzal Khan

What Google Said - When You Weren't Listening

Article printed from SEO-News: http://www.seo-news.com HTML version available at: http://www.seo-news.com/archives.html
What Google Said When You Weren't Listening By Kim Roach (c) 2006

Article Posted on this Blog by Afzal Khan

Google wants to create quality search engine results just as
badly as you want to acquire high search engine rankings.
Fortunately for us, Google provides web masters with plenty of
guidelines and tips for building a Google-Friendly site.

Unfortunately, many web masters simply aren't listening. Most
web masters seem to be pulling tips and strategies from almost
every source but Google itself. However, Google has some of the
most beneficial SEO tips to be found online.

Here are just a few of the questions that you can find answered
directly by Google.

Q. Does Google index dynamic pages?

A. Yes. Google indexes dynamically generated pages. This
includes pages with the following file extensions: .asp, .php,
and pages with question marks in their URLs. However, these
pages can cause problems for the Googlebot and may be ignored.

Fortunately, there is a solution. If you feel that your
dynamically generated pages are being ignored, you may want to
consider creating static copies of those pages for the
Googlebot. Keep in mind, if you choose to do this, be sure to
include a robots.txt file that disallows the dynamic pages so
that Google doesn't see those pages as duplicate content.

Q. Does Google index sites that use ASP?
A. Yes. Google is able to index most types of pages and files
with very few exceptions. This includes pdf, asp, jsp, html,
shtml, xml, doc, xls, ppt, rtf, wks, lwp, wri, swf, cfm, and
php. This is not a complete list, but it gives a good overview.

Q. Does Google index sites that use Macromedia Flash?
A. Yes. Google indexes pages that use Macromedia Flash. However,
Google may have problems indexing Flash pages. If you are
concerned that your Flash content is inhibiting Google's ability
to crawl your site, you may want to consider creating HTML
copies of those Flash pages. As always, you will need to include
a robots.txt file that disallows the Flash pages so that Google
does not recognize those pages as duplicate content.

Q. How do I add my site to Google's search results?
A. According to Google , inclusion in Google's
search results is free and easy. They also state that it is
unnecessary to submit your site to Google. Google uses software
known as "spiders" to crawl the web on a regular basis and find
sites to add to the index.

When a spider misses a site, it is often because of one of the
following reasons:

1. The site is not well connected with other sites through an
inbound linking structure.

2. The site launched after Google's most recent crawl was
completed.

3. Poor web site design makes it difficult for Google to
effectively crawl your content.

4. The site was temporarily unavailable at the time of
crawling or an error was received. You can use Google
Sitemaps to see if the Google crawlers received errors
when trying to crawl your site.

Q. How can I get my web site into Google's Mobile index?
A. Google Mobile offers Google Web Search, Local Search, and
Image Search for web sites that are configured for mobile
devices. Google adds new sites to their mobile Web index every
time they crawl the Web.

To let Google know about your mobile site, it is best to submit
a Mobile Sitemap.To help ensure that Google's mobile crawlers can crawl and index your site, you should:

* Use well-formed markup
* Validate your markup
* Use the right DOCTYPE and Content-Type for the markup language that you are using.

Q. Will participation in Adsense or Adwords affect my listing in
Google's free search results.

A. Google's advertising programs are independent of their search
results. Participation in an advertising program will have no
effect on your organic search engine rankings.

Q. Why does my site have a PageRank of zero?
A. Yes. Google has an answer for this as well. According to
Google, a page may be assigned a rank of zero if Google crawls
very few sites that link to that particular site. In addition to
this, pages that have recently been added to the Google index
may also show a PageRank of zero. This is simply because they
haven't been crawled by Googlebot yet and haven't been ranked
yet.

The key is to be patient. A page's PageRank score may increase
naturally with further crawls.

Q. My URL changed. How can I get Google to index my new site?
A. Google cannot manually change your URL in the search
results. However, there are steps you can take to ensure a
smooth transition.

First, you can redirect visitors to your new site. To do this,
simply use an HTTP 301 (permanent) redirect. This ensures that
Google's crawler will discover your new URL.

To preserve your rank, you will need to tell others who link to
your site about your change of address. To find a portion of the
sites that link to yours, you can go to the Google search engine
and type in: site:www.mydomain.com . To obtain a comprehensive
list of links that point to your page, perform a Google search
on your URL in quotes: "www.mydomain.com".

Q. How often does Google crawl the web?
A. Google's spiders crawl the web on a regular basis to rebuild
their index. Crawls are based on a number of factors, including
Pagerank, links to a page, and a web site's structure. This is
just a small list. There are a variety of factors that can
affect the crawl frequency of individual sites.

Q. How do I create a Google friendly site?
A. To help Google find, index, and rank your site, it is
suggested that you follow their Webmaster Guidelines.

Here are some of the general guidelines that Google offers to
web masters:

* Have other relevant sites link to yours.

* Submit a sitemap.

* Submit your site to relevant directories such as
the Open Directory Project and Yahoo. For a complete
listing of web directories, go to Strongestlinks.com

* Make sure each and every page is reachable from at least
one static text link.

* Offer your visitors a site with links that point to the
most important parts of your site. If your sitemap is larger
than 100 links, you may want to break the site map into
separate pages.

* Keep the links on any given page to a reasonable number
(less than 100).

* Check for broken links and correct HTML.

* Create a useful site that is full of information-rich content.
Your pages should be written in a way that clearly and
accurately describes your content.

* Make sure that your TITLE and ALT tags are descriptive and
accurate.

* Use a text browser such as Lynx
to examine your web site. Most search engine spiders see your
site in much the same way as Lynx would.

* Allow search bots to crawl your sites without session Ids
or arguments that track their path through the site.

* Make use of the robots.txt file which tells crawlers which
directories they can or cannot crawl


Q. How can I report a site that is spamming the Google search
results?

A. Google is constantly working to improve the quality of their
search results. Therefore, they have implemented a program that
allows web searchers to report spam that they find within the
search engine results. These Spam Reports are submitted directly
to Google engineers and are used to devise long-term solutions
to fight spam.

However, before you submit a site as being spam, Google highly
suggests that you take a look at their webmaster guidelines
to determine if sites are acceptable or not.

http://www.google.com/contact/spamreport.html

Q. Why are sites blocked from the Google index?

A. Sites may be blocked from the Google index if they do not
meet certain quality standards. Google does not comment on the
individual reason for pages being removed. However, they do
reveal that certain actions such as cloaking, writing text that
can be seen by search engines but not by users, or setting up
pages/links with the sole purpose of fooling the search engines
may result in removal from the index.

If you receive a notification that your site violates Google's
quality guidelines, you can correct your site to meet their
guidelines and then request reinclusion.

So there you have it, some of the many tips that Google is
handing out for free. If you want to obtain high search engine
rankings for the long-term, Google actually provides some very
good advice.
===============================================
Kim Roach is a staff writer and editor for the SiteProNews
and SEO-News
newsletters.
You can contact Kim at:
Contact: kim@seo-news.com
================================================

Copyright © 2006 Jayde Online, Inc. All Rights Reserved.
SEO-News is a registered service mark of Jayde Online, Inc.

Article Posted on this Blog by Afzal Khan

More SEO Blog's

Article Posted on this blog by Afzal Khan

Here' an good news for you all the reader's of Toprankseo Blog. As you guys has realised we always look to add detailed and descriptive articles about SEO (Search Engine Optimisation) to all the novice those who want to acquire skills and get update there SEO Knowledge, thus enhancing there skills. We are now going to give you the information about all the Blog's and website's publishing newer information on search engine optimisation.

In current section I have been providing you with link of an SEO Blog, maintained by Sonika Mishra an SEO Expert in Delhi. As you all might be aware that Delhi is like an SEO Hub for all SEO Expert's in India, and I am sure you all will also get benefited from Sonika effort's in publishing SEO articles.

About Sonika Mishra, she has been very knowledgable and experienced person in this field. Since she has been sharpening her skills and being in SEO field from last 2 years and 6 month, she has been writing articles for her own blog and regularly publishing good and intresting articles on her blog.

Read out more about SEO articles at http://seo-expert-delhi.blogspot.com/.

Title :- SEO EXPERT DELHI-SEO & Web Promotion India

Description :- Search engine ranking, Search engine optimization, Search engine placement, Website optimization, Search engine positioning, Web site optimization, High search engine ranking, Web page optimization, Search engine promotion, Top search engine ranking, High search engine rankings, Search engine rankings, Better search engine placement, Web site optimization, High search engine placement, Search engines optimization, Website optimization.

Njoy!!!

Afzal Khan

Google Search Engine Optimization Pitfalls

By John Hill (c) 2006
Article Posted on this blog by Afzal Khan

On Page Factors - Is Your Website Search Engine Friendly?

So you have a website but where is it on Google? Have you fallen
foul of a penalty or have you overlooked one of the many common
search engine optimization pitfalls when designing your
site?

Understanding what works for the search engines and what doesn't
when it comes to the content on your website can have a crucial
impact on the relevance and/or page rank of your pages from a
SEO perspective.

Here we highlight common mistakes that could affect your ranking
on Google and other search engines.

Optimizing for the Correct Keywords

Basically 'Get real' about what keywords you feel your website
can be ranked for. If you have a ten page website in a highly
competitive market then ranking naturally for the major terms
will be close to impossible.

Use the Overture keyword tool together with the number of
results on Google to find out what keywords are searched for and
how many other websites are targeting them. If you are lucky
then you might even find a popular keyword that not many other
websites are optimized for. Alternatively a good tool for this
job is Wordtracker from Rivergold Associates Ltd.

Code Validation

If your html code is not valid, then this could make it very
difficult or even impossible for a search engine to separate
your page content from your code. If the search engine cannot
see your content, then your page will obviously have no
relevance.

Frames

Even though most, if not all, major search engines now index
frames and even with the use of the NOFRAMES tag, you run the
risk of your pages being displayed in the search engine results
out of context. As each individual page is indexed separately,
it is likely that your website visitors will not see your pages
within your frame and will effectively be stuck on the page they
arrive at.

If you must use frames then create a 'Home' link on each of your
individual content pages and point the link at your frameset
index page.

JavaScript Navigation

If you use JavaScript to control your website navigation, then
search engine spiders may have problems crawling your site. If
you must use JavaScript, then there are two options available to
you:

* Use the NOSCRIPT tag to replicate the JavaScript link in
standard HTML.

* Replicate your JavaScript links as standard HTML links in
the footer of your page.

Flash Content

Currently only Google can index Macromedia Flash files, how
much or how little content they see is open to debate. So until
search engine technology is able to handle your .swf as standard
it would be advisable to avoid the use of these.

Again if you must use Flash then offer a standard HTML
alternative within NOEMBED tags.

Dynamic URLs

Although Google and Yahoo are able to crawl complicated URLs it
is still advisable to keep your URLs simple and avoid the use of
long query strings. Do not include session IDs in the URL as
these can either create a 'spider trap' where the spider indexes
the page over and over again or, at worst, your pages will not
get indexed at all.

If you do need to include parameters in the URL, then limit them
to two and the number of characters per parameter to ten or
less.

The best SEO solution for dynamic URLs is to use Mod-rewrite or
Multiviews on Apache.

No Sitemap

A sitemap is the search engine optimization tool of choice to
ensure every page within your website is indexed by all search
engines. You should link to your site map from, at least, your
homepage but preferably from every page on your website.

If your website contains hundreds of pages then split the
sitemap into several categorized maps and link these all
together. Try and keep the number of links per page on a sitemap
to less than 100.

Excessive Links

Excessive links on a given page (Google recommends having no
more than 100) can lower its relevance and, although it does
not result in a ban, this does nothing for your search engine
optimization strategy.

Be Careful Who You Link To

As you have no control over who links to your website, incoming
links will not harm your rank. However, outbound links from your
website to 'bad neighbourhoods' like link farms will harm your
ranking.

As a rule ensure as many of your outbound links as possible link
to websites that are topical to your field of business.

Article Posted on this blog by Afzal Khan


=======================================
John Hill - Developer, Designer and SEO Professional with
E-Gain New Media (http://www.e-gain.co.uk) offering website
design (http://www.e-gain.co.uk/web-development/
website_development/web-site-design/), search engine optimization
(http://www.e-gain.co.uk/online_marketing/business_solutions/
search-engine-optimisation/) and PPC Management.
=======================================

The Advance Of Algorithms - New Keyword Optimization Rules

By Matt Jackson (c) 2006

Posted on this blog by Afzal Khan

Maintaining and marketing a website can be a difficult task
especially for those who are inexperienced or who have very
little experience. SEO rules are constantly changing and even
then, many SEO professionals disagree on the actual specifics
required to optimize a website. This is in no small part due to
the search engines themselves.

Major search engines like Google are constantly striving to
ensure that sites at the top of their result pages offer
invaluable information or service to their visitors. However,
webmasters who are looking to make quick money while offering
very little quality content are always finding new ways to beat
the search engines at their own game. For this reason, search
engines regularly change the methods they use to determine the
relevancy and importance of your site.

Evolving Search Engines

The first step you should take is to ensure that your website
will do an effective job of turning visitors into money. The
content needs to be optimized so that both search engine
visitors and human visitors both deem it to be a useful website.
Once upon a time, effective optimization entailed cramming
content with as many keywords as possible and while this once
generated good search engine results it invariably put visitors
off. It is also now frowned upon and penalized as being spam by
all of the major search engines.

The Evolution And Improvement Of Algorithms

Search engines use specific algorithms to determine the
relevance of your website. The calculations from these
algorithms determine where on the search engine result pages
your website will appear. In order to keep the unscrupulous
webmasters guessing and ensuring that results are always up to
date, major search engines regularly update their algorithms.

Recent Advances
The result of some of the most recent changes has seen the
impetus move away from optimizing websites for search engines
and instead the algorithms are now geared to promote websites
that give true value to visitors. They're not only changing,
they are evolving into more intelligent and accurate algorithms.
While the use of keywords based around the relevant topic is
still important, it is also important to ensure that visitors
are your main priority.

Keyword Optimization

Keyword optimization is now more heavily guarded. Those who
include keywords too often will have their sites labeled as
spam, whereas not enough instances of the appropriate keyword
means you won't receive the desired results. However, the
algorithms have become particularly smart and as well as the
keywords you want to target you should include other relevant
keywords. Including inflexions of keywords is one excellent way
to ensure that your site is deemed to be relevant. Inflexions
are slight changes to your keyword. For example, inflexions of
the keyword "advertising" include advertise, advertised,
advertisement, etc...

Keyword Inclusion

Weight is also given to keywords that are included in certain
sections of a page. These sections include the title tag, meta
tags (only relevant to smaller search engines now), header tags,
image alt tags and formatting tags (e.g. keywords in bold or
italicized) of your text. With image alt tags and hyperlink
title tags it is important that you don't simply fill these with
keywords because this will be ignored at best, and penalized at
worst.

Natural Content Writing

One of the most effective ways to ensure that your site is
keyword optimized properly is to write the content naturally
first. Once you have done this, go through and ensure that any
relevant keywords are included throughout the text. Only place
them where they would appear naturally and remove them from
anywhere where they appear awkward. Once you've written the
content you should also check the remaining factors to ensure
everything is ok.

SEO Keyword Checklist

Below is a keyword checklist to ensure that you have fully
optimized your web pages to the current, generally accepted
search engine algorithm rules.

URL: Get your primary keyword as close to the beginning
of the URL as possible.

Title Tag: The title should be between 10 and 50
characters and include one or more keywords while still being
descriptive.

Description Meta Tag: The description meta tag should be
insightful and useful but it should also contain one or two of
your more important keywords.

Keyword Meta Tag: It makes sense that you should include
all of your keywords in the keyword meta tag. Do not include any
words that don't appear in the body of your text.

Keyword Density: Your content should be made up of all of
your keywords and other text. A total keyword density (all
keywords) of around 15% to 20% is the maximum you should aim for
and anything less than 5% is unlikely to yield good results.
Density for a single keyword should be between 1% and 7%.
1% seems too low, and 7% a little too high. Wherever possible
aim for approx 5% with the primary keyword and 3% with secondary
and subsequent keywords.

Header Tags (e.g. H1 and H2 tags): More weight is given
to keywords that appear within H1 tags, then H2 tags and so on.

Text Formatting Fonts (e.g. strong, bold and underline):
This may not offer much weight in algorithms, but generally if
you bold the first instance of your keywords and the last
instance of your primary keyword you should see some positive
results.

Beginning Of Text: The closer you can get your keywords
to the beginning of your page content the better. Try to include
your primary keyword within the first sentence or two and also
within the last paragraph.

Key-Phrases As Whole Phrases:If you are targeting Internet
Marketing as a key phrase then do not split the words up if
possible. Some effect is noticed if the words are split, but
much more benefit is received by including the phrase as a
whole.

Alt Text: Include your keyword at least once in the Alt tag of
any images. Ensure that the text is relevant to the image and
gives some information.

Posted on this blog by Afzal Khan

========================================================
Matt Jackson, founder of WebWiseWords (http://www.webwisewords.com),
is a professional copywriter offering a professional service.
Whether your business or your website needs a website content
copyrwriter, an SEO copywriter, a press release copywriter or a
copywriter (http://www.webwisewords.com) for any other purpose
WebWiseWords can craft the words you want.
========================================================

Google buys search algorithm invented by Israeli student

Source : haaretzdaily.com

Search engine giant Google recently acquired an advanced text search algorithm invented by Ori Alon, an Israeli student. Sources believe Yahoo and Microsoft were also negotiating with the University of New South Wales in Australia, where Alon is a doctoral student in computer science.

Google, Alon and the university all refused to comment, though Google confirmed that "Ori Alon works at Google's Mountain View, California offices."

The University acknowledged that Yahoo and Microsoft had conducted negotiations with its business development company.

Alon told TheMarker in an interview six months ago that the university had registered a patent on the invention.

Orion, as it is called, which Alon developed with faculty, relates only to the most relevant textual results. In addition the software, which currently operates only in English, offers a list of topics directly related to the original source.

"For example, if you search information on the War of Independence, you'll receive a list of related words, like Etzel, Palmach, Ben-Gurion," he explained. The text will only appear on the results page if enough words relevant to the search and the link between them is reasonable. Orion also rates the texts by quality of the site in which they appear.

Google Algorithm Problems

Google Algorithm Problems
By Rodney Ringler (c) 2006.
Posted on this blog by Afzal Khan

Have you noticed anything different with Google lately? The
Webmaster community certainly has, and if recent talk on several
search engine optimization (SEO) forums is an indicator,
Webmasters are very frustrated. For approximately two years
Google has introduced a series of algorithm and filter changes
that have led to unpredictable search engine results, and many
clean (non-spam) websites have been dropped from the rankings.
Google updates used to be monthly, and then quarterly. Now with
so many servers, there seems to be several different search
engine results rolling through the servers at any time during a
quarter. Part of this is the recent Big Daddy update, which is a
Google infrastructure update as much as an algorithm update. We
believe Big Daddy is using a 64 bit architecture. Pages seem to
go from a first page ranking to a spot on the 100th page, or
worse yet to the Supplemental index. Google algorithm changes
started in November 2003 with the Florida update, which now
ranks as a legendary event in the Webmaster community. Then came
updates named Austin, Brandy, Bourbon, and Jagger. Now we are
dealing with the BigDaddy!

The algorithm problems seem to fall into 4 categories. There are
canonical issues, duplicate content issues, the Sandbox, and
supplemental page issues.

1. Canonical Issues: These occur when a search engine
treats www.yourdomain.com, yourdomain.com, and yourdomain.com/index.html
all as different websites. When Google does this, it then flags
the different copies as duplicate content and penalizes them.
Also, if the site not penalized is http://yourdomain.com, but
all of the websites link to your website using www.yourdomain.com,
then the version left in the index will have no ranking. These
are basic issues that other major search engines, such as Yahoo
and MSN, have no problem dealing with. Google is possibly the
greatest search engine in the world (ranking themselves as a 10
on a scale of 1 to 10). They provide tremendous results for a
wide range of topics, and yet they cannot get some basic indexing
issues resolved.

2. The Sandbox: This has become one of the legends of
the search engine world. It appears that websites, or links to them,
are "sandboxed" for a period before they are given full rank in the
index, kind of like a maturing time. Some even think it is only
applied to a set of competitive keywords, because they were the
ones being manipulated the most. The Sandbox existence is
debated, and Google has never officially confirmed it. The
hypothesis behind the Sandbox is that Google knows that someone
cannot create a 100,000 page website overnight, so they have
implemented a type of time penalty for new links and sites
before fully making the index.

3. Duplicate Content Issues: These have become a major
issue on the Internet. Because web pages drive search engine rankings,
black hat SEOs (search engine optimizers) started duplicating
entire sites' content under their own domain name, thereby
instantly producing a ton of web pages (an example of this would
be to download an Encyclopedia onto your website). As a result
of this abuse, Google aggressively attacked duplicate content
abusers with their algorithm updates. But in the process they
knocked out many legitimate sites as collateral damage. One
example occurs when someone scrapes your website. Google sees
both sites and may determine the legitimate one to be the
duplicate. About the only thing a Webmaster can do is track down
these sites as they are scraped, and submit a spam report to
Google. Another big issue with duplicate content is that there
are a lot of legitimate uses of duplicate content. News feeds
are the most obvious example. A news story is covered by many
websites because it is content the viewers want. Any filter will
inevitably catch some legitimate uses.

4. Supplemental Page Issues: Webmasters fondly refer to
this as Supplemental Hell. This issue has been reported on places like
WebmasterWorld for over a year, but a major shake up around
February 23rd has led to a huge outcry from the Webmaster
community. This recent shakeup was part of the ongoing BigDaddy
rollout that should finish this month. This issue is still
unclear, but here is what we know. Google has 2 indexes: the
Main index that you get when you search, and the Supplemental
index that contains pages that are old, no longer active, have
received errors, etc. The Supplemental index is a type of
graveyard where web pages go when they are no longer deemed
active. No one disputes the need for a Supplemental index. The
problem, though, is that active, recent, and clean pages have
been showing up in the Supplemental index. Like a dungeon, once
they go in, they rarely come out. This issue has been reported
with a low noise level for over a year, but the recent February
upset has led to a lot of discussion around it. There is not a
lot we know about this issue, and no one can seem to find a
common cause leading to it.

Google updates were once fairly predictable, with monthly
updates that Webmasters anticipated with both joy and angst.
Google followed a well published algorithm that gave each
website a Page Rank, which is a number given to each webpage
based on the number and rank of other web pages pointing to it.
When someone searches on a term, all of the web pages deemed
relevant are then ordered by their Page Rank.

Google uses a number of factors such as keyword density, page
titles, meta tags, and header tags to determine which pages are
relevant. This original algorithm favored incoming links and the
anchor text of them. The more links you got with an anchor text,
the better you ranked for that keyword. As Google gained the
bulk of internet searches in the early part of the decade,
ranking well in their engine became highly coveted. Add to this
the release of Google's Adsense program, and it became very
lucrative. If a website could rank high for a popular keyword,
they could run Google ads under Adsense and split the revenue
with Google!

This combination led to an avalanche of SEO'ing like the
Webmaster world had never seen. The whole nature of links between
websites changed. Websites used to link to one another because
it was good information for their visitors. But now that link to
another website could reduce your search engine rankings, and if
it is a link to a competitor, it might boost his. In Google's
algorithm, links coming into your website boost the site's Page
Rank (PR), while links from your web pages to other sites reduce
your PR. People started creating link farms, doing reciprocal
link partnerships, and buying/selling links. Webmasters started
linking to each other for mutual ranking help or money, instead
of quality content for their visitors. This also led to the
wholesale scraping of websites. Black hat SEO's will take the
whole content of a website, put Google's ad on it, get a few
high powered incoming links, and the next thing you know they
are ranking high in Google and generating revenue from Google's
Adsense without providing any unique website content.

Worse yet, as Google tries to go after this duplicate content,
they sometimes get the real company instead of the scraper. This
is all part of the cat and mouse game that has become the Google
algorithm. Once Google realized the manipulation that was
happening, they decided to aggressively alter their algorithms
to prevent it. After all, their goal is to find the most
relevant results for their searchers. At the same time, they
also faced huge growth with the internet explosion. This has led
to a period of unstable updates, causing many top ranking
websites to disappear while many spam and scraped websites
remain. In spite of Google's efforts, every change seems to
catch more quality websites. Many spam sites and websites that
violate Google's guidelines are caught, but there is an endless
tide of more spam websites taking their place.

Some people might believe that this is not a problem. Google is
there to provide the best relevant listings for what people are
searching on, and for the most part the end user has not noticed
an issue with Google's listings. If they only drop thousands of
listings out of millions, then the results are still very good.
These problems may not be affecting Google's bottom line now,
but having a search engine that cannot be evolved without
producing unintended results will hurt them over time in several
ways.

First, as the competition from MSN and Yahoo grows, having
the best results will no longer be a given, and these drops in
quality listings will hurt them. Next, to stay competitive
Google will need to continue to change their algorithms. This
will be harder if they cannot make changes without producing
unintended results. Finally, having the Webmaster community lose
faith in them will make them vulnerable to competition.
Webmasters provide Google with two things. They are the word of
mouth experts. Also, they run the websites that use Google's
Adsense program. Unlike other monopolies, it is easy to switch
search engines. People might also criticize Webmasters for
relying on a business model that requires free search engine
traffic. Fluctuations in ranking are part of the internet
business, and most Webmasters realize this. Webmasters are
simply asking Google to fix bugs that cause unintended issues
with their sites.

Most Webmasters may blame ranking losses on Google and their
bugs. But the truth is that many Webmasters do violate some of
the guidelines that Google lays out. Most consider it harmless
to bend the rules a little, and assume this is not the reason
their websites have issues. In some cases, though, Google is
right and has just tweaked its algorithm in the right direction.
Here is an example: Google seems to be watching the
incoming links to your site to make sure they don't have the same anchor
text (this is the text used in the link on the website linking
to you). If too many links use the same anchor text, Google
discounts these links. This was originally done by some people
to inflate their rankings. Other people did it because one
anchor text usually makes sense. This is not really a black hat
SEO trick, and it is not called out in Google's guidelines, but
it has caused some websites to lose rank.

Webmasters realize that Google needs to fight spam and black
hat SEO manipulation. And to their credit, there is a Google
Engineer named Matt Cutts who has a Blog site and participates
in SEO forums to assist Webmasters. But given the revenue impact
that Google rankings have on companies, Webmasters would like to
see even more communication around the known issues, and help
with identifying future algorithm issues. No one expects Google
to reveal their algorithm or what changes they are making. Rumor
on the forum boards speculates that Google is currently looking
at items like the age of the domain name, websites on the same
IP, and frequency of fresh content. It would be nice from a
Webmaster standpoint to be able to report potential bugs to
Google, and get a response. It is in Google's best interest to
have a bug free algorithm. This will in turn provide the best
search engine results for everyone.

==============================================
Rodney Ringler is President of Advantage1 Web Services, Inc.,
which owns a network of Web Hosting Informational Websites
including Hostchart.com (http://www.hostchart.com),
Resellerconnection.com (http://www.resellerconnection.com),
Foundhost.com (http://www.foundhost.com) and Resellerforums.com
(http://www.resellerforums.com).
==============================================

"Google: The "Big Daddy' of the Blogging World"

Article Courtesy By Merle http://MCPromotions.com
Posted on this Blog by Afzal Khan

It seems like every time you turn your head these
days you hear something about Google. Definitely
a force to be reckoned with, Google is king online.
Although they're known mostly as a search engine,
they provide many other helpful services to today's
website owner.

Their non-search services include Blogger, Google
Adsense,Adwords, Google Desktop, and more. If you
use one or more of these services, Google probably
publishes a helpful blog to go with it.

Let's examine a few:

1) Google Adsense: http://adsense.blogspot.com

Many website owners use Google's Adsense to display
ads on their websites and generate a monthly income
in exchange for their efforts. This blog is a "look
inside Google Adsense." Who better to learn the ins
and outs of Adsense from than from Google's employees
themselves?

You'll pick up plenty of tips to help improve your
click thrus and learn about many Adsense features
you may not fully understand. If you're an Adsense
user, you need to subscribe to this page.

2) Google SiteMaps: http://sitemaps.blogspot.com

Google created "Sitemaps" for website owners to
supply information about their sites so that Google
can easily crawl through and index their pages. A
sitemap gives Google more complete information about
your website that they may miss when doing a regular
index visit.

This blog covers the "nitty gritty" of sitemaps,
how they work and things you'll need to pay attention
to when creating your files. If you want to fully
understand the program and how to
use it to your best advantage, stop on over.

3) Google Adwords: http://adwords.blogspot.com

Adwords is Google's pay per click ad program. Your
text ads are displayed both on search engine results
pages and other websites when someone searches for
the keywords/phrases that you have placed bids on.

If you use Adwords, you can really learn a lot
about how to improve your conversions and get
more for your dollars here.

4) Google Reader: http://googlereader.blogspot.com

Google Reader was designed to make it easier to
keep up with all of the information you like
to read online. Using it can help you get all of the
latest updates for your favorite websites.

This blog will keep you posted on the latest
reader updates, any known bug problems and tips
for getting more out of Reader.

5) Blogger http://buzz.blogger.com

I love Blogger. It's the fastest, easiest way to
start your own blog at no charge -- and it's owned
by Google, who publish their own blog to help you get
the most out of blogging at Blogger. Read it for
fun stories and news of the latest
add ons and enhancements from Google's Blogger
team.

6) Google Video: http://googlevideo.blogspot.com

Google Video allows anyone to upload videos to
their servers and others can go and watch them.
This blog showcases some of those videos and
upcoming special event videos.

7) Google's Blog: http://googleblog.blogspot.com

Google is like a small country and this blog will
give you a glimpse into their culture. Posts are
done by various employees in various departments.
You'll learn a lot about Google's products and other
technology news. If you're obsessed with Google,
this one should be on your "must see" list.

8) Adwords API Blog: http://adwordsapi.blogspot.com

This free Google service allows developers to
engineer computer programs that interact directly
with the Adwords server. You might say it's for "tech
heads." If you're a "geek" you'll want to hang
your hat here.

9)Google Desktop: http://googledesktop.blogspot.com

Google Desktop is downloadable software you can
use to quickly find files and other information
on your computer. This blog will tell you about
updates, available plug-ins and more.

10) Google Base: http://googlebase.blogspot.com/

Google Base is a place where you can post all types
of content and have it show up on Google.
This Blog will enlighten you on how best to use
it, and even showcases others having success with
Google's latest service.

It can be a full time job just trying
to keep up with them all the delightful
services Google provides. By bookmarking their
blogs, and/or subscribing to their feeds, you too
can become an "expert" on all things Google.


=========================================
By Merle- Want to Know the SECRETS of Article
Promotion? Discover everything you need to know
in this brand New Ebook, "How to Use Articles to
Drive Website Traffic". Get your F-r-e-e Copy now at
http://articleannouncer.mcpromotions.com