Google Search Engine Optimization Pitfalls

By John Hill (c) 2006
Article Posted on this blog by Afzal Khan

On Page Factors - Is Your Website Search Engine Friendly?

So you have a website but where is it on Google? Have you fallen
foul of a penalty or have you overlooked one of the many common
search engine optimization pitfalls when designing your
site?

Understanding what works for the search engines and what doesn't
when it comes to the content on your website can have a crucial
impact on the relevance and/or page rank of your pages from a
SEO perspective.

Here we highlight common mistakes that could affect your ranking
on Google and other search engines.

Optimizing for the Correct Keywords

Basically 'Get real' about what keywords you feel your website
can be ranked for. If you have a ten page website in a highly
competitive market then ranking naturally for the major terms
will be close to impossible.

Use the Overture keyword tool together with the number of
results on Google to find out what keywords are searched for and
how many other websites are targeting them. If you are lucky
then you might even find a popular keyword that not many other
websites are optimized for. Alternatively a good tool for this
job is Wordtracker from Rivergold Associates Ltd.

Code Validation

If your html code is not valid, then this could make it very
difficult or even impossible for a search engine to separate
your page content from your code. If the search engine cannot
see your content, then your page will obviously have no
relevance.

Frames

Even though most, if not all, major search engines now index
frames and even with the use of the NOFRAMES tag, you run the
risk of your pages being displayed in the search engine results
out of context. As each individual page is indexed separately,
it is likely that your website visitors will not see your pages
within your frame and will effectively be stuck on the page they
arrive at.

If you must use frames then create a 'Home' link on each of your
individual content pages and point the link at your frameset
index page.

JavaScript Navigation

If you use JavaScript to control your website navigation, then
search engine spiders may have problems crawling your site. If
you must use JavaScript, then there are two options available to
you:

* Use the NOSCRIPT tag to replicate the JavaScript link in
standard HTML.

* Replicate your JavaScript links as standard HTML links in
the footer of your page.

Flash Content

Currently only Google can index Macromedia Flash files, how
much or how little content they see is open to debate. So until
search engine technology is able to handle your .swf as standard
it would be advisable to avoid the use of these.

Again if you must use Flash then offer a standard HTML
alternative within NOEMBED tags.

Dynamic URLs

Although Google and Yahoo are able to crawl complicated URLs it
is still advisable to keep your URLs simple and avoid the use of
long query strings. Do not include session IDs in the URL as
these can either create a 'spider trap' where the spider indexes
the page over and over again or, at worst, your pages will not
get indexed at all.

If you do need to include parameters in the URL, then limit them
to two and the number of characters per parameter to ten or
less.

The best SEO solution for dynamic URLs is to use Mod-rewrite or
Multiviews on Apache.

No Sitemap

A sitemap is the search engine optimization tool of choice to
ensure every page within your website is indexed by all search
engines. You should link to your site map from, at least, your
homepage but preferably from every page on your website.

If your website contains hundreds of pages then split the
sitemap into several categorized maps and link these all
together. Try and keep the number of links per page on a sitemap
to less than 100.

Excessive Links

Excessive links on a given page (Google recommends having no
more than 100) can lower its relevance and, although it does
not result in a ban, this does nothing for your search engine
optimization strategy.

Be Careful Who You Link To

As you have no control over who links to your website, incoming
links will not harm your rank. However, outbound links from your
website to 'bad neighbourhoods' like link farms will harm your
ranking.

As a rule ensure as many of your outbound links as possible link
to websites that are topical to your field of business.

Article Posted on this blog by Afzal Khan


=======================================
John Hill - Developer, Designer and SEO Professional with
E-Gain New Media (http://www.e-gain.co.uk) offering website
design (http://www.e-gain.co.uk/web-development/
website_development/web-site-design/), search engine optimization
(http://www.e-gain.co.uk/online_marketing/business_solutions/
search-engine-optimisation/) and PPC Management.
=======================================

The Advance Of Algorithms - New Keyword Optimization Rules

By Matt Jackson (c) 2006

Posted on this blog by Afzal Khan

Maintaining and marketing a website can be a difficult task
especially for those who are inexperienced or who have very
little experience. SEO rules are constantly changing and even
then, many SEO professionals disagree on the actual specifics
required to optimize a website. This is in no small part due to
the search engines themselves.

Major search engines like Google are constantly striving to
ensure that sites at the top of their result pages offer
invaluable information or service to their visitors. However,
webmasters who are looking to make quick money while offering
very little quality content are always finding new ways to beat
the search engines at their own game. For this reason, search
engines regularly change the methods they use to determine the
relevancy and importance of your site.

Evolving Search Engines

The first step you should take is to ensure that your website
will do an effective job of turning visitors into money. The
content needs to be optimized so that both search engine
visitors and human visitors both deem it to be a useful website.
Once upon a time, effective optimization entailed cramming
content with as many keywords as possible and while this once
generated good search engine results it invariably put visitors
off. It is also now frowned upon and penalized as being spam by
all of the major search engines.

The Evolution And Improvement Of Algorithms

Search engines use specific algorithms to determine the
relevance of your website. The calculations from these
algorithms determine where on the search engine result pages
your website will appear. In order to keep the unscrupulous
webmasters guessing and ensuring that results are always up to
date, major search engines regularly update their algorithms.

Recent Advances
The result of some of the most recent changes has seen the
impetus move away from optimizing websites for search engines
and instead the algorithms are now geared to promote websites
that give true value to visitors. They're not only changing,
they are evolving into more intelligent and accurate algorithms.
While the use of keywords based around the relevant topic is
still important, it is also important to ensure that visitors
are your main priority.

Keyword Optimization

Keyword optimization is now more heavily guarded. Those who
include keywords too often will have their sites labeled as
spam, whereas not enough instances of the appropriate keyword
means you won't receive the desired results. However, the
algorithms have become particularly smart and as well as the
keywords you want to target you should include other relevant
keywords. Including inflexions of keywords is one excellent way
to ensure that your site is deemed to be relevant. Inflexions
are slight changes to your keyword. For example, inflexions of
the keyword "advertising" include advertise, advertised,
advertisement, etc...

Keyword Inclusion

Weight is also given to keywords that are included in certain
sections of a page. These sections include the title tag, meta
tags (only relevant to smaller search engines now), header tags,
image alt tags and formatting tags (e.g. keywords in bold or
italicized) of your text. With image alt tags and hyperlink
title tags it is important that you don't simply fill these with
keywords because this will be ignored at best, and penalized at
worst.

Natural Content Writing

One of the most effective ways to ensure that your site is
keyword optimized properly is to write the content naturally
first. Once you have done this, go through and ensure that any
relevant keywords are included throughout the text. Only place
them where they would appear naturally and remove them from
anywhere where they appear awkward. Once you've written the
content you should also check the remaining factors to ensure
everything is ok.

SEO Keyword Checklist

Below is a keyword checklist to ensure that you have fully
optimized your web pages to the current, generally accepted
search engine algorithm rules.

URL: Get your primary keyword as close to the beginning
of the URL as possible.

Title Tag: The title should be between 10 and 50
characters and include one or more keywords while still being
descriptive.

Description Meta Tag: The description meta tag should be
insightful and useful but it should also contain one or two of
your more important keywords.

Keyword Meta Tag: It makes sense that you should include
all of your keywords in the keyword meta tag. Do not include any
words that don't appear in the body of your text.

Keyword Density: Your content should be made up of all of
your keywords and other text. A total keyword density (all
keywords) of around 15% to 20% is the maximum you should aim for
and anything less than 5% is unlikely to yield good results.
Density for a single keyword should be between 1% and 7%.
1% seems too low, and 7% a little too high. Wherever possible
aim for approx 5% with the primary keyword and 3% with secondary
and subsequent keywords.

Header Tags (e.g. H1 and H2 tags): More weight is given
to keywords that appear within H1 tags, then H2 tags and so on.

Text Formatting Fonts (e.g. strong, bold and underline):
This may not offer much weight in algorithms, but generally if
you bold the first instance of your keywords and the last
instance of your primary keyword you should see some positive
results.

Beginning Of Text: The closer you can get your keywords
to the beginning of your page content the better. Try to include
your primary keyword within the first sentence or two and also
within the last paragraph.

Key-Phrases As Whole Phrases:If you are targeting Internet
Marketing as a key phrase then do not split the words up if
possible. Some effect is noticed if the words are split, but
much more benefit is received by including the phrase as a
whole.

Alt Text: Include your keyword at least once in the Alt tag of
any images. Ensure that the text is relevant to the image and
gives some information.

Posted on this blog by Afzal Khan

========================================================
Matt Jackson, founder of WebWiseWords (http://www.webwisewords.com),
is a professional copywriter offering a professional service.
Whether your business or your website needs a website content
copyrwriter, an SEO copywriter, a press release copywriter or a
copywriter (http://www.webwisewords.com) for any other purpose
WebWiseWords can craft the words you want.
========================================================

Google buys search algorithm invented by Israeli student

Source : haaretzdaily.com

Search engine giant Google recently acquired an advanced text search algorithm invented by Ori Alon, an Israeli student. Sources believe Yahoo and Microsoft were also negotiating with the University of New South Wales in Australia, where Alon is a doctoral student in computer science.

Google, Alon and the university all refused to comment, though Google confirmed that "Ori Alon works at Google's Mountain View, California offices."

The University acknowledged that Yahoo and Microsoft had conducted negotiations with its business development company.

Alon told TheMarker in an interview six months ago that the university had registered a patent on the invention.

Orion, as it is called, which Alon developed with faculty, relates only to the most relevant textual results. In addition the software, which currently operates only in English, offers a list of topics directly related to the original source.

"For example, if you search information on the War of Independence, you'll receive a list of related words, like Etzel, Palmach, Ben-Gurion," he explained. The text will only appear on the results page if enough words relevant to the search and the link between them is reasonable. Orion also rates the texts by quality of the site in which they appear.

Google Algorithm Problems

Google Algorithm Problems
By Rodney Ringler (c) 2006.
Posted on this blog by Afzal Khan

Have you noticed anything different with Google lately? The
Webmaster community certainly has, and if recent talk on several
search engine optimization (SEO) forums is an indicator,
Webmasters are very frustrated. For approximately two years
Google has introduced a series of algorithm and filter changes
that have led to unpredictable search engine results, and many
clean (non-spam) websites have been dropped from the rankings.
Google updates used to be monthly, and then quarterly. Now with
so many servers, there seems to be several different search
engine results rolling through the servers at any time during a
quarter. Part of this is the recent Big Daddy update, which is a
Google infrastructure update as much as an algorithm update. We
believe Big Daddy is using a 64 bit architecture. Pages seem to
go from a first page ranking to a spot on the 100th page, or
worse yet to the Supplemental index. Google algorithm changes
started in November 2003 with the Florida update, which now
ranks as a legendary event in the Webmaster community. Then came
updates named Austin, Brandy, Bourbon, and Jagger. Now we are
dealing with the BigDaddy!

The algorithm problems seem to fall into 4 categories. There are
canonical issues, duplicate content issues, the Sandbox, and
supplemental page issues.

1. Canonical Issues: These occur when a search engine
treats www.yourdomain.com, yourdomain.com, and yourdomain.com/index.html
all as different websites. When Google does this, it then flags
the different copies as duplicate content and penalizes them.
Also, if the site not penalized is http://yourdomain.com, but
all of the websites link to your website using www.yourdomain.com,
then the version left in the index will have no ranking. These
are basic issues that other major search engines, such as Yahoo
and MSN, have no problem dealing with. Google is possibly the
greatest search engine in the world (ranking themselves as a 10
on a scale of 1 to 10). They provide tremendous results for a
wide range of topics, and yet they cannot get some basic indexing
issues resolved.

2. The Sandbox: This has become one of the legends of
the search engine world. It appears that websites, or links to them,
are "sandboxed" for a period before they are given full rank in the
index, kind of like a maturing time. Some even think it is only
applied to a set of competitive keywords, because they were the
ones being manipulated the most. The Sandbox existence is
debated, and Google has never officially confirmed it. The
hypothesis behind the Sandbox is that Google knows that someone
cannot create a 100,000 page website overnight, so they have
implemented a type of time penalty for new links and sites
before fully making the index.

3. Duplicate Content Issues: These have become a major
issue on the Internet. Because web pages drive search engine rankings,
black hat SEOs (search engine optimizers) started duplicating
entire sites' content under their own domain name, thereby
instantly producing a ton of web pages (an example of this would
be to download an Encyclopedia onto your website). As a result
of this abuse, Google aggressively attacked duplicate content
abusers with their algorithm updates. But in the process they
knocked out many legitimate sites as collateral damage. One
example occurs when someone scrapes your website. Google sees
both sites and may determine the legitimate one to be the
duplicate. About the only thing a Webmaster can do is track down
these sites as they are scraped, and submit a spam report to
Google. Another big issue with duplicate content is that there
are a lot of legitimate uses of duplicate content. News feeds
are the most obvious example. A news story is covered by many
websites because it is content the viewers want. Any filter will
inevitably catch some legitimate uses.

4. Supplemental Page Issues: Webmasters fondly refer to
this as Supplemental Hell. This issue has been reported on places like
WebmasterWorld for over a year, but a major shake up around
February 23rd has led to a huge outcry from the Webmaster
community. This recent shakeup was part of the ongoing BigDaddy
rollout that should finish this month. This issue is still
unclear, but here is what we know. Google has 2 indexes: the
Main index that you get when you search, and the Supplemental
index that contains pages that are old, no longer active, have
received errors, etc. The Supplemental index is a type of
graveyard where web pages go when they are no longer deemed
active. No one disputes the need for a Supplemental index. The
problem, though, is that active, recent, and clean pages have
been showing up in the Supplemental index. Like a dungeon, once
they go in, they rarely come out. This issue has been reported
with a low noise level for over a year, but the recent February
upset has led to a lot of discussion around it. There is not a
lot we know about this issue, and no one can seem to find a
common cause leading to it.

Google updates were once fairly predictable, with monthly
updates that Webmasters anticipated with both joy and angst.
Google followed a well published algorithm that gave each
website a Page Rank, which is a number given to each webpage
based on the number and rank of other web pages pointing to it.
When someone searches on a term, all of the web pages deemed
relevant are then ordered by their Page Rank.

Google uses a number of factors such as keyword density, page
titles, meta tags, and header tags to determine which pages are
relevant. This original algorithm favored incoming links and the
anchor text of them. The more links you got with an anchor text,
the better you ranked for that keyword. As Google gained the
bulk of internet searches in the early part of the decade,
ranking well in their engine became highly coveted. Add to this
the release of Google's Adsense program, and it became very
lucrative. If a website could rank high for a popular keyword,
they could run Google ads under Adsense and split the revenue
with Google!

This combination led to an avalanche of SEO'ing like the
Webmaster world had never seen. The whole nature of links between
websites changed. Websites used to link to one another because
it was good information for their visitors. But now that link to
another website could reduce your search engine rankings, and if
it is a link to a competitor, it might boost his. In Google's
algorithm, links coming into your website boost the site's Page
Rank (PR), while links from your web pages to other sites reduce
your PR. People started creating link farms, doing reciprocal
link partnerships, and buying/selling links. Webmasters started
linking to each other for mutual ranking help or money, instead
of quality content for their visitors. This also led to the
wholesale scraping of websites. Black hat SEO's will take the
whole content of a website, put Google's ad on it, get a few
high powered incoming links, and the next thing you know they
are ranking high in Google and generating revenue from Google's
Adsense without providing any unique website content.

Worse yet, as Google tries to go after this duplicate content,
they sometimes get the real company instead of the scraper. This
is all part of the cat and mouse game that has become the Google
algorithm. Once Google realized the manipulation that was
happening, they decided to aggressively alter their algorithms
to prevent it. After all, their goal is to find the most
relevant results for their searchers. At the same time, they
also faced huge growth with the internet explosion. This has led
to a period of unstable updates, causing many top ranking
websites to disappear while many spam and scraped websites
remain. In spite of Google's efforts, every change seems to
catch more quality websites. Many spam sites and websites that
violate Google's guidelines are caught, but there is an endless
tide of more spam websites taking their place.

Some people might believe that this is not a problem. Google is
there to provide the best relevant listings for what people are
searching on, and for the most part the end user has not noticed
an issue with Google's listings. If they only drop thousands of
listings out of millions, then the results are still very good.
These problems may not be affecting Google's bottom line now,
but having a search engine that cannot be evolved without
producing unintended results will hurt them over time in several
ways.

First, as the competition from MSN and Yahoo grows, having
the best results will no longer be a given, and these drops in
quality listings will hurt them. Next, to stay competitive
Google will need to continue to change their algorithms. This
will be harder if they cannot make changes without producing
unintended results. Finally, having the Webmaster community lose
faith in them will make them vulnerable to competition.
Webmasters provide Google with two things. They are the word of
mouth experts. Also, they run the websites that use Google's
Adsense program. Unlike other monopolies, it is easy to switch
search engines. People might also criticize Webmasters for
relying on a business model that requires free search engine
traffic. Fluctuations in ranking are part of the internet
business, and most Webmasters realize this. Webmasters are
simply asking Google to fix bugs that cause unintended issues
with their sites.

Most Webmasters may blame ranking losses on Google and their
bugs. But the truth is that many Webmasters do violate some of
the guidelines that Google lays out. Most consider it harmless
to bend the rules a little, and assume this is not the reason
their websites have issues. In some cases, though, Google is
right and has just tweaked its algorithm in the right direction.
Here is an example: Google seems to be watching the
incoming links to your site to make sure they don't have the same anchor
text (this is the text used in the link on the website linking
to you). If too many links use the same anchor text, Google
discounts these links. This was originally done by some people
to inflate their rankings. Other people did it because one
anchor text usually makes sense. This is not really a black hat
SEO trick, and it is not called out in Google's guidelines, but
it has caused some websites to lose rank.

Webmasters realize that Google needs to fight spam and black
hat SEO manipulation. And to their credit, there is a Google
Engineer named Matt Cutts who has a Blog site and participates
in SEO forums to assist Webmasters. But given the revenue impact
that Google rankings have on companies, Webmasters would like to
see even more communication around the known issues, and help
with identifying future algorithm issues. No one expects Google
to reveal their algorithm or what changes they are making. Rumor
on the forum boards speculates that Google is currently looking
at items like the age of the domain name, websites on the same
IP, and frequency of fresh content. It would be nice from a
Webmaster standpoint to be able to report potential bugs to
Google, and get a response. It is in Google's best interest to
have a bug free algorithm. This will in turn provide the best
search engine results for everyone.

==============================================
Rodney Ringler is President of Advantage1 Web Services, Inc.,
which owns a network of Web Hosting Informational Websites
including Hostchart.com (http://www.hostchart.com),
Resellerconnection.com (http://www.resellerconnection.com),
Foundhost.com (http://www.foundhost.com) and Resellerforums.com
(http://www.resellerforums.com).
==============================================

"Google: The "Big Daddy' of the Blogging World"

Article Courtesy By Merle http://MCPromotions.com
Posted on this Blog by Afzal Khan

It seems like every time you turn your head these
days you hear something about Google. Definitely
a force to be reckoned with, Google is king online.
Although they're known mostly as a search engine,
they provide many other helpful services to today's
website owner.

Their non-search services include Blogger, Google
Adsense,Adwords, Google Desktop, and more. If you
use one or more of these services, Google probably
publishes a helpful blog to go with it.

Let's examine a few:

1) Google Adsense: http://adsense.blogspot.com

Many website owners use Google's Adsense to display
ads on their websites and generate a monthly income
in exchange for their efforts. This blog is a "look
inside Google Adsense." Who better to learn the ins
and outs of Adsense from than from Google's employees
themselves?

You'll pick up plenty of tips to help improve your
click thrus and learn about many Adsense features
you may not fully understand. If you're an Adsense
user, you need to subscribe to this page.

2) Google SiteMaps: http://sitemaps.blogspot.com

Google created "Sitemaps" for website owners to
supply information about their sites so that Google
can easily crawl through and index their pages. A
sitemap gives Google more complete information about
your website that they may miss when doing a regular
index visit.

This blog covers the "nitty gritty" of sitemaps,
how they work and things you'll need to pay attention
to when creating your files. If you want to fully
understand the program and how to
use it to your best advantage, stop on over.

3) Google Adwords: http://adwords.blogspot.com

Adwords is Google's pay per click ad program. Your
text ads are displayed both on search engine results
pages and other websites when someone searches for
the keywords/phrases that you have placed bids on.

If you use Adwords, you can really learn a lot
about how to improve your conversions and get
more for your dollars here.

4) Google Reader: http://googlereader.blogspot.com

Google Reader was designed to make it easier to
keep up with all of the information you like
to read online. Using it can help you get all of the
latest updates for your favorite websites.

This blog will keep you posted on the latest
reader updates, any known bug problems and tips
for getting more out of Reader.

5) Blogger http://buzz.blogger.com

I love Blogger. It's the fastest, easiest way to
start your own blog at no charge -- and it's owned
by Google, who publish their own blog to help you get
the most out of blogging at Blogger. Read it for
fun stories and news of the latest
add ons and enhancements from Google's Blogger
team.

6) Google Video: http://googlevideo.blogspot.com

Google Video allows anyone to upload videos to
their servers and others can go and watch them.
This blog showcases some of those videos and
upcoming special event videos.

7) Google's Blog: http://googleblog.blogspot.com

Google is like a small country and this blog will
give you a glimpse into their culture. Posts are
done by various employees in various departments.
You'll learn a lot about Google's products and other
technology news. If you're obsessed with Google,
this one should be on your "must see" list.

8) Adwords API Blog: http://adwordsapi.blogspot.com

This free Google service allows developers to
engineer computer programs that interact directly
with the Adwords server. You might say it's for "tech
heads." If you're a "geek" you'll want to hang
your hat here.

9)Google Desktop: http://googledesktop.blogspot.com

Google Desktop is downloadable software you can
use to quickly find files and other information
on your computer. This blog will tell you about
updates, available plug-ins and more.

10) Google Base: http://googlebase.blogspot.com/

Google Base is a place where you can post all types
of content and have it show up on Google.
This Blog will enlighten you on how best to use
it, and even showcases others having success with
Google's latest service.

It can be a full time job just trying
to keep up with them all the delightful
services Google provides. By bookmarking their
blogs, and/or subscribing to their feeds, you too
can become an "expert" on all things Google.


=========================================
By Merle- Want to Know the SECRETS of Article
Promotion? Discover everything you need to know
in this brand New Ebook, "How to Use Articles to
Drive Website Traffic". Get your F-r-e-e Copy now at
http://articleannouncer.mcpromotions.com

An SEO Checklist

Article Courtesy by Google Page Rank.net

Posted on this blog by Afzal Khan

Search engine optimization is on every webmaster's mind these days. Achieving a favorable ranking for the right keywords can mean a steady stream of targeted traffic to your site, and all for free - that's hard to beat. The key to high search engine rankings is structuring your website correctly, including plenty of content that is relevant to your keywords, and making sure your website is spider-friendly. You can use this checklist to make sure all of your Web pages can be found, indexed and ranked correctly:

Your website is themed. Your site deals with an identifiable theme which is obvious from the text on the home page and reinforced by all the other pages on your site. In other words, all the individual Web pages relate to each other and deal with various aspects of some central theme. The text on your home page should state clearly what that theme is and what your website is about, and the other pages should reinforce that.

Your Web pages have enough high quality, relevant content. Spiders come to your website looking for content. If a page doesn't have much content, or the content doesn't appear closely related to the page's title and your website's theme, the page probably won't be indexed or if it is indexed it won't rank well. Search engines love quality content and lots of it - content is what Web searchers are looking for and search engines try to provide.

Your website's navigational structure is relatively flat. You don't want important pages to be too "deep" within your website, meaning it takes several clicks to get there from the home page. Search engines typically index the home page first, then gradually index other pages on a site over time. Many spiders are programmed to only go three layers deep - if some of your important content is buried deeper than that, it may never be found and indexed at all.

You've created a unique "Title" tag for each page. The title is one of the most important aspects of any Web page from an SEO standpoint, especially for Google (which is the most important search engine to optimize for). Don't use a generic title for all your pages, use the keywords your targeting for that page and keep it brief but descriptive.

You use the "Description" meta tag. Contains a highly descriptive sentence about the content and purpose of your page, and contains your most important keyword phrase early in the sentence. Not all of the search engines will display this "canned" description when they list the page in search results, but many of them will, so it's worth getting it right. Check out our Webmaster Tools for an easy way to create great meta tags.

You use the "Keywords" meta tag. As with the meta tag description, not every search engine will use the keywords meta tag. But some will use it and none will penalize you for having it. Also, having a short list of the keywords you're targeting will help you write appropriate content for each page. The keyword tage should contain your targeted keyword phrase and common variations, common misspellings and related terms. Make sure your keywords relate closely to the page content and tie into the overall theme of your site.

Your keywords are included in the visible page content, preferably high up on the page. You have to achieve a balance here - you want to include keyword phrases (and variations) a number of times within your text, but not so many times that you appear to be guilty of "keyword stuffing". The trick is to work the keywords into the text so that it reads as naturally as possible for your site visitors. Remember, you can incorporate keywords into any Web page element that is potentially viewable by site visitors - header text, link text and titles, table captions, the "Alt" attribute of the image tag, the "title" attribute of the link tag, etc.

Every page of your website can be reached by search engine spiders. This is critical - if your pages can't be found, they can't be indexed and included in search results, let alone rank well. Search engines use spiders to explore your website and index the pages, so every page must be accessible by following text links. If pages require a password to view, are generated by a script in response to a query, or have a long and complicated URL, spiders may not be able to read them. You need to have simple text links to the pages you want indexed.

You've included a site map. Unless your site is very small, it's a good idea to create a site map with text links that you link to the site map from your home page. In addition to a link, include descriptive text for containing the relevant keywords for each page.

You link to your most important pages from other pages on your site. Internal links help determine page rank since they show which pages of your site are most important. The more links you have to have to a page, relative to other pages on your site, the more importance search engines will assign to it.

You use keywords in your link text. When you create a text link to another page on your site, use that page's targeted keywords as the text for the link (inside the anchor tags that create the link). Make it as descriptive as possible. For example, a link that says "Premium Customized Widgets" is much better than one that says simply "Product Page", and indicates to search engine spiders what that linked page is about.

Your site doesn't use frames. If possible, don't use frames on any page you want to get indexed by search engines. If you feel you simply must use frames for a page, then also make use of the "noframes" HTML tags to provide alternative text that spiders can read (and make that text descriptive rather than just a notice that "This site uses frames etc. etc.").

You don't use automatic page redirects. Don't make any pages automatically redirect the visitor to another page (the exception is a page you've deleted for good - in which case you should use a "301 redirect", a permanent redirect which is acceptable to search engines).

Your important content is in plain text and not contained in images. Search engine spiders can't "read" content in JPEG, GIF, or PNG files. If you really feel that using an image rather than text is crucial to your design, at least put the same text in the image's "Alt" tag (or in the "title" tag if you're using the image as a hyperlink).

Your important content is not contained in Flash files. Flash is a wonderful technology, but unfortunately spiders don't have the required "plugin" to view Flash files. As a result, Flash content is mostly inaccessible to search engine spiders. Some can find and follow hyperlinks within the Flash file, but unless those links lead to pages with readable HTML content this won't help you much. Don't create all-Flash pages for any content you want to get indexed - instead, put that content in the HTML portion of the page.

Links and keywords are not hidden inside JavaScript code. If your links use JavaScript to direct the user to the appropriate page (for instance, a drop-down list) or important content is contained within JavaScript code (when it's displayed dynamically using DHTML, for instance) search engine spiders won't be able to "see" it. You can, however, use the "noscript" HTML tags to provide an alternative that can be read by spiders.

You've optimized every important page of your website individually. Don't stop at your home page. Take the trouble to optimize any page which has a reasonable chance of being indexed by the major search engines, targeting appropriate keywords for each. If you face a lot of competition it may be nearly impossible to get a top ranking for your home page, but you can still get a lot of search engine traffic to your site from other pages which are focused on very specific keyword phrases.

You didn't duplicate content. Each page of your site should have unique content that distinguishes it from every other page on your site. Duplicating content or having pages that are only slightly different might be seen as "search engine spamming" (trying to manipulate search engine results).

You provide linking instructions for those who want to link to your site. Somewhere on your site state your policies about other people linking to your site and provide the wording you'd like them to use in their link. You want to encourage other people to link to your site, preferably using link text and a description that reflect the keywords for that page. For their convenience provide the ready-made HTML code for the link - not everyone will use it, but most often they will use your preferred text as a courtesy as long as it is truly descriptive of your site and doesn't contain "marketing hype".

You provide linking instructions for those who want to link to your site. Somewhere on your site state your policies about other people linking to your site and provide the wording you'd like them to use in their link. You want to encourage other people to link to your site, preferably using link text and a description that reflect the keywords for that page. For their convenience provide the ready-made HTML code for the link - not everyone will use it, but many will use your preferred text as a courtesy as long as it doesn't contain "marketing hype".

Important hyperlinks are plain text links and not image links or image maps. Text links are better from an SEO standpoint than image links, as spiders can't read text from an image file. If you feel you really must use a graphic as a link, at least include a text description which (including the relevant keywords) by using the "title" attribute of the link tag.

Your website is free of coding errors and broken links. HTML coding errors and non-working links can keep search engine spiders from correctly reading and indexing your pages. For that reason, it's a good idea to use a Web page validation utility to check your HTML code to make sure it's error-free.

Search Engine Ratings

Posted on this blog by Afzal Khan

Hitwise uses a combination of anonymous web surfing data provided by ISPs in various countries and its own panel-based measurements to determine which sites are most popular on the web. The data encompasses the surfing activities of 25 million people, worldwide.

The chart below shows the share of all visits made to both search and directory sites, as classified by Hitwise, by US web surfers during July 2005. "Domain" shows the core domain of the service involved.

The pie chart generated by comScore Media Metrix Search Engine Ratings also favors Google above all other search engines:


Data taken from Search Engine Watch.

12 Tips to help build the foundation for a new SEO career

By John Alexander
Posted on this blog By Afzal Khan

TIP 1. Set your focus on your clients success.

Stop focusing on sales and start focusing on your client's success! Do all that you can do to make them successful. Pour all of your talents into making their projects work. So many folks I talk to can never stop thinking about
where they will make their next sale, instead of working on delivering results to the clients they ALREADY have. In so doing, you establish "lifetime" residuals.

Make your client successful and they will literally become part of YOUR sales team.

TIP 2. A difference in your performance is a difference in your profits.

If you are NOT up to speed, you better catch up fast. Truely a difference in your performance is a difference in your profits! If you are not up to speed on solid SEO marketing techniques and methods, start learning now. Take a course or study at a live workshop but however you do it, get your SEO skills up to speed so you can genuinely help get your customers results. If you can show them a strategy that really puts dollars in their pocket, they'll put dollars in YOUR pocket!

TIP 3. Have confidence in your own strategies and explore profit sharing.

How's your batting average with profit sharing? Don't be afraid to explore this one! If your skills are medium to above average, why not share in the profits yourself? I am referring to offering someone a vertical deal. This would be a deal where you own part of the company in return for making it successful with your SEO skills.
Don't brush this off. There are some exceptional deals to be had if you start thinking laterally.

TIP 4. Don't forget community and charitable work.
When's the last time you helped promote a charitable work at NO cost. Build a site and promote it for the literacy council or the Easter Seals Society or your local Rotary Group or your Chamber of Commerce.

Don't forget that this work can OFTEN open unusual and even surprising doors. Help make others successful and you will NOT fail. Many important leaders within your local community will be serving on these committees right next to you!
This is a great way to network and meet new people and help the cause too.

TIP 5. Work on Relationship building and position yourself for success to come.

Watch the latest SEO trends and position yourself to quickly take advantage. The study and practice of SEO has been extraordinary. A few years ago, I never dreamed that my study of optimization would lead to the Internet lifestyle. As a result of our work, we enjoy wonderful repeat business and client loyalty.

If you have not been enjoying good profits, a rewarding lifestyle and being appreciated by your clients, then you need to consider a plan of action.

Tip 6. Set your course of action and get started.

Consider taking a live hands-on SEO workshop which can kick-start your professional SEO career in just a few days of hands-on training. Or if you cannot travel, consider taking an Online SEO Training Course.

Tip 7. Choose which voices you choose to listen to carefully.

You MUST be able to "deliver" and make a difference. Do whatever it takes to get your SEO skills and lateral thinking skills up to speed. In business there are many voices offering free advice. You need to choose carefully who you will listen to.

Tip 8. Run a balanced business.

Are you charging for what your services are worth? There are some folks who charge steeply and don't even know how to get the results. For goodness sakes, if you're good at what you do, make sure you are charging well for your services. You DESERVE fair reward if you're helping other business owners to prosper.
(Some folks are afraid to charge for their work)

Note: The ones that charge steeply but DON'T deliver may make a few dollars initially, but they won't enjoy the customer loyalty, the referral business, and the repeat business that you do, and they won't have a "customer for life" like you will.

Tip 9. Don't forget to recognize and be thankful for the progress you've made.

Yes, this is extremely important. How else can you truly measure your progress unless you benchmark along the way. Be sure to benchmark your victories but even more important, celebrate your CLIENTS' VICTORIES too! After all, you helped bring them about.

Tip 10. Give something back to your community (with gladness).

Look for opportunities to help others who genuinely need help and avoid those who are only after your talents to exploit them. (Trust me, when your SEO talents and success stories increase, you'll have strangers coming out of the woodwork to take you to dinner and wine you and dine you and pick your brain). Proceed with wisdom.

Tip 11. You must be willing to change and take action!

Performing the way you perform now has delivered a certain result. Maybe you're happy with that result. If you're happy with this result, carry on exactly the same way and you should get very similar results. However, if you are NOT happy with your results now, then you must change the way you do things.

Some people go all their life complaining that they would like a better career
or a better position in life and yet they continue taking the same actions and
getting the same results year after year.

Think of it like this...
Same action = same result
Different action = new results

Tip 12. Surround yourself with the high quailty people!

Without a doubt, your SEO interests and abilities will make way for many new working relationships. Be sure to carefully choose the people you want to work with. Look for those with whom you can share synergies and be very observant of
the skills and abilities within the others that surround you. No single person can operate as effectively as a group of people that work as a team. Learn to choose the right people to work with and recognize the latent strengths and talents
that will sometimes be present but initially hidden within the group. Build to your ultimate potential by choosing wisely, encouraging one another and recognizing the talent in one another.

Is a career in SEO right for you?

Are you ready to take your SEO career to the next step?

We would be delighted to meet with you personally and teach you exactly how
search engines work. The beauty of learning new SEO skills, is that it puts you
back into the position of having the power to choose and make your own choices.


About John Alexander
john@searchengineworkshops.com
John Alexander is Co-director of Training at Search Engine Workshops offering live, SEO Workshops with partner Robin Nobles as well as online search engine marketing courses through Online Web Training. John is author of an e-book called Wordtracker Magic and co-author of the Totally Non-Technical Guide for A Successful Web Site. John is also an official member of the customer support team at Wordtracker.com.

Black Worm Hits India hard.

Black Worm Hits India hard.
Posted By Afzal Khan
January 30, 2006

It has been observed that the Black Worm also known as W32.Vb.i or W32.Nayem.E has been actively spreading in India since last two weeks now. It’s a mass-mailing worm that also spread using remote shares. After a long gap there has been an outbreak kind of situation as this worm was successful in spreading all over the globe within few hours when it first appeared over the Internet. The reason why the worm was so successful in spreading all over is just because it spreads by creating a mime encoded compressed executable with a different extension (.HQX, .BHX), which didn’t had any kind of header to classify the file. As a result the mail gateway scanners were not able to decode the attachment and scan the infected files. This is why the worm got skipped even though the mail severs have updated anti-virus scan engines. Many of the leading AntiVirus software’s had to do some changes to their scan engine to make the scanners decode the file and scan for the infected attachment.

AntiVirus Quick Heal form India was the first anti-virus to detect this worm when it first hit the net according to the report generated and published by PC-Wallet Magazine, Germany. According to PC-Wallet, Germany the worm was first caught and detected on 16th January 2006 at 10:00 (GMT) by Quick Heal AntiVirus. For more details on outbreak response time of various other anti-virus software’s world wide check at:
http://www.pcmag.com/article2/0,1895,1916880,00.asp

According to US based LURHQ the leading provider of Threat and Vulnerability Management services this worm has hit hard to countries like India, Italy and Peru with high number of infection rates. Among it India is the hardest hit country by far in terms of overall infection rate till today. Live statistics of infection rate per country can be found on their web site at http://www.lurhq.com/blackworm-stats.html
This worm attaches itself to e-mail messages as an executable file with various different names and occasionally this worm compresses itself by ZIP and encodes the compressed file by mime encoding and then attaches the encoded file to the e-mail messages.

The worm has several network spreading routines. One of them enumerates all available shares, then reads the values of the registry key where personal documents and recently opened files are stored. It copies itself to such folders by the file name with executable extension of the same name as the document in that folder. The worm also copies itself to network shares with the same name. This worm once active first tries to delete the popularly known international anti-virus folders (e.g. Norton AntiVirus, McAfee, Trend etc.)

This worm has a dangerous payload, it will delete all the documents, worksheets, presentations, database files and compressed backup files from the system on every 3rd day of the month. This is very serious payload considering that the worm has spread all over India and the first payload day of 3rd February is arriving very soon. We recommend all our users to have their AntiVirus updated, up and running. All the Quick Heal users are already protected from this worm from day one.

For computer users not having Quick Heal we have a special Black Worm removing tool freely available from our website http://www.quickheal.co.in/public/alerts/i-worm.VB_Bi.asp
More Information
Black Worm Analysis
Free removal tool for Black Worm

Open federation for Google Talk

By Mike Jazayeri, Product Manager, Google Talk

We've just announced open federation for the Google Talk service. What does that mean, you might be wondering. No, it has nothing to do with Star Trek. "Open federation" is technical jargon for when people on different services can talk to each other. For example, email is a federated system. You might have a .edu address and I have a Gmail address, but you and I can still exchange email. The same for the phone: there's nothing that prevents Cingular users from talking to Sprint users.

Unfortunately, this is not the case with many IM and Internet voice calling services today. You can only talk to people on the particular service you have an account on (so you need an account on every service to talk to everybody, which is pretty cumbersome). With open federation, you get to choose your service provider and you can talk to people on any other federated service (and vice versa).

In addition to the Google Talk service, many other companies, universities, and corporations support open federation today. This means you can now talk to millions of users around the world all with a single account on the service provider of your choice.

We think this is pretty exciting -- though perhaps not as exciting as something Star Trek-related -- and we hope it will bring us one step closer to making IM and Internet voice calling as ubiquitous as email.