Giving Search a Human Touch

Howdy Folk's

Howaz the New Year Celebration's,Enjoyed!!! hmm..... well wish you n your family a prosperous New Year 2007 ahead to bring joy in ur life!!!

Now coming to posting, after having a hang over of new year party n Eid celebration it's all luking very dumbo to me, still managed to read blog's and other intresting articles which I regularly do without fail & Blogging here with an great article found today during search! "Giving Search a Human Touch" How does it sound's, kewl isn't so......But to hear Jimmy Wales, co-founder of Wikipedia and chairman of the for-profit wiki site Wikia, describe his vision of a totally transparent social search engine -- one built with open-source software and inspired by the collaborative spirit of wikis -- you realize that his plan just might work.

Read below the article written by reporter Michael Calore, Editor of Webmonkey and you can find the original copy of this article at Wired.com

However as usual: Article posted on this blog by Afzal Khan

Giving Search a Human Touch

The idea of building a better search engine sounds almost laughable on the surface.
After all, isn't there already a massively successful internet search player with a seemingly insurmountable market share? But to hear Jimmy Wales, co-founder of Wikipedia and chairman of the for-profit wiki site Wikia, describe his vision of a totally transparent social search engine -- one built with open-source software and inspired by the collaborative spirit of wikis -- you realize that his plan just might work.

Wales' plan for the Search Wikia project is to put ordinary users in charge of ranking search results. Heavy lifting such as indexing and raw ranking will still be done by machines, but the more nuanced work of deciding how search results are displayed will be completed by humans.

Google, the current King of Search, ranks search results based on the perceived trust of the web community at large -- the more links a page receives, the more it's trusted as an authoritative source of information, and the higher the rank. However, this method is open to tinkering, trickery and hacks, all of which damage the relevancy of results.

If successful, Wales' project, which launches in early 2007, will be able to filter out such irrelevant results. Operating much the same way as Wales' Wikipedia, both the software algorithms powering Search Wikia and the changes applied by the community will be made transparent on the project's website.

Wired News spoke to Jimmy Wales about Search Wikia. We discussed the ins and outs of how the model will likely work, what it will take to build it, and what sorts of criticisms it will face.

Wired News: Can you describe the new search engine in your own words?

Jimmy Wales: The core of the concept is the open-source nature of everything we're intending to do -- making all of the algorithms public, making all of the data public and trying to achieve the maximum possible transparency. Developers, users, or anyone who wants to can come and see how we're doing things and give us advice and information about how to make things better.

Additionally, we want to bring in some of the concepts of the wiki model -- building a genuine community for discussion and debate to add that human element to the project.

I mention "community" to distinguish us as something different. A lot of times, when people talk about these kinds of (projects), they're not thinking about communities. They're thinking about users randomly voting, and that action turning into something larger. I really don't like the term "crowdsourcing." We're really more about getting lots of people engaged in conversations about how things should be done.

WN: How are the communities going to be managed?

Wales: I don't know! (laughter) If you asked me how the Wikipedia community is managed, I wouldn't know the answer to that, either. I don't think it makes sense to manage a community.

It's about building a space where good people can come in and manage themselves and manage each other. They can have a distinct and clear purpose -- a moral purpose -- that unites people and brings them together to do something useful.

WN: How will the human-powered ranking element work?

Wales: We don't know. That's something that's really very open-ended at this moment. It's really up to the community, and I suspect that there won't be a one-size-fits-all answer. It will depend on the topic and the type of search being conducted.

One of the things that made Wikipedia successful was a really strong avoidance of a priori thinking about exactly "how." We all have a pretty good intuitive sense of what a good search result is. A variety of different factors make a search result "good," qualitatively speaking. How we get those kinds of results for the most possible searches depends on a lot of factors.

A lot of the earlier social search projects fell apart because they were committed a priori to some very specific concept of how it should work. When that worked in some cases but not others, they were too stuck in one mold rather than seeing that a variety of approaches depending on the particular topic is really the way to do it.

WN: I'm envisioning that Wikia Search will incorporate some sort of voting system, and that users will be able to adjust and rank lists of results. Is this the case?

Wales: Yes, but how exactly and under what circumstances that would work is really an empirical question that we'll experiment with. At Wikipedia and in the wiki world, one of the things we've always pushed hard against is voting. Voting is usually not the best way to get a correct answer by consensus. Voting can be gamed, it can be played with. It's a crutch of a tool that you can use when you don't have anything better to use. Sometimes, there is no better way. You have to say, "We've tried to get a consensus and we couldn't, so we took a vote."
In general, envisioning some sort of pre-built algorithm for counting people's votes is just not a good idea.

WN: Speaking of gaming, what methodologies do you think Search Wikia will employ to fight gaming?

Wales: I think the most important thing to use to fight against gaming is genuine human community. Those kinds of gaming behaviors pop up when there is an algorithm that works in some mechanical way, and then people find a way to exploit it. It's pretty hard to do that within a community of people who know each other. Basically, if you're being a jerk, they'll tell you knock it off and you'll be blocked from the site. It's pretty simple for humans to see through that sort of thing. The real way to fight it is to have a group of people who trust each other, with that trust having been built over a period of time.

WN: Will there be some sort of validation that happens when results are ranked by users? Will knowledgeable contributors get the chance to vet changes?

Wales: Yes. The keys of good design here have to do with transparency -- everybody can see what everyone else has done. The communities will have the ability to effect and modify changes as they see fit.

WN: What forms of open-source software are you applying to this search project, and why do you think those would be more successful than proprietary search software?

Wales: Here's the main thing. If we publish all the software -- and we'll be starting with Lucene and Nutch, which are these open source projects that are out there and already quite good -- and do all of our modifications transparently in public, then other programmers can come and check the code. If you see things that aren't working well, you can contribute. People who are coders can contribute in one way, and ordinary people using the site can also contribute in other ways.

It's mostly about the trust that you get from that transparency. You can see for yourself, if you choose to dig into it, how things are ranked and why certain results are ranked the way they are. You can also choose to download the whole thing and do tests or tweak it to make it better in certain areas. That kind of transparency helps if you see a problem with search in some area that you care about, like some technical field for example. There's no good way for you to go and tell Google that their search is broken in this area, or that they need to disambiguate these terms -- or whatever.

By having an up-front commitment to transparency, I think you can do that.

WN: One of the key arguments in favor of a new search model is that traditional search engines like Google are subjected to spam more and more often. How can a wiki-powered search engine better fight search spam?

Wales: Again, I think it's that human element. Humans can recognize that a domain is not returning good results, and if you have a good community of people to discuss it, you can just kick them out of the search engine. It seems pretty simple to me -- it's an editorial judgment. You just have to have a broad base of people who can do that.

WN: How are you going to build this broad base? Will there be an outreach, or are you expecting people to just come to you?

Wales: I think people will come. If we're doing interesting work and people find it fun, then people will come.

WN: When do you expect to see Search Wikia up and running?

Wales: The project to build the community to build the search engine is launching in the first quarter of 2007, not the search engine itself. We may have something up pretty quickly, maybe some sort of demo or test for people to start playing with. But we don't want to build up expectations that people can come in three months and check out this Google-killing search engine that we've written from scratch. It's not going to happen that fast.

What we want to do now is get the community going and get the transparent algorithms going so we can start the real work. It's going to be a couple of years before this really turns into something interesting.

Waiting for your feedbacks about this news. You can mail me by clicking on this link Afzal Khan.

Article to know in depth about how to remove/fix Supplemental Results from Google search results

Steveb of webmasterworld has an excellent posting on how to remove supplement results, I agree 100% with what he says and I recommend his posting to everyone who have supplement results in google and want to remove them, Supplement results are mostly caused when a page of a site once existed and later removed by the site owner of because of any other problem, Supplement results are also caused when a page which is crawled once had links to it then the links dropped off completely.

Article posted on this blog by Afzal Khan.

Here is his posting

"Google's ill-advised Supplemental index is polluting their search results in many ways, but the most obviously stupid one is in refusing to EVER forget a page that has been long deleted from a domain. There are other types of Supplementals in existence, but this post deals specifically with Supplemental listings for pages that have not existed for quite some time.
The current situation: Google refuses to recognize a 301 of a Supplemental listing. Google refuses to delete a Supplemental listing that is now a nonexistent 404 (not a custom 404 page, a literal nothing there) no matter if it is linked to from dozens of pages. In both the above situations, even if Google crawls through links every day for six months, it will not remove the Supplemental listing or obey a 301. Google refuses to obey its own URL removal tool for Supplementals. It only "hides" the supplementals for six months, and then returns them to the index.
As of the past couple days, I have succeeded (using the below tactics) to get some Supplementals removed from about 15% of the datacenters. On the other 85% they have returned to being Supplemental however.
Some folks have hundreds or thousands of this type of Supplemental, which would make this strategy nearly impossible, but if you have less than twenty or so...
1) Place a new, nearly blank page on old/supplemental URL.
2) Put no actual words on it (that it could ever rank for in the future). Only put "PageHasMoved" text plus link text like "MySiteMap" or "GoToNewPage" to appropriate pages on your site for a human should they stumble onto this page.
3) If you have twenty supplementals put links on all of them to all twenty of these new pages. In other words, interlink all the new pages so they all have quite a few links to them.
4) Create a new master "Removed" page which will serve as a permanent sitemap for your problem/supplemental URLs. Link to this page from your main page. (In a month or so you can get rid of the front page link, but continue to link to this Removed page from your site map or other pages, so Google will continually crawl it and be continually reminded that the Supplementals are gone.)
5) Also link from your main page (and others if you want) to some of the other Supplementals, so these new pages and the links on them get crawled daily (or as often as you get crawled).
6) If you are crawled daily, wait ten days.
7) After ten days the old Supplemental pages should show their new "PageHasMoved" caches. If you search for that text restricted to your domain, those pages will show in the results, BUT they will still ALSO continue to show for searches for the text on the ancient Supplemental caches.
8) Now put 301s on all the Supplemental URLs. Redirect them too either the page with the content that used to be on the Supplemental, or to some page you don't care about ranking, like an "About Us" page.
9) Link to some or all of the 301ed Supplementals from your main page, your Removed page and perhaps a few others. In other words, make very sure Google sees these new 301s every day.
10) Wait about ten more days, longer if you aren't crawled much. At that point the 15% datacenters should first show no cache for the 301ed pages, and then hours later the listings will be removed. The 85% datacenters will however simply revert to showing the old Supplemental caches and old Supplemental listings, as if nothing happened.
11) Acting on faith that the 15% datacenters will be what Google chooses in the long run, now use the URL removal tool to remove/hide the Supplementals from the 85% datacenters.
Will the above accomplish anything? Probably not. The 85% of the datacenters may just be reflecting the fact that Google will never under any circumstances allow a Supplemental to be permanently removed. However, the 15% do offer hope that Google might actually obey a 301 if brute forced.
Then, from now on, whenever you remove a page be sure to 301 the old URL to another one, even if just to an "About Us" page. Then add the old URL to your "Removed" page where it will regularly be seen and crawled. An extra safe step could be to first make the old page a "PageHasMoved" page before you redirect it, so if it ever does come back as a Supplemental, at least it will come back with no searchable keywords on the page.
Examples of 15% datacenter: 216.239.59.104 216.239.57.99 64.233.183.99 Examples of 85% datacenter: 216.239.39.104 64.233.161.99 64.233.161.105 "


Regard's

Afzal Khan

Article posted on this blog by Afzal Khan. Actual source of this article has been picked from Search Engine Genie Blog. I recommend all my reader of Toprankseo Blog to visit links at my favourite blogs.

Importance of Sitemap Page in your website

Howdy Folks,

Hope you people are rocking in your life!!! Well it’s being long I have not written any new fresh SEO article. Today I finally decided to come up with new topic, picking some time from my busy schedule.

Today I am going to talk about the hottest topic now a day – Sitemap. About the importance of Sitemap to rank well in search engines there are many SEO tips and tricks that help in optimizing a site but one of those, the importance of which is sometimes underestimated is sitemaps.

Sitemap, as the name simply speaks of is like a map of your website – i.e. on one single page you show the structure of your site, its sections, the links between them, etc. Sitemap helps in making navigation easier for your site and keeping an updated Sitemap on your site is fruitful both for your users and for search engines. It is an important way of communication with search engines. By providing sitemap page to your site you tell search engines where you’d like them to go, while in robots.txt you tell search engine which parts of your site to exclude from indexing.

Sitemap have always been part of best Web design practices but with the adoption of sitemaps by search engines, now they become even more important. However, it is necessary to make a clarification that if you are interested in sitemaps mainly from a SEO point of view; you can't go on with the conventional sitemap only (though currently Yahoo! and MSN still keep to the standard html format). For instance, Google Sitemaps uses a special (XML) format that is different from the ordinary html sitemap for human visitors.

One might ask why two sitemaps are necessary. The answer is obvious - one is for humans, the other is for spiders (for now mainly Googlebot but it is reasonable to expect that other crawlers will join the club shortly). In that relation it is necessary to clarify that having two sitemaps is not regarded as duplicate content. In 'Introduction to Sitemaps', Google explicitly states that using a sitemap will never lead to penalty for your site.

Do check Toprank SEO Blog for other feature articles or mail me at afzal.bpl@gmail.com for other SEO articles which you would love to know in detail.

SEO Forum

Hey friends,

Sharing you with a great news about SEO and Other Search Engine Marketing news, do join the exclusive forum made to discuss about search engine optimisation techniques at SEO India Forum.

Wish to see you guys.

Njoy!!! n have fun!

Afzal

Precharge Projectnet SEO Contest

MEGA SEO Contest - Guys an Blasting news for you all those who readign my Blog regularly there an SEO Contest is running on and th details are written below, you can also check this at SEOCompetition Blog

Precharge Projectnet SEO Contest, The Exact Phrase - "precharge projectnet"
The Goal - to drive attention to the ever increasing problem surrounding the biggest challenge we may ever face. Everyday, people in Washington DC and all over the country are trying very hard to limit the exponential growth that the internet continues to have. These limitations may forever change the way we work and play online and preCharge believes this challenge needs some serious exposure before it's too late.

Contest starts Sunday August 19, 2006 and ends December 19, 2006. (duration is approximately 4 months) - Just in time for an extra special holiday surprise. We will even guarantee payment by December 24th, 2006 via Western Union or PayPal.

The URL who ranks highest on Google with a domain registered on or after August 19, 2006 wins for the phrase "precharge projectnet".

Good Luck for you all people and for me too as m also participating for this contest.

Regard's

Afzal Khan

What Google Said - When You Weren't Listening

Article printed from SEO-News: http://www.seo-news.com HTML version available at: http://www.seo-news.com/archives.html
What Google Said When You Weren't Listening By Kim Roach (c) 2006

Article Posted on this Blog by Afzal Khan

Google wants to create quality search engine results just as
badly as you want to acquire high search engine rankings.
Fortunately for us, Google provides web masters with plenty of
guidelines and tips for building a Google-Friendly site.

Unfortunately, many web masters simply aren't listening. Most
web masters seem to be pulling tips and strategies from almost
every source but Google itself. However, Google has some of the
most beneficial SEO tips to be found online.

Here are just a few of the questions that you can find answered
directly by Google.

Q. Does Google index dynamic pages?

A. Yes. Google indexes dynamically generated pages. This
includes pages with the following file extensions: .asp, .php,
and pages with question marks in their URLs. However, these
pages can cause problems for the Googlebot and may be ignored.

Fortunately, there is a solution. If you feel that your
dynamically generated pages are being ignored, you may want to
consider creating static copies of those pages for the
Googlebot. Keep in mind, if you choose to do this, be sure to
include a robots.txt file that disallows the dynamic pages so
that Google doesn't see those pages as duplicate content.

Q. Does Google index sites that use ASP?
A. Yes. Google is able to index most types of pages and files
with very few exceptions. This includes pdf, asp, jsp, html,
shtml, xml, doc, xls, ppt, rtf, wks, lwp, wri, swf, cfm, and
php. This is not a complete list, but it gives a good overview.

Q. Does Google index sites that use Macromedia Flash?
A. Yes. Google indexes pages that use Macromedia Flash. However,
Google may have problems indexing Flash pages. If you are
concerned that your Flash content is inhibiting Google's ability
to crawl your site, you may want to consider creating HTML
copies of those Flash pages. As always, you will need to include
a robots.txt file that disallows the Flash pages so that Google
does not recognize those pages as duplicate content.

Q. How do I add my site to Google's search results?
A. According to Google , inclusion in Google's
search results is free and easy. They also state that it is
unnecessary to submit your site to Google. Google uses software
known as "spiders" to crawl the web on a regular basis and find
sites to add to the index.

When a spider misses a site, it is often because of one of the
following reasons:

1. The site is not well connected with other sites through an
inbound linking structure.

2. The site launched after Google's most recent crawl was
completed.

3. Poor web site design makes it difficult for Google to
effectively crawl your content.

4. The site was temporarily unavailable at the time of
crawling or an error was received. You can use Google
Sitemaps to see if the Google crawlers received errors
when trying to crawl your site.

Q. How can I get my web site into Google's Mobile index?
A. Google Mobile offers Google Web Search, Local Search, and
Image Search for web sites that are configured for mobile
devices. Google adds new sites to their mobile Web index every
time they crawl the Web.

To let Google know about your mobile site, it is best to submit
a Mobile Sitemap.To help ensure that Google's mobile crawlers can crawl and index your site, you should:

* Use well-formed markup
* Validate your markup
* Use the right DOCTYPE and Content-Type for the markup language that you are using.

Q. Will participation in Adsense or Adwords affect my listing in
Google's free search results.

A. Google's advertising programs are independent of their search
results. Participation in an advertising program will have no
effect on your organic search engine rankings.

Q. Why does my site have a PageRank of zero?
A. Yes. Google has an answer for this as well. According to
Google, a page may be assigned a rank of zero if Google crawls
very few sites that link to that particular site. In addition to
this, pages that have recently been added to the Google index
may also show a PageRank of zero. This is simply because they
haven't been crawled by Googlebot yet and haven't been ranked
yet.

The key is to be patient. A page's PageRank score may increase
naturally with further crawls.

Q. My URL changed. How can I get Google to index my new site?
A. Google cannot manually change your URL in the search
results. However, there are steps you can take to ensure a
smooth transition.

First, you can redirect visitors to your new site. To do this,
simply use an HTTP 301 (permanent) redirect. This ensures that
Google's crawler will discover your new URL.

To preserve your rank, you will need to tell others who link to
your site about your change of address. To find a portion of the
sites that link to yours, you can go to the Google search engine
and type in: site:www.mydomain.com . To obtain a comprehensive
list of links that point to your page, perform a Google search
on your URL in quotes: "www.mydomain.com".

Q. How often does Google crawl the web?
A. Google's spiders crawl the web on a regular basis to rebuild
their index. Crawls are based on a number of factors, including
Pagerank, links to a page, and a web site's structure. This is
just a small list. There are a variety of factors that can
affect the crawl frequency of individual sites.

Q. How do I create a Google friendly site?
A. To help Google find, index, and rank your site, it is
suggested that you follow their Webmaster Guidelines.

Here are some of the general guidelines that Google offers to
web masters:

* Have other relevant sites link to yours.

* Submit a sitemap.

* Submit your site to relevant directories such as
the Open Directory Project and Yahoo. For a complete
listing of web directories, go to Strongestlinks.com

* Make sure each and every page is reachable from at least
one static text link.

* Offer your visitors a site with links that point to the
most important parts of your site. If your sitemap is larger
than 100 links, you may want to break the site map into
separate pages.

* Keep the links on any given page to a reasonable number
(less than 100).

* Check for broken links and correct HTML.

* Create a useful site that is full of information-rich content.
Your pages should be written in a way that clearly and
accurately describes your content.

* Make sure that your TITLE and ALT tags are descriptive and
accurate.

* Use a text browser such as Lynx
to examine your web site. Most search engine spiders see your
site in much the same way as Lynx would.

* Allow search bots to crawl your sites without session Ids
or arguments that track their path through the site.

* Make use of the robots.txt file which tells crawlers which
directories they can or cannot crawl


Q. How can I report a site that is spamming the Google search
results?

A. Google is constantly working to improve the quality of their
search results. Therefore, they have implemented a program that
allows web searchers to report spam that they find within the
search engine results. These Spam Reports are submitted directly
to Google engineers and are used to devise long-term solutions
to fight spam.

However, before you submit a site as being spam, Google highly
suggests that you take a look at their webmaster guidelines
to determine if sites are acceptable or not.

http://www.google.com/contact/spamreport.html

Q. Why are sites blocked from the Google index?

A. Sites may be blocked from the Google index if they do not
meet certain quality standards. Google does not comment on the
individual reason for pages being removed. However, they do
reveal that certain actions such as cloaking, writing text that
can be seen by search engines but not by users, or setting up
pages/links with the sole purpose of fooling the search engines
may result in removal from the index.

If you receive a notification that your site violates Google's
quality guidelines, you can correct your site to meet their
guidelines and then request reinclusion.

So there you have it, some of the many tips that Google is
handing out for free. If you want to obtain high search engine
rankings for the long-term, Google actually provides some very
good advice.
===============================================
Kim Roach is a staff writer and editor for the SiteProNews
and SEO-News
newsletters.
You can contact Kim at:
Contact: kim@seo-news.com
================================================

Copyright © 2006 Jayde Online, Inc. All Rights Reserved.
SEO-News is a registered service mark of Jayde Online, Inc.

Article Posted on this Blog by Afzal Khan

More SEO Blog's

Article Posted on this blog by Afzal Khan

Here' an good news for you all the reader's of Toprankseo Blog. As you guys has realised we always look to add detailed and descriptive articles about SEO (Search Engine Optimisation) to all the novice those who want to acquire skills and get update there SEO Knowledge, thus enhancing there skills. We are now going to give you the information about all the Blog's and website's publishing newer information on search engine optimisation.

In current section I have been providing you with link of an SEO Blog, maintained by Sonika Mishra an SEO Expert in Delhi. As you all might be aware that Delhi is like an SEO Hub for all SEO Expert's in India, and I am sure you all will also get benefited from Sonika effort's in publishing SEO articles.

About Sonika Mishra, she has been very knowledgable and experienced person in this field. Since she has been sharpening her skills and being in SEO field from last 2 years and 6 month, she has been writing articles for her own blog and regularly publishing good and intresting articles on her blog.

Read out more about SEO articles at http://seo-expert-delhi.blogspot.com/.

Title :- SEO EXPERT DELHI-SEO & Web Promotion India

Description :- Search engine ranking, Search engine optimization, Search engine placement, Website optimization, Search engine positioning, Web site optimization, High search engine ranking, Web page optimization, Search engine promotion, Top search engine ranking, High search engine rankings, Search engine rankings, Better search engine placement, Web site optimization, High search engine placement, Search engines optimization, Website optimization.

Njoy!!!

Afzal Khan

Google Search Engine Optimization Pitfalls

By John Hill (c) 2006
Article Posted on this blog by Afzal Khan

On Page Factors - Is Your Website Search Engine Friendly?

So you have a website but where is it on Google? Have you fallen
foul of a penalty or have you overlooked one of the many common
search engine optimization pitfalls when designing your
site?

Understanding what works for the search engines and what doesn't
when it comes to the content on your website can have a crucial
impact on the relevance and/or page rank of your pages from a
SEO perspective.

Here we highlight common mistakes that could affect your ranking
on Google and other search engines.

Optimizing for the Correct Keywords

Basically 'Get real' about what keywords you feel your website
can be ranked for. If you have a ten page website in a highly
competitive market then ranking naturally for the major terms
will be close to impossible.

Use the Overture keyword tool together with the number of
results on Google to find out what keywords are searched for and
how many other websites are targeting them. If you are lucky
then you might even find a popular keyword that not many other
websites are optimized for. Alternatively a good tool for this
job is Wordtracker from Rivergold Associates Ltd.

Code Validation

If your html code is not valid, then this could make it very
difficult or even impossible for a search engine to separate
your page content from your code. If the search engine cannot
see your content, then your page will obviously have no
relevance.

Frames

Even though most, if not all, major search engines now index
frames and even with the use of the NOFRAMES tag, you run the
risk of your pages being displayed in the search engine results
out of context. As each individual page is indexed separately,
it is likely that your website visitors will not see your pages
within your frame and will effectively be stuck on the page they
arrive at.

If you must use frames then create a 'Home' link on each of your
individual content pages and point the link at your frameset
index page.

JavaScript Navigation

If you use JavaScript to control your website navigation, then
search engine spiders may have problems crawling your site. If
you must use JavaScript, then there are two options available to
you:

* Use the NOSCRIPT tag to replicate the JavaScript link in
standard HTML.

* Replicate your JavaScript links as standard HTML links in
the footer of your page.

Flash Content

Currently only Google can index Macromedia Flash files, how
much or how little content they see is open to debate. So until
search engine technology is able to handle your .swf as standard
it would be advisable to avoid the use of these.

Again if you must use Flash then offer a standard HTML
alternative within NOEMBED tags.

Dynamic URLs

Although Google and Yahoo are able to crawl complicated URLs it
is still advisable to keep your URLs simple and avoid the use of
long query strings. Do not include session IDs in the URL as
these can either create a 'spider trap' where the spider indexes
the page over and over again or, at worst, your pages will not
get indexed at all.

If you do need to include parameters in the URL, then limit them
to two and the number of characters per parameter to ten or
less.

The best SEO solution for dynamic URLs is to use Mod-rewrite or
Multiviews on Apache.

No Sitemap

A sitemap is the search engine optimization tool of choice to
ensure every page within your website is indexed by all search
engines. You should link to your site map from, at least, your
homepage but preferably from every page on your website.

If your website contains hundreds of pages then split the
sitemap into several categorized maps and link these all
together. Try and keep the number of links per page on a sitemap
to less than 100.

Excessive Links

Excessive links on a given page (Google recommends having no
more than 100) can lower its relevance and, although it does
not result in a ban, this does nothing for your search engine
optimization strategy.

Be Careful Who You Link To

As you have no control over who links to your website, incoming
links will not harm your rank. However, outbound links from your
website to 'bad neighbourhoods' like link farms will harm your
ranking.

As a rule ensure as many of your outbound links as possible link
to websites that are topical to your field of business.

Article Posted on this blog by Afzal Khan


=======================================
John Hill - Developer, Designer and SEO Professional with
E-Gain New Media (http://www.e-gain.co.uk) offering website
design (http://www.e-gain.co.uk/web-development/
website_development/web-site-design/), search engine optimization
(http://www.e-gain.co.uk/online_marketing/business_solutions/
search-engine-optimisation/) and PPC Management.
=======================================

The Advance Of Algorithms - New Keyword Optimization Rules

By Matt Jackson (c) 2006

Posted on this blog by Afzal Khan

Maintaining and marketing a website can be a difficult task
especially for those who are inexperienced or who have very
little experience. SEO rules are constantly changing and even
then, many SEO professionals disagree on the actual specifics
required to optimize a website. This is in no small part due to
the search engines themselves.

Major search engines like Google are constantly striving to
ensure that sites at the top of their result pages offer
invaluable information or service to their visitors. However,
webmasters who are looking to make quick money while offering
very little quality content are always finding new ways to beat
the search engines at their own game. For this reason, search
engines regularly change the methods they use to determine the
relevancy and importance of your site.

Evolving Search Engines

The first step you should take is to ensure that your website
will do an effective job of turning visitors into money. The
content needs to be optimized so that both search engine
visitors and human visitors both deem it to be a useful website.
Once upon a time, effective optimization entailed cramming
content with as many keywords as possible and while this once
generated good search engine results it invariably put visitors
off. It is also now frowned upon and penalized as being spam by
all of the major search engines.

The Evolution And Improvement Of Algorithms

Search engines use specific algorithms to determine the
relevance of your website. The calculations from these
algorithms determine where on the search engine result pages
your website will appear. In order to keep the unscrupulous
webmasters guessing and ensuring that results are always up to
date, major search engines regularly update their algorithms.

Recent Advances
The result of some of the most recent changes has seen the
impetus move away from optimizing websites for search engines
and instead the algorithms are now geared to promote websites
that give true value to visitors. They're not only changing,
they are evolving into more intelligent and accurate algorithms.
While the use of keywords based around the relevant topic is
still important, it is also important to ensure that visitors
are your main priority.

Keyword Optimization

Keyword optimization is now more heavily guarded. Those who
include keywords too often will have their sites labeled as
spam, whereas not enough instances of the appropriate keyword
means you won't receive the desired results. However, the
algorithms have become particularly smart and as well as the
keywords you want to target you should include other relevant
keywords. Including inflexions of keywords is one excellent way
to ensure that your site is deemed to be relevant. Inflexions
are slight changes to your keyword. For example, inflexions of
the keyword "advertising" include advertise, advertised,
advertisement, etc...

Keyword Inclusion

Weight is also given to keywords that are included in certain
sections of a page. These sections include the title tag, meta
tags (only relevant to smaller search engines now), header tags,
image alt tags and formatting tags (e.g. keywords in bold or
italicized) of your text. With image alt tags and hyperlink
title tags it is important that you don't simply fill these with
keywords because this will be ignored at best, and penalized at
worst.

Natural Content Writing

One of the most effective ways to ensure that your site is
keyword optimized properly is to write the content naturally
first. Once you have done this, go through and ensure that any
relevant keywords are included throughout the text. Only place
them where they would appear naturally and remove them from
anywhere where they appear awkward. Once you've written the
content you should also check the remaining factors to ensure
everything is ok.

SEO Keyword Checklist

Below is a keyword checklist to ensure that you have fully
optimized your web pages to the current, generally accepted
search engine algorithm rules.

URL: Get your primary keyword as close to the beginning
of the URL as possible.

Title Tag: The title should be between 10 and 50
characters and include one or more keywords while still being
descriptive.

Description Meta Tag: The description meta tag should be
insightful and useful but it should also contain one or two of
your more important keywords.

Keyword Meta Tag: It makes sense that you should include
all of your keywords in the keyword meta tag. Do not include any
words that don't appear in the body of your text.

Keyword Density: Your content should be made up of all of
your keywords and other text. A total keyword density (all
keywords) of around 15% to 20% is the maximum you should aim for
and anything less than 5% is unlikely to yield good results.
Density for a single keyword should be between 1% and 7%.
1% seems too low, and 7% a little too high. Wherever possible
aim for approx 5% with the primary keyword and 3% with secondary
and subsequent keywords.

Header Tags (e.g. H1 and H2 tags): More weight is given
to keywords that appear within H1 tags, then H2 tags and so on.

Text Formatting Fonts (e.g. strong, bold and underline):
This may not offer much weight in algorithms, but generally if
you bold the first instance of your keywords and the last
instance of your primary keyword you should see some positive
results.

Beginning Of Text: The closer you can get your keywords
to the beginning of your page content the better. Try to include
your primary keyword within the first sentence or two and also
within the last paragraph.

Key-Phrases As Whole Phrases:If you are targeting Internet
Marketing as a key phrase then do not split the words up if
possible. Some effect is noticed if the words are split, but
much more benefit is received by including the phrase as a
whole.

Alt Text: Include your keyword at least once in the Alt tag of
any images. Ensure that the text is relevant to the image and
gives some information.

Posted on this blog by Afzal Khan

========================================================
Matt Jackson, founder of WebWiseWords (http://www.webwisewords.com),
is a professional copywriter offering a professional service.
Whether your business or your website needs a website content
copyrwriter, an SEO copywriter, a press release copywriter or a
copywriter (http://www.webwisewords.com) for any other purpose
WebWiseWords can craft the words you want.
========================================================

Google buys search algorithm invented by Israeli student

Source : haaretzdaily.com

Search engine giant Google recently acquired an advanced text search algorithm invented by Ori Alon, an Israeli student. Sources believe Yahoo and Microsoft were also negotiating with the University of New South Wales in Australia, where Alon is a doctoral student in computer science.

Google, Alon and the university all refused to comment, though Google confirmed that "Ori Alon works at Google's Mountain View, California offices."

The University acknowledged that Yahoo and Microsoft had conducted negotiations with its business development company.

Alon told TheMarker in an interview six months ago that the university had registered a patent on the invention.

Orion, as it is called, which Alon developed with faculty, relates only to the most relevant textual results. In addition the software, which currently operates only in English, offers a list of topics directly related to the original source.

"For example, if you search information on the War of Independence, you'll receive a list of related words, like Etzel, Palmach, Ben-Gurion," he explained. The text will only appear on the results page if enough words relevant to the search and the link between them is reasonable. Orion also rates the texts by quality of the site in which they appear.