Wednesday, January 21, 2009

Web Development

Bits Infoware Solutions specializes in developing dynamic database-driven websites with user-friendly content management interfaces. This approach helps you reduce your total cost for updating and maintaining your website by allowing non-specialists to do all the work. In addition to timely, cost effective, and professional work, Bits Infoware Solutions offers you our unique understanding of the social investing audience. Our clients include some of the most well -respected names in social investing.

Think your pocket is small for the price to buy web hosting then go for free web hosting.

All Bits Infoware Solutions projects start and are fulfilled with strong project management:

  • A Bits project manager will be your constant companion in the development process.
  • With their support and guidance, your needs and wishes will be communicated to our development team.
  • Our strategic planning professionals are available to work with you to achieve specific communications and data management needs.

Site development tasks are fulfilled through our areas of focus, including:

  • Website design and production
  • Database, ecommerce and other application development
  • Web multimedia development including Flash.
  • Streaming video and media hosting
  • Website localization/language translation
  • Site maintenance and support.
  • Search engine placement
Bits Infoware Solutions is a full-service, web development company serving the needs of organizations that take their web presence seriously

Thursday, January 24, 2008

SEO

What is SEO?

Search engine optimization is the art and science of publishing information in a format which will make search engines believe that your content satisfies the needs of their users for relevant search queries. SEO, like search, is a field much older than I am. In fact, it was not originally even named search engine optimization, and to this day most people are still uncertain where that phrase came from.

Early SEO

Early search engine optimization consisted mostly of using descriptive file names, page titles, and meta descriptions. As search advanced on the page factors grew more important and then people started trying to aim for specific keyword densities.

Link Analysis

One of the big things that gave Google an advantage over their competitors was the introduction of PageRank, which graded the value of a page based on the number and quality of links pointing at it. Up until the end of 2003 search was exceptionally easy to manipulate. If you wanted to rank for something all you had to do was buy a few powerful links and place the words you wanted to rank for in the link anchor text.

Search Gets More Sophisticated

On November 15, 2003 Google began to heavily introduce many more semantic elements into its search product. Researchers and SEO's alike have noticed wild changes in search relevancy during that update and many times since then, but many searchers remain clueless to the changes.

Search engines would prefer to bias search results toward informational resources to make the commercial ads on the search results appear more appealing. You can see an example of how search can be biased toward commercial or informational resources by playing with Yahoo! Mindset.

Curbing Link Spam

On January 18, 2005, Google, MSN, and Yahoo! announced the release of a NoFollow tag which allows blog owners to block comment spam from passing link popularity. People continued to spam blogs and other resources, largely because search engines may still count some nofollow links, and largely because many of the pages they spammed still rank.

Since 2003 Google has came out with many advanced filters and crawling patterns to help make quality editorial links count more and depreciate the value of many overtly obvious paid links or other forms of link manipulation.

Historical, Editorial, & Usage Data

Older websites may be given more trust in relevancy algorithms than newer websites (just existing for a period of time is a signal of quality). All major search engines use human editors to help review content quality and help improve their relevancy algorithms. Search engines may factor in user acceptance and other usage data to help determine if a site needs reviewed for editorial quality and to help determine if linkage data is legitimate.

Google has also heavily pushed giving away useful software, tools, and services which allow them to personalize search results based on the searcher's historical preferences.

Self Reinforcing Market Positions

In many verticals search is self reinforcing, as in a winner take most battle. Jakob Nielsen's The Power of Defaults notes that the top search result is clicked on as often as 42% of the time. Not only is the distribution and traffic stream highly disproportionate, but many people tend to link to the results that were easy to find, which makes the system even more self reinforcing, as noted in Mike Grehan's Filthy Linking Rich.

A key thing to remember if you are trying to catch up with another website is that you have to do better than what was already done, and significantly enough better that it is comment worthy or citation worthy. You have to make people want to switch their world view to seeing you as an authority on your topic. Search engines will follow what people think.

Hypocrisy in Search

Google engineer Matt Cutts frequently comments that any paid link should have the nofollow attribute applied to it, although Google hypocritically does not place the nofollow attribute on links they buy. They also have placed their ads on the leading Warez site and continued to serve ads on sites that they banned for spamming. Yahoo! Shopping has also been known to be a big link buyer.

Tuesday, January 22, 2008

Search Engine Optimization

Optimistic about your search placement?

Consider how profitable you could be if your prospective customers find your website near the top of search results for the keywords used in your industry। How much of that keyword traffic is currently going to your competitors’ sites instead of yours?

Search Engine Optimization

Today there will be millions of searches performed on the Internet, many of them looking for exactly what you are selling. The process of getting your website to the top of the search engines is known as search engine optimization (SEO). SEO has proven to be one of the most cost-effective forms of marketing available. SEO.com is recognized as a world leader in the art and science of search engine optimization by Fortune 500 companies, emerging Internet entities, small businesses, and professional firms.

Search optimization is a hyper-competitive endeavor that requires intense focus and a thorough, up-to-date understanding of how the search engine algorithms and robots operate. The staffs at Bits Infoware Solutions have been in the search engine marketing industry for over a decade and our tactics continuously evolve to meet the needs of our clients. In the uncertain world of search engine optimization, our straightforward approach has consistently delivered results, which is why we are so optimistic about search marketing. We apply our proven SEO process so your website will achieve that same high level of success.

In addition to search engine optimization, we provide other Internet marketing services including: search-friendly web design, pay-per-click (PPC) advertising campaigns, link building programs, social media optimization, blog optimization and marketing, targeted local search marketing, and mobile marketing optimization.

Every “Johnny-come-lately” can claim to be the best Internet marketing firm around, but the real proof is in the results. Bits Infoware Solutions has true success stories from real-world clients to show you what you can expect in terms of return on investment. We understand that top rankings are important for your business, but even more important is that those top search rankings are bringing in targeted traffic that converts to sales.

If you’re ready to find out how we can improve your website’s search engine ranking, Request a Proposal

Be sure to check out our new Bits Infoware Solutions blog, where we answer SEO questions and offer tips on optimizing your website.

Request a free ranking report to see how your site ranks in the search engines. If you have any other questions, please contact us

About Bits Infoware Solutions

Bits Infoware Solutions is a search engine optimization firm located in Ghaziabad

Each member of the Bits Infoware Solutions staff is an expert in various aspects of search marketing. Our primary areas of expertise include search engine optimization, paid search marketing, affiliate marketing, search-friendly web design, social networking and link building. Bits Infoware Solutions clients have ranged from tiny startup companies to large corporations–achieving high levels of success with each client.

The mission of Bits Infoware Solutions is to give businesses of all sizes the opportunity to harness the power of search engines as a marketing medium. When properly utilized, search engine optimization gives small businesses an opportunity to compete at the same level as larger corporations. We are able to get your company in front of prospective customers at the precise moment when they are looking for your products or services.

Our search engine optimization philosophy is to work within the search engines’ guidelines to provide content that is optimized for the best keywords. We make the necessary adjustments to your website, and get you linked in all the right places, to make the job of finding your website easier. By providing the search engines with quality, relevant content, we deliver long-term results in the form of more traffic and more sales to your website.

Services

Search Engine Optimization & Marketing Services

Our search engine marketing services are customized to meet your goals and objectives. There are several factors that will influence the cost and time required to market your site in the search engines. These factors include your competition, how much marketing you have already done for your site, your website’s content and theme, your business objectives, and of course, your search engine marketing budget. For a customized quote on services for your website, request a proposal.

Keyword Research

A very important service we provide is to help identify the best possible keywords to target with your online marketing campaigns. Intelligent keyword choice is essential for reaching the right audience. We will take your existing keyword list and expand the list to include many more keywords and phrases that are relevant to your business. We will then recommend keywords to target based on keyword search frequency and relevance to your website.

Competitive Analysis

We take an in-depth look at the competitive marketplace for your chosen keywords. Often you will find many companies competing for a share of the traffic on your keywords, even though they may not be direct competitors. It is important to understand what the competition is doing so you can identify key areas of weakness and other opportunities. Our competitive analysis reports allow you to do this.

Search Engine Optimization (SEO)

Search engine optimization (SEO) is the process of improving the volume and quality of traffic to a web site from search engines via “organic” search results. The organic search results are the ones that pull up in the main body of the search results page. Studies have shown that these organic search listings are clicked on more often than the sponsored search listings–which is why it is so crucial that companies optimize their websites for the best possible placement. Our SEO process is based on over 10 years of research and experience getting our clients sites to the top of the search engines.

Paid Search Marketing (PPC)

Pay-Per-Click (PPC) ads are the sponsored listing advertisements that appear at the top or on the right side of the organic search listings. Each time someone clicks on your PPC ad, you pay the search engines for that click. Our PPC marketing experts set up and manage cost-efficient PPC campaigns. PPC advertising offers the benefits of full control over your ad copy and keyword targeting. Combined with an SEO campaign, PPC can also be a very effective tool to increase your brand’s exposure in the search engines and increase sales.

Search-Friendly Web Design

Too often web designers build websites that look pretty, but don’t get results. With our web design services, we focus on a website that works–for your customers and the search engines.

Local Search Optimization

Our local search optimization services help businesses be found by people searching for local businesses. The amount of people searching for local services continues to increase, so the opportunity to get more business from local search is also growing.

Social Media Optimization

Social media has emerged as a huge part of the web landscape. Our staff members are constantly studying the social networking arena and looking for opportunities to use social media to increase our clients’ website exposure. When properly executed, social profiles on social networking sites improve your image, build your brand, and send more traffic to your website. We can create a social media optimization strategy that will work for your business.

Contextual Advertising

Contextual advertisements are text ads that show up within content on targeted websites. Similar to PPC search ads, these ads can be targeted either by keywords that are relevant to your ad, or by the website URL of the site displaying the ad.

Website Analytics

We analyze your website analytics reports to determine which keywords and search engines are driving traffic to your site, and which of those keywords convert to a new customer. This type of data helps us understand what is working and make the necessary adjustments to fully optimize your search marketing campaigns.

Link Building

One very important factor in how well your site shows up in search results is the number of high-quality, relevant links pointing to your site. We use various methods to obtain these high quality links to improve your rankings in the search engines.

Content Development/Copywriting

Our writers produce high quality, unique content that maintains your marketing message and is also optimized for the search engine spiders. Unique content is important for SEO, but that same content needs to sell your product or service to your customers. We work with you to develop a web content strategy for delivering fresh content on your website.

Blog Marketing

A blog is one of the best ways to produce fresh, unique, search engine friendly content for your site, as well as dramatically increase customer experience. Whether you need help setting up your blog, updating it, or just need some advice, we have the experience and results to help you increase your traffic, sales, and blog visibility.

Online Public Relations/Press Release Optimization

Online public relations is the process of knowing where your customers and constituents spend their time online, and influencing those people using better, faster and less-expensive PR and marketing communications. We can optimize your press releases and communications for the search engines and help spread your message to the people who are interested, to reach a much larger and more targeted audience than traditional PR.

SEO Process

Over the past decade, we have developed a six-step process for obtaining top rankings in the search engines. The following is an outline of our search engine optimization process:

Research Keywords

The first thing we do is create a robust keyword portfolio of relevant keyword phrases that are searched on by potential customers. We begin with your input on what your most important keywords are, then expand that list by cross-checking the list against the search engines’ databases of actual keyword searches. This helps us target the right words—the ones that will actually deliver customers to your website.

Analyze Competitors

We then determine what your top competitors are doing in regards to search engine optimization. We look at how many links they have, where they are getting those links and the keywords they are optimizing their websites for. This gives us the edge to see what is working to get top results for your keywords so we can improve up on those strategies and move your site above your competitors in the search engines.

Develop Search Strategy

After careful analysis of your competitors, we will formulate a strategy for improving your overall search engine visibility. We will lay out a step-by-step plan for what needs to happen to move your site to achieve optimal results for your keywords.

Optimize Website Content

Based on your chosen keywords, we will make the necessary adjustments to your website to include important keyword phrases in the right places. This will enable your site to show up for your selected keywords. We will also make recommendations for additional content or keyword-related topics to include on your website to attract additional keyword search traffic.

Link Building

Links from other websites are a very important part of getting your site ranked well in the search engines. We recruit links to your website from various sources. We submit your site to the most appropriate categories in all the major web directories. We then identify additional niche directories to submit your site to including local and industry-specific sites. Additional links are obtained from online press releases, articles and other sources. Link building is an ongoing process that must be done continually to stay ahead of the competition.

Monitor and Adjust

Each month we provide ranking reports to track the status of your site in the search engines. We monitor your website traffic, paying close attention to where the traffic is coming from and which keywords people use to find your website. We are able to analyze which keywords are most effective in generating leads or sales and adjust our search engine optimization tactics to focus on the most productive keywords.

How Search Engines Work: Search Engine Relevancy Reviewed

A comprehensive review of search engine relevancy algorithms, published by Aaron Wall on June 13, 2006. While some of the general details have changed, the major themes referenced in this article were still relevant when I reviewed it a year after publishing it.

Why Did I Write This Article?

Many people think a search engine is broken or their site is banned if they do not rank in one engine while they rank great in others. This article will hopefully help searchers understand the different relevancy criteria used by different engines. While people have looked at search engine ranking factors on a global level I do not think anyone has spent much time comparing and contrasting how different search engines compute their relevancy scores or bias their algorithms.

I think one of the hardest parts of explaining SEO is that the web is such a diverse place where things that make sense in one industry would make no sense in another industry. In addition, different engines look for different things to determine search relevancy. Things that help you rank in one engine could preclude you from ranking in another.

There is enough traffic out there that it might make sense to do ultra aggressive SEO on one burnable site for quick rankings in MSN and Yahoo! while spending time using slow growth long term low risk techniques on a site you eventually want to rank in Google.

I thought it would be worth compiling notes comparing how relevancy is defined by each engine (or how I perceive it based on my experiences). This page makes no aim to be comprehensive, but is designed more for making it easy for people new to the web to understand the differences between the different engines.

Bias is Universal. The Definition of Relevancy is Not.

Each major large scale search engine is ran by a large corporation. Each of these corporations is a for profit entity which has their own underlying principals or core beliefs which helps guide how they craft the search relevancy. While some machines automatically evolve search relevancy via genetic algorithms each major search engine still has some human input in how relevancy is calculated (at the very least humans write some of the algorithms).

The Bias of this Article

Originally when I started writing this article I wanted it to be more about search relevancy perhaps from more of an academic type standpoint, but my perspective on search is as one who understands it more from a marketing perspective.

The Short Version

Yahoo!

· been in the search game for many years.

· is better than MSN but nowhere near as good as Google at determining if a link is a natural citation or not.

· has a ton of internal content and a paid inclusion program. both of which give them incentive to bias search results toward commercial results

· things like cheesy off topic reciprocal links still work great in Yahoo!

MSN Search

· new to the search game

· is bad at determining if a link is natural or artificial in nature

· due to sucking at link analysis they place too much weight on the page content

· their poor relevancy algorithms cause a heavy bias toward commercial results

· likes bursty recent links

· new sites that are generally untrusted in other systems can rank quickly in MSN Search

· things like cheesy off topic reciprocal links still work great in MSN Search

Google

· has been in the search game a long time, and saw the web graph when it is much cleaner than the current web graph

· is much better than the other engines at determining if a link is a true editorial citation or an artificial link

· looks for natural link growth over time

· heavily biases search results toward informational resources

· trusts old sites way too much

· a page on a site or subdomain of a site with significant age or link related trust can rank much better than it should, even with no external citations

· they have aggressive duplicate content filters that filter out many pages with similar content

· if a page is obviously focused on a term they may filter the document out for that term. on page variation and link anchor text variation are important. a page with a single reference or a few references of a modifier will frequently outrank pages that are heavily focused on a search phrase containing that modifier

· crawl depth determined not only by link quantity, but also link quality. Excessive low quality links may make your site less likely to be crawled deep or even included in the index.

· things like cheesy off topic reciprocal links are generally ineffective in Google when you consider the associated opportunity cost

Ask

· looks at topical communities

· due to their heavy emphasis on topical communities they are slow to rank sites until they are heavily cited from within their topical community

· due to their limited market share they probably are not worth paying much attention to unless you are in a vertical where they have a strong brand that drives significant search traffic

The long version...

Yahoo! Search

Yahoo! was founded in 1994 by David Filo and Jerry Yang as a directory of websites. For many years they outsourced their search service to other providers, but by the end of 2002 they realized the importance and value of search and started aggressively acquiring search companies.

Overture purchased AllTheWeb and AltaVista. Yahoo! purchased Inktomi (in December 2002) and then consumed Overture (in July of 2003), and combined the technologies from the various search companies they bought to make a new search engine. Yahoo! dumped Google in favor of their own in house technology on February 17th, 2004.

Yahoo! has a cool Netrospective of their first 10 years, and Bill Slawski posted a list of many of the companies Yahoo! consumed since Overture.

On Page Content

Yahoo! offers a paid inclusion program, so when Yahoo! Search users click on high ranked paid inclusion results in the organic search results Yahoo! profits. In part to make it easy for paid inclusion participants to rank, I believe Yahoo! places greater weight on on-the-page content than a search engine like Google does.

Being the #1 content destination site on the web, Yahoo! has a boatload of their own content which they frequently reference in the search results. Since they have so much of their own content and make money from some commercial organic search results it might make sense for them to bias their search results a bit toward commercial websites.

Using descriptive page titles and page content goes a long way in Yahoo!

In my opinion their results seem to be biased more toward commerce than informational sites, when compared with Google.

Crawling

Yahoo! is pretty good at crawling sites deeply so long as they have sufficient link popularity to get all their pages indexed. One note of caution is that Yahoo! may not want to deeply index sites with many variables in the URL string, especially since

· Yahoo! already has a boatload of their own content they would like to promote (including verticals like Yahoo! Shopping)

· Yahoo! offers paid inclusion, which can help Yahoo! increase revenue by charging merchants to index some of their deep database contents.

You can use Yahoo! Site Explorer to see how well they are indexing your site and which sites link at your site.

Query Processing

Certain words in a search query are better at defining the goals of the searcher. If you search Yahoo! for something like "how to SEO " many of the top ranked results will have "how to" and "SEO" in the page titles, which might indicate that Yahoo! puts quite a bit of weight even on common words that occur in the search query.

Yahoo! seems to be more about text matching when compared to Google, which seems to be more about concept matching.

Link Reputation

Yahoo! is still fairly easy to manipulate using low to mid quality links and somewhat to aggressively focused anchor text. Rand Fishken recently posted about many Technorati pages ranking well for their core terms in Yahoo!. Those pages primarily have the exact same anchor text in almost all of the links pointing at them.

Sites with the trust score of Technorati may be able to get away with more unnatural patterns than most webmasters can, but I have seen sites flamethrown with poorly mixed anchor text on low quality links, only to see the sites rank pretty well in Yahoo! quickly.

Page vs Site

A few years ago at a Search Engine Strategies conference Jon Glick stated that Yahoo! looked at both links to a page and links to a site when determining the relevancy of a page. Pages on newer sites can still rank well even if their associated domain does not have much trust built up yet so long as they have some descriptive inbound links.

Site Age

Yahoo! may place some weight on older sites, but the effect is nowhere near as pronounced as the effect in Google's SERPs.

It is not unreasonable for new sites to rank in Yahoo! in as little as 2 or 3 months.

Paid Search

Yahoo! prices their ads in an open auction, with the highest bidder ranking the highest. By early 2007 they aim to make Yahoo! Search Marketing more of a closed system which factors in clickthrough rate (and other algorithmic factors) into their ad ranking algorithm.

Yahoo! also offers a paid inclusion program which charges a flat rate per click to list your site in Yahoo!'s organic search results.

Yahoo! also offers a contextual ad network. The Yahoo! Publisher program does not have the depth that Google's ad system has, and they seem to be trying to make up for that by biasing their targeting to more expensive ads, which generally causes their syndicated ads to have a higher click cost but lower average clickthrough rate.

Editorial

Yahoo! has many editorial elements to their search product. When a person pays for Yahoo! Search Submit that content is reviewed to ensure it matches Yahoo!'s quality guidelines. Sites submitted to the Yahoo! Directory are reviewed for quality as well.

In addition to those two forms of paid reviews, Yahoo! also frequently reviews their search results in many industries. For competitive search queries some of the top search results may be hand coded. If you search for Viagra, for example, the top 5 listings looked useful, and then I had to scroll down to #82 before I found another result that wasn't spammy.

Yahoo! also manually reviews some of the spammy categories somewhat frequently and then reviews other samples of their index. Sometimes you will see a referral like http://corp.yahoo-inc.com/project/health-blogs/keepers if they reviewed your site and rated it well.

Sites which have been editorially reviewed and were of decent quality may be given a small boost in relevancy score. Sites which were reviewed and are of poor quality may be demoted in relevancy or removed from the search index.

Yahoo! has published their content quality guidelines. Some sites that are filtered out of search results by automated algorithms may return if the site cleans up the associated problems, but typically if any engine manually reviews your site and removes it for spamming you have to clean it up and then plead your case. You can request to have your domain evaluated for re-inclusion using this form.

Social Aspects

Yahoo! firmly believes in the human aspect of search. They paid many millions of dollars to buy Del.icio.us, a social bookmarking site. They also have a similar product native to Yahoo! called My Yahoo!

Yahoo! has also pushed a question answering service called Yahoo! Answers which they heavily promote in their search results and throughout their network. Yahoo! Answers allows anyone to ask or answer questions. Yahoo! is also trying to mix amateur content from Yahoo! Answers with professionally sourced content in verticals such as Yahoo! Tech.

Yahoo! SEO Tools

Yahoo! has a number of useful SEO tools.

· Overture Keyword Selector Tool - shows prior month search volumes across Yahoo! and their search network.

· Overture View Bids Tool - displays the top ads and bid prices by keyword in the Yahoo! Search Marketing ad network.

· Yahoo! Site Explorer - shows which pages Yahoo! has indexed from a site and which pages they know of that link at pages on your site.

· Yahoo! Mindset - shows you how Yahoo! can bias search results more toward informational or commercial search results.

· Yahoo! Advanced Search Page - makes it easy to look for .edu and .gov backlinks

· while doing link:http://www.site.com/page.html searches (links to an individual page)

· while doing linkdomain:www.site.com/ searches (links to any page on a particular domain)

· Yahoo! Buzz - shows current popular searches

Yahoo! Business Perspectives

Being the largest content site on the web makes Yahoo! run into some inefficiency issues due to being a large internal customer. For example, Yahoo! Shopping was a large link buyer for a period of time while Yahoo! Search pushed that they didn't agree with link buying. Offering paid inclusion and having so much internal content makes it make sense for Yahoo! to have a somewhat commercial bias to their search results.

They believe strongly in the human and social aspects of search, pushing products like Yahoo! Answers and My Yahoo!.

I think Yahoo!'s biggest weakness is the diverse set of things that they do. In many fields they not only have internal customers, but in some fields they have product duplication, like with Yahoo! My Web and Del.icio.us.

Search Marketing Perspective

I believe if you do standard textbook SEO practices and actively build quality links it is reasonable to expect to be able to rank well in Yahoo! within 2 or 3 months. If you are trying to rank for highly spammed keyword phrases keep in mind that the top 5 or so results may be editorially selected, but if you use longer tail search queries or look beyond the top 5 for highly profitable terms you can see that many people are indeed still spamming them to bits.

As Yahoo! pushes more of their vertical offerings it may make sense to give your site and brand additional exposure to Yahoo!'s traffic by doing things like providing a few authoritative answers to topically relevant questions on Yahoo! Answers.

Learn More

* Yahoo! Search Content Quality Guidelines

* Yahoo! Search Help

* Yahoo! Search Blog

* Yahoo! Search Submit - paid inclusion

* Yahoo! Publisher Search Blog - blog discussing Yahoo!'s contextual ad product

* Yahoo! Research - Yahoo!'s research lab

Worker Blogs

* Jeremy Zawodny - probably known as one of the top 10 most famous bloggers on the web

* Tim Converse - it is surprising his blog doesn't get more comments on it for how interesting it is

* Tim Mayer - are you going to update this anytime soon Tim?

MSN Search

MSN Search had many incarnations, being powered by the likes of Inktomi and Looksmart for a number of years. After Yahoo! bought Inktomi and Overture it was obvious to Microsoft that they needed to develop their own search product. They launched their technology preview of their search engine around July 1st of 2004. They formally switched from Yahoo! organic search results to their own in house technology on January 31st, 2005. MSN announced they dumped Yahoo!'s search ad program on May 4th, 2006.

On Page Content

Using descriptive page titles and page content goes a long way to help you rank in MSN. I have seen examples of many domains that ranked for things like

state name+ insurance type + insurance

on sites that were not very authoritative which only had a few instances of state name and insurance as the anchor text. Adding the word health, life, etc. to the page title made the site relevant for those types of insurance, in spite of the site having few authoritative links and no relevant anchor text for those specific niches.

Additionally, internal pages on sites like those can rank well for many relevant queries just by being hyper focused, but MSN currently drives little traffic when compared with the likes of Google.

Crawling

MSN has got better at crawling, but I still think Yahoo! and Google are much better at crawling. It is best to avoid session IDs, sending bots cookies, or using many variables in the URL strings. MSN is nowhere near as comprehensive as Yahoo! or Google at crawling deeply through large sites like eBay.com or Amazon.com.

Query Processing

I believe MSN might be a bit better than Yahoo! at processing queries for meaning instead of taking them quite so literally, but I do not believe they are as good as Google is at it.

While MSN offers a tool that estimates how commercial a page or query is I think their lack of ability to distinguish quality links from low quality links makes their results exceptionally biased toward commercial results.

Link Reputation

By the time Microsoft got in the search game the web graph was polluted with spammy and bought links. Because of this, and Microsoft's limited crawling history, they are not as good as the other major search engines at telling the difference between real organic citations and low quality links.

MSN search reacts much more quickly than the other engines at ranking new sites due to link bursts. Sites with relatively few quality links that gain enough descriptive links are able to quickly rank in MSN. I have seen sites rank for one of the top few dozen most expensive phrases on the net in about a week.

Page vs Site

I think all major search engines consider site authority when evaluating individual pages, but with MSN it seems as though you do not need to build as much site authority as you would to rank well in the other engines.

Site Age

Due to MSN's limited crawling history and the web graph being highly polluted before they got into search they are not as good as the other engines at determining age related trust scores. New sites doing general textbook SEO and acquiring a few descriptive inbound links (perhaps even low quality links) can rank well in MSN within a month.

Paid Search

Microsoft's paid search product, AdCenter, is the most advanced search ad platform on the web. Like Google, MSN ranks ads based on both max bid price and ad clickthrough rate. In addition to those relevancy factors MSN also allows you to place adjustable bids based on demographic details. For example, a mortgage lead from a wealthy older person might be worth more than an equivalent search from a younger and poorer person.

Editorial

All major search engines have internal relevancy measurement teams. MSN seems to be highly lacking in this department, or they are trying to use the fact that their search results are spammy as a marketing angle.

MSN is running many promotional campaigns to try to get people to try out MSN Search, and in many cases some of the searches they are sending people to have bogus spam or pornography type results in them. A good example of this is when they used Stacey Kiebler to market their Celebrity Maps product. As of writing this, their top search result for Stacey Kiebler is still pure spam.

Based on MSN's lack of feedback or concern toward the obvious search spam noted above on a popular search marketing community site I think MSN is trying to automate much of their spam detection, but it is not a topic you see people talk about very often. Here are MSN's Guidelines for Successful Indexing, but they still have a lot of spam in their search results. ;)

Social Aspects

Microsoft continues to lag in understanding what the web is about. Executives there should read The Cluetrain Manifesto. Twice.Or maybe three times.

They don't get the web. They are a software company posing as a web company.

They launch many products as though they have the market stranglehold monopolies they once enjoyed, and as though they are not rapidly losing them. Many of Microsoft's most innovative moves get little coverage because when they launch key products they often launch them without supporting other browsers and trying to lock you into logging in to Microsoft.

MSN SEO Tools

MSN has a wide array of new and interesting search marketing tools. Their biggest limiting factor with them is that they have limited search market share.

Some of the more interesting tools are

· Keyword Search Funnel Tool - shows terms that people search for before or after they search for a particular keyword

· Demographic Prediction Tool - predicts the demographics of searchers by keyword or site visitors by website

· Online Commercial Intention Detection Tool - estimates the probability of a search query or web page being commercial, informational-transactional, or

· Search Result Clustering Tool - clusters search results based on related topics

You can view more of their tools under the demo section at Microsoft's Adlab.

MSN Business Perspectives

Microsoft has too many search brands for building their own technology in house.

They have MSN Search, Microsoft AdCenter, and Windows Live Search. All these things are pretty much the same thing and are meshed together, the only difference between them is that Microsoft does not know what brand they want to push.

Microsoft also heavily undermines their own credibility by recommending doorway page generator software and fake Alexa traffic generator software.

It seems as though Microsoft is big, slow moving, and late to the game.

Search Marketing Perspective

I believe if you do standard textbook SEO practices and actively build links it is reasonable to expect to be able to rank well in MSN within about a month. If you are trying to rank for highly spammed keyword phrases keep in mind that many of the top results will have thousands and thousands of spammy links. The biggest benefit to new webmasters trying to rank in Microsoft is how quickly they rank new sites which have shown inbound link bursts.

One note of caution with Microsoft Search is that they are so new to the market that they are rapidly changing their relevancy algorithms as they try to play catch up with Yahoo! and Google, both of which had many years of a head start on them. Having said that, expect that sometimes you will rank where your site does not belong, and over time some of those rankings may go away. Additionally sometimes they may not rank you where you do belong, and the rankings will continue to shift to and fro as they keep testing new technologies.

Microsoft has a small market share, but the biggest things a search marketer have to consider with Microsoft are their vast vats of cash and the dominance on the operating system front.

So far they have lost many distribution battles to Google, but they picked up Amazon.com as a partner, and they can use their operating system software pricing to gain influence over computer manufacturer related distribution partnerships.

The next version of Internet Explorer will integrate search into the browser. This may increase the overall size of the search market by making search more convenient, and boost Microsoft's share of the search pie. This will also require search engines to bid for placement as the default search provider, and nobody is sitting on as much cash as Microsoft is.

Microsoft has one of the largest email user bases. They have been testing integrating search and showing contextually relevant ads in desktop email software. Microsoft also purchased Massive, Inc., a firm which places ads in video games.

Microsoft users tend to be default users who are less advertisement adverse than a typical Google user. Even though Microsoft has a small marketshare they should not be overlooked due to their primitive search algorithms (and thus ease of relevancy manipulation), defaultish users, and potential market growth opportunity associated with the launch of their next web browser.

Learn More

* MSN Guidelines for Successful Indexing

* MSN Site Owner Help

* MSN Search Blog

* MSN AdCenter Blog

* Microsoft AdLab

* Microsoft Research

Worker Blogs

* Robert Scoble - he is probably known as one of the top 10 bloggers, but after working for Microsoft for years he left on June 10th, 2006.

Google

Google sprang out of a Stanford research project to find authoritative link sources on the web. In January of 1996 Larry Page and Sergey Brin began working on BackRub (what a horrible name, eh?)

After they tried shopping the Google search technology to no avail they decided to set up their own search company. Within a few years of forming the company they won distribution partnerships with AOL and Yahoo! that helped build their brand as the industry leader in search. Traditionally search was viewed as a loss leader

Despite the dotcom fever of the day, they had little interest in building a company of their own around the technology they had developed.

Among those they called on was friend and Yahoo! founder David Filo. Filo agreed that their technology was solid, but encouraged Larry and Sergey to grow the service themselves by starting a search engine company. "When it's fully developed and scalable," he told them, "let's talk again." Others were less interested in Google, as it was now known. One portal CEO told them, "As long as we're 80 percent as good as our competitors, that's good enough. Our users don't really care about search."

Google did not have a profitable business model until the third iteration of their popular AdWords advertising program in February of 2002, and was worth over 100 billion dollars by the end of 2005.

On Page Content

If a phrase is obviously targeted (ie: the exact same phrase is in most of the following location: in most of your inbound links, internal links, at the start of your page title, at the beginning of your first page header, etc.) then Google may filter the document out of the search results for that phrase. Other search engines may have similar algorithms, but if they do those algorithms are not as sophisticated or aggressively deployed as those used by Google.

Google is scanning millions of books, which should help them create an algorithm that is pretty good at differentiating real text patterns from spammy manipulative text (although I have seen many garbage content cloaked pages ranking well in Google, especially for 3 and 4 word search queries).

You need to write naturally and make your copy look more like a news article than a heavily SEOed page if you want to rank well in Google. Sometimes using less occurrences of the phrase you want to rank for will be better than using more.

You also want to sprinkle modifiers and semantically related text in your pages that you want to rank well in Google.

Some of Google's content filters may look at pages on a page by page basis while others may look across a site or a section of a site to see how similar different pages on the same site are. If many pages are exceptionally similar to content on your own site or content on other sites Google may be less willing to crawl those pages and may throw them into their supplemental index. Pages in the supplemental index rarely rank well, since generally they are trusted far less than pages in the regular search index.

Duplicate content detection is not just based on some magical percentage of similar content on a page, but is based on a variety of factors. Both Bill Slawski and Todd Malicoat offer great posts about duplicate content detection. This shingles PDF explains some duplicate content detection techniques.

I wrote a blog post about natural SEO copywriting which expounds on the points of writing unique natural content that will rank well in Google.

Crawling

While Google is more efficient at crawling than competing engines, it appears as though with Google's BigDaddy update they are looking at both inbound and outbound link quality to help set crawl priority, crawl depth, and weather or not a site even gets crawled at all. To quote Matt Cutts:

The sites that fit “no pages in Bigdaddy” criteria were sites where our algorithms had very low trust in the inlinks or the outlinks of that site. Examples that might cause that include excessive reciprocal links, linking to spammy neighborhoods on the web, or link buying/selling.

In the past crawl depth was generally a function of PageRank (PageRank is a measure of link equity - and the more of it you had the better you would get indexed), but now adding in this crawl penalty for having an excessive portion of your inbound or outbound links pointing into low quality parts of the web creates an added cost which makes dealing in spammy low quality links far less appealing for those who want to rank in Google.

Query Processing

While I mentioned above that Yahoo! seemed to have a bit of a bias toward commercial search results it is also worth noting that Google's organic search results are heavily biased toward informational websites and web pages.

Google is much better than Yahoo! or MSN at determining the true intent of a query and trying to match that instead of doing direct text matching. Common words like how to may be significantly deweighted compared to other terms in the search query that provide a better discrimination value.

Google and some of the other major search engines may try to answer many common related questions to the concept being searched for. For example, in a given set of search results you may see any of the following:

· a relevant .gov and/or .edu document

· a recent news article about the topic

· a page from a well known directory such as DMOZ or the Yahoo! Directory

· a page from the Wikipedia

· an archived page from an authority site about the topic

· the authoritative document about the history of the field and recent changes

· a smaller hyper focused authority site on the topic

· a PDF report on the topic

· a relevant Amazon, eBay, or shopping comparison page on the topic

· one of the most well branded and well known niche retailers catering to that market

· product manufacturer or wholesaler sites

· a blog post / review from a popular community or blog site about a slightly broader field

Some of the top results may answer specific relevant queries or be hard to beat, while others might be easy to compete with. You just have to think of how and why each result was chosen to be in the top 10 to learn which one you will be competing against and which ones may perhaps fall away over time.

Link Reputation

PageRank is a weighted measure of link popularity, but Google's search algorithms have moved far beyond just looking at PageRank.

As mentioned above, gaining an excessive number of low quality links may hurt your ability to get indexed in Google, so stay away from known spammy link exchange hubs and other sources of junk links. I still sometimes get a few junk links, but I make sure that I try to offset any junky link by getting a greater number of good links.

If your site ranks well some garbage automated links will end up linking to you weather you like it or not. Don't worry about those links, just worry about trying to get a few real high quality editorial links.

Google is much better at being able to determine the difference between real editorial citations and low quality, spammy, bought, or artificial links.

When determining link reputation Google (and other engines) may look at

· link age

· rate of link acquisition

· anchor text diversity

· deep link ratio

· link source quality (based on who links to them and who else they link at)

· weather links are editorial citations in real content (or if they are on spammy pages or near other obviously non-editorial links)

· does anybody actually click on the link?

It is generally believed that .edu and .gov links are trusted highly in Google because they are generally harder to influence than the average .com link, but keep in mind that there are some junky .edu links too (I have seen stuff like .edu casino link exchange directories). While the TrustRank research paper had some names from Yahoo! on it, I think it is worth reading the TrustRank research paper (PDF) and the link spam mass estimation paper (PDF), or at least my condensed version of them here and here understand how Google is looking at links.

When getting links for Google it is best to look in virgin lands that have not been combed over heavily by other SEOs. Either get real editorial citations or get citations from quality sites that have not yet been abused by others. Google may strip the ability to pass link authority (even from quality sites) if those sites are known obvious link sellers or other types of link manipulators. Make sure you mix up your anchor text and get some links with semantically related text.

Google likely collects usage data via Google search, Google Analytics, Google AdWords, Google AdSense, Google news, Google accounts, Google notebook, Google calendar, Google talk, Google's feed reader, Google search history annotations, and Gmail. They also created a Firefox browser bookmark synch tool, an anti-phishing tool which is built into Firefox and have relationships with the Opera (another web browser company). Most likely they can lay some of this data over the top of the link graph to record a corroborating source of the legitimacy of the linkage data. Other search engines may also look at usage data.

Page vs Site

Sites need to earn a certain amount of trust before they can rank for competitive search queries in Google. If you put up a new page on a new site and expect it to rank right away for competitive terms you are probably going to be disappointed.

If you put that exact same content on an old trusted domain and link to it from another page on that domain it can leverage the domain trust to quickly rank and bypass the concept many people call the Google Sandbox.

Many people have been exploiting this algorithmic hole by throwing up spammy subdomains on free hosting sites or other authoritative sites that allow users to sign up for a cheap or free publishing account. This is polluting Google's SERPs pretty bad, so they are going to have to make some major changes on this front pretty soon.

Site Age

Google filed a patent about information retrieval based on historical data which stated many of the things they may look for when determining how much to trust a site. Many of the things I mentioned in the link section above are relevant to the site age related trust (ie: to be well trusted due to site age you need to have at least some link trust score and some age score).

I have seen some old sites with exclusively low quality links rank well in Google based primarily on their site age, but if a site is old AND has powerful links it can go a long way to helping you rank just about any page you write (so long as you write it fairly naturally).

Older trusted sites may also be given a pass on many things that would cause newer lesser trusted sites to be demoted or de-indexed.

The Google Sandbox is a concept many SEOs mention frequently. The idea of the 'box is that new sites that should be relevant struggle to rank for some queries they would be expected to rank for. While some people have debunked the existence of the sandbox as garbage, Google's Matt Cutts said in an interview that they did not intentionally create the sandbox effect, but that it was created as a side effect of their algorithms:

"I think a lot of what's perceived as the sandbox is artefacts where, in our indexing, some data may take longer to be computed than other data."

You can listen to the full Matt Cutts audio interviewshere and here.

Paid Search

Google AdWords factors in max bid price and clickthrough rate into their ad algorithm. In addition they automate reviewing landing page quality to use that as another factor in their ad relevancy algorithm to reduce the amount of arbitrage and other noisy signals in the AdWords program.

The Google AdSense program is an extension of Google AdWords which offers a vast ad network across many content websites that distribute contextually relevant Google ads. These ads are sold on a cost per click or flat rate CPM basis.

Editorial

Google is known to be far more aggressive with their filters and algorithms than the other search engines are. They are known to throw the baby out with the bath water quite often. They flat out despise relevancy manipulation, and have shown they are willing to trade some short term relevancy if it guides people along toward making higher quality content.

Short term if your site is filtered out of the results during an update it may be worth looking into common footprints of sites that were hurt in that update, but it is probably not worth changing your site structure and content format over one update if you are creating true value add content that is aimed at your customer base. Sometimes Google goes too far with their filters and then adjusts them back.

Google published their official webmaster guidelines and their thoughts on SEO. Matt Cutts is also known to publish SEO tips on his personal blog. Keep in mind that Matt's job as Google's search quality leader may bias his perspective a bit.

A site by the name of Search Bistro uncovered a couple internal Google documents which have been used to teach remote quality raters what to look for when evaluating search quality since at least 2003

· Google Spam Recognition Guide for Raters (doc) - discusses the types of sites Google considers spam. Generally sites which do not add any direct value to the search or commerce experience.

· General Guidelines on Random-Query Evaluation (PDF) - shows how sites can be classified based on their value, from vital to useful to relevant to not relevant to off topic to offensive

These raters may be used to

· help train the search algorithms, or

· flag low quality sites for internal reviews, or

· human review suspected spam sites

If Google bans or penalizes your site due to an automated filter and it is your first infraction usually the site may return to the index within about 60 days of you fixing the problem. If Google manually bans your site you have to clean up your site and plead your case to get reincluded. To do so their webmaster guidelines state that you have to click a request reinclusion link from within the Google Sitemaps program.

Google Sitemaps gives you a bit of useful information from Google about what keywords your site is ranking for and which keywords people are clicking on your listing.

Social Aspects

Google allows people to write notes about different websites they visit using Google Notebook. Google also allows you to mark and share your favorite feeds and posts. Google also lets you flavorize search boxes on your site to be biased towards the topics your website covers.

Google is not as entrenched in the social aspects of search as much as Yahoo! is, but Google seems to throw out many more small tests hoping that one will perhaps stick.They are trying to make software more collaborative and trying to get people to share things like spreadsheets and calendars, while also integrating chat into email. If they can create a framework where things mesh well they may be able to gain further marketshare by offering free productivity tools.

Google SEO Tools

· Google Sitemaps - helps you determine if Google is having problems indexing your site.

· AdWords Keyword Tool - shows keywords related to an entered keyword, web page, or web site

· AdWords Traffic Estimator - estimates the bid price required to rank #1 on 85% of Google AdWords ads near searches on Google, and how much traffic an AdWords ad would drive

· Google Suggest - auto completes search queries based on the most common searches starting with the characters or words you have entered

· Google Trends - shows multi-year search trends

· Google Sets - creates semantically related keyword sets based on keyword(s) you enter

· Google Zeitgeist - shows quickly rising and falling search queries

· Google related sites - shows sites that Google thinks are related to your site related:www.site.com

· Google related word search - shows terms semantically related to a keyword ~term -term

Business Perspectives

Google has the largest search distribution, the largest ad network, and by far the most efficient search ad auction. They have aggressively extended their brand and amazing search distribution network through partnerships with small web publishers, traditional media companies, portals like AOL, computer and other hardware manufacturers such as Dell, and popular web browsers such as Firefox and Opera.

I think Google's biggest strength is also their biggest weakness. With some aspects of business they are exceptionally idealistic. While that may provide them an amazingly cheap marketing vehicle for spreading their messages and core beliefs it could also be part of what unravels Google.

As they throw out bits of their relevancy in an attempt to keep their algorithm hard to manipulate they create holes where competing search businesses can become more efficient.

In the real world there are celebrity endorsements. Google's idealism associated with their hatred toward bought links and other things which act similarly to online celebrity endorsements may leave holes in their algorithms, business model, and business philosophy that allows a competitor to sneak in and grab a large segment of the market by factoring the celebrity endorsement factor into being part of the way that businesses are marketed.

Search Marketing Perspective

If you are new to a market and are trying to compete for generic competitive terms it can take a year or more to rank well in Google. Buying older established sites with aged trusted quality citations might also be a good way to enter competitive marketplaces.

If you have better products than the competition, are a strong viral marketer, or can afford to combine your SEO efforts with traditional marketing it is much easier to get natural citations than if you try to force your way into the index.

Creating a small site with high quality unique content and focusing on getting a few exceptionally high quality links can help a new site rank quickly. In the past I believed that a link was a link and that there was just about no such thing as a bad link, but Google has changed that significantly over the last few years. With Google sometimes less is more.

At this point sometimes buying links that may seem relatively expensive at first glance when compared to cheaper alternatives (like paying $299 a year for a Yahoo! Directory listing) can be a great buy because owners of the most spammy sites would not want to have their sites manually reviewed by any of the major search companies, so likely Yahoo! and Google both are likely to place more than average weight on a Yahoo! Directory listing.

Also getting a few citations from high quality relevant related resources can go a long way to improving your overall Google search relevancy.

Right now I think Google is doing a junky job with some of their search relevancy, by placing too much trust on older domains and favoring pages that have only one or few occurrences of certain modifiers on their pages. In doing this they are ranking many cloaked pages for terms other than the terms they are targeting, and I have seen many instances of things like Google ranking real content home mortgage pages for student loan searches, largely because student loans was in the global site navigation on the home mortgage page.

Learn More

* Google on SEO

* Google's Webmaster Guidelines

* Google Spam Recognition Guide for Raters (doc)

* General Guidelines on Random-Query Evaluation (PDF)

* Google Blog

* Google AdWords

* Google AdWords Blog

* Google AdSense

* Google AdSense Blog

* Google Sitemaps

* Google Sitemaps Blog

* Papers Written by Googlers

* patent about information retrieval based on historical data

Worker Blogs

· Matt Cutts - Matt is an amazingly friendly and absurdly accessible guy given his position as the head of Google's search quality team.

· Adam Lasnik - a sharded version of Matt Cutts. A Cuttlet, if you will.

Ask

Ask was originally created as Ask Jeeves, and was founded by Garrett Gruener and David Warthen in 1996 and launched in April of 1997. It was a natural query processing engine that used editors to match common search queries, and backfilled the search results via a meta search engine that searched other popular engines.

As the web scaled and other search technologies improved Ask Jeeves tried using other technologies, such as Direct Hit (which roughly based popularity on page views until it was spammed to death), and then in 2001 they acquired Teoma, which is the core search technology they still use today. In March of 2005 InterActive Corp. announced they were buying Ask Jeeves, and by March of 2006 they dumped Jeeves, changing the brand to Ask.

On Page Content

For topics where there is a large community Ask is good at matching concepts and authoritative sources. Where those communities do not exist Ask relies a bit much on the on page content and is pretty susceptible to repetitive keyword dense search spam.

Crawling

Ask is generally slower at crawling new pages and sites than the other major engines are. They also own Bloglines, which gives them incentive to quickly index popular blog content and other rapidly updated content channels.

Query Processing

I believe Ask has a heavy bias toward topical authority sites independent of anchor text or on the page content. This has a large effect on the result set the provide for any query in that it creates a result set that is more conceptually and community oriented than keyword oriented.

Link Reputation

Ask is focused on topical communities using a concept they call Subject-Specific PopularitySM. This means that if you are entering a saturated or hyper saturated field that Ask will generally be one of the slowest engines to rank your site since they will only trust it after many topical authorities have shown they trusted it by citing it. Due to their heavy bias toward topical communities, for generic search they seem to be far more biased on how many quality related citations you have than looking as much at anchor text. For queries where there is not much of a topical community their relevancy algorithms are nowhere near as sharp.

Page vs Site

Pages on a well referenced trusted site tend to rank better than one would expect. For example, I saw some spammy press releases on a popular press release site ranking well for some generic SEO related queries. Presumably many companies link to some of their press release pages and this perhaps helps those types of sites be seen as community hubs.

Site Age

Directly I do not believe it is much of a factor. Indirectly I believe it is important in that it usually takes some finite amount of time to become a site that is approved by your topical peers.

Paid Search

Ask gets most of their paid search ads from Google AdWords. Some ad buyers in verticals where Ask users convert well may also want to buy ads directly from Ask. Ask will only place their internal ads above the Google AdWords ads if they feel the internal ads will bring in more revenue.

Editorial

Ask heavily relies upon the topical communities and industry experts to in essence be the editors of their search results. They give an overview of their ExpertRank technology on their web search FAQ page. While they have such limited distribution that few people talk about their search spam policies they reference a customer feedback form on their editorial guidelines page.

Social Aspects

Ask is a true underdog in the search space. While they offer Bloglines and many of the save a search personalization type features that many other search companies offer they do not have the critical mass of users that some of the other major search companies have.

Ask SEO Tools

Ask search results show related search phrases in the right hand column. Due to the nature of their algorithms Ask is generally not good at offering link citation searches, but recently their Bloglines service has allowed you to look for blog citations by authority, date, or relevance.

Business Perspectives

Ask is owned by InterActive Corp. While Ask is considered to be far behind the running in search volume Barry Dillar, their CEO, has made a large comeback in the television space in the past.

InterActive Corp. has some of the strongest brands in expensive niches such as online dating, loans, and travel. If they sell ad inventory space in some of those top tier markets they can have a significant effect on the search landscape in many markets. They also push the Ask.com search box on many of those well branded sites.

Ask's two biggest weak points are

* their limited distribution

* their heavy reliance on Google for providing their sponsored links

Search Marketing Perspective

Ask generally has such limited market share that I have never really worried much about them. If I was in a vertical where they drove significant traffic I might pay a bit more attention to them. If you are in one of the verticals where they have a strong brand it will be worth it to watch how they bias their search results and / or search ads toward their other internal properties, and how strongly they push their search offering on those properties.

In areas where there is a limited relevant community to build their topical community around their relevancy drops sharply.

Learn More about Ask

· Ask.com Editorial Guidelines

· Ask.com Web Search FAQ page

· Official Ask Jeeves Blog

· Jon Kleinberg's home page - has links to Kleinberg's papers on hubs and authorities. Jon's research was a large foundation of what lead to Ask's current search algorithm.

· Mike Grehan's Topic Distillation PDF - PDF about hubs & authorities

Other Search Systems

Classic large scale hypertextual search engines are only one type of an information retrieval and organization system. There are many other types of search that we do not think of as being search.

Example Vertical Search Engines

General large scale web search is just one type of search. There are many other types of search engines and information organization tools, for example

· the Yellow Pages

· Television program guides

· directories like DMOZ, the Yahoo! Directory, LII, or specialty directories

· encyclopedia type sites like Wikipedia

· large general structured databases like Google Base

· shopping search like Froogle

· local search like Google Local

· news search like Yahoo! News

· blog search like Technorati

· tag search like Del.icio.us

· video search like YouTube

· photo search like Flickr

· meme trackers like Techmeme

· social communities like Digg

· social networks like MySpace

* some people may also rely on an individual content channel or a group of them to find the most interesting things and deliver it through daily updated content streams

Limitations of Vertical Search

Vertical search services typically

· have exceptionally limited scale through running everything through central editors, or

· they have less editorial control than large scale search engines.

Vertical search is limited because

· there fewer available signs of quality they can measure in their algorithms, and

· many of them use how recent something is as one of their sorting signals, and

· they typically have less content available (due to content copyright restrictions or limited available content)

Ways to Organize Vertical Search

Thus vertical search services have to rely on trusting vetted content partners or they must heavily rely on things like usage data. Some vertical services are even based around displaying the most popular items (like Digg) or most frequently viewed items (like Popular Google Videos or Google Video Movers & Shakers).

Many search systems are thin me too aggregators, but by limiting their data sources, structuring their sources, or providing a different means of search many of these search systems are more useful than general global search systems.

Vertical Folds Into General Search

Danny Sullivan wrote a piece called Searching With Invisible Tabs which talks about how Google will fold vertical search into their global search product.

From a marketing perspective the things you have to remember are

· some vertical search services are harder to get into due to editorial vetting processes

· some vertical search services are easy to dominate due to limited competition

· vertical search is being folded into many global search products

Search Engines as Efficient Media Companies

Google, Microsoft, and Yahoo! are all trying to increase ad network efficiency and extend current ad networks to place more relevant ads in current ad systems and to increase the accessibility of ads to smaller merchants via automation and hyper-targeting.

· Google is launching a payment processor which will help them get better conversion statistics. That will allow them to improve ad targeting and increase ad automation.

· Google is trying to roll out free WiFi services so they can track mobile search ads.

· Companies like Olive Software will make it easier for offline content to come online, and make it easy to cite those works like regular web pages. After a few media companies find a way to make sense of the search model many others will likely trickle across.

· Yahoo! bought Flickr, Del.icio.us, Upcomming, and many other vertical search companies.

· Microsoft bought Massive, Inc., a company which sells targeted ads in video games.

· Google paid roughly 1 billion dollars to buy dMark Broadcasting, a radio advertising automation network.

· Google is trying to sell ads in print media.

· Google is doing research on passive automation of social web integration into the television experience (PDF).

The search battle is general a battle for near perfect market data which can be leveraged in a near infinite number of ways, with each additional vertical or efficiency lending the network to a potentially larger scale AND an even greater network efficiency.

Due to brand limitations and limited market size many vertical search services will remain unscathed by global search powerhouses, but as search improves in efficiency the major search engines will swallow many additional markets and make them more efficient.

Thanks to...

* Danny Sullivan and all the other people that were easy to reference while writing this.

* Google, for being evil and divergent enough with their algorithm to make such an article a worthwhile endeavor.