HYBRID SERVICES

These services present both directory-based and crawler-based results, although most hybrid services favor one type of listing over another. MSN Search, for example, typically will present listings from the LookSmart directory before it presents crawler-based results provided by Inktomi. Yahoo! uses its own directory service, which is supplemented with crawler-based results provided by Google (that may have changed by the time you read this book).

Designing for Search Engines

But, you might ask, how do crawler-based search engines go about determining relevancy, when confronted with hundreds of millions of web pages to sort through? They use algorithms, i.e. a set of rules that govern their spiders’ crawling techniques, indexing techniques and ranking within the list of returns of a specific search term. Although exactly how a search engine’s algorithm works is a closely kept secret, all major search engines follow the same, general rules.

The remainder of this chapter, hopefully, will help you to understand how to design your web pages so that your website will get the proper search engine and directory rankings that it needs to be successful.

When someone queries a search engine for a keyword related to your site’s products/services, does your web page appear in the top 20 matches, or does your competition’s? If your web pages aren’t listed within the first two or three pages of results, you lose. To avoid such a circumstance, when designing your new website, take into consideration the inner workings of search engines. If you ignore the criteria necessary for optimal placement by search engines, your website will miss out on traffic that it would otherwise have received if your website had been designed with search engine placement as one of its design criteria.

NOTE
The three most popular search portals are Google, AlltheWeb, and Yahoo!. Trailing behind these giants are MSN Search, AOL Search, Askjeeves, and HotBot. All of these search portals in one way or another use spiders to crawl or search the Internet. Humans then search through the results in an effort to optimize the search engine’s database.

Mergers and acquisitions are changing the search portal landscape. As of mid-October 2003, Yahoo! owns AltaVista, AlltheWeb, Inktomi, and Overture. At this writing, AltaVista and AlltheWeb continue to be available at their historic locations; however, they may share the same underlying database very soon. It is noted that Inktomi remains the back-end search engine at MSN Search and is still available at HotBot.

As discussed previously, a spider is a small program that gives weight to the placement and frequency of words, and uses ranking algorithms during the search process. And as explained in the “Spider” text box, while location and frequency of keywords on a web page is generally given the most weight, related words and word relevance along with other criteria, such as descriptive five word or so page titles, body copy, placement of keywords, and meta tags within your HTML code, etc. all play a role in how a search engine ranks your web pages.

Here is an illustration of how your website’s design influences a web page’s ranking:

Say that a potential customer types in “antiquarian books.” That customer wants to find websites that have content about and/or sell antiquarian books. Since the search engine assumes the same thing, the results will be top heavy with web pages having that search term appearing in their HTML title tag. It assumes those web pages are more relevant than those without the term in their title tag. But search engines don’t stop there. They also check to see if the words “antiquarian books” appear near the top of a web page, such as in the headline or in the first few paragraphs of text because it is assumed that any page relevant to the topic will mention those words somewhere near the top of the page.

Now let’s consider frequency using the same scenario. A search engine also analyzes how often the keywords “antiquarian” and “books” appear in relation to other words in a web page. Those with a higher frequency are often deemed more relevant than other web pages.

So even though search engines vary on how they rank websites, every web page should include:

  • Page <TITLE> tag.
  • Keyword meta tag which is more than one word.
  • Description meta tag.
  • <!— comments tags —>.
  • First 25 words (or 255 characters) of text.
  • NO FRAMES tag.
  • Hidden FORM tag.
  • HTML tags.
  • <ALT> tags.

Let’s look a bit closer at each of these elements:

Title: The title you choose will be the most important decision you make affecting search engine ranking and listing. There is no specific science to it — just make it simple. Look at the web page and the first five or so descriptive words that come to mind can be the title. Another way to look it — think of your title as a catchy headline for an ad.

Text: When it comes to the text of a submitted web page the search engines vary their indexing procedure. While some will index the text of a submitted page others will only take into account the first 25 words (or 255 characters) of a submitted page (25/255 rule). So, write the text of a submitted page using the important keywords more than once in the first 25 words.

Something else you can do is to create at the top of a submitted web page, a transparent gif image that is one pixel in size and inside the ALT tag insert a description of the page using the 25/255 rule.

Meta Tags: They are indispensable tools in your battle for search engine ranking. Put them, along with keywords relevant to each specific page, on each page of your website.

When we discuss meta tags in this chapter we are discussing only description and keyword tags. A description meta tag is exactly what it sounds like — it gives a description of a web page for the search engine summary. A keyword meta tag is again exactly what it states — it gives keywords (which should never be fewer than two words) for the search engine to associate with a specific web page. These meta tags, which go inside the header tags, are crucial for optimal search engine indexing. Your meta tags should reflect the content of the first couple of sentences of the main body. It is important that you make certain that the words you use in your keyword tags are words that someone would type in to find your website.

Keyword hints:

  • Keywords are target words that will drive people to your website.
  • When choosing your keywords, always use the plural of the word. Searching for “car” with find sites with “cars” in their keywords but searching for “cars” will not find sites with only the singular “car” in their keywords.
  • Almost any site on the Web could use “web,” “internet,” “net,” or “services,” as a descriptive keyword. Don’t! Using these and other like words to target potential customers is fruitless and most of the spiders actively ignore common words such as these.
  • Include incorrect spellings of keywords that are routinely misspelled. For example, the word “accommodations” is commonly misspelled as “accomodations” so include both in your keywords.

An example:

<HEAD>

<TITLE>Best Online Widget Store in the Universe</TITLE>

<META name=”description” content=”An online store with all the Widgets you would ever want.”>

<META name=”keywords” content=”widgets, widget accessories, widget howto, widget books, widget articles, widget technical papers, widget software, working with widgets, designing with widgets,”>

</HEAD>

For guidelines on what you should do with meta tags, go to a search engine, say AlltheWeb, search for a term or word that you hope someone would use to find your website. Then go to the top ranked websites and use the “view source” feature of your browser to see what kind of meta tags each of these sites use. Study them and understand their relationship to the web page, then use this information when you are composing your own meta tags.

Keywords: There are two ways to approach keywords: A blanket strategy and a targeted strategy. When you use a large list of keywords, your pages will be found by a variety of surfers using a extensive range of search strings, but your web pages will not, in all probability, be among the top ranking pages — this is the blanket strategy. When you use a limited number of keywords, the density of these few keywords increase and therefore put them higher up the list — the targeted strategy.

If you have a website that offers either a limited number of products/services or products/services that can be adequately covered with a short keyword list then the target strategy is for you. In other words, you’re confident that potential customers will search for those specific words above all others. However, if you have a wide variety of products/services on your website (such as Drugstore.com or Outpost.com), then you might use the blanket method. Or consider the doorway page, mentioned later in this section.

Keyword Mix: Pay attention to your keyword mix. Keyword density (the ratio of a keyword or keyphrase to the total words [depth] on a page) is the factor that search engines most consider when assigning relevancy ratings to web pages. Achieving the right keyword or keyphrase mix has become almost a science. Some search engines look for various combinations of keyword density, i.e., the number of keywords versus total word count must be within a certain range, and to complicate matters, they assign different “weights” to components such as Title, Meta Tags, Links, Body Text, Headline, etc.

Link popularity. This refers both to the number of similar websites you’ve placed links to within your web pages and the number of websites that have links that point back to your e-commerce site. Your links to other websites must be on relevant pages — that is pages that have as much to do with the common theme of your website as possible, and that are not just a page full of links. Pages that are full of links are commonly referred to as “link farms” and are ignored by spiders.

Search engines view a website with a large number of incoming links (i.e. other websites that have links to your website) as an important or popular website. Thus, according to search engines, a website with lots of links leading to it generally implies that the website is a valued one and the search engine’s database would not be complete without it. Link popularity is vital if your site is to achieve a high search engine placement ranking.

Use Optimization Tools

If your keyword density is too low, your page will not be rated high enough in relevancy and, conversely, if too high, then your site may be penalized for “keyword stuffing.” I know this sounds very complicated but there is a way to get help — keyword optimizing tools such as the GRSoftware’s Keyword Density Analyzer (www.grsoftware.net) and the Webpositioning Gold at (www.website-promoters.com). Or check out Keyworddensity.com, which provides free online analysis of any web page. Another option is to use a service like Abalone Designs (www.abalone.ca), Dragonfly Design (http://dragonfly-design.com/special-offers.html.) or etrafficjams.com. All three provide free website analyses as a marketing tool.

Many of you will decide to purchase search engine optimization tools because such tools can help you to get the desired results from search engine submissions, by providing you with the means to tweak your website so that it works with visiting search engine spiders and their complex, math-based formulas used to rank websites. Here is what to look for when selecting a search engine optimization tool:

  • Good documentation. You want a product that provides a clear overview of the different steps you need to take to prepare and then submit your website to the various search engines and directories.
  • An intuitive interface that provides ease of use. When you buy a search engine optimization tool, you don’t want to spend a lot of time learning how to use the program.
  • Submission options. Look for a product that will let you choose the number of pages that can be submitted to the search engines. With most search engines, it is better to submit all of your web pages individually, but not all tools allow this type of submission.
  • A viable search engine list. Watch out for search submission products that say they will submit your site to thousands of search engines. While this might help drive traffic to your website (there are thousands of micro search engines), only a small portion will help you increase the number of relevant visits to your website. Submitting your site to Lawcrawler, or some other specialty search engine unrelated to your website’s content, will not have much value. The best search engine tools focus on the major search engines and directories. Then add regional and specialty engines.
  • Keyword selection. Choose a tool that guarantees that its keyword selection tools are based on an analysis of millions of words and phrases entered in search queries daily.
  • Page analysis. Look for a product that offers page analysis, i.e. there are tools to examine each web page from a search engine’s viewpoint. The tool should check for standard techniques and elements that will improve your ranking for your selected keyword phrases. A list of problem areas and how to correct them should be presented after the analysis is complete.
  • Select specific search engines and directories. Find an optimization tool that doesn’t provide an “all-or-nothing” submission option. Instead, choose a search engine optimization product that allows you to select the search engines and directories to which you want to submit your website pages.
  • Tracking and ongoing analysis. Look for a product that enables you to assess whether your web pages have been listed in the index of all submitted search engines and directories and how they are ranked in the various indexes. Products that provide tracking and ongoing analysis tools will allow you to perform such analysis.
  • Money-back guarantee. Don’t spend your money on a optimization tool that won’t provide a money-back guarantee. Many search engines and directories are now very selective about automated submission sources (this is what all search engine optimization tools do). Find a product that guarantees listings in the search engines and directories on its list.

Most search engine optimization tools are relatively inexpensive. To give you a sampling of what’s available, check out:

  • Axandra/Voget Selbach Enterprises GmbH’s Internet Business Promoter (IBP) product (www.axandra). The cost ranges from free to $350.00.
  • Microsoft’s bCentral (www.bcentral.com). The cost for the annual plan is $79.
  • Netmechanic.com’s search engine optimization tools are available on an annual plan basis. The cost per URL subscription is $49.
  • Websiteceo.com offers four editions with different pricing options. The cost ranges from free to $495.

NOTE
By submitting your website to a search engine you speed up the spidering process. For after you have submitted your URL to a search engine, it sends a spider to “investigate” your new website. But you should also be aware that the information the spider returns many times will be exactly what appears in the “results page” of the search engine.

You Can Do More

Now let’s examine some other steps you can take that might improved your web page ranking.

Cloaking, or “stealth scripts.” These pages make use of software to serve one page to surfers and a different page to search engines. There are even sophisticated scripts that can serve a different page to each search engine, allowing you to customize a page for specific engines. A secondary reason for cloaking your pages is to protect your highly optimized page from code thieves. All a thief can steal is your “surfer” page.

It is imperative that your search engine pages (i.e. cloaked pages) represent your surfer pages fairly and accurately. Search engines once threatened to ban cloaked sites. However, most engines have admitted, reluctantly, that they will not take any action unless the surfer page is on a different subject than the search engine page. But if you use keywords and phrases that are not related to your actual content, you risk having your entire website banned from most, if not all, search engine indexes.

Doorway Pages: Search engines do a poor job of indexing and scoring web pages that use dynamically generated pages, frames, or Java Script. This is where doorway pages, entry or bridge pages (they all mean the same thing), come into play.

Doorway pages also can be used to create alternate entrances to your website so as to target a specific search engine with a page designed to deal with that search engine’s criteria. Although doorway pages should be designed carefully to target specific keywords for individual search engines, they should also provide customer-centric information and point the customer to the “guts” of your website and have the same look and feel as the rest of your website.

Doorway pages can help you to obtain a high ranking using the unique ranking algorithms of each search engine. A web page that will be highly ranked by one search engine may not fare as well on another. You will want to cater to the top search engines, which at this writing are: Google, AlltheWeb, Yahoo!-owned Inktomi, and Teoma (Ask Jeeves). Ideally, you need one gateway page for each of these six search engines, and for each keyword or keyword phrase that a potential customer might use to find your site. For example, if you anticipate that normally customers will search for your website using one of five different keywords/phrases, then you’ll need 30 doorway pages — 6 search engines times 5 keywords/phrases.

There are three types of doorway pages:

  • A page that invites the customer to continue on to your website’s home page (at the same time it provides the specific search engine with a page that it will find highly relevant).
  • A page that is semi-invisible to the customer through the use of a Java script redirection technique. The page that is submitted to the search engines is stripped down to the minimum so that the search engine finds it highly relevant but a customer will only see the page as a “flash” before the real page is presented to the browser. There are two problems with this method: If your customers’ browser is not enabled for Java script, they will see the unattractive stripped down page, and some search engines find the “redirection” code and then downgrade the relevancy of the page.
  • A doorway page that is completely invisible to your customers. This is where software like iPush.net — an IP based delivery and cloaking system designed to help you get the best possible search engine results and keep them. An oversimplification of this software is that it “cloaks” the doorway pages so that only the spiders from a specific search engine sees a specific page. This allows you to:
  • Never worry about your visitors seeing your doorway pages, since they won’t see them, you can design them strictly based on a particular engine’s algorithm. They may make no sense at all, but the engine they were designed for will love them.
  • Never worry about someone stealing your rankings because outsiders won’t be able to see the HTML code that got the ranking.
  • Design your regular pages freely without fear of hurting your rankings.
  • Use multiple pages without worry of hitting any limit in any search engine’s criteria (www.ipush.net for details).

As a side note: Yahoo! does not accept doorway pages.

To find all the nitty gritty details necessary to design effective doorway pages, search the Web with one of the many search engines using the keywords “doorway pages.” Or visit Spider-food.net for extensive information on doorway pages and other search engine goodies. Another great site is Spiderhunter.com where you can find a tutorial on cloaking techniques and free cloaking script for your use, plus many other interesting details about spiders.

Submitting Your Web Pages

The best practice (there are exceptions) is to submit each page of your website, individually, to all of the search engines and directories. You may think that submitting each and every page of your website is not necessary since some pages may have, for example, investor information or contact information. But every page that is listed is like an entry in a drawing — the more entries, the more chances you have.

Use a search engine optimization tool (previously discussed) for the submission process, you can submit some of your web pages by hand to specific search engines and automate the rest using web-positioning software such as Submitta.com, the previously mentioned Webpositioning Gold(www.website-promoters.com), or check out one of the many website promotion services available online.

If you want to try submitting your web pages yourself here is a guide:

  • Submit your main URL (i.e. http://yourdomainname.com) after you have finished designing your website.
  • Submit other important pages in weekly intervals and in very small batches (no more than 10 a day) since search engines are very sensitive to what they consider spamming.
  • A large website should submit first its most important and customer-centric web pages, keyword-wise that is, since it is easy for a website with 200 or so pages to hit their page limit (usually 50 or so pages) with search engines.
  • Once you have submitted all of your web pages, you need to not only re-submit each time you make substantial changes to a page but also, once every three or so months re-submit the pages following the procedure set out above.
  • Pages that are generated “on the fly” usually will not be indexed so don’t submit them.
  • If you have pages with frames, don’t submit them since most spiders will not crawl a page with frames (no matter what you might have read to the contrary).
  • Test and check to see how your website rates with the search engines after your submission procedure is completed.
  • Monitor your website listings regularly. Sometimes your listing can just disappear or some kind of error can cause the link to become bad, etc. When you find something wrong, re-submit that web page.

Pay attention to how your website is listed on a search engine. Does it identify what your website is and the products/services provided? To assure that your search engine listing provides the proper information needed by the surfing public, use your title tag (e.g. <title>Best Online Widget Store in the Universe </title>). Search engines then use as the descriptive paragraph one of the following, depending on the search engine: either your description tag (<META name=”description” content=”An online store with all the Widgets you would ever want.”>) or the first 250 words (or so) of visible text on your site.

Although it will take a little effort, it is important that you balance these tags. In other words, sometimes what you need to put in as a title or descriptive tag (to get a high ranking with a particular search engine) will not help you in your quest to have potential customers easily find your page through commonly used keyword searches.



The Complete E-Commerce Book. Design, Build & Maintain a Successful Web-based Business
The Complete E-Commerce Book, Second Edition: Design, Build & Maintain a Successful Web-based Business
ISBN: B001KVZJWC
EAN: N/A
Year: 2004
Pages: 159

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net