Wednesday, June 15, 2005

SEO Principles

The following is an excerpt from Infoworld.com

What are some basic SEO principles?

A Web site built with users in mind is likely to fare well with the software "spiders" that search engines use to index content and rank sites, Thurow says. The three building blocks of such a site are:

-- Text: Use content that includes words and phrases your company's target audience is likely to type into search queries. If you're selling mountain bikes, that term should be everywhere -- in page headings, navigation buttons, photo captions and the like.

-- Links: A site's navigation scheme should be accessible, coherent and consistent so that spiders and humans can easily traverse it.

-- Popularity: A good site will prompt others to link to it. External links from reputable sites will enhance your Web site's ranking in search engines.

Why should a company bother to use SEO if it can buy pay-per-click ads?

A company can always buy pay-per-click (PPC) ads on search engines. These ads appear whenever users query specific keywords. However, JupiterResearch has found that the average company gets about 80 percent of its commercial search engine referrals from organic results and the rest from PPC ads. So while PPC ads can complement organic search results, particularly when doing seasonal promotions, they are no substitute.

How much should a company expect to spend on SEO consulting work?

It depends on the size of the Web site and the condition it's in, says Chris Winfield, president and cofounder of search engine marketer 10e20. Most of his company's clients sign up for a year's service, paying between US$5,000 and $15,000 per month, plus sometimes an initial, one-time fee, he says.

What should CMOs look for when evaluating SEO consultants?

In addition to checking the obvious -- references, expertise and resources -- it's critical to hire a consultant who abides by a search engine's SEO guidelines. Avoid search engine marketing firms that offer to elevate rankings for a low, one-time fee, since the only way to legitimately guarantee top positioning is with a PPC ad. Unscrupulous firms typically try to trick search engine spiders by employing so-called black hat techniques, such as stuffing a Web site with hidden keywords. When a search engine detects such a trick, it will ban the site from its index. Then no amount of SEO will make it appear.

2 comments:

  1. Eye tracking studies have shown that people using search engines scan a search results page from top to bottom and left to right looking for a relevant result. Placement at or near the top of the rankings therefore increases the number of searchers who will visit a site. However, more search engine referrals does not guarantee more sales. SEO is not necessarily an appropriate strategy for every website, and other Internet marketing strategies can be much more effective, depending on the site operator's goals. A successful Internet marketing campaign may drive organic search results to pages, but it also may involve the use of paid advertising on search engines and other pages, building high quality web pages to engage and persuade, addressing technical issues that may keep search engines from crawling and indexing those sites, setting up analytics programs to enable site owners to measure their successes, and improving a site's conversion rate.
    SEO may generate a return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.

    ReplyDelete
  2. There are many significant factors which play role in the way major engines such as Google rank the websites:

    age of the domain (including how long the site has been online, the age of individual pages and measures of how frequently content is updated). Overall there appears to be a bias in favour of sites that have existed for a few years,
    quantity (including number of pages/files, number of words in aggregate and number of words per page)
    availability (is the content available 24/7),
    positive reputation (number of citation by other sites, particularly citations by sites that have a high score in terms of age, quantity and so forth)
    technical negatives (non-compliant code, broken outgoing links, conflicts between page titles and page content, indications that metatags have been heavily 'optimised' through recurrent use of words such as sex or adult, use of 'invisible' text)
    reputation negatives (outgoing links that point to sites with a low reputation, files that feature illegal content or malware, participation in commercial link exchange schemes, sudden spurts in inclusion of links to sites with a poor reputation)
    quality indicators (uniqueness of content, inclusion of bibliographic material and of automatically verifiable contact details)
    measures of user satisfaction (eg click through from initial entry to other pages on the site, time spent on the site, correlation between free and paid-placement search results)
    user demographics (matching site content with information about users)
    auspices (weighting for recognised publishers, government agencies, professional organisations)
    IP addresses (weighting against ISPs/ICHs that are perceived as being permissive to spammers and against address blocks that have an unusually high number of low reputation sites)
    keywords (in particular keywords that the engine perceives as presented in an appropriate syntax rather than at an unnatural frequency and at random to subvert the algorithm)

    ReplyDelete