Performics News

Posted by Sam Battin, Senior Search Specialist (Natural Search)

1.     Avoid Dynamic Parameters

 Dynamic parameters are randomized character strings that are typically used for customer tracking. These parameters change with each unique user and they often appear as jumbled text, such as www.yoursite.com/campaigns/printer_promo?mco=MTE3Mzc

Dynamic parameters cause great confusion for search engine spiders. Often times, spiders will index duplicate content, which not only wastes the spiders’ time, but distracts them from indexing other crucial pages on an advertiser’s Web site.

Advertisers should consider using alternative tracking solutions such as cookies. Cookies store pertinent information that a Web site needs to remember, such as a visitor’s name. If an advertiser needs a “work around” solution, they can use robots.txt files to prevent indexation of pages with dynamic URLS.

2.       Benefit More from Sub-Domains or Sub-Directories?

The answer is neither. A sub-domain is a DNS (domain name system) alias, such as dvds.yoursite.com and a subdirectory is a folder on an advertiser’s Web site, such as www.yoursite.com/dvds. Neither structure has a greater natural search advantage among the search engine spiders. They are treated exactly the same in regards to visibility. Therefore, advertisers do not need to worry which structure is the best option for spiders.

Nevertheless, advertisers should consider the length of their sub-directories.  It’s best to keep them short and concise (www.yoursite.com/dvds/horror). Search engine spiders prefer flat directory structures; they are much easier to crawl. In a perfect world, every file would be located within the root domain.  However, this is not always possible or ideal for spiders, so the shorter the better.

3.       Create Logical Content Divisions

If a flat sub-directory structure is not possible, advertisers should, at the very least, ensure that their sub directories are arranged in a logical manner. Additionally, advertisers should use terms that clearly identify the content in each sub-directory. 

For example, a sub-directory structure, such as www.yoursite.com/televisions/plasma, is intuitive to search engine spiders, since each sub-directory contains descriptive words that are arranged in a logical manner (broad to specific). 

However, a directory structure such as, www.yoursite.com/contents/us/plasma/browse/home/shop/televisions, is very confusing to the spiders. The sub-directory names are too generic to convey meaning and they are not arranged in a logical order.

March 9, 2009

Performics’ ABC’s to Spider-Friendly URL Structures

Posted by Sam Battin, Senior Search Specialist (Natural Search) 1.     Avoid Dynamic Parameters  Dynamic parameters are randomized character strings that are typically used for customer tracking. These […]
March 5, 2009

The Case for Misspellings

Posted by Danielle Gantos, Account Manager (Paid Search) Creating misspelled versions of high-volume keywords already running in a paid search program is an effective way of […]
March 4, 2009

Canonical Tags: No Panacea for Duplicate Content

Posted by Chris Keating, Senior Manager (Natural Search) Google, Yahoo! and Microsoft recently announced that spiders will recognize a tag that enables them to select one […]
February 10, 2009

Negative Keywords 101

Posted by Dan Malachowski, Product Marketing Manager (Search and Performance Media) A negative keyword is a keyword matching option used in paid search to prevent online […]

Performics Newsletter

[raw]



[/raw]