When Good Forms Go Bad


Posted by Brad Beiter, Senior Program Manager (Natural Search)

Many Web sites require users to input data into forms in order to advance to the next page and get more information.  This may entail requiring the user to choose a product from a JavaScript drop-down menu or making the user type a ZIP code into a field in order to find a store location.  While these forms are easy for users to navigate, search engine spiders generally can’t input data into the forms in order to get past them.  Unless the spider has another means  to get past, the pages behind the forms won’t be indexed.  This is a common problem for large retailers who have many product offerings and/or store locations that they put behind forms.  As search becomes more geo-targeted, making sure spiders can index store locator pages is particularly important.  Here are a few tips on creating search engine friendly web forms:

·         Alternate HTML Navigation– Creating an HTML path through the form for the spider works well for sites that don’t have too many pages behind the form.  For instance, you could create an alternate navigation in your native sitemap.  As long as your sitemap is clearly linked to within the footer of your homepage, the sitemap can lead the spider to different product areas, different versions of the site (such as by country) or different store/campus locations.

·         XML Sitemap– For sites that have a lot of pages behind forms, such as a retailer with thousands of locations, creating an external sitemap of all the URLs and submitting it to the engines is the best strategy.  In this case, each individual page needs to have a unique URL.

More information on optimizing store locator forms.


Comments are closed.

Performics Newsletter

[raw]



[/raw]