Search Engine Friendly URLs - URL Rewriting



Article by Thomas A. Powell and Joe Lima from Port80 Software

Towards Next Generation URLs

Page 1 - You are here

Page 2

For many years we have heard about the impending death of URLs that are difficult to type, remember and preserve. The use of URLs has actually improved little thus far, but changes are afoot in both development practices and web server technology that should help advance URLs to the next generation.

Dirty URLs

Complex, hard-to-read URLs are often dubbed dirty URLs because they tend to be littered with punctuation and identifiers that are at best irrelevant to the ordinary user. URLs such as http://www.example.com/cgi-bin/gen.pl?id=4&view=basic are commonplace in today's dynamic web. Unfortunately, dirty URLs have a variety of troubling aspects, including:

Dirty URLs are difficult to type

The length, use of punctuation, and complexity of these URLs makes typos commonplace.

Dirty URLs do not promote usability

Because dirty URLs are long and complex, they are difficult to repeat or remember and provide few clues for average users as to what a particular resource actually contains or the function it performs.

Dirty URLs are a security risk

The query string which follows the question mark (?) in a dirty URL is often modified by hackers in an attempt to perform a front door attack into a web application. The very file extensions used in complex URLs such as .asp, .jsp, .pl, and so on also give away valuable information about the implementation of a dynamic web site that a potential hacker may utilize.

Dirty URLs impede abstraction and maintainability

Because dirty URLs generally expose the technology used (via the file extension) and the parameters used (via the query string), they do not promote abstraction. Instead of hiding such implementation details, dirty URLs expose the underlying "wiring" of a site. As a result, changing from one technology to another is a difficult and painful process filled with the potential for broken links and numerous required redirects.

Back to Previous

Why Use Dirty URLs?

Given the numerous problems with dirty URLs, one might wonder why they are used at all. The most obvious reason is simply convention - using them has been, and so far still is, an accepted practice in web development. This fact aside, dirty URLs do have a few real benefits, including:

They are portable

A dirty URL generally contains all the information necessary to reconstruct a particular dynamic query. For example, consider how a query for "web server software" appears in Google...

http://www.google.com/search?hl=en&ie=UTF-8&oe=UTF-8&q=web+server+software

Given this URL, you can rerun the query at any time in the future. Though difficult to type, it is easily bookmarked.

They can discourage unwanted reuse

The negative aspects of a dirty URL can be regarded as positive when the intent is to discourage the user from typing a URL, remembering it, or saving it as a bookmark. The intimidating look and length of a dirty URL can be a signal to both user and search engine to stay away from a page that is bound to change. This is often simply a welcome side effect, rather than a conscious access control policy - frequently nothing is done to prevent actual use of the URL by means of session variables or referring URL checks.

Back to Previous

Cleaning URLs

The disadvantages of dirty URLs far outweigh their advantages in most situations. If the last 30 or 40 years of software development history are any indication of where development for the web is headed, abstraction and data hiding will inevitably increase as web sites and applications continue to grow in complexity. Thus, web developers should work toward cleaner URLs by using the following techniques:

Keep them short and sweet

The first path to better URLs is to design them properly from the start. Try to make the site directories and file names short but meaningful. Obviously, /products is better than /p, but resist the urge to get too descriptive. Having www.xyz.com/productcatalog doesn't add much meaning (if a user looks for a product catalog, they might well expect to find it at or near the top-level products page), but it does needlessly restrict what the page can reasonably contain in the future. It's also harder to remember or guess at. Shoot for the shortest identifiers consistent with a general description of the page's (or directory's) contents or function.

Avoid punctuation in file names

Often designers use names like product_spec_sheet.html or product-spec-sheet.html. The underscore is often difficult to notice and type, and these connectors are usually a sign of a carelessly designed site structure. They are only required because the last rule wasn't followed.

Admin Note: We agree that using the underscore in file naming structures is not user friendly. We recommend separation of file names using hyphens (-).

Use lower case and try to address case sensitivity issues

Given the last tip, you might instead name a file ProductSpecSheet.html. However, casing in URLs is troubling because depending on the web server's operating system, file names and directories may or may not be case sensitive. For example, http://ww.xyz.com/Products.html and http://www.xyz.com/products.html are two different files on a UNIX system but the same file on a Windows system. Add to this the fact that www.xyz.com and WWW.XYZ.COM are always the same domain, and the potential for confusion becomes apparent. The best solution is to make all file and directory names lowercase by default and, in a case sensitive server operating environment, to ensure that URLs will be correctly processed no matter what casing is used. This is not easy to do under Apache on Unix/Linux systems (related info), although URL rewriting and spellchecking can help (discussed on page 2 of this article).

Do not expose technology via directory names

Directory names commonly or easily associated with a given server-side technology unnecessarily disclose implementation details and discourage permanent URLs. More generic paths should be used. For example, instead of /cgi-bin, use a /scripts directory, instead of /css, use /styles, instead of /javascript, use /scripts, and so on.

Plan for host name typos

The reality of end user navigation is that around half of all site traffic is from direct type or bookmarked access. If users want to go to Amazon's web site, they know to type in www.amazon.com. However, accidentally typing ww.amazon.com or wwww.amazon.com is fairly easy if a user is in a hurry. Adding a few entries to a site's domain name service to map w, ww, and wwww to the main site, as well as the common www.site.com and site.com, is well worth the few minutes required to set them up.

Plan for domain name typos

If possible, secure common "fat finger" typos of domain names. Given the proximity of the "z" and "x" keys on a standard computer QWERTY keyboard, it is no wonder Amazon also has contingency domains like amaxon.com. Google allows for such variations as gooogle.com and gogle.com. Unfortunately, many web traffic aggregators will purchase the typo domains for common sites, but most organizations should find some of their typo domains readily available. Organizations with names that are difficult to spell, like "Ximed," might want to have related domains like "Zimed" or "Zymed" for users who know the name of the organization but not the correct spelling. The particular domains needed for a company should reveal themselves during the course of regular offline correspondence with customers.

Support multiple domain forms

If an organization has many forms to its name, such as International Business Machines and IBM, it is wise to register both forms. Some companies will register their legal form as well, so XYZ, LLC or ABC, Inc. might register xyzllc.com and abcinc.com as well as primary domains. While it seems like a significant investment, if you use one of the new breed of low cost registrars, the price per year for numerous domains for a site is quite reasonable. Given alternate domain extensions like .net, .org, .biz and so on, the question begs - where to stop? Anecdotally, the benefits are significantly reduced with new alternate domain forms (like .biz, .cc, and so on), so it is better to stick with the common domain form (.com) and any regional domains that are appropriate (e.g. co.uk).

Add guessable entry point URLs

Since users guess domain names, it is not a stretch for users - particularly power users - to guess directory paths in URLs. For example, a user trying to find information about Microsoft Word might type http://www.microsoft.com/word. Mapping multiple URLs to common guessable site entry points is fairly easy to do. Many sites have already begun to create a variety of synonym URLs for sections. For example, to access the careers section of the site, the canonical URL might be http://www.xyz.com/careers. However, adding in URLs like http://www.xyz.com/career, http://www.xyz.com/jobs, or http://www.xyz.com/hr is easy and vastly improves the chances that the user will hit the target. You could even go so far as to add hostname remapping so that http://investor.xyz.com, http://ir.xyz.com, http://investors.xyz.com, and so on all go to http://www.xyz.com/investor. The effort made to think about URLs in this fashion not only improves their usability, but should also promote long term maintainability by encouraging the modularization of site information.

Where possible, remove query strings by pre-generating dynamic pages

Often, complex URLs like http://www.xyz.com/press/releasedetail.asp?pressid=5 result from an inappropriate use of dynamic pages. Many developers use server-side scripting technologies like ASP/ASP.NET, ColdFusion, PHP, and so on to generate "dynamic" pages which are actually static. For example in the previous URL, the ASP script drills press release content out of a database using a primary key of 5 and generates a page. However, in nearly all cases, this type of page is static both in content and presentation. The generation of the page dynamically at user view time wastes precious server resources, slows the page down, and adds unnecessary complexity to the URL. Some dynamic caches and content distribution networks will alleviate the performance penalty here, but the unnecessarily complex URLs remain. It is easy to directly pre-generate a page to its static form and clean its URL. Thus, http://www.xyz.com/press/releasedetail.asp?pressid=5 might become www.xyz.com/press/pressrelease5 or something much more descriptive like http://www.xyz.com/press/2003-07-04 - or even better like http://www.xyz.com/press/newproduct. The issue of when to generate a page, either at request time or beforehand, is not much different than the question of whether a program should be interpreted or compiled.

Back to Previous

This is a two page article on URL Rewriting and Search Engine Friendly URLs. The next section discusses URL Rewriting and Content Negotiation.

Back to Previous


Article Disclaimer: The SEO Consultants Directory does not endorse the opinions and/or facts expressed by members who provide marketing articles for our site. These search engine optimization and search engine marketing articles are here for you to review and make your own decisions.

If you are a member of the SEO Consultants Directory, you can submit a search engine marketing article for review by following the instructions in our Member Submitted SEO/SEM Articles section.

Back to Previous

 


SEO Consultants Directory