I picked up an interesting topic in the last couple of days ; it seems PC Magazine John Dvorak made a post specifically mentioning 'Long Urls'.
"Long URLs are bogus! Tags like "nude" and "naked" are counterfeits! SEO is a big business, and from what I can tell its proponents are modern snake-oil salesmen."
John Dvorak : http://www.pcmag.com/article2/0,2817,2340694,00.asp
(just because he doesn't believe in SEO, I've helpfully put a 'rel=nofollow' on that link)
In the article, John goes on to explain how he changed his Wordpress blog from a typical CMS url like this:
And he goes further to say that his traffic fell, and concludes "it does nothing". And then concludes that the field of SEO is all snake oil. He even takes the time to put the boot into Tag Clouds, saying that they don't work either.
I'm here to tell you : he is wrong, plain wrong. Patricio Robles covers it well with this blog post:
In that post, it's explained how John may not have correctly redirected his old urls to his new urls using 301 redirects, or that he may have not given the change long enough to work.
DNN Human Friendly or 'Long' Urls
Where I'd like to step in and help eliminate some of the fuss is to say this : Long Urls with keywords in the address absolutely work. DotNetNuke standard Urls are similar to the standard Wordpress Urls that are mentioned. This blog post, without any sort of Url modification, would look like this : http://www.ifinity.com.au/Default.aspx?tabid=65&EntryId=57 .
There are several aspects to why good urls are important to Search Engine Optimisation strategies:
1. Keywords in Urls and in Anchor Text Matter
2. Clear Urls which signal content are more 'clickable'
3. Database Id ridden Urls are an invention of lazy programmers
I'll discuss each of these points separately.
Keywords in Urls and in Anchor Text Matter.
First I'll cover why keywords in the link text are so important. It's because they convey, in an unambiguous way, what the destination Url is all about. You can only choose a few keywords, so generally they indicate the correct content of the destination page.
There's an old experiment that is useful in determining how much value the text in a link matters for optimising for a particular keyword. It's very simple : just google the words 'Click Here'. I'll even put a link in for you (opens in new window) : http://www.google.com/search?q=click+here
What comes up at the coveted number 1 spot? Why, it's the Adobe Acrobat Reader download page. If you search that page for 'click here', or look in the Url, or look in the Page Meta Tags or description, you won't find the words 'click here' anywhere. So how does it rate in the #1 spot? Simple : millions of web pages around the world have a link which says something like 'Click Here to download Acrobat Reader'. Google interprets those links and determines that when it comes to 'click here', the Acrobat download page must be pretty important.
But you will have noted that the Url for the destination page doesn't contain the 'click here' words. So why do keywords in the Url Matter?
Two reasons :
1. People are lazy copy/paste addicts.
2. You have to choose a limited number of words for your Url, so the choice of the Url conveys important meaning about the page.
The first reason is clearly displayed in this blog post. Instead of coming up with a meaningful text link for the linked blog posts above, I just posted the Url and left it at that. Windows Live Writer does the heavy lifting and converts it into a link for me, as does Outlook, as does just about every rich text editor out there today. Chances are that most time you get a link for your site, you won't have control over the link text either. This next bit is important : because people are lazy and generally just post the Url as the link, you can get de-facto optimization of the link text by having your chosen keywords in the Url. It's a simple but often overlooked fact - you can't go wrong relying on laziness of other people. Think of it this way : the DVD industry business model is relying on the fact that people are lazy and want to sit on their couch for hours on end. I think you should too.
The second reason is that search engines very much use the content of the Url to rank pages for keywords. Again, you have limited choice as to what to put in your Url, so search engines reasonably deduce that the contents of a page closely match the Url. It can't be spammed or faked. This is why there is duplicate content penalties in search engines : otherwise we'd all post 100 urls covering all possible search phrases for a page. By rewarding site owners for having canonical Urls, the search engines force us to pick one single set of keywords in a Url. And then, they know, the contents of the page closely match the Url.
I'll give you a concrete example : on November 10, 2008, I converted this blog to use the new Blog 3.5 release, which put keywords into the Urls. Comparing the period from August 10 to November 10 with November 11 to February 11 (handy it was exactly 3 months ago), my average pageviews/day for this Blog have increased by 55%. In this time I've done nothing else differently, and the majority of the visits are for old content. I've only managed 7 posts in that 3 month period. It's not properly scientific, but I'm sure increased rankings and clickability from better links has to be attributable to even half of the gains. Who wouldn't like a 55% increase in traffic?
But don't just take my word for it : even Matt Cutt's (he of the Google anti-spam team) has search engine optimised Urls. I think he should know.
Clear Urls are more clickable
If your page is lucky enough to appear somewhere in the top search results for a phrase, the person doing the searching is going to be presented with a link to your page. The search engine will present that link, as is. The person looking at a list of links on a search results page then has to make a decision : which link to click on? Now, statistics and studies tell us that most people are going to click on the first result.
But if they don't, which out of the others are they going to try?
My experience with Google Adwords testing tells me that the text of the URL absolutely affects the click-through rate. I've run two ads with the same content but with different 'link text' (in Adwords you can type any old thing in for the Url Text, the actual link isn't visible to the visitor). And I'm here to tell you that links that look like they match the search criteria get clicked on more.
Why? I've no idea but I'm guessing the more plain and simple the link, the more people trust what's on the other end of it. Perhaps it's wrapped up in the psychology of labelling and branding - linked to why all pasta sauce tins are red and not white. Personally I don't care, but I use the information to spur me into creating clickable Urls for my sites.
Database Id's in Urls are the result of Lazy Programmers
I'm here to tell you I'm a lazy programmer. Any programmer who tells you they would not rather choose the quicker, easier route to a destination is probably not speaking the truth. That's why so much software has terrible user interfaces. It's because the user is being force to conform to the program model, rather than the other way around. It's part of the reason why the iPod is so successful : the UI assumes the user knows nothing about the internal file storage mechanism of the music files. You hardly ever see the filename in an iPod : it always picks up the song title / artist and shows that instead.
In the early days of database driven websites and content management systems, the main focus was getting the things to work. To expedite that, programmers took the quickest and easiest route to Urls : exposing the underlying database structure as part of the Url. No interpretation or logic required : send table unique id in, get unique table record out. The really early stuff just put together a query based on the contents of the url, which made them laughably easy to do Sql Injection attacks. Nowadays platforms like DNN resist Sql Injection attacks very well, but the early days of the architecture are still there : you get the page of a DNN database by giving it the TabId. The TabId is just an auto-generated number that the database provides. It's why most DNN sites have a Home page of 36, because the Home page is the 36th record to be created on a new install.
So the only person that benefits from having database Id's in the Url is the programmer. Not the search engines, not the site owner, nobody else. If the iPod had come out with an interface that showed '01 - song.mp3' in the list, they would never have caught on. So you shouldn't do this with your website either.
Long Urls with Keywords are an SEO Must Have
To recap, if you have a long Url with keywords accurately depicting the page content, you're better off than if you have a Url stuffed with numbers and other database-generated keys. It's better for the users, and what's better for the users helps you in your search engine optimisation efforts.
And I'm not saying this just because I distribute a DotNetNuke extension to help transform the Urls of your DNN site : the whole reason for building that in the first place was to get better results for my own site. That it caught on in popularity was a happy externality.
If you use the Url Master module for your DotNetNuke website, you will get better SEO results. I get emails all the time from happy customers telling me their site is performing better. You won't have the problems that John Dvorak had, because it automatically 301 redirects all your old urls. It's just a pity he chose Wordpress over DNN, I guess.
Your say : have you got an anecdotal story about how putting in human friendly urls helped your site? Please share in the comments below.