views:

35

answers:

2

Hi all,

I have done some refactoring on a asp.net mvc application already deployed to a live web site. Among the refactoring was moving functionality to a new controller, causing some urls to change. Shortly after the various search engine robots start hammering the old urls.

What is the right way to handle this in general?

  • Ignore it? In time the SEs should find out that they get nothing but 400 from the old urls.
  • Block old urls with robots.txt?
  • Continue to catch the old urls, then redirect to new ones? Users navigating the site would never get the redirection as the urls are updated through-out the new version of the site. I see it as garbage code - unless it could be handled by some fancy routing?
  • Other?

As always, all comments welcome...

Thanks, Anders, Denmark

+1  A: 

Here is a very good article on this exact subject: http://www.codeproject.com/KB/aspnet/webformmvcharmony.aspx. There is a section called "Handling Legacy URLs". The beauty of the approch is that existing users who have bookmarked the old url can still use their old links, but a redirect is sent to their browser with a "301 Moved Permanently" code which tells the browser that a redirect is ocurring. It is up to the browser to use this code or not and support for it varies, but whatever happens, the user ill see the new MVC version of your page.

Daniel Dyson
The nuggets were deep in the article - but they were there! Thanks for the answer and the reference.
Anders Juul
You are welcome
Daniel Dyson
+1  A: 

The answer depends on how important usability and SEO are to you and your site. I have added 301 redirects for old routes by adding the old action methods back in and doing 301 redirects to the new URLs. You might also consider re-submitting your sitemap to the search engines if you are concerned with SEO.

KOTJMF
Hi KOTJFM,I'll follow your and Daniels advice - thanks for taking the time!
Anders Juul
You're welcome :)
KOTJMF