Making the SEO Transition to BlogEngine.NET

It was important for me to make a speedy and complete SEO transition of my four years of Community Server blog posts to BlogEngine.NET.  If tonight you went to Google and searched on “” you’d find a nice library of posts that click clean and true to their current location.  The site index is still building, but everything seems to be populating and promulgating according to the Master Plan.

I wanted to share a bit of the BlogEngine.NET SEO setup process with a few pointers in transitioning from Community Server thrown in.  Before I cover some of the setup processes I’ll answer your question on what I did about Url Redirection.  The answer is “not much.”  I redirected my /blog/rss.aspx and /mainfeed.aspx CS feeds to BE.NET’s /blog/syndication.axd to make sure subscribers were unaffected, but really nothing other than that.  Ignoring redirection might be viewed with some criticism, but my focus was on updating all site urls as quickly as possible and simply not worry about it.  I mean, I love Microsoft, but how many times have you ended up on a “this item no longer exists” page at  One thing I did do was to create a friendly and functional 404 welcome page that would display whenever a new visitor would hit a dead link anywhere on the site. 



This display of a default 404 page is actually built into BlogEngine.NET, which I think is pretty impressive.  I simply customized the .ASPX a bit.  To ensure that the page displays on all 404s I added the following to my root web.config.


<customErrors mode=”RemoteOnly” defaultRedirect=”/blog/error404.aspx”>
    <error statusCode=”403″ redirect=”/blog/error404.aspx”/>
    <error statusCode=”404″ redirect=”/blog/error404.aspx”/>


The next issue was to update my sitemap.  BlogEngine.NET is very SEO savvy and handles the creation of a sitemap as well.  The BE.NET sitemap page is SiteMap.axd and while there’s no physical page, it is referenced in the BE.NET robots.txt file, in which to activate you’d uncomment the #sitemap line and insert your domain name.  If your BE.NET blog was at the root, your SEO work would be complete.  But my blog is at /blog and I’m updating my urls from another blogging application, so I’m not quite done yet.

Next step was to create a SiteMap.aspx page in my site root directory.  I found a C# template in the Google Webmaster area and tweaked it a bit.  Here’s my final version source.  Then I submitted it as a SiteMap in my Google Webmaster account.  I also submitted my /blog/SiteMap.axd, after which my submitted url count started spinning up.

The robots.txt played a big factor in ringing out the old and ringing in the new.  There are a number of directories that no longer exist on DBVT2008, so I wanted to purge them from Google’s index.  An important step was disallowing /blog/archive, which was the root path of all of my old Community Server post urls. 

Here is my current root directory robots.txt file.

User-Agent: *
Allow: /
Allow: /blog
Disallow: /x
Disallow: /images/
Disallow: /photos/
Disallow: /blog/archive/
Disallow: /csbits/
Disallow: /running/
Disallow: /itunes/

The final step in purging the old urls was using the “Remove URLs” service in Google’s Webmaster Tools under the “Tools” menu, where I entered the paths of the Disallowed paths containing old site content and post addresses that no longer existed. Within 24 hours of performing the Remove URLs, all of the old Community Server posts were no longer being indexed in Google.

I hope this post helps you setup SEO with BE.NET. I’m really happy with the results and how I am now BE-ing seen in Google.  BE-ing seen.  HAHAHA!

Article written by

A long time developer, I was an early adopter of Linux in the mid-90's for a few years until I entered corporate environments and worked with Microsoft technologies like ASP, then .NET. In 2008 I released Sueetie, an Online Community Platform built in .NET. In late 2012 I returned to my Linux roots and locked in on Java development. Much of my work is available on GitHub.