Michael K. Campbell has recently launched www.sqlservervideos.com–a site that provides free SQL Server training videos.
There aren't too many videos for now, but it's something to watch out for.
In the Windows Dev Pro Update Newsletter, he has shared deployment issues that he has faced while putting up the website that made for interesting reading. I'm excerpting it here as I didn't find an online link for the same material -
403 Access Forbidden. Initially, every page on my site greeted me with this error. As an IIS 6 veteran, my initial thought was that I’d forgotten to ACL my files and content. But the site was running on IIS 7 where automatic worker process injection removes that headache. As such, my first thought was that maybe something like UrlScan was interfering with my URL Rewriting Engine. Happily, however, the site is hosted with ORCS Web and they were all over the problem in a jiffy. The culprit? When I initially created a site with them, it was hosted on IIS 6 but was later transitioned to IIS 7. As part of the conversion process, the site was configured to use a classic application pool instead of an integrated pool. As such, my URL rewriting code simply wasn’t allowed to run.Replacing "soft 404s" with "hard 404s" as described in Errors in my Errors above is a Google SEO best practice.
Errors in my Errors. Once the site was up and running, I decided to test out my very cool error pages. The idea was to provide end users, or visitors, with helpful markup and information, while still ensuring that I was throwing HTTP 500 and 404 responses out to search engines and bots. I got errors all right, just not the ones that I wanted. Worse, somehow they managed to ripple through the site, and tear the whole thing down after a few requests. Sadly, this was all my fault, and required a few tweaks of my processing logic. Because the Global.asax was also in charge of routing these errors, I threw in some exception handling code in my error handling code as well. I knew that unhandled exceptions in Global.asax turn ugly very quickly.
Stupid Robots or Stupid Code? Since SEO is such a key part of my site, imagine my horror (a few days after launch) when I discovered that while normal visitors were accessing the site fine, search engines encountered nothing but errors. Tracing my way through my code, I found something written months ago that would help ensure that search engine requests wouldn’t be counted along with normal ad impressions. Sadly, while this code worked in testing, it just didn’t work out in the real world with real robots. I think I’ve fixed the problem, but I’m hoping for the glorious return of the robots.
Further on, he has also pointed out that there is now a tool to manage migration from IIS 6 applications & servers to IIS 7.
Related links -
Tips on ASP.NET Hosting & Deployment