Several years ago I wrote an article about creating a custom context provider for CodeRush. In that time CodeRush Classic, as it is called, has been replaced by CodeRush for Roslyn which relies on Roslyn. Now seems like a good time to update the provider. Rather than having to read both articles I’m going to repost the old article with updated changes for Roslyn. The code is semantically similar but had to be rewritten to use Roslyn.
NuGet with Active Directory Support
In a previous article I discussed how to host a private NuGet repository.If you aren’t familiar with NuGet then please refer to that article. If you’re hosting a private gallery then chances are you’re on a network (probably an Active Directory one). One downside to hosting a private NuGet gallery is that it is tied to Forms authentication. For a public repository this makes sense but for a private one it would be better to have NuGet use Windows authentication. This article will discuss the changes that have to be made to the NuGet source to support Windows authentication. The changes are scattered but minor.
Hosting a Private Visual Studio Gallery
Visual Studio extensions are really popular these days. Not surprisingly Visual Studio has a very active extensions gallery. But unfortunately the gallery is public so if you develop some extensions for your personal use, or for your company, then you cannot use the public gallery. Fortunately though VS supports the concept of a private gallery. A private gallery is nothing more than an extension gallery that is not accessible to the world. It is discussed on MSDN so refer to there for more information about the what’s and why’s. Unfortunately the documentation is a little vague and setting one up is not trivial. I’m going to discuss the steps needed to get a private gallery up and running.
Setting Up a Private NuGet Repository
These days most developers use NuGet to reference third-party dependencies like EntityFramework, Enterprise Library and NInject. A full discussion of NuGet is beyond this post. There are lots of articles on what it is, how to use it and how to package and publish to it so I’ll ignore all these things. But what about internal dependencies like shared libraries used by company applications? It turns out you can host your own private NuGet repository as well. Visual Studio is already set up to support this. The hard part is getting the repository set up. At my company we’ve been running a local NuGet repository for a year and it works great, for the most part. We have a company policy that all dependencies must be stored in NuGet. This pretty much eliminates versioning issues, sharing code across projects and storing binaries in source control. In this post I’m going to go over the steps to set up a private NuGet repository. Others have posted on this topic and there is some documentation available but unfortunately it is out of date.
Diskeeper 2012
Condusiv has recently released the next version of Diskeeper 12. I’ve been running it for a while now and I’m still satisfied with the way it optimizes my drives without eating up the resources. Traditionally I wipe my machine on a yearly basis because of all the extra stuff that gets installed and the slow down on the hard drives but since I’ve been using Diskeeper I’m averaging closer to 18 months now. The hard drive just isn’t running slow.
CodeRush – Writing a Context Provider
There are two types of developers in the world – those who use CodeRush and those who use ReSharper. I happen to be in the CodeRush (CR) group for various reasons. One of the benefits I really like about CR is its flexibility and the ability to easily define my own custom templates. A template in CR is like a smart code snippet. When you type a certain key combination in the right (configurable) context then what you type can be replaced by something else. In this post I’m going to discuss a simple context provider that can be used in CR templates.
SandCastle – Is It Blowing Away?
I’m really, really disappointed with Microsoft about SandCastle. For those of you not in the know SandCastle (SC) is the documentation generator from Microsoft. Supposedly they use it internally for generating the .NET Framework documentation but with the tools they released publicly I find it hard to believe. The last time SC was updated was 2010. It’s been over a year and I still can’t generate anything near the style or accuracy of the existing documentation.
I long for the days of NDoc where I could pass my documentation on to a GUI tool and it could spit out the professional looking documentation. With SC it spits out beta quality documentation with styles that hardly work and absolutely cannot handle anything beyond basic doc comments. This is really sad. It doesn’t even ship with a GUI so I can configure the various options to generate something reasonably close without having to read through help files. Fortunately there are third-party tools available but honestly they don’t update that often either.
What makes me really mad is that the whole reason SC is suppose to be so awesome is that it is configurable to the point that we should be able to generate any style of documentation. The reality though is that the existing styles are horribly broken, can’t handle any external stuff and doesn’t even match the existing framework styles. You’d figure that MS would release the very same style and resources that are used for the framework but I’ve yet to see any SC-generated documentation come close. There’s always something broken whether it is bad image references, poorly formatted examples or just plain ugly styles.
Don’t even get me started on the horrible new MS Help system that was introduced in VS2010. Help is sliding so far backwards that I think MS needs to just start over. The day we can’t ship a simple help file (or even a bootstrapper) is a sad day indeed. I’d hate to be the folks at MS who have to go through all the steps needed to install/modify help files just for testing. This is truly ridiculous and a bad, bad sign for help in general.
Therefore I throw out the challenge for MS to step up to the plate and actually provide us an updated version of SC that can generate MSDN-style documentation out of the box without all the extra work generally involved. Better yet, integrate this functionality into VS directly so I don’t have to use third-party tools. Unless MS can fix SC I feel that it’ll fall to the wayside like so many other MS projects. This is unfortunate because documentation is key to good class library design.
Reflector Is Dead
(The views expressed in this post are my own and are not reflections of my employer, peers or any company anywhere. Take it as you wish.)
It was bound to happen. Honestly did anybody not see this coming? Reflector is officially a dead product to the majority of .NET developers. In other words Red Gate (RG) is making it a commercial-only product (http://www.reflector.net/2011/04/why-we-reversed-some-of-our-reflector-decision/). After some backlash they have decided to release the old version for free but read below as to why this isn’t quite as it seems.
First some history. Reflector has been around a long time – for free. Most everybody inside and outside of Microsoft will mention Reflector when talking about disassembling code. A few years ago the author of Reflector sold the code to RG. I’m sure the original thought was that Reflector would remain free and RG would be able to make money off Pro versions. How many times have we heard this story? Early changes were annoying but tolerable. We had to install the product now instead of xcopy because, you know, they can’t add licensing to an xcopy image. We also got the annoying “please buy our Pro version” ads. Again, annoying but tolerable.
As one could expect RG didn’t make sufficient money off the Pro version to cover their costs. They had to recoup the initial purchase price plus the cost of ongoing maintenance. Why would somebody pay money for something that is free? The only good answer is if the paid version had features worth paying for. What features did RG add that were actually worth anything? I can’t think of one. Let’s see, they added integration with VS. Funny, I had that by using ToolsExternal Tools in VS. They added shell integration. Again, had that by a simple registry change. In other words they added absolutely nothing to an existing free tool and expected that people would want the Pro version. They could have gotten sneaky and started removing features that were previously free but that would have caused an uproar.
So the folks at RG have decided that they can’t sustain a free product anymore and therefore are completely eliminating the free version. Even worse is that they removed all options for actually getting the free version before (or as) they announced it (just go read the forums). Fortunately (maybe) they have temporarily added back the free version BUT you must do the following: 1) have an existing copy of v6, 2) check for updates and 3) do so before the deadline (which I believe is August 2011). After that you’re out of luck. Even more sinister is that they say it is a free, unsupported version but the fine print says that you actually get an activation license for 5 machines. So what does that mean if you have to reinstall? I have absolutely no idea but it sounds like a limited version to me.
Now one could argue that $35 isn’t a bad price for Reflector and I would be wholeheartedly in aggreement IF 1) it was a new product that they had actually written, 2) it provided functionality that was not available elsewhere and 3) it hadn’t been previously available for free for years. RG probably looked at other programs (i.e. Linux) that have both free and paid versions and thought they could do the same. It didn’t work out. Their decision is undoubtedly a business one. While I can understand their decision I don’t have to support it. After reflecting on Reflector I’ve decided that I will continue to use the free version of Reflector until such time as a better tool comes along or my activations run out. Then I’ll switch over to the less useful, but still capable, ILDasm. All RG has done is angered those who feel betrayed by the “free-to-paid” switch. I doubt they’ll see any additional money.
What does the future hold for Reflector? Unfortunately I don’t think it is good. RG is trying to recoup their costs and I don’t think they’re going to be able to do it. Most devs are not going to pay for the Pro version if they have the free version (which is probably why the licensing is set up the way it is). They might get some new customers but I don’t know that it’ll cover the long term. I expect that Reflector is going to effectively die because of lack of money. The only way I really see Reflector surviving is for RG to release it to open source (again) and let the community support it themselves. Yes RG would lose money but the way I see it RG needs to cut their loses and go on.
RIP (free) Reflector. You were a good tool. You will be missed.
Diskeeper Undelete to the Rescue
For those of you who are not aware Diskeeper Corporation (http://www.diskeeper.com), the creators of the Diskeeper defragmentation tool that I’ve recommended in the past, has a relative new tool out called Undelete. I received a copy a while back and provided my feedback to Diskeeper and then moved on. I personally do not use undelete tools. My philosophy is “if I delete an important file then maybe I’ll pay attention better next time”. Needless to say I have deleted more than one important document in the past and cursed myself for doing it.
Fast forward to recent days while I was at work. A coworker was cleaning up their hard drive and inadvertently deleted entire directories of important files, most of which were not backed up. Ignoring the whole “should have had a backup plan” discussion they were in trouble. A quick search of the Inet revealed a couple of potential undelete tools. I downloaded one in trial mode and tried it. Sure enough the files were still there but, since it was a trial, we’d have to buy the product to recover the files. Then I remembered Diskeeper sending me a copy of their new Undelete tool.
I did a quick check and sure enough I had a license for the program and, even better, it comes with Emergency Undelete which allows you to recover files even if Undelete was not already installed. This is exactly what I needed. I burned a copy of Emergency Undelete to CD (see below as to why) and ran it on my coworker’s computer. Sure enough it, really quickly, found the files that were deleted. Even better is that it was able to restore almost all of them. We restored all the files to a USB drive and I left my coworker to figure out which files they actually needed. I went back to my desk: happy that we were able to recover most of the files, and impressed with the speed and ease at which we could do it. It saved my coworker days of work in trying to recover the data by hand.
Without a doubt Emergency Undelete is something I’m keeping around on CD for emergency purposes. I’m still not comfortable running Undelete-like tools but Emergency Undelete is definitely handy to have. If nothing else it makes me look like a magician to folks who just lost some critical files. If you do not have an emergency undelete program then you should get one. Regular backups are great but they require time and effort to restore. I, for one, will be recommending Undelete from Diskeeper because I can say first hand that it works as advertised.
Caveat: When a file is deleted it is generally just removed from the file system’s directory table. The actual data is generally still there. Undelete programs work by scanning the drive and finding these files. However the operating system treats newly deleted files just like free space so the more data that gets written to a drive the more likely it is that your deleted file will be overwritten. When you discover that you accidentally deleted a file it is critical that you stop doing anything that might write something to the drive. This includes running an INet browser, shutting down Windows or even closing programs. Any of these could save files and overwrite your data. Go to another program and do the following.
- Get an undelete program like Emergency Undelete.
- Most good programs are going to allow you to copy the necessary files to removable media like a USB or CD. Put the undelete program on the media. A CD is best because you can store the program there and use it whenever you need it. It saves time and eliminates the need for a secondary computer.
- Put the CD (or USB) containing the undelete program into the target computer and run the undelete program.
- If all goes well you should see the file(s) you want to recover. Select the file(s) you want to recover. If you are unsure then it might be best to recover all the files and then selecting merge the files you actually need.
- Now you need to restore them but you cannot restore them to the target machine. Again any writes might overwrite the very data you are trying to recover. Restore the files to removable media or a secondary hard drive. USB works great here.
- Once you have recovered all the files you might need you can begin placing them back onto the target machine. Once you start this step you can assume that any files that were not recovered will be gone.