Legacy Code

Postmodern VB6: A Quick Start with SimplyVBUnit

In a previous post about the “VB6 problem” I wrote that salvaging business critical classic Visual Basic applications involved these steps:

  1. Build it
  2. Test it
  3. Change it
  4. Repeat

That post (Setting up a VB6 Environment) concerned the first step and will take you through setting up an environment which can build legacy applications using Visual Studio 6. This post starts the discussion about the second step: testing.

I’ll start by repeating a critical thing to remember about legacy code: in its current state, the legacy code supports today’s business.

Consider the case where that statement is not true. Imagine that the legacy application does not support today’s business. If this were true then it would follow that either the business is failing or the application is no longer being used.

Although the application supports today’s business, we often feel that the application could support the business better. An application could be more efficient in terms of computing or human resources by taking less time to run or interact with. We may want to capture new business by adding features. Whatever the changes and our reasons for them, we need to make these changes without breaking the current functionality, and that’s where testing comes in. We can write tests to characterize the application’s current behavior. We assume this behavior is correct because it supports today’s business. Then we make changes and re-run our tests to ensure that we haven’t broken anything in the process.

We must take this characterization step even when we don’t assume that the legacy code is “correct by default”. If we are changing the code to fix a long standing bug, we still want to know how the code behaves today. Once we have characterized the current behavior with a test, we can change the test to expect the desired behavior. Finally, we change the code to correctly express the new behavior.

Before we can apply these concepts to the specific case of salvaging classic Visual Basic code we need to know more about the testing tools available for VB6. In this post we will look at a specific unit testing tool for Visual Basic called SimplyVBUnit. SimplyVBUnit is an MIT licensed open source project, which is actively maintained (the most recent update as of this writing was April 18, 2015–not that long ago at all). There are actually a couple more unit testing frameworks for VB6, but SimplyVBUnit seems like a good way to get our feet wet.

Install

If you followed my first guide to setting up a VB6 environment, then you will want to install SimplyVBUnit inside XP Mode. Use the Start Menu to select Windows Virtual PC and then Windows XP Mode. (If you set up your environment differently, then I will trust you to install it wherever is best for your environment.)

The SimplyVBUnit project provides an installer program, so fire up IE8 and download the latest version from SourceForge. You may need to click a little warning banner to allow the download.

Download SimplyVBUnit
Download SimplyVBUnit

Once the download completes double click the installer to run. Again, you may have to click through a security warning to let XP know you really want to install.

SimplyVBUnit Setup Wizard
SimplyVBUnit Setup Wizard

The setup wizard is easy to use, just click Next until the wizard completes. Unfortunately, the installer is not the end of the story.

SimplyVBUnit has a decent wiki on SourceForge. This wiki includes a page on the installation and a few on setting up tests. The installation page reads like release notes and pretty much just tells us to run the wizard. When we get to the new project instructions, the very first instruction shows a SimplyVBUnit Project template in the VB6 New Project dialog. However, I don’t see that on my system after completing the wizard.

No Template
No Template

To see the template, you will need to take the following additional steps inside XP Mode:

  1. Open My Computer
  2. Navigate to C:\Program Files\SimplyVBUnit 4.1\Source\Projects
  3. Copy all files in this folder.
  4. Paste these files in C:\Program Fies\Microsoft Visual Studio\VB98\Template\Projects
Install Template
Install Template

Now when you start VB6, the template should appear. Close Visual Studio, then log off of XP Mode so that we can start experimenting with SimplyVBUnit.

Create Test Project

Now start VB6 as a virtualized application. Once VB6 starts, you should be able to follow along with the getting started guide on the Wiki. First, create a new SimplyVBUnit Project. Next use the Tools menu to select Options, then select the General tab in the dialog that appears. Change the Error Trapping option to Break on Unhandled Errors. Click Ok.

Now you can use the play button to start debugging. The SimplyVBUnit test runner will launch and you can click Run. Your test run should indicate success, since you currently have no tests, and therefore no test failures.

SimplyVBUnit Test Runner
SimplyVBUnit Test Runner

Create Production Library

Although our overall goal is to learn how to salvage legacy code, our immediate goal is to learn about SimplyVBUnit. So let’s take the easy path and continue to follow along with the wiki by creating a brand new class library to test.

Stop the debugger if it is still running. Next, use the File menu to select Add Project..., then choose ActiveX DLL. By default, this project will be created with the name Project1, and contain a class called Class1. I can’t stand such terrible names, even during a demo, so I will make up some better ones: CoolCalculator for the project and Calculator for the class.

With that very important step completed, we can continue by adding a reference to CoolCalculator in the SimplyVBUnitTesting project. Select SimplyVBUnitTesting in the Project Group pane, and then use the Project menu to select References.... CoolCalculator should be near the top of the Available References list, with its check box unchecked. Check the box, then click Ok.

Add References
Add References

The wiki advises us to save our work so far, and that seems like a good idea. Use the File menu to select Save Project Group. VB6 starts by saving each file, and by default it wants to put the files in the VB98 folder under Program Files. This is no good, navigate to My Documents and create a folder called CalculatorDemo and save your files there. You will need to save the Calculator class, the CoolCalculator project file, the frmTestRunner form from SimplyVBUnit, and the SimplyVBUnitTesting project file–also from the SimplyVBUnit template, and finally you need to save the project group file using the name CalculatorDemo.

Create Test Class

Now we will create a test class and use it to drive a feature in CoolCalculator. Right click on SimplyVBUnitTesting in the project group, and select Add then Class Module. Make sure Class Module is selected in the dialog that opens, and click Open.

Once again, this creates a class called Class, so rename it to CalculatorTests. Next we will register this test class with the test runner frmTestRunner. Right click on frmTestRunner and select View Code.

The frmTestRunner contains a stub for Form_Load with comments that explain how to setup a test case (which we have already started). SimplyVBUnit also provides a comment demonstrating how to register a test case with the runner. Register our test case now by adding the following code to the line below the example comment:

AddTest New CalculatorTests

Now we can add a test to our test suite. Let’s make sure our calculator can add. Type the following code into the CalculatorTests class.

Option Explicit

Public Sub Add_Two_Numbers()
Dim calc As New Calculator

Dim actual As Integer
actual = calc.Add(1, 1)

Assert.That actual, Iz.EqualTo(2), "1 + 1 = 2"
End Sub

This code creates a new Calculator instance, uses it to add 1 to itself, then constructs a test assertion to verify the result. We can hit the play button to see what happens when we try to run this test. We expect that the test runner will try to run the test, since we registered it. We also expect that the test will fail, since Calculator has no implementation.

Test Failure
Test Failure

As expected, the test runs then fails as predicted. If we click Ok on the compilation error dialog, we will drop into the debugger at the beginning of the Add_Two_Numbers method (yellow highlight). Notice also that the .Add method is highlighted in blue.

Having verified our expectations, lets stop the debugger and add some code to the production library. Open the Calculator class and add these lines of code:

Option Explicit

Public Function Add(ByVal left As Integer, ByVal right As Integer) As Integer

End Function

This code should get us past the compilation error, because it will provide the method Add, which is expected by the test. However, we still expect the test to fail, because Add has no body, and therefore we expect it to return the default value for integer, which is 0.

Another Failure
Another Failure

Once again, the test runner confirms our expectations. So lets add a little more code to see if we can get the test to pass. Stop the debugger and add the following line of code to the Add method body.

Add = left + right

Use the play button to run the test again. At this point we expect that the test should pass. The test runner confirms our expectation.

Passing Test
Passing Test

The wiki goes on to explain Testing Multiple Scenarios, which looks to me like data driven or theory tests, depending on which jargon you prefer. That’s a nice feature, but I’ll leave its exploration as an exercise for the reader.

We are done with the CalculatorDemo project for now, so close VB and save any changes.

Review

In this post we used the classic Visual Basic environment that we created in Setting up a VB6 Environment to explore a native Visual Basic testing tool called SimplyVBUnit. Although the SimplyVBUnit installer did not automatically copy the testing templates for us, we were able to work our way past that problem and build an example test which we used to drive the creation of a simple calculator feature.

Although working with SimplyVBUnit was relatively simple, there are two things to consider before choosing it as the means to salvage VB6 code and make it safe to update. First, remember that the example we built for this post did not focus on characterizing legacy code. Instead, we took the easy path of creating a new project so that we could become familiar with SimplyVBUnit mechanics. Before going further, it makes sense to explore characterizing an existing code base with SimplyVBUnit. What do I even mean by characterization, and might there be pitfalls that we have not yet uncovered on the easy path?

Second, we should ask ourselves if native VB6 unit testing is the way to go. As discussed in the first Postmodern VB6 post, our goal when working with VB6 should be to get to a state where we aren’t working with VB6 anymore. If we accept that goal, and plan to migrate to a new platform, then we should seriously consider writing our tests in the new platform. After all, if we succeed in migrating, then all VB6 code will be retired, including the tests we are writing now. Furthermore, all the tests we write in VB6 will just be more code we need to convert to the new platform later. Even if we are able to figure out how to add SimplyVBUnit tests to an existing codebase, then we should still explore the possibility of using a testing tool which is native to our target platform.

Those considerations aside, it’s good to have SimplyVBUnit in our tool set, and I hope this post makes it easier to started.

Legacy Code

Postmodern VB6: Setting up a VB6 Environment

Lately I’ve been talking with folks about the “VB6 problem.” Although some love the platform, and some hate it, the classic flavor of Visual Basic on life support at best, while it’s IDE is in a body bag wondering why no one has taken it to the morgue. There is a petition to bring the platform back to life, or to open source the existing runtime, or even to create a new open source runtime. These ideas offer hope to some, but organizations that rely on VB6 programs for a necessary business process find hope a thin comfort. For these organizations, it is past time to move on. Moving on means rewriting or refactoring, although in this case “salvaging” might be a better term. How to choose a way forward?

Many organizations may get away with taking the inadvisable rewriting route. Some applications are small and well understood, or not actually that critical. However, teams with large, complex and critical applications do exist. Perhaps the application’s author has left the team, or can’t remember the purpose behind all the implementation details. Could you correctly classify a given line of code written six months ago as dead, a bug, a fix, or a critical business rule?

You can? Congratulations, I knew my readers were all above average. Lets try something harder, can you do the same trick for code written in 1998?

(Were you even born when that code was written by a junior developer who is now the CTO and very proud that his code has been running “flawlessly” all these years?)

The one thing we know about production code is that it supports the current business, maybe not perfectly, but it does work. To continue to support the “working” quality of the application will require a refactor/salvage operation. But what challenges lie ahead for those who choose this path with VB6?

Well, that’s what this post is about. If you want to salvage code, you need to build it, test it, change it, and repeat. That’s challenging enough when the code will remain on the same platform, say a .net 2.0 to .net 4.5 migration. But with a VB6 salvage, you have the added challenge of migrating to a new platform.

That is certainly too much ground to cover in one post, so lets look at the first step, building the code.

Setup Visual Studio 6.0 on a modern platform

If you want to build classic Visual Basic projects, you will need a copy of Visual Studio 6.0. This software can be harder to find that you might expect, since Visual Studio 6.0 is no longer available from MSDN. I believe this is because of a settlement with Sun Microsystems which required Microsoft to stop distributing software that included the MSJVM. But I’m not a lawyer, I just know the software is not there. You can still buy copies on ebay and prices range between $200 and $400 (at a glance, as of today, YMMV, etc…).

Lucky for me I acquired a used copy from the SDSU bookstore in late 2001 for $50, and it the 5 CDs (2 for Visual Studio, 2 for MSDN, and 1 for Visual J++) have been living on various closet shelves all these years.

Visual Studio 6.0 Media
CDs, oh yeah….

Of course, these days having CDs doesn’t always mean that you have access to the software.  Many machines, including my main development machine, don’t have optical drives. So what I really wanted was ISOs. My desktop machine has an optical drive and I used a handy little utility called ISO Recorder to copy the data off the disks and into image files.  I was careful to check the back of the CD jewel boxes and sleeves for CD key stickers, and copy the keys into read me files which I stored along side the ISO images.

My current dev machine is a MacBook Pro, and after transferring the ISOs onto the laptop I created a new virtual machine with vmWare Fusion. I did some research to see which version of Windows I should try to install VS6 onto. While some blogs and form posts report successful installs all the way up to Windows 8, I formed the impression that the experience began to degrade with Windows 7. This makes some sense. Although the VB6 runtime is currently supported all the way through Windows 8, the development tools reached end of life in 2008. So, there are caveats if you try to install VS6 onto later operating systems. For example, there is an incompatible component you must de-select during install, or the install will hang. You must run the install as admin, or it will reportedly fail. Finally, some users report that the IDE must run with Windows XP compatibility shims applied. This is all hearsay from me and I’m not even providing links to these reports because I decided to cut to the chase instead of trying every combination of OS and tweaks until one works.

If we can get the IDE to run by pretending to be Windows XP, why not just use Windows XP Mode or an actual Windows XP install? I decided to go with XP Mode because I prefer Windows 7 to XP, and double virtualization is always a blast. I grabbed a Windows 7 ISO and stood up the new VM with the default storage (60GB) and RAM (2GB) provided by the vmWare wizard. After the Windows installation completed, I used boxstarter to install some essential applications and utilities and all critical Windows updates. This can take awhile, but I look at it as a “pay me now or pay me later” choice. Eventually, those updates need to come down, so I choose to get it over with up front.

You might wonder why I’m installing utilities into Windows 7, when Visual Studio 6.0 will run in XP Mode. Remember, this is a salvage operation with the goal of migrating to a modern platform. It could be any platform, but lets just assume we are going to .net. As the salvage progresses, I will use VS6 less, and switch to more modern tools, which will run natively on Windows 7. I will install the minimal number of tools into XP Mode, and my main environment will be Windows 7.  Visual Studio 6 will run alongside these modern tools in a window as a virtualized application.

Visual Studio 6 as Virtualized Application
The Goal

Eventually I will install a modern version of Visual Studio into the Windows 7 host and use it to work with the salvaged code as I port it to .net. For me this is one reason to stick with XP Mode instead of a full XP install. I want to make sure that my modern tools work seamlessly, without putting Visual Studio 6.0 into an uncomfortably modern environment.

Once the boxstarter script finished the initial configuration, I downloaded Virtual PC and Windows XP Mode from Microsoft’s download site. Remember that XP is also at the end of its life, so if you are even considering going this route, grab these installers, set everything up and make a backup. Microsoft is under no obligation to make these tools available, and as we saw with Visual Studio 6.0, outside events like lawsuits can have an impact on whether or not you can get the software in the future. Anyway, enough FUD, installing these two tools are straightforward run-and-click wizards. Be aware that the Virtual PC installer requires a reboot. Also, if your setup is similar to mine, vmWare will prompt you for your admin password whenever you start the machine after XP Mode is enabled. Apparently the way that XP Mode hooks into network traffic, requires your permission. When you are done, you’ll have a shiny copy of Windows 7 with a (shiny?) copy of Windows XP running inside it, waiting for you to install Visual Studio.

Windows 7 with Windows XP Mode
XP Mode Achievement Unlocked

To install Visual Studio 6 you will need to insert the first CD into the virtual optical drive on the Windows 7 host. In vmWare, you click on the little CD icon and select “Choose Disc or Disc Image”.

Optical Drive Callout
Insert disc here

Select the image for the first Visual Studio CD-ROM, then click the CD icon again, and choose “Connect CD/DVD”.” In a moment, Windows 7 will recognize the disk can you should be able to see it in File Explorer. If Windows XP Mode was running when you inserted the disk, then XP will also automatically recognize the disk and you will see it in My Computer. Double click the optical drive in My Computer and setup should launch.

Disc available in XP Mode
Disc, Ahoy!

The wizard will guide you through the rest of the setup. An XP Mode reboot will be required once the installer updates the Microsoft Virtual Machine for Java (Yes, the very same troublemaker that caused the lawsuit). I choose to install “Visual Studio 6.0 Professional Edition” and not the “Server Applications” because I have to make a choice before I can click next. If I must, I’ll come back for the server applications later. I click “Next” and “Continue” until another wizard starts and eventually asks me if I want to customize setup. I choose to customize, and then I choose “Select All” since I have no idea what components might be required during salvage operations. I choose to “Register Environment Variables” when prompted. I am going to opt into everything I can, since this is a virtual machine dedicated to one purpose: make salvage as easy as possible. Finally the wizard stops asking questions and copies files for a while. After it finishes, it wants to reboot XP Mode again.

When XP finishes restarting, a new wizard pops up, wanting to know if I would like to install MSDN. The default is yes, but you may be wondering why I need MSDN installed locally. Isn’t MSDN all on the internet now? Yes, it is, but that is the MSDN of today, and what we need is the MSDN of years gone by. MSDN was different back then, it was not just the giant reference it still is today, but it was the help file too. So, install it–it’s only two CDs after all.

To install, you will need to “Locate Microsoft Developer Network Disk 1″. This means, “Put the Disc I want into the tray, please.” Our tray is still virtual, so follow the same process you used to insert the Visual Studio ISO, except insert the first MSDN ISO this time, then click OK. Click “Continue”, “OK”, “I Agree”, and so forth until the wizard asks you how much MSDN to install. Choose, “Complete” to get all of it. The wizard copies about 14% of MSDN before asking for the next disk. I guess the wizard took up too much space on the first disk. Do as the wizard commands and it will copy the other 86% off the second disk.

Next up, opt-in to install shield and swap back to Visual Studio Disk 1. Why couldn’t this happen before the MSDN install? Don’t ask questions. Do as the wizard commands. Click “Next” and “Yes” a few more times, but you can opt out of viewing the read me file. To install Back Office, you wil need to insert Disk 2. You might need SQL Server 6.5? Exchange 5.0? Lets rethink this opt-in to everything idea. We can always install it later.

Opt out of registering your product and you are done!

Take it for a spin

When setup completes you should be able to start Visual Basic 6.0 and see it running inside the XP Mode VM.

Visual Basic 6.0
Shiny 1998 tech

Once you have seen the IDE start, close VB and then log off the XP Mode shell. Once you are logged out, close the XP Mode window and you will see a message indicating that the machine is going into hibernation. When hibernation completes, use the Windows 7 start menu to find the Visual Studio applications under “Windows Virtual PC” > “Windows XP Mode Applications” > “Microsoft Visual Studio 6.0″. Click Visual Basic 6.0 and a progress bar will appear while XP Mode “prepares” to run the IDE. This just means XP Mode is resuming from hibernation. If you didn’t log off from XP Mode before hibernating, XP Mode will get angry with you and ask if it is ok to log you off. The Visual Basic New Project dialog should appear within thirty seconds or so, along with an alert from XP complaining that you don’t have any anti virus.

On my machine, the IDE text is practically illegible because of the difference between modern monitor resolutions and the resolutions that were common in the Visual Studio 6 era. So I used vmWare to configure the Windows 7 machine to scale its display. I clicked on the settings wrench to the left of the virtual optical disk, then clicked Display and finally unchecked the option “Use full resolution for Retina display”. This immediately kicks in, and windows will want you to log off so it can “refresh” your settings. Once you log back in, VS6 should be significantly more legible.

We have a development environment that we can actually see, but does it work? Before trying to load our legacy app into the IDE to build it, lets take a trip down memory lane and write a simple “Hello World” application as a smoke test.

Select “Standard EXE” from the New Project dialog.

New Project Dialog
Select Standard EXE

Click the Project1 in the Project panel and rename the Project to “HelloWorld”

Renaming the Project
Rename the project

Click Form1 in the project panel and rename to “HelloForm” change the caption to “Hello World”.

Button Control
Make a button

Next select the Button control from the tool box and draw a button on the form. Select the button and rename to GreetCommand. Change the caption to “Greet”

Greet Button
Greet Command

Double click the button to access the code behind. Type the following in the event handler sub:


MsgBox "Greetings from 1998", vbOkOnly, "Greetings"

Now click the play button to start debugging. Click the “Greet” button to get your greeting.

Debug button callout
Play that funky app

Now lets save the project. Use the File menu to select “Save Project As…”. You will see the file system from the point of view of the XP Mode VM, but don’t worry. The important folders like “Documents” are synchronized across both machines. Navigate to “My Documents”, create a folder called HelloWorld, and save your files there. You will see the same folder and files appear in the Windows 7 Documents library.

Now make the project. Use the file menu to select “Make HelloWorld.exe …”. Again, navigate to the folder you created in “My Documents” and click OK. On the Windows 7 side, double click HelloWorld.exe, then click the button to get your greeting.

Hello World
Greetings from 1998

Congratulations

We have managed to setup a development environment for Visual Basic 6.0 salvage operations. We are hosting our tools in a relatively modern operating system: Windows 7. Windows 7 allows us to create a full blown Windows XP system in order to gain seamless compatibility with even older tools, in this case Visual Studio 6.0. Our Visual Studio installation is working well enough that we can write a Hello World application, compile it, and run it in the more modern Windows 7 environment.

If you are planning on salvaging business critical code, you still have a long road ahead of you. It is my hope that this post makes the first step along the path a little easier. Thanks for reading.

Coding

Packaging Contract Assemblies Like A Pro

This is a short follow up to my NuGet Like A Pro post.  I left an additional step out of that post, even though I almost always need to to do it.  I didn’t think this next step was widely applicable, and wanted to position the previous post as a “Super Duper Happy Path” that most people could follow with out confusing digressions.

However, I did my self a disservice by leaving it out, because now whenever I need to refresh my memory by reading my own post, I am left still having to figure out this one step again.  So, I’m going to post it here so that I’ve got all my documentation in one place.

Contract Assemblies

If you don’t know what they are, then you probably don’t need to read this post.  However, if you are curious they are an artifact created when using Code Contracts.

Code Contracts provide a language-agnostic way to express coding assumptions in .NET programs. The contracts take the form of preconditions, postconditions, and object invariants. Contracts act as checked documentation of your external and internal APIs. The contracts are used to improve testing via runtime checking, enable static contract verification, and documentation generation.

In other words, Code Contracts are another form of static analysis and client code needs to know about your contracts in order to properly evaluate their own contracts.  This is where the contract assembly comes in, it provides the contract information about the assembly in your package.

So you need to create this assembly, put it in your nuget package so that the contract checker can find it, and then give nuget a hint indicating that only the “normal” assembly should get a project reference, while the contract assembly (which only contains metadata) should not be referenced by the project.

Creating the Contract Assembly

This step is easy, but I will include it for those who are new.  First one must visit the Visual Studio Gallery and download the tools.  Once the tools are installed, the Visual Studio project properties page will grow a new blade, pictured below.

Code Contracts Control Panel
Code Contracts Control Panel

I check almost everything in the “Static Checking” section and leave “Runtime Checking” alone.  It would be off topic to explain why in this post, but you can visit the Code Contracts website and make your own decision.  You can also choose not to turn anything on, yet still build the Contract Reference Assembly.  This will let clients using Contracts know that you don’t have any.

By default, the Contract Reference Assembly is not configured to build, but as you can see in the red rectangle, I have turned it on.

Now when I build my project, the additional assembly is created (below the build output folder, in a subfolder called “CodeContracts”)

The Contract Assembly
The Contract Assembly

Adding to the package

Now that you have the assembly you can let nuget know about it by adding a file reference to the nuspec file.  This reference goes in the files node, which is a child of the package node.  I usually put it right after the metadata node:

  </metadata>
  <files>
    <file src="bin\Debug\CodeContracts\TemporaryFile.Contracts.dll" target="lib\net45" />
  </files>
</package>

After rebuilding, you will see that the Contract assembly is now packaged with the normal library.

Packaged
Packaged

However, if you were to use this package as is, NuGet would add a reference to the Contracts assembly as well as the library.  To prevent that, we provide NuGet a white list of assemblies which should be referenced, and it will ignore the rest.

To do this, add a child node to metadata called “references” and a “reference” node for the normal library.

    <references>
      <reference file="TemporaryFile.dll" />
    </references>
  </metadata>
  ...
</package>

Now rebuild again, and the NuGet Package Explorer will indicate that the assembly references have been “filtered”.

Filtered
Filtered

Conclusion

So, to distribute Contract Assemblies (or other any assembly which should not be referenced) follow the steps above.  First create the assembly you want to distribute.  Next add a file reference to the nuspec which points at the new assembly.  Then, add a references node and add references to each assembly which should be referenced (the new assembly should not be in this section, but the original assembly should be).  After filtering your references you are ready to go.  Upload your package to your favorite feed (nuget.org, myget.org, proget, etc…) and pour yourself a drink.

Coding

NuGet like a Pro, the MSBuild way

Back in 2012 I posted an article on this blog called “Creating Packages with NuGet the MSBuild Way“. That post described an MSBuild-integrated method to create NuGet packages from your own source code. It has remained one of my most popular posts. Like many popular things on the internet, it has been out of date for sometime now. When I check on my blog and see that the most visited article of the day is “Creating Packages with NuGet the MSBuild Way“, I wonder if visitors know that its out of date. Do they dismiss me as a crank and leave the page immediately? Even worse: do they follow the outdated and complicated recipe described in the post?

In 2012, I needed to integrate packaging into MSBuild because I could not find a plug-in for CruiseControl.net that would create NuGet packages.  There may be a plug-in now, I don’t know.  After a couple years creating NuGet packages, many tools I use from day to day have changed including my source control and continuous integration options. Even though I now have the option to create CI builds on TFS, where NuGetter is available, I still use MSBuild to configure my projects to create packages every time I hit F6.

I have a new, simple process for setting this up and it usually takes me about five minutes to convert an existing project to produce a NuGet package as part of it’s build output. I start with some existing code, enable package restore, make one small edit to my project file, and build. That’s all it takes to get the first package in the first five minutes.

If I want to continue customizing after creating the first package, I pull the nuspec file out of the existing package, put the nuspec file next to my project file, and customize from there, that’s the second five minutes.

Finally, I make some small modifications to the nuget.targets file provided by package restore in order to automate some cleanup, that takes about five more minutes.

It takes me about fifteen minutes to get everything setup just how I like it, but if your needs are simple, you can be done in five minutes. Hopefully this simplified process will be much more useful to my future visitors and help you, dear reader, understand how easy it is to create NuGet packages for your open source (or private) packages.  So read on for all the details!

Build

Start with Some Code

Any Class Library will do.  The important thing is that its something you want to share.  Either its your open source project, or a bit of private code which you’d like to share with your customers, other departments in your organization, or just your team.

For this example I’ve created a super-cool class called TemporaryFile.  TemporaryFile provides a disposable wrapper around a FileInfo which deletes the file when the Dispose method executes.  This allows the user to control the lifetime of the temporary file with a using statement, or trust the garbage collector to take care of it during finalization.  I find myself creating and deleting temporary files for a certain class of unit tests, and a wrapper like this takes alot of the grunt work out of the task.

namespace TemporaryFile
{
    using System;
    using System.IO;
    using ApprovalUtilities.Utilities;

    public class Temp : IDisposable
    {
        private readonly FileInfo backingFile;

        public Temp(string name)
        {
            this.backingFile =
                            new FileInfo(PathUtilities.GetAdjacentFile(name));
            this.backingFile.Create().Close();
        }

        ~Temp()
        {
            this.Dispose();
        }

        public FileInfo File
        {
            get
            {
                return this.backingFile;
            }
        }

        public void Dispose()
        {
            // File on the file system is not a managed resource
            if (this.backingFile.Exists)
            {
                this.backingFile.Delete();
            }
        }
    }
}

Notice that the class uses a method from PathUtilities in ApprovalUtilities (part of ApprovalTests).  I added this method call solely to generate a dependency on another package, which in turn helps demonstrate how much metadata NuGet can infer for you without explicit configuration.  Relying on inference is a big part of keeping this process fast an simple–as long as the inferred information meets your needs.

However, the way I used PathUtilities here turned out to be a bug.  So don’t copy this code.  It is useful to have a bug in the code when doing demos, so I left it in there.  If you think the temporary file idea sounds super useful, then a bug free version is now available as part of ApprovalUtilities.

If you examine the NugetLikeAPro repository on GitHub, TemporaryFile is a plain old .net # class library.  It has a test project but not much else is going on.

Enable Package Restore

The NuGet documentation is very good, and covers a lot of ground but if it covered everything then you wouldn’t need me!  I think that “Using NuGet without committing packages to source control” contains a lot of good information about what happens when you click the “Enable Package Restore” menu item, but it does not emphasize something very important to us as package creators: the NuGet.Build package installed by package restore contains everything you need to convert a project to create packages.

When you enable package restore, two packages are added to your solution: NuGet.CommandLine and NuGet.Build.  You could add these yourself, but that would be two steps instead of one.  Package restore also performs a third, more tedious step for you: it updates your project files to reference a new MSBuild script and adds a $(SolutionDir) property so that the new script can do its work.  The project files need to reference an MSBuild script (NuGet.targets) in order to run the package restore target before the build.  The package restore article doesn’t mention that the script also defines a build package target, which can create a package for you after the build completes.

So, lets enable package restore on TemoraryFile and see what we get.

Image of the Visual Studio solution context menu
Enable Package Restore

Just as promised by the documentation, the process added a solution folder and three files: NuGet.targets, NuGet.exe, and NuGet.Config.  NuGet.Config is only needed by TFS users so you can probably delete it safely.  It has no impact on what we are doing here.  By observing red checkmarks in the Solution Explorer we can also see that the process modified TemporaryFile.csproj and TemporaryFile.Tests.csproj.

Image showing Visual Studio solution explorer
Modifications to Solution

Lets see what changes package restore made to TemporaryFile.

diff --git a/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj b/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj
index c1e5a2c..85e156b 100644
--- a/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj
+++ b/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj
@@ -11,6 +11,8 @@
 <AssemblyName>TemporaryFile</AssemblyName>
 <TargetFrameworkVersion>v4.5</TargetFrameworkVersion>
 <FileAlignment>512</FileAlignment>
+ <SolutionDir Condition="$(SolutionDir) == '' Or $(SolutionDir) == '*Undefined*'">..\</SolutionDir>
+ <RestorePackages>true</RestorePackages>
 </PropertyGroup>
 <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
 <DebugSymbols>true</DebugSymbols>
@@ -49,6 +51,13 @@
 <None Include="packages.config" />
 </ItemGroup>
 <Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
+ <Import Project="$(SolutionDir)\.nuget\NuGet.targets" Condition="Exists('$(SolutionDir)\.nuget\NuGet.targets')" />
+ <Target Name="EnsureNuGetPackageBuildImports" BeforeTargets="PrepareForBuild">
+ <PropertyGroup>
+ <ErrorText>This project references NuGet package(s) that are missing on this computer. Enable NuGet Package Restore to download them. For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is {0}.</ErrorText>
+ </PropertyGroup>
+ <Error Condition="!Exists('$(SolutionDir)\.nuget\NuGet.targets')" Text="$([System.String]::Format('$(ErrorText)', '$(SolutionDir)\.nuget\NuGet.targets'))" />
+ </Target>
 <!-- To modify your build process, add your task inside one of the targets below and uncomment it.
 Other similar extension points exist, see Microsoft.Common.targets.
 <Target Name="BeforeBuild">

Lines 18-24 create the reference to the NuGet.targets file in the .nuget folder, and add some error handling if the script is missing during the build.  On line 9 the $(SolutionDir) property is created, and its default value is the project’s parent directory.  NuGet.targets uses this piece of configuration to find resources it needs, like NuGet.exe or the solution packages folder.  Finally on line 10, package restore is enabled by adding the RestorePackages property and setting it’s value to true.  (Side note: this is a bit misleading.  It is getting harder and harder to opt-out of package restore.  If you set this to false, Visual Studio will set it to true again during the build, unless you opt-out again using a separate Visual Studio option.)

Editing project files is a bit tedious because you have to unload them, open them again as XML files, make your changes and then reload them.  Its not hard to learn but its at least four mouse clicks and then some typing in an obscure syntax without much intellisense (although R# helps here).  It’s nice that the Enable Package Restore menu item did all that editing for you with one click.  Remember that the process also added two NuGet packages for you, so you can add all that to your overall click-savings.  Note that the documentation mentions a new feature available in NuGet 2.7 called “Automatic Package Restore“.  This feature is enabled by default and solves some problems caused by package restore in certain scenarios.  It’s already on by default, so we can imagine that someday a program manager at Microsoft is going to say, “Hey, lets get rid of that ‘Enable Package Restore’ menu item.”

If the Enable Package Restore “gesture” is ever removed then we can install the NuGet packages ourselves and make the necessary changes to the project files.  This will get tedious and use way more than the five minutes I’ve allotted to the process, so I’m sure someone will think of a clever way to automate it again with yet another NuGet package.  However, this is all just my own speculation.  Today we live in the Golden Age of NuGet package creation, and package restore does 99% of the work for us.

One Small Edit

The NuGet.targets file provided by the NuGet.build package provides a “BuildPackage” target.  Unlike the “RestorePackages” target, the build package target is not enabled by default.  So, we have to edit our project file to turn it on.  To edit the file in Visual Studio is a several step process.  If I were to make the change from within the IDE, I would: right-click on the  TemporaryFile node in Solution Explorer, select “Unload Project”, right click again, select “Edit Project”, edit the project file, save the project file, close the project file, right-click the project again, select “Reload Project”.  It’s a hassle.

An image of the project context menu in Solution Explorer
Too Much Work

I find it’s easiest to use a regular text editor to make this change rather than Visual Studio.  Anything should work, I often use Sublime Text or Notepad++.  Plain old notepad or WordPad should work fine.  I prefer Sublime because I keep a my “Projects” folder open in Sublime by default so that I can glance at code or edit these types of files quickly.  However you choose to do it, you only need to add one property in order to turn on the BuildPackage target.

diff --git a/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj b/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj
index 85e156b..e42d010 100644
--- a/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj
+++ b/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj
@@ -13,6 +13,7 @@
 <FileAlignment>512</FileAlignment>
 <SolutionDir Condition="$(SolutionDir) == '' Or $(SolutionDir) == '*Undefined*'">..\</SolutionDir>
 <RestorePackages>true</RestorePackages>
+ <BuildPackage>true</BuildPackage>
 </PropertyGroup>
 <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
 <DebugSymbols>true</DebugSymbols>

I usually put it right below the RestorePackages property (line 9), but you can choose where it goes.  For example, if you wanted to only create packages for debug builds, you could down a few lines to line 12, into the next PropertyGroup, which is only defined when Debug is selected.  The same technique would work to restrict package creation to Release builds, if that’s what you would like to do.  If you made the change out side Visual Studio, the IDE will notice and ask you if you want to reload the project.  You do, so click “Reload” or “Reload All”.

An Image of the "File Modification Detected" dialog
You need to reload now

Once the BuildPackage property is set to true, MSBuild will execute the corresponding target in NuGet.targets and create a package for you on every build.  This package will get most of it’s configuration by inference, and appear in the bin directory next to your normal build outputs.

An image of Windows File Explorer
BuildPackage creates two packages by default

BuildPackage created two packages for me.  One is an ordinary NuGet package, which contains the TemporaryFile assembly and one is a “Symbol” package, which includes the same assembly along with additional debugging resources.

An image of the standard NuGet package, open in NuGet Package Explorer
The ‘Standard’ NuGet package

We didn’t provide NuGet with any configuration information.  NuGet configured these packages by convention, and used the project and assembly information to infer what the package configuration should be.  By opening the standard package in NuGet Package Explorer we can see what NuGet came up with.  The Id, Version, Title, and Copyright are all inferred by examining assembly attributes.  These attributes are defined in AssemblyInfo.cs by default.

using System.Reflection;
using System.Runtime.InteropServices;

[assembly: AssemblyTitle("TemporaryFile")]
[assembly: AssemblyDescription("")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("")]
[assembly: AssemblyProduct("TemporaryFile")]
[assembly: AssemblyCopyright("Copyright ©  2013")]
[assembly: AssemblyTrademark("")]
[assembly: AssemblyCulture("")]
[assembly: ComVisible(false)]
[assembly: Guid("4365a184-3046-4e59-ba28-0eeaaa41e795")]
[assembly: AssemblyVersion("1.0.0.0")]
[assembly: AssemblyFileVersion("1.0.0.0")]

Authors and Owners are both set to “James” which is my user name on the machine where I created the package. NuGet would prefer to use the value from “AssemblyCompany” for these fields, but I haven’t filled it out yet. Since AssemblyCompany was empty, NuGet moved on to the next convention and chose my user name instead. NuGet would also prefer to use “AssemblyDescription” to populate the Description value, but this was also blank. Since there is no other logical place (yet) for NuGet to find a description, the program simply gave up and used the word “Description” instead. NuGet uses the build log to warn me (lines 4, 5, 11, and 12 below) when this happens.

1>  Attempting to build package from 'TemporaryFile.csproj'.
1>  Packing files from 'C:\Users\James\Documents\GitHub\Blogs\NugetLikeAPro\TemporaryFile\bin\Debug'.
1>  Found packages.config. Using packages listed as dependencies
1>EXEC : warning : Description was not specified. Using 'Description'.
1>EXEC : warning : Author was not specified. Using 'James'.
1>  Successfully created package 'C:\Users\James\Documents\GitHub\Blogs\NugetLikeAPro\TemporaryFile\bin\Debug\TemporaryFile.1.0.0.0.nupkg'.
1>
1>  Attempting to build symbols package for 'TemporaryFile.csproj'.
1>  Packing files from 'C:\Users\James\Documents\GitHub\Blogs\NugetLikeAPro\TemporaryFile\bin\Debug'.
1>  Found packages.config. Using packages listed as dependencies
1>EXEC : warning : Description was not specified. Using 'Description'.
1>EXEC : warning : Author was not specified. Using 'James'.
1>  Successfully created package 'C:\Users\James\Documents\GitHub\Blogs\NugetLikeAPro\TemporaryFile\bin\Debug\TemporaryFile.1.0.0.0.symbols.nupkg'.

Notice on lines 3 and 10 that NuGet noticed that my project depends on another NuGet package. It infers this by detecting the ‘packages.config’ file where NuGet lists the project dependencies, reads that file, and automatically configures TemporaryFile to depend on ApprovalUtilities.

Overall NuGet did a pretty good job, and this package is actually usable.  Before we move on to customizing this package lets take a look at it’s sibling, the symbol package.

An Image of the Symbol package open in NuGet Package Explorer
The Symbol Package

The symbol package configuration is identical to the standard package.  Version, Id, Authors, and the rest are all the same.  However, there are more files in the symbol package.  Along with the class library, the lib/net45 folder contains the debugging symbols.  There is also a new folder called src.  Under the src directory, we can find all the source code for TemporaryFile.dll.  All together, this extra content gives Visual Studio enough information to provide a complete step-through debugging experience for this NuGet package.  What to do with this package and how to configure Visual Studio to use it are topics better handled on their own, so I wont cover them further here.  Stay tuned.

Customize

There are a few things I would like to change in this package before sharing it with the team/customers/world. I don’t like the default values for Author/Owner and Description. At a minimum the Author field should contain my last name, or perhaps my twitter handle or something I’d like the world to know me by. It is also appropriate to use your company name in this field. The description is important because this package will probably end up in a gallery and certainly be presented in the NuGet Package Manager inside Visual Studio. You need a good concise description so people have an idea what you are trying to share with them.  The copyright isn’t claimed by anyone either, be careful here because some default Visual Studio installs automatically use “Microsoft” as the default copy right holder (this seems to have been fixed in 2013, now its just blank).  Finally, I don’t like the default 3-dot version number, I prefer the 2-dot version, so I’d like to change that too.  These are the low hanging fruit which can be customized using AssemblyInfo.cs.

diff --git a/NugetLikeAPro/TemporaryFile/Properties/AssemblyInfo.cs b/NugetLikeAPro/TemporaryFile/Properties/AssemblyInfo.cs
index 7c3c830..bf494d8 100644
--- a/NugetLikeAPro/TemporaryFile/Properties/AssemblyInfo.cs
+++ b/NugetLikeAPro/TemporaryFile/Properties/AssemblyInfo.cs
@@ -2,14 +2,14 @@
 using System.Runtime.InteropServices;

 [assembly: AssemblyTitle("TemporaryFile")]
-[assembly: AssemblyDescription("")]
+[assembly: AssemblyDescription("A file that deletes itself when disposed")]
 [assembly: AssemblyConfiguration("")]
-[assembly: AssemblyCompany("")]
+[assembly: AssemblyCompany("ACME Co.")]
 [assembly: AssemblyProduct("TemporaryFile")]
-[assembly: AssemblyCopyright("Copyright ©  2013")]
+[assembly: AssemblyCopyright("Copyright © Jim Counts 2013")]
 [assembly: AssemblyTrademark("")]
 [assembly: AssemblyCulture("")]
 [assembly: ComVisible(false)]
 [assembly: Guid("4365a184-3046-4e59-ba28-0eeaaa41e795")]
-[assembly: AssemblyVersion("1.0.0.0")]
-[assembly: AssemblyFileVersion("1.0.0.0")]
\ No newline at end of file
+[assembly: AssemblyVersion("1.0.0")]
+[assembly: AssemblyFileVersion("0.0.1")]
\ No newline at end of file

I filled or edited out the attributes which NuGet checks when looking for configuration information: AssemblyDescription, AssemblyCompany. AssemblyCopyright and AssemblyVersion.  I also changed AssemblyFileVersion, even though NuGet doesn’t use it, and I left AssemblyTitle alone because I was happy with the value already there.  After building again, these changes should show up in the newly created package.

An Image of the NuGet Package Explorer showing updated metadata
Most AssemblyInfo changes are applied automatically

NuGet applied most of my changes automatically, and all the build warnings are gone.  But I expected a 2-dot version number both in the package name and as part of the metadata.  That 3-dot version is still hanging around.  I can take greater control over the version number, as well as many other aspects of the package metadata by providing a “nuspec” metadata file.  If this file has the same name as my project and is in the same directory as my project, then NuGet will prefer to use the data from the nuspec.

Pull the Nuspec File Out

You can generate nuspec files from assemblies or project files using NuGet.exe.  In the past I’ve found this method for creating nuspec files to be tedious because it creates configuration  I don’t always need or configuration with boilerplate text that I need to delete.  My old solution was some fairly complex MSBuild scripts that transformed generated files, but today I just create the default package as described above, rip it’s metadata, then customize to my liking.  If you have NuGet Package Explorer open, it’s pretty easy to use the “Save Metadata As…” menu item under “File” and save the nuspec file next to your project file (remove the version number from the filename if you do this).

Another way to retrieve the package nuspec file is with an unzip tool.  NuGet packages are zip files, and tools like 7-zip recognize this, buy you can always change the extension from nupkg to zip, if 7-zip isn’t handy.   Once the file has a zip extension, any zip utility can manipulate it, including the native support built into Windows.

An image showing the nuget package as a zip, open in Windows FIle Explorer
Nupkg files are Zip files

You can extract all the files from the zip, or just the nuspec file.  You will only need the nuspec file.

Put the Nuspec File Next to the Project

Once you have pulled the nuspec file out of the existing package, move it to the project directory.  It should sit in the same folder where the csproj file is (or vbproj, or fsproj) and have the same base name as the csproj.  There should be no version number in the nuspec file name, so remove it if there is.

An image showing the nuspec file in the project folder.
Put the nuspec file in the project folder

You can also add the item to the project using Visual Studio for easy access from the IDE, but it is not required.  I usually add it.

Make Changes

Now, let’s take a look at what is inside the nuspec file.

<?xml version="1.0"?>
<package xmlns="http://schemas.microsoft.com/packaging/2011/08/nuspec.xsd">
 <metadata>
 <id>TemporaryFile</id>
 <version>1.0.0.0</version>
 <title>TemporaryFile</title>
 <authors>ACME Co.</authors>
 <owners>ACME Co.</owners>
 <requireLicenseAcceptance>false</requireLicenseAcceptance>
 <description>A file that deletes itself when disposed</description>
 <copyright>Copyright © Jim Counts 2013</copyright>
 <dependencies>
 <dependency id="ApprovalUtilities" version="3.0.5" />
 </dependencies>
 </metadata>
</package>

We can see that most of the information in the nuspec file is the exact information displayed in the package explorer. I can now override the defaults by editing this file.  Any XML or text editor will work, it’s very convenient to use Visual Studio if you add the nuspec file to the project, so that’s what I usually do.

diff --git a/NugetLikeAPro/TemporaryFile/TemporaryFile.nuspec b/NugetLikeAPro/TemporaryFile/TemporaryFile.nuspec
index 5770b72..815c44e 100644
--- a/NugetLikeAPro/TemporaryFile/TemporaryFile.nuspec
+++ b/NugetLikeAPro/TemporaryFile/TemporaryFile.nuspec
@@ -2,15 +2,12 @@
 <package xmlns="http://schemas.microsoft.com/packaging/2011/08/nuspec.xsd">
 <metadata>
 <id>TemporaryFile</id>
- <version>1.0.0.0</version>
+ <version>0.0.1</version>
 <title>TemporaryFile</title>
- <authors>ACME Co.</authors>
+ <authors>@jamesrcounts</authors>
 <owners>ACME Co.</owners>
 <requireLicenseAcceptance>false</requireLicenseAcceptance>
 <description>A file that deletes itself when disposed</description>
 <copyright>Copyright © Jim Counts 2013</copyright>
- <dependencies>
- <dependency id="ApprovalUtilities" version="3.0.5" />
- </dependencies>
 </metadata>
 </package>
\ No newline at end of file

I changed the version number to “0.0.1” and updated the the author to use my twitter handle.  “ACME Co.” is still the owner, and I removed the dependency list.  I prefer to allow NuGet to continue to infer this information on it’s own.

With these changes, the next package I build should reflect the new version number in the file name, and show updated metadata for Version and Authors.  However, the dependency list should remain the same in the completed package.

An image of Nuget Package Explorer showing the applied customizations
That’s More Like It

Automate

You’ll need some way to share your package now that you’ve created one.  If it’s an open source project you can definitely upload it to nuget.org if you like.  For private code, that’s probably not a good idea.  There are solutions out there, and I wrote about one of them in a previous article: Use ProGet to Host Your Private Packages.  In the interest of making sure this article doesn’t get any longer than it already is, I won’t cover options for sharing private packages here.

However, there are a couple things you can do now which will make your life easier once you do start sharing your package.  First, nuget.targets does not clean up after itself during clean and rebuild.  This means that all your old package versions will hang around in the build folder until you delete them manually.  Besides taking up space, those packages eventually slow you down when you get ready to share.  If you are using the NuGet Package Explorer to share, you have to scroll past an increasingly longer list of old package versions to find the new version you want to upload, and if you use the command line utility, all those old versions increase the amount of typing and tabbing in order to complete the command.  Finally, I find the quickest way to push packages is with a custom script which wraps the command line utility, and that script is much easier to write when the bin folder only contains the latest package.

Cleanup with nuget.targets

To integrate nuget.target with “Clean” and “Rebuild” you need to add a new target to the script, add a new item group which lists the files to clean, and finally ad a hook using the “CleanDependsOn” property that will actually execute the target.

Nuget.targets is already added to your solution in the .nuget folder, open it and add what you need.

diff --git a/NugetLikeAPro/.nuget/NuGet.targets b/NugetLikeAPro/.nuget/NuGet.targets
index 8962872..a5cebf3 100644
--- a/NugetLikeAPro/.nuget/NuGet.targets
+++ b/NugetLikeAPro/.nuget/NuGet.targets
@@ -1,5 +1,8 @@
 <?xml version="1.0" encoding="utf-8"?>
 <Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
+ <ItemGroup>
+ <OutputPackages Include="$(TargetDir)*.nupkg" />
+ </ItemGroup>
 <PropertyGroup>
 <SolutionDir Condition="$(SolutionDir) == '' Or $(SolutionDir) == '*Undefined*'">$(MSBuildProjectDirectory)\..\</SolutionDir>

@@ -83,6 +86,11 @@
 $(BuildDependsOn);
 BuildPackage;
 </BuildDependsOn>
+
+ <CleanDependsOn Condition="$(BuildPackage) == 'true'">
+ $(CleanDependsOn);
+ CleanPackages;
+ </CleanDependsOn>
 </PropertyGroup>

 <Target Name="CheckPrerequisites">
@@ -118,6 +126,10 @@
 Condition=" '$(OS)' == 'Windows_NT' " />
 </Target>

+ <Target Name="CleanPackages">
+ <Delete Files="@(OutputPackages)"></Delete>
+ </Target>
+
 <UsingTask TaskName="DownloadNuGet" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll">
 <ParameterGroup>
 <OutputFilename ParameterType="System.String" Required="true" />

On lines 8-10 I define a collection of items called “OutputPackages” which uses a glob to find all the NuGet packages in the bin directory, referred to in the script as TargetDir.

Then I use this item collection with the new target defined on lines 30-32.  The CleanPackages target is a very simple target that uses MSBuild’s built-in Delete task to remove the files in the OutptuPackages collection.

Finally, I instruct MSBuild to run this target during clean by hooking into the CleanDependsOn property using lines 19-22.  CleanDependsOn is one of several hooks provided for modifying targets defined in Microsoft.Common.Targets. On line 20, I add back any existing dependencies and on line 21 I append the CleanPackages target to the end of the list.  Now, MSBuild will clean up old packages whenever I Clean or Rebuild my project.

Write a push script

Pushing your packages to NuGet.org is pretty simple because it is the default for nuget.exe.  Both NuGet.exe and the NuGet Package Explorer will allow you to specify a custom host to push your package to, but I’m paranoid that I will forget to specify the host and send packages to nuget.org that I don’t want to share publicly.

So, to speed things up, and to keep the risk of mistakes to a minimum, I use a simple shell script to push my packages.  Here is an example that would push to a local ProGet server.

.nuget\NuGet.exe push .\TemporaryFile\bin\Debug\*.nupkg -apikey Admin:Admin -source http://localhost:81/nuget/Default

I specified ProGet’s default credentials as the API key, but if you plan to push to nuget.org I suggest you use the NuGet “setapikey” option to configure the API key on your machine, and that way you don’t have to commit the key to source control.

Recap

In this post I showed how to create basic packages with MSBuild, customize them, and gave a couple automation tips I find useful.  Once you have converted a few of projects to produce packages this way, you can do the conversion in about 15 minutes for straightforward packages.  NuGet packages can become complex and you may need to do a lot more in the customization stage.  However, for most cases I find that these few steps are enough: enable package restore, add the BuildPackage property, rip the nuspec file from the first package, customize a few pieces of AssemblyInfo and nuspec metadata, and start sharing your package.

Once you have the package, you can quit, or you can make your life a little easier by adding a cleanup target and a push script.  Either way, I hope you find this information useful, and bit more approachable than my previous post on this topic.

The GitHub repository to accompany this post is here: https://github.com/jamesrcounts/Blogs/tree/master/NugetLikeAPro

Coding, Testing, Uncategorized

Clarify Your Test Intention with ApprovalTests

In this post I’m going to explore the benefits of shorter tests. Today, I’m not interested in shortening test run times (although that’s a good thing too). Instead, I am interested in shortening the amount of code I have to read before I can figure out (or remember) the intention of a test.

If you believe, even a little bit, that "the tests are the documentation" or "the tests are the spec", then the tests better be crystal clear about what they are trying to prove. If they are not clear then the "specification" aspect of the test will be lost to future readers (possibly yourself).

So lets look at one way intention gets less clear when tests are long.

Long Test

While engaged in some practical refactoring at work, I recently came across some really long tests. The general domain was parsing, but I’ve changed the specifics to protect the guilty. I’m pasting the complete test here because I want to give you a taste of how overwhelming the initial test looked.

namespace ApprovalsExample
{
    using Microsoft.VisualStudio.TestTools.UnitTesting;

    /// <summary>
    /// Describe a JSON parser.
    /// </summary>
    [TestClass]
    public class JsonParserTest
    {
        /// <summary>
        /// Parse this JSON into a POCO object.
        /// </summary>
        [TestMethod]
        public void ItConvertsJsonToPoco()
        {
            const string Source = @"{
            ""status"": ""upcoming"",
            ""visibility"": ""public"",
            ""maybe_rsvp_count"": 0,
            ""venue"": {
                ""id"": 11835602,
                ""zip"": ""92660"",
                ""lon"": -117.867828,
                ""repinned"": false,
                ""name"": ""TEKsystems"",
                ""state"": ""CA"",
                ""address_1"": ""100 Bayview Circle #3400"",
                ""lat"": 33.655819,
                ""city"": ""Newport Beach"",
                ""country"": ""us""
            },
            ""id"": ""124139172"",
            ""utc_offset"": -25200000,
            ""duration"": 10800000,
            ""time"": 1378947600000,
            ""waitlist_count"": 0,
            ""announced"": false,
            ""updated"": 1370985561000,
            ""yes_rsvp_count"": 7,
            ""created"": 1370985561000,
            ""event_url"": ""http://www.meetup.com/vNext-OrangeCounty/events/124139172/"",
            ""description"": ""<p><strong>Talk Info :</strong></p>\n<p>The techniques for building applications have changed dramatically in the last <br />\n\nfew years. Gone are the days of single-tier, battle-ship gray, boring user <br />\n\ninterfaces. Users demand that your applications (or portions) run on more than <br />\n\none device. This session will take you on a tour of how you should be architecting your application by breaking it up into services. You will learn how <br />\n\nto create your business rules and data layer as a service. This seminar will <br />\n\nassume you have some knowledge of .NET but have been developing <br />\n\napplications the old way and you are now looking to see how to use WCF and <br />\n\nthe Model-View-View-Model (MVVM) design pattern to create applications that <br />\n\ncan be run one more than one user interface platform. This session has many <br />\n\ndemonstrations and you will be led step-by-step through the code. You will walk <br />\n\naway with a sample set of services that run on Silverlight, Windows Forms, <br />\n\nWPF, Windows Phone and ASP.NET.</p>\n<p> </p>\n<p><strong>About The Speaker</strong></p>\n<p>Paul D. Sheriff is the President of PDSA, Inc. (www.pdsa.com), a Microsoft <br />\n\nPartner in Southern California. Paul acts as the Microsoft Regional Director for <br />\n\nSouthern California assisting the local Microsoft offices with several of their <br />\n\nevents each year and being an evangelist for them. Paul has authored literally <br />\n\nhundreds of books, webcasts, videos and articles on .NET, WPF, Silverlight, <br />\n\nWindows Phone and SQL Server. Paul can be reached via email at <br />\n\nPSheriff@pdsa.com. Check out Paul's new code generator 'Haystack' at <br />\n\n<a href=\""http://www.CodeHaystack.com\"">www.CodeHaystack.com</a>.</p>"",
            ""how_to_find_us"": ""Office is on the 3rd floor of the North Tower - Occupied by TekSystems"",
            ""name"": ""Paul D. Sheriff - Architecting Applications for Multiple User Interfaces"",
            ""headcount"": 0,
            ""group"": {
                ""id"": 2983232,
                ""group_lat"": 33.650001525878906,
                ""name"": ""vNext_OC"",
                ""group_lon"": -117.58999633789062,
                ""join_mode"": ""open"",
                ""urlname"": ""vNext-OrangeCounty"",
                ""who"": ""Members""
            }
        }";

            var o = Event.DeserializeJson(Source);
            const string Answer = @"Announced: False, Created: 1370985561000, Description: <p><strong>Talk Info :</strong></p>
<p>The techniques for building applications have changed dramatically in the last <br />

few years. Gone are the days of single-tier, battle-ship gray, boring user <br />

interfaces. Users demand that your applications (or portions) run on more than <br />

one device. This session will take you on a tour of how you should be architecting your application by breaking it up into services. You will learn how <br />

to create your business rules and data layer as a service. This seminar will <br />

assume you have some knowledge of .NET but have been developing <br />

applications the old way and you are now looking to see how to use WCF and <br />

the Model-View-View-Model (MVVM) design pattern to create applications that <br />

can be run one more than one user interface platform. This session has many <br />

demonstrations and you will be led step-by-step through the code. You will walk <br />

away with a sample set of services that run on Silverlight, Windows Forms, <br />

WPF, Windows Phone and ASP.NET.</p>
<p> </p>
<p><strong>About The Speaker</strong></p>
<p>Paul D. Sheriff is the President of PDSA, Inc. (www.pdsa.com), a Microsoft <br />

Partner in Southern California. Paul acts as the Microsoft Regional Director for <br />

Southern California assisting the local Microsoft offices with several of their <br />

events each year and being an evangelist for them. Paul has authored literally <br />

hundreds of books, webcasts, videos and articles on .NET, WPF, Silverlight, <br />

Windows Phone and SQL Server. Paul can be reached via email at <br />

PSheriff@pdsa.com. Check out Paul's new code generator 'Haystack' at <br />

<a href=""http://www.CodeHaystack.com"">www.CodeHaystack.com</a>.</p>, Duration: 10800000, EventUrl: , Group: ApprovalsExample.Group, HowToFindUs: , Headcount: 0, Id: 124139172, MaybeRsvpCount: 0, Name: Paul D. Sheriff - Architecting Applications for Multiple User Interfaces, Status: upcoming, Time: 1378947600000, Updated: 1370985561000, UtcOffset: 0, Venue: ApprovalsExample.Venue, Visibility: public, WaitlistCount: 0, YesRsvpCount: 0";
            Assert.AreEqual(Answer, o.ToString());
        }
    }
}

We can guess from the initial JSON blob (grabbed from vNext OC‘s meetup.com event stream) and the test name that the intention is to demonstrate something about converting JSON into .net objects. But the input data for the test is so large that we must scroll almost an entire page before seeing the first executable line of code:

var o = Event.DeserializeJson(Source);

Once we get past the first 40 or so lines, we finally see that the Event class does the parsing. Next we have 40 or so lines of expectation definition before we reach a very simple assert:

Assert.AreEqual(Answer, o.ToString());

So the test is not that hard to understand, but the signal-to-noise ratio is wimpy: 2:83. In this high level test, the specifics of the JSON source are not important. The only important thing about the source text is that it produces the expected result. Likewise the only important thing about the expected result is that it is correct and it corresponds to the provided input. So, both giant strings are noise.

Alternatives

Of course, the thrust of my argument is that ApprovalTests provides the best set of tools for cleaning up a test like this. But let me setup a couple of straw-men first.

AAA

Maybe you read the test and thought, "Fool! You didn’t follow triple-A!" While it is true that the test doesn’t conform to the Arrange/Act/Assert pattern, making it conform to AAA only yields a small improvement. By moving the call to DeserializeJson from line 42 to line 83, I now conform to the pattern:

[TestMethod]
public void ItConvertsJsonToPoco()
{
    const string Source = @"{
    /* 80 lines of "Arrange" omitted */

    var o = Event.DeserializeJson(Source);
    Assert.AreEqual(Answer, o.ToString());
}

What is the improvement? Well now all the code is next to each other, so you no longer have to hunt for the "Act" part, just skip to the bottom, and there it is. Knowing where things should be is one of the strengths of AAA, I’ll concede that. Unfortunately, we haven’t done anything to fix the signal-to-noise ratio, it is still 2:83. It’s a little easier to find the signal, because its all bunched up at the end of the noise (past 2 pages of noise now).

Move the Noise

To gain any traction on the signal to noise ratio, we need to put the noise somewhere else.

Many of testers labor under a misconception similar to this: "A test must not interact with any system except the one under test." Many usually include the file system under the category "any". Clearly, I am not a subscriber to this line of thinking, but I can take a jab at the straw-man by pointing out that the tests exist in code files that live on the file system. So, I would not worry about that, but since so many do, lets see what kind of improvement we can get by moving things around. We could promote the big strings to fields, and reduce the number of lines in our test body.

[TestMethod]
public void ItConvertsJsonToPoco()
{
    var o = Event.DeserializeJson(Source);
    Assert.AreEqual(Answer, o.ToString());
}

This certainly makes this one test look nicer. If I only consider the test method we have fantastic signal-to-noise: 1:1. This is not to say that it is absolutely clear what this test intends to prove, but we can very quickly see how it tries to prove it. So, good signal-to-noise isn’t everything, but it helps.

Can we stop here and call it a day? Of course, the truth is that you can, because the test still passes. Not surprisingly though, I say no.

I have problems with this approach. In this example, I’ve only written one test, and this solution seems to work OK, but does it scale? At work, the actual test suite contained many tests, and this solution would not scale well. Applying "move the noise" to all the tests would result in half-a-dozen "sources" and half-a-dozen "answers". These were varying lengths, some much longer than 40 lines, so we are talking about a preamble of many hundred lines of "Arrange" starting off the class before we get to any "Act" or "Assert".

I also have a problem with maintaining giant strings inside the tests, no matter where they are put in the code. First, you often run afoul of newlines and quote marks. The newlines in the answer conform to the new lines in your environment, in my case this means CRLF. The JSON blob has a mixture of line endings, so something must be done to the answer or the source to get them to match. Then we have quote marks. The JSON uses double quotes, so I had to convert them to double-double quotes to make the multi-line string literal work. Of course I could have escaped everything and used a normal interpolated string… but that’s work too. I don’t want to do any extra work.

Giant strings in the code are also very easy to mess up. If you are clumsy like me (or maybe you are perfect… but you might have someone clumsy on your team) your cursor often ends up where you least expect it when you’re in the middle of typing a tweet to your sweet old grandmother (that’s what twitter is for right?). Next thing your know, your test is failing because some how the phrase "I really liked the pie @Grandma" ends up in your giant string. I don’t like constructing my tests in such a way that debugging sessions can result from dumb mistakes.

Use ApprovalTests to Hide the Noise

ApprovalTests for .net is a assertion library that enhances your existing test framework with new capabilities for long strings, dictionaries, collections, log files, web pages, WinForm views, WPF views, Entity Framework queries, event configurations, and RDLC reports. If this is the first you’ve ever heard of ApprovalTests, then I encourage you to explore further by watching a few short videos on youtube, posted by the creator of ApprovalTests, Llewellyn Falco. Don’t let the purple hair put you off, they are great videos.

ApprovalTests provide the perfect solution for shortening the long test presented at the beginning of this post. In fact, that test’s original author had essentially re-invented approval testing without knowing it, and without gaining the power that the ApprovalTests library would provide. Our test has three parts, a known input, an action, and a known correct output. The output is the big answer string, and we know it is correct because the test passed when I inherited it from its original author. Approval testing is about capturing human intelligence, a human has declared This is what the DeserializeJson method produces. We should continue to check that the correct answer is given. An approval test automates this check.

In particular, the ApprovalTests library not only automates this check for us, but provides us with better feedback on failure. It also hides the noisy strings most of the time, but will present us with an opportunity to review or update the answer when the test fails.

At work I refactored the original test into an ApprovalTest, but for this post, I’ll just continue from where we were. I’ll post all the code so you can watch it shrink. So here is where we we want to go:

namespace ApprovalsExample
{
    using System.IO;
    using ApprovalTests;
    using ApprovalTests.Reporters;
    using ApprovalUtilities.Utilities;
    using Microsoft.VisualStudio.TestTools.UnitTesting;

    /// <summary>
    /// Describe a JSON parser.
    /// </summary>
    [TestClass]
    public class JsonParserTest
    {
        /// <summary>
        /// Parse this JSON into a POCO object.
        /// </summary>
        [TestMethod]
        [UseReporter(typeof(VisualStudioReporter))]
        public void ItConvertsJsonToPoco()
        {
            var text = File.ReadAllText(PathUtilities.GetAdjacentFile("sample.json"));
            var o = Event.DeserializeJson(text);
            Approvals.Verify(o);
        }
    }
}

And here is where we are after "moving the noise":

[TestClass]
public class JsonParserTest
{
    private const string Expected = @"Announced: False, Created: 1370985561000, Description: <p><strong>Talk Info :</strong></p>
<p>The techniques for building applications have changed dramatically in the last <br />

few years. Gone are the days of single-tier, battle-ship gray, boring user <br />

interfaces. Users demand that your applications (or portions) run on more than <br />

one device. This session will take you on a tour of how you should be architecting your application by breaking it up into services. You will learn how <br />

to create your business rules and data layer as a service. This seminar will <br />

assume you have some knowledge of .NET but have been developing <br />

applications the old way and you are now looking to see how to use WCF and <br />

the Model-View-View-Model (MVVM) design pattern to create applications that <br />

can be run one more than one user interface platform. This session has many <br />

demonstrations and you will be led step-by-step through the code. You will walk <br />

away with a sample set of services that run on Silverlight, Windows Forms, <br />

WPF, Windows Phone and ASP.NET.</p>
<p> </p>
<p><strong>About The Speaker</strong></p>
<p>Paul D. Sheriff is the President of PDSA, Inc. (www.pdsa.com), a Microsoft <br />

Partner in Southern California. Paul acts as the Microsoft Regional Director for <br />

Southern California assisting the local Microsoft offices with several of their <br />

events each year and being an evangelist for them. Paul has authored literally <br />

hundreds of books, webcasts, videos and articles on .NET, WPF, Silverlight, <br />

Windows Phone and SQL Server. Paul can be reached via email at <br />

PSheriff@pdsa.com. Check out Paul's new code generator 'Haystack' at <br />

<a href=""http://www.CodeHaystack.com"">www.CodeHaystack.com</a>.</p>, Duration: 10800000, EventUrl: , Group: ApprovalsExample.Group, HowToFindUs: , Headcount: 0, Id: 124139172, MaybeRsvpCount: 0, Name: Paul D. Sheriff - Architecting Applications for Multiple User Interfaces, Status: upcoming, Time: 1378947600000, Updated: 1370985561000, UtcOffset: 0, Venue: ApprovalsExample.Venue, Visibility: public, WaitlistCount: 0, YesRsvpCount: 0";

    private const string Source = @"{
        ""status"": ""upcoming"",
        ""visibility"": ""public"",
        ""maybe_rsvp_count"": 0,
        ""venue"": {
            ""id"": 11835602,
            ""zip"": ""92660"",
            ""lon"": -117.867828,
            ""repinned"": false,
            ""name"": ""TEKsystems"",
            ""state"": ""CA"",
            ""address_1"": ""100 Bayview Circle #3400"",
            ""lat"": 33.655819,
            ""city"": ""Newport Beach"",
            ""country"": ""us""
        },
        ""id"": ""124139172"",
        ""utc_offset"": -25200000,
        ""duration"": 10800000,
        ""time"": 1378947600000,
        ""waitlist_count"": 0,
        ""announced"": false,
        ""updated"": 1370985561000,
        ""yes_rsvp_count"": 7,
        ""created"": 1370985561000,
        ""event_url"": ""http://www.meetup.com/vNext-OrangeCounty/events/124139172/"",
        ""description"": ""<p><strong>Talk Info :</strong></p>\n<p>The techniques for building applications have changed dramatically in the last <br />\n\nfew years. Gone are the days of single-tier, battle-ship gray, boring user <br />\n\ninterfaces. Users demand that your applications (or portions) run on more than <br />\n\none device. This session will take you on a tour of how you should be architecting your application by breaking it up into services. You will learn how <br />\n\nto create your business rules and data layer as a service. This seminar will <br />\n\nassume you have some knowledge of .NET but have been developing <br />\n\napplications the old way and you are now looking to see how to use WCF and <br />\n\nthe Model-View-View-Model (MVVM) design pattern to create applications that <br />\n\ncan be run one more than one user interface platform. This session has many <br />\n\ndemonstrations and you will be led step-by-step through the code. You will walk <br />\n\naway with a sample set of services that run on Silverlight, Windows Forms, <br />\n\nWPF, Windows Phone and ASP.NET.</p>\n<p> </p>\n<p><strong>About The Speaker</strong></p>\n<p>Paul D. Sheriff is the President of PDSA, Inc. (www.pdsa.com), a Microsoft <br />\n\nPartner in Southern California. Paul acts as the Microsoft Regional Director for <br />\n\nSouthern California assisting the local Microsoft offices with several of their <br />\n\nevents each year and being an evangelist for them. Paul has authored literally <br />\n\nhundreds of books, webcasts, videos and articles on .NET, WPF, Silverlight, <br />\n\nWindows Phone and SQL Server. Paul can be reached via email at <br />\n\nPSheriff@pdsa.com. Check out Paul's new code generator 'Haystack' at <br />\n\n<a href=\""http://www.CodeHaystack.com\"">www.CodeHaystack.com</a>.</p>"",
        ""how_to_find_us"": ""Office is on the 3rd floor of the North Tower - Occupied by TekSystems"",
        ""name"": ""Paul D. Sheriff - Architecting Applications for Multiple User Interfaces"",
        ""headcount"": 0,
        ""group"": {
            ""id"": 2983232,
            ""group_lat"": 33.650001525878906,
            ""name"": ""vNext_OC"",
            ""group_lon"": -117.58999633789062,
            ""join_mode"": ""open"",
            ""urlname"": ""vNext-OrangeCounty"",
            ""who"": ""Members""
        }
    }";

    /// <summary>
    /// Parse this JSON into a POCO object.
    /// </summary>
    [TestMethod]
    public void ItConvertsJsonToPoco()
    {
        var o = Event.DeserializeJson(Source);
        Assert.AreEqual(Expected, o.ToString());
    }
}

Lets start refactoring.

Hide Source in File

After adding ApprovalTests to the project using nuget, I can take advantage of ApprovalUtilities to help me move the big source string into a file that sits next to the code file. I could do this by making a file and using cut and paste, but as I previously discussed, I had to mangle the source with double-double quotes to make the string literal work. I could demangle the source by hand, but letting the computer do it will be quick and less error prone.

Here’s the relevant portions of the code:

namespace ApprovalsExample
{
    using System.IO;
    using ApprovalUtilities.Utilities;

    /// <summary>
    /// Describe a JSON parser.
    /// </summary>
    [TestClass]
    public class JsonParserTest
    {
        /* Giant strings still here, omitted for clarity */

        /// <summary>
        /// Parse this JSON into a POCO object.
        /// </summary>
        [TestMethod]
        public void ItConvertsJsonToPoco()
        {
            File.WriteAllText(PathUtilities.GetAdjacentFile("sample.json"), Source);
            var o = Event.DeserializeJson(Source);
            Assert.AreEqual(Expected, o.ToString());
        }
    }
}

I added a couple of namespaces that I will need going forward, and added a line of code to write the giant source string into a file. Notice that I am still using the giant source string in the test. I’m just going to change one thing at a time as I refactor, then run the tests before making the next change. The next time I run this test, PathUtilities will provide the full path to a non-existent file next to the code file called "sample.json". Then WriteAllText will create that file by dumping the giant source string into it. So I run the test, it passes, and now I have a copy of the source in "sample.json":

{
        "status": "upcoming",
        "visibility": "public",
        "maybe_rsvp_count": 0,
        "venue": {
            "id": 11835602,
            "zip": "92660",
            "lon": -117.867828,
            "repinned": false,
            "name": "TEKsystems",
            "state": "CA",
            "address_1": "100 Bayview Circle #3400",
            "lat": 33.655819,
            "city": "Newport Beach",
            "country": "us"
        },
        "id": "124139172",
        "utc_offset": -25200000,
        "duration": 10800000,
        "time": 1378947600000,
        "waitlist_count": 0,
        "announced": false,
        "updated": 1370985561000,
        "yes_rsvp_count": 7,
        "created": 1370985561000,
        "event_url": "http://www.meetup.com/vNext-OrangeCounty/events/124139172/",
        "description": "<p><strong>Talk Info :</strong></p>\n<p>The techniques for building applications have changed dramatically in the last <br />\n\nfew years. Gone are the days of single-tier, battle-ship gray, boring user <br />\n\ninterfaces. Users demand that your applications (or portions) run on more than <br />\n\none device. This session will take you on a tour of how you should be architecting your application by breaking it up into services. You will learn how <br />\n\nto create your business rules and data layer as a service. This seminar will <br />\n\nassume you have some knowledge of .NET but have been developing <br />\n\napplications the old way and you are now looking to see how to use WCF and <br />\n\nthe Model-View-View-Model (MVVM) design pattern to create applications that <br />\n\ncan be run one more than one user interface platform. This session has many <br />\n\ndemonstrations and you will be led step-by-step through the code. You will walk <br />\n\naway with a sample set of services that run on Silverlight, Windows Forms, <br />\n\nWPF, Windows Phone and ASP.NET.</p>\n<p> </p>\n<p><strong>About The Speaker</strong></p>\n<p>Paul D. Sheriff is the President of PDSA, Inc. (www.pdsa.com), a Microsoft <br />\n\nPartner in Southern California. Paul acts as the Microsoft Regional Director for <br />\n\nSouthern California assisting the local Microsoft offices with several of their <br />\n\nevents each year and being an evangelist for them. Paul has authored literally <br />\n\nhundreds of books, webcasts, videos and articles on .NET, WPF, Silverlight, <br />\n\nWindows Phone and SQL Server. Paul can be reached via email at <br />\n\nPSheriff@pdsa.com. Check out Paul's new code generator 'Haystack' at <br />\n\n<a href=\"http://www.CodeHaystack.com\">www.CodeHaystack.com</a>.</p>",
        "how_to_find_us": "Office is on the 3rd floor of the North Tower - Occupied by TekSystems",
        "name": "Paul D. Sheriff - Architecting Applications for Multiple User Interfaces",
        "headcount": 0,
        "group": {
            "id": 2983232,
            "group_lat": 33.650001525878906,
            "name": "vNext_OC",
            "group_lon": -117.58999633789062,
            "join_mode": "open",
            "urlname": "vNext-OrangeCounty",
            "who": "Members"
        }
    }

Admittedly, the indentation is a little funky, but at least all the double-double quotes are now back to single double quotes. A trip to JSONLint shows the blob is kosher. Now I can refactor the test to use this file instead of the giant string. Only two lines need to change:

var text = File.ReadAllText(PathUtilities.GetAdjacentFile("sample.json"));
var o = Event.DeserializeJson(text);

I changed WriteAllText to ReadAllText, then captured the result in a variable. Next, I updated the call to DeserializeJson to use the text I just read, instead of the string stored in Source. I run the test and it passes.

Now my refactoring tool tells me that the Source field is unused. So I delete the giant string and run the test. It passes, leaving me with the same test, minus about 40 lines of string.

namespace ApprovalsExample
{
    using System.IO;
    using ApprovalUtilities.Utilities;
    using Microsoft.VisualStudio.TestTools.UnitTesting;

    /// <summary>
    /// Describe a JSON parser.
    /// </summary>
    [TestClass]
    public class JsonParserTest
    {
        private const string Expected = @"Announced: False, Created: 1370985561000, Description: <p><strong>Talk Info :</strong></p>
<p>The techniques for building applications have changed dramatically in the last <br />

few years. Gone are the days of single-tier, battle-ship gray, boring user <br />

interfaces. Users demand that your applications (or portions) run on more than <br />

one device. This session will take you on a tour of how you should be architecting your application by breaking it up into services. You will learn how <br />

to create your business rules and data layer as a service. This seminar will <br />

assume you have some knowledge of .NET but have been developing <br />

applications the old way and you are now looking to see how to use WCF and <br />

the Model-View-View-Model (MVVM) design pattern to create applications that <br />

can be run one more than one user interface platform. This session has many <br />

demonstrations and you will be led step-by-step through the code. You will walk <br />

away with a sample set of services that run on Silverlight, Windows Forms, <br />

WPF, Windows Phone and ASP.NET.</p>
<p> </p>
<p><strong>About The Speaker</strong></p>
<p>Paul D. Sheriff is the President of PDSA, Inc. (www.pdsa.com), a Microsoft <br />

Partner in Southern California. Paul acts as the Microsoft Regional Director for <br />

Southern California assisting the local Microsoft offices with several of their <br />

events each year and being an evangelist for them. Paul has authored literally <br />

hundreds of books, webcasts, videos and articles on .NET, WPF, Silverlight, <br />

Windows Phone and SQL Server. Paul can be reached via email at <br />

PSheriff@pdsa.com. Check out Paul's new code generator 'Haystack' at <br />

<a href=""http://www.CodeHaystack.com"">www.CodeHaystack.com</a>.</p>, Duration: 10800000, EventUrl: , Group: ApprovalsExample.Group, HowToFindUs: , Headcount: 0, Id: 124139172, MaybeRsvpCount: 0, Name: Paul D. Sheriff - Architecting Applications for Multiple User Interfaces, Status: upcoming, Time: 1378947600000, Updated: 1370985561000, UtcOffset: 0, Venue: ApprovalsExample.Venue, Visibility: public, WaitlistCount: 0, YesRsvpCount: 0";

        /// <summary>
        /// Parse this JSON into a POCO object.
        /// </summary>
        [TestMethod]
        public void ItConvertsJsonToPoco()
        {
            var text = File.ReadAllText(PathUtilities.GetAdjacentFile("sample.json"));
            var o = Event.DeserializeJson(text);
            Assert.AreEqual(Expected, o.ToString());
        }
    }
}

Hide Expectation in File

I could use a similar technique to hide the expectation in a file, but I don’t need to because hiding the expectation is built into the library. This is one of the tasks that ApprovalTests excels at. So, leaving all else the same, I will add a couple namespaces to the code, and make a couple small changes to the test.

namespace ApprovalsExample
{        
    using ApprovalTests;
    using ApprovalTests.Reporters;
    /* Other namespace imports remain the same */

    /// <summary>
    /// Describe a JSON parser.
    /// </summary>
    [TestClass]
    public class JsonParserTest
    {
        private const string Expected = @"Announced: False, Created: 1370985561000, Description: ..."
        /* this giant string remains here for now */

        /// <summary>
        /// Parse this JSON into a POCO object.
        /// </summary>
        [TestMethod]
        [UseReporter(typeof(VisualStudioReporter))]
        public void ItConvertsJsonToPoco()
        {
            var text = File.ReadAllText(PathUtilities.GetAdjacentFile("sample.json"));
            var o = Event.DeserializeJson(text);
            Assert.AreEqual(Expected, o.ToString());
            Approvals.Verify(o);
        }
    }
}

I run this test and it fails, but this failure now occurs after the Assert, when I make the call to Verify. This is expected behavior for ApprovalTests. Until I have approved my expected output, ApprovalTests cannot check it for me, so it must continue to fail until I give my blessing to something. Besides failing, it also gives me the opportunity to review the results by launching a reporter. In this case, the output appears in Visual Studio’s diff viewer because I specified the VisualStudioReporter when I attached the UseReporter attribute to the test method.

The output we see on the lefts side is simply the result of converting the instance o into a string. Event happens to have a decent ToString formatting method, but I could have manipulated the output by formatting or redacting the data before calling Verify. Now the only question is whether I should give this result my blessing.

In fact, its not a question at all, I know that I can immediately approve the output because the original test still passes. Although the test shows as a failure in the test runner, I can see that it failed when it reached the Approval, meaning the Assert still passed. Since the assert is checking the same output that Verify checks, then if the Assert is good, the output received by Verify must also be good. Visual Studio does not provide merging unless you are connected TFS (as far as I can tell) so my options for approval are:

  1. Select all the left side and copy/paste to the right side.
  2. Use file explorer to rename the "received" file to JsonParserTest.ItConvertsJsonToPoco.approved.txt.

I will go with option two because I don’t trust copy/paste not to make mischief with things like line-endings and character encoding.

After renaming the file, I run the test again and it passes. I should note that I normally choose to use the composite DiffReporter which searches my system for a working diff utility and uses that to show me the results. These utilities (Kdiff3, BeyondCompare, Perforce, and many more…) usually let me approve the result without resorting to renaming files. I don’t know what Microsoft thinks it is accomplishing by hobbling their diff utility in this way.

Next, I delete the original assert, re-run the test, and it passes.

/// <summary>
/// Parse this JSON into a POCO object.
/// </summary>
[TestMethod]
[UseReporter(typeof(VisualStudioReporter))]
public void ItConvertsJsonToPoco()
{
    var text = File.ReadAllText(PathUtilities.GetAdjacentFile("sample.json"));
    var o = Event.DeserializeJson(text);
    Approvals.Verify(o);
}

Now that the original Assert is gone, my refactoring tool tells me that the Expected field (formerly Answer) is unused, so I delete it, and run the test.

With the second giant string removed, I’m left with this:

namespace ApprovalsExample
{
    using System.IO;
    using ApprovalTests;
    using ApprovalTests.Reporters;
    using ApprovalUtilities.Utilities;
    using Microsoft.VisualStudio.TestTools.UnitTesting;

    /// <summary>
    /// Describe a JSON parser.
    /// </summary>
    [TestClass]
    public class JsonParserTest
    {
        /// <summary>
        /// Parse this JSON into a POCO object.
        /// </summary>
        [TestMethod]
        [UseReporter(typeof(VisualStudioReporter))]
        public void ItConvertsJsonToPoco()
        {
            var text = File.ReadAllText(PathUtilities.GetAdjacentFile("sample.json"));
            var o = Event.DeserializeJson(text);
            Approvals.Verify(o);
        }
    }
}

And I’ve reached my goal. If you still care about signal-to-noise ratio, its 2:3. But more importantly, the entire test, including all the kruft of namespaces, attributes and comments can be seen and understood at a glance. I would probably not spend more than a few seconds reading this test before moving on to read the actual implementation of DeserializeJson. ApprovalTests has allowed me to shorten up this test, which makes the test take up less mental real-estate, and allows me to use more of my brain thinking about the production code instead of the test.

The code for this example is available on GitHub.

Coding

What’s New in CompositionTests 2.0

Download the latest version of CompositionTests from nuget.org!

ApprovalTests 3.0

Updated the dependency on ApprovalTests to 3.0.01. Thanks to the new version updating policy for ApprovalTests, CompositionTests should remain forward compatible with future versions of ApprovalTests, unless there are breaking changes in the API.

New version policy

Following LLewellyn’s lead with ApprovalTests, I am adopting a JSON.NET-style version update policy. Adopting this policy will enable me to sign CompositionTests in the future without creating forward-compatibility problems for anyone else. For now, the package remains unsigned because its other dependency, the MEFX Diagnostic Library is unsigned. I’ll have to decide if I’m willing to do anything about that before I can consider a signed version of CompositionTests.

The impact is that the CompositionTests AssemblyVersion will stay at 2.0.0 from now on. The real version can be found by looking at AssemblyFileVersion, or by looking at the nuget package version, which will be 2.0.1 for this release.

Common Language Specification Compliance

The CompositionTests library now declares itself CLS compliant. However, MEFX.Core does not make the same declaration, so certain methods that interact with the core are individually marked non-compliant. I don’t think that MEFX.Core uses anything non-compliant, the library is simply missing the declaration of compliance. I don’t think Microsoft has plans to provide any more updates to this piece, so I’ll have to decide that I’m willing to modify and maintain a fork of MEFX.Core before I can do anything about that missing attribute.

Removed Obsolete Methods

Methods and types marked with the ObsoleteAttribute in the 1.0 time-frame have been removed in order to clean up the interface in 2.0. You must now migrate to Verify* and MefComposition if you wish to use new versions of the library.