NuGet like a Pro, the MSBuild way

Back in 2012 I posted an article on this blog called “Creating Packages with NuGet the MSBuild Way“. That post described an MSBuild-integrated method to create NuGet packages from your own source code. It has remained one of my most popular posts. Like many popular things on the internet, it has been out of date for sometime now. When I check on my blog and see that the most visited article of the day is “Creating Packages with NuGet the MSBuild Way“, I wonder if visitors know that its out of date. Do they dismiss me as a crank and leave the page immediately? Even worse: do they follow the outdated and complicated recipe described in the post?

In 2012, I needed to integrate packaging into MSBuild because I could not find a plug-in for CruiseControl.net that would create NuGet packages.  There may be a plug-in now, I don’t know.  After a couple years creating NuGet packages, many tools I use from day to day have changed including my source control and continuous integration options. Even though I now have the option to create CI builds on TFS, where NuGetter is available, I still use MSBuild to configure my projects to create packages every time I hit F6.

I have a new, simple process for setting this up and it usually takes me about five minutes to convert an existing project to produce a NuGet package as part of it’s build output. I start with some existing code, enable package restore, make one small edit to my project file, and build. That’s all it takes to get the first package in the first five minutes.

If I want to continue customizing after creating the first package, I pull the nuspec file out of the existing package, put the nuspec file next to my project file, and customize from there, that’s the second five minutes.

Finally, I make some small modifications to the nuget.targets file provided by package restore in order to automate some cleanup, that takes about five more minutes.

It takes me about fifteen minutes to get everything setup just how I like it, but if your needs are simple, you can be done in five minutes. Hopefully this simplified process will be much more useful to my future visitors and help you, dear reader, understand how easy it is to create NuGet packages for your open source (or private) packages.  So read on for all the details!

Build

Start with Some Code

Any Class Library will do.  The important thing is that its something you want to share.  Either its your open source project, or a bit of private code which you’d like to share with your customers, other departments in your organization, or just your team.

For this example I’ve created a super-cool class called TemporaryFile.  TemporaryFile provides a disposable wrapper around a FileInfo which deletes the file when the Dispose method executes.  This allows the user to control the lifetime of the temporary file with a using statement, or trust the garbage collector to take care of it during finalization.  I find myself creating and deleting temporary files for a certain class of unit tests, and a wrapper like this takes alot of the grunt work out of the task.

namespace TemporaryFile
{
    using System;
    using System.IO;
    using ApprovalUtilities.Utilities;

    public class Temp : IDisposable
    {
        private readonly FileInfo backingFile;

        public Temp(string name)
        {
            this.backingFile =
                            new FileInfo(PathUtilities.GetAdjacentFile(name));
            this.backingFile.Create().Close();
        }

        ~Temp()
        {
            this.Dispose();
        }

        public FileInfo File
        {
            get
            {
                return this.backingFile;
            }
        }

        public void Dispose()
        {
            // File on the file system is not a managed resource
            if (this.backingFile.Exists)
            {
                this.backingFile.Delete();
            }
        }
    }
}

Notice that the class uses a method from PathUtilities in ApprovalUtilities (part of ApprovalTests).  I added this method call solely to generate a dependency on another package, which in turn helps demonstrate how much metadata NuGet can infer for you without explicit configuration.  Relying on inference is a big part of keeping this process fast an simple–as long as the inferred information meets your needs.

However, the way I used PathUtilities here turned out to be a bug.  So don’t copy this code.  It is useful to have a bug in the code when doing demos, so I left it in there.  If you think the temporary file idea sounds super useful, then a bug free version is now available as part of ApprovalUtilities.

If you examine the NugetLikeAPro repository on GitHub, TemporaryFile is a plain old .net # class library.  It has a test project but not much else is going on.

Enable Package Restore

The NuGet documentation is very good, and covers a lot of ground but if it covered everything then you wouldn’t need me!  I think that “Using NuGet without committing packages to source control” contains a lot of good information about what happens when you click the “Enable Package Restore” menu item, but it does not emphasize something very important to us as package creators: the NuGet.Build package installed by package restore contains everything you need to convert a project to create packages.

When you enable package restore, two packages are added to your solution: NuGet.CommandLine and NuGet.Build.  You could add these yourself, but that would be two steps instead of one.  Package restore also performs a third, more tedious step for you: it updates your project files to reference a new MSBuild script and adds a $(SolutionDir) property so that the new script can do its work.  The project files need to reference an MSBuild script (NuGet.targets) in order to run the package restore target before the build.  The package restore article doesn’t mention that the script also defines a build package target, which can create a package for you after the build completes.

So, lets enable package restore on TemoraryFile and see what we get.

Image of the Visual Studio solution context menu

Enable Package Restore

Just as promised by the documentation, the process added a solution folder and three files: NuGet.targets, NuGet.exe, and NuGet.Config.  NuGet.Config is only needed by TFS users so you can probably delete it safely.  It has no impact on what we are doing here.  By observing red checkmarks in the Solution Explorer we can also see that the process modified TemporaryFile.csproj and TemporaryFile.Tests.csproj.

Image showing Visual Studio solution explorer

Modifications to Solution

Lets see what changes package restore made to TemporaryFile.

diff --git a/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj b/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj
index c1e5a2c..85e156b 100644
--- a/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj
+++ b/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj
@@ -11,6 +11,8 @@
 <AssemblyName>TemporaryFile</AssemblyName>
 <TargetFrameworkVersion>v4.5</TargetFrameworkVersion>
 <FileAlignment>512</FileAlignment>
+ <SolutionDir Condition="$(SolutionDir) == '' Or $(SolutionDir) == '*Undefined*'">..\</SolutionDir>
+ <RestorePackages>true</RestorePackages>
 </PropertyGroup>
 <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
 <DebugSymbols>true</DebugSymbols>
@@ -49,6 +51,13 @@
 <None Include="packages.config" />
 </ItemGroup>
 <Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
+ <Import Project="$(SolutionDir)\.nuget\NuGet.targets" Condition="Exists('$(SolutionDir)\.nuget\NuGet.targets')" />
+ <Target Name="EnsureNuGetPackageBuildImports" BeforeTargets="PrepareForBuild">
+ <PropertyGroup>
+ <ErrorText>This project references NuGet package(s) that are missing on this computer. Enable NuGet Package Restore to download them. For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is {0}.</ErrorText>
+ </PropertyGroup>
+ <Error Condition="!Exists('$(SolutionDir)\.nuget\NuGet.targets')" Text="$([System.String]::Format('$(ErrorText)', '$(SolutionDir)\.nuget\NuGet.targets'))" />
+ </Target>
 <!-- To modify your build process, add your task inside one of the targets below and uncomment it.
 Other similar extension points exist, see Microsoft.Common.targets.
 <Target Name="BeforeBuild">

Lines 18-24 create the reference to the NuGet.targets file in the .nuget folder, and add some error handling if the script is missing during the build.  On line 9 the $(SolutionDir) property is created, and its default value is the project’s parent directory.  NuGet.targets uses this piece of configuration to find resources it needs, like NuGet.exe or the solution packages folder.  Finally on line 10, package restore is enabled by adding the RestorePackages property and setting it’s value to true.  (Side note: this is a bit misleading.  It is getting harder and harder to opt-out of package restore.  If you set this to false, Visual Studio will set it to true again during the build, unless you opt-out again using a separate Visual Studio option.)

Editing project files is a bit tedious because you have to unload them, open them again as XML files, make your changes and then reload them.  Its not hard to learn but its at least four mouse clicks and then some typing in an obscure syntax without much intellisense (although R# helps here).  It’s nice that the Enable Package Restore menu item did all that editing for you with one click.  Remember that the process also added two NuGet packages for you, so you can add all that to your overall click-savings.  Note that the documentation mentions a new feature available in NuGet 2.7 called “Automatic Package Restore“.  This feature is enabled by default and solves some problems caused by package restore in certain scenarios.  It’s already on by default, so we can imagine that someday a program manager at Microsoft is going to say, “Hey, lets get rid of that ‘Enable Package Restore’ menu item.”

If the Enable Package Restore “gesture” is ever removed then we can install the NuGet packages ourselves and make the necessary changes to the project files.  This will get tedious and use way more than the five minutes I’ve allotted to the process, so I’m sure someone will think of a clever way to automate it again with yet another NuGet package.  However, this is all just my own speculation.  Today we live in the Golden Age of NuGet package creation, and package restore does 99% of the work for us.

One Small Edit

The NuGet.targets file provided by the NuGet.build package provides a “BuildPackage” target.  Unlike the “RestorePackages” target, the build package target is not enabled by default.  So, we have to edit our project file to turn it on.  To edit the file in Visual Studio is a several step process.  If I were to make the change from within the IDE, I would: right-click on the  TemporaryFile node in Solution Explorer, select “Unload Project”, right click again, select “Edit Project”, edit the project file, save the project file, close the project file, right-click the project again, select “Reload Project”.  It’s a hassle.

An image of the project context menu in Solution Explorer

Too Much Work

I find it’s easiest to use a regular text editor to make this change rather than Visual Studio.  Anything should work, I often use Sublime Text or Notepad++.  Plain old notepad or WordPad should work fine.  I prefer Sublime because I keep a my “Projects” folder open in Sublime by default so that I can glance at code or edit these types of files quickly.  However you choose to do it, you only need to add one property in order to turn on the BuildPackage target.

diff --git a/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj b/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj
index 85e156b..e42d010 100644
--- a/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj
+++ b/NugetLikeAPro/TemporaryFile/TemporaryFile.csproj
@@ -13,6 +13,7 @@
 <FileAlignment>512</FileAlignment>
 <SolutionDir Condition="$(SolutionDir) == '' Or $(SolutionDir) == '*Undefined*'">..\</SolutionDir>
 <RestorePackages>true</RestorePackages>
+ <BuildPackage>true</BuildPackage>
 </PropertyGroup>
 <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
 <DebugSymbols>true</DebugSymbols>

I usually put it right below the RestorePackages property (line 9), but you can choose where it goes.  For example, if you wanted to only create packages for debug builds, you could down a few lines to line 12, into the next PropertyGroup, which is only defined when Debug is selected.  The same technique would work to restrict package creation to Release builds, if that’s what you would like to do.  If you made the change out side Visual Studio, the IDE will notice and ask you if you want to reload the project.  You do, so click “Reload” or “Reload All”.

An Image of the "File Modification Detected" dialog

You need to reload now

Once the BuildPackage property is set to true, MSBuild will execute the corresponding target in NuGet.targets and create a package for you on every build.  This package will get most of it’s configuration by inference, and appear in the bin directory next to your normal build outputs.

An image of Windows File Explorer

BuildPackage creates two packages by default

BuildPackage created two packages for me.  One is an ordinary NuGet package, which contains the TemporaryFile assembly and one is a “Symbol” package, which includes the same assembly along with additional debugging resources.

An image of the standard NuGet package, open in NuGet Package Explorer

The ‘Standard’ NuGet package

We didn’t provide NuGet with any configuration information.  NuGet configured these packages by convention, and used the project and assembly information to infer what the package configuration should be.  By opening the standard package in NuGet Package Explorer we can see what NuGet came up with.  The Id, Version, Title, and Copyright are all inferred by examining assembly attributes.  These attributes are defined in AssemblyInfo.cs by default.

using System.Reflection;
using System.Runtime.InteropServices;

[assembly: AssemblyTitle("TemporaryFile")]
[assembly: AssemblyDescription("")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("")]
[assembly: AssemblyProduct("TemporaryFile")]
[assembly: AssemblyCopyright("Copyright ©  2013")]
[assembly: AssemblyTrademark("")]
[assembly: AssemblyCulture("")]
[assembly: ComVisible(false)]
[assembly: Guid("4365a184-3046-4e59-ba28-0eeaaa41e795")]
[assembly: AssemblyVersion("1.0.0.0")]
[assembly: AssemblyFileVersion("1.0.0.0")]

Authors and Owners are both set to “James” which is my user name on the machine where I created the package. NuGet would prefer to use the value from “AssemblyCompany” for these fields, but I haven’t filled it out yet. Since AssemblyCompany was empty, NuGet moved on to the next convention and chose my user name instead. NuGet would also prefer to use “AssemblyDescription” to populate the Description value, but this was also blank. Since there is no other logical place (yet) for NuGet to find a description, the program simply gave up and used the word “Description” instead. NuGet uses the build log to warn me (lines 4, 5, 11, and 12 below) when this happens.

1>  Attempting to build package from 'TemporaryFile.csproj'.
1>  Packing files from 'C:\Users\James\Documents\GitHub\Blogs\NugetLikeAPro\TemporaryFile\bin\Debug'.
1>  Found packages.config. Using packages listed as dependencies
1>EXEC : warning : Description was not specified. Using 'Description'.
1>EXEC : warning : Author was not specified. Using 'James'.
1>  Successfully created package 'C:\Users\James\Documents\GitHub\Blogs\NugetLikeAPro\TemporaryFile\bin\Debug\TemporaryFile.1.0.0.0.nupkg'.
1>
1>  Attempting to build symbols package for 'TemporaryFile.csproj'.
1>  Packing files from 'C:\Users\James\Documents\GitHub\Blogs\NugetLikeAPro\TemporaryFile\bin\Debug'.
1>  Found packages.config. Using packages listed as dependencies
1>EXEC : warning : Description was not specified. Using 'Description'.
1>EXEC : warning : Author was not specified. Using 'James'.
1>  Successfully created package 'C:\Users\James\Documents\GitHub\Blogs\NugetLikeAPro\TemporaryFile\bin\Debug\TemporaryFile.1.0.0.0.symbols.nupkg'.

Notice on lines 3 and 10 that NuGet noticed that my project depends on another NuGet package. It infers this by detecting the ‘packages.config’ file where NuGet lists the project dependencies, reads that file, and automatically configures TemporaryFile to depend on ApprovalUtilities.

Overall NuGet did a pretty good job, and this package is actually usable.  Before we move on to customizing this package lets take a look at it’s sibling, the symbol package.

An Image of the Symbol package open in NuGet Package Explorer

The Symbol Package

The symbol package configuration is identical to the standard package.  Version, Id, Authors, and the rest are all the same.  However, there are more files in the symbol package.  Along with the class library, the lib/net45 folder contains the debugging symbols.  There is also a new folder called src.  Under the src directory, we can find all the source code for TemporaryFile.dll.  All together, this extra content gives Visual Studio enough information to provide a complete step-through debugging experience for this NuGet package.  What to do with this package and how to configure Visual Studio to use it are topics better handled on their own, so I wont cover them further here.  Stay tuned.

Customize

There are a few things I would like to change in this package before sharing it with the team/customers/world. I don’t like the default values for Author/Owner and Description. At a minimum the Author field should contain my last name, or perhaps my twitter handle or something I’d like the world to know me by. It is also appropriate to use your company name in this field. The description is important because this package will probably end up in a gallery and certainly be presented in the NuGet Package Manager inside Visual Studio. You need a good concise description so people have an idea what you are trying to share with them.  The copyright isn’t claimed by anyone either, be careful here because some default Visual Studio installs automatically use “Microsoft” as the default copy right holder (this seems to have been fixed in 2013, now its just blank).  Finally, I don’t like the default 3-dot version number, I prefer the 2-dot version, so I’d like to change that too.  These are the low hanging fruit which can be customized using AssemblyInfo.cs.

diff --git a/NugetLikeAPro/TemporaryFile/Properties/AssemblyInfo.cs b/NugetLikeAPro/TemporaryFile/Properties/AssemblyInfo.cs
index 7c3c830..bf494d8 100644
--- a/NugetLikeAPro/TemporaryFile/Properties/AssemblyInfo.cs
+++ b/NugetLikeAPro/TemporaryFile/Properties/AssemblyInfo.cs
@@ -2,14 +2,14 @@
 using System.Runtime.InteropServices;

 [assembly: AssemblyTitle("TemporaryFile")]
-[assembly: AssemblyDescription("")]
+[assembly: AssemblyDescription("A file that deletes itself when disposed")]
 [assembly: AssemblyConfiguration("")]
-[assembly: AssemblyCompany("")]
+[assembly: AssemblyCompany("ACME Co.")]
 [assembly: AssemblyProduct("TemporaryFile")]
-[assembly: AssemblyCopyright("Copyright ©  2013")]
+[assembly: AssemblyCopyright("Copyright © Jim Counts 2013")]
 [assembly: AssemblyTrademark("")]
 [assembly: AssemblyCulture("")]
 [assembly: ComVisible(false)]
 [assembly: Guid("4365a184-3046-4e59-ba28-0eeaaa41e795")]
-[assembly: AssemblyVersion("1.0.0.0")]
-[assembly: AssemblyFileVersion("1.0.0.0")]
\ No newline at end of file
+[assembly: AssemblyVersion("1.0.0")]
+[assembly: AssemblyFileVersion("0.0.1")]
\ No newline at end of file

I filled or edited out the attributes which NuGet checks when looking for configuration information: AssemblyDescription, AssemblyCompany. AssemblyCopyright and AssemblyVersion.  I also changed AssemblyFileVersion, even though NuGet doesn’t use it, and I left AssemblyTitle alone because I was happy with the value already there.  After building again, these changes should show up in the newly created package.

An Image of the NuGet Package Explorer showing updated metadata

Most AssemblyInfo changes are applied automatically

NuGet applied most of my changes automatically, and all the build warnings are gone.  But I expected a 2-dot version number both in the package name and as part of the metadata.  That 3-dot version is still hanging around.  I can take greater control over the version number, as well as many other aspects of the package metadata by providing a “nuspec” metadata file.  If this file has the same name as my project and is in the same directory as my project, then NuGet will prefer to use the data from the nuspec.

Pull the Nuspec File Out

You can generate nuspec files from assemblies or project files using NuGet.exe.  In the past I’ve found this method for creating nuspec files to be tedious because it creates configuration  I don’t always need or configuration with boilerplate text that I need to delete.  My old solution was some fairly complex MSBuild scripts that transformed generated files, but today I just create the default package as described above, rip it’s metadata, then customize to my liking.  If you have NuGet Package Explorer open, it’s pretty easy to use the “Save Metadata As…” menu item under “File” and save the nuspec file next to your project file (remove the version number from the filename if you do this).

Another way to retrieve the package nuspec file is with an unzip tool.  NuGet packages are zip files, and tools like 7-zip recognize this, buy you can always change the extension from nupkg to zip, if 7-zip isn’t handy.   Once the file has a zip extension, any zip utility can manipulate it, including the native support built into Windows.

An image showing the nuget package as a zip, open in Windows FIle Explorer

Nupkg files are Zip files

You can extract all the files from the zip, or just the nuspec file.  You will only need the nuspec file.

Put the Nuspec File Next to the Project

Once you have pulled the nuspec file out of the existing package, move it to the project directory.  It should sit in the same folder where the csproj file is (or vbproj, or fsproj) and have the same base name as the csproj.  There should be no version number in the nuspec file name, so remove it if there is.

An image showing the nuspec file in the project folder.

Put the nuspec file in the project folder

You can also add the item to the project using Visual Studio for easy access from the IDE, but it is not required.  I usually add it.

Make Changes

Now, let’s take a look at what is inside the nuspec file.

<?xml version="1.0"?>
<package xmlns="http://schemas.microsoft.com/packaging/2011/08/nuspec.xsd">
 <metadata>
 <id>TemporaryFile</id>
 <version>1.0.0.0</version>
 <title>TemporaryFile</title>
 <authors>ACME Co.</authors>
 <owners>ACME Co.</owners>
 <requireLicenseAcceptance>false</requireLicenseAcceptance>
 <description>A file that deletes itself when disposed</description>
 <copyright>Copyright © Jim Counts 2013</copyright>
 <dependencies>
 <dependency id="ApprovalUtilities" version="3.0.5" />
 </dependencies>
 </metadata>
</package>

We can see that most of the information in the nuspec file is the exact information displayed in the package explorer. I can now override the defaults by editing this file.  Any XML or text editor will work, it’s very convenient to use Visual Studio if you add the nuspec file to the project, so that’s what I usually do.

diff --git a/NugetLikeAPro/TemporaryFile/TemporaryFile.nuspec b/NugetLikeAPro/TemporaryFile/TemporaryFile.nuspec
index 5770b72..815c44e 100644
--- a/NugetLikeAPro/TemporaryFile/TemporaryFile.nuspec
+++ b/NugetLikeAPro/TemporaryFile/TemporaryFile.nuspec
@@ -2,15 +2,12 @@
 <package xmlns="http://schemas.microsoft.com/packaging/2011/08/nuspec.xsd">
 <metadata>
 <id>TemporaryFile</id>
- <version>1.0.0.0</version>
+ <version>0.0.1</version>
 <title>TemporaryFile</title>
- <authors>ACME Co.</authors>
+ <authors>@jamesrcounts</authors>
 <owners>ACME Co.</owners>
 <requireLicenseAcceptance>false</requireLicenseAcceptance>
 <description>A file that deletes itself when disposed</description>
 <copyright>Copyright © Jim Counts 2013</copyright>
- <dependencies>
- <dependency id="ApprovalUtilities" version="3.0.5" />
- </dependencies>
 </metadata>
 </package>
\ No newline at end of file

I changed the version number to “0.0.1″ and updated the the author to use my twitter handle.  “ACME Co.” is still the owner, and I removed the dependency list.  I prefer to allow NuGet to continue to infer this information on it’s own.

With these changes, the next package I build should reflect the new version number in the file name, and show updated metadata for Version and Authors.  However, the dependency list should remain the same in the completed package.

An image of Nuget Package Explorer showing the applied customizations

That’s More Like It

Automate

You’ll need some way to share your package now that you’ve created one.  If it’s an open source project you can definitely upload it to nuget.org if you like.  For private code, that’s probably not a good idea.  There are solutions out there, and I wrote about one of them in a previous article: Use ProGet to Host Your Private Packages.  In the interest of making sure this article doesn’t get any longer than it already is, I won’t cover options for sharing private packages here.

However, there are a couple things you can do now which will make your life easier once you do start sharing your package.  First, nuget.targets does not clean up after itself during clean and rebuild.  This means that all your old package versions will hang around in the build folder until you delete them manually.  Besides taking up space, those packages eventually slow you down when you get ready to share.  If you are using the NuGet Package Explorer to share, you have to scroll past an increasingly longer list of old package versions to find the new version you want to upload, and if you use the command line utility, all those old versions increase the amount of typing and tabbing in order to complete the command.  Finally, I find the quickest way to push packages is with a custom script which wraps the command line utility, and that script is much easier to write when the bin folder only contains the latest package.

Cleanup with nuget.targets

To integrate nuget.target with “Clean” and “Rebuild” you need to add a new target to the script, add a new item group which lists the files to clean, and finally ad a hook using the “CleanDependsOn” property that will actually execute the target.

Nuget.targets is already added to your solution in the .nuget folder, open it and add what you need.

diff --git a/NugetLikeAPro/.nuget/NuGet.targets b/NugetLikeAPro/.nuget/NuGet.targets
index 8962872..a5cebf3 100644
--- a/NugetLikeAPro/.nuget/NuGet.targets
+++ b/NugetLikeAPro/.nuget/NuGet.targets
@@ -1,5 +1,8 @@
 <?xml version="1.0" encoding="utf-8"?>
 <Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
+ <ItemGroup>
+ <OutputPackages Include="$(TargetDir)*.nupkg" />
+ </ItemGroup>
 <PropertyGroup>
 <SolutionDir Condition="$(SolutionDir) == '' Or $(SolutionDir) == '*Undefined*'">$(MSBuildProjectDirectory)\..\</SolutionDir>

@@ -83,6 +86,11 @@
 $(BuildDependsOn);
 BuildPackage;
 </BuildDependsOn>
+
+ <CleanDependsOn Condition="$(BuildPackage) == 'true'">
+ $(CleanDependsOn);
+ CleanPackages;
+ </CleanDependsOn>
 </PropertyGroup>

 <Target Name="CheckPrerequisites">
@@ -118,6 +126,10 @@
 Condition=" '$(OS)' == 'Windows_NT' " />
 </Target>

+ <Target Name="CleanPackages">
+ <Delete Files="@(OutputPackages)"></Delete>
+ </Target>
+
 <UsingTask TaskName="DownloadNuGet" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll">
 <ParameterGroup>
 <OutputFilename ParameterType="System.String" Required="true" />

On lines 8-10 I define a collection of items called “OutputPackages” which uses a glob to find all the NuGet packages in the bin directory, referred to in the script as TargetDir.

Then I use this item collection with the new target defined on lines 30-32.  The CleanPackages target is a very simple target that uses MSBuild’s built-in Delete task to remove the files in the OutptuPackages collection.

Finally, I instruct MSBuild to run this target during clean by hooking into the CleanDependsOn property using lines 19-22.  CleanDependsOn is one of several hooks provided for modifying targets defined in Microsoft.Common.Targets. On line 20, I add back any existing dependencies and on line 21 I append the CleanPackages target to the end of the list.  Now, MSBuild will clean up old packages whenever I Clean or Rebuild my project.

Write a push script

Pushing your packages to NuGet.org is pretty simple because it is the default for nuget.exe.  Both NuGet.exe and the NuGet Package Explorer will allow you to specify a custom host to push your package to, but I’m paranoid that I will forget to specify the host and send packages to nuget.org that I don’t want to share publicly.

So, to speed things up, and to keep the risk of mistakes to a minimum, I use a simple shell script to push my packages.  Here is an example that would push to a local ProGet server.

.nuget\NuGet.exe push .\TemporaryFile\bin\Debug\*.nupkg -apikey Admin:Admin -source http://localhost:81/nuget/Default

I specified ProGet’s default credentials as the API key, but if you plan to push to nuget.org I suggest you use the NuGet “setapikey” option to configure the API key on your machine, and that way you don’t have to commit the key to source control.

Recap

In this post I showed how to create basic packages with MSBuild, customize them, and gave a couple automation tips I find useful.  Once you have converted a few of projects to produce packages this way, you can do the conversion in about 15 minutes for straightforward packages.  NuGet packages can become complex and you may need to do a lot more in the customization stage.  However, for most cases I find that these few steps are enough: enable package restore, add the BuildPackage property, rip the nuspec file from the first package, customize a few pieces of AssemblyInfo and nuspec metadata, and start sharing your package.

Once you have the package, you can quit, or you can make your life a little easier by adding a cleanup target and a push script.  Either way, I hope you find this information useful, and bit more approachable than my previous post on this topic.

The GitHub repository to accompany this post is here: https://github.com/jamesrcounts/Blogs/tree/master/NugetLikeAPro

Clarify Your Test Intention with ApprovalTests

In this post I’m going to explore the benefits of shorter tests. Today, I’m not interested in shortening test run times (although that’s a good thing too). Instead, I am interested in shortening the amount of code I have to read before I can figure out (or remember) the intention of a test.

If you believe, even a little bit, that "the tests are the documentation" or "the tests are the spec", then the tests better be crystal clear about what they are trying to prove. If they are not clear then the "specification" aspect of the test will be lost to future readers (possibly yourself).

So lets look at one way intention gets less clear when tests are long.

Long Test

While engaged in some practical refactoring at work, I recently came across some really long tests. The general domain was parsing, but I’ve changed the specifics to protect the guilty. I’m pasting the complete test here because I want to give you a taste of how overwhelming the initial test looked.

namespace ApprovalsExample
{
    using Microsoft.VisualStudio.TestTools.UnitTesting;

    /// <summary>
    /// Describe a JSON parser.
    /// </summary>
    [TestClass]
    public class JsonParserTest
    {
        /// <summary>
        /// Parse this JSON into a POCO object.
        /// </summary>
        [TestMethod]
        public void ItConvertsJsonToPoco()
        {
            const string Source = @"{
            ""status"": ""upcoming"",
            ""visibility"": ""public"",
            ""maybe_rsvp_count"": 0,
            ""venue"": {
                ""id"": 11835602,
                ""zip"": ""92660"",
                ""lon"": -117.867828,
                ""repinned"": false,
                ""name"": ""TEKsystems"",
                ""state"": ""CA"",
                ""address_1"": ""100 Bayview Circle #3400"",
                ""lat"": 33.655819,
                ""city"": ""Newport Beach"",
                ""country"": ""us""
            },
            ""id"": ""124139172"",
            ""utc_offset"": -25200000,
            ""duration"": 10800000,
            ""time"": 1378947600000,
            ""waitlist_count"": 0,
            ""announced"": false,
            ""updated"": 1370985561000,
            ""yes_rsvp_count"": 7,
            ""created"": 1370985561000,
            ""event_url"": ""http://www.meetup.com/vNext-OrangeCounty/events/124139172/"",
            ""description"": ""<p><strong>Talk Info :</strong></p>\n<p>The techniques for building applications have changed dramatically in the last <br />\n\nfew years. Gone are the days of single-tier, battle-ship gray, boring user <br />\n\ninterfaces. Users demand that your applications (or portions) run on more than <br />\n\none device. This session will take you on a tour of how you should be architecting your application by breaking it up into services. You will learn how <br />\n\nto create your business rules and data layer as a service. This seminar will <br />\n\nassume you have some knowledge of .NET but have been developing <br />\n\napplications the old way and you are now looking to see how to use WCF and <br />\n\nthe Model-View-View-Model (MVVM) design pattern to create applications that <br />\n\ncan be run one more than one user interface platform. This session has many <br />\n\ndemonstrations and you will be led step-by-step through the code. You will walk <br />\n\naway with a sample set of services that run on Silverlight, Windows Forms, <br />\n\nWPF, Windows Phone and ASP.NET.</p>\n<p> </p>\n<p><strong>About The Speaker</strong></p>\n<p>Paul D. Sheriff is the President of PDSA, Inc. (www.pdsa.com), a Microsoft <br />\n\nPartner in Southern California. Paul acts as the Microsoft Regional Director for <br />\n\nSouthern California assisting the local Microsoft offices with several of their <br />\n\nevents each year and being an evangelist for them. Paul has authored literally <br />\n\nhundreds of books, webcasts, videos and articles on .NET, WPF, Silverlight, <br />\n\nWindows Phone and SQL Server. Paul can be reached via email at <br />\n\nPSheriff@pdsa.com. Check out Paul's new code generator 'Haystack' at <br />\n\n<a href=\""http://www.CodeHaystack.com\"">www.CodeHaystack.com</a>.</p>"",
            ""how_to_find_us"": ""Office is on the 3rd floor of the North Tower - Occupied by TekSystems"",
            ""name"": ""Paul D. Sheriff - Architecting Applications for Multiple User Interfaces"",
            ""headcount"": 0,
            ""group"": {
                ""id"": 2983232,
                ""group_lat"": 33.650001525878906,
                ""name"": ""vNext_OC"",
                ""group_lon"": -117.58999633789062,
                ""join_mode"": ""open"",
                ""urlname"": ""vNext-OrangeCounty"",
                ""who"": ""Members""
            }
        }";

            var o = Event.DeserializeJson(Source);
            const string Answer = @"Announced: False, Created: 1370985561000, Description: <p><strong>Talk Info :</strong></p>
<p>The techniques for building applications have changed dramatically in the last <br />

few years. Gone are the days of single-tier, battle-ship gray, boring user <br />

interfaces. Users demand that your applications (or portions) run on more than <br />

one device. This session will take you on a tour of how you should be architecting your application by breaking it up into services. You will learn how <br />

to create your business rules and data layer as a service. This seminar will <br />

assume you have some knowledge of .NET but have been developing <br />

applications the old way and you are now looking to see how to use WCF and <br />

the Model-View-View-Model (MVVM) design pattern to create applications that <br />

can be run one more than one user interface platform. This session has many <br />

demonstrations and you will be led step-by-step through the code. You will walk <br />

away with a sample set of services that run on Silverlight, Windows Forms, <br />

WPF, Windows Phone and ASP.NET.</p>
<p> </p>
<p><strong>About The Speaker</strong></p>
<p>Paul D. Sheriff is the President of PDSA, Inc. (www.pdsa.com), a Microsoft <br />

Partner in Southern California. Paul acts as the Microsoft Regional Director for <br />

Southern California assisting the local Microsoft offices with several of their <br />

events each year and being an evangelist for them. Paul has authored literally <br />

hundreds of books, webcasts, videos and articles on .NET, WPF, Silverlight, <br />

Windows Phone and SQL Server. Paul can be reached via email at <br />

PSheriff@pdsa.com. Check out Paul's new code generator 'Haystack' at <br />

<a href=""http://www.CodeHaystack.com"">www.CodeHaystack.com</a>.</p>, Duration: 10800000, EventUrl: , Group: ApprovalsExample.Group, HowToFindUs: , Headcount: 0, Id: 124139172, MaybeRsvpCount: 0, Name: Paul D. Sheriff - Architecting Applications for Multiple User Interfaces, Status: upcoming, Time: 1378947600000, Updated: 1370985561000, UtcOffset: 0, Venue: ApprovalsExample.Venue, Visibility: public, WaitlistCount: 0, YesRsvpCount: 0";
            Assert.AreEqual(Answer, o.ToString());
        }
    }
}

We can guess from the initial JSON blob (grabbed from vNext OC‘s meetup.com event stream) and the test name that the intention is to demonstrate something about converting JSON into .net objects. But the input data for the test is so large that we must scroll almost an entire page before seeing the first executable line of code:

var o = Event.DeserializeJson(Source);

Once we get past the first 40 or so lines, we finally see that the Event class does the parsing. Next we have 40 or so lines of expectation definition before we reach a very simple assert:

Assert.AreEqual(Answer, o.ToString());

So the test is not that hard to understand, but the signal-to-noise ratio is wimpy: 2:83. In this high level test, the specifics of the JSON source are not important. The only important thing about the source text is that it produces the expected result. Likewise the only important thing about the expected result is that it is correct and it corresponds to the provided input. So, both giant strings are noise.

Alternatives

Of course, the thrust of my argument is that ApprovalTests provides the best set of tools for cleaning up a test like this. But let me setup a couple of straw-men first.

AAA

Maybe you read the test and thought, "Fool! You didn’t follow triple-A!" While it is true that the test doesn’t conform to the Arrange/Act/Assert pattern, making it conform to AAA only yields a small improvement. By moving the call to DeserializeJson from line 42 to line 83, I now conform to the pattern:

[TestMethod]
public void ItConvertsJsonToPoco()
{
    const string Source = @"{
    /* 80 lines of "Arrange" omitted */

    var o = Event.DeserializeJson(Source);
    Assert.AreEqual(Answer, o.ToString());
}

What is the improvement? Well now all the code is next to each other, so you no longer have to hunt for the "Act" part, just skip to the bottom, and there it is. Knowing where things should be is one of the strengths of AAA, I’ll concede that. Unfortunately, we haven’t done anything to fix the signal-to-noise ratio, it is still 2:83. It’s a little easier to find the signal, because its all bunched up at the end of the noise (past 2 pages of noise now).

Move the Noise

To gain any traction on the signal to noise ratio, we need to put the noise somewhere else.

Many of testers labor under a misconception similar to this: "A test must not interact with any system except the one under test." Many usually include the file system under the category "any". Clearly, I am not a subscriber to this line of thinking, but I can take a jab at the straw-man by pointing out that the tests exist in code files that live on the file system. So, I would not worry about that, but since so many do, lets see what kind of improvement we can get by moving things around. We could promote the big strings to fields, and reduce the number of lines in our test body.

[TestMethod]
public void ItConvertsJsonToPoco()
{
    var o = Event.DeserializeJson(Source);
    Assert.AreEqual(Answer, o.ToString());
}

This certainly makes this one test look nicer. If I only consider the test method we have fantastic signal-to-noise: 1:1. This is not to say that it is absolutely clear what this test intends to prove, but we can very quickly see how it tries to prove it. So, good signal-to-noise isn’t everything, but it helps.

Can we stop here and call it a day? Of course, the truth is that you can, because the test still passes. Not surprisingly though, I say no.

I have problems with this approach. In this example, I’ve only written one test, and this solution seems to work OK, but does it scale? At work, the actual test suite contained many tests, and this solution would not scale well. Applying "move the noise" to all the tests would result in half-a-dozen "sources" and half-a-dozen "answers". These were varying lengths, some much longer than 40 lines, so we are talking about a preamble of many hundred lines of "Arrange" starting off the class before we get to any "Act" or "Assert".

I also have a problem with maintaining giant strings inside the tests, no matter where they are put in the code. First, you often run afoul of newlines and quote marks. The newlines in the answer conform to the new lines in your environment, in my case this means CRLF. The JSON blob has a mixture of line endings, so something must be done to the answer or the source to get them to match. Then we have quote marks. The JSON uses double quotes, so I had to convert them to double-double quotes to make the multi-line string literal work. Of course I could have escaped everything and used a normal interpolated string… but that’s work too. I don’t want to do any extra work.

Giant strings in the code are also very easy to mess up. If you are clumsy like me (or maybe you are perfect… but you might have someone clumsy on your team) your cursor often ends up where you least expect it when you’re in the middle of typing a tweet to your sweet old grandmother (that’s what twitter is for right?). Next thing your know, your test is failing because some how the phrase "I really liked the pie @Grandma" ends up in your giant string. I don’t like constructing my tests in such a way that debugging sessions can result from dumb mistakes.

Use ApprovalTests to Hide the Noise

ApprovalTests for .net is a assertion library that enhances your existing test framework with new capabilities for long strings, dictionaries, collections, log files, web pages, WinForm views, WPF views, Entity Framework queries, event configurations, and RDLC reports. If this is the first you’ve ever heard of ApprovalTests, then I encourage you to explore further by watching a few short videos on youtube, posted by the creator of ApprovalTests, Llewellyn Falco. Don’t let the purple hair put you off, they are great videos.

ApprovalTests provide the perfect solution for shortening the long test presented at the beginning of this post. In fact, that test’s original author had essentially re-invented approval testing without knowing it, and without gaining the power that the ApprovalTests library would provide. Our test has three parts, a known input, an action, and a known correct output. The output is the big answer string, and we know it is correct because the test passed when I inherited it from its original author. Approval testing is about capturing human intelligence, a human has declared This is what the DeserializeJson method produces. We should continue to check that the correct answer is given. An approval test automates this check.

In particular, the ApprovalTests library not only automates this check for us, but provides us with better feedback on failure. It also hides the noisy strings most of the time, but will present us with an opportunity to review or update the answer when the test fails.

At work I refactored the original test into an ApprovalTest, but for this post, I’ll just continue from where we were. I’ll post all the code so you can watch it shrink. So here is where we we want to go:

namespace ApprovalsExample
{
    using System.IO;
    using ApprovalTests;
    using ApprovalTests.Reporters;
    using ApprovalUtilities.Utilities;
    using Microsoft.VisualStudio.TestTools.UnitTesting;

    /// <summary>
    /// Describe a JSON parser.
    /// </summary>
    [TestClass]
    public class JsonParserTest
    {
        /// <summary>
        /// Parse this JSON into a POCO object.
        /// </summary>
        [TestMethod]
        [UseReporter(typeof(VisualStudioReporter))]
        public void ItConvertsJsonToPoco()
        {
            var text = File.ReadAllText(PathUtilities.GetAdjacentFile("sample.json"));
            var o = Event.DeserializeJson(text);
            Approvals.Verify(o);
        }
    }
}

And here is where we are after "moving the noise":

[TestClass]
public class JsonParserTest
{
    private const string Expected = @"Announced: False, Created: 1370985561000, Description: <p><strong>Talk Info :</strong></p>
<p>The techniques for building applications have changed dramatically in the last <br />

few years. Gone are the days of single-tier, battle-ship gray, boring user <br />

interfaces. Users demand that your applications (or portions) run on more than <br />

one device. This session will take you on a tour of how you should be architecting your application by breaking it up into services. You will learn how <br />

to create your business rules and data layer as a service. This seminar will <br />

assume you have some knowledge of .NET but have been developing <br />

applications the old way and you are now looking to see how to use WCF and <br />

the Model-View-View-Model (MVVM) design pattern to create applications that <br />

can be run one more than one user interface platform. This session has many <br />

demonstrations and you will be led step-by-step through the code. You will walk <br />

away with a sample set of services that run on Silverlight, Windows Forms, <br />

WPF, Windows Phone and ASP.NET.</p>
<p> </p>
<p><strong>About The Speaker</strong></p>
<p>Paul D. Sheriff is the President of PDSA, Inc. (www.pdsa.com), a Microsoft <br />

Partner in Southern California. Paul acts as the Microsoft Regional Director for <br />

Southern California assisting the local Microsoft offices with several of their <br />

events each year and being an evangelist for them. Paul has authored literally <br />

hundreds of books, webcasts, videos and articles on .NET, WPF, Silverlight, <br />

Windows Phone and SQL Server. Paul can be reached via email at <br />

PSheriff@pdsa.com. Check out Paul's new code generator 'Haystack' at <br />

<a href=""http://www.CodeHaystack.com"">www.CodeHaystack.com</a>.</p>, Duration: 10800000, EventUrl: , Group: ApprovalsExample.Group, HowToFindUs: , Headcount: 0, Id: 124139172, MaybeRsvpCount: 0, Name: Paul D. Sheriff - Architecting Applications for Multiple User Interfaces, Status: upcoming, Time: 1378947600000, Updated: 1370985561000, UtcOffset: 0, Venue: ApprovalsExample.Venue, Visibility: public, WaitlistCount: 0, YesRsvpCount: 0";

    private const string Source = @"{
        ""status"": ""upcoming"",
        ""visibility"": ""public"",
        ""maybe_rsvp_count"": 0,
        ""venue"": {
            ""id"": 11835602,
            ""zip"": ""92660"",
            ""lon"": -117.867828,
            ""repinned"": false,
            ""name"": ""TEKsystems"",
            ""state"": ""CA"",
            ""address_1"": ""100 Bayview Circle #3400"",
            ""lat"": 33.655819,
            ""city"": ""Newport Beach"",
            ""country"": ""us""
        },
        ""id"": ""124139172"",
        ""utc_offset"": -25200000,
        ""duration"": 10800000,
        ""time"": 1378947600000,
        ""waitlist_count"": 0,
        ""announced"": false,
        ""updated"": 1370985561000,
        ""yes_rsvp_count"": 7,
        ""created"": 1370985561000,
        ""event_url"": ""http://www.meetup.com/vNext-OrangeCounty/events/124139172/"",
        ""description"": ""<p><strong>Talk Info :</strong></p>\n<p>The techniques for building applications have changed dramatically in the last <br />\n\nfew years. Gone are the days of single-tier, battle-ship gray, boring user <br />\n\ninterfaces. Users demand that your applications (or portions) run on more than <br />\n\none device. This session will take you on a tour of how you should be architecting your application by breaking it up into services. You will learn how <br />\n\nto create your business rules and data layer as a service. This seminar will <br />\n\nassume you have some knowledge of .NET but have been developing <br />\n\napplications the old way and you are now looking to see how to use WCF and <br />\n\nthe Model-View-View-Model (MVVM) design pattern to create applications that <br />\n\ncan be run one more than one user interface platform. This session has many <br />\n\ndemonstrations and you will be led step-by-step through the code. You will walk <br />\n\naway with a sample set of services that run on Silverlight, Windows Forms, <br />\n\nWPF, Windows Phone and ASP.NET.</p>\n<p> </p>\n<p><strong>About The Speaker</strong></p>\n<p>Paul D. Sheriff is the President of PDSA, Inc. (www.pdsa.com), a Microsoft <br />\n\nPartner in Southern California. Paul acts as the Microsoft Regional Director for <br />\n\nSouthern California assisting the local Microsoft offices with several of their <br />\n\nevents each year and being an evangelist for them. Paul has authored literally <br />\n\nhundreds of books, webcasts, videos and articles on .NET, WPF, Silverlight, <br />\n\nWindows Phone and SQL Server. Paul can be reached via email at <br />\n\nPSheriff@pdsa.com. Check out Paul's new code generator 'Haystack' at <br />\n\n<a href=\""http://www.CodeHaystack.com\"">www.CodeHaystack.com</a>.</p>"",
        ""how_to_find_us"": ""Office is on the 3rd floor of the North Tower - Occupied by TekSystems"",
        ""name"": ""Paul D. Sheriff - Architecting Applications for Multiple User Interfaces"",
        ""headcount"": 0,
        ""group"": {
            ""id"": 2983232,
            ""group_lat"": 33.650001525878906,
            ""name"": ""vNext_OC"",
            ""group_lon"": -117.58999633789062,
            ""join_mode"": ""open"",
            ""urlname"": ""vNext-OrangeCounty"",
            ""who"": ""Members""
        }
    }";

    /// <summary>
    /// Parse this JSON into a POCO object.
    /// </summary>
    [TestMethod]
    public void ItConvertsJsonToPoco()
    {
        var o = Event.DeserializeJson(Source);
        Assert.AreEqual(Expected, o.ToString());
    }
}

Lets start refactoring.

Hide Source in File

After adding ApprovalTests to the project using nuget, I can take advantage of ApprovalUtilities to help me move the big source string into a file that sits next to the code file. I could do this by making a file and using cut and paste, but as I previously discussed, I had to mangle the source with double-double quotes to make the string literal work. I could demangle the source by hand, but letting the computer do it will be quick and less error prone.

Here’s the relevant portions of the code:

namespace ApprovalsExample
{
    using System.IO;
    using ApprovalUtilities.Utilities;

    /// <summary>
    /// Describe a JSON parser.
    /// </summary>
    [TestClass]
    public class JsonParserTest
    {
        /* Giant strings still here, omitted for clarity */

        /// <summary>
        /// Parse this JSON into a POCO object.
        /// </summary>
        [TestMethod]
        public void ItConvertsJsonToPoco()
        {
            File.WriteAllText(PathUtilities.GetAdjacentFile("sample.json"), Source);
            var o = Event.DeserializeJson(Source);
            Assert.AreEqual(Expected, o.ToString());
        }
    }
}

I added a couple of namespaces that I will need going forward, and added a line of code to write the giant source string into a file. Notice that I am still using the giant source string in the test. I’m just going to change one thing at a time as I refactor, then run the tests before making the next change. The next time I run this test, PathUtilities will provide the full path to a non-existent file next to the code file called "sample.json". Then WriteAllText will create that file by dumping the giant source string into it. So I run the test, it passes, and now I have a copy of the source in "sample.json":

{
        "status": "upcoming",
        "visibility": "public",
        "maybe_rsvp_count": 0,
        "venue": {
            "id": 11835602,
            "zip": "92660",
            "lon": -117.867828,
            "repinned": false,
            "name": "TEKsystems",
            "state": "CA",
            "address_1": "100 Bayview Circle #3400",
            "lat": 33.655819,
            "city": "Newport Beach",
            "country": "us"
        },
        "id": "124139172",
        "utc_offset": -25200000,
        "duration": 10800000,
        "time": 1378947600000,
        "waitlist_count": 0,
        "announced": false,
        "updated": 1370985561000,
        "yes_rsvp_count": 7,
        "created": 1370985561000,
        "event_url": "http://www.meetup.com/vNext-OrangeCounty/events/124139172/",
        "description": "<p><strong>Talk Info :</strong></p>\n<p>The techniques for building applications have changed dramatically in the last <br />\n\nfew years. Gone are the days of single-tier, battle-ship gray, boring user <br />\n\ninterfaces. Users demand that your applications (or portions) run on more than <br />\n\none device. This session will take you on a tour of how you should be architecting your application by breaking it up into services. You will learn how <br />\n\nto create your business rules and data layer as a service. This seminar will <br />\n\nassume you have some knowledge of .NET but have been developing <br />\n\napplications the old way and you are now looking to see how to use WCF and <br />\n\nthe Model-View-View-Model (MVVM) design pattern to create applications that <br />\n\ncan be run one more than one user interface platform. This session has many <br />\n\ndemonstrations and you will be led step-by-step through the code. You will walk <br />\n\naway with a sample set of services that run on Silverlight, Windows Forms, <br />\n\nWPF, Windows Phone and ASP.NET.</p>\n<p> </p>\n<p><strong>About The Speaker</strong></p>\n<p>Paul D. Sheriff is the President of PDSA, Inc. (www.pdsa.com), a Microsoft <br />\n\nPartner in Southern California. Paul acts as the Microsoft Regional Director for <br />\n\nSouthern California assisting the local Microsoft offices with several of their <br />\n\nevents each year and being an evangelist for them. Paul has authored literally <br />\n\nhundreds of books, webcasts, videos and articles on .NET, WPF, Silverlight, <br />\n\nWindows Phone and SQL Server. Paul can be reached via email at <br />\n\nPSheriff@pdsa.com. Check out Paul's new code generator 'Haystack' at <br />\n\n<a href=\"http://www.CodeHaystack.com\">www.CodeHaystack.com</a>.</p>",
        "how_to_find_us": "Office is on the 3rd floor of the North Tower - Occupied by TekSystems",
        "name": "Paul D. Sheriff - Architecting Applications for Multiple User Interfaces",
        "headcount": 0,
        "group": {
            "id": 2983232,
            "group_lat": 33.650001525878906,
            "name": "vNext_OC",
            "group_lon": -117.58999633789062,
            "join_mode": "open",
            "urlname": "vNext-OrangeCounty",
            "who": "Members"
        }
    }

Admittedly, the indentation is a little funky, but at least all the double-double quotes are now back to single double quotes. A trip to JSONLint shows the blob is kosher. Now I can refactor the test to use this file instead of the giant string. Only two lines need to change:

var text = File.ReadAllText(PathUtilities.GetAdjacentFile("sample.json"));
var o = Event.DeserializeJson(text);

I changed WriteAllText to ReadAllText, then captured the result in a variable. Next, I updated the call to DeserializeJson to use the text I just read, instead of the string stored in Source. I run the test and it passes.

Now my refactoring tool tells me that the Source field is unused. So I delete the giant string and run the test. It passes, leaving me with the same test, minus about 40 lines of string.

namespace ApprovalsExample
{
    using System.IO;
    using ApprovalUtilities.Utilities;
    using Microsoft.VisualStudio.TestTools.UnitTesting;

    /// <summary>
    /// Describe a JSON parser.
    /// </summary>
    [TestClass]
    public class JsonParserTest
    {
        private const string Expected = @"Announced: False, Created: 1370985561000, Description: <p><strong>Talk Info :</strong></p>
<p>The techniques for building applications have changed dramatically in the last <br />

few years. Gone are the days of single-tier, battle-ship gray, boring user <br />

interfaces. Users demand that your applications (or portions) run on more than <br />

one device. This session will take you on a tour of how you should be architecting your application by breaking it up into services. You will learn how <br />

to create your business rules and data layer as a service. This seminar will <br />

assume you have some knowledge of .NET but have been developing <br />

applications the old way and you are now looking to see how to use WCF and <br />

the Model-View-View-Model (MVVM) design pattern to create applications that <br />

can be run one more than one user interface platform. This session has many <br />

demonstrations and you will be led step-by-step through the code. You will walk <br />

away with a sample set of services that run on Silverlight, Windows Forms, <br />

WPF, Windows Phone and ASP.NET.</p>
<p> </p>
<p><strong>About The Speaker</strong></p>
<p>Paul D. Sheriff is the President of PDSA, Inc. (www.pdsa.com), a Microsoft <br />

Partner in Southern California. Paul acts as the Microsoft Regional Director for <br />

Southern California assisting the local Microsoft offices with several of their <br />

events each year and being an evangelist for them. Paul has authored literally <br />

hundreds of books, webcasts, videos and articles on .NET, WPF, Silverlight, <br />

Windows Phone and SQL Server. Paul can be reached via email at <br />

PSheriff@pdsa.com. Check out Paul's new code generator 'Haystack' at <br />

<a href=""http://www.CodeHaystack.com"">www.CodeHaystack.com</a>.</p>, Duration: 10800000, EventUrl: , Group: ApprovalsExample.Group, HowToFindUs: , Headcount: 0, Id: 124139172, MaybeRsvpCount: 0, Name: Paul D. Sheriff - Architecting Applications for Multiple User Interfaces, Status: upcoming, Time: 1378947600000, Updated: 1370985561000, UtcOffset: 0, Venue: ApprovalsExample.Venue, Visibility: public, WaitlistCount: 0, YesRsvpCount: 0";

        /// <summary>
        /// Parse this JSON into a POCO object.
        /// </summary>
        [TestMethod]
        public void ItConvertsJsonToPoco()
        {
            var text = File.ReadAllText(PathUtilities.GetAdjacentFile("sample.json"));
            var o = Event.DeserializeJson(text);
            Assert.AreEqual(Expected, o.ToString());
        }
    }
}

Hide Expectation in File

I could use a similar technique to hide the expectation in a file, but I don’t need to because hiding the expectation is built into the library. This is one of the tasks that ApprovalTests excels at. So, leaving all else the same, I will add a couple namespaces to the code, and make a couple small changes to the test.

namespace ApprovalsExample
{        
    using ApprovalTests;
    using ApprovalTests.Reporters;
    /* Other namespace imports remain the same */

    /// <summary>
    /// Describe a JSON parser.
    /// </summary>
    [TestClass]
    public class JsonParserTest
    {
        private const string Expected = @"Announced: False, Created: 1370985561000, Description: ..."
        /* this giant string remains here for now */

        /// <summary>
        /// Parse this JSON into a POCO object.
        /// </summary>
        [TestMethod]
        [UseReporter(typeof(VisualStudioReporter))]
        public void ItConvertsJsonToPoco()
        {
            var text = File.ReadAllText(PathUtilities.GetAdjacentFile("sample.json"));
            var o = Event.DeserializeJson(text);
            Assert.AreEqual(Expected, o.ToString());
            Approvals.Verify(o);
        }
    }
}

I run this test and it fails, but this failure now occurs after the Assert, when I make the call to Verify. This is expected behavior for ApprovalTests. Until I have approved my expected output, ApprovalTests cannot check it for me, so it must continue to fail until I give my blessing to something. Besides failing, it also gives me the opportunity to review the results by launching a reporter. In this case, the output appears in Visual Studio’s diff viewer because I specified the VisualStudioReporter when I attached the UseReporter attribute to the test method.

The output we see on the lefts side is simply the result of converting the instance o into a string. Event happens to have a decent ToString formatting method, but I could have manipulated the output by formatting or redacting the data before calling Verify. Now the only question is whether I should give this result my blessing.

In fact, its not a question at all, I know that I can immediately approve the output because the original test still passes. Although the test shows as a failure in the test runner, I can see that it failed when it reached the Approval, meaning the Assert still passed. Since the assert is checking the same output that Verify checks, then if the Assert is good, the output received by Verify must also be good. Visual Studio does not provide merging unless you are connected TFS (as far as I can tell) so my options for approval are:

  1. Select all the left side and copy/paste to the right side.
  2. Use file explorer to rename the "received" file to JsonParserTest.ItConvertsJsonToPoco.approved.txt.

I will go with option two because I don’t trust copy/paste not to make mischief with things like line-endings and character encoding.

After renaming the file, I run the test again and it passes. I should note that I normally choose to use the composite DiffReporter which searches my system for a working diff utility and uses that to show me the results. These utilities (Kdiff3, BeyondCompare, Perforce, and many more…) usually let me approve the result without resorting to renaming files. I don’t know what Microsoft thinks it is accomplishing by hobbling their diff utility in this way.

Next, I delete the original assert, re-run the test, and it passes.

/// <summary>
/// Parse this JSON into a POCO object.
/// </summary>
[TestMethod]
[UseReporter(typeof(VisualStudioReporter))]
public void ItConvertsJsonToPoco()
{
    var text = File.ReadAllText(PathUtilities.GetAdjacentFile("sample.json"));
    var o = Event.DeserializeJson(text);
    Approvals.Verify(o);
}

Now that the original Assert is gone, my refactoring tool tells me that the Expected field (formerly Answer) is unused, so I delete it, and run the test.

With the second giant string removed, I’m left with this:

namespace ApprovalsExample
{
    using System.IO;
    using ApprovalTests;
    using ApprovalTests.Reporters;
    using ApprovalUtilities.Utilities;
    using Microsoft.VisualStudio.TestTools.UnitTesting;

    /// <summary>
    /// Describe a JSON parser.
    /// </summary>
    [TestClass]
    public class JsonParserTest
    {
        /// <summary>
        /// Parse this JSON into a POCO object.
        /// </summary>
        [TestMethod]
        [UseReporter(typeof(VisualStudioReporter))]
        public void ItConvertsJsonToPoco()
        {
            var text = File.ReadAllText(PathUtilities.GetAdjacentFile("sample.json"));
            var o = Event.DeserializeJson(text);
            Approvals.Verify(o);
        }
    }
}

And I’ve reached my goal. If you still care about signal-to-noise ratio, its 2:3. But more importantly, the entire test, including all the kruft of namespaces, attributes and comments can be seen and understood at a glance. I would probably not spend more than a few seconds reading this test before moving on to read the actual implementation of DeserializeJson. ApprovalTests has allowed me to shorten up this test, which makes the test take up less mental real-estate, and allows me to use more of my brain thinking about the production code instead of the test.

The code for this example is available on GitHub.

What’s New in CompositionTests 2.0

Download the latest version of CompositionTests from nuget.org!

ApprovalTests 3.0

Updated the dependency on ApprovalTests to 3.0.01. Thanks to the new version updating policy for ApprovalTests, CompositionTests should remain forward compatible with future versions of ApprovalTests, unless there are breaking changes in the API.

New version policy

Following LLewellyn’s lead with ApprovalTests, I am adopting a JSON.NET-style version update policy. Adopting this policy will enable me to sign CompositionTests in the future without creating forward-compatibility problems for anyone else. For now, the package remains unsigned because its other dependency, the MEFX Diagnostic Library is unsigned. I’ll have to decide if I’m willing to do anything about that before I can consider a signed version of CompositionTests.

The impact is that the CompositionTests AssemblyVersion will stay at 2.0.0 from now on. The real version can be found by looking at AssemblyFileVersion, or by looking at the nuget package version, which will be 2.0.1 for this release.

Common Language Specification Compliance

The CompositionTests library now declares itself CLS compliant. However, MEFX.Core does not make the same declaration, so certain methods that interact with the core are individually marked non-compliant. I don’t think that MEFX.Core uses anything non-compliant, the library is simply missing the declaration of compliance. I don’t think Microsoft has plans to provide any more updates to this piece, so I’ll have to decide that I’m willing to modify and maintain a fork of MEFX.Core before I can do anything about that missing attribute.

Removed Obsolete Methods

Methods and types marked with the ObsoleteAttribute in the 1.0 time-frame have been removed in order to clean up the interface in 2.0. You must now migrate to Verify* and MefComposition if you wish to use new versions of the library.

Beyond the Event Horizon: WinForms Plumbing

WinFormsPlumbingIn my last post in this series about testing event configurations, I discovered that my code base thus far doesn’t work with the WinForms event system.  After examining the classes in the System.Windows.Forms namespace, I figured out that I’m going to need to put some plumbing in place before I can even think about writing a query to lock down the event configuration for WinForms.

This article might not make a lot of sense if you haven’t read the previous entry, and that article might be confusing if you haven’t read them all, so here’s a table of contents describing where we’ve been so far:

  1. Beyond the Event Horizon: Delegate Basics” — Explores the useful Delegate.GetInvocationList method.
  2. Beyond the Event Horizon: Event Basics” — Explains the relationship between Delegates and Events, and how the compiler implements simple events.
  3. Beyond the Event Horizon: Events You Don’t Own” — Shows how to use reflection to retrieve delegates for events declared on classes you can’t or won’t change.
  4. Beyond the Event Horizon: Event Complications” — Completes the toolset introduced in part 3 by handling inherited events and events of any delegate type.
  5. Beyond the Event Horizon: WinForms Event System” — In which we discover that part 4 did not complete the toolset after all. 

You can download the code associated with these articles from GitHub.  While I hope the code is interesting to you, it’s only a reimplementation of features already available the the ApprovalTests library.  You can download ApprovalTests from SourceForge or NuGet and start using these features immediately.  If you need help getting started with ApprovalTests, check out Llewellyn Falco’s video series on YouTube.

To pick up on where we left off, I need to make some plumbing.  First I need to create a public wrapper for the inaccessible ListEntry class.  Then I need to create an enumerable adapter for the EventHandlerList class.

Speaking the Unspeakable

I’ll start with creating the public wrapper for ListEntry that will take care of all the reflection necessary to access it’s fields. First, I’ll need to figure out how to get an instance of ListEntry so I can write tests for the wrapper implementation. Here is my test skeleton:

[TestMethod]
public void RetrieveListEntryWithReflection()
{
    // Create an object which should contain a EventHandlerList with something in it
    // Get the private property Component.Events
    // Get the private field head
    // Assert that head's type is named "ListEntry"
}

I know that ListEntry is nested in EventHandlerList and that EventHandlerList.head is the only direct reference to a ListEntry value on EventHandlerList. I also want to guard against the possibility that a random type happens to have a field called “events” which has nothing to do with events. Here’s a prototype:

[TestMethod]
public void RetrieveListEntryWithReflection()
{
    // Create an object which should contain a EventHandlerList with something in it
    var value = new DemoForm();

    // Get the private field Component.events
    BindingFlags bindingFlags = BindingFlags.NonPublic | BindingFlags.Instance;
    var listInfo = value.GetType().EnumerateFieldsWithInherited(bindingFlags).SingleOrDefault(fi => fi.Name == "events" && typeof(EventHandlerList).IsAssignableFrom(fi.FieldType));
    Assert.IsNotNull(listInfo);
    var eventHandlerList = listInfo.GetValue(value);
    Assert.IsNotNull(eventHandlerList);

    // Get the private field head
    var headInfo = eventHandlerList.GetType().GetField("head", bindingFlags);
    Assert.IsNotNull(headInfo);
    var head = headInfo.GetValue(eventHandlerList);
    Assert.IsNotNull(head);

    // Assert that head's type is named "ListEntry"
    Assert.AreEqual("ListEntry", head.GetType().Name);
}

This isn’t pretty, but it works.  I see a couple groupings that I can make into methods.  It makes sense to pull out a method that gets the EventHandlerList and one that extracts the ListEntry called “head”.

Here are some tests for a new method, GetEventHandlerList:

[TestMethod]
public void GetEventHandlerList()
{
    Assert.IsNotNull(new DemoForm().GetEventHandlerList());
}

[TestMethod]
public void PocoHasNoEventHandlerList()
{
    Assert.IsNull(new Poco().GetEventHandlerList());
}

[TestMethod]
public void SpoilerHasNoEventHandlerList()
{
    Assert.IsNull(new Spoiler().GetEventHandlerList());
}

[TestMethod]
public void NullHasNoEventHandlerList()
{
    Assert.IsNull(ReflectionUtility.GetEventHandlerList(null));
}

Spoiler is a little class that contains a field called “events” which has noting to do with raising events.

public class Spoiler
{
    private string events = "I spoil ur reflektion.";
}

Now I need to create the method to get these tests to compile.

public static EventHandlerList GetEventHandlerList(this object value)
{
    return null;
}

With this change stub, the null, Poco and Spoiler tests pass.  To get the DemoForm test to pass I will need to copy some code from RetrieveListEntryWithReflection.  After copying the relevant section and deleting this calls to Assert I get this:

public static EventHandlerList GetEventHandlerList(this object value)
{
    BindingFlags bindingFlags = BindingFlags.NonPublic | BindingFlags.Instance;
    var listInfo = value.GetType().EnumerateFieldsWithInherited(bindingFlags)
        .SingleOrDefault(fi => fi.Name == "events" &&
                         typeof(EventHandlerList).IsAssignableFrom(fi.FieldType));
    var eventHandlerList = listInfo.GetValue(value);
    return (EventHandlerList)eventHandlerList;
}

With this implementation my DemoForm tests pass but my other three fail.  There is also some redundancy with methods and constants already provided by ReflectionUtility.  So, after a little refactoring I get this:

public static EventHandlerList GetEventHandlerList(this object value)
{
    var lists = from fieldInfo in GetType(value).EnumerateFieldsWithInherited(NonPublicInstance)
                where 
                    fieldInfo.Name == "events" &&
                    typeof(EventHandlerList).IsAssignableFrom(fieldInfo.FieldType)
                select fieldInfo.GetValue<EventHandlerList>(value);

    return lists.SingleOrDefault();
}

This implementation passes all four of the tests (DemoForm, Poco, Spoiler and Null).  So what did I do?  First I eliminated the local declaration of the binding flags and used the constant already defined by ReflectionUtility.  To get past the first null reference exception (caused by the Null test), I used my customized version of GetType, which returns typeof(void) when the input is null.  The where clause survived intact, but I modified the select clause.  I used my own custom GetValue<> method to handle casting from object. to the EventHandlerList type.  For Null, Poco and Spoiler, nothing on the type matches the constraints, so the select statement never executes, so I don’t need additional null checking there.  Finally I unpack the matching value, or return null if there were no matches.

Now I can update RetrieveListEntryWithReflection to use this method.

[TestMethod]
public void RetrieveListEntryWithReflection()
{
    // Create an object which should contain a EventHandlerList with something in it
    var value = new DemoForm();

    // Get the private field Component.events
    var eventHandlerList = value.GetEventHandlerList();
    Assert.IsNotNull(eventHandlerList);

    // Get the private field head
    BindingFlags bindingFlags = BindingFlags.NonPublic | BindingFlags.Instance;
    var headInfo = eventHandlerList.GetType().GetField("head", bindingFlags);
    Assert.IsNotNull(headInfo);
    var head = headInfo.GetValue(eventHandlerList);
    Assert.IsNotNull(head);

    // Assert that head's type is named "ListEntry"
    Assert.AreEqual("ListEntry", head.GetType().Name);
}

The next chunk of RetrieveListEntryWithReflection pulls the head field from the EventHandlerList. When it comes to extracting the head, I have the advantage of being able to restrict the input to EventHandlerList instances. This means I don’t have to worry so much about spoilers, but I still need to check for nulls.  Here are some tests:

[TestMethod]
public void DemoFormEventHandlerListHasHead()
{
    var eventHandlerList = new DemoForm().GetEventHandlerList();
    Assert.AreEqual("ListEntry", eventHandlerList.GetHead().GetType().Name);
}

[TestMethod]
public void ButtonHasNoHead()
{
    Assert.IsNull(new Button().GetEventHandlerList().GetHead());
}

[TestMethod]
public void NullHasNoHead()
{
    Assert.IsNull(ReflectionUtility.GetHead(null));
}

Note the second test ButtonHasNoHead. Button is a control and has many things in common with Form, but in this case GetEventHandlerList returns null because I haven’t wired up any of Button‘s events.

Here’s a first draft of GetHead copied from  RetrieveListEntryWithReflection:

public static object GetHead(this EventHandlerList value)
{
    BindingFlags bindingFlags = BindingFlags.NonPublic | BindingFlags.Instance;
    var headInfo = value.GetType().GetField("head", bindingFlags);
    var head = headInfo.GetValue(value);
    return head;
}

With GetHead I have no choice but to return an object, because the ListEntry type remains un-nameable.  This method is not well protected from null reference exceptions and two of my tests are failing because of that.  More refactoring is in order.

public static object GetHead(this EventHandlerList value)
{
    var headInfo = GetType(value).GetField("head", NonPublicInstance);
    return headInfo == null ? null : headInfo.GetValue(value);
}

Again I used the binding flags defined by ReflectionUtility, and the customized GetType method.  One more null check and all my tests are good to go.

I now have a method to retrieve objects of the right type (ListEntry) that I can use for testing.  I can also update RetrieveListEntryWithReflection to use the new method.

[TestMethod]
public void RetrieveListEntryWithReflection()
{
    // Create an object which should contain a EventHandlerList with something in it
    var value = new DemoForm();

    // Get the private field head
    var head = value.GetEventHandlerList().GetHead();
    Assert.IsNotNull(head);

    // Assert that head's type is named "ListEntry"
    Assert.AreEqual("ListEntry", head.GetType().Name);
}

In fact, I can use ApprovalTests to make this test not only more explicit, but shorter too.

[TestMethod]
public void RetrieveListEntryWithReflection()
{
    var head = new DemoForm().GetEventHandlerList().GetHead();
    ApprovalTests.Approvals.Verify(head.GetType().FullName);
}

By approving the the full name, I feel more confident I’m getting the right type instead of some other type which might happen to have the same name in a different namespace.

Wrapping ListEntry

Because ListEntry is unspeakable outside an EventHandlerList instance, I’m forced to start my wrapper with a constructor that takes an object as its input.

public class ListEntryWrapper
{
    public ListEntryWrapper(object value)
    {

    }
}

Since I must pass an object I can’t rely on the compiler to require that it is a ListEntry. I need to verify this at runtime, then decide what I should do if something other than a ListEntry instance is provided.  I’ll do this by getting my hands on ListEntry’s Type using reflection, and comparing it to the type passed into the constructor.

private static readonly Lazy<Type> ListEntryType =
    new Lazy<Type>(() => typeof(EventHandlerList).GetNestedType("ListEntry", BindingFlags.NonPublic));

public ListEntryWrapper(object value)
{
    if (ReflectionUtility.GetType(value) == ListEntryType.Value)
    {
        // Do something with a genuine ListEntry instance.
    }
}

Since I know the nested type’s name, getting a reference to its Type is not too difficult.  Since every ListEntryWrapper will need access to this information and the information doesn’t change, I store the Type in a static, read only and Lazy field.  Using Lazy might be premature optimization, but I haven’t had much chance to play with Lazy and this seems like a good learning opportunity.

I don’t want to put the burden of type-checking on the caller, so I’ll use the NullObject pattern if the wrong type is passed in.  When the caller gives me the wrong kind of object I will just let the reference pass out of scope and implement appropriate “do-nothing” behavior in the wrapper’s properties. Since the ListEntry is part of a single-linked list, do-nothing implies that ListEntryWrapper’s properties should return null.

On the other hand, if the object really is a ListEntry I should store a reference to it so that I can reflect over the value later.

private readonly object listEntry;

public ListEntryWrapper(object value)
{
    if (ReflectionUtility.GetType(value) == ListEntryType.Value)
    {
        this.listEntry = value;
    }
}

Since ListEntry is unspeakable, all it’s members are unspeakable, and each member must be accessed through reflection.  Here’s a test I can use to get feedback, while I build with the correct type, and one which shows the result when the type is incorrect. I also pulled the ceremony around getting a ListEntry instance into a convenience method.

[TestMethod]
public void ListEntryIsWrapped()
{
    var listEntryWrapper = new ListEntryWrapper(GetListEntry());
    ApprovalTests.Approvals.Verify(listEntryWrapper.WritePropertiesToString());
}

[TestMethod]
public void WrongObjectIsntWrapped()
{
    var listEntryWrapper = new ListEntryWrapper(new object());
    ApprovalTests.Approvals.Verify(listEntryWrapper.WritePropertiesToString());
}

At the moment, both tests produce the same uninteresting results:

ListEntryWrapper
{
}

I should see some differentiation once I create accessors for each of ListEntry’s members: key, handler and next.

private static readonly Lazy<FieldInfo> KeyInfo =
    new Lazy<FieldInfo>(
        () => ListEntryType.Value.GetField("key", BindingFlags.NonPublic | BindingFlags.Instance));

public object Key
{
    get
    {
        return this.listEntry == null ? 
            null :
            KeyInfo.Value.GetValue(this.listEntry);
    }
}

Here is the result for the actual list entry:

ListEntryWrapper
{
    Key: System.Object
}

And the result for the NullObject scenario:

ListEntryWrapper
{
    Key: NULL
}

I can repeat this technique to implement the remaining properties. Things go fine for Handler, but I run into a problem implementing Next. To keep the linked list going, I need to wrap ListEntry.next in a new ListEntryWrapper. But, if the ListEntry is a singleton or the last link in the list, then “next” should be null.  The ListEntryWrapper constructor uses the customized GetType method to retrieve typeof(void) when value is null.  Since typeof(void) is not ListEntry’s type, the wrapper reverts to it’s NullObject behavior… and the linked list terminates with a NullObject followed by a null.  There is one extra link in the chain.  To avoid having to remember this whenever I use the wrapper, Next needs some special logic.

public ListEntryWrapper Next
{
    get
    {
        if (this.listEntry == null)
        {
            return null;
        }

        object nextValue = NextInfo.Value.GetValue(this.listEntry);
        return nextValue == null ? null : new ListEntryWrapper(nextValue);
    }
}

Now, if an object of the wrong type is passed in, listEntry will be null and Next will be null.  If the correct type is passed in, listEntry will be populated, but GetValue will return null when listEntry is the last link in the list.  When that happens, the property will return null; otherwise it will wrap the non-null value and return it.  This behavior is more intuitive and now the results look correct.  Here are the final results for the wrapped ListEntry.

ListEntryWrapper
{
    Key: System.Object
    Handler: System.EventHandler
    Next: NULL
}

To round out my tests, I should add one that covers the case where the ListEntry is not a singleton.

[TestMethod]
public void ListHasMoreThanOneEntry()
{
    var button = new Button();
    button.Click += (s, e) => { return; };
    button.LostFocus += (s, e) => { return; };
    var wrapper = new ListEntryWrapper(button.GetEventHandlerList().GetHead());
    ApprovalTests.Approvals.Verify(wrapper.WritePropertiesToString());
}

This test works as expected, and produces these results:

ListEntryWrapper
{
    Handler: System.EventHandler
    Key: System.Object
    Next: EventReflection.Demo.ListEntryWrapper
}

With this last bit, I have a wrapper for ListEntry that allows me to manipulate these objects like normal public types.  Now, I can turn to the next problem, which is that EventHandlerList doesn’t implement an IEnumerable interface.

Making EventHandlerList into a… List

Other than a few CRUD operations (add/remove/find) EventHandlerList is just a reference to one ListEntry, called “head”. To access any other list entries beyond “head”, you have to go through “head”. I’ll know that I’ve visited all the entries when I find a ListEntry where the “next” parameter is a null reference.

The procedure outlined about is called “walking” the list. It should be fairly easy to implement now that I have ListEntryWrapper to work with, and because I also already wrote the reflection to retrieve the “head” as part of testing ListEntryWrapper. Rather than creating a wrapper class that implements IEnumerable<ListEntryWrapper> I can simply take advantage of the yield statement and have the compiler generate the enumerable class for me.

I’ll create a method to produce the Button from ListHasMoreThanOneEntry, and reuse it in my next test. If I’m successful, a test like this should show two ListEntryWrappers in the result.

[TestMethod]
public void AsEnumerableMethodAdaptsEventHandlerList()
{
    var button = GetTestButton();
    ApprovalTests.Approvals.VerifyAll(
        button.GetEventHandlerList().AsEnumerable(),
        e => e.WritePropertiesToString());
}

I stub out a new extension method and call it AsEnumerable because that describes what the method does. Although there are methods called AsEnumerable in the framework, their parameters do not have types compatible with EventHandlerList, so I will need to provide the implementation. Of course, before I get too far ahead of myself, I should make sure my AsEnumerable implementation can handle nulls.

[TestMethod]
public void NullListIsEmpty()
{
    var button = new Button();
    Assert.IsFalse(button.GetEventHandlerList().AsEnumerable().Any());
}

I’ve decided to handle null by returning an empty set rather than throwing an exception or returning null. This will save me the trouble of checking for null when querying, since LINQ can handle empty sets just fine.

Here is an AsEnumerable implementation that satisfies both tests:

public static IEnumerable<ListEntryWrapper> AsEnumerable(this EventHandlerList source)
{
    object value = source.GetHead();
    if (value == null)
    {
        yield break;
    }

    for (var head = new ListEntryWrapper(value); head != null; head = head.Next)
    {
        yield return head;
    }
}

And the results show two items in the list.

ListEntryWrapper
{
    Handler: System.EventHandler
    Key: System.Object
    Next: EventReflection.Demo.ListEntryWrapper
}

ListEntryWrapper
{
    Handler: System.EventHandler
    Key: System.Object
    Next: NULL
}

A just like that, I can now bring all the power of LINQ to bear on EventHandlerList.

Relationship With EventApprovals

The classes in ApprovalTests track pretty closely to what I’ve shown here.  The enumerable adapter is hosted in ApprovalUtilities.Reflection.HandlerListHelper.  This class also contains the GetHead method.  The AsEnumerable method is implemented with a while loop, but this detail has no effect.  ILSpy shows that the compiler converts this loop into a for loop like I’ve shown above.

In ApprovalTests, you can find the wrapper class in ApprovalUtilities.Reflection.HandlerListEntry.  Because the ApprovalTests libraries target .NET 3.5, Lazy<T> isn’t available.  The properties are still lazy, but the laziness is implemented by hand.  I created a GetField<T> method which took the field name as a parameter and leveraged the methods in ReflectionUtilities rather than specifying the binding flags a second time.  Conceptually, both wrappers work the same, including the extra null checking in the Next property.

Up Next

The plumbing is done and I can return to the domain problem of querying for events that have handlers attached.  Check back soon for the thrilling conclusion to this blog series: “Beyond the Event Horizon: Testing WinForms”.

Beyond the Event Horizon: WinForms Event System

OLYMPUS DIGITAL CAMERAYou remember WinForms don’t you? WinForms applications are still around, in huge quantities. Some of us still need to maintain or rehabilitate WinForms applications, and testing events is particularly important when working with WinForms. The WinForms designer takes responsibility for wiring up many events for you, then discourages you from thinking too hard about what it did. Not only does it put the event wiring code in a semi-hidden Designer file, it also encloses the code it doesn’t want you touch in a region and adds a large comment explaining that you really should slowly back away before you hurt yourself.

Even if you heed all of the warnings and stay out of that code, its still far too easy to add useless empty handlers from within the designer, or worse—accidentally unhook your events.  Unit testing your handler implementations wont help, nothing in the internal implementation changes when a handler is wired (or unwired) from an event.  You need something like the event tests I’ve been working on to make sure that the proper code is invoked when you perform an action on the GUI (for example, clicking a button).

To build a system for inventorying events on WinForms controls, you should have a good idea how events work elsewhere in the .NET Framework.  Here’s a short list of posts on this blog which should get you up to speed:

  1. Beyond the Event Horizon: Delegate Basics” — Explores the useful Delegate.GetInvocationListmethod.
  2. Beyond the Event Horizon: Event Basics” — Explains the relationship between Delegates and Events, and how the compiler implements simple events.
  3. Beyond the Event Horizon: Events You Don’t Own” — Shows how to use reflection to retrieve delegates for events declared on classes you can’t or won’t change.
  4. Beyond the Event Horizon: Event Complications” — Completes the toolset introduced in part 3 by handling inherited events and events of any delegate type.

I’ve made the code associated with these articles available on GitHub.  The code described in this article is a reimplementation of features available in a free, open source testing library called ApprovalTests, which you can download from SourceForge or NuGet and start using immediately.  By using ApprovalTests, you can save yourself the bother of cut-and-paste, and the maintenance headache of keeping your own copy of the code.  Need help getting started with ApprovalTests?  Check out Llewellyn Falco’s fantastic series on YouTube and you will be up to speed in no time.

So, will my event testing system work on WinForms? On the surface it seems like it will, but lets give it a try and see what happens.

Start With A Failing Test

First I’ll create a simple Form.  I won’t be using the designer.  Some of you might have never ventured into the “forbidden zone” inside the Designer file.  I’ll implement the Form by hand, and this will give you your first bit of insight into how events work in WinForms.

First I need to add a reference to System.Windows.Forms and import the namespace having the same name.

using System.Windows.Forms;

public class DemoForm : Form
{
}

While I design, I’ll use WinFormsApprovals.Verify to get feedback as I work.

[TestClass]
public class WinFormsDemo
{
    [TestMethod]
    public void VerifyDemoFormView()
    {
        WinFormsApprovals.Verify(new DemoForm());
    }
}

This test will continue to fail until I approve it, so each time I run it I’ll see what DemoForm looks like.  Now that I can see what I’m doing I can quickly add a few controls to the form. For example, here I add a Button.

private Button button1;

public DemoForm()
{
    this.button1 = new Button();
    this.button1.Text = "Click Me!";
    this.Controls.Add(this.button1);
}

And I can see what this looks like in TortoiseIDiff:

WinFormsEvents

And here’s a CheckBox:

private CheckBox checkBox1;

public DemoForm()
{
    this.button1 = new Button();
    this.button1.Text = "Click Me!";
    this.Controls.Add(this.button1);

    this.checkBox1 = new CheckBox();
    this.checkBox1.Text = "Check Me!";
    this.checkBox1.Location = new Point(
        this.button1.Location.X + this.button1.Width + 10,
        this.button1.Location.Y);
    this.Controls.Add(this.checkBox1);
}

The layout code is ugly, but it gets the job done.

WinFormsEvents2

Finally, I’ll have a ListBox too:

public DemoForm()
{
    this.button1 = new Button();
    this.button1.Text = "Click Me!";
    this.Controls.Add(this.button1);

    this.checkBox1 = new CheckBox();
    this.checkBox1.Text = "Check Me!";
    this.checkBox1.Location = new Point(
        this.button1.Location.X + this.button1.Width + 10,
        this.button1.Location.Y);
    this.Controls.Add(this.checkBox1);

    this.listBox1 = new ListBox();
    this.listBox1.Location = new Point(
        10,
        this.button1.Location.Y + this.button1.Height + 10);
    this.listBox1.Size = new Size(
        this.Width - 40,
        this.Height - this.button1.Height - 40);
    this.Controls.Add(this.listBox1);
}

And here’s my final Form.  Fantastic design if you ask me.  I’ll apply a ClipboardReporterAttribute to the test method and approve the result.  Doing this locks down the look and feel.

WinFormsEvents3

Now I want to add another failing test to lock down the event handlers for the form.

[TestMethod]
public void VerifyDemoFormEvents()
{
    EventUtility.VerifyEventCallbacks(new DemoForm());
}

And here are the results:

Event callbacks for DemoForm

Throughout these posts I’ve been implying that this code base wont work with WinForms but we haven’t proved that yet.  Before I can say that its not working, I need to wire up some events!

public DemoForm()
{
    // ...

    this.button1.Click += this.ButtonClick;
    this.checkBox1.CheckedChanged += this.HandleCheckedChanged;
    this.Load += this.HandleFormLoad;
}

I’ve wired up three events but none of them show up in the results.  Right away I can see that there might be a problem with the Button and CheckBox, those events are wired up to child objects, and I’ve never tried to do anything other than get events off the top level object.  However, the HandleFormLoad method is wired up to the Form.Load event, which is part of the top-level object and isn’t showing up either.

Now I’ve got a TODO list:

  1. Figure out why the handler attached to Load is missing.
  2. Detect events wired up to child controls.

WinForms Event Implementation

In my last post, I thought I had discovered the end-all-be-all technique for finding events of any type, but the missing Load event proves me wrong.  According to MSDN, Form.Load has this signature:

public event EventHandler Load

That doesn’t seem very exotic.  When I debug my test and query the type in the Immediate window, I get the following output:

typeof(DemoForm).GetEvents()
{System.Reflection.EventInfo[91]}
    [0]: {System.EventHandler AutoSizeChanged}
    [1]: {System.EventHandler AutoValidateChanged}
    [2]: {System.ComponentModel.CancelEventHandler HelpButtonClicked}
    [3]: {System.EventHandler MaximizedBoundsChanged}
    [4]: {System.EventHandler MaximumSizeChanged}
    [5]: {System.EventHandler MarginChanged}
    [6]: {System.EventHandler MinimumSizeChanged}
    [7]: {System.EventHandler TabIndexChanged}
    [8]: {System.EventHandler TabStopChanged}
    [9]: {System.EventHandler Activated}
    [10]: {System.ComponentModel.CancelEventHandler Closing}
    [11]: {System.EventHandler Closed}
    [12]: {System.EventHandler Deactivate}
    [13]: {System.Windows.Forms.FormClosingEventHandler FormClosing}
    [14]: {System.Windows.Forms.FormClosedEventHandler FormClosed}
    [15]: {System.EventHandler Load}
    // Many more events...

Maybe something is wrong with the way I’m building my type collection in GetEventCallbacks. I’ll use Extract Method to break out the type collection query so that I can test it in isolation.

public static IEnumerable<EventCallback> GetEventCallbacks(
    this object value)
{
    return value.GetEventsForTypes(GetEventTypes(value).ToArray());
}

public static IEnumerable<Type> GetEventTypes(object value)
{
    return GetType(value).GetEvents().Select(ei => ei.EventHandlerType).Distinct();
}

This change doesn’t break any existing tests.  I’ll add a test to see what happens when this method queries DemoForm.

[TestMethod]
public void GetEventTypeForDemoForm()
{
    ApprovalTests.Approvals.VerifyAll(
        ReflectionUtility.GetEventTypes(new DemoForm()), string.Empty);
}

The results look and EventHandler, the delegate type backing Form.Load, is right there in slot 0.

[0] = System.EventHandler
[1] = System.ComponentModel.CancelEventHandler
[2] = System.Windows.Forms.FormClosingEventHandler
[3] = System.Windows.Forms.FormClosedEventHandler
[4] = System.Windows.Forms.InputLanguageChangedEventHandler
[5] = System.Windows.Forms.InputLanguageChangingEventHandler
[6] = System.Windows.Forms.ScrollEventHandler
[7] = System.Windows.Forms.ControlEventHandler
[8] = System.Windows.Forms.DragEventHandler
[9] = System.Windows.Forms.GiveFeedbackEventHandler
[10] = System.Windows.Forms.HelpEventHandler
[11] = System.Windows.Forms.InvalidateEventHandler
[12] = System.Windows.Forms.PaintEventHandler
[13] = System.Windows.Forms.QueryContinueDragEventHandler
[14] = System.Windows.Forms.QueryAccessibilityHelpEventHandler
[15] = System.Windows.Forms.KeyEventHandler
[16] = System.Windows.Forms.KeyPressEventHandler
[17] = System.Windows.Forms.LayoutEventHandler
[18] = System.Windows.Forms.MouseEventHandler
[19] = System.Windows.Forms.PreviewKeyDownEventHandler
[20] = System.Windows.Forms.UICuesEventHandler

The next logical problem could be that the GetEventsForTypes method is not finding any fields assignable to EventHandler on Form or any of its ancestors.  I’ll write another test to focus in on that possibility.

[TestMethod]
public void GetEventsForDemoFormEventHandlers()
{
    ApprovalTests.Approvals.VerifyAll(
        new DemoForm().GetEventsForTypes(typeof(EventHandler)),
        string.Empty);
}

ApprovalTests reports that the result set is empty.  That’s a problem.  Why isn’t it finding the field backing Form.LoadIs there a field backing Form.Load?  My testing assumes that every event is implemented by the compiler, and as a result, every event should have a delegate backing the event, declared as a private field.  What if WinForms uses custom add/remove methods instead of compiler implemented events?

I can use ILSpy to figure out what’s going on.  Sure enough, ILSpy shows a custom add/remove implementation.

public event EventHandler Load
{
    add
    {
        base.Events.AddHandler(Form.EVENT_LOAD, value);
    }
    remove
    {
        base.Events.RemoveHandler(Form.EVENT_LOAD, value);
    }
}

Instead of adding and removing delegates from a private instance field on Form, these add and remove methods make calls to a protected property called Events. ILSpy tells me that Events is an instance of EventHandlerList. I don’t know what that is yet, but before I try to figure it out I want to spend a little more time inside Form. The purpose of value is easy to understand, it is the delegate to add or remove, but what about Form.EVENT_LOAD?

EVENT_LOAD refers to a private static read-only object, initialized to new object(). This argument it is just a reference to some unique chunk of memory on the managed heap, it can’t be changed, and every instance of Form has access to the same unique reference. I notice that there are many more static objects like this on Form. For example, Form has EVENT_MENUCOMPLETE, EVENT_MENUSTART, EVENT_RESIZEBEGIN, and so on. Presumably there is a static object which corresponds to each event implemented with custom add/remove methods similar to Form.Load.

If I navigate to the Events property declaration I find it declared on System.ComponentModel.Component:

protected EventHandlerList Events
{
    get
    {
        if (this.events == null)
        {
            this.events = new EventHandlerList(this);
        }
        return this.events;
    }
}

The property lazily instantiates an EventHandlerList when needed, and that’s about it. EventHandlerList also lives in the System.ComponentModel namespace. Despite it’s name, it does not derive from List, nor does it implement any list or collection interfaces.  I’ll look at AddHandler to see what it does when it receives the static object and handler delegate.

public void AddHandler(object key, Delegate value)
{
    EventHandlerList.ListEntry listEntry = this.Find(key);
    if (listEntry != null)
    {
        listEntry.handler = Delegate.Combine(listEntry.handler, value);
        return;
    }
    this.head = new EventHandlerList.ListEntry(key, value, this.head);
}

From this method’s point of view, the static object is called key and the delegate is called value. So, EventHandlerList is logically closer to a dictionary than a list.  When AddHandler executes, the dictionary attempts to find an existing value with that key. When found, the new delegate is combined with the existing delegates, otherwise the method creates a new dictionary entry. The last line of the method might give some insight into how EventHandlerList stores its data, if I knew how EventHandlerList.ListEntry was implemented.

Here is ListEntry‘s entire implementation:

private sealed class ListEntry
{
    internal EventHandlerList.ListEntry next;
    internal object key;
    internal Delegate handler;
    public ListEntry(object key, Delegate handler, EventHandlerList.ListEntry next)
    {
        this.next = next;
        this.key = key;
        this.handler = handler;
    }
}

In a typical .NET dictionary, each entry provides a key and a value. ListEntry provides these members, but also a reference to the next entry. So, EventHandlerList is a hybrid dictionary/linked-list.

Missing Pieces

After finding ListEntry I don’t need to look any further. Since WinForms has taken us deep into a rabbit hole, I’ll restate my goals:

Given a Form to examine, make one call that will inventory the event handlers attached to the Form and each of the Form’s controls. Display the inventory where each invocation list is associated with the event it’s attached to, and each event is associated with the type it is declared on.

Let’s compare how I solved each part of this problem in the Poco example with how I will need to solve it in the WinForms example.

  • Find Invocation List
    • Poco: Find the delegate field backing the event; call GetInvocationList
    • WinForms: Find the ListEntry instance in Events; access the handler field and call GetInvocationList
  • Associate Invocation List with Event
    • Poco: Use the delegate field name.
    • WinForms: Use the name of the static object used as the ListEntry key.
  • Associate Event with Object
    • Poco: Use the Type name.
    • WinForms: Use the Form type name for Form Events.  Use the Control type name for each child control.

I think the biggest difference while constructing a query will be the way the invocation list is associated with an event name.  With POCO events I could get both the invocation list and the name from the same delegate reference.  By the time a Component stores an event delegate in a ListEntry the event name is lost and everything is called key. The CLR doesn’t care about the name because it checks the key with a reference comparison, it just needs the pointer.

Besides the query, I’m going to need a lot of plumbing just to make the data structure queryable. ListEntry is private, nested and sealed. The compiler wont even acknowledge it’s existence:

EventHandlerList.ListEntry entry;  // Won't compile

Error:

Error   1   The type name 'ListEntry' does not exist in the type 'System.ComponentModel.EventHandlerList'

And since EventHandlerList doesn’t implement IEnumerable or IEnumerable<T>, it doesn’t play nice with LINQ:

EventHandlerList ehl = new EventHandlerList();
var q = from e in ehl select e; // Wont compile

Error:

Error   1   Could not find an implementation of the query pattern for source type 'System.ComponentModel.EventHandlerList'.  'Select' not found.

So, before I can write my queries, I’ll need to get under there and do some plumbing.

Relationship With EventApprovals

This article has been a bit of a tease.  While I hope I’ve provided some useful information, I haven’t really produced any solutions.  The relationship with EventApprovals is that all of these problems are already solved in ApprovalTests!  When you call ApprovalTests.Events.EventApprovals.VerifyEvents, WinForms events are supported and will appear in your inventory.

Up Next

In this article’s next installment: “Beyond the Event Horizon: WinForms Plumbing”, I will use what I’ve learned about WinForms to build up the plumbing pieces necessary to make a usable wrapper for ListEntry and an enumerable adapter for EventHandlerList.  With those pieces of plumbing out of the way then I can return to my TODO list, find the missing Form.Load event, and figure out how I want to capture the events declared on child controls.

Beyond the Event Horizon: Event Complications

PB040015This article will continue to expand the capabilities of my reflection based event tests.  Although my VerifyEventCallbacks test method can inventory collections of EventHandler based events declared directly on a class, it only works on events backed by EventHandler delegates, and it won’t detect events declared on base classes.   To be truly useful, these shortcomings need to be addressed.

If you have read the previous articles in this series, then I hope you have gained a decent grasp of some fundamental event system concepts.  Here is a short table of contents for where we’ve been so far:

  1. Beyond the Event Horizon: Delegate Basics” — Explores the useful Delegate.GetInvocationListmethod.
  2. Beyond the Event Horizon: Event Basics” — Explains the relationship between Delegates and Events, and how the compiler implements simple events.
  3. Beyond the Event Horizon: Events You Don’t Own” — Shows how to use reflection to retrieve delegates for events declared on classes you can’t or won’t change.

For your enjoyment and education, you can can get the code associated with these articles from GitHub.  However, remember that this code is nothing more than a reimplementation of features already available in the ApprovalTests library, which is a free, open source library you can use to enhance your tests.  If your primary interest is to use these features, then don’t bother with cut-and-paste, just get yourself a copy of ApprovalTests from SourceForge or NuGet.

Don’t know what ApprovalTests are?  You will get more out of this article if you take a moment to watch a few videos in Llewellyn Falco’s ApprovalTests tutorial series on YouTube.

A Comprehensive Inventory

So far my tests against Poco look pretty nice. I tried to keep the test general, because it would be nice to reuse my extension methods on objects besides Poco instances. I’m worried that Poco doesn’t represent objects I might find in the real world.

Here are a couple traits of Poco that indicate it may not be complicated enough to model real world classes:

  1. Poco‘s events are only based on EventHandler. I should add some events based on EventHandler<T> or the always popular (hated?) PropertyChangedEventHandler.
  2. Poco‘s events are all declared on Poco. I should introduce a class that inherits from some of its events from a base class.

I’ll create a class with both these features by inheriting from Poco and implementing INotifyPropertyChanged on the descendant.

using System.ComponentModel;
public class PocoExtension : Poco, INotifyPropertyChanged
{
    public event PropertyChangedEventHandler PropertyChanged;

    protected virtual void OnPropertyChanged(object sender, PropertyChangedEventArgs e)
    {
        var handler = this.PropertyChanged;
        if (handler != null)
        {
            handler(sender, e);
        }
    }
}

And I’ll write a test for this class.

[TestMethod]
public void PocoExtensionTest()
{
    var target = new PocoExtension();
    target.ProcessStarted += Domain.HandleProcessStarted;
    target.PropertyChanged += Domain.HandlePropertyChanged;
    EventUtility.VerifyEventCallbacks(target);
}

As a developer using PocoExtension I can wire up a handler to ProcessStarted (which is inherited from Poco) just as easily as I can wire up a handler to PropertyChanged (which is declared on PocoExtension). Both events are part of the same object, so why should I need to worry about whether they are part of the same class? My intuition is that both events should show up in the inventory. Likewise, I wire up my handlers in the exact same manner, even though ProcessStarted and PropertyChanged leverage different delegate types to specify compatible handlers. They are both events, why should I have to worry about the delegate type? My intuition remains that both events should show up in the inventory. The test results do not meet my intuitive expectation:

Event callbacks for PocoExtension

With all the preamble in this article, it shouldn’t surprise you that neither event was found, but here is the final proof.  More importantly, I have a failing test that I can use to guide me toward a solution.  I have two problems to solve:

  1. Detect inherited events.
  2. Detect events not based on EventHandler

Which one to attack first?  The second requirement seems harder because all delegates inherit from MulticastDelegate whether they are associated with events or not.  It’s natural to think that EventHandler<T> derives from EventHandler but other than similar names (and sharing MulticastDelegate as a base type) these two delegates have no relationship.  Remember, delegates are their own types, and one of the rules for the delegate type is that they are all implicitly sealed (at the language level, the compiler can do what it likes when implementing delegates in IL).  So, there is no least derived type that I could use to find all the “event” delegates because none of them can even derive from each other in the first place.

So, without inheritance to lean on, I’ll need some other way to filter for delegates that are related to events. I’ll come back to this problem after dealing with the easier problem of detecting inherited events on PocoExtension.

Inherited Events

I wired up two event handlers in my test: ProcessStarted and PropertyChanged.  Neither was detected.  With ProcessStarted I know that the problem is not the delegate type, because it is backed by an EventHandler delegate.   So, the problem with this event must be inheritance.  In other words, my test does not detect ProcessStarted because it is declared on the base class (Poco).

Nothing has changed about the backing field I’m looking for, it’s still a private instance field on the declaring class.  Although the reflection API provides a BindingFlag that will flatten class hierarchies, private fields are not included, so this field is not showing up in my query.  I need to implement this capability myself.

The procedure seems straightforward.  Given an instance of Type, I can use the Type.BaseType property to get the less derived type.  I can crawl up this inheritance chain in a loop until it ends, collecting private fields as I go.  My guess is that I’m probably not the first developer to come up with this idea, and I wonder if maybe someone out there has a better solution than simple iteration.  However, after some research, it looks like everyone is just iterating so that’s what I’ll do.

Here is my current GetEventHandlers implementation.

public static IEnumerable<EventCallback> GetEventHandlers(this object value)
{
    if (value == null)
    {
        return null;
    }

    return from fieldInfo in value.GetType().GetFields(NonPublicInstance)
           where typeof(EventHandler).IsAssignableFrom(fieldInfo.FieldType)
           let callback = fieldInfo.GetValue<EventHandler>(value)
           where callback != null
           select new EventCallback(fieldInfo.Name, callback);
}

The problem is that GetFields does not include inherited private fields (“inherited private” is a weird thing to say—all that I mean is these fields exist at runtime and have values).  I can’t change the behavior of GetFields, so I need to replace it.  I’ll create an extension method on Type that does what I need:

public static IEnumerable<FieldInfo> EnumerateFieldsWithInherited(
    this Type typeInfo,
    BindingFlags bindingFlags)
{
    for (var type = typeInfo; type != null; type = type.BaseType)
    {
        foreach (var fieldInfo in type.GetFields(bindingFlags))
        {
            yield return fieldInfo;
        }
    }
}

This method is more or less implements the procedure described above, but instead of collecting the private fields, it streams them out as they are needed.  Now I can test whether updating GetEventHandlers to use this method will result in any changes to my results.

New results:

Event callbacks for PocoExtension

ProcessStarted
    [0] Void HandleProcessStarted(System.Object, System.EventArgs)

The test found the inherited event, excellent. My previous tests still pass, so I haven’t broken anything either. That’s one down, on to the next challenge.

Find events of any type

The current GetEventHandlers method does just what it advertises. It gets fields declared as EventHandler. Unfortunately, nothing requires that events be declared as EventHandler and there are many other options. Because delegates have no meaningful inheritance relationships, these other options don’t even inherit from EventHandler.

I’ll approach this problem by refactoring the out the “defective” part of GetEventHandlers. I’ll end up with two methods, GetEventHandlers will continue to work as advertised, but it will use a new method, GetEventsForType, for the heavy lifting.

public static IEnumerable<EventCallback> GetEventHandlers(this object value)
{
    return value.GetEventsForType(typeof(EventHandler));
}

public static IEnumerable<EventCallback> GetEventsForType(
    this object value, 
    Type type)
{
    if (value == null)
    {
        return null;
    }

    return from fieldInfo in value.GetType().EnumerateFieldsWithInherited(NonPublicInstance)
           where type.IsAssignableFrom(fieldInfo.FieldType)
           let callback = fieldInfo.GetValue<EventHandler>(value)
           where callback != null
           select new EventCallback(fieldInfo.Name, callback);
}

After making this change, my existing test on Poco still passes, and the output for my new test on PocoExtension is the same. Technically, PocoExtensionTest is failing, but that’s only because I haven’t approved anything yet. So this change hasn’t broken anything.

To get my test into a state where I can approve the result, I need to keep working on GetEventsForType.  In it’s current form, GetEventsForType lets me specify a type to filter for, but I know that PocoExtension uses more than one delegate type for its events.  I would rather pass a collection of delegate types to GetEventsForType (and pluralize the name).  Once I have a collection of types, I can change the query to collect backing fields for any type in the collection.

public static IEnumerable<EventCallback> GetEventsForTypes(
    this object value,
    params Type[] types)
{
    if (value == null)
    {
        return null;
    }

    return from fieldInfo in value.GetType().EnumerateFieldsWithInherited(NonPublicInstance)
            where types.Any(t => t == fieldInfo.FieldType)
            let callback = fieldInfo.GetValue<EventHandler>(value)
            where callback != null
            select new EventCallback(fieldInfo.Name, callback);
}

Keeping in mind that delegate inheritance is restricted, the old query’s use of IsAssignableFrom started to smell.  If there are no inheritance chains for delegates, then a simple equality check should suffice.  My existing tests are happy with this change, but PocoExtensionTest still doesn’t detect the PropertyChanged handlers.

The last piece of the puzzle is to create the collection of delegate types associated with PocoExtension’s backing fields and pass it to GetEventsForTypes.  I know the reflection API has GetProperties, GetConstructors, GetFields, why not GetEvents?  As a matter of fact, such a method exists. It returns an EventInfo array, and each member includes an EventHandlerType property that I can use to create a collection of types. Another extension method is in order, but naming it is hard.  GetEventHandlers still seems like the best name because its so general, but it implies a false relationship with the EventHandler delegate. So, I’ll go with Callback.

public static IEnumerable<EventCallback> GetEventCallbacks(
    this object value)
{
    var types = value.GetType().GetEvents()
        .Select(ei => ei.EventHandlerType).Distinct();
    return value.GetEventsForTypes(types.ToArray());
}

Notice that I don’t pass any binding flags to GetEvents. Events should be public. They could be protected or perhaps even private but that seems kind of weird, so I’m not going to worry too much about it. I update VerifyEventCallbacks to use this new method:

public static void VerifyEventCallbacks(object value)
{
        // ...
        foreach (var callback in value.GetEventCallbacks())
        {
            buffer.AppendLine(callback.ToString());
        }
        // ...
}

Making this change doesn’t break any passing tests, but it does “break” the test I haven’t approved yet.  Instead of showing me any results, that test now throws an InvalidCastException.  Turns out that I did not pay enough attention when I let my refactoring tool extract GetEventsForTypes.

Although the method takes an array of types, the query still attempts to cast everything to EventHandler.  Again, since inheritance relationships don’t exist between delegate types, PropertyChangedEventHandler can’t cast to EventHandler.  My only choices are to use Delegate or MulticastDelegate for my cast.  Delegate will work fine because the only thing I need to do is call GetInvocationList.

public static IEnumerable<EventCallback> GetEventsForTypes(
    this object value,
    params Type[] types)
{
    // ...
    return from fieldInfo in value.GetType().EnumerateFieldsWithInherited(NonPublicInstance)
            where types.Any(t => t == fieldInfo.FieldType)
            let callback = fieldInfo.GetValue<Delegate>(value)
            where callback != null
            select new EventCallback(fieldInfo.Name, callback);
}

Now my test makes it all the way to the call to Approvals.Verify and produces output:

Event callbacks for PocoExtension

PropertyChanged
    [0] Void HandlePropertyChanged(System.Object, System.ComponentModel.PropertyChangedEventArgs)

ProcessStarted
    [0] Void HandleProcessStarted(System.Object, System.EventArgs)

More importantly, it’s output that I can approve.  Cue Borat: “Great Success!”

Cleanup

Hopefully cleanup will be easier this time when compared to the “Making it Better” section in the last segment of this series.  But a little cleanup is necessary because once again GetType is called before making a null check, this time in GetEventCallbacks.

Here’s a test to detect the defect:

[TestMethod]
public void NullHasNoEventCallbacks()
{
    Assert.IsFalse(ReflectionUtility.GetEventCallbacks(null).Any());
}

For previous null-checks I used a if-null-return-null pattern to handle the null case.  However, as these null checks multiply I’m getting tired of writing the same code over and over again, simple as it may be.  So in this test I’m not going to look for null when null is returned, I’m going to look for an empty collection.  If I can come up with a NullObject solution to the null cases that I like, then I’ll refactor my null checks to use that instead of returning null.

I want a NullObject that is a do-nothing implementation of Type.  The NullType should respond to GetFields and GetEvents calls with empty arrays.  Before I run off an create one, I should see if the framework already has a Type that would work as a NullType.  Turns out that a suitable type does exist: typeof(void).  MSDN says System.Void is rarely useful in a typical application and that it is used by classes in System.Reflection.  I’m doing reflection, so I’ll use it too.  I just need to make sure that when I try to get a value’s type, that System.Void is used when that value is null.

public static Type GetType(object value)
{
    return value == null ? typeof(void) : value.GetType();
}

Notice that this method is not an extension method.  Now I change GetEventCallbacks to use the new method.  That gets me past my first null reference exception, but my test is still hitting a null reference when it tries to call Any on the results from GetEventCallbacks.  I need to follow the execution a little further an make sure that GetEventCallbacks never returns null.  (By the way Code Contracts would be a great way to help diagnose and resolve this type of issue, but I’ll just investigate it “by hand” for this example.)

Remember that GetEventsForTypes will return null when value is null.  I’ll update this method to use the new GetType method instead of a local null check and see what happens.

public static IEnumerable<EventCallback> GetEventsForTypes(
    this object value,
    params Type[] types)
{
    return from fieldInfo in GetType(value).EnumerateFieldsWithInherited(NonPublicInstance)
        // ...
}

NullHasNoEventCallbacks passes after making this update!  However, an older test, NullHasNoProcessCompletedHandler, fails now.  This test is expecting GetEventHandlers to return null, so I need to update its expectation.

[TestMethod]
public void NullHasNoProcessCompletedHandler()
{
    Assert.IsFalse(ReflectionUtility.GetEventHandlers(null).Any());
}

This new version of the test passes.  Now I’ll just search for any more null-checks and see if I can use the new method there.  The last candidate null check is in VerifyEventCallbacks.  I put a null check in there to make it safe to retrieve the type name.  If I use System.Void then instead of verifying an empty result I’ll get a result like this:

Event callbacks for Void

I’m on the fence on whether that is better than the empty result, but I lean toward the empty result.  I think that seeing an empty result is more likely to make me consider that I passed in a null value rather than seeing “Void”.  So I leave VerifyEventCallbacks alone.

Relationship with EventApprovals

Unfortunately these scenarios are not supported in ApprovalTests 2.0.  When I wrote EventApprovals, I needed to inventory the event handlers on a WinForms application.   Neither inheritance problems nor problems with delegate types surfaced while using EventApprovals against a WinForms target, because of the custom event implementation WinForms uses.

Eventually, when I used EventApprovals with a POCO class which extended an INotifyPropertyChanged implementer, I  encountered these problems and figured this stuff out.  The good news is that Llewellyn and I got together recently and these fixes have made their way upstream into ApprovalTests 2.0+.  As of this writing, you should compile ApprovalTests from source if you need these features immediately.  If you’re reading this later on, and you have ApprovalTests 2.1 or greater, then you already have these features.  Once you have a version of ApprovalTests with these fixes, you don’t have to do anything special.  EventApprovals.VerifyEvents takes advantage of them automatically.

In terms of implementation, the delegate type issue is solved almost identically to what I’ve shown here.  For inheritance, Llewellyn thought it would be fun to solve the problem with recursion, so that’s a little different.

Up Next

I’m feeling better about my tools now. To review, these extension methods can dynamically find all event-backing delegates, regardless of type. And they can find delegates no matter where they are declared in the class hierarchy. It looks like this set of extension methods can handle all my event testing needs for any object, but is that really so?

It turns out that there is large and important set of events which these methods will completely fail to find events for.  I’ve mentioned it a couple times already: Windows Forms.  I’ll take a look at WinForms events next time in: “Beyond the Event Horizon: WinForms Event System