Monday, October 30, 2006

NCoverExplorer... v1.3.5

The latest build ( is now available for download from this site, which includes updates to both NCoverExplorer and the file (for the stylesheets and NAnt/MSBuild tasks). I have held off officially releasing for a while so as to offer the usual "combined installation" option with the TestDriven.Net 2.0 RTM which Jamie Cansdale has just released - congratulations Jamie!

The leading new feature in this release as I blogged about a number of weeks ago is enhanced integration with NCover for generating NAnt scripts, MSBuild scripts and command lines. These should support the whole gamut of permutations of NCover versions (1.3.3 and 1.5.x), with or without my custom NAnt/MSBuild tasks. Of course you can also choose to just run NCover directly from the same screen, with the results being displayed immediately within NCoverExplorer. For people not using TestDriven.Net or needing to report coverage without using the VS.Net IDE this should hopefully prove a useful option.

Function coverage is another significant enhancement. For a while now NCoverExplorer has had "function coverage" as just showing the #visits incurred by each method. This has now been expanded to include a wider range of options to show percentages of methods covered on a class and/or number of methods unvisited.

There is also a new coverage report "Module/Class Function Summary" which can be used within NCoverExplorer or CruiseControl.Net (please upgrade your CC.Net stylesheets).

A new filter option of "Hide All Above Threshold" has been added to supplement the existing options of hiding unvisited or 100% covered nodes. For those of you who set a threshold for coverage below 100% this option should allow you to focus quickly on just the elements of the codebase that require your attention. Filtering and sorting options are now automatically reapplied when refreshing/reloading files in a session.

Sorting and filtering applies to reports either from within NCoverExplorer or optionally from the command line. NCover.Console.exe adds new /sort: and /filter: arguments. Both these and the /report: arguments will now take either numeric values or the enumeration text, as do the NAnt/MSBuild tasks. Type ncover.console /? to see the full set of options.

  ncover.console /h /report:ModuleClassFunctionSummary coverage.xml
  ncover.console /h /report:5 coverage.xml

A significant amount of refactoring has been done to the NAnt and MSBuild tasks for NCover/NCoverExplorer to fix some bugs with various NCover versions. As part of this exercise I ended up renaming the assemblies and namespaces. My apologies to those of you who have to update your scripts but it made the documentation generation and maintenance easier for me. Speaking of which, at long last some documentation for these tasks is published outside of the source code. It is available both within the and found here for MSBuild and here for NAnt. You will also find these links on the NCoverExplorer Help menu.

On top of all that there have been the usual swarm of important bug fixes (such as a fix for memory leaks) and minor tweaks - please read the Release Notes for more details. As with every release I encourage you to upgrade so as to trade in your old known bugs for some shiny new ones.

While NCover 1.5.5 was released a little while ago there are unfortunately some "showstopping" issues with it not only with .Net 1.0/1.1 but also when used with NCoverExplorer. I posted the details about the problem here on the NCover forum. The issues will impact the integrity of your coverage results but there is absolutely nothing I can do to workaround the issues - garbage in, garbage out I'm afraid. Until NCover 1.5.6 is released your best bet may be to stick with NCover 1.3.3 or NCover 1.5.4 or else you may possibly end up with results like this:

For any issues with the release or further enhancement suggestions, please post to the NCoverExplorer forums or send me an e-mail (Help -> Send Feedback).

Download TestDriven.Net 2.0 combined install

Download the latest NCoverExplorer versions from here.

Release Notes
MSBuild Task Documentation
NAnt Task Documentation

Sunday, October 29, 2006

CruiseControl.Net... Serialised Build Queues

As a little break from NCoverExplorer development I started dabbling last weekend with the CruiseControl.Net source code. A disclaimer up front - this is all at the experimental stage and subject to further approval by the CC.Net project admins but it looks promising so far...

The main problem I am trying to resolve is the situation where you want to prevent two projects from running at the same time. Why would you want to do that? Perhaps you have resource constraints like a file system directory or a database the unit tests run against. For instance we have separate projects for continuous and daily builds - without a locking mechanism we likely get a failed build if they both run at the same time (one may do a "clean" while the other is trying to "build" etc).

A useful plugin was developed recently by Richard Hensley available from here to implement a locking mechanism which does help solve this. However it has a few limitations due to being a plugin which are best overcome by changing the CruiseControl.Net source code (in my opinion anyways). Various other implementations I have seen or even developed myself using mutexes or lock files all have similar issues.

What sort of limitations? As a build manager and developer I want visibility of the build queues, the ability to cancel pending items on the queue and to more granularly control the order that things get built.

My first stab of this is still a work in progress but looks like this...

1. You can optionally add a "queueName" attribute to each <project> node to force projects into a serialised integration queue. If you don't specify a queue the project will have it's own queue of the project name - effectively the same behaviour you have now.

2. You can also optionally add a "queuePriority" attribute to control the order of pending integrations on the queue. A priority of zero (default) just means build in the order placed on the queue.

3. A new optional panel in CCTray (see the demo gif below) to display the state of your queues across all polled servers. From CCTray you can right-click to cancel pending integrations provided they have not yet started.

4. A new web page plugin to allow web-based viewing of the queues and cancel pending integrations. Still to be developed.

Here's a little taster of what it looks like at the moment...

Comments, suggestions and feedback welcomed... there is a thread for this on the ccnet-devel Google newsgroup.

[Update: 22nd-Jun-07 This patch has been integrated into the CruiseControl.Net 1.3.0 release now publicly available]

Thursday, October 12, 2006

NCoverExplorer... Merging NCover Reports

There have been a couple of posts recently by Jeremy Miller and debating how best to structure unit test code. One of the arguments mentioned by a number of people against multiple unit test assemblies is the difficulty of merging multiple NCover coverage files on the build server.

To the people who posted comments along the lines of "it's just not possible"... well in actual fact it is, and here are a couple of ways to do it.

The first option for NUnit users is to use a .nunit project file listing your test assemblies. This requires only a single execution of NUnit and means that only one coverage file is produced in the first place, avoiding the problem altogether. I have a custom NAnt task that I wrote for dynamically creating the .nunit project file based on a file pattern just to avoid any maintenance - if anyone else is interested let me know and I will make it available.

The second option is to use NCoverExplorer. The console executable version is designed to be used on the build server. Included amongst it's abilities since v1.3.4 is the ability to merge raw NCover coverage files and save the result along with xml/html reports if required. In addition you can optionally apply coverage exclusions etc at the same time.

This can be run either directly from the command line or using the NAnt/MSBuild tasks available in the NCoverExplorer.Extras zip file.

So... to merge a bunch of suitably named coverage.xml files from a test coverage run into a single file using the command line it can be as simple as:

ncoverexplorer.console.exe *.coverage.xml /s:merged.coverage.xml

If you want to produce an actual report instead, just add the /r argument indicating the report type, and any combination of /x or /h to produce xml or html output as appropriate. Of course the xml output is designed to be merged into your CC.Net build file and transformed using the CC.Net stylesheets also in that Extras zip file. Use /? to see the full command line options.

The same operations can also be performed using the NAnt or MSBuild tasks, documentation for which is found here for NAnt and here for MSBuild.

[Update Nov 14th 2006: I have made available the NAnt and MSBuild tasks to dynamically create nunit project files as detailed here.]

Saturday, October 07, 2006

BinarySerializer and Strings in .Net 1.1

I've just verified that this problem does appear to have been rectified in .Net 2.0 so undoubtedly it is "old news" to many people but this one had us guessing for a while at work this week...

When using the .Net Remoting over tcp/binary with a "large" collection of objects (6,000) we found that our application was going into meltdown and taking over 6 minutes to return the results - yikes!

The actual population time of the custom collection from the DataReader was negligible - it was when it hit the "return" statement to leave the application server domain that the CPU utilisation shot up to 100% and the lag began. The objects being serialized had around 40 fields containing a few Int32 and DateTime data types with the bulk being strings.

As a colleague very funnily cracked at the time - asking the users to go Tools -> Options -> Tiers -> 2 Tiers was probably not going to be acceptable... ;-)

After much shagging around with a profiler and Reflector I eventually figured out that it was to do with the way the BinarySerializer works. It maintains an internal object table so as to ensure it only serializes each object in the graph once and avoid cyclic references no doubt. The problem is that it uses the hashcode of each object as a starting point for where to store it in that internal object table. If it finds the same object instance at that hash position it can return it, otherwise it adds another hash and tries again in the next "bucket" and so on until it finds a match (or a free space to put this object).

However when you have an object graph with lots of System.String field instances that all have the same value this design decision fell to bits. Such is the case when populating your objects using DataReader.GetString() which gives you a new string instance each time. As the hashcode for a string is based on it's contents you now get an exponential increase in hashing collisions the more rows you have in your result set.

As it turns out this is a known problem for which there is the dreaded hotfix dated from Dec 2004. I say "dreaded" as like most large companies out there we have absolutely no show of getting a hotfix deployed due to the logistics involved. When will Microsoft EVER get their act together over their .Net service pack strategy (i.e. more than once every 3 years would be a start!).

Our workaround? Well, as I have hinted at above what breaks the serializer is instances of strings having the same value. So why not have a cache of the string instances that you retrieve through when using DataReader.GetString()? That way the same value "xyz" is serialized only once instead of once per row - smaller object graph, faster serialization performance.

Sure enough after the change our total round trip performance dropped to under a couple of seconds for those 6,000 records (the serialization itself in under 0.2 seconds). Nothing like that great feeling on a Friday afternoon of having spanked a problem like this... That we hadn't really hit the problem previously was just luck - either screens had not many string columns in the objects; the string values were mostly unique; or the resultsets were not very large.

Another solution would have been to use ISerializable and take control ourselves, for which there are some good articles on CodeProject like this one. However the cost is it introduces another maintenance point in each of your entity classes. For large development teams in the early phases of a project with a continually evolving data model like my current one that's bound to go subtly wrong at some point. We may yet need to resort to that - but I would prefer to hold off until we know we need that extra few drops of performance!

As I said at the top this problem seems to have been rectified in .Net 2.0 from my quick testing this morning - had we been using that at work it would have avoided many hours of frustration this week. Then again as per my .Net 2.0 TreeView performance problem post we would likely have had some other problems to deal with. I guess this is why we get paid the big bucks right?

Thursday, October 05, 2006

CCStatistics for CruiseControl.Net 1.1

A few days ago CruiseControl.Net 1.1 was officially released. Thanks to all those responsible - it has become an entrenched daily tool for many of us. I've upgraded a couple of build servers so far - well worth doing for the performance improvements and statistics features alone. This also gave cause for me to knock up a supporting tool called "CCStatistics" - more on this below.

When upgrading there are a few things you need to change in your CC.Net project files that are not all listed in the release notes...
  1. Remove any <publishExceptions> elements.
  2. Change the format of your <weburl> links to jump to the latest report in CCTray. They should now be in the format below, unlike the query-string approach used previously:
  3. If you want to use the new statistics functionality you need to add a <statistics /> node to your <publishers> section. Without it when you click on the "View Statistics" page you get a nasty exception being thrown.
Speaking of statistics, if you hadn't guessed already this is one of the more interesting additions for me. In my opinion it is a "0.9" version in terms of readiness for release - there are a number of bugs I found I have added to Jira and the documentation is incomplete or misleading in a couple of instances at the time I write this.

That said it has the potential to be a stonkingly good feature. Who doesn't want to see the most common reasons their build fails (when you have ClearCase performance issues like we do) or see their unit tests count, code coverage etc over time all on a single web page? No more endless drilling down into build log hyperlinks...

To use this feature you add a <statistics> node to your <publishers> section in the project file. I recommend you put this at the end of your publishers (after any <merge> element) so that way you can get additional custom statistics from files merged into the build log. When CC.Net runs the project it will then generate in your artifacts folder:
  • report.xml (not statistics.xml as the documentation says). Contains all the statistics for builds to date.
  • reports.bmp (which is actually a png - JIRA). A graphic illustrating the "TestCount" statistic across the build log cycle.
  • statistics.csv (JIRA). Intended to be an exportable alternative to report.xml - but it has garbage output in it currently due to a few bugs.
When you click on the "View Statistics" link from a build project page you will see the standard statistics on a page like the following:

The report is fairly basic in appearance but that should be easily changed by editing the statistics.xsl file. That is next on my list and once done I will contribute it for anyone interested (or feel free to beat me to it!). Strangely the reports.bmp file is not referenced in this stylesheet by default - so the graph produced is not actually being used yet as far as I can tell.

A minor limitation in the current release is that you cannot "replace" statistics with your own ones without editing the xsl. Not the end of the world but again I've posted a feature request on JIRA on ideas to make that more flexible. You can however add new statistics as mentioned here. So for instance if we want to add the code coverage totals for the build as a result of running NCoverExplorer we can do this:

<firstMatch name="Coverage" xpath="//coverageReport/project/@coverage" />

This will result in an additional column being added to the statistics page on the right-hand side. You could of course add other NCoverExplorer statistics as well such as lines of code etc.

There is a catch to all this - statistics will only be generated for future builds, not for your legacy ones. That is where my CCStatistics tool comes in. This will parse all your build log files for a project and produce an updated reports.xml, reports.bmp and statistics.csv file as though you had been running CruiseControl.Net 1.1 "forever".

There is another benefit to this tool I can see. Undoubtedly you will over time decide on additional statistics you would like to measure. Provided they are sourced from data that existed in historical build logs, you can regenerate all your statistics to date with the new data included to give you a much better picture of the progress of your project. If the data isn't present (say you just started merging NCoverExplorer code coverage) then the values will be blank.

It should be pretty trivial to use - drop the executable into your CruiseControl.Net\server folder, click on the Load button to list your CC.Net projects, choose which to recalculate statistics for and away it goes...

I've made all the source code available - perhaps the CruiseControl.Net developers might be interested in adding it to their solution and maintaining it? There is also a compiled executable against CC.Net build (aka CC.Net release 1.1.1). To use the source code just unzip into the same \project source code folder as the other CC.Net projects and add to ccnet.sln.

This was a very quick hack from a few hours work and has the usual "it works on my machine" proviso. There is no care or attention over internationalization and I have only ever used the 1.x versions of CruiseControl.Net. So use at your own "risk" - although as the only file access it does is reading logs and writing the couple of statistics files I don't think there's much to go badly wrong. The code itself has a few "hacks" so as to work without touching the original CC.Net source code - these may be cleaned up at a later date if the dependent objects are enhanced. Note also that this code results in the same "useless" .csv file - that bug is in CC.Net source and even if I did work around it obviously your future builds would still have incorrect lines being published.

If you notice any problems or have suggestions feel free to let me know. Note that it will backup your previous statistics files with a .old prefix so if you notice a problem after running it once you can revert.

Also thanks to Jamie Cansdale for his help today with a sticky issue I had (the NetReflector component only "works" if the assembly containing the type it is trying to instantiate is loaded in memory - just referencing the .dll containing it is not enough).

Download CCStatistics executable (Depends on CCNet
Download CCStatistics source code

[Update 14-Oct-06: I've pushed up a new version which includes progress bars and timings. When processing a big directory of large files CCStatistics may become unresponsive - don't panic as it will finish eventually. I have also updated the unittests.xsl stylesheet to fix the still outstanding issue in JIRA of test suite failures not showing up that I blogged about previously.]

[Update 22-Sep-07: Damon Carr has significantly updated this tool to work with the latest CC.Net (1.3) which you can can download here as mentioned in the comments below. Thanks also to Drew Noakes for his previous work for compatibility with CC.Net 1.2. Having "served its purpose" at the time for my needs I did not have the time to maintain it myself but am glad that others have found it useful enough to do so. Well done guys...]