Monday, December 14, 2009

ASP.NET MVC XML Model Binder

Update 12/14/2011
This is one of my most highly viewed posts. Unfortunately, this post applies to version 1 of ASP.NET MVC (it might also work with version 2 with some small modifications). If that is what you are looking for, then please read on! Otherwise, for more relevant coverage of this topic I recommend Jimmy Bogard's post

I needed a way to receive raw XML in my action methods, rather than form-encoded values. Model binding makes it a piece of cake:



In my action method all I have to do is apply the model binder attribute:



Now the edit action receives an XML string. Even better, I can get a strongly typed object if it is serializable:



Now my action method looks like this:



The model binder attribute is a little too verbose for my taste, so I wrote a custom model binder attribute:



Finally my action method is nice and readable:


Tuesday, December 8, 2009

"Good Enough" Test Automation

If you are practicing Agile and you want to stay releasable at the end of every iteration, you must automate your tests. If you do not automate, one of two things will happen:
  1. Your velocity will decrease each sprint as you need to reserve more and more time for regression testing.
  2. You will not run (or will forget to run) regression tests, and you run the risk of breaking something unintentionally. Best-case, your testers will catch any problems before release. Worst-case, your customers will find problems after release.
Now, here's the fun part: you have 2-3 weeks to build and test something potentially releasable. How can you possibly squeeze test automation into that time frame? The same way you attack that next killer feature: start small and iterate.

A few days ago I was testing some changes to a multi-step online signup process. The current code base is, shall we say, less than testable at the unit level. As a result, all testing for this feature has been manual in the past. After a few manual test runs, introducing some automated tests at the browser level seemed like a good idea. I immediately thought about continuous integration, but quickly realized dealing with all the necessary data setup and tear-down was going to take several days to get right. All I needed at that moment was a script to fill out the forms and click through the whole process. I coded a short Watir script, ran it a few times, and visually verified the process still worked end-to-end.

If all you need is a script to do the boring, repetitive parts of testing for you, write the script. If the script is useful, you'll keep coming back to it, and eventually get it running under continuous integration. Just take the first step, choose a tool, and start automating.

Wednesday, December 2, 2009

Missing reports in CruiseControl.NET

My team experienced a spell of build failures last week due to broken unit tests. It wasn't a big deal, broken tests happen from time to time, the breaker simply fixes them and life goes on. However, something strange happened when we looked at the build report to see what failed.



The report showed that the NUnit process had exited with an error code, but the unit test summary claimed that no tests had been run.

I logged in to the build server and checked out the build log. The unit test results were there, and I was able to search the log to find which ones failed, but obviously nobody wants to log in to the build server and search through a text file whenever a unit test breaks. On closer inspection, I realized that all of the XML data for the test results were wrapped in a CDATA section. One developer pointed out that this was likely caused by strange characters in the test results, and the documentation for the file merge task confirmed the possibility. However, my team has amassed about 2,500 unit tests, so finding the offending character in approximately 10,000 lines of test results would have been difficult to say the least.

After some digging, I found out that CruiseControl.NET logs a debug message when a file cannot be merged because it contains invalid XML. I opened up the log (ccnet.log by default) and a quick search for output is not valid XML revealed the exact line number in the test results that caused the file merge task to fail. Now we are happily integrating continuously once again.