Rash thoughts about .NET, C#, F# and Dynamics NAV.


"Every solution will only lead to new problems."

Saturday, 31. January 2009


n-dimensional k-means clustering with F# – part II

Filed under: BioInformatik,English posts,F#,Informatik — Steffen Forkmann at 11:28 Uhr

In the last post I show how we can calculate clusters of n-dimensional Objects with the K-means-Algorithm in F#. This time I will simplify the code by using the built-in type vector.

First of all we redefine the interface for “clusterable” objects:

type IClusterable =
  abstract DimValues: vector with get

This time we do not need any dimension information – the vector type will care about this. Now we can reduce our distance function to:

let calcDist (p1:vector) (p2:vector) = (p1 - p2).Norm

The function Norm calculates the 2-Norm of a vector.

2-Norm ==> Eulidean distance 

So calcDist will still compute the n-dimensional Euclidean distance.

The next step is the definition of a n-dimensional Centroid type. This time we will only use a type alias of vector:

type Centroid = vector

The calculation of the centroid and the centroid error can be written as:

let calcCentroid (items:'a list when 'a :> IClusterable)
     (oldCentroid:Centroid) =
  let count = items.Length
  if count = 0 then oldCentroid else  
  let sum =
    items
      |> List.fold_left 
        (fun vec item -> vec + item.DimValues) // vector addition
        (nullVector oldCentroid)
   
  sum * (1./(count |> float))  // multiply with scalar
     

let calcError centroid (items:'a list when 'a :> IClusterable) =
  let sq x = x * x

  items
    |> List.sum_by (fun i -> calcDist centroid i.DimValues |> sq)

The calcError function is straight forward but not perfect in terms of performance. As we use the calcDist function and hence the 2-Norm the function will first calculate a square root and then squares the result.

The rest of the code is pretty much same as in the last post:

/// creates n-dimensional 0 - vector
let nullVector (v:vector) = v * 0.
    
type Cluster<'a> when 'a :> IClusterable =
  {Items: 'a list;
   Count: int;
   Centroid: Centroid;
   Error: float}

  member x.Print =
     printfn "(Items: %d, Centroid: %A, Error: %.2f)"
       x.Count x.Centroid x.Error

  static member CreateCluster (item:'a) =
    let items = [item]
    let empty = nullVector item.DimValues

    let centroid = calcCentroid items empty
    {new Cluster<'a> with
      Items = items and
      Count = 1 and
      Centroid = centroid and
      Error = calcError centroid items}

  member x.EvolveCluster (items:'a list) =
    let l = items.Length
    let centroid = calcCentroid items x.Centroid
    {x with
      Items = items;
      Count = l;
      Centroid = centroid;
      Error = calcError centroid items} 

let rand = new Random()  

let InitClusters (k:int)
      (items:'a array when 'a :> IClusterable)  =
  let length = items.Length
  let getInitItem() = items.[rand.Next length]
  Array.init k (fun x ->
                  getInitItem()
                    |> Cluster<'a>.CreateCluster)
                    
                    
let AssignItemsToClusters k (clusters:Cluster<'a> array)
      (items:'a seq when 'a :> IClusterable) =
  if k <= 0 then failwith "KMeans needs k > 0"
  let findNearestCluster (item:'a) =
    let minDist,nearest,lastPos =
      clusters
        |> Array.fold_left
            (fun (minDist,nearest,pos) cluster ->
               let distance = calcDist item.DimValues (cluster.Centroid)
               if distance < minDist then
                 (distance,pos,pos+1)
               else
                 (minDist,nearest,pos+1))
            (Double.PositiveInfinity,0,0)
    nearest

  let assigned =
    items
      |> Seq.map (fun item -> item,findNearestCluster item)

  let newClusters = Array.create k []
  for item,nearest in assigned do
    newClusters.[nearest] <- item::(newClusters.[nearest])

  clusters
    |> Array.mapi (fun i c -> c.EvolveCluster newClusters.[i])  

    
let calcClusterError (clusters:Cluster<'a> array) =
  clusters
    |> Array.fold_left (fun acc cluster -> acc + cluster.Error) 0.  
                      
let kMeansClustering K epsilon
       (items:'a array when 'a :> IClusterable) =
  let k = if K <= 0 then 1 else K
  let rec clustering lastError (clusters:Cluster<'a> array) =
    let newClusters =
      AssignItemsToClusters k clusters items
    let newError = calcClusterError newClusters

    if abs(lastError - newError) > epsilon then
      clustering newError newClusters
    else
      newClusters,newError    

  InitClusters k items
    |> clustering Double.PositiveInfinity

This vector approach shortens the code for the kMeans algorithm, but has a little longer runtime. I am interested in any suggestions.

Tags: , , , ,

Thursday, 29. January 2009


n-dimensional k-means clustering with F#

Filed under: BioInformatik,English posts,F#,Informatik — Steffen Forkmann at 11:38 Uhr

The K-means-Algorithm is one of the simplest unsupervised learning algorithms that solve the well known clustering problem. In this article I will show how we can implement this in F#.

First of all we define an interface for “clusterable” objects:

type IClusterable =

  abstract DimValues: float array with get

  abstract Dimensions: int

The next step is to define a distance function. We will use n-dimensional Euclidean distance here.

Euclidean distance

let calcDist (p1:IClusterable) (p2:IClusterable) =

  let sq x = x * x

  if p1.Dimensions <> p2.Dimensions then

    failwith "Cluster dimensions aren’t equal."

 

  p2.DimValues

    |> Array.fold2

        (fun acc x y -> x – y |> sq |> (+) acc)

        0. p1.DimValues

   |> sqrt 

Now we define a n-dimensional Centroid type. Our kMeans-Algorithm will minimize the squared distances to k centroids (or “means”).

type Centroid =

  {Values: float array;

   dimensions: int}

  member x.Print = printfn "%A" x.Values

  interface IClusterable with

    member x.DimValues = x.Values   

    member x.Dimensions = x.dimensions   

The centroid of a finite set of n-dimensional points x1, x2, …, xn is calculated as

image

let calcCentroid (items:’a list when ‘a :> IClusterable)

     (oldCentroid:Centroid) =

  let count = items.Length

  if count = 0 then oldCentroid else

  let mValues =    

    [|for d in 0..oldCentroid.dimensions-1 do

        let sum =

          items

            |> List.sumBy (fun item -> item.DimValues.[d])

 

        yield sum / (count |> float)|]

 

  { Values = mValues;

    dimensions = oldCentroid.dimensions}

We made a small modification – if we don’t have any items assigned to a cluster the centroid won’t change. This is important for the robustness of the algorithm.

Now we can calculate the squared errors to a centroid:

let calcError centroid (items:’a list when ‘a :> IClusterable) =

  let calcDiffToCentroid (item:’a) =

    centroid.Values

      |> Array.fold2

          (fun acc c i -> acc + (c-i)*(c-i))

          0. item.DimValues

 

  items

    |> List.sumBy calcDiffToCentroid 

For storing cluster information we create a cluster type:

type Cluster<‘a> when ‘a :> IClusterable =

  {Items: ‘a list;

   Count: int;

   Centroid: Centroid;

   Error: float}

 

  member x.Print =

     printfn "(Items: %d, Centroid: %A, Error: %.2f)"

       x.Count x.Centroid x.Error

 

  static member CreateCluster dimensions (item:’a) =

    let items = [item]

    let empty =

      { Values = [||];

        dimensions = dimensions}

 

    let centroid = calcCentroid items empty

    { Items = items;

      Count = 1;

      Centroid = centroid;

      Error = calcError centroid items}

 

  member x.EvolveCluster (items:’a list) =

    let l = items.Length

    let centroid = calcCentroid items x.Centroid

    {x with

      Items = items;

      Count = l;

      Centroid = centroid;

      Error = calcError centroid items} 

Now we need a function to initialize the clusters and a function to assign a items to the nearest cluster:

open System

 

let rand = new Random() 

 

let InitClusters (k:int) dimensions

      (items:’a array when ‘a :> IClusterable)  =

  let length = items.Length

  let getInitItem() = items.[rand.Next length]

  Array.init k (fun _ ->

                  getInitItem()

                    |> Cluster<‘a>.CreateCluster dimensions) 

 

let AssignItemsToClusters k dimensions (clusters:Cluster<‘a> array)

      (items:’a seq when ‘a :> IClusterable) =

  if k <= 0 then failwith "KMeans needs k > 0"  

  let findNearestCluster item =

    let minDist,nearest,lastPos =

      clusters

        |> Array.fold

            (fun (minDist,nearest,pos) cluster ->

               let distance = calcDist item (cluster.Centroid)

               if distance < minDist then

                 (distance,pos,pos+1)

               else

                 (minDist,nearest,pos+1))

            (Double.PositiveInfinity,0,0) 

    nearest

 

  let assigned =

    items

      |> Seq.map (fun item -> item,findNearestCluster item)

 

  let newClusters = Array.create k []  

  for item,nearest in assigned do        

    newClusters.[nearest] <- item::(newClusters.[nearest])

 

  clusters

    |> Array.mapi (fun i c -> c.EvolveCluster newClusters.[i])   

The last step is a function which calculates the squared error over all clusters:

let calcClusterError (clusters:Cluster<‘a> array) =

  clusters

    |> Array.sumBy (fun cluster -> cluster.Error)

Now it is an easy task to write the kMeans algorithm:

let kMeansClustering K dimensions epsilon

       (items:’a array when ‘a :> IClusterable) =

  let k = if K <= 0 then 1 else K

  let rec clustering lastError (clusters:Cluster<‘a> array) =

    let newClusters =

      AssignItemsToClusters k dimensions clusters items

    let newError = calcClusterError newClusters

 

    if abs(lastError – newError) > epsilon then       

      clustering newError newClusters

    else

      newClusters,newError   

 

  InitClusters k dimensions items

    |> clustering Double.PositiveInfinity 

We can test this algorithm with random 2-dimensional points:

type Point =

  {X:float; Y: float} 

  member x.Print = printfn "(%.2f,%.2f)" x.X x.Y

  interface IClusterable with

    member x.DimValues = [| x.X; x.Y |]

    member x.Dimensions = 2  

 

let point x y = {X = x; Y = y}

let getRandomCoordinate() = rand.NextDouble() – 0.5 |> (*) 10. 

let randPoint _ = point (getRandomCoordinate()) (getRandomCoordinate())    

let points n = Array.init n randPoint

let items = points 1000

 

printfn "Items:"

items |> Seq.iter (fun i -> i.Print)

 

let clusters,error = kMeansClustering 3 2 0.0001 items

printfn "\n"

clusters |> Seq.iter (fun c -> c.Print)   

printfn "Error: %A" error   

Next time I will show how we can simplify the code by using F#’s built-in type vector instead of array.

https://edpillsbelgium.com
Tags: , , , ,

Tuesday, 20. January 2009


F# Bootcamp in Leipzig

Filed under: F#,Veranstaltungen — Steffen Forkmann at 11:45 Uhr

Am Freitag dem 29.5.2009 werde ich in Leipzig ein kostenloses “.NET Bootcamp” zum Thema “Funktionale Programmierung in F#” leiten. Die Anmeldung wird ab März auf der Veranstaltungsseite der .NET User Group Leipzig möglich sein.

“Funktionale Programmiersprachen nehmen seit geraumer Zeit einen hohen Stellenwert in der Wissenschaft ein. Demnächst könnte es eine dieser Sprachen sogar aus dem Forschungsbereich direkt in den Mainstream schaffen. Visual Studio 2010 wird neben C# und VB.NET die funktionale Programmiersprache F# als dritte Hauptsprache anbieten. Das .NET Bootcamp zu F# soll einen Einblick in funktionale Konzepte und deren Umsetzung in F# geben. Insbesondere soll auf “Funktionen höherer Ordnung”, Typinferenz, Currying, Pattern Matching, “Unveränderlichkeit” und parallele Programmierung eingegangen werden.”

Veranstaltungsabstract

Update: Die Anmeldung ist nun möglich.

Update: Die Fragen zum BootCamp können nun hier herunter geladen werden.

Tags: , , ,

Saturday, 17. January 2009


Stammtisch 1/2009 der .NET User Group Leipzig

Filed under: .NET 3.0,Veranstaltungen — Steffen Forkmann at 14:58 Uhr

Am 20.01.2009 findet zwischen 19:30 und 21:30 Uhr der 1. Stammtisch der .NET User Group Leipzig im Jahr 2009 statt. Treffpunkt ist das TELEGRAPH Café in Leipzig und als besonderes Highlight hat sich Ralf Westphal angekündigt. Eine Anmeldung ist nicht nötig.

Warning: Creating default object from empty value in /www/htdocs/w007928a/blog/wp-content/plugins/wp-referencesmap/GoogleMapsWrapper.php on line 55


https://hrvatskafarmacija24.com
Tags: ,

Sunday, 11. January 2009


ReSharper 4.5 and F#

Filed under: F#,Tools — Steffen Forkmann at 19:16 Uhr

Since I am working with hybrid solutions (with F# and C# projects in it) I had to deactivate ReSharper. ReSharper had a problem with analyzing my F# sources (see JIRA bug entry #79203). The result was that every single F# defined type and function was marked as an error. I nearly got crazy. On one hand I got used to all the nice ReSharper refactorings (and the NUnit runner) and on the other I got all these false positive errors.

But from now on this hard times are over. Today I tested build 1153 (see nightly builds for version 4.5) – and everything works fine.

ReSharper is running again

Thank you guys at JetBrains. 🙂

Tags: , ,

Deutsche Add On Module für Dynamics NAV 2009 verfügbar

Filed under: Dynamics NAV 2009,msu solutions GmbH — Steffen Forkmann at 14:26 Uhr

Seit dem 22.12.2008 ist die Add On Datenbank für NAV2009 mit den Modulen Zahlungsverkehr und Kostenrechnung zum Download auf Partnersource verfügbar.

“Dieses Release beinhaltet folgende Dateien:

  • Klassische fdb Datenbank
  • MDF Datenbank für SQL Server
  • Onlinehilfe für den Zahlungsverkehr (DEU und ENU)
  • Changes.doc für Zahlungsverkehr und Kostenrechnung

Hinweis:
Die Objekte für den Zahlungsverkehr beinhalten auch die SEPA Funktionalität.
Die Onlinehilfe für die Kostenrechnung wird umgehend nachgereicht.”

[Quelle Partnersource]

Weitere Links:

Tags: , , ,

How I do Continuous Integration with my C# / F# projects – part IV: Adding a documentation

Filed under: C#,English posts,F#,Tools,Visual Studio — Steffen Forkmann at 12:46 Uhr

In the last 3 posts I show how to set up a Continuous Integration environment for F# or C# projects with Subversion (part I), TeamCity (part II) and NUnit (part III).

This time I want to show how we can set up an automated documentation build.

Installing and using GhostDoc

“GhostDoc is a free add-in for Visual Studio that automatically generates XML documentation comments for C#. Either by using existing documentation inherited from base classes or implemented interfaces, or by deducing comments from name and type of e.g. methods, properties or parameters.”

[product website]

GhostDoc is one of my favorite Visual Studio plugins. It allows me to generate comments for nearly all my C# functions. Of course these generated comments aren’t sufficient in every case – but they are a very good start.

Unfortunately GhostDoc doesn’t work for F# 🙁 – the actual version works for C# and the support for VB.Net is called “experimental”.

Download and install http://www.roland-weigelt.de/ghostdoc/.

Now you should be able to generate XML-based comments directly in your C# code:

Using GhostDoc

Generated XML-comment

The next step is to activate the xml-documentation in your Visual Studio build settings:

image

Commiting these changes and adjusting the build artifacts will produce the input for the documentation build:

Adjust the build artifacts

Build artifacts

Using Sandcastle to generate a documentation

“Sandcastle produces accurate, MSDN style, comprehensive documentation by reflecting over the source assemblies and optionally integrating XML Documentation Comments. Sandcastle has the following key features:

  • Works with or without authored comments
  • Supports Generics and .NET Framework 2.0
  • Sandcastle has 2 main components (MrefBuilder and Build Assembler)
  • MrefBuilder generates reflection xml file for Build Assembler
  • Build Assembler includes syntax generation, transformation..etc
  • Sandcastle is used internally to build .Net Framework documentation

[Microsoft.com]

Download and install “Sandcastle – Documentation Compiler for Managed Class Libraries” from Mircosoft’s downloadpage or http://www.codeplex.com/Sandcastle.

For .chm generation you also have to install the “HTML Help Workshop“. If you want fancy HTMLHelp 2.x style (like MSDN has) you need “Innovasys HelpStudio Lite” which is part of the Visual Studio 2008 SDK.

“HelpStudio Lite is offered with the Visual Studio SDK as an installed component that integrates with Visual Studio. HelpStudio Lite provides a set of authoring tools you use to author and build Help content, create and manage Help projects, and compile Help files that can be integrated with the Visual Studio Help collection.”

[MSDN]

Last but not least I recommend to install the Sandcastle Help File Builder (SHFB) – this tool gives you a GUI and helps to automate the Sandcastle process.

“Sandcastle, created by Microsoft, is a tool used for creating MSDN-style documentation from .NET assemblies and their associated XML comments files. The current version is the May 2008 release. It is command line based and has no GUI front-end, project management features, or an automated build process like those that you can find in NDoc. The Sandcastle Help File Builder was created to fill in the gaps, provide the missing NDoc-like features that are used most often, and provide graphical and command line based tools to build a help file in an automated fashion.”

[product homepage]

After the installation process start SHFB to generate a documentation project:

Sandcastle Help File Builder

Add the TestCITestLib.dll to your project and add nunit.framework.dll as a dependency. Now try to compile your help project – if everything is fine the output should look something like this:

Generated Help

Setting up the documentation build

One of the main principles of Continuous Integration is “Keep the Build Fast” – so I am working with staged builds here. The documentation build should only be started if the first build was successful and all UnitTests are positive. For most projects it is enough to generate the documentation daily or even weekly.

First of all we have to create a simple MSBuild file which executes the SHFB project:

<Project ToolsVersion="3.5" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <!-- 3rd Party Program Settings -->
  <PropertyGroup>
    <SandCastleHFBPath>c:\Program Files (x86)\EWSoftware\Sandcastle Help File Builder\</SandCastleHFBPath>
    <SandCastleHFBCmd>$(SandCastleHFBPath)SandcastleBuilderConsole.exe</SandCastleHFBCmd>
    <SandCastleHFBProject>HelpProject.shfb</SandCastleHFBProject>
  </PropertyGroup>

  <Target Name="BuildDocumentation">
    <!-- Build source code docs -->
    <Exec Command="%22$(SandCastleHFBCmd)%22 %22$(SandCastleHFBProject)%22" />
  </Target>
</Project>

Add this build file and the SHFB project to your Visual Studio solution folder and commit these changes.

Put the build settings into the solution folder

Now we can create a new TeamCity build configuration:

Create a new build configuration

Take the same Version Control Settings like in the first build but use MSBuild as the build runner:

Take the MSBuild runner

We want the documentation to be generated after a successful main build so we add a “dependency build trigger”:

image

Now we need the artifacts from the main build as the input for our documentation build:

Set up artifacts dependencies

Be sure you copy the artifacts to the right directory as given in your .shfb-project. Now run the DocumentationBuild – if everything is fine the DocumentationBuild should give you the Documentation.chm as a new artifact:

image

Tags: , , , , , , , , , ,

Friday, 9. January 2009


Die Stadt Halle/Saale und das Parkplatzmonopol

Filed under: Steffen — Steffen Forkmann at 16:41 Uhr
Die Ausgangssituation

imageDa das Thema mich jetzt schon so lange nervt, muss ich das nun doch endlich mal bloggen. Ich wohne seit einigen Jahren in Halle/Saale im Charlottenviertel. Das Viertel zeichnet sich im Wesentlichen dadurch aus, dass es zentral in Innenstadtnähe liegt und ein großes Kino sowie eine größere Disko besitzt. Mit Parkplätzen sieht es wie in vielen Innenstädten eher mager aus, so dass besonders zu Stoßzeiten (Dienstags ist z.B. “Super-Kino-Dienstag” und am Wochenende natürlich Disko) für Anwohner oftmals nötig ist zum Parken in andere Viertel auszuweichen.

Die Parkplatzneuregelung

Am 1.8.2008 wurde die Parksituation im Charlottenviertel in Halle/Saale neu geregelt. Laut einem Informationsschreiben der Stadt war das aufgrund des permanenten Parkplatzmangels nötig geworden und geschah “zu Gunsten der Anwohner”.

Die Stadt hatte nämlich bemerkt, dass das schöne große Parkhaus im Charlottenviertel von vielen Diskobesuchern und Kinogängern gar nicht genutzt wird und stattdessen in öffentlichen Flächen geparkt wird, die eigentlich für Anwohner reserviert sind. [Als Anwohner muss man sich übrigens für 30,70 € einen Anwohnerparkausweis besorgen.]

Ausgangspunkt für die Neuregelung ist aber laut HalleForum.de ein Gesetzesentwurf aus dem Jahre 2002, der vorsieht, dass tagsüber den Besuchern von Geschäften im Viertel mehr Parkfläche zugestanden wird. Die Formulierung “zu Gunsten der Anwohner” wird demnach damit begründet, dass die Zahl der reinen Bewohnerstellplätze von 89 auf 141 erhöht wurden sei.

Die Situation heute

Seit dem 1.8. ist aus Anwohnersicht einiges geschehen. Mittlerweile gibt es tatsächlich viele freie Parkflächen – nur darf man darauf nicht parken. Es ist Anwohner nämlich nur von 18 bis 8 Uhr gestattet dort das Auto kostenlos abzustellen.

18 Uhr bis 8 Uhr ?? Das heißt es wird vom Bürger erwartet, dass er min. 10h arbeiten geht bzw. für die übrige Zeit eine Parkkarte kauft. Selbst Samstags darf man auf diesen Flächen tagsüber nicht parken – sollen da etwa auch immer alle Bürger arbeiten gehen?

Es gibt aber auch ein paar Parkflächen, die rund um die Uhr für Anwohner verfügbar sind. Das sind die oben angesprochenen “reinen Bewohnerstellplätze”. Ich schaffe es sogar ca. 5 mal im Monat einen solchen Parkplatz zu bekommen. Den Rest der Zeit heißt die Devise Falschparken. Es bleibt mir nicht einmal etwas anderes übrig – ich überprüfe sogar immer jede einzelne dieser Flächen – aber meistens sind tatsächlich alle besetzt.

Chronisches Falschparken – Widerspruch zwecklos

Falschparken bedeutet natürlich, dass mit regelmäßigen Abständen von unseren Politessen kleine Zettelchen am Auto befestigt werden. Diese führen in der Regel zu 5 € Verwarnungsgeld. Ich habe auch schon 3 mal dagegen Widerspruch eingelegt – immer mit der Begründung, dass zum “Tatzeitpunkt” keine “reinen Bewohnerstellplätze” verfügbar waren.

Jeder dieser 3 Widersprüche führte jedoch kommentarlos (!) zu einer Umwandlung der Verwarnung in ein Bußgeld und somit zu einer Erhöhung des Strafbetrages auf 30,- € (entspricht also jeweils den Kosten für einem Bewohnerparkausweis für ein Jahr 🙁 ).

Alternative Parkhaus?

Nachdem nun der dritte Widerspruch kommentarlos in ein Bußgeld umgewandelt wurde, habe ich mich jetzt über die Kosten eines Dauerstellplatzes im Parkhaus informiert. Der Stellplatz kostet mittlerweile stolze 122,- € im Monat – wer soll das denn bitte bezahlen?

Der Hammer ist aber, dass das Parkhaus der HAVAG und somit wieder der Stadt gehört. Für mich bedeutet das: je weniger Stellplätze im Viertel für Anwohner verfügbar sind und je mehr Anwohner deshalb auf das Parkhaus ausweichen, desto mehr Geld fließt in die klammen Kassen der Stadt.

Ausblick

Vor kurzem wurde eine Unterschriftensammlung gestartet – mal sehen, ob die etwas gegen das Parkplatzmonopol der Bürgermeisterin ausrichten kann. 😉

Tags: , , ,

Thursday, 8. January 2009


How I do Continuous Integration with my C# / F# projects – part III: Running automated UnitTests

Filed under: C#,English posts,F#,Tools,Visual Studio — Steffen Forkmann at 19:13 Uhr

In the last two posts I showed how to set up a Subversion (part I: Setting up Source Control) and a TeamCity server (part II: Setting up a Continuous Integration Server).

This time I will show how we can integrate NUnit to run automated test at each build. TeamCity supports all major Testing Frameworks (including MS Test) but I will concentrate on NUnit here.

"NUnit is a unit-testing framework for all .Net languages. Initially ported from JUnit, the current production release, version 2.4, is the fifth major release of this xUnit based unit testing tool for Microsoft .NET. It is written entirely in C# and has been completely redesigned to take advantage of many .NET language features, for example custom attributes and other reflection related capabilities. NUnit brings xUnit to all .NET languages."

[product homepage]

Creating a TestProject

First of all download and install NUnit 2.4.8 (or higher) from http://www.nunit.org/.

Now we add a small function to our F# source code:

let rec factorial = function  
  | 0 -> 1
  | n when n > 0 -> n * factorial (n-1)
  | _ -> invalid_arg "Argument not valid"

This is the function we want to test. We add a new C# class library to our solution (e.g. “TestCITestLib” 😉 ) and add a reference to nunit.framework. Inside this new TestLibrary we add a TestClass with the following code:

namespace TestCITestLib
{
    using NUnit.Framework;

    [TestFixture]
    public class FactorialTest
    {
        [Test]
        public void TestFactorial()
        {
            Assert.AreEqual(1, Program.factorial(0));
            Assert.AreEqual(1, Program.factorial(1));
            Assert.AreEqual(120, Program.factorial(5));
        }

        [Test]
        public void TestFactorialException()
        {
            Program.factorial(-1);
        }
    }
}

To ensure the build runner is able to compile our solution we put the nunit.framework.dll near to our TestProject and commit our changes.

Adding Nunit.framework.dll

Configure TeamCity for UnitTesting

The next step is to tell TeamCity that the build runner should run our UnitTests:

Configure build runner for NUnit

If we now run the build we should get the following error:

UnitTest error during automated build

Our second test function failed, because we didn’t expect the System.ArgumentException. We can fix this issue by adding the corresponding attribute to the Testfunction:

[Test,
 ExpectedException(typeof(System.ArgumentException))]
public void TestFactorialException()
{
    Program.factorial(-1);
}0

Tests passed

Configure the build output

At this point we have a minimalistic Continuous Integration infrastructure. Every time someone performs a Commit on our repository a automated build will be started and the sources will be tested against the given UnitTests. Now we should concentrate on getting our build output – the artifacts. The term artifact is usually used to refer to files or directories produced during a build. Examples of such artifacts are:

  • Binaries (*.exe, *.dll)
  • Software packages and installers (*.zip, *.msi)
  • Documentation files (e.g. help files)
  • Reports (test reports, coverage reports, …)

At this time we are only interested in the binaries (this means CITestLib.dll). We can add the following artifact definition to our TeamCity project:

Configure artifacts in TeamCity

If we now rebuild our solution the build runner collects the configured artifacts and stores them with all build information:

Collected artifacts

Next time I will show how we can add more artifacts – e.g. an automated documentation.

Tags: , , , , , , , , ,

How I do Continuous Integration with my C# / F# projects – part II: Setting up a Continuous Integration Server

Filed under: C#,English posts,F#,Tools,Visual Studio — Steffen Forkmann at 14:50 Uhr

In the last post I showed how easy it is to install Subversion and how it can be integrated into Visual Studio 2008 via AnkhSVN. This time we will set up a Continuous Integration server and configure a build runner.

As a Continuous Integration Server I recommend JetBrains TeamCity. You can download the free professional edition at http://www.jetbrains.com/teamcity/.

Installing TeamCity

During the installation process TeamCity wants to get a port number. Be sure that there will be no conflict with other web applications on your server. I chose port 8085 – and my first build agent got this default settings:

Configure a Build Agent

In the next step you have to sign the License Agreement and to create an administrator account:

Set up administrator account

Creating a Project

Now you can create your project and set up the build configuration:

Create project on your TeamCity Server

Create CITest project

Create Build Configuration

Set up version control root

Setting up a build runner

For setting up specific build runners see the TeamCity documentation. For now I will use the “sln2008”-Build runner (the Runner for Microsoft Visual Studio 2008 solution files).

Runner for Microsoft Visual Studio 2008 solution files

Now add a build trigger. Whenever someone performs a Commit on the Subversion repository the server has to start a build.

Build trigger

Testing the build runner

After this step we have two options to start a build. The first one is by clicking the “Run”-button on the project website and the second is doing a checkin:

Triggering the build running with a checkin

After performing the Commit the pending build appears on the project website:

Pending build 

After 60 seconds (see my configuration above) the build is starting. After the build is complete one can see the results in different ways. The simplest is via the project website:

View build results

Of cause TeamCity gives you a lot of different notification and monitoring possibilities including mail, RSS feeds or System Tray Notifications.

Next time I will show how we can integrate UnitTesting in our automated build scenario.

Tags: , , , , ,