TFS Build 2015 … and versioning!

August 24, 2015

Lately I got some time to play a bit more with the new build system which was released with TFS 2015 and which is also available for Visual Studio Online. The new build system was in the beginning announced as build vNext, but now with the release of TFS 2015, it’s safe to call it Team Foundation Build 2015 (TFBuild 2015) while the “old” build system can be referred to as the xaml (workflow) build system. Colin Dembovsky has a great post why you should switch to the new build sytem.

In the last years, I had to implement a lot of customizations into the xaml build system and I became very productive with the workflow activities. Along the way I developed a number of generic activities which I could reuse for other assignments and I really knew my way around in the build workflow. In many cases, the TFS Build Extensions were used to not reinvent the wheel. So, in the first phase I was a bit sceptic about the rise of yet another build system, but I clearly saw some interesting advantages which are explained in the post by Colin. One disadvantage of the xaml build system is the steep learning curve to master the customization process and also the deployment mechanism to refresh the TFS build controller(s). But like I experienced, once you got there, you were able to integrate very powerful customizations into the build process. Anyway, the “old” build system won’t disappear and you can still rely on this functionality for quite some time in the future, but I recommend to have a good look at the new build system and use it for your new/future build definitions.

In this post I want to share how I integrated a common activity in the build process: Versioning. With the available build steps it has become extremely simple to hook your own scripts into the build process. In your scripts you will have access to some predefined build variables.

In my previous blogpost I wrote about adopting a Global .NET Versioning Strategy and the existence of a third (optional) version attribute: AssemblyInformationalVersion. Let’s use this strategy to add versioning to a sample Fabrikam software application.

My build definition:

BuildDefinition-1

In the screenshot above you will see that I launch a powershell script (PreBuild.ps1) before building the solution and I pass one argument productVersion to the script. The powershell script will do the magic in the background to replace all versioning values for AssemblyVersion, AssemblyFileVersion and AssemblyInformationalVersion in the Assembly Info files, based on this product version. The product version will be passed as a whole to the AssemblyVersion and the AssemblyInformationalVersion attributes. The AssemblyFileVersion will be replaced with a full version number which will consist of the major and minor version number of the product version, a Julian based date and an incremental build number.

BuildResult-1

Assembly File Version = 1.0.15236.3

  • 1 => taken from “Major” product version
  • 0 => taken from “Minor” product version
  • 15236 => generated by build process: “15” = year 2015, “236” = day of year 2015
  • 3 => third build, run on day 236 in year 2015

Looking at the assembly details of a custom built Fabrikam assembly now reveals correct meta data:

AssemblyDetails

I also modified the build number format to have some more version information displayed in the build run.

BuildDefinition-2

BuildRuns-1

I added a gist at GitHub to share the powershell script. Note that the script has been used for experimentation and may not be ready to be used for production. it certainly lacks some proper validation and error-handling. Use at your own risk.

Also have a look at some similar inspiring blog posts about versioning TFS Builds which helped me to develop the powershell script that works for my scenario.


Update MSBuild Toolpath in TFS build process template

October 20, 2014

I have experienced a number of migration scenarios where it was decided to first upgrade old Visual Studio 2010 solutions to the latest and greatest version of Visual Studio (VS 2013 at this moment) without forcing a TFS upgrade at the same time.

Depending on the type of included projects for the Visual Studio solution, the TFS build might not work anymore because it requires other MSBuild .targets files (related to the .NET Framework version / Visual Studio version).

The easiest way to fix your TFS build failures is to modify the TFS 2010 build process templates and explicitly set the MSBuild ToolPath variable in the MSBuild activity to the upgraded Visual Studio version.

Visual Studio 2013 => C:\Program Files (x86)\MSBuild\12.0\Bin

MSBuild


Integration of Dynamics CRM 2011 solutions with TFS

December 28, 2012

Some weeks ago I was asked for a proof of concept to design a TFS 2010 solution to fully (not less than 100%) automate a complex Dynamics CRM 2011 deployment for various environments (dev / test / staging / production). Many different components were involved: the CRM solution itself, but also web applications, database objects, reports (SSRS), transformations, …

Dynamics CRM

It has been an interesting journey so far and along the way I got to know (a bit) how Dynamics CRM 2011 is working. Not to my surprise, I realized that it’s quite hard to push all source related items to TFS and to force ALL changes/updates to a CRM environment from a version controlled solution in TFS. Many things in the CRM environment are easily modified by the development team via the CRM UI web interface and as result, directly stored in the CRM database(s). So, the POC also required me to think about enforcing best practices for the CRM development team to avoid inconsistencies in the global deployment solution and I definitly wanted to end up with a build-once;deploy-many solution.

Anyway, I won’t talk about the entire scope of the POC, but I want to highlight the approach I took for automating the export & extract operation from the development CRM 2011 instance via a TFS build definition. The goal here was to automatically capture the daily changes which were published to deployed CRM development solutions.

EXPORT
The MSCRM 2011 Toolkit contains a Solution Export command line utility which enabled me to export one or multiple CRM solutions from an existing Solutions Export Profile into a single compressed solution file (.zip).

EXTRACT
The compressed solution file (zip-format) is of course not ideal to track the individual changes and to bind it to a version control repository. Luckily, with the latest release of the Dynamics CRM 2011 SDK, a new tool (SolutionPackager) was added to extract the different components into individual files.

The SolutionPackager tool, available in the Microsoft Dynamics CRM 2011 Update Rollup 10 version of the Microsoft Dynamics CRM SDK download, resolves the problem of source code control and team development of solution files. The tool identifies individual components in the compressed solution file and extracts them out to individual files. The tool can also re-create a solution file by packing the files that had been previously extracted. This enables multiple people to work independently on a single solution and extract their changes into a common location. Because each component in the solution file is broken into multiple files, it becomes possible to merge customizations without overwriting prior changes. A secondary use of the SolutionPackager tool is that it can be invoked from an automated build process to generate a compressed solution file from previously extracted component files without needing an active Microsoft Dynamics CRM server.

TFS 2010

So, these tools opened the door for me to work out a custom build process (workflow) in TFS 2010 with the following sequential activities:

  • Export CRM solution from dev environment (MSCRM 2011 Toolkit)
  • Prepare TFS workspace before extract of solution file [Get Latest + Check-Out]
  • Extract compressed solution file into TFS workspace (SolutionPackager)
  • Scan TFS workspace for changes/additions/deletions
  • Check-In pending changes of the TFS workspace as a single changeset

The scan of the TFS workspaces – to end up with all differences [changes/additions/deletions] – was a bit more complex than expected because I needed to use several TFS API Workspace calls like PendEdit, PendAdd, PendDelete, … I also made use of the EvaluateCheckin2 method to detect potential conflicts and to perform proper exception handling.

This process allows the development team to easily follow-up the incremental changes (via TFS changesets) which were applied to the dev CRM environment. Note that the SolutionPackager tool is also able to generate a compressed solution file from the individual component files.


Managing Builds through the Build Quality value

February 12, 2012

The integration in TFS between Builds and Work Items is just too good. In this blog post I will describe a solution based on a TFS Server Plugin to have more fine-grained control on completed builds.

Completed builds may sometimes be more valuable for the development team and only a few of them will eventually be published/deployed for (public) Testing. This means that Testers (in theory) should only be able to log bugs against these particular builds.

There are two important fields on a bug work item which may point to a build: “Found in Build” and “Integrated in Build”.

BuildQuality1

For both fields a value can be selected from a combobox. The values in the combobox are part of the global list for the builds in the active Team Project.

BuildQuality2

By default, ALL finished builds (failed + succeeded) will trigger the BuildCompletion event in the Team Project Collection and the accompanying Build Number will be appended to the existing global list for the builds in the Team Project. In Team Projects where a lot of builds are defined, this “Builds” global list will be flooded by superfluous builds which should never be selected for the above fields. CI builds for example should not be part of this global list. Only full builds which are deliberately transferred for testing should be searchable in the above comboboxes.

So, how to explicitly mark a build for “Testing”? This can be easily done with setting a Build Quality for a given build.

BuildQuality3

Note that people who want to modify the Build Quality should have the permission “Edit build quality”.

BuildQuality4

Build Quality values can also be managed in a dedicated list.

BuildQuality5

By using a specific Build Quality value, it’s also possible to create a TFS Server Plugin and to listen to a BuildQualityChangedNotification event and to only add the Build Number to the global list when the Build Quality is set to “Ready for Initial Test”. Of course, the default event-subscription for the BuildCompleted event must be disabled and the existing global list should be cleaned.

This is exactly what I did. Only the Build Numbers of the builds that get a desired Build Quality (“Ready For Initial Test”) will be pushed to the global list of the Builds in the Team Project.

When your team makes fully use of Microsoft Test Manager to file bugs, the “Found in build” field can be automatically set by associating particular builds to a Test Plan. Test execution can then be done against an approved build list. The “Integrated in build” field is normally automatically set by the build process which picks up a bug resolution through a changeset at check-in time.

BuildQuality6

BuildQuality7

Some more details how I did implement the customization of the global build list:

Delete the out-of-the-box event-subscription for the BuildCompleted event

The easiest way is actually to navigate to the event-subscription table via SQL Management Studio (tbl_EventSubscription in specific Team Project Collection database) and to delete the event-subscription row from there. But, that shortcut is not really recommended I’m afraid. A safer solution is to rely on the Bissubscribe command line tool ((executable can be found in :\Program Files\Microsoft Team Foundation Server 2010\Tools). Strange enough, there’s no option to list the existing event-subscriptions, but there certainly is an unsubscribe switch to unsubscribe from an event-subscription. You will need the ID of the event-subscription. The only way to get this ID without having to write custom code on the TFS API is to get the ID from looking into the tbl_EventSubscription table in SQL Management Studio. Note that it’s the Integer ID you will need instead of the GUID Subscriber ID.

BuildQuality8

BuildQuality9

Recreating this default event-subscription is possible through the command BisSubscribe /eventType BuildCompletionEvent /address http://:8080/tfs//WorkItemTracking/v1.0/Integration.asmx /collection http://:8080/tfs/.

Clean up the existing “Builds” global list

The global list of a Team Project Collection can be exported/imported with the witadmin command-line options or you can export/import a global list through the UI with the Process Editor from the TFS Power Tools.

BuildQuality10

You will be prompted to store the GlobalList.xml file after which you may edit the list. Delete as many ListItem entries as you want in order to clean up the Build list. This is the value list that will be used for the “Found in Build” and “Integrated in Build” fields.

BuildQuality11

Import the global list back to TFS via witadmin or the Process Editor.

BuildQuality12

To immediately witness the result, you may need to restart Visual Studio to clear the cache.

Activate the TFS Server Plugin for processing BuildQualityChangedNotification events

Since TFS 2010, it has become fairly easy to write custom event handlers that will run in the context of Team Foundation Server. It’s a plugin (dll) that needs to be deployed on the TFS Application Tier. How to accomplish this is not well documented, but your best starting point would be to read Chapter 25 of the book Professional Team Foundation Server 2010. Grant Holliday – one of the authors – explains how the ISubscriber interface can be used for extending Team Foundation Server.

For my purpose I implemented the ISubscriber interface in my BuildQualityChangedEventHandler class. The Build Quality I will use in my example is “Ready For Initial Test”. This action should append the Build Number to the Build global list of the Team Project.

BuildQuality13

Next task was to pick up the BuildQualityChangedNotificationEvent and to take action when the Build Quality was set to “Ready For Initial Test”: export global list, append the current Build Number and import the list back to TFS.

BuildQuality14

That’s it, build the assembly and drop it in the plug-in folder on all active TFS Application Tiers.

BuildQuality15

Setting the Build Quality to “Ready For Initial Test” does the trick and adds the Build Number to the Build global list for the appropriate Team Project.

BuildQuality16

As an extra, I also decided to keep the “Ready For Initial Test” builds indefinitely.

BuildQuality17

BuildQuality18

Controlling your builds with the Build Quality value in combination with a TFS Server plugin gives you a lot of power! The next step could be to trigger deployments off a Build Quality value …


The importance of Continuous Building/Integration

December 13, 2011

Automatically triggering a build after each check-in (continuous integration option in TFS Build Definitions) in a development branch is a good practice that nowadays needs less explanation than a few years ago. Many development teams seem to adopt this practice quite well, but there’s still a lot of room for improvement.

The main benefit of enabling continuous (integration) builds is clear: providing early feedback on the validity of the latest code changes that were committed to the version control repository. I won’t discuss in detail the different checks (compilation, deploy, test) that should be part of the validation step (as always: it depends!), but for this post I want to focus mainly on the importance of having at least a continuous (integration) build. A future topic to tackle is how to get your CI builds as fast as possible with TFS 2010.

TriggerCI

The first guideline that developers should follow if they want to reap the full benefits of CI builds, is to check-in early and often. I’m often worried/shocked when developers show me the amount of pending changes in their local workspace. This nearly always leads to the conclusion that those developers are working on many different tasks at the same time which should be avoided at all cost. The risk of committing unwanted files for a dedicated change is just too high. There are a number of version control features available to safely switch to another task: shelving or creating an extra workspace. The next version of TFS (TFS11) will even include better support for task-driven development through a revised Team Explorer (more details in this blog post of Brian Harry). Check-in early and often does of course not mean that developers must just check-in changes that are not ready yet and not locally validated, but it does mean that developers should work on small incremental changes. This requires every involved developer to break down their assigned work into small tasks which may result in a check-in operation. Ideally, as a developer, you should commit your changes to the version control repository at least daily. And don’t forget to provide a decent comment for the check-in operation in addition to an association with a work item. This meta-data might become quite important in some merging scenarios.

Having a CI build and adopting a check-in early and often policy, will lead to the fact that a broken build can only be caused by an incremental small change which should be easier to fix than a large changeset or a combination of multiple changesets by different developers. More frequent check-ins will further decrease the amount of files that could be part of a conflicting changeset that caused the build to fail.

BuildOverview

Another golden rule in a CI environment is to fix the broken build as soon as possible and to delay other commits of pending changes. This implies that a notification mechanism should be put in place to warn the development team about broken builds. In Team Foundation Server 2010, you can easily set up an e-mail alert or you can rely on the built-in Build Notifications tool. The developer who caused the build to fail must take responsibility to fix it immediately. The check-in operation of a developer can only be considered as completed/done when the CI build has built successfully the latest changeset.

My recommended 10-step workflow for all developers:

DeveloperWorkflow

After the last step, you may start all over again or you may just go home! Committing changes to version control requires you at least to wait until the CI build has finished successfully. Do you really want to impact the entire development team by going home without verifying your latest changes?

This reminds me to my early years where I was a junior .NET developer in a mid-sized development team. At that time, there wasn’t yet a proper build server and we never heard of continuous integration before. I don’t have to tell you how many times all developers were impacted by an invalid check-in and how much time we spent on finding the culprit. You probably have all been in a similar situation.

  • Dev1: My local build fails after a get latest
  • Dev1: Who did a check-in recently? Anyone heard of method X?
  • Dev2: I Just did a get latest and everything works fine!
  • Dev3: Are you sure that you took a “real” latest of everything?
  • Dev1: Wait, let me redo a get latest to be sure!
  • Dev4: Damn, I also took a get latest and my build now fails!
  • Dev2: Sorry guys, I might have forgotten to check-in a file!

In some cases, the person who caused the build to fail was not at his desk (sick/holiday/meeting) the moment the build failure was discovered! Yeah, those were the times we all never want to go back right? Not to mention what this type of troubleshooting costs!

In brief, it’s all about discipline and communication. Having a nice up-to-date build dashboard on a big screen might definitely help to make everyone aware of the importance of the latest build status.

In a next blog post I will talk about the importance of the CI build speed. It’s obvious that for providing early feedback, the CI builds must run as fast as possible. What’s the value of a CI build that takes more than 1 hour?! There are some interesting things you can enable in the TFS Build Definitions to optimize CI build definitions.

Note that since TFS 2010, there is also the Gated Check-in trigger option that will even prevent broken builds by delaying the actual commit until a successful build was run with the latest code changes in a temporary shelveset.


Errors during build execution while downloading large files from Version Control

August 3, 2011

In a large SharePoint development project at a customer I was confronted with the not always reproducible build error while syncing the build workspace:

Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host

image

This error sometimes occurred while downloading the larger database backup files (> 1.5GB) which are required in the deployment and test phase.

Luckily a patch (KB981898) for the TFS Application Tier (Windows Server 2008 R2) does exist to resolve this particular issue! Also have a look at this blog article for more background info.

Also thanks to Giulio Vian for helping me out with this issue!


Publication of Test Results to TFS 2008

July 16, 2010

Lately I’ve been struggling with some weird behavior during a Team Build (TFS 2008). The build executed also a set of Unit Tests which passed, but during the publication step of the test results to Team Foundation Server, the operation failed time after time.

BuildFailed

PublishFailed

I couldn’t find any additional information (eventlog, TFS Log, …) about the root cause of this failure, but while limiting the test methods for the test run I bumped into a test method which name consisted of 461 characters!

Apparently there’s a hard limit of 256 characters for the test method names that are published to the TFS data warehouse.


Follow

Get every new post delivered to your Inbox.