Migrate inline images to VSTS

September 14, 2016

Lately I have been planning a number of migrations to move small/big companies from TFS on-premises to Visual Studio Team Services (VSTS).

There are a number of options to migrate data from TFS to VSTS, but option 3 [high-fidelity database migration] is unfortunately not yet available. So, most of the time I still use custom/third-party tooling to perform the migration which is not always straightforward and may be very time-consuming.

One serious issue that popped up in a migration towards VSTS (using the TFS Integration Tools), was that the inline images in the Description field (or other html fields) of migrated work items were not properly migrated. The inline images are actually still referring to the old TFS on-premises environment because the html value of the Description field contains an <img> element with a source set to http://<tfs-on-prem>:8080/tfs/<tpc>/workitemtracking/v1.0/attachfilehandler.ashx?filenameguid=<guid>&filename=<filename>.png.

As a result, all inline images are only stored in the old TFS environment and they have not been uploaded to VSTS. The html value of the Description field has been migrated as-is. Initially you might not notice this after the migration of the selected work items because as long as the old TFS environment is still available, the inline image will be displayed. But what if the old TFS environment has been archived/destroyed?

To correct this and to upload the original images to VSTS, I have written some code (TFS API) to loop over the VSTS work items to detect image links to the old TFS environment. Using the source link of the original image, I can download the image to my local disk and upload it as an attachment to the VSTS work item. Finally, I’m replacing the original image source link with the new VSTS image link. Good to know is that once in-line images are detected inside the html field, those images are stored on the server and the temporary image file attachments may be deleted as well.


Retain VSTS Build Indefinitely – Fetch Build ID from RM artifact variable

June 26, 2016

While showing a Visual Studio Release Management demo in a Practical DevOps training, I stressed how important it was that the build artifact, which was used during a release, was not destroyed by the built-in retention policy. By default, the output of a build run is only stored for 10 days. So, in case you really want to keep the build and build artifacts, you must take care of this yourself.

For that purpose I created a powershell script which calls the VSTS REST API to accomplish this.

The powershell script is called from a release management task at the beginning of the release process.


The powershell script calls the VSTS Build v2 REST API and uses Basic Authentication (passed in the headers) with a Personal Access Token password.

At the moment I worked on this activity (probably still in private preview of VSRM – September 2015) it was not possible yet to fetch the exact Build ID through the build artifact which is linked in the release definition. That’s why I was dropping a simple text file with the Build ID in the build process which was also stored in the build artifact. That file was then used in the release management process to parse the Build ID.


Apparently, this workaround is not necessary anymore and you can now immediately fetch the Build Id from the build artifact via a pre-defined release management artifact variable RELEASE_ARTIFACTS_[source-alias]_[variable-name]. Read more about the available RM artifact variables.

Next task on my todo list => create a VSTS extension to provide a dedicated build/release task.

Migrating TFVC (TFS on-premises) to Git (VSTS)

June 9, 2016

In the last couple of months I do get more requests to move TFVC version control history to a git repository in Visual Studio Team Services (VSTS). The migration from TFVC to TFVC is at the moment possible via the TFS Integration Tools and is not that straightforward to accomplish. Migrating to a git repository is much simpler and is certainly the way to go if you were already planning to adopt git in the future. The migration can be done via Git-TF which is a set of cross-platform command-line tools that facilitate sharing of changes between Team Foundation Server, Visual Studio Team Services and Git.

What do you need to get started?

  • Download git via https://git-scm.com/downloads
  • Download and extract Git-Tf to your computer
  • Add the extracted git-tf path to the system environment variable “path”
  • Create a new “git” Team Project in VSTS

Migration Steps:

  • Open a command-line prompt and navigate to a directory where you want to host the local git repository
  • Call git-tf clone to push all TFS changeset info from TFVC to a new local git repo. The first argument is the Team Project Collection url. You pass the TF version control path to the exact branch in the second argument and you end the command with the “deep” flag to ensure that the full history of the branch is moved into separate commits in the git repo. Pass your credentials to connect to TFS and execute the command.


  • Once you have a local git repository it’s easy to push it towards an empty central VSTS git repository. First use the git remote add command to link your local git repo to the remote “origin” and afterwards you can push all changes via git push.


    Navigate to the Code Hub in your VSTS Team Project and you should see all code history inside the git repo. What’s a big plus is that the original changeset date/time stamps are now part of the git commit info.

    Work Item Query via TFS API and the dayPrecision parameter

    March 7, 2016

    By default TFS doesn’t pay attention to the time part in work item queries when comparing datetime values. If you want to launch a query and you need to take into account the exact timestamp, you must switch off the dayPrecision parameter in the Query constructor.


    using the dayPrecision parameter in the Query constructor

    MSDN documentation: https://msdn.microsoft.com/en-us/library/bb133075(v=vs.120).aspx

    Mystery resolved!


    (upcoming) ALM/DevOps Community Talks

    February 8, 2016

    Q1 of 2016 has already been quite busy with the organisation of VISUG 10 Years and the upcoming conference Techorama 2016. The user group and all community activities are simply not a simple side job anymore.

    For the celebration of VISUG 10 years I delivered a session DevOps with Visual Studio Release Management.

    In March (part of a Techorama 2016 promo tour), I will be travelling to Scotland for a User Group Talk in Glasgow (March 10): Advanced Techniques for Web Deploy (msdeploy) to simplify the deployment of web applications. The next day (March 11), I will again deliver my DevOps with Visual Studio Release Management talk in Edinburgh. Thanks to the Scottish Developers User Group for setting this up!

    Destroying deleted branches in TFS

    January 13, 2016

    Everyone using a version control system will have ever used a delete command to delete files/folders which are not required anymore.

    Team Foundation Verion Control (TFVC) treats a delete [tf delete] as a pending change in your workspace (type = delete). The final checkin command removes the item from the version control server, but does not delete the item permanently. All historical changes to the file remain available for lookup. In fact it’s also possible to undelete [tf undelete] an item and bring it back into play.

    In only a few scenarios you might want to go one step further and really destroy the deleted files from TFS …

    Recently a customer was complaining about the exponential growth of their TFS 2013 databases (exceeding 400GB for a specific TPC database). Running some SQL scripts on the tbl_content table of the Team Project Collection database revealed that a lot of version control and file container data was added monthly to the TFS database. Ranging from 2.5GB a few months ago up to 8GB in the last months.

    After some investigation I noticed that the development team was creating a lot of feature branches (150+) for a specific application to isolate their feature development. This shouldn’t have a big impact on the exponential growth of the TFS databases because a branch is like a shallow copy of each of the files from the source branch. Both branches will originally refer to the same copy of each file in TFS. Extra changes will trigger a deltafication process to optimize storage. More info of this complex procedure is explained in this old TFS 2010 blog post which should still be valid for the latest version of TFS. But the main problem was that the feature branches also contained a lot of references (NuGet packages / binary files) which were added to version control as well and many dependency changes were initiated in the feature branches. As described in the old blog post, TFS will also try to compute deltas for binary files, but it’s way more difficult to predict database growth based on the sizes of these deltas. The results from deltafication vary greatly depending on what type of file it is, and the nature of the changes.

    After further discussions, we decided to permanently destroy old deleted feature branches in version control and see the impact of the database size. I wrote a powershell script to iterate over a version control root folder to search for deleted branches and to destroy it only if the latest check-in did not occur in the last 100 days. In the script I used the startcleanup switch to immediately trigger the clean up process in TFS instead of waiting for the daily clean up background process.

    I don’t remember the exact results on the database size, but it certainly decreased the total size of the TFS database for the involved Team Project Collection. With the upcoming planned upgrade to TFS 2015, we also used the Test Attachment Cleaner (part of the TFS Power Tools) to remove old diagnostic test data. So, in the end we were able to reduce the total size of the TPC database with a fair amount of GB in order to also speed up the duration of the future upgrade to TFS 2015.

    Another initiative was started to not use TFS anymore to keep (physical) track of the dependencies. TFS is indeed a version control repository to store sources, but there are better tools out there to store software packages/artefacts.

    TFS Build 2015 … and versioning!

    August 24, 2015

    Lately I got some time to play a bit more with the new build system which was released with TFS 2015 and which is also available for Visual Studio Online. The new build system was in the beginning announced as build vNext, but now with the release of TFS 2015, it’s safe to call it Team Foundation Build 2015 (TFBuild 2015) while the “old” build system can be referred to as the xaml (workflow) build system. Colin Dembovsky has a great post why you should switch to the new build sytem.

    In the last years, I had to implement a lot of customizations into the xaml build system and I became very productive with the workflow activities. Along the way I developed a number of generic activities which I could reuse for other assignments and I really knew my way around in the build workflow. In many cases, the TFS Build Extensions were used to not reinvent the wheel. So, in the first phase I was a bit sceptic about the rise of yet another build system, but I clearly saw some interesting advantages which are explained in the post by Colin. One disadvantage of the xaml build system is the steep learning curve to master the customization process and also the deployment mechanism to refresh the TFS build controller(s). But like I experienced, once you got there, you were able to integrate very powerful customizations into the build process. Anyway, the “old” build system won’t disappear and you can still rely on this functionality for quite some time in the future, but I recommend to have a good look at the new build system and use it for your new/future build definitions.

    In this post I want to share how I integrated a common activity in the build process: Versioning. With the available build steps it has become extremely simple to hook your own scripts into the build process. In your scripts you will have access to some predefined build variables.

    In my previous blogpost I wrote about adopting a Global .NET Versioning Strategy and the existence of a third (optional) version attribute: AssemblyInformationalVersion. Let’s use this strategy to add versioning to a sample Fabrikam software application.

    My build definition:


    In the screenshot above you will see that I launch a powershell script (PreBuild.ps1) before building the solution and I pass one argument productVersion to the script. The powershell script will do the magic in the background to replace all versioning values for AssemblyVersion, AssemblyFileVersion and AssemblyInformationalVersion in the Assembly Info files, based on this product version. The product version will be passed as a whole to the AssemblyVersion and the AssemblyInformationalVersion attributes. The AssemblyFileVersion will be replaced with a full version number which will consist of the major and minor version number of the product version, a Julian based date and an incremental build number.


    Assembly File Version = 1.0.15236.3

    • 1 => taken from “Major” product version
    • 0 => taken from “Minor” product version
    • 15236 => generated by build process: “15” = year 2015, “236” = day of year 2015
    • 3 => third build, run on day 236 in year 2015

    Looking at the assembly details of a custom built Fabrikam assembly now reveals correct meta data:


    I also modified the build number format to have some more version information displayed in the build run.



    I added a gist at GitHub to share the powershell script. Note that the script has been used for experimentation and may not be ready to be used for production. it certainly lacks some proper validation and error-handling. Use at your own risk.

    Also have a look at some similar inspiring blog posts about versioning TFS Builds which helped me to develop the powershell script that works for my scenario.