Deep copy of Test Cases to a TFS Team Project in another Team Project Collection

July 22, 2015

A few days ago I got stuck in a project to migrate a set of test cases from an “old” Team Project to another (new) Team Project in another Team Project Collection (TFS 2013 Update 4 environment). Via the TFS API I used the work item Copy method to perform a deep copy [including links/attachments] with the WorkItemCopyFlags set to CopyFiles.

ITestCase testCase = <MyTestCase>;
WorkItem wiCopy = testCase.WorkItem.Copy(<TargetProjectWorkItemType>, WorkItemCopyFlags.CopyFiles);
wiCopy.Save();

This always worked when copying test cases to a Team Project in the same Team Project Collection (TPC) as the original Team Project, but copying the test cases to a Team Project in another TPC always caused an error:

TF237136: File attachment was not found on attachment server.

I expected this to work, but after I verified that the MTMCopyTool also generated the same error when copying test cases across Team Projects, I decided to search for a workaround … downloading and uploading the attachments as after-processing.

DownloadUploadAttachments

Not perfect, but good enough to keep going and moving all attachments to the other Team Project in the other Team Project Collection.


Reporting Services issues after migration to TFS 2015 RC2 (from TFS 2010)

July 10, 2015

Yesterday, I blogged about the migration activities to get to TFS 2015 RC2, but I was still stuck at an error during new Team Project creation.

TeamProjectFailure

Looking into the logs it looked like there was some kind of issue during the upload of the first Reporting Services report “Bug Status” (part of the latest TFS 2015 Agile process template).

TeamProjectFailure-ReportUpload

First I thought there was something wrong with permissions on the Reporting Services site because I also applied a certificate to offer accessibility via https, but that wasn’t it. All seemed to be setup correctly. A sign to look further into the details of the Bug Status .rdl report definition and why it couldn’t be correctly uploaded to the Reporting Services site. After downloading the process template from the Team Project Collection and opening the report with notepad, I knew I was getting closer to the root cause.

DataSourceIssue

As shown in the picture above, the (default TFS 2015) Data Source mentioned in the report definition did not exist in the Report Server database after the migration from TFS 2010 SP1. In the TFS 2010 timeframe, all report defintions were linked to a “2010” Data Source.

Instead of renaming the existing Data Sources (that would break the existing reports, created from a TFS 2010 Team Project), I duplicated the entries to provide extra Data Sources with the expected names for usage with TFS 2015.

Issues resolved. Back on track!


Upgrading TFS 2010 SP1 to TFS 2015 RC2

July 9, 2015

Today I got the opportunity to upgrade a customer from TFS 2010 SP1 to TFS 2015. As blogged by Brian Harry a few days ago, the RTM release of TFS 2015 (not Visual Studio 2015!) is delayed (for a good reason!) and in order to further test the overall migration/upgrade process a new RC2 release has been made available (with go-live license). That’s the build I used today to upgrade from TFS 2010 SP1. The official installation guide is not yet available, so here are some tips to get you started in the right direction … At the customer I got 2 new servers at my disposal with a clean Windows Server 2012 R2 OS. The first server is used for the TFS Application and TFS Data Tier (Single Server Topology). The second one will be used as the TFS Build Server.

Prerequisites

Installing SQL Server 2014 Standard Edition

Before starting the installation of SQL Server 2014, I enabled the .NET 3.5 feature on Windows Server 2012 R2. OSFeature

Required SQL Server features:

SQL2014-FeatureSelection

I always recommend to use dedicated domain service accounts for running SQL Server. In this case I decided to reuse the <TFSSERVICE> account. Note that all services are set to start automatically. ServiceAccountsForSQL

SQL2014-InstallationResult

After successful installation of SQL Server 2014 I started the backup procedure on the old TFS 2010 environment and I copied all SQL .bak files to the new server for restoring all TFS related databases to the SQL Server 2014 instance. SQLDatabases

Configuration of SQL 2014 Reporting Services

Start-RS

Link to the existing restored “ReportServer” SQL Server database.

SetRSDatabase

RS-ExistingDB

RS-SelectReportServerDatabase

RS-Summary

Restore the SQL Server Reporting Services Encryption key.

RS-RestoreEncryptionKey

RS-RestoreEncryptionKey2

Run/Apply the web service and the report manager service.

RS-Webservice

RS-ReportManager

Remove the “old” server from the scale-out deployment configuration [http://intovsts.net/2014/07/14/tfs-migration-upgrade-scale-out-reporting-services-issues/] Test the Report Server website …

RS-Running

Installation TFS 2015 RC2

InstallTFS2015

trialLicense

The trial license is valid for 90 days and can be extended for 30 days. At this moment, there are no product keys available for the RC2 release.

Wizard-Upgrade

Wizard-Upgrade2

Wizard-Upgrade3

Wizard-Upgrade4

No Build Configuration yet.

Wizard-Upgrade5

Wizard-Upgrade6

Wizard-Upgrade7

Wizard-Upgrade8

Wizard-Upgrade9

Wizard-Upgrade10

No SharePoint link (yet).

Wizard-Upgrade11

Verification …

Wizard-Upgrade-Verification

Successful upgrade!

Upgrade-Success

Once the TFS 2015 RTM release becomes available, it’s only a minor upgrade to move to the official RTM bits. Interested in the release notes for TFS 2015? Check out the official news update, posted in April 2015.


Connect Visual Studio Release Management to Visual Studio Online and Microsoft Azure

November 6, 2014

Today was a very big day for developers around the world. Microsoft announced a ton of exciting news at the Connect event in New York with keynotes from Scott Guthrie, Soma Somasegar, Scott Hanselman and Brian Harry. Read more details in the Microsoft News Center: Microsoft takes .NET open source and cross-platform, adds new development capabilities with Visual Studio 2015, .NET 2015 and Visual Studio Online. On MSDN, you can now also download Team Foundation Server and Visual Studio Release Management 2013 Update 4 which offers the capability to connect Visual Studio Release Management to Visual Studio Online. Before it was only possible to connect Visual Studio Release Management to an on-premises Team Foundation Server.

I wrote a small guide to get you started with Visual Studio Release Management for Visual Studio Online and how to create an Azure Release Management environment via the (old) Azure Management Portal.

  1. Download and Install the latest bits of the Release Management Client.
  2. Configure the Release Management Client to connect to Visual Studio Online.Provide the url of your Visual Studio Online account

    image

    Hit OK and after logging in with your account credentials to VSO, the RM client should be connected to “TFS in the cloud”.

  3. Create a new Cloud Service via the Azure Portal to group all Release Management VMs.image

    image

  4. Create a new Storage Account via the Azure Portal to group all Release Management VM storage.image

    image

    Note that I have created my storage account in West Europe. Be consistent and reuse your region of choice when creating the VM(s) later. Find out more about all Azure Regions.

  5. Create new (Gallery) VM(s) via the Azure Portal in the newly created Storage Account.image

    Follow the wizard to create a new VM and don’t forget to add a HTTP endpoint …

    image

    image

  6. Download the Azure Publish Settings file to get the Management Certificate Key.To be able to complete the next step, you will need the Management Certificate key to connect from the Release Management client to your Azure subscription. There are a number of ways to get the key, but the easiest way in my opinion is to navigate to https://manage.windowsazure.com/publishsettings, log in with your Azure credentials and save the publishsettings file. Open the file with notepad and copy the full ManagementCertificate value (without quotes) in your subscription.
  7. Configure the Release Management Client to add your Microsoft Azure subscription.Navigate to the Manage Azure section in the Release Management Client and add a new Azure Subscription.

    image

    image

    Make sure to enter the Storage Account Name you have created in one of the previous steps.

  8. Create a new Release Management Environment, linked to the Azure subscription.

    Navigate to the Configure Paths tab and select the Environments link to click for a new vNext Azure environment.

    image

    Now, before you can add your VMs to an environment you must link the environment to an Azure environment via the top right button “Link Azure Environment”.

    image

    Select the Azure VM Cloud Service endpoint you want to include in the new Release Management (RM) environment and finish off by hitting the Link button.

    SNAGHTMLd6951fe

    This linking will enable you to select Azure VM machines in the RM environment. Also note that the Environment name and the MSDN Subscription have been populated (read-only), but you still need to link the Azure Servers.

    image

    Finally you can select the Azure VM which has been created in one of the previous steps. Linking this VM from this dialog window will push it in the RM environment. Save and Close to return

    image

    image

    image

With one or more Release Management environments setup, you can start defining a Release Path and once that has been setup, it will be possible to create a new (vNext) Release Template to define your deployment actions. All the appropriate Azure VMs will show up in the Release Template toolbox.

image

Enjoy connecting RM to VSO! No excuses anymore to not setup your release pipeline!

 

 

 

 


Update TFS Build Controller via Powershell

October 21, 2014

During a trial-migration of TFS, I typically prepare a lot of command-line tools to run some background update processes. Instead of writing little C# command-line applications in Visual Studio, I’m moving more and more away from Visual Studio and do stuff via PowerShell.

For a migration upgrade from TFS 2010 to TFS 2013, I have to deal with an update of the Build Controller to a new build machine/server. For a Team Project Collection of 100+ Team Projects and lots of build definitions, this is not something you want to do manually or delegate to all involved dev teams.

Instead of having to start from zero, I found this interesting post from Daniel Mann which does exactly what I want: updating the Build Controller from PowerShell via the TFS API. Except, it does it specifically for one dedicated Team Project.

So, the PowerShell script only needed a bit of tweaking to fetch all available Team Projects via the ListAllProjects method of the ICommonStructureService interface.

param(
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[string]
$TfsUrl,
[Parameter(Mandatory)]
[ValidateNotNullOrEmpty()]
[string]
$NewBuildController,
[Switch]
$WhatIf
)

Function SetBuildControllerForTeamProject($TeamProject, $BuildController, $WhatIf)
{
$buildDefinitions = $buildClient.QueryBuildDefinitions($TeamProject)
$buildDefinitions | % {
Write-Host " >> Checking Build Definition" $_.Name
Write-Host " >>" $_.Name "is using" $_.BuildController.Name

if ($_.BuildController.Uri -ne $controller.Uri) {
Write-Host " >>> Setting" $_.Name "to use $BuildController"
if (!$WhatIf) {
$_.BuildController = $controller
$_.Save()
}
}
else {
Write-Host " >> Build controller is already set. Taking no action."
}
}
}

add-type -Path 'C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\ReferenceAssemblies\v2.0\Microsoft.TeamFoundation.Common.dll'
add-type -Path 'C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\ReferenceAssemblies\v2.0\Microsoft.TeamFoundation.Client.dll'
add-type -Path 'C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\ReferenceAssemblies\v2.0\Microsoft.TeamFoundation.Build.Client.dll'

$tfsUri = new-object System.Uri $TfsUrl
$tpc = new-object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection $tfsUri

$buildClient = $tpc.GetService('Microsoft.TeamFoundation.Build.Client.IBuildServer')
$commonStructure = $tpc.GetService('Microsoft.TeamFoundation.Server.ICommonStructureService')
$controller = $buildClient.GetBuildController($NewBuildController)

$allTeamProjectInfo = $commonStructure.ListAllProjects()
$sortedTeamProjectInfo = $allTeamProjectInfo | sort-object { $_.Name }

foreach($teamProjectInfo in $sortedTeamProjectInfo)
{
Write-Host '************** Scanning Build Definitions in Team Project' $teamProjectInfo.Name
SetBuildControllerForTeamProject $teamProjectInfo.Name $NewBuildController $WhatIf
Write-Host ''
}

You may also use the Community TFS Build Manager (Visual Studio 2013 extension) to modify a number of Build Definitions in bulk, but for my migration scenario, I preferred to have a PowerShell script. Having this script also allows me to modify other settings in the Build Definitions … for example the Build Drop Location.

Community TFS Build Manager


Update MSBuild Toolpath in TFS build process template

October 20, 2014

I have experienced a number of migration scenarios where it was decided to first upgrade old Visual Studio 2010 solutions to the latest and greatest version of Visual Studio (VS 2013 at this moment) without forcing a TFS upgrade at the same time.

Depending on the type of included projects for the Visual Studio solution, the TFS build might not work anymore because it requires other MSBuild .targets files (related to the .NET Framework version / Visual Studio version).

The easiest way to fix your TFS build failures is to modify the TFS 2010 build process templates and explicitly set the MSBuild ToolPath variable in the MSBuild activity to the upgraded Visual Studio version.

Visual Studio 2013 => C:\Program Files (x86)\MSBuild\12.0\Bin

MSBuild


Split test runs for TFS Build and inspect test results

September 22, 2014

As a consultant, many times I have to deal with custom requests which cannot be handled in TFS out-of-the-box. Many customizations end up to become valuable for other customers as well. Unfortunately I don’t always find the time to write about it and to work out a more generic solution which could help other people.

But recently I got an interesting question to intervene during the test run on the TFS Build Server because a complete test run took way too much time. The solution which was built on the server consisted of a big set of Unit Tests and a big set of Integration Tests. The Integration Tests required a deployment of a SQL Server database with some reference data. All tests were run at the same time and this caused builds to run for a long time, even if one of the Unit Tests failed at the beginning of the test run. The test run only completes (success/failure) after running ALL tests.

So, the goal was to quickly detect when a test fails (= fail early!) and to have the possibility to stop the build process immediately after the first test failure (=stop/fail build at the point one of the tests fails). The customer didn’t see any added value to run the remaining tests, knowing that already one test failed. Instead of waiting 30’ or longer for the full test results, the developers could already start fixing the first test failure and stopping the build would also decrease the load on the build server and test environment. We also agreed to only deploy the database when all Unit Tests succeeded.

How to separate the Integration Tests from the Unit Tests?

image

My sample solution above contains 2 separate projects/assemblies to host the Unit Tests and the Integration Tests. During the configuration of a Build Definition, you can easily define 2 separate test runs.

image

The first test run definition will only fetch the Unit Tests, while the second test run definition will look for the Integration Tests. Note that I specified a specific name for the test run definition. I will use this name later to filter the test run definitions. Creating multiple test run definitions is a flexible and easy way to split your big test run in multiple smaller test runs.

How to filter the test run definitions before the execution of a Test Run?

Time to customize the build process a bit so that first only the Unit Tests can be run before deciding to proceed with a database deployment and the run of the Integration Tests.

image

Instead of running the default VS Test Runner activity which would run all test run definitions (“AutomatedTests”), you need to filter for the Unit Tests. This can be done by modifying the TestSpec parameter for the RunAgileTestRunner activity. A where clause is added to the AutomatedTests value to search only for the “UnitTests” test run definition(s).

image

Result => only the Unit Tests will be executed by the VS Test Runner.

After this test run we can check the TestStatus of the build to stop the build or (in case of no test failures) to continue with a database deployment and the run of the Integration Tests.

In the ForEach activity after the database deployment operation I added a TestSpec filter for the AutomatedTests to only fetch the “IntegrationTests”.

image

The sequence in the ForEach activity will then call again the VS Test Runner and check for test failures to potentially stop the build in case of failure in the Integration Tests.

The more fine-grained you define your Integration Tests (= different test run definitions, making use of the test assembly file specification or the test case filter), the sooner you can inspect the test results and fail the build without running the other set(s) of Integration Tests.

Inspect Test Results during a Test Run (no filters)?

In the beginning, I started looking into options to inspect the test results during the ongoing one-and-only test run (no different test run definitions / no requirement for filters). I quickly stumbled on this MSDN page to learn more about the VSTest.Console.exe command-line options. By adding the switch /Logger:trx it’s possible to drop the test results into a Visual Studio Test Results File (.trx), but the problem is that the .trx file is only persisted to disk once the test run finishes. I didn’t find a way to get to the test results while the test run was still executing.

To stop the build in the customized build process template, I did throw an exception which is sufficient to stop the build process.

You can download the build process template which I used to write this blog entry. It’s far from complete and not fully tested, but it could help you to understand the filtering of the test run definitions.


Follow

Get every new post delivered to your Inbox.