Integration of Dynamics CRM 2011 solutions with TFS

December 28, 2012

Some weeks ago I was asked for a proof of concept to design a TFS 2010 solution to fully (not less than 100%) automate a complex Dynamics CRM 2011 deployment for various environments (dev / test / staging / production). Many different components were involved: the CRM solution itself, but also web applications, database objects, reports (SSRS), transformations, …

Dynamics CRM

It has been an interesting journey so far and along the way I got to know (a bit) how Dynamics CRM 2011 is working. Not to my surprise, I realized that it’s quite hard to push all source related items to TFS and to force ALL changes/updates to a CRM environment from a version controlled solution in TFS. Many things in the CRM environment are easily modified by the development team via the CRM UI web interface and as result, directly stored in the CRM database(s). So, the POC also required me to think about enforcing best practices for the CRM development team to avoid inconsistencies in the global deployment solution and I definitly wanted to end up with a build-once;deploy-many solution.

Anyway, I won’t talk about the entire scope of the POC, but I want to highlight the approach I took for automating the export & extract operation from the development CRM 2011 instance via a TFS build definition. The goal here was to automatically capture the daily changes which were published to deployed CRM development solutions.

EXPORT
The MSCRM 2011 Toolkit contains a Solution Export command line utility which enabled me to export one or multiple CRM solutions from an existing Solutions Export Profile into a single compressed solution file (.zip).

EXTRACT
The compressed solution file (zip-format) is of course not ideal to track the individual changes and to bind it to a version control repository. Luckily, with the latest release of the Dynamics CRM 2011 SDK, a new tool (SolutionPackager) was added to extract the different components into individual files.

The SolutionPackager tool, available in the Microsoft Dynamics CRM 2011 Update Rollup 10 version of the Microsoft Dynamics CRM SDK download, resolves the problem of source code control and team development of solution files. The tool identifies individual components in the compressed solution file and extracts them out to individual files. The tool can also re-create a solution file by packing the files that had been previously extracted. This enables multiple people to work independently on a single solution and extract their changes into a common location. Because each component in the solution file is broken into multiple files, it becomes possible to merge customizations without overwriting prior changes. A secondary use of the SolutionPackager tool is that it can be invoked from an automated build process to generate a compressed solution file from previously extracted component files without needing an active Microsoft Dynamics CRM server.

TFS 2010

So, these tools opened the door for me to work out a custom build process (workflow) in TFS 2010 with the following sequential activities:

  • Export CRM solution from dev environment (MSCRM 2011 Toolkit)
  • Prepare TFS workspace before extract of solution file [Get Latest + Check-Out]
  • Extract compressed solution file into TFS workspace (SolutionPackager)
  • Scan TFS workspace for changes/additions/deletions
  • Check-In pending changes of the TFS workspace as a single changeset

The scan of the TFS workspaces – to end up with all differences [changes/additions/deletions] – was a bit more complex than expected because I needed to use several TFS API Workspace calls like PendEdit, PendAdd, PendDelete, … I also made use of the EvaluateCheckin2 method to detect potential conflicts and to perform proper exception handling.

This process allows the development team to easily follow-up the incremental changes (via TFS changesets) which were applied to the dev CRM environment. Note that the SolutionPackager tool is also able to generate a compressed solution file from the individual component files.


TFS 2010: Virus Scan file/folder exclusions

December 20, 2011

At several customers I have seen that some Virus Scan Systems (no names) have a serious negative impact on the performance of Team Foundation Server. Especially the Build Server may be an easy target for the Virus Scanning actions.

Here’s my latest recommendation I have sent to the Ops Team. Note that all involved servers run Windows Server 2008 R2 (64 bit) and the software is installed on the C drive. The TFS Application Tier also has WSS 3.0 (SP2) installed. Use at your own risk and good luck to get this approved by your Ops Team. Traditionally they handle the presumption of innocence principle: the virus scan is innocent until proven guilty!

  • TFS Application Tier
    • C:\Program Files\Microsoft Team Foundation Server 2010\Application Tier\Web Services\_tfs_data [TFS Version Control Cache folder]
    • C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions [SharePoint Portal]
    • C:\Windows\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files [SharePoint Portal + Web Access]
    • C:\Users\TFSWSSAccount\AppData\Local\Temp [SharePoint Portal]
  • TFS Build Server(s)
    • Build Agent(s) working directory: D:\Builds
    • TFS local cache folder for TFSBUILD account: C:\Users\TFSBUILDAccount\AppData\Local\Microsoft\Team Foundation
  • TFS Data Tier
    • SQL Server data files: .mdf + .ldf
  • Client DevSeat
    • TFS workspaces folder: D:\TFS

I did find some official references:


The importance of Continuous Building/Integration

December 13, 2011

Automatically triggering a build after each check-in (continuous integration option in TFS Build Definitions) in a development branch is a good practice that nowadays needs less explanation than a few years ago. Many development teams seem to adopt this practice quite well, but there’s still a lot of room for improvement.

The main benefit of enabling continuous (integration) builds is clear: providing early feedback on the validity of the latest code changes that were committed to the version control repository. I won’t discuss in detail the different checks (compilation, deploy, test) that should be part of the validation step (as always: it depends!), but for this post I want to focus mainly on the importance of having at least a continuous (integration) build. A future topic to tackle is how to get your CI builds as fast as possible with TFS 2010.

TriggerCI

The first guideline that developers should follow if they want to reap the full benefits of CI builds, is to check-in early and often. I’m often worried/shocked when developers show me the amount of pending changes in their local workspace. This nearly always leads to the conclusion that those developers are working on many different tasks at the same time which should be avoided at all cost. The risk of committing unwanted files for a dedicated change is just too high. There are a number of version control features available to safely switch to another task: shelving or creating an extra workspace. The next version of TFS (TFS11) will even include better support for task-driven development through a revised Team Explorer (more details in this blog post of Brian Harry). Check-in early and often does of course not mean that developers must just check-in changes that are not ready yet and not locally validated, but it does mean that developers should work on small incremental changes. This requires every involved developer to break down their assigned work into small tasks which may result in a check-in operation. Ideally, as a developer, you should commit your changes to the version control repository at least daily. And don’t forget to provide a decent comment for the check-in operation in addition to an association with a work item. This meta-data might become quite important in some merging scenarios.

Having a CI build and adopting a check-in early and often policy, will lead to the fact that a broken build can only be caused by an incremental small change which should be easier to fix than a large changeset or a combination of multiple changesets by different developers. More frequent check-ins will further decrease the amount of files that could be part of a conflicting changeset that caused the build to fail.

BuildOverview

Another golden rule in a CI environment is to fix the broken build as soon as possible and to delay other commits of pending changes. This implies that a notification mechanism should be put in place to warn the development team about broken builds. In Team Foundation Server 2010, you can easily set up an e-mail alert or you can rely on the built-in Build Notifications tool. The developer who caused the build to fail must take responsibility to fix it immediately. The check-in operation of a developer can only be considered as completed/done when the CI build has built successfully the latest changeset.

My recommended 10-step workflow for all developers:

DeveloperWorkflow

After the last step, you may start all over again or you may just go home! Committing changes to version control requires you at least to wait until the CI build has finished successfully. Do you really want to impact the entire development team by going home without verifying your latest changes?

This reminds me to my early years where I was a junior .NET developer in a mid-sized development team. At that time, there wasn’t yet a proper build server and we never heard of continuous integration before. I don’t have to tell you how many times all developers were impacted by an invalid check-in and how much time we spent on finding the culprit. You probably have all been in a similar situation.

  • Dev1: My local build fails after a get latest
  • Dev1: Who did a check-in recently? Anyone heard of method X?
  • Dev2: I Just did a get latest and everything works fine!
  • Dev3: Are you sure that you took a “real” latest of everything?
  • Dev1: Wait, let me redo a get latest to be sure!
  • Dev4: Damn, I also took a get latest and my build now fails!
  • Dev2: Sorry guys, I might have forgotten to check-in a file!

In some cases, the person who caused the build to fail was not at his desk (sick/holiday/meeting) the moment the build failure was discovered! Yeah, those were the times we all never want to go back right? Not to mention what this type of troubleshooting costs!

In brief, it’s all about discipline and communication. Having a nice up-to-date build dashboard on a big screen might definitely help to make everyone aware of the importance of the latest build status.

In a next blog post I will talk about the importance of the CI build speed. It’s obvious that for providing early feedback, the CI builds must run as fast as possible. What’s the value of a CI build that takes more than 1 hour?! There are some interesting things you can enable in the TFS Build Definitions to optimize CI build definitions.

Note that since TFS 2010, there is also the Gated Check-in trigger option that will even prevent broken builds by delaying the actual commit until a successful build was run with the latest code changes in a temporary shelveset.


Errors during build execution while downloading large files from Version Control

August 3, 2011

In a large SharePoint development project at a customer I was confronted with the not always reproducible build error while syncing the build workspace:

Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host

image

This error sometimes occurred while downloading the larger database backup files (> 1.5GB) which are required in the deployment and test phase.

Luckily a patch (KB981898) for the TFS Application Tier (Windows Server 2008 R2) does exist to resolve this particular issue! Also have a look at this blog article for more background info.

Also thanks to Giulio Vian for helping me out with this issue!


Work Item Only View (WIOV) Users in TFS 2010

June 29, 2011

Team Web Access (TWA) is a customizable Web interface that can access Team Foundation Server project data. It acts as a client of Team Foundation Server and provides most of the functionality available through Visual Studio Team Explorer. Users that connect to TFS via TWA should also have a valid Client Access License (CAL).

WIOV

But … without having a CAL you may also create and view/modify work items that are created by you in Visual Studio Team Foundation Server. To perform these tasks, you need only Team Web Access in Work Item Only View (WIOV) and the required permissions.

Taken from the Visual Studio 2010 Licensing Whitepaper (area Client Access License Exception for Certain Work Items):

A user does not need a CAL or External Connector License to create new work items or to update work items that that same user has created. This exception applies only to work items related to defect filing or enhancement requests. However, a CAL is required when a user views or modifies a work item created by another user or interacts with Team Foundation Server in any other way.

Great, but how to make this work inside Team Foundation Server 2010?

Open up Team Foundation Administration Console and click on the Group Membership link that belongs to the Application Tier tab.

image

There you will find a TFS Security Group Work Item Only View Users. User accounts of users should be added to this group when these users should only have access to the Work Item Only View feature.

The actual downgrade of the permissions for these users is set through the Security Administration.

image

This TFS Security group is denied access to the full Web Access features.

Being part of this group will show you a limited version of Team Web Access where you will only be able to view and manage your own work items.

image


ALM End-To-End

May 12, 2011

Today I held a short presentation (45’) on Application Lifecyle Management End-To-End. The session was part of the ISV Discovery Day.

The presentation was primarily focused on showing the audience the ALM end-to-end workflow in Visual Studio 2010, starting from gathering and managing requirements, up to releasing an application for production.

Topics covered during the demo:

  • Requirements Management with work item queries and excel workbooks.
  • Status reports with SQL Server Reporting Services and Excel Reports
  • Work Item Management association during check-in
  • Continuous Integration builds
  • Test Case Management with Microsoft Test Manager

Download slides here.


MVP Summit 2011 – VS ALM stuff

March 9, 2011

Last week I was in Seattle for the MVP Summit where I was one of the present 78 Visual Studio ALM MVPs to explore the future of the Application Lifecycle Management platform.

On Sunday I already enjoyed a full day of interesting MVP to MVP sessions and I do want to highlight one particular session from Sven Hubert, a German Visual Studio ALM MVP. He did a nice talk on extended dependency management with Team Foundation Server. Many of us will certainly have been in the situation where a custom solution was required to manage internal/external dependencies, but this type of automated solution really rocks. Be sure to check out his blog article where he described his solution to manage dependencies with Team Foundation Server. If you understand German, you might also want to have a peek at the AIT TFS blog.

The other days were filled with a bunch of interesting sessions about the (near) future of Microsoft’s ALM offering, given by the ALM/TFS product team. Good to see that the people at Microsoft are really listening to the customer/community feedback. They are also extremely passionate about the new features they want to introduce in the product. I had a blast to be part in this gathering at Microsoft and made a few extra important connections.

Meanwhile, a few important releases have been made:

A few Visual Studio ALM MVPs and Microsoft people have been working hard to publish new ALM books:

Really exciting times to be involved in the ALM space!


Going beyond nice and easy installation scenarios of TFS 2010

January 29, 2011

The TFS product team has done a great job in facilitating the installation/configuration of Team Foundation Server 2010, but don’t get too excited about all this. Some scenarios are still a real challenge to complete successfully and require a lot of planning and testing. This all depends on the fact if you are migrating from a previous Team Foundation Server (2005/2008) and what type of TFS topology you are looking for.

Easy

The easiest scenario is of course the TFS 2010 basic installation. A clean install of TFS 2010 on a client operating system (Vista / Windows 7). No hassle with SharePoint and SQL Reporting Services. This scenario can be done by every software developer and is perfect for having a local development playground for version control, work item management and build automation. You may be up-and-running in less than 1 hour! This is the only installation type where you don’t really need the famous TFS 2010 Installation Guide.

The other easy scenario is when you want to setup a clean Team Foundation Server 2010 (no upgrade) on a single server (Application Tier and Data Tier are installed on the single server). This is especially true if you are using all new installations of prerequisite software to host the configuration database, the report server, and the portal server (Windows SharePoint Services 3.0). This scenario involves already some familiarity with the other (optional) TFS Components: SharePoint and SQL Server Reporting Services. Read more about the Single-Server installation procedure.

Difficult

Selecting a (complex) multiple-server installation for Team Foundation Server should always be based on actual requirements instead of the fact that it’s just cool in trying to set this up. Don’t even think about this scenario when your potential users are lower than 250. At this time the complexity will automatically increase because you will probably also need to pass the security department for getting clearance in required service accounts for TFS and you will also need to work closely together with the SQL Server operations team / SharePoint operations team to work out a matching infrastructure for the SQL Server database instance, the SQL Server Report Server, the SQL Server Analysis Server and the Microsoft Office SharePoint Server (MOSS or WSS). Most of the time they won’t give you immediately the infrastructure you really need. You will need to earn their respect and you will need to understand their overall policies. This is definitely the part where you won’t have full control of the situation. The bigger the company, the more time it will take to get agreement from all parties. Start as early as possible with these negotiations!

Let’s shift to upgrading from a previous Team Foundation Server (2005/2008) towards Team Foundation Server 2010. In this case you have two possible options: an in-place upgrade or a migration upgrade. The in-place upgrade to TFS 2010 will use the exact same set of hardware that the previous version was using. The migration upgrade allows you to move at the same time to new and better hardware for all TFS Components. You might also want to consider moving from 32 bit to 64 bit servers. An extra difficulty might come up when the previous TFS installation is still using a SQL Server 2005 data store because TFS 2010 doesn’t offer support anymore for SQL Server 2005. Before going down the path of an upgrade, be sure to carefully read the TFS 2010 Supplemental Upgrade Guide at Codeplex.

In my opinion it’s always better to go for the migration upgrade. On top of the benefits of new hardware for Team Foundation Server 2010 (always very welcome!), you will always have an easy fall back scenario when the migration upgrade didn’t succeed.

Advanced

A not so well known feature in TFS 2010 is the ability to import data and projects from one or more previous Team Foundation Servers (2005/2008) into new Team Project Collections in a single instance of TFS 2010. Importing has some different characteristics than upgrading, but in some situations a combination of upgrading and importing might be the best fit. Consider the following situation:

  • A running multi-server TFS 2008 instance with 150+ Team Projects.
  • Team Projects should be split in different Team Project Collections that will match a specific division in the company.
  • Not all divisions can and want to upgrade at the same time due to different important release cycles. Risk and possible impact are a lot bigger when everything is upgraded at the same time.

Possible solution:

  • Perform a migration upgrade to TFS 2010.
  • Split upgraded Team Project Collection in new Team Project Collections for dedicated company divisions that were ready for the upgrade.
  • Divisions that were not ready for the upgrade can temporarily continue to work on the old TFS 2008 infrastructure.
  • To investigate a future import of remaining 2008 Team Projects for one or more dedicated divisions into TFS 2010, a trial import can be executed. During the trial period all actions to become TFS 2010 ready can be studied (builds, reports, portal, …).
  • Once the trials were successful, one or more (depending on the number of divisions left to move to TFS 2010) final imports can be performed to a new TFS 2010 Team Project Collection in combination with the rework necessary for bringing it fully operational on the new infrastructure.
  • After all remaining Team Projects are imported in TFS 2010, the TFS 2008 environment won’t be needed anymore.

Over time you will end up with a TFS 2010 environment with all desired Team Project Collections and you will have mitigated the risk of doing the one and only big-bang upgrade, forcing everyone to be ready at that particular time. One downside of the import command is that it does not upgrade reports or team project portals that are associated with the projects and databases to TFS 2010. That’s why it’s certainly a good practice to first perform an upgrade and to bring as many Team Projects back online in TFS 2010 after the initial upgrade. The original purpose of the import action is to consolidate different TFS environments into a single TFS instance.


TFS 2010 Team Project Security Management

January 26, 2011

Setting up security for all Team Projects on all involved TFS Components (TFS, SharePoint and SQL Reporting Services) for all individual users might be quite frustrating and error-prone from time to time.

I have seen this type of mismanagement once too many now. About time to publish some basic guidelines on how to manage Team Project security rights and permissions across all involved TFS components.

Download my recommended strategy for getting rid of the familiar red crosses in Team Explorer and manage TFS security wisely.

Download TFS2010TeamProjectSecurityManagement.pdf.

Content:

  • New Team Project
  • Group Membership for Team Project
  • What about security for SharePoint and SQL Reporting Services
  • Welcoming the TFS Administration Tool (v2.1)
  • Make use of Active Directory groups

References used in the recommendation:

A final note to conclude: the explained Team Project permission sets are not the only available permission sets in the Team Project. Read my previous blogpost on fine-grained permissions in TFS 2010 for more information.


Required permission for TFS 2010 Backup Plan

January 24, 2011

While setting up a TFS Backup Plan (part of the TFS 2010 Power Tools) on a new Team Foundation Server, I ran into a security issue.

TFSBackupError

[ Grant Backup Plan Permissions ] Account tfssetup failed to create backups using path TfsBackups.

I didn’t get rid of this error after making sure that the tfssetup account had the appropriate rights on the shared network folder where the backups will be dropped.

Giving Everyone modifications rights resolved the issue immediately, but of course that’s not the solution I was looking for.

So, ProcessMonitor from SysInternals came once again to the rescue. There I found out that SQL Server (sqlsrvr.exe) was trying to access the shared network folder.

Solution: also grant modifications rights on the shared network folder to the SQLService account that’s running SQL Server.


Follow

Get every new post delivered to your Inbox.