Database Deployment. Hacking the VS database project. Part 2.

In the last article we looked at how, with the aid of pre-build  tasks, we could combine hand-written update scripts with the Visual Studio database project. This concluding article will go through the steps needed to set up an automated build task in Team Foundation Server, based upon the solution we created earlier.

We are going to set up a continuous integration style build task, though for brevity I’m simply calling it ‘Test’. This is an automated build that will run every time a new check in is made. The benefit of this is that if developers make breaking changes to the codebase it is picked up almost immediately after check-in. Developers can work just a subset of the projects in a solution but still be confident that the whole solution builds as it will be built by the continuous integration build task. This is also an ideal place to put unit tests, these can be set up so that they raise build errors if they fail, however unit-testing is outside of the scope of this article.

Setting up a build

I’m assuming that you already have a build controller and a build agent set-up, if you don’t then don’t worry, it’s very straight forward. You can muddle through with the built-in wizard or read about it here

Go to Team Explorer, Home, Builds and Select “New Build Definition”.

Under General give your new build a descriptive name.


Under Trigger set the trigger type to Continuous Integration – Build each check-in. Have a read of the other types as well, they are fairly self-explanatory.


Under Workspace, make sure that every folder your solution uses is in this list, and that the folder structure is reflected in the same way in the Build Agent Folders. Essentially, Team Build is going to copy all of the source files onto the Build Agent before building the solution, this screen defines the folder mapping.


Process is the tab where we have to make the majority of changes:

  1. Under “Items to Build” -> “Configurations to Build”. Add a configuration called “Test”, Platform: Any CPU. This will be important later.
  2. Under Advanced, set the following to False: “Associate Changesets and Work Items”, “Create Work Item on Failure” & “Label Sources”. These settings are very specific for bug tracking and labelling, I don’t know why they are not False by default.
  3. Also under Advanced, add the following for “MSBuild Arguments”. This is telling MSBuild that you want to build, but you also want to publish changes to a database.

 /t:Build /t:Publish /p:SqlPublishProfilePath=FootballTest.publish.xml

The publish settings

You should follow the steps from the previous article, as if you were going to publish the solution, set the connection string to that of your test environment, then click “Save Profile As..” and save this profile in the root of your solution, so that the name matches that in the MSBuild arguments above.

Editing the project file

Open Configuration Manager, from the Build menu. Add a new solution configuration called test, and create new project configurations as well. I like to delete the default Debug & Release ones, but you can do as you wish so long as there is one that matches the configuration in the build we just defined.

This is where things get interesting. The problem at this point is that we do not need (or want) all of the PreBuild tasks to run on the build agent, so long as the solution is building when checked in, the dacpac file will be up to date so we don’t need to regenerate it. We don’t want all of these Pre-Build tasks running on the build server as then we would also need an instance of Sql Server. Unfortunately Visual Studio does not allow you to specify different Pre-Build tasks for different build configurations straight off the bat. We can achieve this by editing the project file manually.

In the Solution Explorer, right click the project and select “Unload Project”. Right click on the project again and you will get the option Edit ProjectName.sqlproj. You’ll notice that a number of the PropertyGroup elements have  Condition attributes, the PropertyGroup is only active when the condition is met. Handily, some already exist where the Configuration is the condition being tested. We need to find the element that contains the Pre-Build events, give it a condition to that it is only active for the Dev configuration, then copy it and enter a different set of Pre-Build tasks for the Test configuration.

<PropertyGroup Condition=” ‘$(Configuration)’ == ‘Dev’ “>
<PreBuildEvent>CD “$(SolutionDir)”
CD ..\FootballDatabase\Build
CALL 01RestoreStructure.bat “$(SolutionDir)”
CALL 02WriteStructureScript.bat “$(SolutionDir)”
CALL 03UpdateStructure.bat “$(SolutionDir)”
CALL 04ImportDatabaseObjects.bat “$(SolutionDir)” “$(VS110COMNTOOLS)”
CALL 05WriteDataScript.bat “$(SolutionDir)”
<PropertyGroup Condition=” ‘$(Configuration)’ == ‘Test’ “>
<PreBuildEvent>CD “$(SolutionDir)”
CD ..\FootballDatabase\Build
CALL 02WriteStructureScript.bat “$(SolutionDir)”
CALL 05WriteDataScript.bat “$(SolutionDir)”

The bat files that write the scripts should be left in as the generated scripts are not under source control.

That’s it. Reload your project file and check-in the changes which should trigger a new build. You can see the status of your builds via Team Explorer -> Builds or you can use the web interface provided with TFS. If your build is failing then you’ll need to check that the permissions are set up correctly, your build agent will need permission to alter the database in your Test environment for instance. In the build definition there was an option to copy build output to the following drop folder, you may have been tempted to turn this off (especially as only UNC paths are supported), this isn’t a good idea though as it greatly reduces the amount of build output you can access and makes debugging tricky.

Another great tool that comes as standard in the Visual Studio tools is Build Notifications. This adds a notification icon to the taskbar which will let you know if your continuous integration build has failed.

There is no limit to the number of builds you can have set up against a single solution. You may decide that you want a separate environment for load testing, or a UAT environment that is only triggered manually, if you have internal live environments you could use TFS to perform upgrades. I would suggest you add security to your build task if you are going to set it up with live connection strings, this is supported as standard.

I would strongly advise some sort of automated deployment, upgrading your software is always going to be a risk, this is one way that you can mitigate that risk by ensuring that your deployment process has as little manual intervention as possible. This is the best solution that I can come up with, the drawbacks of it are that it is quite a serious hack and there are a lot of steps where things could potential go wrong. Please comment if you can see any improvements that could be made, or any other 3rd party tools that can be employed to aid deployment.

Download the solution

Download the data script DataGeneration.sql

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s