Migrate a VSS repository to TFS

August 2nd, 2012 No comments

Visual StudioRecently I had to migrate parts from a Microsoft Visual SourceSafe 2005 repository to a Microsoft Team Foundation Server 2010 repository. In this blog post I show what I had to do and what the pitfalls were.

The tool

To migrate a repository you have at least two possibilities: Migrate the latest snapshot or the whole history. Normally you prefer a migration of the whole history, so you don’t loose the gained advantage of an version control system.

To migrate a repository from Visual SourceSafe (VSS) with the complete history, there exists a tool, which comes with Visual Studio: vssconverter.exe. You find the tool in the following directory: C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE.

To migrate a repository or just a part of it from VSS to Team Foundation Server (TFS), you have to process two steps: Analyse and Migrate.

There is quite a good documentation about the process and the tool itself at the MSDN.

Analyse step

In the Analyse step the VSSConverter tool checks if there are any problems and creates an user mapping file.

To start the analyse step, you have to enter the following at the command line:

vssconverter.exe analyze analyzesettings.xml

The analyzesettings.xml file looks like the following:

<?xml version="1.0" encoding="utf-8"?>
     <Source name="VSS">
          <VSSDatabase name="\\server\vss"/>
          <UserMap name="Usermap.xml"/>
          <Project Source="$/Project/Scripts/Func"/>
          <Project Source="$/Project/Scripts/Proc"/>
          <Project Source="$/Project/Scripts/Trig"/>
          <Project Source="$/Project/Scripts/View"/>
     <Output file="AnalysisReport.xml"/>

The result of the execution of the command line are two files: Usermap.xml and the AnalysisReport.xml. You can open the AnalysisReport.xml to see if there are any problems. The Usermap.xml file you have to modify before you can continue with the next step.

In the user mapping file (Usermap.xml) you map the VSS users to the users you use with the TFS. This file looks like the following:

<?xml version="1.0" encoding="utf-8"?>
<UserMappings xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <UserMap From="michael" To="DOMAIN\michael"/>
  <UserMap From="john" To="DOMAIN\john"/>
  <UserMap From="ADMIN" To="DOMAIN\michael"/>

I had some troubles here with the correct domain name. The problem result that the user mapping didn’t work during the migration and all history entries had me as user. So I had to destroy the migrated items in the TFS repository with the following command line statement:

tf.exe destroy $/Project/Main/Source/Data/Project/SQL

After that, I corrected the Usermap.xml file and started the migration step again.

Migration step

For the migration step you need a migration setting file. A such file looks like the following:

<?xml version="1.0" encoding="utf-8"?>
     <Source name="VSS">
          <VSSDatabase name="\\server\vss"/>
          <UserMap name="Usermap.xml"/>
          <Project Source="$/Project/Scripts/Func" Destination="$/Project/Main/Source/Data/Project/SQL/Func"/>
          <Project Source="$/Project/Scripts/Proc" Destination="$/Project/Main/Source/Data/Project/SQL/Proc"/>
          <Project Source="$/Project/Scripts/Trig" Destination="$/Project/Main/Source/Data/Project/SQL/Trig"/>
          <Project Source="$/Project/Scripts/View" Destination="$/Project/Main/Source/Data/Project/SQL/View"/>
	 <TeamFoundationServer name="tfs" port="8080" protocol="http" collection="tfs/DefaultCollection"/>
     <Output file="MigrationReport.xml"/>

This setting file looks quite similar to the analyse setting file. But in the ProjectMap section you have the destination attribute where you define the directory in the TFS repository where you want to migrate the VSS data.

In the Settings section there is an important entry TeamFoundationServer. For TFS 2010 you have to define the attribute collection. It wont work without this attribute.

You start the migration with the following command line statement:

vssconverter.exe migrate migratesettings.xml

As a result of this statement you will receive a MigrationReport.xml file, which you can watch in a browser if there were any problems. I recommend also to have a look in the VSSConverter.log file. There are some valuable additional information.

If you like this, follow me on twitter…

Categories: .NET, Good practices Tags:

VDD – the new programming manifesto?

July 17th, 2012 1 comment

VikingLawsWhen I was at the NDC, I had also the possibility to visit with colleagues the city of Oslo. During a stop in front of a little shop, a colleague discovered a post card about Viking laws. When I read it, I was really surprised how well the laws fit to today’s software practices.

The Viking laws are grouped in four paragraphs. I pick the most interesting laws for each paragraph and try to make some relations to the software engineering.

Be brave and aggressive

Here you can read laws like "Be versatile and agile", "Attack one target at a time" or "Don’t plan everything in detail". Those laws are valid for agile software projects too. But one of the more interesting laws is "Use top quality weapons" – just replace the word "weapons" with the word "tools".

Be prepared

In this paragraph you can read laws like "Keep weapons in good conditions", which means in software development to keep your code and tools in good conditions. Another laws is "Keep in shape", which means to do continuous learning and improve your skills. Also the law "find good battle comrades" is interesting, because learning from other programmers or engineers in our industry is very important (by practicing pair programming for example).

Be a good merchant

This paragraph is maybe about business orientation and business value for the customers in the software industry.

Keep the camp in order

When I read this paragraph I thought about the "Boy scout rule", which fits quite well. Also team work is an important point here.


So there are so many well fitted laws here – why don’t we do Viking Driven Development ;-)?

If you like this, follow me on twitter…

Categories: Agile Tags:

My personal wrap-up of the NDC 2012

June 21st, 2012 No comments

I was at the Norwegian Developer Conference (NDC) 2012 in Oslo. It is one of the best conferences I know in Europe. One reason is, that a lot of alpha-geeks are speaking there.

There were during three days 8 parallels tracks, so you have to manage your program. My program looked like this:

Wednesday, 6.6.2012

Keynote, Aral Balkan

Decisions, Decisions, Dan North

Professional Software Development, Robert C. Martin (Uncle Bob)

Agile Estimating, Mike Cohn

Modeling Distributed Systems with NServiceBus Studio, Udi Dahan

Fakes, Isolation Unit Tests, Jonathan “Peli” de Halleux

Social Clairvoyance, Gary Short

Thursday, 7.6.2012

The process, technology and practice of Continuous Delivery, Dave Farley

Busting the BDD myths, Gojko Adzic

Reinventing Softare Quality, Gojko Adzic

Moving from Scrum to Kanban, Rachel Davies

The surprising science behind agile leadership, Jonathan Rasmusson

Dealing with Dynamically-Typed Legacy Code, Micheal Feathers

Deep Design Lessons, Micheal Feathers

Friday, 8.6.2012

Developers: The Prima Donnas of the 21st Century, Hadi Hariri

RabbitMQ Hands On, Alvaro Videla

NDC Cage Match: NodeJS vs. ASP.NET, Rob Conery, Damian Edwards, Jon Galloway

Clojure for the Web, Bodil Stokke

Responsive Web Design, Bruce Lawson

Caring about Code Quality, Venkat Subramaniam


One highlight was the speaker Gojko Adzic. I knew him already, because I read his blog. But I didn’t know how entertaining he could be without loose a bit of useful information. I liked also his sarcasm. Another highlight was the rant of Hadi Hariri about not getting things done. But one of the biggest highlights was the keynote by Aral Balkan. It was a fresh clear and motivating keynote, just great.


There were a lot of talks about software quality, which was nice to see. This is a topic which is also very important for myself. But there weren’t that much new fresh talks about the topic itself. The only exception here was the second talk of Gojko.

If you like this, follow me on twitter…


Quality isn’t a tool–You can’t install it!

June 20th, 2012 13 comments

time, quality and money conceptDid you ask yourself why a team in an organization produces very good software quality and another team in the same organization just struggles to get things done and those things are in really bad quality? Interesting is also that for both teams exists the same rules (methologies, procedures, tools, frameworks, etc.). But why could and does this happen?

Some people – mostly managers or vendors – try to distill quality to a recipe. Vendors could sell it expensively (with consulting) and managers can buy it to prove their bosses that they didn’t make something wrong (they used the standard procedures and tools). This whole thing is ridiculous, also because it happens again and again (also with agile practices).

So, you can’t buy quality, neither you can install it in your team or organization. Also it isn’t a tool, which makes the quality (many managers think that frameworks guarantee quality, which is completely wrong). But why can one team be so much better than another one? So we are back to my question at the beginning of this blog post.

The only reason for this difference are the people. The people or some of them in the good team care about their profession. So they keep up-to-date (read blogs, articles, books, etc.) and leads other team members to become better. But the one of the most important things is unfortunately discipline. So you have to improve yourself constantly (keep you out of the comfort zone) and look always for improvements for you and your team. And yes, that isn’t easy.

Once you are one of those people, be aware of being dogmatic, just stay pragmatic but insist on the important principles as long your arguments are reasonable.

And the managers? If they are not able to change themselves (especially the middle management), then it’s up to you to change the organization – it’s your career and life.

If you like this, follow me on twitter…


Are stale data evil?

February 27th, 2012 No comments

Sexy young woman as devil in fireWhen you’re a software engineer who produces software for enterprises like banks or assurances, then it is normal you have huge databases (several gigabytes). Such systems have an operative application where users do the daily business of the company and there are more informative parts (or strategic parts) of the systems which the management uses. At a first glance, there isn’t a problem with those two views, but as you probably know, those companies have for the second part for the management a data ware house solution.

But what if your customer doesn’t want a data ware house solution? Or if he couldn’t afford one? Then you will probably add reports, search views to your application. In this blog post I describe some of the aspects if you’ve to choose this variation.

Stale data as a requirement

Unfortunately the question "how old can the data on this report/search be?" is rarely asked. When the answer is "The report/search has to show the right data", then you have to ask the customer again. The problem is, the data is maybe already stale after the query, because somebody changed some data.

In my experience there are only a few reports, which need as little as possible stale data. But it is essential that you ask this question.

Isolate only as far as needed

Most searches or reports need essential tables in your relational database, so it is important that those searches or reports don’t have an effect to your daily business. You ask yourself maybe now, how those queries could have any impact?

If you use Microsoft SQL Server, then the default isolation level is "Read committed". If a query isn’t that clever made, it could happen, that the query blocks a whole table (Intended Shared Lock which blocks any inserts or updates). If that happen, your users will remark that by waiting while they try to save their data.

When you create a search or an report you have to ask yourself always, which Isolation level you will use. When you use dirty reads (Isolation level "Read uncommitted"), then you’ll probably never generate any locks, but you have to deal with data which is wrong. This because data could be roll backed and the same query wouldn’t bring the roll backed data again.


Stale data or even wrong data on a search or an informational report hasn’t to be wrong or a mistake. Sometimes it’s just good enough to fulfil the requirements and make the customer happy. And that’s what it’s all about.

If you like this, follow me on twitter…

Categories: Software architecture Tags: