Archive

Archive for the ‘.NET’ Category

NDC 2014

June 16th, 2014 1 comment

ndclogo2014I attended this year’s NDC (Norwegian developer conference) in Oslo. It was a very interesting conference, but as a short summary, it saw something like a consolidation. JavaScript – as some people say in its fourth generation (Simple Scripts, AJAX, MVC-Framworks, SPA) – is finally accepted as a language like C# or Java. Also in the agile world there is no hype anymore about Scrum or Kanban. It was more how and when to use it.
One major topic, which I saw in several sessions was maintainable code. Code has to be readable, clear and simple. This is something, what I try to teach my team-members also since a few years (yes, I was also once a geek who tried more or less any new fancy framework).
The highlights were for me the sessions “Seven ineffective coding habits of many programmers” with Kevlin Henney, “Code that fits your brain” with Adam Tornhill and “Beautiful builds” with Roy Osherove.

Here the sessions I attended:

4.6.2014:

5.6.2014:

6.6.2014:

Beside the fun presentation of Scott Hanselman about JavaScript, there was another last funny presentation: “History of programming, Part 1″. You can watch all recorded sessions on vimeo.
After the conference I stayed for the weekend in Oslo. And yes, it is a really nice place to be and the people are really friendly.


Share

Migrate a VSS repository to TFS

August 2nd, 2012 No comments

Visual StudioRecently I had to migrate parts from a Microsoft Visual SourceSafe 2005 repository to a Microsoft Team Foundation Server 2010 repository. In this blog post I show what I had to do and what the pitfalls were.

The tool

To migrate a repository you have at least two possibilities: Migrate the latest snapshot or the whole history. Normally you prefer a migration of the whole history, so you don’t loose the gained advantage of an version control system.

To migrate a repository from Visual SourceSafe (VSS) with the complete history, there exists a tool, which comes with Visual Studio: vssconverter.exe. You find the tool in the following directory: C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE.

To migrate a repository or just a part of it from VSS to Team Foundation Server (TFS), you have to process two steps: Analyse and Migrate.

There is quite a good documentation about the process and the tool itself at the MSDN.

Analyse step

In the Analyse step the VSSConverter tool checks if there are any problems and creates an user mapping file.

To start the analyse step, you have to enter the following at the command line:

vssconverter.exe analyze analyzesettings.xml

The analyzesettings.xml file looks like the following:

<?xml version="1.0" encoding="utf-8"?>
<SourceControlConverter>
<ConverterSpecificSetting>
     <Source name="VSS">
          <VSSDatabase name="\\server\vss"/>
          <UserMap name="Usermap.xml"/>
     </Source>
     <ProjectMap>
          <Project Source="$/Project/Scripts/Func"/>
          <Project Source="$/Project/Scripts/Proc"/>
          <Project Source="$/Project/Scripts/Trig"/>
          <Project Source="$/Project/Scripts/View"/>
     </ProjectMap>
</ConverterSpecificSetting>
<Settings>
     <Output file="AnalysisReport.xml"/>
</Settings>
</SourceControlConverter>

The result of the execution of the command line are two files: Usermap.xml and the AnalysisReport.xml. You can open the AnalysisReport.xml to see if there are any problems. The Usermap.xml file you have to modify before you can continue with the next step.

In the user mapping file (Usermap.xml) you map the VSS users to the users you use with the TFS. This file looks like the following:

<?xml version="1.0" encoding="utf-8"?>
<UserMappings xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <UserMap From="michael" To="DOMAIN\michael"/>
  <UserMap From="john" To="DOMAIN\john"/>
  <UserMap From="ADMIN" To="DOMAIN\michael"/>
</UserMappings>

I had some troubles here with the correct domain name. The problem result that the user mapping didn’t work during the migration and all history entries had me as user. So I had to destroy the migrated items in the TFS repository with the following command line statement:

tf.exe destroy $/Project/Main/Source/Data/Project/SQL

After that, I corrected the Usermap.xml file and started the migration step again.

Migration step

For the migration step you need a migration setting file. A such file looks like the following:

<?xml version="1.0" encoding="utf-8"?>
<SourceControlConverter>
<ConverterSpecificSetting>
     <Source name="VSS">
          <VSSDatabase name="\\server\vss"/>
          <UserMap name="Usermap.xml"/>
     </Source>
     <ProjectMap>
          <Project Source="$/Project/Scripts/Func" Destination="$/Project/Main/Source/Data/Project/SQL/Func"/>
          <Project Source="$/Project/Scripts/Proc" Destination="$/Project/Main/Source/Data/Project/SQL/Proc"/>
          <Project Source="$/Project/Scripts/Trig" Destination="$/Project/Main/Source/Data/Project/SQL/Trig"/>
          <Project Source="$/Project/Scripts/View" Destination="$/Project/Main/Source/Data/Project/SQL/View"/>
     </ProjectMap>
</ConverterSpecificSetting>
<Settings>
	 <TeamFoundationServer name="tfs" port="8080" protocol="http" collection="tfs/DefaultCollection"/>
     <Output file="MigrationReport.xml"/>
</Settings>
</SourceControlConverter>

This setting file looks quite similar to the analyse setting file. But in the ProjectMap section you have the destination attribute where you define the directory in the TFS repository where you want to migrate the VSS data.

In the Settings section there is an important entry TeamFoundationServer. For TFS 2010 you have to define the attribute collection. It wont work without this attribute.

You start the migration with the following command line statement:

vssconverter.exe migrate migratesettings.xml

As a result of this statement you will receive a MigrationReport.xml file, which you can watch in a browser if there were any problems. I recommend also to have a look in the VSSConverter.log file. There are some valuable additional information.

Share
Categories: .NET, Good practices Tags:

TF.exe or maybe the most useful tool for TFS

January 31st, 2012 2 comments

visualstudioWhen you’ve to use TFS, then there are moments when you’re missing some features in the UI tools in Visual Studio. In those situations the console tool tf.exe is very useful. In this short blog post I summarized the commands I used the most.

Update an old version of a branch

It could happen, that the production branch or an older feature branch isn’t up-to-date, because somebody forgot one or several merges from the main branch. To fix that, there could be used the following statement:

tf.exe merge /recursive /force /version:T $/Project/Main/Source $/Project/Production/Source

Delete a branch permanently

After a while the number of merged feature branches increase. Normally, you would delete the merged feature branches, so they aren’t any longer visible in Visual Studio. But in my case, I display also the deleted items and this means, that I see all the deleted feature branches. To remove the old merged feature branches, there could be used the following statement:

tf.exe destroy $/Project/Development/FeatureX

Share
Categories: .NET, Good practices Tags:

DataSet and deleted rows

January 31st, 2012 No comments

Pair of cubic eggsYes, I know, the DataSet isn’t the leading edge technology, but as the company where I work currently, there are several companies who use DataSet as data access technology.

Recently, I had to migrate the DataSet subclasses of the framework of my current employer to .NET 4.0 and add LINQ support. After this migration I made some unit tests and I found a surprising fact: If you using LINQ with a DataSet, then you have to handle deleted rows, which isn’t as you would expecting the behaviour.

First, here a little setup to run the tests:

TestDataSet ds = new TestDataSet();
TestDataSet.BECRow row = ds.BEC.NewBECRow();
row.BecId = Guid.NewGuid();
row.RunDat = DateTime.Now;
row.MachineName = Environment.MachineName;
row.CrtDat = DateTime.Now;
row.CrtUsr = "pw";
ds.BEC.AddBECRow(row);
ds.AcceptChanges();

The old way

In the ages before LINQ you get the preferred rows in the following way out of a typed or untyped DataSet:

string strFilter = string.Format("{0} = '{1}'", ds.BEC.CrtUsrColumn.ColumnName, "pw");
TestDataSet.BECRow[] rowsBEC = (TestDataSet.BECRow[]) ds.BEC.Select(strFilter);
Console.WriteLine("Row count: {0}", rowsBEC.Length);

As you expecting, the result here is one row. The ugly thing is the magic filter string and the casting for the variable rowsBEC. But if you’re using DataSets, you are used to such code.

The new way?

With LINQ, the code could look much nicer:

var query = from r in ds.BEC
			where r.CrtUsr == "pw"
			select r;
Console.WriteLine("Row count: {0}", query.Count());

That’s better and I thought that I had the job done. But I was wrong, seriously wrong. This code behaves differently because you will get also the deleted rows (when no where clause exists) or even worse get a DeletedRowInaccessible exception. This exception occurs, when you try to access a property on a deleted row, which will be the case here, if there are deleted rows in the DataTable.

Why? Well, if you look at the code of the Select method (for example with the tool Reflector) you will see, that there is an implicit filtering on the current rows (added, modified and unchanged).

The solutions

So, how could you fix this issue? There are at least two ways. First, the nearest one:

var query = from r in (TestDataSet.BECRow[]) ds.BEC.Select()
			where r.CrtUsr == "pw"
			select r;

This solution works, but you have to do the ugly cast. Well, because we have a framework and we subclassed every DataSet class, we could fix that by an override in the generic base class.

If you don’t have a framework, where you subclassed all the DataSet classes, then there is an alternative way:

var query = from r in ds.BEC
			 where r.RowState != DataRowState.Deleted && r.CrtUsr == "pw"
			 select r;

The order here is important, because you have to check first the RowState and after that you can add the real where clause. Also the "&&" is important, because it has to break if the row is deleted.

Conclusion

There are some pitfalls with DataSet and LINQ, because you wouldn’t expect that behaviour with deleted rows. It doesn’t make any sense at all, because I nearly never saw a reason why my business logic would like to deal with deleted rows. Also the behaviour is unnatural because you think of the DataSet as an in-memory-database, and on the database you have not to deal with deleted rows, unless you have to implement a trigger.

Share
Categories: .NET Tags:

Grooming your code base

September 1st, 2011 2 comments

Fotolia_32643902_SWhen you’re doing Test Driven Development (TDD), it’s in the process: Red-Green-Refactor. Refactoring doesn’t only mean to improve your new code, it is also important to make your existing code nicer.

If you are a .NET Developer, then you should have the Visual Studio Add-on ReSharper. With this tool you get a marker bar which you see right hand side of your editor. On top of this marker bar you have a little square which shows the worst found item (error or warning). In our company we introduced recently this tool and we also introduced a new rule: "Check-in only green code!"
This rule means that the square has to be green: no errors and no warnings are allowed in the current file.

Reality check

New tools and rules are nice, but we know all the pearl of wisdom "A fool with a tool still a fool". You aren’t immediately a better software developer when you do TDD or using ReSharper. You have to know how to use it.
All of us have in our code base a lot of legacy code, I mean really bad code and a remarkable collection of anti-patterns: god object, Swiss army knife, etc. and the ReSharper shows tons of warnings and errors. Now you can start to fix that code base, but be honest, that will be a project of its own. So, should we just resign or try to improve our code base?

Grooming your code base

When you know Scrum, then you know also the term grooming the product backlog. It is essential to make progress and to do the right things. The same goal we should have with our code base. We want to implement fast new features or change quickly with no risk breaking current behaviour.
The word risk here is critical: Don’t try to improve your code base at once, it’ll be a huge task (or even a project) and maybe you will fix things which will change in the near future. So, the right way is step by step or just following the boy scout rule: "Leave your code better than you found it".
If you doing TDD, then try during the refactor step to groom your code base. If not, then just clean up the code you touch and make it green (ReSharper marker bar). Don’t try to fix a whole class, just improve the needed methods.

Make the field a bit more green

In a classical brownfield project, there will be just bad legacy code. So it is important to plan some bigger refactoring tasks to make the code base better and decrease the technical debt. If the code coverage is very low then those tasks are a good start to increase the number of tests and the code coverage. But be aware of those tests: They could document wrong behaviour as correct.

Share