Just a little post about this argument, since as the author of (t4) Hbm2net and Db2hbm I think I can say my opinion from a critic point of view. As an user I can say that, as an OR/M NHibernate is a great platform, almost any circumstances are treated in deep, and even the infamous “legacy db from hell” are gently handled. So this some of my thought:
This comes from the experience from Db2hbm. The tool can be great if you ( as me, as everybody at the start/half part of the learning curve ) have to start with a big database, and you don’t want to do all the job by hand. Db2hbm has a strategy called mapping exception, you can use it to customize the mapping, but the product is actually monkey code, and you need to abandon the automatic lifecycle in favor of custom mapping. This is bad in term of code generation, since the difference between a code generator that works almost 90% the time and one that works 100% of the time is a lot more than 10% .
This comes from hbm2net. I worked personally on the old version in the NHContrib and produced a version with a T4 template. I never modified the template and always had a working class for my mapping, even with exotic associations. This is really pleasant, since we already did a boring job in writing the hbm, creating the class from it would be just painful. So, in my experience, there is nothing more in the class that is not already said in the hbm.
There is nothing painful in writing XML if we enable the intellisense for it. It is easier to grab all the mapping details by writing your own mapping, and at soon you have a a good practice, you can leverage some nh mapping sugar, to deal with really powerful mapping constructions. With the meta tag hbm can collect even more information to feed more code generators.
Fluent NHibernate is the ( probably most famous ) way to avoid writing XML with NHibernate. Even of there is an AutoMapping feature, that does avoid wrtite any code, you eventually ends writing such things:
public class CatMap : ClassMap<Cat>
Id(x => x.Id);
Map(x => x.Name)
Map(x => x.Sex);
References(x => x.Mate);
HasMany(x => x.Kittens);
So if it is monking code writing the class for HBM, here you monkey twice by writing two class for say the same thing.
CodeConform is the newest, and maybe the most innovative no xml mapping strategy. Fabio Maulo inspired himself probably in this post. The main idea in Conform in order to me is the pattern strategy that can help write complex mapping without really having to touch any xml file. The xml layer is avoided even internally, so loading a session factory is sensibly faster. Since Fabio is the lead developer of NHibernate, some of these concept will probably appear on the next NH version.
I like code first OR/M. But in real life this happened to me 0 times. So having legacy DB ( well, DB first approach, that is the same ) is quite common for me, and I think for a lot of people. Write a mapping in code is not an option, or does not make sense in order to replace hbm, we eventually fall in duplicate the concepts. ConfOrm by design avoid composite key, that are really common in legacy schemas. We need something to easily grab the pattern the DB designer would mean and easily port it to our NH solution.
a@href@title, b, strike, strong
The opinions expressed herein are my own personal opinions and do not represent
my employer's view in any way.