Despite the fact that you probably haven’t noticed or cared, I want to apologize for my lack of posting in the last few months.
Sure, I’ve been extremely busy, but in many ways I feel like I’ve been a horrible member of our software community, unfairly learning without sharing my findings with everyone else.
So here I am again, promising that I will try to contribute more in the future with the hope to encourage others to do the same.
There are several things I feel I should write about (including the amazing software architecture workshop that took place last month in Cortina d’Ampezzo, Italy) but today I will just mention the more recent Microsoft Architect Forum 2006 which was superbly organized by Bill O’Brien here in Dublin.
I really couldn’t miss the full-day event since Beat Schwegler and Ingo Rammer were set to dig deep into a topic that I simply can’t afford to ignore these days: Software Factories .
Beat and Ingo were excellent in articulating Microsoft’s vision of how to combine model-driven development, guidance, frameworks and tools together with the objective of creating one or more product lines able to systematically exploit commonalities among the members of software product families .
Bill has kindly made available all the slides in this post so I won’t go into the details of each session.
During his analysis, Beat emphasized that we should consider that, in many cases, up to 70% of the cost of a software project goes into operations rather than pure development; as a consequence, he recommended that we should start adopting model-driven development not only for code generation, but also for requirements, deployment, configuration, and, more generally, for all other activities that are involved during the full lifecycle of a project.
While in principle I don’t disagree with this thought, today I find quite unlikely that different stakeholders (business analysts, network engineers, enterprise architects, solutions architects, developers, security specialists, QA testers, etc.) would accept to use the current incarnation of tools and designers and be confined to one single hosting environment, namely Visual Studio Team System.
But hey, in fairness we are talking about a medium-to-long term goal here, so I will surely change my mind on this point whenever Microsoft (or I :-)) will get there.
Ingo was really impeccable throughout the day and used several examples to illustrate the capabilities of the DSL tools. In one specific instance however, I could not help but notice that the version of the domain specific language that he used did not provide a particular option he needed for his demonstration; he then resorted to manually modify the generated code to accomplish his objectives. Tut-tut Ingo, you are not supposed to do that ;-).
I know I know, it was just a demo, and I really sound way too fussy.
It would be good however if somebody out there explained that, in the real world:
- We obviously cannot modify code once generated since the DSL models will diligently overwrite everything at the next transformation; as a consequence:
- Put a comment header in each template to explain that “This code has been generated by a tool…do not modify…etc.”
- Do not put the generated code under source control. That code is a dispensable artifact. Put the designer file and the templates under source control instead.
- Modifying a template is clearly a better option, generally however:
- You need to put it under source control
- You need to deploy it in a centralized location so that it can be shared across different applications in the same family
- A change in the template may not be done lightly as it could easily break all other existing solutions that use the same template.
- You need to unambiguously identify the version in use (you may put a version number in the header of even resort to change file name if changes are substantial and break existing applications that use it)
- A change in the template that breaks existing applications may trigger the beginning of a new product line, particularly if you can’t accomplish 100% code generation of a solution and you can’t afford to retrofit all the existing solutions.
- If we are unable to build systems that achieve 100% code generation (which happens if the degree of variation in a product family is not completely known):
- We need careful guidance (patterns) to understand how to happily write manual code beside generated code. How do we create the extension points? What do you put in the underlying frameworks and what do you keep in a template? Who should make the changes in the first place? Is it up to us to rediscover these tricks over and over? Sure we can use base classes, template methods, etc. But no, the one-size-fits-all solution of using partial classes does not work all the time (wake up people, we often deal with xml files with little or no control on the reader…has anyone heard of the app.config file for example?)
- There is an evident abstraction leak as developers need to appreciate the generated code as they are going to write beside it and even debug it if necessary (tip: keep templates “thin” by leveraging rich frameworks instead). By the way, the idea that the “smartest” people will write DSLs and templates while the others will just use them is flawed in my opinion, but I will definitely save an explanation for another post.
I’m exhausted already and I’m aware that this list is far from complete; and I haven’t even started talking about how I see we could version a domain specific language or an entire product line using the current tools. Or maybe I should attempt to figure out how I would assemble effective teams within this new paradigm. Has anybody discussed yet how to deal with developer’s natural resistance to model-driven development? How should we address their concerns of domain-restricted job specialization and the rightful dislike for anything that contains the word factory in it? Or has everybody agreed that this is not an issue?
As often happens, the real problem is that the tools are getting there, with their capabilities and limitations, and we really need to go beyond the simple APIs to be good at using them.
But perhaps it is unfair to ask a toolmaker to tell us out how to excel at using those tools.
After all Mozart wasn’t a piano maker, right ;-)?