Model Driven Architecture is the idea that you create models which express the problem you need to solve in a way that is free of any (or at least most) implementation technologies, and then you generate implementation for one or more specific platforms. The claim is that working at a higher level of abstraction is far more powerful and productive. In addition, your models outlive technologies (so you still have something when your first language / platform becomes obsolete that you can use for your next generation solution). Another key claimed benefit is that much of the boilerplate and "grunt work" can be generated. Once the computer understands the semantics of your situation, it can help you more.
Some claim this approach is 10 times more productive, and that it is the way we will all be building software in 10 years.
However, this is all just theory. I am wondering what the outcomes are when the rubber meets the road. Also, the "official" version of MDA is from the OMG, and seems very heavy. It is heavily based on UML, which might be considered good or bad depending on who you ask (I'm leaning towards "bad").
But, in spite of those concerns, it is hard to argue with the idea of working at a higher level of abstraction and "teaching" the computer to understand the semantics of your problem and solution. Imagine a series of ER models which simply express truth, and then imagine using those to generate a significant portion of your solution, first in one set of technologies and then again in another set of technologies.
So, I'd love to hear from people who really are doing MDA right now ("official" or not). What tools are you using? How is it working out? How much of the theoretical promise have you been able to capture? Do you see a true 10X effectiveness increase?