Embarking on a another new serverside project (java based) and the question here is whether to stick to pure J2EE based approach or mix and match the best tools based on open source frameworks. After J2EE 1.4, we are seeing a plethora of technology choices like Spring ( borderline as to when it exactly came out but feel that popularity shot up very much after 1.4), SCA which seems to overlap with J2EE and spring, and many many open source java frameworks. Maintaining these new frameworks over long term is going to be difficult. The developers do like the best tools for the best job and always seem to welcome more newer choices.
In my opinion J2EE hasn't kept pace with the trends in the industry, and won't in the future either.
Web services, RSS, XML in general, ORM approaches. J2EE was advertised as this thing that would let you commit to a technology thereby reduce the risk of commitment to a vendor of app platform technology. But that never really worked out. And now it is clearly no longer realistic.
What J2EE failed to realize was that in deploying server-based apps, the vendor of the so-called "application platform" is not the sole or even dominant investment. There are databases involved, sometimes queuing systems. It's rare to find a shop that doesn't use Oracle-specific features in administering or operating its database, if it has Oracle DB. Or WebLogic-specific features, if it has WebLogic server.
Beyond the "app server + db" pair, there are source code control systems, testing frameworks and tools, requirements management systems, rules engines, development tools, search/index engines, performance analysis tools, and on and on. You're going to end up committing to many vendors, or at least specific implementations, of many of these things.
Twenty years ago, in the pre- "Open Systems" days (I know I sound like an old codger), committing to an app platform meant committing to that vendor for everything. If you bought your server from DEC, then you bought DEC networking, DEC dev tools, DEC databases. If you bought a server from Apollo, then you bought Apollo terminals, Apollo networking gear, Apollo compilers, and so on. This had two adverse effects - (a) prices were high, because customers couldn't switch compilers, networking gear, etc. Once you committed to one piece, you were captive. and (b) there was a risk that the supplier would go out of business, and you'd be left stranded with an abandoned platform. These were real problems.
The promise of J2EE, even when it was launched 10 years ago, was based on addressing those same problems - keep prices for "app platform" low, and keep the risk of being abandoned low.
But Linux, open source, and the intertubes have done much more to alleviate those risks than J2EE ever did. Today you can get Tomcat for free, an x64 server for almost nothing, and open-source source control and testing frameworks. You're still at risk of a project being abandoned, but that's just the way things are.
So the originally stated reason to have J2EE in the first place, isn't really valid anymore. Even so, J2EE is here, known, common. So, is it worth using it?
No, I'd say it should be judged on its merits like anything else. Is Hibernate better than EJB3? Maybe. Is Spring a better design metaphor than server beans? Arguably. But of course it depends on the project, the shop.
You should make that decision on your own, and on the technology merits. I don't think J2EE should get any special consideration.