Frameworks simplify coding at the cost of speed and obfuscation of the OS. With the passing of Moore's law do you thing that there might be a shift away from Frameworks?
I suspect that one of the reasons for Vista not being an outstanding success was that it ran much slower than XP, and, because computers had not improved as greatly in speed as in the past, this change seemed like a step backwards.
For years CPU speed outstripped the speed of software so new frameworks that added layers of OS obfuscation and bloat did little harm. Just imagine how fast Windows 95 would run on today's hardware (given a few memory tweaks). Win2K then WinXP were great improvements, and we could live with them being slower because of faster computers.
However, even years ago, I noticed that programs written in MS foundation classes didn't seem quite as crisp as code doing the same thing written directly to the API. Since the proliferation of these frameworks like .Net and others can only have made this situation worse, is it possible that we might discover that being able to write code in 'C' directly to the Win32 API (or the equivalent in other OS's) will become a strong competitive advantage, even if it does take longer to write? Or will the trade off in longer development time just not ever be worth it?