Possible Duplicate:
What is your longest-held programming assumption that turned out to be incorrect?
What did you once believe (that is topical for SO) that you no longer do?
Why did you change your mind?
Me: I grew up on C. I thought that pointers – and custom management of assignment sizes and so forth (saving a few bytes here and there) – was important, and scripting or interpreted languages (eg Java) were inefficient because they hid this.
Man, was I wrong (in practical terms). In real life (working on production systems with tens of millions of users), this has been totally insignificant compared to the real speed and memory bottlenecks (bad SQL, network i/o, race conditions, caching, etc); language level stuff has essentially never been worth any sacrifice.
Not having to deal with that level of stuff any more makes my coding much more enjoyable, since I get to write much more stuff I want to happen and much less how it runs.
IOW, high level programming is good.
Caveat: This applies only to systems where you have the luxury of modern processors and RAM sizes. Operating on e.g. an old Palm Pilot with a Dragonball Z processor, one starts running more into processing constraints. Sorry to all the embedded systems programmers out there. ;)