views:

6321

answers:

80

What was the single thing you learned (either in classes or during work) that felt most like scales falling off your eyes?

For me, it was a lecture about microcode, because that filled the gap of understanding between electrons flowing through transistors to form logic gates, and assembler programming. It finally made me feel that I understood completely how a computer works, on all levels.

Related question: What is the single hardest programming skill or concept you have learned?

A: 

That I should learn something practical. Take care of animals, grow plants or learn to survive. When I understood how computers worked - the same you did; I also understood I have to get out of this electron-fantasy very soon :D

Skuta
If you're participating on this website, I'd say it means you didn't get out after all :)
Michael Borgwardt
Friends got me a plant!
Skuta
+1  A: 

Learning about programming language concepts, for instance static/dynamic linking, parsing, stacks, heaps and the inner works of how computer languages function internally.

krosenvold
+21  A: 

Polymorphism - suddenly Object-Orientation made sense!

Brabster
+5  A: 

CPU insfrastructure and machine organisation. How did the bits get from memory and into the CPU and what happened during their execution. Understanding that was a turning point for me.

LenW
+66  A: 

I think the first time I realised "Wow, the computer does whatever I tell it!", followed by the first time I realised "Oh, it really does exactly what I tell it, not what I want it to do."

That and, like you, when I realised I had learnt enough to have a rough idea of how a computer works, all the way from electrons to user interfaces. I find having the understanding of the levels below the one you are working on to be very helpful, especially when things don't happen as you expect - you are then able to reason about it from first principles and often work out why the machine is doing what it's doing. Knowing how the computer works down to physical processes also helps to reinforce that it's a machine and prevent one from anthropomorphising it - intentionally or otherwise.

lemnisca
Oh yes - I read somewhere some guy working on the first computers wrote about the moment he realized that he'd be spending much of his professional life from then on fixing mistakes in his own programs.
Michael Borgwardt
Yes - this is the biggest stepping stone for most people. They think that when there is an error in whatever they are doing, that it is due to some random chance. With probability infinitesimally close to 1, it was actually exactly how their crappy app was written.
Overflown
+53  A: 

Pointers.

Once I figured out how to use a location, rather than the contents of a location, a lot of things became clearer to me.

Abizern
In what way? I'm curious as to the meaning of this.
Well, for one thing, if you don't understand pointers, you're not likely to understand how object references work...and hence you're not likely to understand OO very well.
Kyralessa
Also - passing values or references between functions. In particular - passing objects with multiple levels of indirection.
Abizern
Oh, DavidK - you might know me as Stompy on LFGSS - so we have something to talk about the next time we go on a ride.
Abizern
@Abizer: That will make an entertaining ride. I know pointers I just wanted to see the definition of "lot of things" expanded.
I'll bug you on LFGSS, I just know you read that place more ;)
+110  A: 

When I stopped listening to the lecturer telling me to think of data objects as "like cars, made of components" and started thinking of them as custom data types with their own commands. Suddenly I could program Java.

Richeh
Wish I could vote this up more than once!
Software Monkey
:) Perfekt answer!
furtelwart
totally agree - sometimes i look back and think perhaps my lecturers didn't know what the hell they were talking about
Ben Aston
+1, Many Interface tutorials use the Car metaphor as well. Interface Car, Class Ford : Car, Class Mazda : Car. Why would I create a class for each individual brand. Completely screwed up my understanding of interfaces
TT
Indeed, looking through the things they try to teach you is the biggest step -- and the hardest, because it makes it hard to talk to other students.
Tetha
Can anyone cite any articles, blogs, etc. that expand on thinking about objects in terms of custom data types? As someone new to OOP this concept makes a lot more sense to me and it'd be nice to see some tutorials that taught OOP this way.
vg1890
Also, keep in mind that it is a difficult job to be a teacher, especially with 100 students in a class. EVERYBODY LEARNS DIFFERENTLY!
Redbeard 0x0A
note that the "custom data types" way of looking at objects is valid only in class-based languages like Java, not so much in prototype-based languages like JavaScript
Michael Borgwardt
+14  A: 

"Wow, can I make my own class!?!!?"

IceHeat
That is a great moment indeed!
discorax
Especially after having them show up in Intellisense, just like the "real" built-in classes.
IceHeat
+9  A: 

Storage. Once it actually twigged that each bit of data actually needed to live somewhere, in memory or on disk, and not just mysteriously exist somehow, things fell into place, programming wise.

Jonathan
+40  A: 

Recursion.

Specifically, when I learned in university that one could implement a path-finding algorithm with a simple recursive function. I'd previously thought it was only useful for things like computing factorials.

It completely opened my mind.

csl
To understand recursion, you must first understand recursion.
Dave Markle
To understand recursion make sure you have a terminating case.
Eugene M
// upvoted for the comments.
EightyEight
And the sad thing is that although everyone sees his/her first recursion when professor says "you can make factorial in it", it's actually completely inappropriate to use it for something for which FOR loop is designed, and real usages are seen much later
Slartibartfast
+19  A: 

GIT

has changed the way I see and do my work more than any other tool. No longer understanding version control as a backup tool and having the whole power of an own repository at hand is very enlightening:

  • Being able to do everything in private - even large changes over a large number of patches without anyone even have a chance to see your stupid mistakes
  • If in doubt add one more branch
  • Move patches and branches around like Lego bricks
  • Do not only decide how the code looks like but also how the history looks like
  • Have a line by line control what of the changes actually makes it into the commit
  • reorder, rebase, squash and split patches with ease
  • have a set of damn powerful tools that build on the basic version control functionality

While the learning curve is a bit steeper than those of other tools the view from the top is much better.

But you will get so much better if everyone sees your stupid mistakes!
PeterAllenWebb
+28  A: 

Work.

Once I started working, applying my theoretical knowledge, I realised how little I knew about the important things in this business.

For example, being a DB wiz means nothing unless you have the skills to extract the real requirements from a client, and ignore what he has scribbled on the back of a fag packet.

Another example is when I learnt that just because I've done Task X a million times and have considered all the expected consequences, I should still take a backup; I found out just how many unexpected consequences there could be, and how incredibly likely they are to occur. A senior colleague once told me: 'Never take a step forwards unless you are reasonably confident that you can take the same step backwards'.

CJM
Million to one chances show up 9 times out of 10. -- some Terry Pratchett character.
Zan Lynx
Fag packet is a slang term for a packet of cigarettes widely used throughout the United Kingdom. Learn something new everyday. I was trying to figure out what kind of "packet" could be considered gay.
Simucal
As for that senior colleague's advice, I bet it was upsetting when they first heard about the 2nd law of thermodynamics.
Daniel Earwicker
Simucal - sorry, a translation would have removed the implied distain in my story. And alternative would be to write 'on the back of torn-up beer mat'. In reality, sometimes we don't even get requirements on a fag packet!
CJM
+1  A: 

Application of Strategy pattern in some PHP code (then I started to look into design patterns).

This one opened mine eyes on the object oriented programming, which earlier I thought being just stupid parent-child relations.

+13  A: 

"A computer can't do anything complicated, it can just do simple things lots of times fast." That was a big starting point for me in programming.

Also recursion, which although I learned about it right at the start I found I needed to go back over and learn in more depth as the rest of my programming got better. From time to time I realise it's time for me to go and relearn recursion, which although...

glenatron
Seems like a strange distinction to me - of course they can do complicated things, they just do it by doing simple things lots of times and fast.
Daniel Earwicker
Earwicker, I think that is the point. A daunting project is suddenly less daunting when broken down into a series of small stages.
CJM
+4  A: 

Realizing that what you produce lives longer than you think - which is a nice way of saying people a long long time from now will be dealing with what you write today.

I got a call in 2002 from someone at a company I left in 1989 pleading for help with a database application I wrote on a Sun workstation in C & UNIFY. It had since been ported to VAXen/INGRES and then to Windoes2000/SQLServer. They were STILL using it and trying to build a web app out of it!

n8wrl
A: 

Not quite a duplicate, but very closely related to this post.

chills42
+21  A: 

I have to say, top of the list has to be function pointers in C.

I'd missed the class and copied the notes from a friend. I wrote an example and fireworks went off in my head, suddenly it occurred to me - all at once - just how incredibly powerful they were, with these you could write ANYTHING. Compilers, operating systems etc. it completely changed how I looked at code, and in my mind made everything achievable.

Later that year I implemented a project using what I now realise was a primitive form of Polymorphism. It took less than a quarter of the code the rest of the class used. All thanks to (deep announcers voice) the power of Function Pointers.

Years later - after learning some OO - on the job I was introduced to UML and Design patters. Again suddenly I had a vocabulary that allowed me to communicate all the cool ideas I was having e.g. Instead of saying

  • "What we need is a thing and you'd have a couple of different ones, but like you'd only be using one at a time and what it does is create the specific objects for you and don't worry it'll be fine I know what I'm talking about"

I could say

  • "We need a Factory"

and everyone would know what I was talking about (and if not, I'd hand them my design patterns book, and they could RTFM)

Binary Worrier
+13  A: 

When I needed to solve a problem and realized I could do it with a program.

I once has a math professor tell me to stop writing programs to prove theorems, he said I was "missing the point". I was freshmen and very proud of myself, for using my CS skills to make my life easier.
Mark Robinson
+11  A: 

When I read how LISP works.It was beautiful

Varun Mahajan
+3  A: 

Wait, you mean I didn't really need to learn calculus for web development jobs?

BenMaddox
I thought about this one a bunch as well. Though the longer I'm in the industry the more I appreciate the calculus I did learn. Not that I ever use it but the problem solving and the way you tackle a hard math problem can be brought over into the development world.
Joshua Hudson
Oh, it is helpful, but not needed. I'd say the problem solving aspect was mostly learned through algebra and geometry for me. I guess my point was I didn't *need* to take it that far.
BenMaddox
+7  A: 

"Hello World!"

+26  A: 

Functional Programming. Doing Miranda and Haskell changed the way I think about programming and solving certain kinds of problem.

Using my second programming language (Fortran -- don't laugh) also opened my mind.

Stephen Darlington
+6  A: 

Computer Science is not about digital computers and it is not a science.

It was said a long time ago. But most programmers don't get it.

Our design of this introductory computer-science subject reflects two major concerns. First, we want to establish the idea that a computer language is not just a way of getting a computer to perform operations but rather that it is a novel formal medium for expressing ideas about methodology. Thus, programs must be written for people to read, and only incidentally for machines to execute. Second, we believe that the essential material to be addressed by a subject at this level is not the syntax of particular programming-language constructs, nor clever algorithms for computing particular functions efficiently, nor even the mathematical analysis of algorithms and the foundations of computing, but rather the techniques used to control the intellectual complexity of large software systems. SICP

J.F. Sebastian
Related http://www.cs.uni.edu/~wallingf/blog/archives/monthly/2008-09.html#e2008-09-16T21_43_48.htm
J.F. Sebastian
"Thus, programs must be written for people to read, and only incidentally for machines to execute." Try telling that to the compiler!
Daniel Earwicker
Science is developing an understanding of the universe through controlled repetition. That's what CS does. It isn't any less a science because the subset of the universe it examines is manmade.
Robert Rossney
@Robert Rossney: What you saying doesn't contradict my point (maybe I've poorly expressed it, so see http://uk.youtube.com/watch?v=zQLUPjefuWA
J.F. Sebastian
I quote that from SICP all the time!
BobbyShaftoe
Also, I don't think Computer Science is exactly a science as science is really about explanation. You have testable hypotheses and so forth. Computer Science more aligned with mathematics and logic than science.
BobbyShaftoe
+7  A: 

I learned to program early on - probably when I was 12 back in 1987. I just kept programming and writing little things that suited my purposes or studies. I'd never considered myself an actual developer until at one point during a co-op term it suddenly occurred to me that professional programs (Word / Lotus 123 / Doom) all worked by using the same ints and float variables I was using, and that I actually was a developer.

I remember the thought struck me so hard that I stopped and said 'huh.'. For me, that's a massive emotional outburst.

Kieveli
+1  A: 

My first introduction to patterns. Finally I figured out how not to reinvent the wheel every week. I finally figured out what colour it should have been in the first place.

Pete OHanlon
+9  A: 

Realizing that there are unsolvable problems. And then, realizing that one can approximate solutions to these problems.

mepcotterell
A: 

IOC Containers. All of a sudden your application is beautiful - less coupled, easier to maintain and a lot easier to write in the first place! Ban Spaghetti!! Microsoft's Unity is easy to use, but NInject is superb. There are loads out there (StructureMap, Castle Windsor, AutoFAC etc) and it doesn't really matter which one you use - just use one.

Stuart Harris
+5  A: 

Digital Logic Design class... The professor said an XOR and AND gate was a universal set. In other words, you could build any computer from a combination of them. Pretty mind-blowing! :)

NAND *is* "not AND" :)
Barry Brown
NAND alone is sufficient, but XOR + AND is also universal
Kim
+10  A: 

My moment was when I was reading the S&ICP book: I realized that lambdas in the presence of closures allow you to implement any data structure you like. Suddenly, cons, car and cdr were not the essence of Lisp, as we were taught back in Russia. Lambda was.

(not that this affected my day-to-day C++ slug, but it was beautiful)

Arkadiy
+68  A: 

Humility. Going into my first 9 to 5 development job, about twelve years ago, thinking it was going to be a cakewalk and quickly getting put in my place. It is the realization that knowledge is not the same as experience.

joseph.ferris
I would vote this up a hundred times if I could.
Joshua Hudson
Some of us had the opposite experience.
Tmdean
You realized that knowledge is the same as experience? Good luck with that!
joseph.ferris
+6  A: 

Deterministic finite state automata and the realization that there is a direct transformation that goes from a drawing with circles and arrows to logic gates - this was the zen moment of knowing that software and hardware are one.

plinth
+1  A: 

It was at my first real programming job when "scope" clicked. I had a basic understanding of it, but wasn't optimizing for it in high school. My job (right after high school) made sure that I knew what scope was.

I had a basic understanding of the concepts of all OOP. However, until scope clicked, I wasn't able to dive in and start running with development.

Jeremiah
+4  A: 

Probably in work and working with people. I think the hardest thing ever is to work with people in a group and modifying a pre-existing code within a tight deadline. In school, all we were taught is, "Make a student register application" and then it only taking 100 lines and rarely give any insight to large scale applications/maintenance and working with groups.

A: 

that all the theoretical stuff that i learn in cs doesn't necessary apply that well in real world. yup, normalization should always be done to tables in databases...wat?!!! all your data is in this table? what is this column all over the places?

we should use UML to do all the documentation, wait, what you guys have never heard of UML? is this some sort of a twilight zone?

that, trying to write a good program is more than getting the algorithm which is closest to n(0) or just remembering the syntax really, really well.

melaos
+9  A: 

When I understood that data and instructions were both just bit patterns in storage, and that what happened with them depended totally on what process was interpreting those bit patterns. The eight bit byte 00110000 (hex 30) could be a 6502 branch on minus opcode or the ASCII code for digit 0 or the number 48. etc.

Jim Davis
+3  A: 

Mine was an anti-CS moment: realizing that for all we care about the complexity of a data structure, the constant matters a lot, and can get messed by memory layout, file systems, etc.

Uri
Yes, Big-O notation can be misleading.
MaD70
+7  A: 

When I realized that code and data were the same thing. It's all bits and code can be manipulated just like any other data.

Ferruccio
Only in von Neumann architectures. ;-)
Paul Nathan
+3  A: 

My first big system.

For about 10 years, I had programed on and off as a hobby, doing programs between 10 and 100 lines. I then arrived at college, and completed some very complex, yet brief algorithm assignments- still under 1000 lines, perhaps 3-5 files.

Then, I took a class with a term project. The task was to create a web based information system - you know, something similar to things people actually use. There is no experience quite like starting from nothing, figuring out a technology, and creating a multi thousand line application. It seemed somewhat magic that my sql commands actually created a functioning database, that they actually made it over a network. It was also an eye opener to realize that I, as a programmer, was fully capable of creating things commonly sold for thousands of dollars.

Dylan White
+5  A: 

Hashing. When I realized that I could use a mathematical transform of a key to pick an index in O(1), and that I could get constant time storage and retrieval I wanted to store EVERYTHING in a hash table!

dicroce
I also remember learning hash-tables as something mind-blowing... too bad that only a year later we've learned the cache implications of over-using hashtables!
Oak
+1  A: 

Well, this was really early in my CS education, but at one point I'd written a number of short straight-line execution programs. Then I learned about arrays and loops, and it really was an amazing experience to see that light bulb switch on.

Larry Gritz
+5  A: 

Starting out: GOSUB. Wow! you can REUSE bits of code in your program?

At Uni: The Universal Turing Machine. A realisation of how simple computers fundamentally are.

Work : More difficult to pick one thing out, but possibly finally grasping the implications of apply-templates in xslt.

Alohci
+3  A: 

The most profound thing I learned going for my Bachelor's in CS came during my AI class, when I learned that information can be considered interchangeable with energy. This totally blew my mind and changed the way I look at the world.

More practically, I didn't have a true understanding of pointers until taking an assembly language course. Before understanding their implementation, they might as well have been useful and yet unpredictable gremlins.

Ubiquitous
+7  A: 

The most important thing I ever came to understand as a programmer is that it is universally my fault when my code behaves incorrectly. Even in the few cases where it is not my fault, it's still probably my fault.

Mike Burton
A: 

Learning Smalltalk, in my concrete case Squeak.

zedoo
+2  A: 

when i realized that there is much more to computer programming than writing compilers and assemblers...

gagneet
A: 

Data Structures.

When I learned all of the common ones I realised that a lot of the code I had made could have been done a lot better. Although when I learned all of that stuff it was during my first year of University, so I wasn't exactly an expert by that stage. Before learning about linked lists, I was doing silly things like creating very large arrays, and simply hoping that the array won't ever be exhausted.

Now when I see a problem, I have a much clearer idea of how the data should be stored and accessed, along with the speeds of each implementation.

A: 

I had started windows programming with MFC but concept of Windows (parent, child, sibling etc.) was not very clear to me. It may seem very weird but as soon as I read about GetDlgItem function, everything became clear :-). Suddenly reading MSDN become my favorite hobby.

+1  A: 

I was writing a small C++ program for a data structures class in college (using a DOS version of Borland). I had gone through a few iterations, but by now I understood exactly what it was doing. It was so simple, there was NO WAY it couldn't work... except that it didn't!

Stepping through the debugger, I watched it jump to some "random" line of code "for no reason at all". At my wit's end after watching it do this 10 or 15 times, I rebooted the PC and ran the program again. It worked fine! Hmmm... Guess I should've paid more attention to all those lessons about pointers and needing to be careful about accidentally venturing past the end of your arrays!

A: 

Python's use of lists. After reviewing the list of methods, I was extremely confused as to why something called a "list" would need these. Working through them, however, taught me quite a bit about data structures, including stacks, queues, linked lists, and eventually tuples, dictionaries, and sets as I worked through "why does this need something different than a list?"

For a while, though, my Python code did more list manipulation than my Scheme code.

J.T. Hurley
+1  A: 

Learning that a computer can be built using nothing but NAND gates and a clock.

Then much later on actually simulating this process myself. http://www.hackszine.com/blog/archive/2008/03/from_nand_to_tetris_in_12_step.html

Andy Webb
+3  A: 

Once, when I had been programming for about 6 years, my company sent me on a totally inappropriate course (can't even remember what it was - something to do with local area networks maybe...). Bored with it, I had a look in the class next door where they were doing structured programming. There were more course notes than students, so I was able to take one home and read it. There was nothing in it that I didn't kind of know intuitively and from experience - but I had never been explicitly taught it either, and having the principles and reasoning behind it spelt out was very illuminating.

Since then I have learned many other languages, and newer techniques such as OO, but the principles of structured programming are just as valid now as they were then.

Tony Andrews
A: 

'A' + 32 = 'a'

Took me a while to fully appreciate the fact that empirically everything is just numbers and not abstract 'a' letters and 'b' letters etc. I'm also a EE, so I'm slightly biased.

Ben
+1  A: 

University classes in theoretical computer science and compiler construction.

On the theoretical side, I learned about terms like correctness and formal provableness, so where the limits of being able to write correct software are. I my view, some knowledge in this area is mandatory for writing software that does what it should, even if formal proofs are acutally hardly ever done in read-life software development.

There has been no other place where I learned so much about the programming, importance of theory (in some areas), but also stuff like how to implement complex data structures efficiently like in compiler construction. Knowledge in this area does not only help for related problems like building parsers for complex data formats, macro languages or similiar stuff, but also helps to get some idea what the computer actually does when we enter instructions in high-level language and what needs to be done to implement software efficiently.

mh
A: 

recursion.

Jon DellOro
A: 

This is taking me way back to my youth. Apple ProDOS had just come out. Prior to that was plain ol' Apple DOS 3.3 with its flat file system, which I cut my teeth on.

I had an "a ha!" moment when I figured out the difference between absolute and relative pathnames and that they were interchangeable. The concept of the "current working directory" suddenly took on a whole new dimension which was missing before. Sure, it had all been explained in numerous books and magazines, but it didn't sink in until that moment.

Barry Brown
A: 

Learning how pointers worked in C was definitely a light-bulb moment. But I had a much better one a few years later: modularity and abstraction. What is significant is that it came after I'd been doing both for months. Experience can be a wonderful teacher.

(What actually happened was that I was learning how to write Windows programs in C against the Win16 API. The Petzold book was absolute gold, but it taught "start with this skeleton". That was the key. I eventually had a batch file to start a new program by copying the template I had made of the essential pieces. When I learnt DDE, there was so much mechanical stuff you had to do that it was (by then) natural to abstract it away into another .c file. Then I built a small library on top of my own DDE one and that's when I realized what I was doing. The lesson has stayed with me every since.)

staticsan
+1  A: 

Sitting with a user group of professionals all asking about solutions to their problems and realising that every single one was a problem to do with individual people and not technical issues. Every problem was a people problem.

Nat
A: 

I dunno if any scales fell off my eyes :-) but something that I thought was really cool was spatial data structures, like kd-trees and PR quadtrees. I also liked doing 3D graphics with matrices.

Willie Wheeler
+1  A: 

When I realized what functions were, a light bulb went off. "I don't have to do copy and paste anymore!"

da_code_monkey
It goes dark when lightbulbs go off ;)
mackenir
+2  A: 

Functional Programming.

Although I'm a determined C# programmer, this tutorial was an eye-opener to me that there are other ways of solving problems. Using overloading in C# to emulate functional programming, I was able to refactor some of my more complex algorithms using less than 20% of the code before while having a better readability.

MZywitza
+1  A: 

AND, NOT and OR. I was aware what they do but one day our teacher explained to us how you need to arrange them to add up two 4-bit values. 1 minute later I was understanding how you would go about arranging them to do whatever operation you want them to perform on operands of any size you'd like. 2 minutes later I was thinking "a 32-bit CPU must be about the most complicated thing in the world but I still understand how it works, yay".

Of course every processor I would have built back then would be missing state, but we learned that later :-)

Recursion, polymorphism, the power of LISP and some other things I can't think of right now where also pretty big eye openers.

Maximilian
A: 

Lisp and the idea that you can use code to execute other code.

Shalmanese
A: 

In my 2nd CS C++ class and I finished my homework assignment on pointers. I couldn't believe it actually worked.

bpapa
+2  A: 

1972 -- "fixed" a broken photo-typesetting machine by having the 4-bit computer flash every character twice, thereby getting the newspaper to the pressroom. The manufacturer's techinician was 150 miles away, and was able to replace the weak flash power supply the next day. Fixing or working around broken hardware with software was a "WOW" moment for me.

+1  A: 

Being astounded that the IBM 1620 could do millions of operations, each nearly instantaneously, and never make a mistake. Computers really were a brave new world.

For a mechanical engineer, where things slip, wear out, fatigue, rust, and eventually break, that was phenomenal.

Or that a chunk of program that would take millons of operations could be invoked by a single instruction. That is like a machine the size of a battleship hanging comfortably from the thinnest wire.

Mike Dunlavey
+3  A: 

I first encountered the idea of object-orientation in college, and while I understood the mechanics well enough — the "how", if you will — I didn't quite get the "why". It seemed like just another way of representing data and actions, and a fairly cumbersome one at that. It certainly didn't inspire me to stop writing procedural code at the time.

Some time later, however, I found myself reading through the language documentation of an interpreted language I was considering taking up (I've forgotten which) and while scanning through the examples, found what seemed the single most transcendent notion I'd ever encountered. The example was something akin to the following:

" foo ".trim();

In all my courses, I had never seen an object method called on a literal. It astounded me! For whatever reason, the idea of objects suddenly made sense. Classes as a way of structuring data had seemed clear enough before, but until that moment, the idea that objects could be so deeply embedded in the design of a language that actual string literals were objects with class methods had never occurred to me.

I've always felt a great debt to whatever anonymous programmer decided to add that particular example. Not only did it greatly expand my concept of how code is written, but I don't think I would have survived learning JavaScript without it!

Ben Blank
+1  A: 

Concurrent programming/Multi-threading flow control. This is where we got an assignment where we would enter a # of kids and a # of toys and each kid would receive a random toy trying to complete the set by trading duplicates to get any that were missing. The key was to create Semaphores to prevent cheating like somebody getting 2 toys while giving away one. Very cool assignment that showed how complex the real world can be.

Second on the list would be the realisation that there are only about a dozen lines of code needed for a function to do its work. I remember that being said and seeing some examples where that is how some things are done, if it takes more lines of code then it can be refactored down to that size likely.

JB King
+2  A: 

The biggest thing I learned was that the customers have no idea what they want and once you show them something they usually will know they don't want exactly that and "can you make that green?" or "how about a picture of a tree here instead" will happen.

Basically Joel said it best in his article The Iceberg Secret, Revealed An application takes 1-10% of your time to look good and 90% - 99% to be functional but the customer will only care about that 1-10%

WACM161
A: 

Test-Driven-Development and Domain Driven Design.

Nilsson's book, Domain Driven Development, has opened my eyes to the benefits of testing and modeling.

It was an unnerving experience looking back at my untested code.

Chance
A: 

It's been said many times but it's the same for me: Pointers

A spooky "clunk" noise at the back of my head as the relationship between code and hardware fell into place in a way it never really had before.

I'm sure there are many other ways to get that "ah ha" moment and that I was a dullard to have taken so long to have it happen for me but, until it happens, there is something fundamentally missing in one's grasp of the whole system one is working with.

duncan
+1  A: 

Probably the most eye-opening experience was a required computer engineering class that went over logic gates, flipflops, adders, all the way up to state machines and ALUs. It was fun learning how those things worked but at the end of the class we actually designed a CPU. It was shocking to see how it worked. A CPU instruction was really just a bit pattern used by multiplexers to specify the input registers, output registers, and operation of the ALU (obviously modern CPUs are much more complex).

It was then that I felt like I understood computers "all the way down" from the higher level stuff like Java, C++, lower level stuff like assembly and of course knowing how logic gates worked. But the CPU design which connected the highest level hardware devices - register, ALUs, etc, up to the lowest level programming -- assembler meant that I now had a 'complete path' all the way from transistors at the very lowest level too whatever you could imagine at the highest level: OO design, scripting languages, whatever.

Other then that, the theoretical stuff was enjoyable, but it was a gradual progression of "cool stuff" rather then any one 'ah hah' moment.

Chad Okere
A: 

When I started to learn Design Patterns, I then realized the real power of polymorphism. It really opened my eyes, and has completely changed the way I think about every project.

A: 

Much like the OP, my epiphany occurred while tracing instructions through an pipeline. It was like the last piece of the puzzle. All of a sudden there was no mystery left about computers. This was all there was to know, everything else was just gravy.

unclerojelio
A: 

I am 24 and still learning a lot of CS stuffs, but so far the biggest eye-opener has been my exercise to learn Common Lisp and reading the SICP book.

Amit
A: 

With assembly, writing bytes to the address space of the screen and seeing pixels change.

Ates Goral
A: 

Coming from a C++, C#, and Java background, many of the concepts of Scheme (a dialect of Lisp) such as first-order procedures, code as data, data as code were eye-opening (see Structure and Interpretation of Computer Programs).

Clojure (a JVM Lisp) is also eye-opening for its use of built-in concurrency.

Steve Betten
+1  A: 

It's the day you start thinking in code, rather than thinking about what code you going to be writing.

This seems to happen about 1 - 3 years after starting a language.

leppie
+1  A: 

0.

"Computer science is no more about computers than astronomy is about telescopes."

1.

Computer science and mathematics are closely linked. Math would help me with cutting edge Computer science.

2.

That I didn't need to know advanced mathematics to be a successful programmer (to have high income).

Patrick Gryciuk
A: 

When I first saw how a clever algorithm could be used to replace a bunch of really stinky code, I realized there was more to programming than just learning IBM 1401 instructions. Many times I have ached to start coding a project, and then forced myself to do some more thinking.

gary
A: 

Desk Calculator example in Kernighan and Pike. It demonstrated how to use lexx, yacc, function pointers, implement a new language, all in one simple example.

Larry Watanabe