views:

701

answers:

17

Hello,

What would be a good methodology for learning how computers and computer programming works?

For example, would you recommend learning how circuits work, then assembly language, and then higher level languages?

+12  A: 

Build one. Improve it. Use it. Program it. Install different OSes. Read those big fat books. Ask lots of questions. Go get a CS degree. Don't stop learning.

Paul Beckingham
+9  A: 

I'd suggest a computer engineering bachelor's degree.

Jeb
+6  A: 

I'd say, start with programming, then go to emulator, then architecture. Yes, counter to history.

A good language will teach you how to interact with a computer. Building your own emulator will teach you how the hardware works from the software side..building the computer finishes the equation.

If you start from the hardware, it doesn't help programming as much as programming helps you learn the hardware.

+1 : 3 major facets, 3 good projects. I'd throw in a chapter or 2 of a discrete math book on binary logic.
SnOrfus
+2  A: 

I think you will find the more you learn, the more you don't know as is the case with every subject. You may just get to the point where no one else knows the answers... then you can do a PhD thesis :)

ccook
+3  A: 

Thats a tall order.

Hardware side:

  • build a cheap computer, I've always liked Arstechnica's build guides.
  • Read tons of architecture articles.
  • Install different OSs (windows, linux, etc)

Software side:

  • Learn Assembly
  • Starter book on C
  • Deep C Secrets by Peter Van Der Linden (awesome book)

Random other good books

Keep reading, keep asking questions, keep learning.

Rob McCready
+4  A: 
  1. You will never know everything there is to know about computers.
  2. If you try to learn all the history you will never catch up.

That said never stop learning. It may be impossible to learn it all but it doesn't mean you can't try :)

there are many resources out there that you can learn from. Wikipedia would be a great place to start with learning the history.

In the beginning was the command line to learn a little bit on the command line

Tanj
"In the beginning was the command line" and lo it was good. The geeks said unto them selves 'behold this goodness' and were efficient. Then there was the GUI and the geeks said unto the world 'behold, for it is bloated and inefficient' and prophetically, they were correct (and still efficient)
Adam Hawes
+1  A: 

I find most of the fun is because the topic is so deep. Pick any direction at all and follow it as long as it is interesting.

That said, cut your teeth on some actual programming earlier, rather than later. Java, c, .NET, whatever tools are handy or are being used by guys you know. Having some facetime people can be invaluable.

Ryan Townshend
+5  A: 

Well, first I don't think you have enough time to learn everything...

  • First, you need to learn a lot of math, including number theory and logic.
  • Then you need to learn about computational theory, and read the works of Gödel, von Neumann and Turing.
  • Read about Herman Hollerith and IBM and the application of Jacquard-loom technology to counting stuff.
  • Then you need to learn the science of electricity, to understand how triodes/transistors work. This will give you and understanding of the very bottom of how a computer processor works.
  • You should also read about Colossus, the electromechanical computer that was used to crack German codes during the second world war
  • Then you need to pick a hardware architecture to start with. If you REALLY want to understand, find a really old integrated circuit with a simple instruction set. Or start with a modern chip and learn assembly.
  • Learn how assembly writes to machine code.
  • Build a compiler for a higher level language. I would recommend something simple like a LISP, since you won't need to worry about complex parsing.
  • Learn FORTRAN. And not one of the modern ones. IV would be good. Get a punchcard reader to really experience the history.
  • Learn Smalltalk to learn about Object Oriented Programming
  • Add CLOS to your LISP compiler.
  • Implement your own OS
  • Implement your own file system
  • get a Quantum Physics degree if you really want to understand how modern hard-drives and memory chips work (i.e. Spintronics, Quantum limitations to microprocessors)

That should give you a good start at learning everything.

Technical Bard
+1 for 'start' at the end of that
ccook
Then there's always the time-honored method of "download Python and screw around with it for a few days."
rtperson
Yeah, but Python isn't going to teach you about chip fabrication. Personally, I'm getting too old to worry about knowing everything. There just isn't time.
David Thornley
-1 You don't need to know half of that to 'learn how computers and programming works'. Much of that is a) a lesson in computer history or b) the beginnings of learning how to build one (i.e. not put one together)
SnOrfus
SnOrfus - the OP edited the original post to remove their original request to learn "everything". This is why I presented this facetious list.
Technical Bard
+5  A: 

Start with the 0's.

After you've mastered that, move on to the 1's.

Baltimark
A: 

I suggest starting here, to get a good overview.

EvilTeach
+1  A: 

Read Danny Hillis's The Pattern on the Stone. Learn to program. After you've been programming for a while, if you're still interested, check out The Elements of Computing Systems: Building a Modern Computer from First Principles. By then you'll have seen plenty of pointers to more things to study.

Darius Bacon
A: 

Unfortunately by getting computer science or computer engineering degree wont make you an expert on all the topics of computer science or programming. First of all, you have to be aware of that it is not an easy task and it may take several years , so you have to be patient and I recommend you to follow the YAGNI(You aint gonna need it)principle, at first try to get a background on several fields of Computer Science. Then choose the one that you like most and become an expert on that topic.

systemsfault
+6  A: 

I decided to do just that when I was 15, and just kept up with it for--well, forever.

Learned to program basic on the book that came with my computer (The first trs-80 came with a great book targeted at virtually any age, not sure I've seen as good a beginner book since)..

Learned assembly and binary by hand-coding CPU instructions from a z-80 CPU databook. Learned all about registers and how a CPU operates from that. Also messed with assembly on our DEC PDP-11 at school (Just loved the fact that it used Base 8 instead of 16)

Knowing assembly helped a lot with my first job in C. I didn't know C at the time, but had picked up the concept of pointers while accessing basic variables from assembly, so there really weren't many surprises. Getting used to the syntactical exceptions was the only hard part (the for loop, for example, is different than anything else, and irritated me a lot at first)

Took some electronics classes and paid special attention to gates and flip-flops. Still couldn't figure out how to get from a bunch of gates and flip-flops to a CPU, that took learning about timing.

Learned about timing signals(a critical part of bringing it all together) in the navy--one of their classes involved troubleshooting a box that was essentially an exploded CPU. You could operate on any transistor, and go all the way up to plugging in CPU instructions through toggle switches and having the CPU execute them (100 bytes memory). They could break any transistor and you had to be able to find it. (outside training whenever you troubleshot anything, you just swapped cards until it worked).

Edit: By the way, the best part of this class was the 50-page book about 2/3 the size of a desk that had every single part of that thing diagrammed. Studied every inch of that until I "Got" what every single wire did (at least on a logic level-forget the power supply stuff)..

Took a job assembling PCs (because up until then I wasn't really comfortable socketing ram, changing cards and hard-drives, swapping power-supplies, ...)

Took jobs in financial, database and just about any field I could find. Whenever I took a job, a big consideration was how much I'd learn from it. Tried to focus on learning business practices and tools. Took jobs where I spent a good deal of time on customer sites.

That was the first 15 years of my career, the last 10 has been the possibly more difficult task of understanding higher-level design (Focusing on OOD), learning to think in terms of who's reading or using your code (instead of just making your code solve a problem) and thinking more in terms of making others more productive by how I interact with them.

I guess a big part was just never being afraid to jump in. I've never looked at the computer doing something and said "Boy, I could never make it do that". If someone needed something, I just did it.

Knowing about the history of how all these things came about (the school version and who invented what) for me is completely irrelevant. I pick up bits and pieces here and there, but for the most part I just care about how it works. I concentrate on design patterns and books that apply to my work, the ones I can't apply yet I try to understand so I know when I should apply them, etc.

Damn, this is starting to sound too much like some computer psychopath resume. Sorry.

Bill K
+1 for providing the best example I've seen of somebody trying to learn everything about computers.
David Thornley
+2  A: 

You could read Code: The Hidden Language of Computer Hardware and Software from Charles Petzold. It provides a very nice historical perspective on the development of computers.

It's a easy and very good read in my opinion.

David Klein
A: 

In no particular order...

Consider working in QA for a while. It will have the effect of making you see your programs the way mathematicians see theorems. Many people writing code can fall into the seduction of creating something that they then gaze upon uncritically. A stint in QA can have good effects on establishing good programming habits. It will also help make you a good debugger.

Learn to work without an IDE so you know what is going on behind the scenes. Otherwise, the IDE becomes a crutch.

Get a basic understanding of how computers work, including the relationship between CPU, memory and IO. No matter what languages you learn or tools you come to use, everything rests on these principles. Learning this will serve you well.

Finally, get a computer and take it apart and put it back together again. If a physical decomposition is not feasible for you then start with the OS instead. Begin with an “empty” computer, just the OS running. Then start being destructive and remove things, change file protections, etc. Exploring the effects of specific actions will reveal patterns to you that will lead to discoveries of the foundations of the OS. What’s the worst that can happen? So you have to reformat the disk and reinstall the OS. So what? What is gained is far more valuable. The same can be done with a perfectly running program. Take its source and start breaking it – one thing at a time. The resultant error messages will tell you a lot about the compiler.

This leads to the last suggestion. Once you start coding yourself, get in the habit of adding small parts and compiling often.

yetanotherdave
A: 

I wanted to learn exactly what you're looking for. Of course, the full answer is: never stop learning etc... but if you want the most condensed self-paced crash course, read Charles Petzold's Code: The Hidden Language of Computer Hardware and Software then read The Elements of Computing Systems: Building a Modern Computer from First Principles.

This will jump-start your overall understanding better than a half dozen or more specialized university courses.

There's no magic bullet here and these books don't contain any secrets. They are just super-focused with exactly the goal of understanding computer related concepts in an accessible way from top to bottom.

Dinah