views:

533

answers:

10

I think knowing how computers work can be very beneficial for programming efficient applications. I am looking for a book that doesn't teach programming but teaches how computers work in a way that helps you make better decisions as a programmer?

Thanks.

+8  A: 

It sounds like you'll be interested in low-level or embedded programming.

Code: The Hidden Language of Computer Hardware and Software

Mike
Curse you! You beat me to it :)
Kevin
Thank you, that books looks very interesting and seems like it might be just what I was looking for.
John Isaacks
+3  A: 

Computer Architecture: A Quantitative Approach, 3rd Edition is a great resource for understanding how computers work.

Aaron Saarela
Ben Collins
que que
A: 

I wholeheartedly agree with Aaron Saarela's suggestion, but also take a look at "Computer Organization and Design: The Hardware/Software Interface". The latest edition of either of these books is not essential, as the earlier editions were almost as good. You'll miss a bit of detail on the very newest chip designs with earlier editions, but save a lot of cash.

You'll probably need to work some exercises to get the most from either of these books. Computers aren't simple machines, unfortunately.

PeterAllenWebb
+2  A: 
Floetic
I used the same book. I liked it, but I don't have anything to compare it with. +1 anyway.
Michael Myers
+1  A: 

Did you look at A Peek at Computer Electronics in the Pragmatic Programmers Things You Should Know series ?

EDIT: this book really explains the lowest level things. Thinking more about it, I think that to improve the decisions one takes a programmer, one is better off reading about Operating Systems. Indeed, this is really whhere programs are running on; as modern OSes create an abstraction layer above the hardware. I'd recommend "Understanding the Linux kernel" (Bovet and Cesati); the third edition covers the version 2.6 of the kernel. Knowing about memory management, pagination, the scheduler, program layout, signals etc. can help getting what is happening behidnd the scene.

philippe
I am actually currently reading The Pragmatic Programmer, great suggestion, thanks!
John Isaacks
A: 

While the books by Hennessy & Patterson are classics, I have always recommended another book to my students - Structured Computer Organization by Andrew Tanenbaum. To me, it is far simpler to read and understand. He introduces the computer in layers, starting briefly at the transistor layer, moving through logic, function and control blocks, instruction sets and finally ending up at the software layer. The H&P books assumes that you have some pre-requisite information as they are designed as course texts.

sybreon
A: 

You have a long journey ahead depending on how much you want to keep up with the changes.

I would not recommend any specific book until I knew what specific area of interest is. You have to love digital equipment (as hardware) more than software imho..

As an example and put very simply, you have MS chasing an extremelly bad JIT code generation chasing Java with much better chasing C++ that is behind modern CPU capabilities (yes Intel leads language design, not the other way around). This is happening all the time and even Intel is chasing GPU guys that cannot write a decent bloody compiler (or graphics driver software that doesn't cause childish problems). What a funny-looking graph..

Another way to look at it is Intel chasing AMD nicking Alpha ideas ripping transputers attempting custom FPGA or Amiga-like designs..

If you're as you say looking for software efficiency aspects, then sure: think MIMD, and it isn't available yet :) Seriously, for efficient you need to worry about only von Neumann architecture, and that cache memory is sacred. Or simply that easily exploitable hardware advances are coming to an abrupt end as evident with Intel's latest quad cores..

Then again, Google showed by doing all the dirty and distributed work of software you can lift the abstraction and still rule; ie. that it can payoff so kids programming mass-scale harware don't have to worry about it. According to them they crash/burn few 1TB disks by just running an odd testing process or two.

Pick your poison of course.. just don't do Java + Oracle, or C# + WPF + LINQ and talk about software efficiency :)

rama-jka toti
Lots of chasing around, and I have no idea what you're saying. And what do you mean MIMD isn't available yet? Multicore certainly qualifies as MIMD, as do a number of commercially available technologies.
Ben Collins
Note the ":)" next to it aka joke... Dear me, should take it less seriously. Btw, that MIMD on Quad is just getting worse and worse. First of all it shares cache and second it starts to suck on memory latency with each new addition.
rama-jka toti
And the "Seriously," after it...
rama-jka toti
A: 

You might also want to read up on the history of operating systems, how and why they were formed. I just took a course on operating systems, and it was actually quite good. Started with those old punch-card computers, and went on to talk about assembly language, then higher and higher level languages were formed. Now we had relatively lots of CPU power, but we weren't using it to it's maximum capacity. Plus, computers could only process jobs one after the other. So, we needed an operating systems so that users could run multiple programs "simultaneously". But now hardware was evolving, and it was difficult for users to write low end stuff that dealt with the hardware directly, so operating systems needed to help abstract away from this so that programmers could just make a system call to read and write files and such... well, I don't want to spoil all the fun, I'm sure you can find more info if you're interested.

You might also be interested in computer architecture if you want to learn about and/or gates, how the CPU actually crunches bits, and how RAM works, and all the circuitry that goes into a computer.

You probably also want to learn about heaps (not the data structure) and stacks.

Also, all your programs essentially get boiled down to assembly language so I'd look into that too.

A computing science degree really covers all this stuff, but if you don't have four years and several grand to toss anyway, I'd snoop through Wikipedia and then you can dig up a good book if you want detailed info.

Mark
A: 

In fact, SICP has some goodies on this topic.

TokenMacGuy
And the accompanying lectures are online. http://groups.csail.mit.edu/mac/classes/6.001/abelson-sussman-lectures/
A: 

alt text

Yes i posted this for some other question in SO.. but this is my suggestion for understanding how computer works.. i think many people will agree..

puttaraju