views:

654

answers:

6

How much hardware understanding does one need to fully comprehend "Operating System" and "Computer Architecture" courses one takes as a computer science student?

+4  A: 

At that level, the more you know the better, but the bare necessities are boolean logic design for computer architecture. Understand how do you design registers, adders, multiplexers, flip flops, etc. from basic logic units (and, or, clocks). You can probably understand operating systems starting from basic understanding of ASM, memory mapped IO, and interrupts.

EDIT: I'm not certain what you mean by "hardware", do you consider logic design to be hardware? Or were you talking about transistors? I suppose it wouldn't hurt to understand the basics of semiconductors, but architecture is abstracted above the real hardware level. I would also say that operating systems are abstracted above the architecture.

CookieOfFortune
+2  A: 

"Computer science is no more about computers than astronomy is about telescopes."

S.Lott
yes, but your analogy is not completely correct. Computer architecture is about computers as (insert some telescope design field) is about telescopes.
yx
An appropriate quote, as long as it is not interpreted the wrong way. An astronomer who knows nothing about telescopes would be unlikely to excel in astronomy.
e.James
Exactly that's the point. If you don't know how your telescopes work, the physics behind it, its features and limitations, well you are a very poor astronomist then.
DrJokepu
What about the radio folks? What about the theoreticians? What about the mathematical modeling of orbits and positions? Telescopes have nothing to do with that.
S.Lott
@S.Lott: Even though the physics of telescopes don't control their work, the operation of telescopes informs the history of their science and the sorts of observation they can make about the world (what's a theory that can't be falsified?) I think it's a good analogy to why hardware is important for computer scientists to understand.
mquander
"hardware is important for [some] computer scientists". But not all. Indeed, for some areas a telescope/hardware is irrelevant and for other parts it's confusing or detrimental. Hardware knowledge is just one (narrow) specialty topic.
S.Lott
+4  A: 

At the very basic level, you should know about Von Neumann architecture and how it maps onto real-life computers. Above that, the more the better. And not just the OS - in garbage collected & VM languages, how the heap, stack and instructions work and are executed, so you know what will perform bad and how to improve it to get the best out of the architecture.

thecoop
+1  A: 

A good way to determine a baseline knowledge set for hardware knowledge generally needed for Comp Sci studies is to visit the curriculum websites of a wide range of prestigious universities. For me, I'd check the Comp Sci curriculum at MIT, Stanford, University of Illinois at Urbana/Champaign (UIUC), Georgia Tech, etc. Then I'd get an average understanding from that.

Furthermore, you could also personally phone up guidance counselors at Universities to which you are either attending or applying in order to get a personalized view of your needs. They would be available to guide you based on your desires. Professors even more. They are surprisingly accessible and very willing to give feedback on things like this.

Recently, I looked at grabbing my master's degree. As an alum of UIUC, I emailed a few old professors there and told them of my interest. I asked them several questions geared at understanding gradschool and their perspective. They shared and most invited me to call and chat.

Personally, I'd agree with @CookieOfFortune. The more you know about how a computer works internally, the more you can use that to your advantage while writing software. That said, it's not as if you really need to understand the physics of electronics to a high degree. It's interesting, sure, but your focus should be on circuitry, logic, etc. Much of this should be presented in a good Operating Systems course or at least provide you with springboards to learn more on your own.

Mike Caron
+2  A: 

It helps when you are trying to optimize for the hardware you are targeting. Take a hard drive for example, it helps to write software that takes advantage of locality to minimize seek time. If you just treat a hard drive as 'it works', and stick files and data all over the place, you will run into severe fragmentation issues and result in lower performance.

A lot of this is taken into consideration when designing an operating system since you are trying to maximize performance. So in short, learning something about it can help, and certainly can't hurt in any way.

yx
+2  A: 

Two thoughts:

First, everything is going parallel. Multi-threading is one thing, multi-core is another. There are oodles of issues around caching, memory architecture, resource allocation, etc. Many of these are 'handled'' for you but the more you know about the metal the better.

Second, number representations in hardware. This as old as computer science itself, but it still trips everyone up. Not sure who said this, but it's prefect: "Mapping an infinity of numberrs onto a finite number of bits involves approximations." Understanding this and numerical analysis in general will save your bacon time and again. Serialization and endian-ness, etc.

Besides, it's fun!

n8wrl