How much hardware understanding does one need to fully comprehend "Operating System" and "Computer Architecture" courses one takes as a computer science student?
At that level, the more you know the better, but the bare necessities are boolean logic design for computer architecture. Understand how do you design registers, adders, multiplexers, flip flops, etc. from basic logic units (and, or, clocks). You can probably understand operating systems starting from basic understanding of ASM, memory mapped IO, and interrupts.
EDIT: I'm not certain what you mean by "hardware", do you consider logic design to be hardware? Or were you talking about transistors? I suppose it wouldn't hurt to understand the basics of semiconductors, but architecture is abstracted above the real hardware level. I would also say that operating systems are abstracted above the architecture.
"Computer science is no more about computers than astronomy is about telescopes."
At the very basic level, you should know about Von Neumann architecture and how it maps onto real-life computers. Above that, the more the better. And not just the OS - in garbage collected & VM languages, how the heap, stack and instructions work and are executed, so you know what will perform bad and how to improve it to get the best out of the architecture.
A good way to determine a baseline knowledge set for hardware knowledge generally needed for Comp Sci studies is to visit the curriculum websites of a wide range of prestigious universities. For me, I'd check the Comp Sci curriculum at MIT, Stanford, University of Illinois at Urbana/Champaign (UIUC), Georgia Tech, etc. Then I'd get an average understanding from that.
Furthermore, you could also personally phone up guidance counselors at Universities to which you are either attending or applying in order to get a personalized view of your needs. They would be available to guide you based on your desires. Professors even more. They are surprisingly accessible and very willing to give feedback on things like this.
Recently, I looked at grabbing my master's degree. As an alum of UIUC, I emailed a few old professors there and told them of my interest. I asked them several questions geared at understanding gradschool and their perspective. They shared and most invited me to call and chat.
Personally, I'd agree with @CookieOfFortune. The more you know about how a computer works internally, the more you can use that to your advantage while writing software. That said, it's not as if you really need to understand the physics of electronics to a high degree. It's interesting, sure, but your focus should be on circuitry, logic, etc. Much of this should be presented in a good Operating Systems course or at least provide you with springboards to learn more on your own.
It helps when you are trying to optimize for the hardware you are targeting. Take a hard drive for example, it helps to write software that takes advantage of locality to minimize seek time. If you just treat a hard drive as 'it works', and stick files and data all over the place, you will run into severe fragmentation issues and result in lower performance.
A lot of this is taken into consideration when designing an operating system since you are trying to maximize performance. So in short, learning something about it can help, and certainly can't hurt in any way.
Two thoughts:
First, everything is going parallel. Multi-threading is one thing, multi-core is another. There are oodles of issues around caching, memory architecture, resource allocation, etc. Many of these are 'handled'' for you but the more you know about the metal the better.
Second, number representations in hardware. This as old as computer science itself, but it still trips everyone up. Not sure who said this, but it's prefect: "Mapping an infinity of numberrs onto a finite number of bits involves approximations." Understanding this and numerical analysis in general will save your bacon time and again. Serialization and endian-ness, etc.
Besides, it's fun!