I was told by a professor that using C code one could heat a single pixel on an old monitor to the point that the monitor would overheat and smoke. Have any of you come across anything that would support this? I am having a debate in my office on whether this is possible or not.
This myth is probably popping up because monitors have an effect where pixels that don't change become "burnt" but the term is slang, not literally burn.
On a CRT or Plasma, you can "burn" a pixel by using it excessively, causing the pixel to get stuck. LCD will appear to do this to, but if you simply leave the monitor off for a few hours, the burnt image will go away, CRTs and Plasmas on the other hand, are damaged forever.
With old PC monochrome monitors, you could programmatically turn off the horizontal sync signal which would cause internal bits in the monitor to overheat and physically fail.
Well, the old multi-sync CRTs were a bit flaky. Get them in a state (resolution) where the vertical and horizontal deflection coils would stop moving the electron beam around (and not turn the beam off), it would burn a nice pinhole in the phosphor coating. Messing with the signals sent to the CRT wasn't hard, you could reprogram the CRT controller with some simple OUT instructions. Smoke? Nah, it was on the inside. It was a problem for a year or two, I was just a pup back then.
Never actually smoked one myself, but great urban professor myth.
Ah the killer poke. This clearly was possible on certain very early computer models. The Commodore PET is the one that springs to mind.