Reading about the G.729 codec, I found this interesting tidbit about "Comfort Noise":
A comfort noise generator (CNG) is also set up because in a communication channel, if transmission is stopped, and the link goes quiet because of no speech, then the receiving side may assume that the link has been cut. By inserting comfort noise the old analog hiss is played during silence to assure the receiver that the link is active and operational.
This is the kind of thing a good programmer needs to know about before they design VOIP software, for instance.
Earlier today I also learned about Saccadic Suppression:
Humans avoid retinal blurring during eye movement by temporarily attenuating the data flowing from the retina into the brain. An amusing way to demonstrate this phenomenon is to look at your face in a mirror. Holding your head steady, look at one eye and then the other, rapidly shifting your gaze between the two. The image is stable and you do not see your own eye movement, but another person watching you will clearly see your eyes move.
This has application in video game and other visual and graphics development.
There are many books on user interface design, but I have yet to see a single reference which enumerates most of the human design factors we should understand when designing software. I expect a lot of software engineers learn this by the seat of their pants - they design it, find that something is odd and/or annoying, and play with it until it feels comfortable. Yet the answers already exist, the studies have been done, and someone knows not only how to fix our issue, but why it's an issue.
- Without getting a BS/BA in a dozen different professions, where would I look for this sort of information?
- Am I doomed to stumbling across it in daily internet surfing (which many companies/managers frown on)?
- What other human factors impact programming (please link a reference, resource, or at least give a googleable technical name - alternately post a new question about it with the tag "human-factors")?