I am working on a library designed to communicate (over RS232 serial communication) with external devices. I was thinking about error handling strategy and exceptions seemed to be right and industry standard way of reporting errors.
So I read few guidelines on exceptions. One pretty clearly states that I should not worry about performance hit:
Do not use error codes because of concerns that exceptions might affect performance negatively.
Other told me NOT to throw exception in normal cases:
Do not use exceptions for normal or expected errors, or for normal flow of control.
I am not able to draw clear line between normal/expected and other cases. For example in my library, a operation may fail because:
- There is no response from device. (no cable connected, device not turned on, wrong baud rate)
- Operation request is rejected by device because it couldn't authenticate the request.
- Communication failed in between. (someone tripped over the cable, device was powered off suddenly).
I can think all above as expected problems because they can happen in practice very often (infact many marketing morons call me to solve the ^problem^ in my software only to find out they didnt connect the cable to their laptop). So may be exceptions should not be thrown because otherwise application programmer will have to catch those at plenty of places (a lot of catch blocks are also NOT nice to have I believe).
On the other hand, I also tend to think that these are all errors which I somehow need to report to application programmer, and exception seems to be the way to do that. If I don't use exceptions, I will need to report these problems using some error code or error enums. (ugly, I know).
Which approach do you think I should take?