I'm in a Math class about coding theory and we have to do a project for our term mark.
My project is a client with an echo server, which shows what errors were introduced in the trip from the client to the server, and then back. And test different error correcting schemes for efficiency and how well they are suited to this task.
The coding isn't really a problem, I was able to make something that would detect errors, ask for retransmission if unable to fix them and such.
The problem I have is that so far, for me to introduce any type of bit error, I must do it artificially -- since other layers of data transfer have their own error correction protocols.
My question: Is there a way to get around this?
I have no idea how I would go about this or even where to begin.
Also, I know there are protocols I can't tinker with so theres always going to be error correction going on in the background at those levels. But what I'd like is to be able to pretend that one of these layers wasn't checking things itself and then my application would be given the chance to play that role.
If this can't be done, what are some good methods of simulating errors introduced during transmission. I was unable to find statistical information about the distribution of errors in even a simple example of a channel. Given those I could continue with the current approach of having the server introduce errors in the message.