I'm trying to remember how the math is worked out to compute the remainder of an XOR algorithm in Cyclical Redundancy Checks to verify the remainder bits of a network message.
I shouldn't have tossed that text book.
This is easily done in code, but how is it worked out by hand?
I know it looks something like a standard division algorithm, but I can't remember where to go from there to get the remainder.
___________
1010 | 101101000
Note: I did google it, but wasn't able to find a place where they mapped the steps in figuring the remainder.