Ok, so we've got some detail now - perhaps 10 GB of total (uncompressed) data, every 3 days, so that's 100 GB per month.
That's actually not really a sizeable chunk of data these days. Whose bandwidth are you trying to save - yours, or your clients'?
Does the data perhaps compress very readily? For raw binary data it's not uncommon to achieve 50% compression, and if the data happens to have a lot of repeated patterns within it then 80%+ is possible.
That said, if you really do need a system that can just transfer the changes, my thoughts are:
- make sure you've got a well defined primary key field - use that as your key to identify each record
- record a timestamp for each record to say when it last changed
- have each client tell you the timestamp of the last change it knows of, so you can calculate the deltas
- ensure that full downloads are possible too, in case clients get out of sync