I have a design pattern I have been struggling with on how best to prevent duplicate posting of data.
Here are the steps:
- Client submits data with a unique guid (client generated guid - guaranteed unique)
- Server side software makes sure client guid doesn't exist yet in the DB
- begins transaction
- process data (can take between 1-20 seconds depending upon payload)
- commits transaction
Here are the scenarios: Client submits data with guid "1", and then resubmits data with guid "1" before step (5) is hit for the original data submission, then the transaction is processed twice.
What is the best design pattern to prevent this without using semaphores or blocking? The user should be able to resubmit, in case the first submission fails for some reason (hardware issue on the server side, etc.).
Thanks!