We have a set of actions or "Jobs" which we'd like to happen one at a time (not concurrently). Ie: Job A can't happen while B is happening, and you can't have two C jobs running at the same time.
In the event that a thread attempts to run a job concurrently they should get an error. We shouldn't just queue up the request.
Some jobs happen asynchronously where the user makes request, we return a status message and id, and then process job asynchronously on server.
We're looking for advice on how to handle this scenario.
One option is to lock on a shared object:
public class Global {
public static final Object lock = new Object();
}
public class JobA {
public void go() {
synchronized(Global.lock) {
//Do A stuff
}
}
}
public class JobB {
public void go() {
synchronized(Global.lock) {
//Do B stuff
}
}
}
Problems with this:
- We would queue up concurrent requests and not return an error message (which we want to do)
- If
JobA
wants to write message to a queue to process asynchronously, how can we be assured that when the message is read from the queue by another method inJobA
that the method will be able to acquire theGlobal.lock
lock before another instance ofJobA
starts?
Any suggestions on a better approach?