Why not use hypermedia to constrain access?
Use something like,
POST /A
to initate the first process. The when it is complete the results should provide a link to follow to initiate the second process,
<ResultsOfProcessA>
<Status>Complete</Status>
<ProcessB href="/B"/>
</ResultsOfProcessA>
Follow the link to initate the second process,
POST /B
and repeat for part C.
Arguably a badly behaving client could cache the link to step B and attempt to re-use it in some future request to circumvent the sequence. However, it would not be too difficult to assign some kind of token when doing step A and require that the token be passed to step B and C to prevent the client from constructing the URL manually.
Reading your comments further, it seems that you have a situation where A could be run either before or after B. In this case I would suggest creating a resource D that represents the status of the entire set of processes (A,B and C). When a client retrieves D it is presented with the URIs that it is allowed to follow. Once a client has initiated the A process then the D resource should remove the B link for the duration of the processing. The opposite should occur when B is initiated before A.
The other advantage of this technique is that it is obvious if A or B has been run for the day as the status can be displayed in D. Once A and B have been run then D can contain a link for C.
The hypermedia is not a 100% foolproof solution because you could have two clients with the same copy of D and both might think that process A has not been run and both could attempt to run it simultaneously. This could be addressed by having some kind of "Last Modified" timestamp on D and you could update that timestamp whenever the status of D changes. This could allow the later request to be denied. Based on the description of your scenario it would seem that this is more of an edge case and the hypermedia would catch most attempts to run processes in parallel.