Within my Rails application, I'd like to generate requests that behave identically to "genuine" HTTP requests.
For a somewhat contrived example, suppose I were creating a system that could batch incoming HTTP requests for later processing. The interface for it would be something like:
- Create a new batch resource via the usual CRUD methodology (POST, receive a location to the newly created resource).
- Update the batch resource by sending it URLs, HTTP methods, and data to be added to the collection of requests it's supposed to later perform in bulk.
- "Process" the batch resource, wherein it would iterate over its collection of requests (each of which might be represented by a URL, HTTP method, and a set of data), and somehow tell Rails to process those requests in the same way as it would were they coming in as normal, "non-batched" requests.
It seems to me that there are two important pieces of work that need to happen to make this functional:
First, the incoming requests need to be somehow saved for later. This could be simply a case of saving various aspects of the incoming request, such as the path, method, data, headers, etc. that are already exposed as part of the incoming request object within a controller. It would be nice if there was a more "automatic" way of handling this--perhaps something more like object marshaling or serialization--but the brute force approach of recording individual parameters should work as well.
Second, the saved requests need to be able to be re-injected into the rails application at a later time, and go through the same process that a normal HTTP request goes through: routing, controllers, views, etc. I'd like to be able to capture the response in a string, much as the HTTP client would have seen it, and I'd also like to do this using Rails' internal machinery rather than simply using an HTTP library to have the application literally make a new request to itself.
Thoughts?