views:

335

answers:

5

What's a good way to download HTTP URLs (e.g. such as http://0.0.0.0/foo.htm ) in C++ on Linux ? I strongly prefer something asynchronous. My program will have an event loop that repeatedly initiates multiple (very small) downloads and acts on them when they finish (either by polling or being notified somehow). I would rather not have to spawn multiple threads/processes to accomplish this. That shouldn't be necessary.

Should I look into libraries like libcurl? I suppose I could implement it manually with non-blocking TCP sockets and select() calls, but that would likely be less convenient.

+4  A: 

Libcurl is the way to go. See http://curlpp.org for C++ bindings and an excellent set of tutorials.

Matt
Thanks. However, I'm not seeing anything about the multi interface (needed for the asynchronous stuff I was talking about) in the documentation. It's implemented in curlpp, though. I think I'll just stick to the C API to stay on the safe side.
Joey Adams
+1  A: 

Have you considered Qt's network module? They provide some classes for asynchronous download for example QNetworkAccessManager.

Donotalo
+5  A: 

You can use boost::asio to perform async IO operations. Heres an example of an async http client.

Joakim Karlsson
A: 

Isn't it possible without using any libraries? Could it be attained using C++ Socket programming?

Sayan Ghosh
read my response above
Ankur Gupta
A: 

Qt's QThread instance when run can have it's own event loop. Insides QThread you can have an instance of QHttp and since QHttp uses Qt Event loop to function you have your async Http calls from mail thread. Also note that inter thread communication is very easy.

Head straight to http://doc.qt.nokia.com and look at the classes's documentation to understand better.

Ankur Gupta