views:

139

answers:

2

NOTE: I fixed the user agent problem and i also added in an extra byte to match the content length. No luck however.

I dont understand this. I ran this code below and the result json string was the link is expired (meaning invalid).

However the curl code does the exact same thing and works. I either get the expected string with the url or it says i need to wait (for a few seconds to 1 minute).

Why? whats the difference between the two? It looks very F%^&*ed up that it is behaving differently (its been causing me HOURS of problems).

NOTE: the only cookie required by the site is SID (tested). It holds your session id. The first post activates it and the 2nd command checks the status with the returning json string. Feel free to set the CookieContainer to only use SID if you like.

WARNING: you may want to change SID to a different value so other people arent activating it. Your may want to run the 2nd url to ensure the session id is not used and says expired/invalid before you start.

additional note: with curl or in your browser if you do the POST command you can stick the sid in .NET cookie container and the 2nd command will work. But doing the first command (the POST data) will not work. This post function i have used for many other sites that require post and so far it has worked. Obviously checking the Method is a big deal and i see it is indeed POST when doing the first command.

    static void Main(string[] args)
    {
        var cookie = new CookieContainer();
        PostData("http://uploading.com/files/get/37e36ed8/", "action=second_page&file_id=9134949&code=37e36ed8", cookie);
        Thread.Sleep(4000);
        var res = PostData("http://uploading.com/files/get/?JsHttpRequest=12719362769080-xml&action=get_link&file_id=9134949&code=37e36ed8&pass=undefined",
            null/*this makes it GET*/, cookie);

        Console.WriteLine(res);
        /*
        curl -b "SID=37468830" -A "DUMMY_User_Aggent" -d "action=second_page&file_id=9134949&code=37e36ed8" "http://uploading.com/files/get/37e36ed8/"
        curl -b "SID=37468830" -A "DUMMY_User_Aggent" "http://uploading.com/files/get/?JsHttpRequest=12719362769080-xml&action=get_link&file_id=9134949&code=37e36ed8&pass=undefined"
        */
    }

I am fed up with HttpWebRequest/.NET htto clients. I did it using curl (after learning how to set it up on .NET and refreshing myself). I did it simply and quickly and it WORKED. No stupid problems.

        Curl.GlobalInit((int)CURLinitFlag.CURL_GLOBAL_ALL);
        var curl = new Easy();
        curl.SetOpt(CURLoption.CURLOPT_USERAGENT, "DUMMY_User_Aggent");
        curl.SetOpt(CURLoption.CURLOPT_COOKIE, "SID=4385743");


        curl.SetOpt(CURLoption.CURLOPT_POSTFIELDS, "action=second_page&file_id=9134949&code=37e36ed8");
        curl.SetOpt(CURLoption.CURLOPT_URL, "http://uploading.com/files/get/37e36ed8/");
        curl.Perform();


        curl.SetOpt(CURLoption.CURLOPT_POST, 0);
        Easy.WriteFunction wf = new Easy.WriteFunction(OnWriteData);
        curl.SetOpt(CURLoption.CURLOPT_WRITEFUNCTION, wf);
        curl.SetOpt(CURLoption.CURLOPT_URL, "http://uploading.com/files/get/?JsHttpRequest=12719362769080-xml&action=get_link&file_id=9134949&code=37e36ed8&pass=undefined");
        curl.Perform();

        curl.Cleanup();
        Curl.GlobalCleanup();
A: 

Try using the WebClient class.

It will make your code MUCH shorter, too.

Foxfire
How do i set the POST data and the cookies?
Foxfire
Thats pretty good. Can i set cookies? bc i need that in both functions. If i could i will always use this.
Nevermind, a small wrapper can solve it http://couldbedone.blogspot.com/2007/08/webclient-handling-cookies.html but thats nearly like the func i already have. Its good to know tho =)
A: 

Apparently you're trying to download something from uploading.com with a program, which is something they don't want you to.
Since you aren't sending a UserAgent with cUrl, they have no way of knowing you're automating it, yet .NET will probably send a blacklisted UserAgent.

tstenner
I fixed the user agent problem. If they didnt want me to download with a problem then are likely to blacklist curl which they havent done.
Also i tried -A "" so no agent is used. It worked. So, what gives.