tags:

views:

409

answers:

8

I just ran into something weird. I have two JSON arrays which holds different data. Usually I just need data from one of them, but when I try to do a dual ajax call to get data from both arrays, I end up with the exact same data on both calls.

Say array 1 holds JSON data on users, and array 2 holds JSON data on houses, and I want to get data from both arrays: (PS! I cut out the url and type to save a couple of lines.)

$.ajax({
   async:false,
   data:"type=users&id=3,5,6",
   success: function(data) {
      data = JSON.parse(data);
      alert(data.length) //will alert 3 as expected
   }
});

Then I make the second call to get some houses in there as well:

$.ajax({
   async:false,
   data:"type=houses&id=2,4",
   success: function(data) {
      data = JSON.parse(data);
      alert(data.length) //alerts 3 as well...
   }
});

When I look at the params and response with Firebug, I can see that the params are correct, but the response is wrong.

In my PHP I even tried to just echo out whatever comes in doing this:

echo $_GET['id'] . ", " . $_GET['type'];

Which made the request look the exact same on both calls... If I put an alert inbetween the ajax calls, I get the expected result (since the system waits). But I thought putting them both to be synchronous would be enough to not crash up the calls..?

edit: I've tried creating a copy of the php file which is called within the AJAX function, and setting the calls to go to seperate files, which makes everything work as expected. So I'm fairly sure there's nothing wrong with the javascript.

more edit: If I remove the parameters from the second AJAX call, I still get the same result. Looking at the request with Firebug I can see that the params list is empty, but the response is still identical...

even more edit: Looking around in Firebug, I can see there is a header called connection which has the value of keep-alive, and then a header called Keep-Alive with the value of 300. I'm guessing this might have something to do with it? Can't find anything on it in the jQuery docs, though...

source code: I've made a small test case, which reproduces the problem:

PHP

echo $_GET['test'];
die();

HTML

<sript>
    $(document).ready(function() {
      $.ajax({
        type:"get",
        url:"bugtest.php",
        data="test=hello",
        success: function(data) {
          $("output").append(data);
        }
      });
      $.ajax({
        type:"get",
        url:"bugtest.php",
        data="test=world!",
        success: function(data) {
          $("output").append(" "+data);
        }
      });
    });
  </script>
  <h1>AJAX bug in Aptana's PHP server?</h1>
  <output></output>

That's all I have, and it does the same thing: Instead of getting hello world! like I'd expect, I get hello hello...

+5  A: 

did you try adding timestamp to the URL ?? just to avoid caching etc

 data:"type=houses&id=2,4&"+timestamp

UPDATE:

or just try cache:false in Ajax

You using async:false

Async:

By default, all requests are sent asynchronous (i.e. this is set to true by default). If you need synchronous requests, set this option to false. Note that synchronous requests may temporarily lock the browser, disabling any actions while the request is active.

may be because of this temporary blocking .. you can try setting it to true.

Wbdvlpr
Would I have to do something with that timestamp in PHP, then?
peirix
No, I know the urls params are different but just wanted to try. Did you try print_r($_REQUEST) ??
Wbdvlpr
Yup. It will produce the same error. It will just continue to think I passed in users and 3 ids. `cache:false` didn't help either...
peirix
OK .. please try async:true
Wbdvlpr
That will of course produce the same error, which kinda makes sense. It will have two calls over each other, so the params are likely to get messed up over the wire.
peirix
HTTP is sent over TCP, so messing up 'over the wire' just won't happen. If it's stuffing up, it'll be either on the client or server, and from the description, I'd presume JavaScript - PHP page executions don't have much room to influence one another.
Mead
Kevin
+6  A: 

Hi,

first of all, synchronous AJAX calls are fundamentally bad on a live system. Testing is ok, but don't do this at home, kids! Why? Because if the server doesn't answer in a second or at all (which is quite probable), the browser tab becomes non-responsive. In the worst case (old Mozillas) the whole browser will freeze.

</teaching>

  • Caching should not be the problem, since the URLs are different, when you send GET requests (POSTs are most likely not cached at all).

  • Your alert() test tells you, that it is not PHP's fault, because it sends back all stuff as requested

  • So we remain with JavaScript:

    • If you would've hand-written the code, I'd guess, that you do a wrong check on the request's onreadystate (that is, the function to handle the requested data checks on the wrong request object)

    • Can you tell us, which library you use? Looks like prototype to me.

Cheers,

Boldewyn
Yeah, this is for a prototype setup, so everything here is sorta hacked together :p I'm using jQuery, but I'm not so sure JS is to blame here. Firebug tells me that the params sent in are correct, but PHP are reading the old params for some reason...
peirix
Looks like jQUERY to me.
Wbdvlpr
just realized my comment might be confusing. It's a prototype setup as in I'm prototyping a new website, not using prototype framework. Sheez, what a bad name for a framework :p
peirix
A: 

Try putting the second request in a timeout which waits for 0 seconds.

$.ajax({
   async:false,
   data:"type=users&id=3,5,6",
   success: function(data) {
      data = JSON.parse(data);
      alert(data.length) //will alert 3 as expected
   }
});

setTimeout(function(){$.ajax({
   async:false,
   data:"type=houses&id=2,4",
   success: function(data) {
      data = JSON.parse(data);
      alert(data.length) //alerts 3 as well...
   }
})}, 0);
Marius
Already tried that, and it doesn't help. I have to set the timeout to at least 500ms for it to work...
peirix
That doesn't set a timeout correctly. That's passing the *result* of calling $.ajax() to be called after 0ms. Hell, if you do this and there's a difference changing between setting 0 and 500, things smell worse than fishy Mr peirix.
Mead
Yeah, and that's why I think there's something wrong with the PHP, and not javascript...
peirix
Thanks Mead, I fixed that now. Not sure if peirix tested it the proper way, or the way I originally did it
Marius
I actually misread your answer, and thought you had this all along. So this is what I've tried, and it works if I set the timer to something along the lines of 500ms or sometimes even higher.
peirix
A: 

Is it possible, that there is some Problem with the data variable ?

What happends if you change your second request to:

$.ajax({
   async:false,
   data:"type=houses&id=2,4",
   success: function(dataHouse) {
      data = JSON.parse(dataHouse);
      alert(dataHouse.length)
   }
});
ArneRie
Nope, because looking at the response from the php file, I can see that the response is wrong. So I think the error must be in PHP somehow...
peirix
But are you sure that the same request data isn't being sent twice? I'm wondering what ArneRie is, if that "data" variable is getting tripped up inside the ajax method somehow. Have you tried this in Firebug and checked what parameters are actually going out?
Brother Erryn
@Erryn: Yeah, as noted in my question, I've verified that the parameters are correct, but the request is messed up somehow.
peirix
A: 

Put the next ajax call in the response of the first ajax call:

function nextAjaxCall(){
  $.ajax({
    async:false,
    data:"type=houses&id=2,4",
    success: function(data) {
      data = JSON.parse(data);
      alert(data.length) //alerts 3 as well...
    }
  });
}
$.ajax({
  async:false,
  data:"type=users&id=3,5,6",
  success: function(data) {
    data = JSON.parse(data);
    alert(data.length) //will alert 3 as expected
    nextAjaxCall();
  }
  error: function(){
    //error handling
    nextAjaxCall();
  }
});

Make sure that you call the second ajax even if the first one doesn't return successfully.

Marius
Still get the same result. As I mentioned in your other answer, PHP seems to need at least 500ms before it can accept any new AJAX calls...
peirix
+2  A: 

Try using POST.

Also if PHP receives 2 identical requests when they actually aren't supposed to be then it's not PHP's fault; the bug is somewhere between the place where you make the $.ajax call and where web-server passes the request to PHP.

Could you provide some more info about web-server? Which one do you use? Do you have some sort of catching or optimization going on there? Maybe you use some PHP framework or PHP "bootstrap" which could cache request?

You could also try to make a standalone test. Strip the Ajax code out and use minimal php script which just echoes the request. See if you get the same bug.

Some copy-paste data of requests sent and requests received in php would be nice.


I've never seen anything like this, I'm almost sure it's catching related. As a workaround you could just merge both requests in one, you would have to change the PHP script a bit. It's actually probably a better solution as well since you should always try to make as few requests to the server as you can.

Maiku Mori
POST gives the same result. I'm just using Aptana's built in server, and I have no idea what the settings are for it...
peirix
And taking the time (5 seconds) to upload and test this on our server at work, I found that it actually works... :\ So that would mean that something's up with Aptana's PHP server set-up...
peirix
I think the server just couldn't handle it, or it's a bug. Maybe open an issue over at Aptana bug-tracker and see what they have to say about it. That is if you can make a small test case which reproduces the bug.
Maiku Mori
See the edit. I'll open a bug at Aptana...
peirix
I've checked the provided test case (it contains errors, btw) on Apache 2.2.12 and PHP 5.3.0, with different versions of jQuery - all return the correct answer (although jQuery eats up the space between Hello and world!, but that's beside the point ;) - so my guess is that this is an issue on the server (Aptana) side.
Igor Klimer
Yeah, I've also tested on Apache recently, and that works... So I'm now fairly sure this problem is isolated to the Aptana set-up somehow. And I also noticed the space being eaten up, but as you said: beside the point. Which errors were there, btw?
peirix
Not much space, so: my HTML test file: http://pastie.org/642489, my PHP test file: http://pastie.org/642490
Igor Klimer
A: 

Try setting global:false in the jQuery ajax() options.

If that doesn't help, try calling two different PHP pages - one for houses and one for users. If that works, it should at least help narrow the problem down.

Scott Saunders
As noted in one of my edits, I've tried with two different files, and that works...Setting `global:false` doesn't help either... :\
peirix
+2  A: 

In order to determine wether PHP or JS is the problem you could use Charles to monitor your request (i trust charles more than firebug actually, dunno why) and generate a unique value serverside

<?php echo uniqid();

If both requests return the same value there is something wrong with your server, else it's the browser or jscode.

ChrisR
Yup, they are both identical... So I'm just gonna go ahead and accept that this is probably some error with Aptana's built in Jetty server...
peirix
Accepted as answer because it manages to point the fault somewhere, and because the bounty is out of time in a couple of hours, and someone should get the `150`
peirix
I've had a similar issue using Aptana, like the page is being cached, but it shouldn't be. This gives me some more ideas to try, thanks for mentioning it!
Schamp