tags:

views:

469

answers:

6

Hmmmm... ok, so I'm trying to do exactly the same thing as in my previous python question, but in php. See my previous (answered) question

So lets say that php script from previous question does something and then runs another php script on the same server, does something and then quits (while second script still continues its work). How do I achieve this? (If my question is unclear feel free to comment and I'll edit it. It's a bit chaotic, heh.)

Please note that php script is also a web page at the same time (so maybe we can use it like in previous question where answer to my question was snippet that made python just open url instead of running subprocess... although I don't have a clue if that's useful information, maybe it's different in php, I'm not too experienced in php)... and that I want to make each scripts independent - so if first php script will finish I would like second php script to continue working even though first one ended.

What do you think is most elegant way to do this? Would echoing iframe work or should I do this differently?

+4  A: 

If a user will be using a web browser to reach the php script, I would use ajax to call the second page.

The user won't even know it is being called.

See w3schools for a tutorial on AJAX

Justin Giboney
+1  A: 

If you need this other script to be run, depending on the client would not be wise.

What I'd do is use the technique described in the answer to this StackOverflow question (which points to this comment in the PHP documentation) and include the other script as your post-processing.

However, that comment was written in 2006, and things may have changed since then. Please give the technique a try (as I will be doing, just for fun) and see if it works for you :)

Matchu
+1  A: 

To the best of my knowledge, there isn't any way to run a second page (AJAX is the exception). As far as I know, PHP doesn't support multiple threads (please correct me if I'm wrong) and the 'single-thread' nature of the web seems to defeat it anyways.

I would rather be inclined to look at your specific application and find out why you need to have two separate pages run - and then re-engineer the process so it does not.

I'd be willing to bet that a re-engineer would end being less of a headache from a development standpoint, as well as a logic and implementation one.

EvilChookie
+1  A: 

This is a bit of unix/linux-only hack that might not work on shared web servers:

file1.php

<?php
$somearg = escapeshellarg('blah');
exec("php file2.php $somearg > /dev/null &");

file2.php

<?php
//do some stuff that will take a while
//$argv should contain 'blah', and also it seems the name of the php file
//this script will continue to run. You might want to set max_execution_time to a sensible value.
Tom Haigh
+1  A: 

You should look into pcntl_fork if you want a multithreaded application. Also look at ignore_user_abort.

shadowhand
Does that method work on Apache? This comment on documentation (http://us2.php.net/manual/en/function.pcntl-fork.php#49949) says no, but others say kinda.
Matchu
Additionally, ignore_user_abort only involves the user pressing stop, not the script itself closing the connection early.
Matchu
No, pcntl_fork will not work under mod_php. The OP did not specify if the script would used for web or cli.
shadowhand
A: 

I'm having trouble getting my previous answer to work, though I suspect it may be the fault of my own server, or perhaps new browsers refusing the close the connection when instructed (I'm really not a pro on how that shtuff works).

If that method doesn't work for you, either, try this article on pseudo-multi-threading in PHP, and see if you have better luck :)

Matchu