views:

498

answers:

1

Hello

I am trying to code a custom url_rewriter for squid. & also with using some other url_rewriter programs like squidGuard so have to use a wrapper to able use both or any other program.

when i try to loop with php. (that's the way how squid communicates with external programs. STDIN/STDOUT. it gives you a url & you have to send the new one or old one back. )

it has a devastating memory usage even doing nothing. i've changed to wrap it with another bash script it is only a few lines. & it loops instead of php. calls php externally. when php script is done with the url returns it & exits. this way is much more better than instead of to loop the php script.

in php script nearly there is nothing now. (coz i'm still developing it.) it's only replacing video.yahoo.com to youtube.com. and a few variable and an explode() to parse input string that's all... but script is still using huge amount of memory


here is the top output:

 PID USER   VIRT  RES  SHR S %CPU %MEM TIME+  COMMAND

32059 squid   19720 7432 4396 R    0.9    2.9    0:00.02   php      
32063 squid   19720 7436 4396 R    0.9    2.9    0:00.02   php      
32066 squid   19720 7436 4396 R    0.9    2.9    0:00.02   php      
32068 squid   19460 6188 3472 R    0.9    2.4    0:00.02   php      
32070 squid   19720 7432 4396 R    0.9    2.9    0:00.02   php      
32074 squid   19588 6792 3924 R    0.9    2.6    0:00.02   php      
32077 squid   19720 7436 4396 R    0.9    2.9    0:00.02   php


here is the PHP script

#!/opt/lampp/bin/php -q 
<php
ini_set('html_errors',false);
ini_set('implicit_flush',true);
ini_set('max_execution_time',0);
ini_set('register_argc_argv',true);

$nl="\n"; $tab="\t";
$ds=DIRECTORY_SEPARATOR;
$lamppdir='/opt/lampp/';
$htdocsdir='/opt/lampp/htdocs/';
$wdir='/opt/lampp/htdocs/bin/';
$incdir=$htdocsdir.'inc/';
$logfile=$wdir.'log.txt';

if ($argc>1){
    $return=$argv[1];
    $return=explode(' ',trim($return));
    $url=$return[0];
    $sourceip=$return[1];
    $user=$return[2];
    $method=$return[3];
    $urlgroup=$return[4];
    $myip=$return[5];
    $myport=$return[6];

    $logdata=$argv[1];

    // if(strlen($logdata)>50){ file_put_contents($logfile,$logdata.$nl,FILE_APPEND); }

    fwrite(STDOUT,$return[0]."\r\n");
}

exit(0);

And here is the bash script

#!/bin/bash
 lamppdir=/opt/lampp/
 phpexecpath=/opt/lampp/bin/php
 phpredirectorpath=/opt/lampp/htdocs/bin/redir.php
 logdfile=/opt/lampp/htdocs/bin/log.txt
 forcedexit=false

 while [ "${forcedexit}" != "true" ]
 do
    read squidinput
    phpout=`"${phpexecpath}" "${phpredirectorpath}" "${squidinput}"`
    echo "${phpout}"
 done

 echo "\r\n"

exit 0


i already googled to find any useful documentation about PHP cli & source usage but no luck.

Do you have any advice to decrease source usage ?

A: 

i bet you will laugh at this. i was looking at the wrong place.

after a long tracing stdin/stdout of squid. i've just added an if statement in the loop. to control string's length as a url.

while [ "${forcedexit}" != "true" ]
do
    read squidinput
    if [ -n squidinput ]
       then
     phpout=`"${phpexecpath}" "${phpredirectorpath}" "${squidinput}"`
     echo "${phpout}"
    fi
done

result: there is no waiting php processes in background anymore. because it's handling & exiting just in milliseconds.

without the IF statement squid is sending empty spaces and newlines to script so it never stops. i was trimming input string with php that is why i couldn't get earlier squid's weird stdins. trimming $argv is just an habit brought. squid version was 2.6stable7 probably it's all the same in earlier versions. i've just lost half a day :( thank you for everyone read.

risyasin