tags:

views:

286

answers:

3
echo "sed -i 's/NULL/\\N/g' ".$_REQUEST['para'].".sql";

The above statement works. But it fail when I use it in exec like this...

exec("sed -i 's/NULL//\/\/\N/g' ".$_REQUEST['para'].".sql");
+3  A: 

You should escape backslashes with backslashes, not with forward slashes, like this:

exec("sed -i 's/NULL/\\\\N/g' ".$_REQUEST['para'].".sql");

EDIT I wrote the answer without looking at what the code actually does. Don't do this, because $_REQUEST['para'] can be whatever the user wants, which can be used for code injection. Use the PHP functions as the other answer suggests.

Amnon
The words those contain NULL in it are getting replaced. For. e.g. TNULLL became TNL How do I add word boundaries?
shantanuo
+3  A: 

Although it's entirely up to you, but my advice is not to call system commands unnecessarily. In PHP, you can use preg_replace() to do the functionality of sed.

preg_replace("/NULL/","\\N",file_get_contents("$_REQUEST['para']"."sql") )
ghostdog74
I think you're missing a quote.
Mark Byers
I think that the subject of replacing is not the file name but the file itself. But your idea is better and safer.
stefita
@mark, thks. fixed
ghostdog74
+1  A: 

Building on ghostdog's idea, here's code that will actually do what you want (the original code he posted didn't actually read content of the file in):

//basename protects against directory traversal
//ideally we should also do a is_writable() check   
$file = basename($_REQUEST['para'].".sql");
$text = file_get_contents($file); 
$text = str_replace('NULL', '\\N', $text); //no need for a regex
file_put_contents($file, $text);

Admittedly, however, if the file in question is more than a few meg, this is inadvisable as the whole file will be read into memory. You could read it in chunks, but that'd get a bit more complicated:

$file = basename($_REQUEST['para'].".sql");
$tmpFile = tempnam("/tmp", "FOO");
$in = fopen($file, 'r');
$tmp = fopen($tmpFile, 'w');
while($line = fgets($in)) {
  $line = str_replace('NULL', '\\N', $line);
  fputs($tmp, $line);
}
fclose($tmp);
fclose($in);
rename($tmpFile, $file);

If the file is 100+ meg, honestly, calling sed directly like you are will be faster. When it comes to large files, the overhead of trying to reproduce a tool like sed/grep with its PHP equivalent just isn't worth it. However, you need to at least take some steps to protect yourself if you're going to do so:

Taking some basic steps to secure amnom's code:

$file = basename($_REQUEST['para'].".sql");
if(!is_writable($file))
  throw new Exception('bad filename');
exec("sed -i 's/NULL/\\\\N/g' ".escapeshellarg($file));
  1. First, we call basename, which strips any path from our filename (e.g., if an attacker submitted the string '/etc/passwd', we'd at least now be limiting them to the file 'passwd' in the current working directory
  2. Next, we ensure that the file is, in fact, writable. If not, we shouldn't continue
  3. Finally, we escapeshellarg() on the file. Failure to do so allows arbitrary command execution. e.g., if the attacker submitted the string /etc/passwd; rm -rf /; #, you'd end up with the command sed 's/blah/blah/' /etc/passwd; rm -rf /; #.sql. It should be clear that while that exact command may not work, finding one that actually would is trivial.
Frank Farmer
Fatal error: Allowed memory size of 20971520 bytes exhausted (tried to allocate 7444267 bytes) in somefile.php on line number 123I tried to increase the allocated memory by using ini_set("memory_limit","20M"); but it is still not working as expected.
shantanuo
Made major revisions to cover the the other 2 ways of approaching this. The first approach is fine for small files; the second, for files more than a few meg, but probably less than 100; and the final is honestly the fastest way to go for files 100 meg and larger.
Frank Farmer
I suggest increasing the memory limit to 32 or 64 perhaps. I've had to do that to run some scripts before.
Alex JL
Increasing the memory limit is definitely an acceptable hack sometimes (after all, why rewrite code if all you need is another 3 meg?). But if you get, say, a 2 gig file, you're probably past the point where upping the memory limit is going to help. Then it's time to stop loading the whole file into memory all at once.
Frank Farmer