tags:

views:

235

answers:

7

The script below, test.php, is intended to be placed in a specific directory of all my wordpress sites. Its purpose is to grab the file at the $source address below and extract it to the directory in which it resides. That's all its intended to do.

For example, I will have a dashboard interface on my central server that lists all my sites in which this script is present. I will then execute a cURL routine that iterates over every site and performs a call on this script, effectively sending the update file to all of them at once.

The call is made like so...

...processing site 1 update...
http://targetsite1.com/somedeepdirectory/test.php?query=updates.zip

...processing site 2 update...
http://targetsite2.com/somedeepdirectory/test.php?query=updates.zip

...etc until all my sites have been updated.

My question is (1) how secure (hardened) is this script, as is. and (2) what checks should I put in place to make more so...

I'm thinking that at a minimum I would restrict the character count for myquery as well as check the payload in myquery for malicious, unexpected filetypes?

<?php

// TEST.PHP

$source = 'http://mycentralserver.com/protected/'.$_GET['myquery'];
$target = '.';

$out_file = fopen(basename($source), 'w');
$in_file = fopen($source, 'r');
while ($chunk = fgets($in_file)) {
    fputs($out_file, $chunk);
}
fclose($in_file);
fclose($out_file);

$zip = new ZipArchive();
$result = $zip->open(basename($source));
if ($result) {
    $zip->extractTo($target);
    $zip->close();
}

?>
+2  A: 

I could pass

http://targetsite1.com/test.php?query=/tmp/somefile.zip

and clobber your site with any file I could manage to get somewhere on your webhost.

-- actually I'm not sure about that. It would have to be web accessible on mycentralserver.com.

Paul Tomblin
This sounds interesting Paul. Can you suggest some means of thwarting or at least minimizing the risks imposed with this sort of exploit?
Scott B
http starts the string, so fopen will force http.
Rook
@michael: I'm not sure this averts Paul's assumption. Can you help me understand?
Scott B
Consider only allowing a whitelist of filenames in $GET['myquery'] and not arbitrary filenames. Remember that ../ can be used to get outside of your /protected directory (but a whitelist of allowed filenames will automatically protect you from this).
Cheekysoft
Well with this http get request the script will make this request:http://mycentralserver.com/protected/tmp/somefile.zip . You won't be able to force fopen into accessing a local file because the URI is hardcoded and that can't be changed with $_GET['myquery'] even if you use ../../../../tmp/somefile.zip. You can clobber files and overwrite somefile.zip on the local system, but thats easy to check for and the OP probably should be doing that.
Rook
Michael: what if I throw in a lot of backspace characters? :)
Broam
+2  A: 

The sites in question dumbly download the .ZIP file from your central server on command. They don't verify that the .ZIP file came from you, or that your central server hasn't been compromised.

I'd use GPG to sign your ZIPs. They don't need to be encrypted (but that never hurts) but they need to be signed.

Broam
That's a great suggestion. I'll look into this as a hardening strategy (assuming I can overcome the scenario presented by @paul tomblin above)...
Scott B
If you check that the zip file is always signed with YOUR key (and possibly encrypted), then Paul can pass any number of files he wants--but your script will correctly reject them.(Remember that anyone can decrypt a sign operation as it's encrypt with private key, decrypt with public key.)
Broam
+1  A: 

Consider what your script is doing. You are accessing some sort of query engine, creating a zip file, and passing data back and forth to that file and query engine. Fortunately this code is executed on the server and not the client machine, so it at least as insecure as access to your web server. After consideration for your web server it is only as secure as your network architecture necessary to access the service running that query engine or as secure as access to that zip file the script creates.

Unless those data repositories store personally identifiable information nobody is going to want to mess with your code unless messing with your code provides an easy means to defacing a site with some degree of traffic or reputation. So, what mechanisms exist to prevent people from freely injecting malicious code into your query engine or your zip archive?

+3  A: 

The security of this script in its current state is pretty good. I do have a few concerns. Under NO CONDITION must you accidentally download a .php file and store it in your web root. This is the worst thing that could happen for this script as it would be a remote code execution vulnerability. Files should be downloaded into a specific directory, if you are concerned with other accessing this file you should do a "deny from all" in a .htaccess in that folder. If there are any errors in this script you should delete the downloaded file. In fact I recommend deleting the downloaded files asap.

My concern is that the script should error gracefully. You should check to make sure that you have obtained what you are looking for. Even if the file isn't a .php file it can contain php code <?php ?> which then could be include()'ed which would turn a Local File Include (LFI) vulnerability into full blown remote code execution.

In a secure php configuration allow_url_fopen should be OFF and PhpInfoSec agrees with me. This means that fopen() cannot be used for HTTP. allow_url_fopen is enabled by default and I disable it on all production systems. The reason why is because I have personally written a remote code execution exploit in Coppermine Photo gallery that took advantage of this insecure default. CURL should ALWAYS be used for HTTP in PHP, it is more secure and more stable.

Rook
Thanks Michael, I'll take your advice. Can I ask you to elaborate a bit though when you say...
Scott B
"If there are any errors in this script..." are you referring to the zip file that is being sent from the central server to the target server?
Scott B
...and when you say "I recommend deleting the downloaded files asap", I'm assuming you are talking about the files extracted from the zip, right? I can't do that b/c those files are simply updated files that overwrite the existing theme files.
Scott B
"In a secure php configuration allow_url_fopen should be OFF" > since I'm using fopen() as part of the zip extraction, how would you replace this?
Scott B
I would seriously consider saving the remote file to a folder OUTSIDE the webroot and processing it in that safe location. Remember to clean up after you have finished.
Cheekysoft
Download the file with curl, maybe into /tmp/ with a randomly generated file name and then extract it with fopen(). Also as a side note Directory Traversal is a devastating attack but the only attacker controlled variable is $_GET['myquery'] which cannot be used to remove the HTTP:// part, so an attacker couldn't force fopen() to access a local file even with allow_url_fopen=On. I am informing you of the best secuirty practice, not necessarily a vulnerability in your code.
Rook
@Cheekysoft yep you beat me to the punch! We agree most of the time anyway ;)
Rook
@Cheekysoft: I suppose I could create some rudimentary cipher technique on my sender and receiver files so that I'm changing the names of the files that come in. Would the exploiter still be able to devine or execute their control if I change the names of their payload files? (You'll probably see holes in this, but I'm thinking out loud :-)
Scott B
@Scott The file name could be a primary key in the database, both servers could connect to the same database to obtain the value. The main points are that you want a unique value to prevent clobbering of files, and you don't want nasty extensions like .php, .cgi ect. Btw, we prefer to be called hackers not exploiters ;) .
Rook
A: 

One problem here is that you dont check for directory traversal, so one can copy the whole disk (in principle) to location '.'

If ./ is accessible from the web, you have a big problem.

My advice is to recursively remove all the instances of .. so you can avoid directory traversal

To be more precise, if I would pass as input ../../../../../../etc/passwd then this file will be copied to the current working directory. Since your script has access to write there, obviously, the passwd file would be exposed.

Also not the encoding a . or a / can be encoded in a dozen different ways, so dont do a hardcoded search and replace on ../

Henri
why recursively?
Jimmy
Imagine if you dont do it recursively, I as an attacker will input:"....//", the ../ will be replaced by an empty string and whats left over is another pair of ../
Henri
Much better to allow a whitelist than deny a blacklist. Remember that regex replace functions can stop at a newline character. Also bear in mind that there could be other vulnerabilities that allow an attacker to 'break out', by using different character combinations to '..'. Additionally, bear in mind that an attacker could use this script to launch an attack against your central server, not just your satellite servers. You don't want your satellite sending arbitrary attack URLs to you.
Cheekysoft
I agree about the blacklisting vs whitelisting. However, whitelisting is not always possible. But if it is, use it!
Henri
In this case, it looks like a whitelist **should** be possible. Since the OP has control over the path/s and name/s of the files to be downloaded.
Blair McMillan
@Blair - Definitely do.
Scott B
+1  A: 

Consider switching off fopen_wrappers and using the curl or http libraries to access intended HTTP content.

This will secure your system from other vulnerabilites where a malicious attacker can gain access to web content inside a script where you intended to rectrict access to the local file system. With fopen_wrappers off, file means (local) file and http means http.

Cheekysoft
I will only have control over mycentralserver.com. The satellite sites are customers (which is why I'm earnestly seeking hardening advice on this remote upgrade procedure). I won't deploy the procedure until I've settled that its as sound as it should be...
Scott B
$Cheekysoft: If I hard code the http call to mycentralserver so that I'm always calling the same file, with no $_get parameters, what would my risk profile be then?
Scott B
Remove the user-supplied data and you have a good fewer things to worry about. Perhaps arguably, i would also think that the chances of suffering a problem "in the real-world" drops hugely.
Cheekysoft
+1  A: 

Think also about who can execute this script. I would protect it in some way, so that a random guy/robot on the internet can't call it easily. Consider restricting it by source IP, or maybe implement a password check, if the script is intended to be called interactively.

Cheekysoft