tags:

views:

287

answers:

5

My hosted scripts have been moved and no longer work.

The specified CGI application misbehaved by not returning a complete set of HTTP headers.

I notice that someone at my host company has modified my scripts so that where I used to have

use lib 'd:/myorig/LIB';

I now have

use lib '//newhost/LIB';

Should this work?

I tried 1800 INFORMATION's suggestion and ran the minimal script of

#!perl -w
use lib '//whatever/lib';
print "success";

...which gave the same result.

Update: ysth's suggestion of FatalsToBrowser did indeed reveal more information. It looks like the path (added by someone from the hosting company) might be wrong.

Update2: The hosting company now says that these scripts, unchanged from the previous host mind, are throwing lots of syntax errors. "Since we cannot debug your scripts for you we suggest you contact the original programmer and ask them for help". <grinds teeth>

Partial Resolution: The hosting company finally realised they hadn't set permissions correctly. They still aren't right, and (aargh) they don't allow site owners to set folder permissionsn, not even on folders within their own sites.

+3  A: 

I don't know if it should work or not, but my intuition is that it would be okay. However, the two use lib lines you posted are not equivalent.

# go to the 'd' drive and use the 'myorigLIB' directory on that drive
use lib 'd:/myorigLIB';

# go to the 'newhostLIB' server - no path is specified - this looks invalid to me
use lib '//newhostLIB';

Perhaps you need to specify the path to the share on the server? Also, you might need to look at permissions? Maybe the user the CGI is running as cannot access that network path?

Also, you could write a simple (non CGI) program to test your theory and just run it:

#!perl -w
use lib '//whatever/lib';
print "success";

Then just run that on the server if you can and see what happens.

1800 INFORMATION
+2  A: 

No the path is incomplete it needs both a server name and a complete path. It is a bad practice as well because it requires that two machines be monitored rather than one for your application to function.

ojblass
+1 for the two machines comment
Ed Guiness
+1  A: 
The specified CGI application misbehaved by not returning a complete set of HTTP headers.

That's a non-error. If you are lucky, your hosting company will make an error log available to you that will show the actual error that perl is dying with. If not, consider using

use CGI::Carp "fatalsToBrowser";

for testing. (If you are paranoid (which is not a bad thing to be), you will refrain from leaving that enabled once you are done testing, since errors can commonly provide information about your code or even your database that may help a black hat exploit security holes.)

ysth
+1  A: 

I know I ran into trouble trying to use mapped drives and unc paths from apache because the apache user was not allowed to use network drives. That was difficult to figure out -- but it's possible to do it. That may be a related problem.

jettero
+1  A: 
#!perl -w

print "HTTP/1.0 200 OK\nContent-Type: text/plain\n\n";

my $path = "//whatever/lib";
print "\nExists ", -e $path;
print "\nDirectory ", -d $path;
print "\nReadable ", -r $path;
print "\nListing:\n";
print "\t$_\n" for glob "$path/*";