tags:

views:

142

answers:

6

Many many times on a page I will have to set post and get values in PHP like this

I just want to know if it is better to just continue doing it the way I have above or if performance would not be touched by adding it into a function like in the code below?

This would make it much easiar to write code but at the expense of making extra function calls on the page.

I have all the time in the world so making the code as fast as possible is more important to me then making it "easiar to write or faster to develop"

Appreciate any advice and please nothing about whichever makes it easier to develop, I am talking pure performance here =)

<?php
function arg_p($name, $default = null) {
    return (isset($_GET[$name]))?$_GET[$name]:$default;
}

$pagesize = arg_p('pagesize', 10);

$pagesize = (isset($_GET['pagesize'])) ? $_GET['pagesize'] : 10;

?>
+3  A: 

Sure you'll probably get a performance benefit from not wrapping it into a function. But would it be noticeable? Not really.

Your time is worth more than the small about of CPU resources you'd save.

Simon
A: 

Whilst performance wouldn't really be affected, anything that takes code out of the html stream the better.

graham.reeds
+2  A: 

I doubt the difference in speed would be noticeable unless you are doing it many hundreds of times.

scragar
+2  A: 

Function call is a performance hit, but you should also think about maintainability - wrapping it in the function could ease future changes (and copy-paste is bad for that).

PiotrLegnica
StackOverflow is not 4chan.
Mike Daniels
Well, I've never really been on 4chan. But, if "copypasta" is so bad, then I'll change it.
PiotrLegnica
"copypasta" existed before 4chan, btw. Nothing wrong with it.
Josh Davis
+4  A: 

If you have all the time in the world, why don't you just test it?

<?php
// How many iterations?
$iterations = 100000;

// Inline
$timer_start = microtime(TRUE);
for($i = 0; $i < $iterations; $i++) {
  $pagesize = (isset($_GET['pagesize'])) ? $_GET['pagesize'] : 10;
}
$time_spent = microtime(TRUE) - $timer_start;
printf("Inline: %.3fs\n", $time_spent);

// By function call
function arg_p($name, $default = null) {
  return (isset($_GET[$name])) ? $_GET[$name] : $default;
}

$timer_start = microtime(TRUE);
for($i = 0; $i < $iterations; $i++) {
  $pagesize = arg_p('pagesize', 10);
}
$time_spent = microtime(TRUE) - $timer_start;
printf("By function call: %.3fs\n", $time_spent);
?>

On my machine, this gives pretty clear results in favor of inline execution by a factor of almost 10. But you need a lot of iterations to really notice it.

(I would still use a function though, even if me answering this shows that I have time to waste ;)

Henrik Opel
thanks, using that shows about a 70% slow down by using the function instead. in a large scale system I think that is worth noticing
jasondavis
@jasondavis: 70% in and of itself is worth noticing but how much of your total runtime is taken up by those calls? If it's more than a couple of percent, you've probably got bigger problems. In reality, if I were that concerned with it, I'd probably be suggesting a single function at the script start that 'standardises' my GET parameter and then from there, just drag from GET directly (if that makes sense... basically ensure that your GET array is populated correctly with defaults if need be, then never test it again).
Narcissus
A: 
Josh Davis