views:

139

answers:

5

At my workplace we're planning a major refactor on our core product, a web application with several 'modules'. I quoted that because that's one of our main concerns: modules are not really modules, the whole thing is monolithic. The application is written in PHP with smarty templating and using Pear for accessing a MySQL database. We're not really concerned with database independence, although it would be nice if that wouldn't take months to implement.

Our main concerns are that development time/cost is increasing exponentially because of bugs popping up in unrelated places and not having a sound common architecture to rely on to get the most common functionality (each module is basically copy/paste from the previous one, then adapt).

I've got some experience with the web MVC principle, mainly in ASP.NET MVC. I like the clean separation it offers and the testability. However, when trying this on a local machine the app is simply a lot slower than it should be.

Alright, enough introduction, off to the questions: - Should I rely on caching modules? Does this remove most of the overhead using a good architecture provide? Something like APC.

  • The application is mainly read. Writing is mainly single values (change a single field on a record). Any OR/M for PHP that are good at this?
  • Also looking for a flexible MVC framework. I know Zend, CakePHP, maybe Symfony?

The tricky part is that we don't have the luxury of being able to do a full rewrite. We'll have to incrementally improve a currently very messy codebase. This has to be done while writing new code, or fixing bugs. One thing I'd really, REALLY like to be able to do is write a regression test for a new bug before fixing it, to prevent it from popping up again later (this happens, occasionally).

The stack I'm currently considering contains:

  • MVC framework of choice
  • Logging (log4php?)
  • an OR/M of choice (doesn't have to be dynamic, code generation is fine too)
  • IoC container of choice
  • Smarty Templating, perhaps abstracted so we can switch it out if we need to.
  • Opcode cache of choice (we're using one now, forgot which one, have to ask sysadmin)

The main point that worries me is the performance implications of creating clean code in PHP. Seeing it's a parsed language opposed to something like the .NET/Java web stack, creating abstractions for otherwise in-line code (with obligatory separation in different files) might create new problems on another level.


Note: Retag if you come up with more appropriate tags, I'm not sure on the current ones.

+3  A: 

Having a clean setup isn't a performance issue, usually. Most performance is spent with databases or other external systems you're talking to.

Except for these there are usually one or two hotspots which might be worth optimizing but for that you should start with a clean design, then use a profiler (like XDebug or ZendDebugger) to identify the bottlenecks.

A clean software design is way more important than the 0.01% performance gain by a "optimized" design. Usuallyit's even cheaper to buy and run more hardware than worry about an "optimized" codebase which is unmaintainable.

johannes
+1  A: 

There isn't any good reason that well structured object oriented code should perform significantly worse than sapghetti php code in a database driven web application. You need to do some profiling to find where your bottlenecks are and optimize accordingly.

Asaph
A: 

You do have a tough (but not uncommon) situation.

As far as organizing the code to minimize bugs, all I can give is a tip of the cap to DRY.

For performance issues, those are easy to find, because their very slowness shows them to you, by this technique.

Mike Dunlavey
+2  A: 

I'd stress budgeting time to build tests, with the following arguments to management:

  1. When developers fix a bug, allow them to write a test for the bug. Bugs reoccur much more often than they probably should, and this is a cheap and effective way to stop that completely.
  2. When developers are building new functionality, allow them to write tests under it. Since they're completely familiar with the functionality at that point, this is the least expensive time to build the "safety net" of automated testing.

Don't candy coat how long testing will take you; whether it's 1% of your time or 50%, give that to the manager straight, but stress that building automated testing as a safety net will stop users from hitting as many bugs, and will save developer time for new development instead of bugfixing.

Dean J
+2  A: 

As far as managing an MVC component with a spaghetti code component, we had a similar issue with a large project. What worked well was just taking a directory and making that the new docroot for MVC app (Zend Framework in our case) such that:

old part:
http://site.com/data.php
http://site.com/other.php

new part: http://site.com/app/controller/action/...

Re authentication, you have a couple of choices. Probably the most logical is to redirect your login.php script to the MVC login and then pass it back to the original page that you want to go with necessary info passed as a GET parameter. This will allow legacy and new systems to exist simultaneously and transparently.

Re slowness, before I would pull out XDebug, I would try to isolate a problematic part and just output times it takes. Faster IMHO.

timpone
I've done some work with cachegrind, simply get the xdebug profiling api up and it'll simply log all calls into a file which can be parsed with a tool. At a glance you can then see what takes a long time easily. We already optimized some old db queries which took faaaaar to long for what they needed to do.
Erik van Brakel