views:

3208

answers:

7

I am working on my first Drupal project on XAMPP in my MacBook. It's a prototype and receives positive feedback from my client.

I am going to deploy the project on a Linux VPS two weeks later. Is there a better way than 're-do'ing everything on the server from scratch?

  • install Drupal
  • download modules (CCK, Views, Date, Calendar)
  • create the Contents
  • ...

Thanks

+1  A: 

I don't work with Drupal, but I do work with Joomla a lot. I deploy by archiving all the files in the web root (tar and gzip in my case, but you could use zip) and then uploading and expanding that archive on the production server. I then take a SQL dump (mysqldump -u user -h host -p databasename > dump.sql), upload that, and use the reverse command to insert the data (mysql -u produser -h prodDBserver -p prodDatabase < dump.sql). If you don't have shell access you can upload the files one at a time and write a PHP script to import dump.sql.

+18  A: 

A couple of tips:

  • Use source control, NOT FTP/etc., for the files. It doesn't matter what you use; we tend to spin up an Unfuddle.com subversion account for each client so they have a place to log bugs as well, but the critical first step is getting the full source tree of your site into version control. When changes are made on the testing server or staging server, you see if they work, you commit, then you update on the live server. Rollbacks and deployment gets a lot, lot simpler. For clusters of multiple webheads you can repeat the process, or rsync from a single 'canonical' server.

  • If you use SVN, though, you can also use CVS checkouts of Drupal and other modules/themes and the SVN/CVS metadata will be able to live beside each other happily.

  • For bulky folders like the files directory, use a symlink in the 'proper' location to point to a server-side directory outside of the webroot. That lets your source control repo include all the code and a symlink, instead of all the code and all the files users have uploaded.

  • Databases are trickier; cleaning up the dev/staging DB and pushing it to live is easiest for the initial rollout but there are a few wrinkles when doing incremental DB updates if users on the live site are also generating content.

I did a presentation on Drupal deployment best practices last year. Feel free to check the slides out.

Eaton
Thanks for your tips! In fact, I watched your presentation before asking this question ;-) I keep looking for alternatives and experiencing suggestions. My VPS supplier suggested another option: Virtual Appliance + rsync + SQL dump/restoreAny comment on that vs the CVS way...
ohho
rsync can definitely work, though most of the projects I work on involve distributed teams where SVN/CVS as a central syncing mechanism helps at more than just deploy-time. SQL dump/restore is the method we use when pushing the DB out for 'launch', though other methods are needed for ongoing updates
Eaton
A very good presentation Eaton. Thanks!
Leandro Ardissone
A: 

If you're new to deployment (and or Drupal) then be sure to do everything in one lump. You have to be quite careful once there are users effecting content while you are working on another copy.

It is possible to leave the tables that relate to actual content, taxonomy, users, etc. rather than their structure. Then push the ones relating to configuration. However, this add an order of magnitude of complexity.

Apologies if deployment is something old hat to you, thus this is vaguely insulting.

+6  A: 

We've had an extensive discussion on this at my workplace, and the way we finally settled on was pushing code updates (including modules and themes) from development to staging to production. We're using Subversion for this, and it's working well so far.

What's particularly important is that you automate a process for pushing the database back from production, so that your developers can keep their copies of the database as close to production as possible. In a mission-critical environment, you want to be absolutely certain a module update isn't going to hose your database. The process we use is as follows:

  1. Install a module on the development server.
  2. Take note of whatever changes and updates were necessary. If there are any hitches, revert and do again until you have a solid, error-free process.
  3. Test your changes! Repeat your testing process as a normal, logged-in user, and again as an anonymous user.
  4. If the update process involved anything other than running update.php, then write a script to do it.
  5. Copy the production database to your staging server, and perform the same steps immediately. If it fails, diagnose the failure and return to step 1. Otherwise, continue.
  6. Test your changes!
  7. BACK UP YOUR PRODUCTION DATABASE and TAKE NOTE OF THE REVISION YOU HAVE CHECKED OUT FROM SVN.
  8. Put your production Drupal in maintenance mode, run "svn update" on your production tree, and go through your update process.
  9. Take Drupal out of maintenance mode and test everything (as admin, regular user, and anonymous)

And that's it. One thing you can never really expect for a community framework such as Drupal is to be able to move your database from testing to production after you go live. From then on, all database moves are from production to testing, which complicates the deployment process somewhat. Be careful! :)

Lendrick
+2  A: 

I'm surprised that no one mentioned the Deployment module:

http://drupal.org/project/deploy

Dave
It was implicitly mentioned in @Eaton's presentation.
Török Gábor
+1  A: 

We use the Features module extensively to capture features and then install them easily at the production site.

Niels
A: 

A good strategy that I have found and am currently implementing is to use a combination of the deploy module to migrate my content, and then drush along with dbscripts to merge and update the core and modules. It takes care of database merging even if you have live content, security and module updates, and I currently have mine set up to work with svn.

Ryan L