views:

44

answers:

1

I'm pre-emptively marking this a community wiki because, well, it is.

I'm interested in hearing from people who have real life success stories in moving to continuous integration for a web application. In particular, I'm most interested in the following:

  • What tools did you use for aggregation of reporting (e.g. Hudson)?
  • What code did your application consist of (e.g. just a JavaScript API, mix of server side code + client side code, etc.)
  • What tools did you use for performing Unit Testing?
  • What tools did your QA use for performing functional, smoke, etc. testing?
  • What cultural obstacles did you have to overcome, if you had to overcome anything, to move to CI?

As for me, I'm in the middle of what will be a successful integration, but there has been a cost to this integration (I actually cried at one point late at night). The reason: I'm artificially constrained by my environment, and I'm additionally trying to do things in my constrained environment that no one has done.

But never fear, there's no crying in programming (most of the time) and here's my own answers to my questions:

  • Hudson is my CI master tool. Development is happening on Mac, Windows and Linux, but the CI builds are all happening on Linux.
  • I'm responsible for a pure JavaScript API, plus a tiny bit of PHP.
  • I'm using YUI Test 3 for unit testing, pushed through Selenium Remote Control running in a headless environment on a Linux box.
  • QA is using Selenium with all tests written in Java Selenese.
  • Cultural obstacles: This was interesting. People just don't understand testing JavaScript. For example, here's a response from a meeting, "No really, I need a browser to run my unit tests in. Why would I ever run my tests outside of a browser?" Or, "Come on, developers need to run these unit tests, too. Why do I have to package everything up and treat everything in such a formal way?"
+2  A: 

I moved our product, small 7 man team, to Hudson. Here is a bit of the background:
Our team is very talented yet, suffer from the new developer I don't need to test mentality. I could not tell you how many times in the first couple of weeks our build was broken repeatadly in the same day.. We were following Agile Development methodology that used iterations/sprints to produce releasable code at certain intervals. So far I am sure you see my pain. I installed Hudson and haven't looked back because not only do I not have to continuously monitor the build and the status of the unit tests.

•What tools did you use for aggregation of reporting (e.g. Hudson)?
Hudson

•What code did your application consist of (e.g. just a JavaScript API, mix of server side code + client side code, etc.)
Grails (Groovy, Jquery, Javascript)

•What tools did you use for performing Unit Testing?
Groovy Unit Tests were created for all files and one for every bug encountered to assist with regression testing and provide the customer actionable results.

•What tools did your QA use for performing functional, smoke, etc.testing?
Smoke testing because of team size and personality.

•What cultural obstacles did you have to overcome, if you had to overcome anything, to move to CI?
Honestly this was a huge hurdle. Developers complained about the amount of time needed to create the unit tests, which later they realized the time gained by having it continuously tested by Hudson. Developers initially felt that I was trying to "point the finger" at people for breaking the builds.

If you want more information let me know.

tathamr
I hear more and more people building unit tests around reported bugs. I'm liking that methodology as a very simple method of where to concentrate time on building unit tests.
jeremyosborne
We have found it very useful. Its also increasing our customer trust because we show build that into the delivered documentation of unit test runs.
tathamr
Do you have an example you could share about what and how you build the unit tests into the delivered documentation?
jeremyosborne
Yes, our releases encompass both a cliff notes version and detailed version of the entire release. In the cliff notes we have a section that will list any bugs that failed the regression test. In the detailed section we list every bug found during User Acceptance Testing and display a grid of bug #, info, pass/fail. The reason that we did this is because the customer informed us of a group they were currently working with that keep producing the same error every couple of releases.
tathamr