tags:

views:

274

answers:

9

In small enough organizations, should there be completely independent QA and Dev roles, or should each role involve some time (e.g., 1 day a week) doing the role of the other side?

I'm not talking about unit tests. I'm talking about a QA focusing on the system also contributing some production code, and a dev spending some time analyzing and testing a separate part of the system.

It seems to me that this sort of juggling might make sense because the QA gets a better understanding and personal stake in the system, while the dev gets a better understanding of quality and testing issues beyond unit testing. But I am sure there are also reasons against it...

+1  A: 

Every company is different. What works for one company will not necessarily automatically work for another.

That said, sure, it makes sense to have employees who are as well-rounded as possible. The more knowledge someone has, the better. Should you force someone to contribute code to a new release? I'm not so sure if it should be a strict requirement. But I'd think that if you only had a small amount of people that you would tend towards having everyone do a little bit of everything, not only to spread the knowledge and reduce the knowledge you lose when someone leaves but also because you probably have way more work than people and can't afford to play games like "well I work in Dept X and we don't touch that, sorry".

It sounds reasonable and pragmatic, sure, but there cannot be a hard and fast rule. If a good developer is a bad tester, I wouldn't hold it against them, and vice versa.

matt b
+2  A: 

I think developers should focus on writing production code and unit tests, whereas QA should focus on integration testing, automation of tests, and acceptance level testing. If the QA team is good, and the API documentation is good, I think it would be okay for QA to write unit-tests that exercise an API according to a spec.

A lot of the QA people I've worked with were more proficient with writing procedural code, such as automation scripts. I'm not sure if I would want them writing the production code, especially when there are complex, object-oriented design patterns in use, or anything beyond a basic level of coding.

Just my opinion.

Andy White
So you feel that QA people are usually less capable devs? Or just that they focus on something else?
Uri
I think it depends on the company. In my current company, the QA group was put together will the intent of more black-box/integration/acceptance testing, rather than code testing, so they are not in a position to write a lot of code. On the other hand, I know at Microsoft, test coding is a full-fledged career path, so their testers would be more able to write test code.
Andy White
+1  A: 

I think it's a great idea to have this type of "cross-pollination", I believe it helps developers and testers work together more effectively by having a better understanding of their respective roles.

Lance Richardson
+1  A: 

Putting them in each other's shoes will help them understand and co-operate with each other more. They will each understand the position the other is coming from at a time of any dis-agreement.

Plus, testing is a great learning experience for a Dev. A little bit of QA will go a long way in him/her being a little circumspect when writing code that just works.

Having said that, QA should not be writing mission critical production code and Dev's should not be QA'ng mission critical features.

Aditya Sehgal
+9  A: 

Some points to consider:

  • Presumably, you hired your developers because they were good at writing code, and your testers because they were good at QA. Does it make sense from a business perspective to pay them to do each others' jobs?
  • If your testers understand the code too well, will they develop blind spots?
  • On the same note, can your developers really be effective testers, with their intimate knowledge of the application?
Amanda S
In order to thoroughly test a program, you need to understand how it may fail. Programming experience -> better tester
Treb
A programmer that does some testing sees the program from a different perspective, which helps him see the big picture. Testing experience -> better programmer.
Treb
@Treb: In general, yes. When talking about a single instance of a program/system, not necessarily (hence Amanda's point about blind spots).
Andrew Coleson
+6  A: 

Speaking as a QA guy, I find the idea intriguing. Having the chance to develop professional code sounds like a great idea; I also like the idea of exposing the developers to the QA world, so they know what it takes to advocate for a defect fix.

Here are few thoughts regarding pros/cons of such an approach.

Pros:

  1. QA would gain a better feel for working in a true development environment. Very often, QA is relegated to ad hoc script creation, where automated tests are written in a somewhat rushed and slipshod manner. This might give them an opportunity to expand their horizons into a more structured development cycle. This would also provide some insight into how they are writing their scripts, and may give a few ideas for better test development.
  2. QA might have a little more stake in the release cycle. Though from a personal standpoint, I would say I associate a lot of my pride with our releases and the quality therein, sometimes it really does seem like QA doesn't have much investment in the overall product. We are often seen as "bug finders" rather than engineers, and I wonder if this type of approach would give an even greater veneer of professionalism (for lack of a better word) to the QA team.
  3. Developers would possibly gain a better feel for QA as a practice. I've often had developers tell me they have no idea what I do for a living. Having developers test code would give them a slight taste of "eating your own dog food," so to speak.
  4. This would give the QA a chance to expand their resume a bit. Many QA personnel I know are concerned about their marketability. Good developers are typically able to pick up jobs relatively quickly; testing positions seem harder to find. Anything which helps employees to broaden their experience in the field would be an attractive proposal.

Cons:

  1. Developers should not be relied upon to test their own code. In the same vein, QA should not be relied upon to test their own code. "Cross-pollination" (to steal Lance's excellent phrase) is dangerous inasmuch as it could result in people validating their own work. Generally speaking, this is not a good idea. People are often blind to their own shortcomings or mistakes, and developed code is best validated when tested by third parties. Of course, proper monitoring and management of this process could mitigate the process...but it's a concern.
  2. QA are not professional developers and developers are not professional QA. I cringe whenever a developer hands me code he/she has "tested." It's not that I look down on their skill set: on the contrary, I couldn't write the code they have. But I also recognize the differences between our definition of "tested code." In the same way, I wouldn't, as a QA guy, want my code to be given high-visibility to a customer which could hurt or impede the customer's relationship. Note: I don't necessarily mean mission critical: sometimes a customer relationship can be impaired by simple usability flaws -- a somewhat common mistake for an immature developer. My concern would be high visibility (which I think includes mission critical); I know I'm not a professional developer, and I wouldn't want to be held to the same standard.
  3. This increases the chance for rework. I don't necessarily trust developers to adequately test code, and they shouldn't test me to adequately develop a solution. In both cases, the possibility for someone needing to go back over previous work to do it "right" would be a concern.

On the whole, I think the idea is very interesting, and it would be great to hear of stories where this has been attempted.

One similar approach I've been involved in are "bug days." These are days where developers sit alongside QA and they team up to find as many defects as possible. Days like this are outstanding: professional relationships between the QA and development teams are strengthened, and respect for each other's skills typically increases: devs can better understand how QA works to find bugs, and QA can better understand how much the devs know as they rattle off solutions to the bugs you find. It's not a perfect way to address the issue: QA still doesn't do much production-level code. But it really aids in promoting better understanding between the positions.

bedwyr
A: 

I've seen it work well:

  • In an Agile shop doing Test-Driven Development, a C++ developer who knew CppUnit would pair with a tester who knew how to automate using an in-house GUI automation tool. For each story, they would decide what blend of Unit/GUI testing would be most effective, and they'd work together to get the tests/code written. The testers came in not knowing C++, and the developers came in not knowing GUI automation. It was so successful that the first project to take the approach did a company-wide presentation on it. Nobody on that project wanted to go back to "the old way," where testers lagged a sprint or two behind developers.

  • Bug Bashes, Guerilla Testing, whatever you want to call it, where developers test the product. I've been at several shops where this was successful in terms of bugs found. Session Based Exploratory Testing can be a helpful here if you want to add a little structure and reporting.

  • A tester with programming skills works on the development team for awhile as a junior programmer. In one instance the task was to beef up the team's C++ unit tests for some legacy code.

Pete TerMaat
A: 

In my experience QA should understand your product as well as your customers as possible. They should be well versed in the problem domain of your product and should be able to troubleshoot customer issues as well as level 2 customer support staff for issues that do not need code changes. And while test scripts are necessary, not all QA staff need to be able to write them and thus not all QA staff need any capability to program. In fact, not being programmers will lead QA to find more bugs than if they were coding gurus.

Additionally, if you allow QA staff to code parts of the system, what happens when there are bug reports on that part. Do they take over doing the bugfix? If they do, who QAs the bugfix? If they don't, can they QA the programmer's changes when they know the code. Knowing the code biases you in subtle ways which is why you have QA in the first place. For them, the system is black box and it is their job to make sure that inputs generate the correct outputs. It is not their job to know how it does this. And knowing how can create blindspots reducing their effectiveness.

On the flipside, a statistically significant number of coders hate "testing". Or regard testing as menial/entry level stuff. Having them work in QA can affect morale which affects productivity.

Short Answer: No.

jmucchiello
A: 

As a QA guy, I did some programming and implemented new features. It made me a better tester, as I got an even deeper understanding of the system. There were only two major rules to follow: 1) You can't QA your own code, so this requires at least 2 people on the QA team. 2) It must go through the standard development process, which means code review by the lead developer.

Cross pollination is useful. It helps you learn additional skills and allows for easier shuffling of employees if necessary. Plus, it's good for QA to get burned by kickbacks a bit, to keep the ego in check.

Mike DeMaria