views:

725

answers:

4

I'm looking for process suggestions, and I've seen a few around the site. What I'd love to hear is what you specifically use at your company, or just you and your hobby projects. Any links to other websites talking about these topics is certainly welcome!

Some questions to base an answer off of:

  1. How do users report bugs/feature requests to you? What software do you use to keep track of them?
  2. How do bugs/feature requests get turned into "work"? Do you plan the work? Do you have a schedule?
  3. Do you have specs and follow them? How detailed are they?
  4. Do you have a technical lead? What is their role? Do they do any programming themselves, or just architecture/mentoring?
  5. Do you unit test? How has it helped you? What would you say your coverage is?
  6. Do you code review? When working on a tight deadline, does code readability suffer? Do you plan to go back later and clean it up?
  7. Do you document? How much commenting do you or your company feel comfortable with? (Description of class, each method and inside methods? Or just tricky parts of the code?)
  8. What does your SCM flow look like? Do you use feature branches, tags? What does your "trunk" or "master" look like? Is it where new development happens, or the most stable part of your code base?
+1  A: 

To give a better answer, my company's policy is to use XP as much as possible and to follow the principles and practices as outlined in the Agile manifesto.

http://agilemanifesto.org/

http://www.extremeprogramming.org/

So this includes things like story cards, test-driven development, pair programming, automated testing, continuous integration, one-click installs and so on. We are not big on documentation, but we realize that we need to produce just enough documentation in order to create working software.

In a nut shell:

  • create just enough user stories to start development (user stories here are meant to be the beginning of the conversation with business and not completed specs or fully fleshed out use cases, but short bits of business value that can be implemented in less then 1 iteration)
  • iteratively implement story cards based on what the business prioritizes as the most important
  • get feedback from the business on what was just implemented (e.g., good, bad, almost, etc)
  • repeat until business decides that the software is good enough
Chris Johnston
you'd be surprised. yours is not an answer btw, post it as comment and delete it.
SilentGhost
you'd be surprised. you are asking for decades of experience to be summarized on an internet forum.
hmcclungiii
No, he's not asking what processes EXIST, he's asking what *your company* does.
JoshJordan
"what "your company" does" would be something that "EXIST"S wouldn't it?
hmcclungiii
-1. This isn't an answer at all.
Alex Fort
What processes are out there is not a good question. What processes individual posters' companies use is a reasonable question, particularly as community wiki.
David Thornley
How is it not an answer? Because it didn't specifically address each of the question's 8 bullet points, point by point? I don't think it's worth downvoting for that.
Alan Hensel
He had originally stated "A Google search will give you much better results and spend some time looking into things like Agile, XP, Scrum. Your question is too big to really answer in a forum like this."
Zach Gardner
+9  A: 

For my (small) company:

  • We design the UI first. This is absolutely critical for our designs, as a complex UI will almost immediately alienate potential buyers. We prototype our designs on paper, then as we decide on specifics for the design, prepare the View and any appropriate Controller code for continuous interactive prototyping of our designs.

  • As we move towards an acceptable UI, we then write a paper spec for the workflow logic of the application. Paper is cheap, and churning through designs guarantees that you've at least spent a small amount of time thinking about the implementation rather than coding blind.

  • Our specs are kept in revision control along with our source. If we decide on a change, or want to experiment, we branch the code, and IMMEDIATELY update the spec to detail what we're trying to accomplish with this particular branch. Unit tests for branches are not required; however, they are required for anything we want to incorporate back into trunk. We've found this encourages experiments.

  • Specs are not holy, nor are they owned by any particular individual. By committing the spec to the democratic environment of source control, we encourage constant experimentation and revision - as long as it is documented so we aren't saying "WTF?" later.
    On a recent iPhone game (not yet published), we ended up with almost 500 branches, which later translated into nearly 20 different features, a huge number of concept simplifications ("Tap to Cancel" on the progress bar instead of a separate button), a number of rejected ideas, and 3 new projects. The great thing is each and every idea was documented, so it was easy to visualize how the idea could change the product.

  • After each major build (anything in trunk gets updated, with unit tests passing), we try to have at least 2 people test out the project. Mostly, we try to find people who have little knowledge of computers, as we've found it's far too easy to design complexity rather than simplicity.

  • We use DOxygen to generate our documentation. We don't really have auto generation incorporated into our build process yet, but we are working on it.

  • We do not code review. If the unit test works, and the source doesn't cause problems, it's probably ok - but this is because we are able to rely on the quality of our programmers. This probably would not work in all environments.

  • Unit testing has been a god-send for our programming practices. Since any new code can not be passed into trunk without appropriate unit tests, we have fairly good coverage with our trunk, and moderate coverage in our branches. However, it is no substitute for user testing - only a tool to aid in getting to that point.

  • For bug tracking, we use bugzilla. We don't like it, but it works for now. We will probably soon either roll our own solution or migrate to FogBugz. Our goal is to not release software until we reach a 0 known bugs status. Because of this stance, our updates to our existing code packages are usually fairly minimal.

So, basically, our flow usually looks something like this:

  1. Paper UI Spec + Planning » Mental Testing » Step 1
  2. View Code + Unit Tests » User Testing » Step 1 or 2
  3. Paper Controller & Model Spec + Planning » Mental Testing » Step 2 or 3
  4. Model & Controller Code + Unit Tests » User Testing » Step 3 or 4
  5. Branched Idea » Spec » Coding (no unit tests) » Mental Testing » Rejection
  6. Branched Idea » Spec » Coding (no unit tests) » Mental Testing » Acceptance » Unit Tests » Trunk » Step 2 or 4
  7. Known Bugs » Bug Tracker » Bug Repair » Step 2 or 4
  8. Finished Product » Bug Reports » Step 2 or 4

Our process is not perfect by any means, but a perfect process would also imply perfect humans and technology - and THAT's not going to happen anytime soon. The amount of paper we go through in planning is staggering - maybe it's time for us to get a contract with Dunder Mifflin?

Zach Gardner
+5  A: 

I am not sure why this question was down voted. I think it's a great question. It's one thing to google search, and read some random websites which a lot of times are trying to sell you something rather than to be objective. And it's another thing to ask SO crowd which are developers/IT Mangers to share their experiences, and what works or doesn't work for their teams.

Now that this point is out of the way. I am sure a lot of developers will point you towards "Agile" and/or Scrum, keep in mind that these terms are often used very loosely especially Agile. I am probably going to sound very controversial by saying this which is not my intention, but these methodologies are over-hyped, especially Scrum which is more of a product being marketed by Scrum consultants than "real" methodology. Having said that, at the end of a day, you got to use what works the best for you and your team, if it's Agile/Scrum/XP or whatever, go for it. At the same time you need to be flexible about it, don't become religious about any methodology, tool, or technology. If something is not working for you, or you can get more efficient by changing something, go for it.

To be more specific regarding your questions. Here's the basic summary of techniques that have been working for me (a lot of these are common sense):

  1. Organize all the documents, and emails pertaining to a specific project, and make it accessible to others through a central location (I use MS OneNote 2007 and Love it for all my documentation, progess, features, and bug tracking, etc.)

  2. All meetings (which you should try to minimize) should be followed by action items where each item is assigned to a specific person. Any verbal agreement should be put into a written document. All documents added to the project site/repository. (MS OneNote in my case)

  3. Before starting any new development, have a written document of what the system will be capable of doing (and what it wont do). Commit to it, but be flexible to business needs. How detailed the document should be? Detailed enough so that everyone understands what the final system will be capable of.

  4. Schedules are good, but be realistic and honest to yourself and business users. The basic guideline that I use: release quality and usable software that lacks some features, rather than a buggy software with all the features.

  5. Have open lines of communication among your dev. team and between your developers and business groups, but at the end of a day, one person (or a few key people) should be responsible for making key decisions.

  6. Unit test where it makes sense. But DO NOT become obsessive about it. 100% code coverage != no bugs, and software works correctly according to the specs.

  7. Do have code standards, and code reviews. Commit to standards, but if it does not work for some situations allow for flexibility.

  8. Comment your code especially hard to read/understand parts, but don't make it into a novel.

  9. Go back and clean up you code if you already working on that class/method; implementing new feature, working on a bug fix etc. But don't refactor it just for the sake of refactoring, unless you have nothing else to do and you're bored.

  10. And the last and more important item: Do not become religious about any specific methodology or technology. Borrow the best aspects from each, and find the balance that works for you and your team.

WebMatrix
What do you do about inaction items? Utilize synergy? ;-) (sorry, just a little pet peeve of mine. They're "items".)
Alan Hensel
+3  A: 
  1. We use Trac as our bug/feature request tracking system
  2. Trac Tickets are reviewed, changed to be workable units and then assigned to a milestone
  3. The trac tickets are our specs, containing mostly very sparse information which has to be talked over during the milestone
  4. No, but our development team consists only of two members
  5. Yes, we test, and yes, TDD has helped us very much. Coverage is at about 70 Percent (Cobertura)
  6. No, we refactor when appropriate (during code changes)
  7. We document only public methods and classes, our maximum line count is 40, so methods are usually so small to be self-describing (if there is such a thing ;-)
  8. svn with trunk, rc and stable branches
    1. trunk - Development of new features, bugfixing of older features
    2. rc - For in house testing, bugfixes are merged down from trunk
    3. stable - only bugfixing merged down from trunk or rc
dhiller