views:

1299

answers:

5

I'm sure most of us know about the joel test by now. I've got an interview with a company that claims to score a 12 on the Joel test. I see it this way: if I were to put something on my resume, I would expect to be quizzed on it by my interviewer. Thus, I think that it's fair for me to verify that they implement all 12 portions of the Joel test.

The problem is, it seems like it would be too easy for someone to have scripted answers to respond to questions about all 12 points if asked directly. So what are some ways of determining if they really implement all 12 points? Are there any particular questions I can ask? Are there any other means of verifying that they're the real deal?

Cross links:

+3  A: 

There are things worse than scripted answers. There are companies whom ignore half the criteria, and go "oh, that criterion is not relevant to our company because of $foo, we'll mark it as done!"

I believe if a given criterion is for whatever reason not appicable to the flow/demand of the company, they should create alternative rules in their stead.

For example, one place argued that because they were running php, a purely dynamic language, that required no compilation, "one step to build" was thus automatically met, ignoring all the benefits that turn up in the compile phase such as type checking and other compile checks. Such a company should instead have automated test kits that all users can run, and 'breaking the build' entails those test kits failing.

"Do you have the best tools money can buy" is also often met with "oh but its opensource!, its free!, money is not even required!" , and there needs to be something to fill this logical void too.

Kent Fredric
Sometimes the best tools money can buy are personal choice. To me, the best SCM is Perforce. Not everyone agrees with me on this, and it doesn't work for all situations.
Mitch Haile
With Free tools, you might ask "do your coders make improvements to programs they use" or "how do you contribute to development of x"- i.e. spending time instead of money to get better tools (or even sponsoring others to improve open-source tools)
OJW
+2  A: 

If you expect them to just flat-out lie to you in response to a straight question, then don't go to the interview. Maybe that should be Joel's point 13: "we don't tell obvious lies to applicants that they'll realise are lies on Day 1 on the job, and fifty-fifty be so pissed off they quit".

If you ask about the details of how they achieve the bullet-points, then eventually they either have to explain it, or lie, or fail to answer (depending how many interviews you have, the interviewer might genuinely not know some of those 12 points, for decent reasons, but they'll know someone who does know). 12 is a pretty big laundry list, so depending on the length of the interview and how many other questions you have, it might be sensible to pick 2 or more items which you absolutely need to know about, and if they handle those OK don't worry about the rest. For instance, asking about point 11 is probably low on your priority list.

There's no way you can verify that their answers to your questions are true, so just make it clear that the answer is important to you, so if they don't know the details you'd prefer them to refer to someone else rather than either take their best guess, or nod and smile.

Steve Jessop
The problem is that I have no way of knowing if I should suspect that they'll flat out lie. Of course all things that happen in an interview are questionable. I just want to know about some ways that will at least tell me if it's likely that they know what they're talking about.
Jason Baker
If they flat out lied to you don't show up for Day 2. You're not honor-bound to work for a company that thinks so little of you that they'll lie. I think onebyone's got it.
jcollum
Well, the way to beat scripted answers is to ask for details not in the initial answer. So when they say, "yes, we have an up to date schedule kept on each product's wiki page", ask them who updates it, how often, and how slips are communicated to engineers.
Steve Jessop
+1  A: 
  • Look at the code base might give you an idea about the build process and the overall complexity of the code and you do not want to end up working with a piece of crap code.
  • Speak to someone from the team you would be working with might be a good idea to gauge their attitude towards new technologies and the vibe.
  • Whilst you are looking at the code base on someone's machine (I guess) you might be able to see the tools they use (if they are using free ones when better alternatives are out there shows the company does not care about the tools)

I am not sure if these are possible to do though but I would definitely give them a try.

andHapp
+6  A: 

For some of the points, why not ask to be shown the internal bug system, the build web page, see the source code control web browser, whatever? If those tools are missing, or if people cannot give you reasonable sounding metrics around the dimensions of these tools use, I'd be concerned.

One point to keep in mind though: It's more than possible to score 12/12 on the Joel test and still have a disaster of a company/engineering situation. Maybe not a disaster on super fundamental levels, but you can have automated builds and tests and still not be solving a problem that customers care about... and so on. The Joel test is a fine starting point, but I've worked at places that were depressingly bad and pass the Joel test.

Also, don't get trapped into thinking that someone using a free tool => not the best money can buy. While I love Perforce ($800 per yr per dev), I also love Bugzilla (free, but you can--and I have--donate money). I would not turn down a job because some company uses a different SCM or uses a different bug system.

Mitch Haile
I agree 100%. I just didn't want to duplicate anything in other threads.
Jason Baker
Yes! Free tools are often great. "Best that money can buy" isn't necessarily "most expensive."
Bill Karwin
+41  A: 

It's reasonable to say, "show me." Ask them for examples and concrete details of their support for the Joel Test subjects. Since they claim they score all 12 points, they are obviously proud of it. People tend to like to show off, so they'll probably be eager to share more details.

If you ask more specific questions, it'll become apparent from their descriptions whether they really do those good practices.

One can think of many specific follow-up questions to the basic Joel Test questions (which are in bold below):

  1. Do you use source control? What source control system do you use? Why did you pick that one? What is your branch/release policy? What are your tag naming conventions? Do you organize your tree by code vs. tests at the top with all modules under each directory, or do you organize by module at the top with code and tests under each module directory?
  2. Can you make a build in one step? What tools do you use to make builds? How long does it take to go from a clean checkout to an installation image? What would it take to modify the build? Is it integrated into your testing harness? What would it take to duplicate a build environment? Are the build scripts and tools also under source control?
  3. Do you make daily builds? What software testing tools do you use for daily builds? Do you use a Continuous Integration tool? If so, which one? How do you identify who "broke the build?" What is your test coverage?
  4. Do you have a bug database? What bug tracker software do you use? Why did you pick that one? What customizations did you apply to it? Can you show me trends of rate of bugs logged or bugs fixed per month? How does a change in source control get associated with the relevant bug?
  5. Do you fix bugs before writing new code? What is your bug triage process? Who is involved in prioritizing bugs? How many bugs did you fix in the last release of your product? Do you do bug hunts with bounties for finding critical bugs?
  6. Do you have an up-to-date schedule? Can I see it? How far are you ahead of/behind schedule right now? How do you do estimating? How accurate a method has that turned out to be?
  7. Do you have a spec? Can I read one? Do you have a spec template? Can I see that? Who writes the specs? Who reviews and approves the specs?
  8. Do programmers have quiet working conditions? Can I see the cubicle or work area for the position I'm interviewing for? (or an equivalent work area)
  9. Do you use the best tools money can buy? What tools do you use? Are you up to date on versions? What tools do you want you don't have yet? Why not?
  10. Do you have testers? How many? Can I meet one? Do testers do black-box or white-box testing?
  11. Do new candidates write code during their interview? What code would you like me to write? What are you looking for by seeing my code?
  12. Do you do hallway usability testing? How frequently? Can I see a report documenting one of your usability testing sessions? Can you give me an example of something you changed in the product as a result of usability testing?

Beware if their answers to the specific follow-up questions are evasive like, "um yeah, we are committed to doing more best practices and we'll be looking to you to help us effect changes toward that goal." If they're so committed to it, why don't they have anything to show for it yet? Probably because like many companies, when the schedule is in jeopardy, following "best practices" goes out the window.

Bill Karwin
Very Nice Answer!
Mark Brittingham
Ask open-ended questions and let them sink or swim. For example, 2 and 3 can be covered by the question, "Would you tell me about your build process?" 5, 6, and 7 can be hit with "Describe a development cycle for me from start to finish."
plinth
@plinth: good suggestion. Though open-ended questions might be subject to B.S.-ing. A mix of some open-ended and some specific, verifiable questions is probably best.
Bill Karwin
+1 - Quite a good set of interview questions, digging into specifics of claims.
ConcernedOfTunbridgeWells
excellent, excellent answer sir. My only critique would be that #11 could be answered rather easily by asking yourself "Did I write code during the interview?" :)
Jordan