views:

56

answers:

2

I'm development manager in a small software house producing products written primarily in Java and a bunch of Java related web technologies and frameworks (the odd bit of C++ when we need something lower level).

When one of the developers comes to me and says "I want to knock up an internal tool in Perl / Python / Ruby / Visual Basic / Fortran / 6800 Assembler (basically anything that's not on our core technology list)", my immediate response is that I don't want something we may not be able to support when that person leaves - internal tools have a way of becoming critical and you need to be able to maintain them independent of a particular individual.

My view is that it's not as if I'm asking them to write a web app in C, our core technology list contains generally tools which are fine for the sort of job in question (if perhaps not as good as some alternatives), but how strictly should standards be applied in these situations?

(Marker as community wiki as I know it's subjective - though not, I hope argumentative - but I'm sure people will close if they think it's unreasonable).

+6  A: 

Standards are, by definition, arbitrary restrictions on potential solutions. By nature, they don't always include the 'best' tool for a given job (certainly not for all interpretations of 'best'). But adhering to standards allows you to keep costs reasonable, and to stay flexible (avoiding situations where you don't dare touch your toolchain because of a nonstandard technology for which you have insufficiently skilled staff). So generally speaking, be firm on your standards: you have an excellent case.

However, given the flux of the world in general and IT in particular, standards also need to evolve. That's where you should leave some wiggle room for experimentation: allow your developers to throw together e.g. a Ruby tool to get a feel for it, but budget for its replacement with an approved technology, and don't permit the original developer to maintain it, in order to get second opinions and decrease the risk of lock-in to the original developer. (Which, granted, is going to hurt in an emergency.) If the new technology works out and significantly surpasses your existing standards (key point!), consider phasing it in. But not without phasing out something else (in at least nine cases out of ten): you don't want your standard portfolio growing uncontrollably.

Pontus Gagge
+1 for focusing on the financial side and the irresponsibility of having a wildly grown code base. Been there, seen that at customers. TERRIBLE.
TomTom
+1 for "standards also need to evolve." Don't create programmer hell by fearing change so much that you don't even allow gradual changes. I've heard so many horror stories about shops still using .NET 1.0 or 1.1 because of their fear of change; at some point the cost of moving on to more-current technologies is less than the cost of staying put. Think it's easier to find libraries for .NET 1.0 than for later versions of the framework? Think it's easier to find developers? Sure, Java isn't "legacy" now, but it may be some day.
Jacob
A: 

Your position seems eminently reasonable, that is to say, you are able to explain reasons behind what you are telling people.

The reason they can not do something faster, and perhaps better, with their favorite language, is because you would go crazy managing it. Except, what if you didn't?

A sane management strategy might be (might be!) to generally say no to these things, but every now and then, say yes. And try to learn.

Here is my suggestion: An internal tool that is well written, built in a language you don't normally use, just might open you up to further use of that tool. Python and Django, Ruby and Rails, and the list goes on and on. So you're a Java guy. Good for you.

But your development team, organization can probably outperform expectations when you leave behind arbitrary limits, and leave the limits to what really has a 100% solid business and financial case behind it.

Fear of learning a new programming language is the difference between the mediocre, and the good people in your organization. Artificial limits on their productivity, will in time, cause you to lose those squeaky-wheels. They will go elsewhere. Your question should be "if I tick off these guys and they leave, did I lose something good, something worth keeping, and do the people who I have now, who all write everything in Java, have the capability to build me my next big product, or solve my next big problem, when I need it solved? Or are they all plodding egalitarian members of a groupthink-team, and not one of them capable of an original thought?"

I submit to be beaten-up the following proposal:

i. Say no to Visual Basic, and C# on the grounds that Java and the JVM offer a superset of the benefits of the .net managed code environment. Moving to .net means keeping around a second virtual machine technology, runtime and competing frameworks, while losing Java's portability and open source JVM platform, a net loss to your organization.

ii. Say yes to Ruby and Python, because they are on the list of languages that startups are using to change the world. They are stars in ascendancy.

iii. Say no when you have a clear business case to do so, and a provisional yes when six months or less will give you a better answer on what will provide the optimal results and capabilities to your software team. Is there anything you can't do in Java? No. Is there anything that you can't do faster, and possibly better, in Python and Ruby? Some of your people are saying yes? Give them a chance to prove it.

iv. Also, say no to extensions and improvements to your main Java products using secondary languages. If used, Python and Ruby are for standalone internal tools, not as "first class languages to be used instead of Java to extend your primary existing Java applications", and not as a destination for a total rewrite, and not as a way to abandon Java.

Warren P
+1 for "Fear of learning a new programming language is the difference between the mediocre, and the good people in your organization."
Jacob
-1 for ignoring the reality of having to keep more knowledge around which has significant impact on hiring decisions, or leaves to tons of orphaned code. been there, seen that. Nice argument in theory, but killing yourself
TomTom
Pontus Gagge
I've never worked with programmers so bad that they couldn't easily pick up a new language, but maybe I'm just lucky. What I have seen is where the "forced" learning of new skills also enhances the company's core skill set due to widened perspective. I've also seen how programmers are stifled at companies that stick with the same technologies through fear of change for far too long; being stuck with libraries and tools for legacy technologies just because of arbitrary management decisions can be very frustrating. Technology should be selected based on its merits, not its familiarity only.
Jacob
Even when PHP was new, I would have said no. PHP was designed by a guy who wrote unmaintanable hack upon unmaintainable hack. The story of PHP is the story of "the first 90% gets done fast, the last 10% never gets done right, at all, even after you spend another 90% time/money on it". Merits are 100% key here, as Jacob says. Python and Ruby are excellent on their merits.
Warren P
Orphaned code happens whenever there is developer turnover, even when it's all in Java. It's the "write-only code" phenomenon. Maybe some people have gotten burned there more by it being in a language that the hiring person or managing person (the OP in this case) doesn't know. Maybe the answer is to learn something new every year. If you're the director of the software team, shouldn't you be on the leading edge, not the trailing one?
Warren P