views:

172

answers:

3

Recently I was comparing an old Windows DOS command for deleting all the files in a directory with a scripted equivalent - I noticed the "modernised" version required typing 50 times more keystrokes to achieve the same outcome.

Are these additional keystrokes enhancing productivity? Are they serving a purpose that has been quantified, for example reducing coding error rates?

The issue as I see it is that a computer language written primarily to accomodate Von Neumann architecture - rather than the way we think - forces us to solve problems by juggling three problem domains in our heads (a) the original probem (b) the problem restrcutured to fit Von Neumann architecture (c) the mapping rules needed to translate back and forth between (a) and (b).

As a rule of thumb the more efficient a computer language notation - in the sense that it enables you to work directly with the problem at hand - the lower the coding overhead. Lower coding overheads makes problem solving more tractable and thereby reduces coding and room for error. It should definitely not increase workload!

Which computer language in your opinion makes for the most efficient problem resolution platform - in that it enables you to think directly in terms of the original problem without having to do cross-domain problem juggling?

For interest I did a byte count of 37 different solutions to Conway's game of life and came up with the following stats:

J : 80, 
APL : 145, 
Mathematica : 182, 
Ursala : 374, 
JAMES II : 394, 
SETL : 559,
ZPL : 652, 
PicoLisp : 906, 
F# : 1029, 
Vedit macro language : 1239, 
AutoHotkey : 1344, 
E : 1365, 
Perl 6 : 1372, 
TI-89 BASIC : 1422, 
Perl : 1475, 
PureBasic : 1526, 
Ocaml : 1538,
Ruby : 1567, 
Forth : 1607, 
Python : 1638, 
Haskell : 1771, 
Clojure : 1837, 
Tcl : 1888, 
R : 2031, 
Common Lisp : 2185, 
OZ : 2320, 
Scheme : 2414, 
Fortran : 2485, 
C : 2717, 
ADA : 2734, 
D : 3040, 
C# : 3409, 
6502 Assembly : 3496, 
Delphi : 3742
ALGOL 68 : 3830, 
VB.NET : 4607, 
Java : 5138, 
Scala : 5427   

(See e.g. http://rosettacode.org/wiki/Conway's_Game_of_Life)

Comments?

Please be specific about the merits of the notational approach the language you critique takes and do so from a reasonably high level - preferably with direct project experience.

+1  A: 

You used Conway's game of Life as an example, and no language can solve that more elegantly or efficiently than APL. The reason is full array/matrix manipulation in very powerful single or multiple character operators.

See: Whatever Happened to APL? and my story about my combinatorics assignment that compares APL with PL/I.

If you're talking about "efficient" in terms of keystrokes to solve a problem, APL will be tough to beat.

Your byte count of 145 for APL solving Conway's game is wrong. That is a very inefficient solution you were looking at.

This is one solution:

alt text

That's 68 bytes and beats the J solution. I think there are other APL solutions that are even better.

Also see this video about it.

lkessler
I just wonder about the *minutes per keystroke* ratio...
Camilo Martin
@Camilo: It's like typing. Some people type by thinking about every letter they type. Some think about every word. And some look at the words and their fingers just type it. APL is the same. If you practise it, or have a natural ability for it, you'll be fast.
lkessler
Well, it does look astonishingly small, but I think I can understand the VB.NET or Java sample in less time than I'd take to understand something so condensed... Of course it's a matter of practice too, but APL looks like golfscript on steroids (but also looks like hell of a fun language to play with!).
Camilo Martin
@Camilo. Yes. Look at the video about it in the link I give in my answer, and prepare to be impressed.
lkessler
+1  A: 

Are you really comparing del /s *.* with an implementation of the same? I bet that the author of the script could have shell out and execute the built-in del command. It's impossible to say why he didn't do that but he could have had a good reason.

I'm all for less ceremony and as low cyclomatic complexity as reasonable but keystrokes seems like a really bad metric of how easy the code is to read (Perl - why are you looking at me like that?) or how well it maps to the problem domain. Just change all your variable names to one character and you save lots of keystrokes! Or make the code totally undreadable by some advanced code golfing. Not very productive.

Jonas Elfström
+1 for Perl looking at you.;)
Paul
A: 

Are these additional keystrokes enhancing productivity? Are they serving a purpose that has been quantified, for example reducing coding error rates?

I think in part they are. Even if I typed 10 times faster, a project woulnd't be done 1% faster in the end. But take a look at bat files, they look like Spaghetti. No, more like ramen.

Most of the time I code some "quick n' dirty" script, I have to run it to check if it does indeed work. But in a modern language I hardly face stupid surprises (like deleting the wrong file because the script got invoked from a network drive or a shortcut) at run-time.

Camilo Martin