views:

323

answers:

10

I read somewhere the following statement:

In Information Technology related jobs 60% of learned knowledge is not useful after one and a half years.

I'm not sure if I remember numbers correctly (but I think they are close to original) and the author of this theorem. Also I'm not sure if this was about general IT knowledge or domain knowledge. I was searching Google a lot for it, but unfortunately I can't find it.

So my first question is: Where can I find the correct version of this statement?

I have also same open questions related this subject.

I talked about this with my colleagues from university and one of them said that this is a statement about stupid people who can't find out what is important to learn. I disagree with him. He is good mobile applications developer. But would be he so good after spending years on a desert island (of course with out his Android based phone ;) or other Internet enabled device)?

However I think he has right about one important thing - devaluation comes faster or slower depending on knowledge type. And this is my last question: What programming related skills don't lose value over time?

Some interesting links:

+18  A: 

What programming ralated skils doesn't lose value over time?

What is technology-agnostic:

  • Skills to design good and resilient architecture
  • Writing robust and quality code
  • Understanding how hardware works
  • Basics of compilers, programming languages, complexity and performance
  • Algorithms and data structures
  • Database theory
  • Time management skills
  • Project management skills
  • Domain-/industry-specific knowledge
  • Business of software
  • Social and communication skills

As to specific languages and frameworks, yes, this kind of knowledge tends to fade and devalue. Some time ago it was Pascal/Delphi, don't need it anymore. Then I played with C++/WinAPI/MFC/COM, don't need it any longer. Then with ASP.NET WebForms, was great, but now we have ASP.NET MVC. Something else will definitely follow and replace/upgrade the current stack. Then something else. And so on.

Developer Art
I completely agree. It's the core theory and concepts for a wide range of topics that will be valuable for the longest time. It's oddly enough the type of knowledge a university tries to teach you. This knowledge is also useful in learning for yourself the specifics needed for a task at hand. Of course employers often complain that students come out of school without any real world or directly applicable skills. I personally don't care if someone knows how to write Visual Basic or not. It seems I have start off on a tangent.
Sean Copenhaver
And even thise knowledge (specific languages) should be relevant for at least some years. Heck, even yer good ol' COBOL is still used in many places.
Kuroki Kaze
I'd add: * Understanding that there are two audiences you write code for: the computer (i.e. "Understanding how hardware works" above) and other developers (not quite the same thing as "Social and communication skills"). Also, by "MVC", do you mean model-view-controller, Microsoft Visual C, or something else?
Mike DeSimone
+1  A: 

The statement as stated only applies to people who are surfing the bleeding edge for their entire career, which doesn't apply to very many people.

Moreover, programming techniques are relatively static, within the confines of individual languages. Things change, but many things also remain the same. If you got a quality education (theory emphasised over individual platforms and deployment solutions) much of what you learned should remain useful.

Your individual job will dictate what knowledge is going to be outdated, and how quickly. There are plenty of old COBOL systems still lying around, with developers who've been working on the same code, in the same environment, for decades.

Satanicpuppy
+1  A: 

Specific knowledge might become useless, but the wisdom you picked up from making the mistakes with a specific technology will not. I'm sure there are dozens of FORTRAN card-punchers on this board than can tell me the same thing.

..reminds me of a quote..

"Understanding that a tomato is a fruit is knowledge. Knowing not to put it in a fruit salad is wisdom."

San Jacinto
+1  A: 

What makes a devleoper good will translate to new languages, frameworks, and technologies. You can put a good developer on a desert island for 5 years, and he/she will be surprisingly effective after just a few months back in civilization. Some characteristics of great developers that translate to seemingly completely different situations:

  • They continually balance competing concerns. Someone else will have a large body of knowledge and (supposedly) best practices, but they can't balance them.
  • They have a balance between confidence and humility. They have a good feel for when they need to investigate something further, and when to just do what they think is best. They know when just good enough is good enough and when something has to be done really well.
  • They know success is delivered by being flexible, practical, and keeping it simple.
  • The can collaborate and work in a team, they learn from anyone, and respect ideas on their merit.

The disadvantage to a developer for being on a desert island is that, because a great developer gets better at these things with more experience, that 5 years lost was 5 years where they would have gotten even better.

Patrick Karcher
A: 

1.5 half years? == 9 months?

I simply don't believe that knowledge of SQL, Java (or name any other ) language syntax for example are useless in 9 years let alone 9 months.

As for foundation principles such as separation of concerns, and different kinds of testing arguably they've been around since the days of valves and paper tape and remain valuable.

I do agree that if you didn't learn something new in the last 9 months you probably missed a trick, but knowledge is often incremental, learning something new doesn't make what you already know useless, rather it may even you learn new stuff.

djna
+5  A: 

As a Lisp programmer, and I disagree, at least as far as languages are concerned.

Seriously: every couple years the popular language changes, which just means I'm writing Lisp with a different syntax. (Hey guys what's REMOVE-IF-NOT called today? Python: "filter". Ruby: "select". C#: "Where".) And learning syntax is way less than 60% of what I learn. I haven't seen any wholly new concepts in ages. A 2009 personal computer is pretty much like a 1995 personal computer but with a lot of ugly hacks (needed for the slow tiny PCs of the day) replaced with more robust implementations.

If there was, I don't know, a new model of computation, then maybe I'd have some real learning to do. Erlang looks the closest to that these days, but even Erlang isn't that different. If you have a computer science degree (and maybe even if not), you probably (I hope!) learned C, Lisp, and Haskell, which are basically the spanning set of modern languages, so you can pick up any new language pretty quick.

The other aspect of what I learn is domain knowledge, and that changes every time I change jobs (so less frequently than every 1.5 years!), but that's true of any occupation, from writers to welders.

Ken
+1 for domain knowledge
MatthieuF
A: 

I think there are two parts to your question. The first is if your knowledge stays useful, and as others have said, the core theory of what you're doing most certainly stays useful over long periods of time. But this is subject to "use it or lose it" just like any other skill or knowledge (this is the second part). You ask about a developer stuck on a desert island for a few years. IMHO, such a developer would be useless. Not because technology has passed them by, but because they haven't used their skills for several years and would have forgotten a lot of it. They'd essentially be regressing in their experience level. For evidence of this decay after you're no longer using your development skills regularly, just examine your local programmer promoted into management :)

rmeador
A: 

Your quote is probably true for domain specific knowledge Core Software Engineering principles will not decay this quickly and you would be surprised how little a lot of professionals care about their craft this number would only be that high if the dev doesn't care enough to refresh their skills.

RHicke
A: 

Did You Know; Shift Happens - Globalization; Information Age has something like that quote in it that may be a source of the statistic.

JB King
A: 
gWiz