Possible Duplicate:
What is your longest-held programming assumption that turned out to be incorrect?

What do you consider to be the most harmful misconception about programming from people who are new to programming that you have seen?

+3  A: 

That they will "break" something!

Or, to define "newcomers" as those that don't do it, "It'll be easy to change! It's software!"


Rob Wells
+1  A: 

"But you can do anything!"

Williham Totland
But... you can. That's my favorite part about programming.
William Brendel
+12  A: 
  1. That their program will work.
  2. If the previous hurdle is overcome miraculously, that their program will work as expected by the end user
  3. If the previous hurdle is again overcome miraculously, that their program will stand the test of time, i.e that it will be maintainable
  4. If all of the previous hurdles are again overcome miraculously, that their second system will be as good or better
Andrew from NZSG
+3  A: 

That the program has to be correct the first time.

Fail fast, early, and often. It's the only way to get better.

Mike Reedell
+38  A: 

Maybe not the most harmful, but they usually can't estimate how long stuff will take to be done, they think it can be done much faster than it really must(including me).

As for harmful stuff, good companies usually keep beginners away from where they can do much harm. They are usually encouraged to work by someone more experienced, so they can learn better.

Samuel Carrijo
plz one more upvote, so I can get a comma =)
Samuel Carrijo
The question says _beginners_ :)
Daniel Daranas
@samuelcarrijo I regularly upvote anyone who has 990+ rep, except if they said something really silly :)
Daniel Daranas
@daniel I know even experienced guys still makes lots of mistakes on estimating, but at least in my case, I'd think things could be done two to four times faster than it really took... Old times... (now I usually make mistakes around 60% =P)
Samuel Carrijo
Same here. Before, my estimates needed to be multiplied by pi, while now just a 30% safety net is enough :)
Daniel Daranas
So, what's the difference between the newbies and the old hands? It seems to me that everybody underestimates time.
David Thornley
I think my estimates would usually be pretty good if all factors were under my control, but they're not... Inevitably, I hit some major stumbling block. Very often, it's a bug or limitation I didn't realize in some library (usually in-house library) that we're using. Also, the compiler has cost me tons of time due to bugs and non-compliance (MSVC6). Does that make me a bad estimator? I still run over my estimates even when I add in a factor of 3 or so of what I think it would take me working with reasonable technology...
@David That was the point of my first comment. Then samuelcarrijo and I seemed to agree that the problem doesn't go away, it's just mitigated in its quantitative impact. I've yet to see a precise estimating developer. For normal people, I think Joel's Painless Software Schedules is the most effective approach.
Daniel Daranas
+7  A: 

It has something to do with computers.

Why is that a misconception?
I think he means it isn't really about computers but about logic and sets and algorithms and such (universal concepts that can be separated from the computer hardware they usually run on). Ebo, maybe if you re-worded a bit...?
By this statement you're implying that programming has nothing to do with computers? Perhaps you mean "Programming is only about computers."
+36  A: 

That if their code doesn't compile or work, it is because of a bug in the compiler.

@Neil, Yes! Seen Jeff's post on this topic?
Rob Wells
@Rob No, this is the synthesis of my own experience as an instructor.
@Neil: I vehemently disagree. Hardware errors are much more common.
Andrew from NZSG
@Andrew Back in CP/M days (two 5.25" floppies, no hard disk) hardware problems WERE much more common - the disk drives were always failing. I remember the happy day I took delivery of my first hard disk. It was from DEC, 8Mb capacity, and came in packaging suitable for a small washing machine. Bliss!
+3  A: 

That all there is to it is building cool new stuff everyday. Maintenance IS a part of programming!

+1  A: 

That it is a promising career path and they should all go there. Then it takes years to clean up the system of primates' code.

+50  A: 

Re-inventing standard library functions/classes.

After going through a language book/tutorial, most beginners - knowing how to handle strings and numbers - will invent their own date functions, their own 'compression algorithms', their own SORT implementations.

Oh, and they always spend their first day searching for clrscr();.

Guilty. Luckily I have learned from my "sins" long ago. This got a chuckle out of me.
Of course I talk from personal 'experience' too;)
I can't say I agree that this is harmful. Implementing some of the basic stuff (even if it already exists) can be a good way for beginners to learn the basics, as well as how *not* to implement things. This is not harmful as long as you *eventually* figure out that standard libraries exist. I would take a programmer who wrote his own linked list implementation over one that uses the built-in libraries without question...
William Brendel
@William: agreed: *I* would take a programmer who *once upon a time* wrote his own linked list implementation(s), too.
I wrote my own hash-table once. :-) Actually, I was quite experienced by then and needed something extremely fast, capable of handling tens of thousands of insertions per second. I succeeded at that. But kids, don't try that at home! Leave it to us trained Professionals...
Workshop Alex
Don't forget ones writing "encryption algorithms".
I'm ashamed to say how many C++ string classes I've implemented. Of course, that was pre-standardisation, I'd never do it nowadays. No, no, no indeed.
Thats not only begginers, I was asked on some job interviews to re-implement the Map.
Is it really harmful, i mean is it really waste-of-time re-inventing the wheel. But i use to think this is the best possible way to learn efficient designing, as u try to reason why and how the professional/standard/ or best design decisions/ choice are made
Vivek Sharma
It's been a long first day then. :)
I don't agree with this either. I would rather work with people who have an idea of how this stuff is done.
Ed Swangren
Almost thought I would be clear of this one, except for that last `clrscr()`. darn!
this reminds me of the old verbiage on 'how to think' vs 'what to think'
+9  A: 

That you have to use every feature of the language you are learning, inheritance above all.

Updated: be obsessive about assembly inline code in C


That automated testing is a waste of time.

+50  A: 

That because their program compiles and runs it does what they expect it to do.

Yes! This is quite seductive and can be hard to get rid of.
Anders Eurenius
This is what I agree with right here. Students jump to conclusions just when they get code to compile and run. They understand syntax errors, but easily forget logic errors.
hey - the compiler said '0 errors' who am I to argue?
Martin Beckett
It compiles - ship it!

Most new programmers overestimate the intelligence of the compiler, in my experience. This might take the form of expecting c arrays to multiply like vectors or matrices, right down to telling the computer what they want in English. ("diagonalize matrix A;") I've also seen people expect the compiler to be completely aware of all the code right from the beginning, and so being lax about what order things go in.

+29  A: 

That if their program works on their own computer, then it will work on everybody else's computer too.

"But it works on my machine!"

Greg Hewgill
Assuming you aren't doing 3d games or kernel hacking, it usually will run on most other computers too.
@Zifre: I vehemently disagree. Even number-crunching programs sometimes have problems on other machines, e.g. some system resource runs out where you didn't check for it and *bang*!
Anton Tykhyy
Learning the difference between works and works well.I see this especially on database driven apps. No, 1000 rows in your test db is no "a lot" of data.Also, no, a 1MB javascript file is not a good idea just because it's fast on a LAN.
+1: this is one of the most _frustrating_ things to hear from a developer when QA logs a defect.
We're not shipping your machine!
@Zifre: Well, unless the programmer has hardcoded everything, because *of course* everyone has a `C:\Windows\System32` and can write there sans restrictions. Not that I'd ever do anything like FRIEND told me about that. Yes, that's it.
+23  A: 

That the user is a programmer.

No that's more from experienced programmers.
@Zifre No. The more experience you get, the more you learn about users culture. IMHO
I'm sort of in the camp that a beginner programmer involves to an experienced one when they learn to think about their users.
That's actually true for me, as I'm the only one who uses my programs. I get sent data, I run my programs, and send my boss the results of my analysis.
Andrew Grimm
+25  A: 

That programming is all about the syntax. Turns out it is all about problem solving.

William Edmondson
Agreed. I was guilty of this when I started out. A course in algorithms will quickly cure a beginner of that!
Sometimes using the syntax is the problem solving... but in general agreed.
I had a college intern tell me this a few weeks ago. I don't know how well I was able to convince them otherwise.
+1  A: 

Overestimating the importance (and the time share) of actually writing code followed by a little testing/debugging, while underestimating or simply forgetting about writing unit tests, and other important activities such as requirements, writing specifications, design, system test, and customer acceptance.

Daniel Daranas
+21  A: 

Thinking if it doesn't look horribly complicated it must be wrong or "bad" code.

I must admit years ago in school I was guilty of thinking my programs didn't look complicated enough! These days I want to cry if something doesn't turn out as simple as:



//go home


Peter Spain
+21  A: 
  • Programming is easy: Programming is a lot of fun but don't ever think of it as being easy. It takes a lot of experience, learning, and failure to get better at it and be humble about it.
  • Tools do it for me so I don't need to learn what happens underneath the covers: Tools make things a lot easier and allow you to get things done quicker. However, you still need to know and get familiar with what's happening underneath the covers because sooner or later you will need to pop open the hood.
  • Lack of curiosity
  • It's all about the newest and the coolest technologies: Not necessarily. It is about what's right for the customer and the problem you're trying to solve.
Mehmet Aras
Or, alternately, that "I don't need any stupid tools".
David Thornley
I somewhat disagree with your answer to the last point; programming need not involve a "customer" in the business sense. I'd rephrase that as "it's about what gets the job done"
hasen j
+6  A: 

The most common misconception is that you can write an application by starting your favorite IDE/editor and then write code immediately.

Yes, it will create an application. Yes, it's probably [email protected] too when you're finished...

You start developing software by first creating a design. Preferably with pen and paper or with some useful tools on your computer. Writing the actual code just happens to be a small part of the whole process. (If not, you're doing something wrong!)

Workshop Alex
+1  A: 

Clever programmers knows that:

  1. Best way to speed up your appliacation is to come up with a better algorithm
  2. Unit testing is the best way to speed up your development and cut on debugging
  3. Never implement feature that you're not sure you need
+10  A: 

"I am going to make a ton of money by playing with computers!"

Edit: Another one that drives me nuts:

"The other guy's code isn't calling mine correctly, so it's not my fault the system doesn't work." -- with no proactive investigation, diagnosis, suggested patch, nothing. As a manager or a team leader, this really gets under my skin.

Mitch Haile
Actually the first one is quite true for me. Maybe if you've never had a "real" job (only IT jobs) you can't really appreciate how pleasant our work is and how well paid it is, relatively speaking.
That was mostly referring to people in freshmen computer science classes who know nothing about programming, math, computer science, computer hardware, anything.Sort of like people who watched Matlock or Perry Mason and decided to become lawyers.
Mitch Haile
OK, I see what you mean... I had that too when I started my CS degree in '98: people who had never even owned a computer but had heard that IT was where the easy money was. Thankfully many of them left after the first 2 CS1 lectures, and I think most of those people are choosing some other major since the dot-com bubble burst (accounting? I can't think of any other reason someone says "hey I really want to be an accountant!").

Or, to add another insult to injury, the newbie starts to improve the performance of a piece of code, making it 5 times better and being very proud of himself... Until someone reminds him that he improved the performance of just a small piece of the whole process with a net result of one second for a process that takes two hours.

(I've actually had a colleague who did something dumb. A process had to import half a million of records and he was real proud that he made it start up faster simply by skipping some initialization. As a result, the first log entry would appear within a second instead of after 10 seconds. Unfortunately, the whole process slowed down from 30 minutes to 6 hours...)

Workshop Alex
I don't think that's a common problem (at least I've never had it).
+15  A: 

"The problem is not in my program, it's a bug in the library / OS / language."

"It worked on my machine! What is wrong with yours?"

"Everything is a pattern, you just have to find them."

"I don't need to test because I only made a one line change."

"Source control is a waste of time for this project."

Aaron Saarela
You don't hear these quotes from beginning programmers, you get them from Code Monkeys (see:
Kelly French
How about "I don't need to post a code review because I only made a one line change." Experienced and competent person (me), and that turned out to be a really bad idea. (There was only one way to test it, and that turned out to be expensive.)
David Thornley
It's funny how quickly the last one is unlearned though.
Mike Robinson
+2  A: 
  1. They read a tutorial on the web, copy-paste, the code it's working but they don't know why and they are happy with it.
  2. The code works on the local machine but not on others
  3. The problem is with the machine, not with the alien between the chair and the keyboard
  4. Writing the code but when it comes to maintenance they prefer a beer...
+6  A: 

That their code doesn't need to be documented. They're the only ones who will ever look at it, right?

+9  A: 

That you have to have design patterns in your code.

Even worse, "that you *have* to use every pattern from The Book" (whichever book it is at the moment).
+4  A: 

That the hard part is typing in the code. The farther up you go, the more that comes to be the easy part.

+1  A: 

Being resistant to changing code because of some gut feeling that it will be slower, e.g. changing nested ifs to a table-driven approach.

Robert Munteanu
+8  A: 

That cool == usable.

Bryan Oakley
Sometimes Cool != usable at all, however.
Surely if it's not usable, it's can't be cool?
I don't agree. I've seen "cool" websites that weren't particularly usable. Many marketing related sites can lay claim to being 'cool' but not be particularly usable. I can't think of specific references at the moment but I know I've stumbled on product websites that were full of flash and video and cute effects and fancy fonts and professional imagery and definitely looked "cool", but which failed to make it easy for me to actually get at the information I was seeking.
Bryan Oakley
+2  A: 

That the rigidity of language syntax is there to annoy them or "for show".

It's not until much later (course in automata/formal languages and later on in compilation) that they realize that the reason that they do have to put that semicolon or close that brace is because otherwise the compiler can't parse their program unambiguously.

This probably comes from the fluidity of natural language, which this generation of students is probably even more apt to believe in thanks to texting.


Another misconception is that when they get a compilation error or an exception, the actual error is exactly in the line indicated by the constructor.

Unfortunately, the source is often somewhere earlier (e.g., missing brace) or in some earlier state change, but there's a tendency to stick to the line the compiler/runtime indicated.


Most of this stuff mentioned is, as far as I'm concerned, not related to the beginner programmer, but the programmer that has made it past the stage of working out how most of this works - how learning 2nd, 3rd and 4th language was way easier than 1st and so forth - but whom has yet to be part of a big "serious" project.

A beginner's misconception for me includes things such as:

  1. When code doesn't compile or throw errors - the error messages doesn't say anything other than the fact that there is an error (ie. making sense out of the error messages PHP pump out)
  2. With respect to web programming, understandhing how the entire relation between php and html seems to be a big hurdle to many
  3. When I had a beginners programming class there was ALOT of confusion of just how everything worked - granted we was pretty much shown VB.Net 2005 Express, shown how to create a new project with a window on it, given the function "Rand" (I think it is called, I'm not a VB guy) and then asked to make a game that utilized Random numbers. Need I say more than way less than half of the class ever got the concept of the difference between using a local function/sub variable or declaring the variable in the class? And also, none of them, I don't think, ever got the slightest clue of what the hell OOP was, or the fact that they could create their own objects.

I can't honestly remember my own ones (and hey, I probably STILL carry around with some stupid misconceptions which is why I haven't realised what mine were, cause they still ARE my misconceptions) - but my guess is, that it was very much like what I have just described.

+2  A: 

Testing is not important / necessary.

Unittests are a waste of time

Certain codestyles (naming conventions, etc..) are not important


That pseudo code is how things are supposed to look. Too many new programmers try to write code like they would write a sentence, and well... it just doesn't work like that.

My wife has a BA in English, she is recently trying to go back for a CS degree. I am seeing this first hand as she tries to write her code as:

If Myint = 1

   cout ...

+1  A: 
  • Forgetting the Design phase. I work with students all the time who just want to jump into the code without a thought as to the ultimate design.
  • Confusion about how assignment works, as detailed here.
+9  A: 

Disabusing them of the notion that "perfect but very late" is better than "acceptable and on time".

No one is going to care if some weekly report runs in 5 seconds rather than 8 if it is two months late.

what about "acceptable and late"?

As a part-time instructor I observed that they usually think programming is VERY VERY HARD!

I find that this generally depends on the person, some people have the mind for it, some don't. My sister can barely grok simple logical processes in code, but give her a pencil and she'll draw a masterpiece, and I"m the exact opposite.
If it's so easy, why are there so many terrible programmers out there?
What I meant with VERY VERY HARD is that they can't really learn programming even if they tried very hard.
+5  A: 

The most harmful misconception is to assume that people in software industry know what they're doing. Beginners tend to trust everything written in product's documentation, they trust error messages and exception descriptions. They even trust stuff posted on blogs.


In my experience with newbie-friends, i think that the common misconception is that validating data and making your code fail-proof is just a waste of time. Really, EVERYONE in my CA class don't validate the INPUT data!

  • The other ones, is that you only need to sit on the chair and code. Forget about writing your problem and studying the best approaches before even touch the keyboard. And they also create so complex codes when a much simpler and pretty approach would work?

Just my 2 cents.

+2  A: 

That languages like Java, Python, etc "don't have" pointers as opposed to C.

(beware I quoted the negative)

don't forget C#!

That everyone else is a genius because they can code it up right quick and you can't. After you sit with them a while you see they solve problems just like you and it is really a matter of experience which in turn gives intuition - oh, that they they use search engines, just like you.

+2  A: 

I think one that hasn't been mentioned yet is that some students assume they will always have valid data/input. In reality, valid data is only one condition and they forget about all the forms of invalid data/input.

+11  A: 
+7  A: 

The worse misconception I've encountered, and the hardest to be rid of, is that programming is writing code, and not reading it.

Avihu Turzion

That the difference between a successful project, and a failure, is due to technical issues or choices.

In fact, the difference is almost always due to social (people-based / team-based) factors.

So true. I wish everyone had a better grasp of this, not just newbies. IMO interviews should be 20% technical and 80% figuring out if the candidate fits in with the team. You can teach tech skills if necessary but you can't give someone a new personality.
Newton Falls
True. But there are people who are incapable of learning something and then applying it to a similar situation. (ie. they know how to loop from 1 to 5 but not from 5 to 10)
+11  A: 

The real problem I've seen with programming tyros is "programming is magic", meaning not truly groking that the computer will operate exactly logically, and will do exactly the same thing every time given the exact same input.

They write something that they think should sort of does what they want, and then when it doesn't work, rather than try to approach the problem logically, they start changing things semi-randomly, hoping, apparently to appease the gods of computer magic by their sheer tenacity or willingness to abase themselves upon the altar of whimsy. They feel that the computer is capricious, and changes things randomly, and the best they can hope for is to get things to a vague approximation of working, and hope the stars stay aligned for long periods.

Of course, even to experienced programmers, it can feel that way sometimes, but there is an inherent knowledge that what is happening is happening for a specific reason, and you just have to dig down to get to that reason.

Hmm... There is a bit of semi-randomness, when it comes to doing things concurrently. Just because a multithreaded program run with the correct results once doesn't mean that the right thread will win a race condition every time. It seems to be very difficult for beginners to realize when they have created a race, or realizing that inconsistent behavior is because of such a logic bug in their own code.
I thought about getting into that, but decided to avoid it: even then, it's not random! It's just that you don't have direct control over the inputs. The race condition, if you break it down far enough, will resolve one way or the other because of the exact situation...but because we can't directly control those factors, we speak of it as though it were random. But I thought that getting into that in detail would dilute the point of my answer.
Ah, the joys of shotgun debugging! This was especially fun with *some people* (ahem) who thought that using version control was a boring and unnecessary chore. "Well, can you revert this to the previous revision?" "What previous revision? I don't keep any...can I create the previous revision now and then revert to it?"
+12  A: 

most harmful misconception (financial version):

"That a college education is required to know or have understanding about how to write software."

I'd say "That a college education is **enough** to know or have understanding about how to write software."
+8  A: 

The most harmful misconception is: You are done when you get the code to work.

Uncle Bob
+2  A: 

In object oriented programming, using instance variables where local variables would have been more appropriate, especially in multithreaded frameworks (e.g., servlets).

More generally, using a wider scope than appropriate.

Jack Leow
+1  A: 

lack of indentation... improper variable names and commenting. its harmful because they are able to do small programs despite these mistakes

Umair Ahmed

First that they can ignore error checking, or that error checking can be treated as an afterthought, or that catching the exception means you can ignore the fact that it happened. This gives you code that does things like:

try {
    date = format.parse(dateString);
} catch (ParseException e) {
    log.debug("exception: ", e);
String message = "The date was " + date.toString();

Second misconception would be that programming will get easy. It doesn't. The problems will grow to match your abilities.

Kevin Peterson

Having been a beginner once, I would say that the biggest misconception on the part of beginners is that

if (thisBool = true)
    blah blah blah...

is valid code.

well, it can be valid in some languages.
hasen j
True, but it almost never does what the user is intending it to do...

Command lines and text editors are things of the past, I have an IDE so I don't [need to] care about what happens under the hood.

hasen j

The conceptualization of how references are passed in languages like Java and C#.

+1  A: 

That they don't have to learn anything new.

+1  A: 

That a 500+ line function is acceptable provided it's well-commented. I've seen experienced developers do this, and refuse to break it down into maintainable chunks because the function "only did what it was supposed to, and each operation was commented."

+1  A: 

That being a software developer is all about knowing programming languages and API's.

Ed Swangren

That you can ignore variable types in a dynamic language.

It's very common to see PHP programmers do things like:

$a = false;
if ($a == "false") ...


$b = "0";
if ($b) ...
+2  A: 

Early on:

  • But isn't all the world an x86?
  • I have to pass a size with that buffer?
  • Error checking? Why?
  • The STL is too complicated. I'd rather implement everything myself.
    • (Use std::swap()! std::swap()! Start there, then branch out to more...)
  • Not knowing that you cannot treat binary buffers as strings without first null terminating them. (Think: read(), recv(), etc.)

Later on:

Wrongly thinking that...

  • That there are 8 bits in a byte.
  • That garbage collection will save you from resource management.

  • Endianness? Padding? I can't just write()/send()/etc. the whole struct?

  • Threads and deadlocks and race conditions oh my.
  • i18n? (2009, and we're still learning that the earth is round!)
  • I could have written this better. Time to rewrite. (Hint: refactor.)
  • Time related, wrongly thinking that:
    • That within a calendar year, DST starts before it ends.
    • That all time timeszones are + or - whole hours.
    • That the max UTC offset is + or - 12 hours.
    • That there are 60 seconds in a minute.
    • That 1900 is a leap year.

Wrongly thinking that:

  • 16-bit is enough to hold a Unicode code point.
  • I can ignore FOSS libraries that will do 90% of the work for me.
  • That C/C++/Python/Lisp/C#/.Net/Java/VB6/Ruby/PHP/bash/assembler is the perfect language for every task!
There are 60 seconds in a minute right, What's the catch?
Liran Orevi
Leap seconds add the possibility of a 61th second in UTC, ie: 23:59:59 UTC ... 23:59:60 UTC ... 00:00:00 UTC
+1  A: 
  • That [Insert language, language feature, pattern, technology du jour] is a silver bullet for all problems.
  • That starting over is a good idea.
  • That Unicode is just a compiler switch.
  • That this small change won't break anything.
  • That looking something up is admitting failure.
+1  A: 

"We'll cross that bridge when we come to it"

Mike Robinson
+1  A: 

Most harmful that I have seen is that they don't have to understand what they are doing (i.e. it is acceptable to paste some code you don't understand because it "appears" to work).

It also truly amazes me the number of people who don't seem to grasp the difference between AND and OR.

Also the edge cases have to be handled. Anytime there is a decision point, you must handle all the possible paths from that point even if they rarely happen.

And in database terms, just because the query runs doesn't mean the results are what the user wanted.


Being inflexible when it comes to choses, aka "When you have a hammer, everything looks like a nail syndrome". This might include trying to solve every problem in the favorite language.

+1  A: 

A common misconception among beginners and managers is that programming is basically memorizing password-like "codes" and typing the correct code for a problem into the computer. If you don't know the "code" for a problem you're either stupid or lying.

Dour High Arch
I wonder from where did you or they get that idea.
Cristian Ciupitu
I think there are many influences; the word "code" itself makes people think of secret Da Vinci-like codes; cargo-cult programmers who simply cut and paste source without understanding it think all programming is like this; movies where "hackers" do things by guessing passwords as if a feature is there all along, you only have to guess the right secret code. Plus the natural human tendency to assume that anything one does not understand must be simple.
Dour High Arch
  1. That copying code from MSDN or somewhere else is a valid way to solve a problem

  2. Unit tests and contracts make no sense and it's only for those guys at the university

  3. Design patterns: see 2.

  4. Visual Basic 6.0 and Java are good programming languages

  5. Assembling XML by string concatenation is an intelligent thing to do

  6. Functional programming is complicated

  7. Object oriented design is only there to annoy people

  8. That the more flexible a component is the easier it is to use (which is not only terribad but also terrisad)

Turing Complete