Possible Duplicate:
What is your longest-held programming assumption that turned out to be incorrect?
What do you consider to be the most harmful misconception about programming from people who are new to programming that you have seen?
Possible Duplicate:
What is your longest-held programming assumption that turned out to be incorrect?
What do you consider to be the most harmful misconception about programming from people who are new to programming that you have seen?
That they will "break" something!
Or, to define "newcomers" as those that don't do it, "It'll be easy to change! It's software!"
cheers,
That the program has to be correct the first time.
Fail fast, early, and often. It's the only way to get better.
Maybe not the most harmful, but they usually can't estimate how long stuff will take to be done, they think it can be done much faster than it really must(including me).
As for harmful stuff, good companies usually keep beginners away from where they can do much harm. They are usually encouraged to work by someone more experienced, so they can learn better.
That if their code doesn't compile or work, it is because of a bug in the compiler.
That all there is to it is building cool new stuff everyday. Maintenance IS a part of programming!
That it is a promising career path and they should all go there. Then it takes years to clean up the system of primates' code.
Re-inventing standard library functions/classes.
After going through a language book/tutorial, most beginners - knowing how to handle strings and numbers - will invent their own date functions, their own 'compression algorithms', their own SORT implementations.
Oh, and they always spend their first day searching for clrscr();
.
That you have to use every feature of the language you are learning, inheritance above all.
Updated: be obsessive about assembly inline code in C
That because their program compiles and runs it does what they expect it to do.
Most new programmers overestimate the intelligence of the compiler, in my experience. This might take the form of expecting c arrays to multiply like vectors or matrices, right down to telling the computer what they want in English. ("diagonalize matrix A;") I've also seen people expect the compiler to be completely aware of all the code right from the beginning, and so being lax about what order things go in.
That if their program works on their own computer, then it will work on everybody else's computer too.
"But it works on my machine!"
That programming is all about the syntax. Turns out it is all about problem solving.
Overestimating the importance (and the time share) of actually writing code followed by a little testing/debugging, while underestimating or simply forgetting about writing unit tests, and other important activities such as requirements, writing specifications, design, system test, and customer acceptance.
Thinking if it doesn't look horribly complicated it must be wrong or "bad" code.
I must admit years ago in school I was guilty of thinking my programs didn't look complicated enough! These days I want to cry if something doesn't turn out as simple as:
//start
if(something)
{
do_stuff();
}
//go home
:P
The most common misconception is that you can write an application by starting your favorite IDE/editor and then write code immediately.
Yes, it will create an application. Yes, it's probably cr@p too when you're finished...
You start developing software by first creating a design. Preferably with pen and paper or with some useful tools on your computer. Writing the actual code just happens to be a small part of the whole process. (If not, you're doing something wrong!)
Clever programmers knows that:
"I am going to make a ton of money by playing with computers!"
Edit: Another one that drives me nuts:
"The other guy's code isn't calling mine correctly, so it's not my fault the system doesn't work." -- with no proactive investigation, diagnosis, suggested patch, nothing. As a manager or a team leader, this really gets under my skin.
Or, to add another insult to injury, the newbie starts to improve the performance of a piece of code, making it 5 times better and being very proud of himself... Until someone reminds him that he improved the performance of just a small piece of the whole process with a net result of one second for a process that takes two hours.
(I've actually had a colleague who did something dumb. A process had to import half a million of records and he was real proud that he made it start up faster simply by skipping some initialization. As a result, the first log entry would appear within a second instead of after 10 seconds. Unfortunately, the whole process slowed down from 30 minutes to 6 hours...)
"The problem is not in my program, it's a bug in the library / OS / language."
"It worked on my machine! What is wrong with yours?"
"Everything is a pattern, you just have to find them."
"I don't need to test because I only made a one line change."
"Source control is a waste of time for this project."
That their code doesn't need to be documented. They're the only ones who will ever look at it, right?
That the hard part is typing in the code. The farther up you go, the more that comes to be the easy part.
Being resistant to changing code because of some gut feeling that it will be slower, e.g. changing nested ifs to a table-driven approach.
That the rigidity of language syntax is there to annoy them or "for show".
It's not until much later (course in automata/formal languages and later on in compilation) that they realize that the reason that they do have to put that semicolon or close that brace is because otherwise the compiler can't parse their program unambiguously.
This probably comes from the fluidity of natural language, which this generation of students is probably even more apt to believe in thanks to texting.
Another misconception is that when they get a compilation error or an exception, the actual error is exactly in the line indicated by the constructor.
Unfortunately, the source is often somewhere earlier (e.g., missing brace) or in some earlier state change, but there's a tendency to stick to the line the compiler/runtime indicated.
Most of this stuff mentioned is, as far as I'm concerned, not related to the beginner programmer, but the programmer that has made it past the stage of working out how most of this works - how learning 2nd, 3rd and 4th language was way easier than 1st and so forth - but whom has yet to be part of a big "serious" project.
A beginner's misconception for me includes things such as:
I can't honestly remember my own ones (and hey, I probably STILL carry around with some stupid misconceptions which is why I haven't realised what mine were, cause they still ARE my misconceptions) - but my guess is, that it was very much like what I have just described.
Testing is not important / necessary.
Unittests are a waste of time
Certain codestyles (naming conventions, etc..) are not important
That pseudo code is how things are supposed to look. Too many new programmers try to write code like they would write a sentence, and well... it just doesn't work like that.
My wife has a BA in English, she is recently trying to go back for a CS degree. I am seeing this first hand as she tries to write her code as:
If Myint = 1
Then
cout ...
Else
Disabusing them of the notion that "perfect but very late" is better than "acceptable and on time".
No one is going to care if some weekly report runs in 5 seconds rather than 8 if it is two months late.
As a part-time instructor I observed that they usually think programming is VERY VERY HARD!
The most harmful misconception is to assume that people in software industry know what they're doing. Beginners tend to trust everything written in product's documentation, they trust error messages and exception descriptions. They even trust stuff posted on blogs.
In my experience with newbie-friends, i think that the common misconception is that validating data and making your code fail-proof is just a waste of time. Really, EVERYONE in my CA class don't validate the INPUT data!
Just my 2 cents.
That languages like Java, Python, etc "don't have" pointers as opposed to C.
(beware I quoted the negative)
That everyone else is a genius because they can code it up right quick and you can't. After you sit with them a while you see they solve problems just like you and it is really a matter of experience which in turn gives intuition - oh, that they they use search engines, just like you.
I think one that hasn't been mentioned yet is that some students assume they will always have valid data/input. In reality, valid data is only one condition and they forget about all the forms of invalid data/input.
The worse misconception I've encountered, and the hardest to be rid of, is that programming is writing code, and not reading it.
That the difference between a successful project, and a failure, is due to technical issues or choices.
In fact, the difference is almost always due to social (people-based / team-based) factors.
The real problem I've seen with programming tyros is "programming is magic", meaning not truly groking that the computer will operate exactly logically, and will do exactly the same thing every time given the exact same input.
They write something that they think should sort of does what they want, and then when it doesn't work, rather than try to approach the problem logically, they start changing things semi-randomly, hoping, apparently to appease the gods of computer magic by their sheer tenacity or willingness to abase themselves upon the altar of whimsy. They feel that the computer is capricious, and changes things randomly, and the best they can hope for is to get things to a vague approximation of working, and hope the stars stay aligned for long periods.
Of course, even to experienced programmers, it can feel that way sometimes, but there is an inherent knowledge that what is happening is happening for a specific reason, and you just have to dig down to get to that reason.
most harmful misconception (financial version):
"That a college education is required to know or have understanding about how to write software."
The most harmful misconception is: You are done when you get the code to work.
In object oriented programming, using instance variables where local variables would have been more appropriate, especially in multithreaded frameworks (e.g., servlets).
More generally, using a wider scope than appropriate.
lack of indentation... improper variable names and commenting. its harmful because they are able to do small programs despite these mistakes
First that they can ignore error checking, or that error checking can be treated as an afterthought, or that catching the exception means you can ignore the fact that it happened. This gives you code that does things like:
try {
date = format.parse(dateString);
} catch (ParseException e) {
log.debug("exception: ", e);
}
String message = "The date was " + date.toString();
Second misconception would be that programming will get easy. It doesn't. The problems will grow to match your abilities.
Having been a beginner once, I would say that the biggest misconception on the part of beginners is that
if (thisBool = true)
blah blah blah...
is valid code.
Command lines and text editors are things of the past, I have an IDE so I don't [need to] care about what happens under the hood.
The conceptualization of how references are passed in languages like Java and C#.
That a 500+ line function is acceptable provided it's well-commented. I've seen experienced developers do this, and refuse to break it down into maintainable chunks because the function "only did what it was supposed to, and each operation was commented."
That being a software developer is all about knowing programming languages and API's.
That you can ignore variable types in a dynamic language.
It's very common to see PHP programmers do things like:
$a = false;
if ($a == "false") ...
or:
$b = "0";
if ($b) ...
Early on:
Later on:
Wrongly thinking that...
That garbage collection will save you from resource management.
Endianness? Padding? I can't just write()/send()/etc. the whole struct?
Wrongly thinking that:
Most harmful that I have seen is that they don't have to understand what they are doing (i.e. it is acceptable to paste some code you don't understand because it "appears" to work).
It also truly amazes me the number of people who don't seem to grasp the difference between AND and OR.
Also the edge cases have to be handled. Anytime there is a decision point, you must handle all the possible paths from that point even if they rarely happen.
And in database terms, just because the query runs doesn't mean the results are what the user wanted.
Being inflexible when it comes to choses, aka "When you have a hammer, everything looks like a nail syndrome". This might include trying to solve every problem in the favorite language.
A common misconception among beginners and managers is that programming is basically memorizing password-like "codes" and typing the correct code for a problem into the computer. If you don't know the "code" for a problem you're either stupid or lying.
That copying code from MSDN or somewhere else is a valid way to solve a problem
Unit tests and contracts make no sense and it's only for those guys at the university
Design patterns: see 2.
Visual Basic 6.0 and Java are good programming languages
Assembling XML by string concatenation is an intelligent thing to do
Functional programming is complicated
Object oriented design is only there to annoy people
That the more flexible a component is the easier it is to use (which is not only terribad but also terrisad)