views:

18862

answers:

190

I am doing some research into common errors and poor assumptions made by junior (and perhaps senior) software engineers.

What was your longest-held assumption that was eventually corrected?

For example, I misunderstood that the size of an integer is not a standard and instead depends on the language and target. A bit embarrassing to state, but there it is.

Be frank; what firm belief did you have, and roughly how long did you maintain the assumption? It can be about an algorithm, a language, a programming concept, testing, or anything else about programming, programming languages, or computer science.

+58  A: 

That programming is impossible.

Not kidding, I always thought that programming was some impossible thing to learn, and I always stayed away from it. And when I got near code, I could never understand it.

Then one day I just sat down and read some basic beginner tutorials, and worked my way from there. And today I work as a programmer and I love every minute of it.

To add, I don't think programming is easy, it's a challenge and I love learning more and there is nothing more fun than to solve some programming problem.

Ólafur Waage
thanks for the added background information. +1
Demi
Amen! But, hey, don't proclaim this view from rooftops. We don't want *everyone* to know programming is fun, now do we? ;) ;P
Peter Perháč
MasterPeter: It would give us more fodder for us to increase our rep when they come here asking questions.
TheTXI
I would say that programming *is* hard *to do right*. It is, however, possible, which seems to be your point.
Steve S
Good point Steve, I've updated the answer.
Ólafur Waage
@TheTXI - good point, too :))
Peter Perháč
I also voted to reopen it since it's been changed to CW.
Ólafur Waage
@Olafur: Thank you for the vote and for your answer.
Demi
@Olafur: Why would you want the question to be wiki, but not your answer?
gnovice
This mirrors my experience exactly. I wish I'd started sooner now :P
Skilldrick
Heh, i had this assumption too. its intimidating to look at code when you are a beginner who doesn't know anything
Click Upvote
+19  A: 

That programming is easy.

RedFilter
What about programming, more specifically?
Demi
Programming is easy. Programming well or programming correctly, those are not so easy.
jmucchiello
Really? I never noticed... I just figured it was that minds that are attuned to good programming, are rare...
AviD
+195  A: 

That bugfree software was possible.

JaredPar
+1, although NASA almost managed it
Patrick McDonald
Yes but the "almost" cost a few million of dollars :)
Jem
It hasn't yet been achieved, but it's definitely possible (as long as we continue to work with deterministic digital hardware). We'd have to start from scratch with a new, carefully engineered OS and hardware free of serious design faults.
Triynko
What's deterministic digital hardware?
Liran Orevi
To be a bit more specific, "Bug free software was possible on a normal budget"
Frank Farmer
You will never have bug-free software, just software that hasn't had any bugs found yet.
Mark Glorie
+1 to Patrick it's almost an utopia but I think everyone knows it's not bug-free software it's almost bug-free software.
dr. evil
I don't want bug free program, it's not fun, and will drop the employement rate of developer by 10 !
Nicolas Dorier
Wow, got a -1 vote for this answer. Do people really think bug free software is possible?
JaredPar
@Triynko your "possible" and @JaredPar's "possible" are not the same. Theory and practice might be the same in theory but are very different in practice.
wilhelmtell
Depends if you mean algorithmic correctness or 'does what the user wants it to do'. The former is certainly possible.
SnOrfus
@wilhelmtell: Theory and practice are only different when you don't know what "theory" means. People seem to confuse it with "conjecture".@Jared: yeah, I always thought if I could just be smarter, try harder, learn more, I could do it. This is the same reason we make so many laws, have so many years of schooling, take so many drugs, etc.: maybe with a little more control we can make everything perfect, right? (Off topic much?)
Jay Bazuzi
@JaredPar I've seen many Hello World applications that are COMPLETELY bug free =P In fact, a lot of programming books I pick up have this little guy in one fashion or another, and I can't think of a one that had a bug in it =P
Joseph
@Joseph, part of the problem is people think Hello World programs are bug free. They're not. Most do not check for errors in printf for instance or account for other failed IO attempts.
JaredPar
@JaredPar's right--most trivial software has bugs. All non-trivial software has bugs. It is simply not possible to write interesting, bug-free software with today's tools; and we seem to be cool with that for the most part. Software really does an amazing job considering how complex it gets. Even notorious office always seems to recover my unsaved documents should I cross it :). We've come a long way this decade in *handling* the inevitable bugs better.
Michael Haren
Autopilots with Autoland systems for aircraft must be capable of more then 1 million landings without an accident. Most use three computers working in parallel, 1 flying the plane, 2 checking the first for mistakes. That is about as close to bug free as you will get. For the record I worked on those systems, but I did not program them,
Jim C
I'll have to downvote this. It *is* possible to have bug-free software, and there is a constantly-developing science to achieve this. Conforming yourself that there is no bug-free software is just an excuse so that you don't have to care. That helps to stagnate an important research subject in Software Engineering.
Juliano
I know one - TeX. There's even a price for each bug afaik.
Tobias Langner
Isn't this C program bug free? void main() {} ;-)
RussellH
@RussellH, no. You've failed to specify a return value and the resulting process will return random garbage memory.
JaredPar
@JaredPar: Bug free software is completely possible. However, it requires patience, and very skilled programmers, both of which are in short supply. Which makes Bug free software possible, just unlikely.
Chris Lively
"Bug free" software has to account for EVERY eventuality. Including bad design and UI, faulty components, and to take it to the extreme then maybe even power failures?
Robin Day
I still do't understand what is a bug actually? :)
THEn
Please don't go to dissect the literature of the meaning of the word "possible." We're computer programmers. Writing bug free software is simply not possible. :)
Donotalo
+200  A: 

That nonprogrammers understand what I'm talking about.

Robin Day
understand/care..
nickf
I still have this one at times... I thought at least my wife would have started to understand properly by now :P
workmad3
Oh dear, I fear I may be yet to learn this!
thatismatt
yeah, sometimes i forget my audience and end up with a bunch of people wtih blank looks on thier face stairing at me, it's nice when people show an interest though
Petey B
This is my biggest frustration with the profession.
Andres Jaan Tack
I still struggle with this; at least now I recognise the glazed expression from my wife when I'm talking techie!
JLWarlow
+171  A: 

That private member variables were private to the instance and not the class.

Dave Webb
That's a common one. I usually have to provide test cases before people will believe me.
Bill the Lizard
Some people think they should be. See 'case 4' in http://gbracha.blogspot.com/2009/03/subsuming-packages-and-other-stories.html
Daniel James
I held that assumption until... just now.
TheMissingLINQ
You can do that in Scala; private[this] val foo = 42
egaga
This one is new to me! Although I can't think of a case where this would have been useful to me in the past.
ebrown
@ebrown I usually only find it useful when writing an equals() method
Dave Webb
They are in Ruby.
Mike Kucera
I found this out recently when creating a Clone function. I was surprised it worked.
Meta-Knight
Mike Kucera, thanks, I didn't know that about Ruby. Worse, it hasn't come up yet :)
Yar
I knew it after 3 or 4 years of hard programming
Nicolas Dorier
This is so normal to me that this answer didn't make sense the first few times I read it. Now I want to learn Ruby so it can confuse me the other way. :)
jmucchiello
VB6 uses the instance-only meaning for private. It actually has no scope for 'this class only'.
Strilanc
Wow... I had no idea
Graphain
New to me too! Thanks :)
rmz
I think my brain just broke... I need to go fix it now
Joseph
For a moment, I thought this said "values", not "variables", which made me wonder how in the world any code I've written to date has ever worked.
Jeremy Frey
It depends on the language. In C# that is correct, but I'm not sure that is true for every language that supports encapsulation.
Krzysztof Koźmic
What are you talking about? Does somebody has an example? :P
Kiewic
@Kiewic If you have a private member variable called myVar inside your class you can reference other.myVar directly in your code if other is an instance of this class. I had assumed because it was "private" you had to use other.getMyVar() even inside the class.
Dave Webb
@Dave Webb It all makes sense now, thank you!
Kiewic
A discussion about this answer can be found here :http://stackoverflow.com/questions/1357496/can-you-explain-this-thing-about-encapsulation
Moayad Mardini
I was never confused by this. How else would you overload a comparison operator?
Matt Brunell
@Dave Webb, shouldn't your *incorrect* assumption be "That private member variables were private to the instance *and* the class." In other words I can access directly a private variable on another object, if that object is the same class as me. I'm confused.
Ash
I learn this and then forget it ... and then learn it and then forget it ...
cplotts
Wow this actually just rocked my world slightly.
Finbarr
@Mike Kucera: Thats like saying "Nobody can jump 10m high - god can".
Vincent
+17  A: 

Having No defects is possible before going live.

It is definitely not true, even P2 defects get left open at times.

TStamper
How about the assumption that your internal names for priority levels are my internal names for priority levels? Over here, what y'all call TPS reports are called SRP reports! ;)
Doug McClean
+32  A: 

That condition checks like:

if (condition1 && condition2 && condition3)

are performed in an unspecified order...

User
In what language? Languages like C/C++, Java, and Python guarantee that the conditions are evaluated left to right and that evaluation stops at the first condition that returns false. It's part of the langauge spec. I assume that most other languages make the same guarantee.
Clint Miller
@Clint: Yeah, hence "that turned out to be incorrect".
bzlm
yeah, this one is cool. it makes wrint stuff like if(myList!=null } a lot easier
Zack
Lucas
Damien
@Damien: In C# it won't try it out your self. In pre .net VB that would crash.
Binary Worrier
New thing learned, you are correct it is in order.
Damien
. . . Actually it will crash in VB.net too unless you use AndAlso re Lucas' comment
Binary Worrier
Some languages don't have an order specified. For the languages that do, they usually brag about how they have "short-circuit" logic.
Unknown
Richard Hein
Richard Hein
in C++ they may be performed in an unspecified order AFAIR.
Krzysztof Koźmic
windfinder
As i read this i remembered that i was actually told in my CS class that i couldn't trust the execution order and we where told to write => if(x!=null) if(x.getX()==5) <= so it would always work. I total forgot about that until now :)
edorian
Note that in C/C++, function arguments are performed in an unspecified order. For example:`printf("%d, %d\n", i++, i);`
Joey Adams
+52  A: 

That anything other than insertion/bubble sort was quite simply dark magic.

Robin Day
Haha, I like this one, as it hits close to home. Sort in faster than n-squared time?? Unpossible!
Ross
Brian
I am a RESEARCHER in sorting algorithms! And they STILL feel like dark magic.
SPWorley
No amount of recursion helps me grok sorting networks, such as bitonic sorts.
TokenMacGuy
I once had a line of code in my program that was a long and complicated and I didn't feel like breaking it up or explaining it (it was some complicated lighting formula), so I put it all on one line and #define'd it to be DARK_MAGICK, and the only comment was a warning against trying to unravel the mysteries of the dark magick
Alex
Arno, I guess that's to be expected. It seems a fundamental property of our world that the closer you look at something, the crazier it gets. Atoms, people, algorithms... you name it. (As a side note, I just realized that if I'd care about such things I'd accept that as the base for a proof of the existence of a god.)
peterchen
Bogosort is the most mysterious of them all.
Alex Beardsley
+12  A: 

I used to assume it's enough to program Win32 applications.

Also that every program must come with a GUI, because command-line is "outdated".

Peter Perháč
+1 These definitely hit home.
Evan Plaice
+71  A: 

That all languages are (mostly) created equal.

For a good long while I figured that the language of choice didn't really make much of a difference in the difficulty of the development process and the potential for project success. This is definitely not true.

Choosing the right language for the job is as important/critical as any other single project decision that is made.

Overhed
I feel that chosing the right libraries is what matters.It just so happens there's often a 1-to-1 correspondence between languages and libraries...
Kevin Montrose
But if two programming languages are both Turing complete then what's the difference? You can write any program in either language! ;)
Bill the Lizard
I disagree, the decision what language to use is way less important than who will actually be implementing the project. As just one example of many other more important decisions.
Boris Terzic
A study referred to in the (still) wonderful project management book "Peopleware" concluded that despite the strong opinions language choice engenders, there wasn't a strong difference in productivity between languages (except for assembly, which notably underperformed higher-level languages). However, that study was from around the time when Ada was the hot new language on the block, so it may be time to redo such a study.
Daniel Martin
While Turning complete languages do make it possible to implement the same application the syntax and metaphors of some languages do often lend themselves to particular problem types
Crippledsmurf
BrainFu** is as turing complete as python is.
hasen j
Think about doing COM Integration with C# vs VB.NET. The optional parameters... This goes away with C# 4.0, but to this point it was/is true.
Nate Bross
The usual wisdom is that productivity in lines of debugged code per unit time is roughly the same across languages, while the number of lines needed for a given amount of functionality can vary considerably.
David Thornley
That Turing complete languages are somehow equally applicable is a common misconception. A Turing complete language can compute everything that a Turing machine can (and often implied the other way around too). There is absolutely no implications regarding performance. An operation that takes linear time in one language could very well take exponential time on another and they could still both be Turing complete. There's a huge difference between what's theoretically computable and what is feasible in practice.
TrayMan
@TrayMan, I just assumed the comment from Bill the Lizard was a joke.
Alex Baranosky
+281  A: 

That people knew what they wanted.

For the longest time I thought I would talk with people, they would describe a problem or workflow and I would put it into code and automate it. Turns out every time that happens, what they thought they wanted wasn't actually what they wanted.

Edit: I agree with most of the comments. This is not a technical answer and may not be what the questioner was looking for. It doesn't apply only to programming. I'm sure it's not my longest-held assumption either, but it was the most striking thing I've learned in the 10 short years I've been doing this. I'm sure it was pure naivete on my part but the way my brain is/was wired and the teaching and experiences I had prior to entering the business world led me to believe that I would be doing what I answered; that I would be able to use code and computers to fix people's problems.

I guess this answer is similar to Robin's about non-programmers understanding/caring about what I'm talking about. It's about learning the business as an agile, iterative, interactive process. It's about learning the difference between being a programming-code-monkey and being a software developer. It's about realizing that there is a differnce between the two and that to be really good in the field, it's not just syntax and typing speed.

Edit: This answer is now community-wiki to appease people upset at this answer giving me rep.

Instantsoup
Or change what they want after seeing what they previously wanted. People like to change their minds. I know, cuz I'm a people.
sheepsimulator
You were giving them what they asked for, not what they wanted.
Brent Baisley
Why do boring uncontroversial no-answers get up-voted so excessively?!
nes1983
I mean, dude, that is not a statement that can be "wrong" or "false", it isn't really statement, more your perception that changed. It doesn't satisfy the terms of the original questions and describes more your increasingly pessimistic look upon minkind than your progress as a scientist.
nes1983
Wow. Sounds like someone needs a hug.
bzlm
@niko- this is a good answer..even I upvoted it and my answer has to compete ;)
TStamper
@Brent Baisley : Worse, you are giving them what you think they asked for based on what they asked for, not what they wanted.
Hao Wooi Lim
What lead you to make that assumption in the first place?
Daniel Daranas
Not to be pedantic, but it seems like what you're finding is that building systems is an INTERACTIVE process. Idea, prototype (and sometimes deployment), trash bin, new idea... etc. Or you can see it as "people are stupid" which is true as well...
Yar
That's why I like agile project management.
Scoregraphic
@Niko: The answer is OK. The fact that he got nearly 400 Rep off of it is BS.
gnovice
On the other hand, some people know what they want but it's absolutely non-sensical. For example, one of our clients "I know this guy, and personally I know that he has this much money, we want to highlight him on our web page, but we can't store anywhere anything that indicates this, can you do it?". Hmmm.
Kezzer
"I think you can have what you want or what you need, but you can't have both... usually." (from Hal Hartley's "Simple Men")
Daniel Daranas
@Daniel Daranas: Sounds like the uncertainty principle for software...
Treb
My god @ people complaining, stackoverflow rep is not a competition. Upvote if you enjoyed the answer, don't downvote because you are jealous you didn't post it first.
Dmitri Farkov
@All - expounded on my reasonings and stopped the rep gain. Thanks for the comments, I was surprised this answer generated what it did!
Instantsoup
This definitely deserves to be the highest upvoted answer. The idea that naively asking users what they want will produce the best product is widespread and very, very wrong. Especially important to keep in mind when building "shrinkwrap" or web software: the users that are giving you feedback are only a subset of your total users, and this will skew your perspective.
Wedge
Maybe I want it to be the highest voted answer. Maybe I don't. I'm not sure.
Daniel Daranas
My favorite is when you see something wrong with what the customer asked for, and point it out. "When I build it like this, its going to do X, which I'm almost certain you don't want it to do". They tell you to build it anyway, and then 6 months later panic when the release is pushed back because its doing X and it needs to be 'fixed'.
Jherico
@Jherico this is called a "change request" in the contract = $$$.
Daniel Daranas
Sometimes customers makes you implement something just to help them decide what they want. Ideally, a "Proof of concept" should do it, but it happens very rarely.
Sergiu
Damn..so true for me
kudor gyozo
+33  A: 

For the first few years I was programming I didn't catch on that 1 Kbyte is technically 1024 bytes, not 1000. I was always a little perplexed by the fact that the sizes of my data files seemed slightly off from what I expected them to be.

gnovice
Hard drive manufacturers still haven't caught on...
Michael Myers
@mmyers I think you mean hard drive marketers right? Or are the drives actually built like that?
Instantsoup
well they'll manufacture a drive with 300 billion bytes and call it 300GB, when 300GB (is for most intents) 300 x 2^30. (7.3% difference there!)
nickf
There's a silly (and rarely used) "formal" definition of 1024: "kibi-". As in "A Commodore 64 has 64 Kibibytes of memory". Ugh. It is confusing in some instances. Certain specific areas of computing use K=1000 like networking bitrates (Kbit/s), and others use it for marketing reasons (HD's mostly). Memory and file sizes are normally quoted in K=1024/etc.
jesup
I may date myself with this statement, but every time I see/hear the word "Kibibytes", I think of that character from "Fat Albert" (Mushmouth, I think) who added B's to everything he said. "Probagrambing is rebabeally harbd."
gnovice
Hey, stop the kibi hating. MeBi and KiBi are at least unbambiguobus.
bzlm
Kilo means 1000, Mega means 1000000, Giga means 1000000000. It's the RAM and OS makers that got it wrong, not the drive makers.
Mark Ransom
@Mark Ransom: Actually it's not the RAM/OS makers that got it wrong. In binary, 1024 is 10 bits all set to 1, it is a nice round figure [10000000000], as all data is stored in binary it makes sense to use a KB as 1024. What is retarded is that everyone uses different standards. They should all use one or the other, regardless of which they pick and stop confusing the hell out of everyone...
BenAlabaster
Those prefixes had a defined meaning long before someone tried to adapt them to binary numbers. You shouldn't need to know the context to decipher the prefix meaning. I understand how it came about, but I think it was a mistake that has caused confusion for far too long.
Mark Ransom
Wow, looks like I unintentionally set off quite the feud. =)
gnovice
No one's going to do it? Seriously? Okay, I'll do it... http://xkcd.com/394/
Erik Forbes
Maybe we should go back to binary coded decimal... math functions would make less mistakes... of course they would be slower and use more RAM but who cares :) (4th times a charm)
Matthew Whited
And it's Erik FTW!
gnovice
Wait ... it's not?
hasen j
IIRC IBM used 2k48 and 4k96 (about the memory size of their mainframes) during the sixties, and in informal talk it was just "4k". In the beginning only small numbers where used and the difference was such small that nobody cared, or "everyone" knew by the context if 1k meant 1000 or 1024.
some
@BenAlabaster: Actually 1024 (1 KiB) is 2^10 or binary 10000000000, which is certainly *not* 10 bits all set to 1. And Mark is quite correct, the definitions of Kilo and Mega were around and defined in engineering circles long before computer guys borrowed them for their own (inexact) uses. It's time for computer geeks like us to let it go, admit were were wrong, and start using the right notation to mean the right thing.
Software Monkey
+9  A: 

That XML namespaces (or worse, well formedness) are in some way more difficult than trying to do without them.

A very common blunder, even at the W3C!

Simon Gibbs
It's not that they're worse. It's that they take a language that's already pretty ugly/verbose and make it a lot more ugly/verbose.
Evan Plaice
A: 

This language also OOPs....

Access Denied
OOPs, I did it again?
bzlm
What language? C#? English?
Triynko
There is Logic dude in "OOPs" you didnt get it i.e. "Object Oriented Programing".
Access Denied
Still doesn't make sense.
nicholaides
Object Oriented Programming Style. The 's' should be in uppercase, or the grammar changed.
Arafangion
+35  A: 

I believed that creating programs would be exactly like what was taught in class...you sit down with a group of people, go over a problem, come up with a solution, etc. etc. Instead, the real world is "Here is my problem, I need it solved, go" and ten minutes later you get another, leaving you no real time to plan out your solution efficiently.

Aaron
I think that's called life.
Robin Day
hmmm.. it's time you bail out that company. ..
jpinto3912
@jpinto3912: No. Because the next company will also be a part of life (see previous comment).
Treb
Hmm, this is how we work :(
kudor gyozo
+15  A: 

That .NET structs (C# and VB.NET) were reference types, just like classes.

I "received" that piece of wisdom at some point shortly before or after .NET 1.0 arrived on the scene (I've no idea where from, it may have sprung whole from my mind, like Athena from the brow of Zeus), and kept it until disabused of the notion by Jon Skeet about 4 months ago.

Thanks Jon.

P.S. Not programming related, but I also believed (until about 5 minutes ago) that "Apollo sprang whole from the brow of Zeus".

Binary Worrier
Athena came from the brow of Zeus. Apollo was born the old fashioned way
Brian Postow
It shows that I didn't attend a University, I never studied the classics properly . . . Oh the shame . . .
Binary Worrier
If you use Vb.Net, you are studying the classics every day.
bzlm
Upvoted just for the first comment and the last sentence.
Michael Myers
In C++ struct is the same as class so most people that come to C# assume it's the same here.
kubal5003
+42  A: 

That C++ was somehow intrinsically better than all other languages.

This I received from a friend a couple of years ahead of me in college. I kept it with me for an embarrassingly long time (I'm blushing right now). It was only after working with it for 2 years or so before I could see the cracks for what they were.

No one - and nothing - is perfect, there is always room for improvement.

Binary Worrier
It's not?! Uh oh....
Drew Hall
"better" will bring you tons of less-than-hateful comments. But I would say it is one of the most fast-executing, flexible, free-from-hurdles one. It's also one that takes your youth to proper learn it, only to find you could do more or less the same app. (albeit requiring some extra tonne or two of electricity-generating coal) with java or C#.
jpinto3912
@JP: I'm happy with my choice of words :)
Binary Worrier
Productivity matters more in the world of business applications. of course, there are some niches that c++ is required, and the only option.
Shaw
@Shaw: Indeed recently - for a pet project - I chose to write one particular component in Managed C++, purely for performance reasons. It's just that I no longer believe it's "intrinsically superior" to all other languages.
Binary Worrier
I've always assumed C++ is worse than straight ANSI C, simply because the kind of trouble that I've seen C++ programmers get into is so much more complicated than the kind of trouble I've seen C programmers get into.
Nosredna
@Nosredna: Is that an assumption for the list? :)
gridzbi
Actually, the language that's better than all other is Common Lisp. C++ isn't bad, though.
David Thornley
+1  A: 

That software engineers are always honest about what they are doing now or done to your software in the past.

u07ch
+60  A: 

"On Error Resume Next" was some kind of error handling

Paulo Guedes
I feel you...but in vbscript (esp. asp), it was the ONLY "error handling" option available, combined with judicious checking whether an error actually occurred, and a fair amount of prayer.
flatline
Yeah... it is some kind... just a kind that we are glad to be getting away from
Matthew Whited
Well?! but it is. You start your error-handling block with On Error Resume Next, try something, and then If (err.number<>0) then
jpinto3912
Isn't this the only vbscript equivalent to a try catch?
James
-1: It is a kind of error handling. It just isn't that elegant.
JohnFx
Resume Next thing is awesome :) Ignoring anything and saying "Of course it will work, I wrote it!" (I did that before.)
JCasso
I know it is fun to pick on this construct, but to be fair it was INTENDED to be used for inline error handling If (Err.code=12...)
JohnFx
1 more reason why VB should die a quick and painful death. @jpinto3912 Add, '<>' to the die/quick/painful list too while you're at it.
Evan Plaice
@Evan yeah, I'm still typing != for the first 30mins I start vba-ing. But to wish VBA would die quickly over try-catch semantics and the odd <> is bit too much, hmm?
jpinto3912
@jpinto3912 Obviously, I'm not a fan of VB. The language has awkward semantics and a lot of awkward practices that are special to VB. IMHO, it's a MS word macro language that grew to be way more than it ever should have been. Mostly, it irritates me that It's a lot harder to find C# info on Goog because everything about .NET is flooded with really old VB examples and Goog gives age precedence over usefulness in that case. I'll admit. I have religious anti-VB views. Hence the whole 'death to the infidels' speak ;)
Evan Plaice
@jpinto3912 '<>' isn't really so much a vb specific thing. It just rubs me the wrong way because 'x <> y' literally means 'x is less-than greater-than y'. So, x becomes omniponent when y is involved ;). I like != because it's a simple 'not equal'
Evan Plaice
But there's on error goto!
box9
+17  A: 

Ok, I learned programming rather early. I was 14 or so. And I held all kinds of crazy beliefs, but don't ask me about the precise timing, because that was a … long while ago.

  • Ok, so, I believed for a while that if you use the term synchronize in Java, then Java solves this nasting synchronizing thing for you

  • I believed for at least half a year, likely more, that static typing would improve performance.

  • I believed that freeing something would return memory back to the OS.

  • I believed that malloc calls boil down to checking if there is enough free space on the OS, so malloc would be inexpensive.

  • I thought a long while that Java was built with all the benefits and flaws of the other languages in mind, into a "perfect blend" that would take the best properties of the other languages and reject the mistakes.

  • I vastly overestimated the number of cases where LinkedLists outperform ArrayLists.

  • I thought that NP-hardness was a proof that no INSTANCE could be solved efficiently, which is trivially false, for a while.

  • I thought that finding the best flight-plan on travel agency web sites would take so long because of the "Travelling Salesman Problem", as I proudly chuckled to my relatives (when I was small, alright?!)

Could come up with more. No idea how long I sticked to each of them. Sorry.

PS:
Ahh, ok, this one got cleared up not so slowly, but I see newbies do this every now and then, so I thought you might be interested: I also thought that to store an uncertain number of things, you'd need to declare a new variable for each. So I'd create variables a1, a2, a3, ..., rather than using one variable a, which I would declare to be a vector.

nes1983
No, no - you're supposed to create a1, a2, a3 etc but they're ALL supposed to be vectors.
AviD
That traveling salesman just made my day. :D
Andrew Szeto
Wait, so after half a year your statically-typed program runs slower? Memory gets "long" when you free it? free space on the OS? ...?!
Qwertie
+56  A: 

That optimizing == rewriting in assembly language.

When I first really understood assembly (coming from BASIC) it seemed that the only way to make code run faster was to rewrite it in assembly. Took quite a few years to realize that compilers can be very good at optimization and especially with CPUs with branch prediction etc they can probably do a better job than a human can do in a reasonable amount of time. Also that spending time on optimizing the algorithm is likely to give you a better win than spending time converting from a high to a low level language. Also that premature optimization is the root of all evil...

danio
Peek and Poke are your friends :)
Matthew Whited
Pervert! Say that to the judge!
scraimer
This is where complexity theory comes in. Assembly is generally micro optimization. Making your algorithms time complexity smaller is where speed is gained.
PeteT
@scraimer: Fancy seeing you here, I never would have expected it ;-)
Robert S. Barnes
@ Matthew - "Peek and Poke are your friends :)": **EXTREMELY jealous I didn't write that first.
FastAl
A: 

Of course you could look at FindBugs and PMD but these are my favorite gotchas and tricks (all Java):

Fields are not overridden, they are shadowed.

There is no explicit super.super access.

Classes with no constructors defined have an implicit zero-argument constructor. I made a practical error related to this one this year.

To get a reference to an inner class's parent you can use the syntax "Outer.this" to disambiguate method calls or synchronize.

Classes are "friends of themselves" in C++ terms, private methods and fields of any instance of that class can be referenced from any method of the same class, even static methods. This would have made some of my early clone() and copy constructors much simpler.

Protected methods and fields are accessable in a static context of extending classes, but only if that class is in the same package. I'm glad that flex.messaging.io.amf isn't a sealed package.

Karl the Pagan
Okay, which language is this? I'm pretty sure it isn't C or C++.
David Thornley
Java. FindBugs and PMD are Java static analysis tools.
Alan
+91  A: 

That quality of software will lead to greater sales. Sometimes it does but not always.

Ian Ringrose
Selling software? That's so 1999.
bzlm
Lots of subscription based websites now adays...
altCognito
Microsoft sure makes a killing at it.
asp316
gotta love this one, so true.
dr. evil
Also true of music, sadly.
RedFilter
I wish that improving the quality / performance of our software counted as a feature
Tom Leys
+21  A: 

This is embarrassing, but for the longest time I didn’t really grasp the difference between reference types and value types. I thought to you had to use the ref keyword to change an object in a different method.

This is one of the most fundamental concepts to C# that I should have known.

Aaron Daniels
You'd be surprise how many developers do not know the difference.
Richard Hein
When I did phone interviews, I would always have a question for the candidate about this because it's so commonly misunderstood.
Kyralessa
...and most of them missed it. They could explain the difference between pass-by-value and pass-by-reference for value types, but few of them grokked it for reference types.
Kyralessa
correct me if i'm wrong(not 100% sure) value types = int/bool/decimal etc and referece types = classes?
LnDCobra
@LnDCobra you're right and wrong. Classes are fundamentally reference types but value types (int, bool, decimal) can also be passed by reference using the ref keyword.
Evan Plaice
+1  A: 

That 640K should to be enough for anybody (DOS). That was widely believed by a lot of people for a number of years.

The first time I had a system with 8MB of RAM, I thought that was far more than I needed. That ran the OS (Mac) plus all the applications I was using (Word, Email, Firefox, etc).

Brent Baisley
You ran firefox on an 8MB machine? What decade was this, and how did you get a hold of such an early copy ;) (intended sarcarm)
Evert
How is this assumption programming related? Did you use Word, Email (is that an actual application?) and Firefox to program?
bzlm
His statement was to memory usage from programming... while is examples were not. I don't see why this was down voted.
Matthew Whited
Dude, there wasn't a firefox back then. and word was probably notepad, lol.
hasen j
You're right, it was Mosaic (NCSA?). I actually meant to say FoxBase, not Firefox. And there was a program called "Mail", which Microsoft bought. They also bought Fox Software, makers of Foxbase.
Brent Baisley
@Brent Baisley: then why don't you edit your answer?
Cristian Ciupitu
+255  A: 

That I know where the performance problem is without profiling

lothar
I think this is why premature optimization is so common place.
Hao Wooi Lim
+1 Wow, someone included an answer that wasn't trivial or off-topic.
Mark Rogers
I've got some tablets that should help with premature optimization...
AndyM
+92  A: 

I thought I should move towards abstracting as much as possible. I got hit in the head major with this, because of too much intertwined little bits of functionality.

Now I try keep things as simple and decoupled as possible. Refactoring to make something abstract is much easier than predicting how I need to abstract something.

Thus I moved from developing the framework that rules them all, to snippets of functionality that get the job done. Never looked back, except when I think about the time I naively thought I would be the one developing the next big thing.

Evert
Decoupled = true Abstraction. Abstract for its own sake is... premature optimization.
Jared Updike
This goes along with what I've found doing performance tuning. There can be a lovely program with multiple layers of abstraction. Then the workload gets heavy, and guess what is costing all the time ... all the abstractions. Computers execute instructions, not abstractions.
Mike Dunlavey
Abstraction and generalisation are powerful tools, sadly used to generalise an abstract use case with one single implementation. The funny thing is that whenever there is a need to change the implementation, the abstractions and generalisations have to change too...
KarlP
I totally agree with Jared ... if you have managed to get to "simple and decoupled" you have achieved true abstraction. How can things be decoupled if you haven't abstracted things out into interfaces and factories etc...? How can it be simple unless you remove all the "if type = this then do this, or if the type is that then do something else code"?
Richard Hein
Same here. I think I learned about abstraction *before* making a whole lot of spaghetti code. They should've taught how to get things done even if the code is spaghetti, and *then* teach us about OO and abstraction.
hasen j
What about abstraction in the event that you may make use of it in the future, even if it's inefficient now?
Andrew Weir
Andrew,The point I tried to convey, was that you can probably refactor your code when you do need it.I realize this is a blanket generalized statement, but it's a good rule of thumb.
Evert
This is biting me right now! I think I fell into "architecture astronaut" mode when I was designing my software, and now that I'm implementing it, I see that I wasted a lot of time making it much more flexible than necessary. I'm still in favor of abstraction, but you pay for it up front with a lot of effort. I think I should have focused on making something that works well instead of the "one true system".
A. Levy
"Refactoring to make something abstract is much easier than predicting how I need to abstract something." <--
Alex Baranosky
that put very nicely.
John G
I've never really understood the importance of abstract classes. Sure, there's a 1/100 case where it makes sense to template the sub-classes but IMHO, abstract base classes are wayyyy overused. It has to be a backlash of Univ style teaching that assumes theory==practice.
Evan Plaice
+2  A: 

That all OOP languages have the same concept of object orientation.

  • A Java interface != a method's interface.
  • A Java interface is a language-specific solution for the need to have multiple inheritance. Ruby's mixins attempt to solve the same problem.
  • Inheritance provided out of the box in Javascript is very different from how Java implements inheritance.
Alan
+30  A: 

That my programming would be faster and better if I performed it alone.

bzlm
But it cant get as ugly as Pair- Programming :-) except for maybe your code
Egg
That all depends on the other person. =)
JohnFx
+132  A: 

I thought that static typing was sitting very still at your keyboard.

edr
Sincere or not, this made me laugh hard at the end of a long day of work. :P
MrZombie
++ for a good laugh. sounds like something my (non-technical) husband would come up with.
jess
+1! I thought duck typing involved typing too. Or ducks. Or both.
SqlACID
+1 for a laugh!
LnDCobra
I thought duck punching...
Evan Plaice
+34  A: 

I thought mainstream design patterns were awesome, when they were introduced in a CS class. I had programmed about 8 years as hobby before that, and I really didn't have solid understanding of how to create good abstractions.

Design patterns felt like magic; you could do really neat stuff. Later I discovered functional programming (via Mozart/Oz, OCaml, later Scala, Haskell, and Clojure), and then I understood that many of the patterns were just boilerplate, or additional complexity, because the language wasn't expressive enough.

Of course there are almost always some kind of patterns, but they are in a higher level in expressive languages. Now I've been doing some professional coding in Java, and I really feel the pain when I have to use a convention such as visitor or command pattern, instead of pattern matching and higher order functions.

egaga
"many of the patterns were just boilerplate, or additional complexity, because the language wasn't expressive enough." Expressiveness is simply boilerplate code hardwired into the language.
Unknown
Not true, how is it boilerplate to have first class stuff instead of limiting the capabilities of a programmer, like in the case of higher order functions. Lisps are beautiful example of this.
egaga
+10  A: 

That I should always optimize my code. That's not to say I shouldn't think through it before I write it, but that I should think hard about how to squeeze every bit of performance out of each statement, even to the point of sacrificing readability.

Jimmy
+201  A: 

That I should have only one exit point from a function/method.

Dug
Excellent realization; exit as often as necessary. One should bail out of a function as soon as it makes no sense to continue further into it. Doing this can reduce complexity and increase readability by, for example, avoiding deeply nested conditionals, when they are preconditions required for the method to run properly. In modern languages with memory management and resource constructs like using/finally, continuing all the way to the end of a method dogmatically makes no sense.
Triynko
Who came up with this, by the way? It's like a programming urban legend.
brad
People who have to debug other people's code are who came up with this.
gatorfax
If your code is so deeply nested that you can't try to make one exit point, then refactor the code. Move stuff into methods with names that describe what's happening, etc.... Sure you have to follow each method to understand exactly what each method is doing, but at least you can get the big picture just by looking at which methods are called, under which conditions. Then you can't put in many exit points can you? So by all your exit points you make someone have to totally reorganize your code, so they can refactor tons of nested conditionals into separate methods.
Richard Hein
My Data structures teacher used to teach that. I always thought it was confusing and unnecessary, however, his exams were based on that assumption.
Edison Gustavo Muenz
I think this commonly-held but wrong idea is based on a misunderstanding. When you exit a function, you should always *return* to the same point. That was an important rule in languages like BASIC that didn't enforce it: The rule meant, for instance, that you should use GOSUB instead of GOTO. In languages like C# or Java that call methods, it's automatic. But because it's automatic, I think it morphed from the logical "only one return-to point" to the nonsensical "only one exit point".
Kyralessa
From languages like C where yo need to manually release ressources. Multiple exit points were a good chance for leaking ressources. IMO there's no point to it in languages with exceptions, as you often don't know your exit points anymore, or the are in the middle of a statement. -- In these languages, all that remains is "structure for readability".
peterchen
I think this *rule* is made up by flow-chart oriented people (like some stupid instructors in my college :-P)
Mehrdad Afshari
-1. A single exit point is the only sane way to enforce post-conditions.
Adrian McCarthy
@Adrian McCarthy - only if you're using C. Most other languages have something like try/finally, some languages/platforms have built-in support for postconditions.
Daniel Earwicker
I have never had a problem with multiple exits as long as they are used responsibly. Use them either at the top of the function to return early on unexpected params (usually null) or in an if/switch/try-catch statement in the logic. Just don't hide them everywhere.
PeteT
'return', 'continue', and 'exit' are my favourite keywords.
Evan Plaice
People often confuse *bad programming* with *disobeying a rule*. So they try to enforce rules like "one exit point" globally because they think it will eliminate bad programming. OEP makes sense when you need your method to do cleanup. In the other 95% of cases it leads to hideous nesting and *less readable* code. It's all about using the right tool for the job. For some functions in some languages OEP is a very good idea.
Jason Williams
@Adrian: If you've got something complex enough that you've got real problems with an early exit, you might be best to divide the function into two pieces, an outer which handles resources and pre/post-conditions, and an inner which does the real work and which can use early exit as necessary. Most of the time you don't need that sort of complexity though.
Donal Fellows
@Donal Fellows: I'd say just the opposite. If you find yourself needing multiple exit points, then you probably haven't decomposed the problem properly.
Adrian McCarthy
@Adrian: This sounds like one of these arguments where different sides think that the other is utterly wrong.
Donal Fellows
+2  A: 

If you can't read my code, you just don't know the language. I had a few code reviews where I tried to counter with that.

Took a couple more years to learn there's a time and place to be magical with your code and it is in the libraries, not the application. The app is for clarity and readability. Magic is best used when hidden behind extension methods and frameworks.

Ball
In fact, you should never be magical. It's simple to write code to do what you want. I can imagine 6-7 ways of doing the same thing. Only a couple of those are easy for others, or yourself in 6 months, to read. That's the real challenge. That's the real goal of programming - to make it easy for other humans to read. Even in a library, other people will need to extend or modify it. Always keep it readable.
Kieveli
+5  A: 
  • Programming Language == Compiler/Interpreter
  • Programming Language == IDE
  • Programming Language == Standard Library
Tahir Akhtar
+54  A: 
  • That the company executives care about the quality of the code.
  • That fewer lines is better.
Trampas Kirk
they DO care, but you have to combine artist-skills with worker-skills. Every piece of algoritm cant be a piece of art too. Some of it will be plumpering, so reuse the "less used". Remember the old 80/20 rule. 80% of the program is used 20% of the time. So focus 80% on 20% of the code and make that REAL PIECE OF ART! :OP
BerggreenDK
fewer lines are better! part of the reason I dislike java as a language is that doing anything takes up so many lines of code. less lines of code means it is easier to change your code.
Claudiu
It depends on what you're removing to get fewer lines. If the code is still readable with fewer lines then it's good. However, there are plenty of ways to reduce the number of lines of code that make the code worse.
Herms
Except when people take the "fewer lines is better" mentality to far with chained method calls 7 deep so that when one of them throws a null pointer, you have no idea which it was. Or they condense so many actions into one line that it's 150 characters long and performs 4 operations. This makes it both harder to read and harder to debug, but is not any faster nor does it uses less memory during execution.
Trampas Kirk
The real killer to the fewer lines issue is the PHB. When the manager doesn't read 10k lines of your buddies work and compares it to your 1k lines which he also didn't read, he is likely to assume that you are work only 1 hour per day.
TokenMacGuy
I think we need to make the distinction between "fewer lines" and "less code".
scraimer
If your line ends in ))))) and you're not writing Lisp, you have too-few lines.
James M.
It SHOULDN'T (!) be like that - but how do you expect a company "X (feel free to add any type) managers" to understand/care the qulity of the code, if the only thing (at least from my own experiences) they understand (sadly) is this: more $$$, more $$$, etc.
+8  A: 

In C++, during a long time I was tkinking that compiler rejects your when giving a definition for a pure virtual method.

I was astonished when realizing that I was mistaken.

Many times when I tell someone else to give a default implementation of its pure virtual destructor for its abstract class, he/she looks back at me with BIG eyes. And I know from here that a long discussion will follow ... It seems a common belief somewhat spread within C++ beginners (as I consider myself too .. I am still learning currently!)

wikipedia link to c++'s pure virtual methods

yves Baumes
Holy crap! I am gonna quiz all my friends with C++ experience, see if any of them know this, 'cause I sure didn't.
KeyserSoze
Most of the time it doesn't make sense - if you're forced to override a method anyway, why waste time on an implementation? Destructors are a special case, because they're always called even when they're overridden.
Mark Ransom
Heh :). I've spent *way* too much time debugging problems that resulted from having forgotten to add a virtual destructor to a base class.
Reuben
Mark: It allows you to provide a "default" implementation while still requiring the author of the derived class to think about whether they should use the default implementation. Rarely useful really. But it is there is that's the style you want.
jmucchiello
+4  A: 

I could spend days trying to reduce the amount of memory my business layer used, just to later realize that the WinForms (GUI) of my project used 4 times more memory than the rest of the application.

Burnsys
+12  A: 

That this:

SomeClass object(initialValue);

and this:

SomeClass object = initialValue;

were guaranteed to be equivalent in C++. I thought the second form was guaranteed to be interpreted as if it had been written as the first form. Not so: see C++ Initialization Syntax.

Kristopher Johnson
Only a few days ago I made this mistake in an answer I gave on SO.
Mark Ransom
+12  A: 

I thought all I needed to do to improve database performance was put the database in 3rd normal form.

Oorang
+14  A: 

That if conditions were evaluated every line, and if you wrote code like this:

Dim a as Boolean = True
If a Then
    Console.WriteLine("1")
    a = False
    Console.WriteLine("2")
Else
    Console.WriteLine("3")
End If

Then the output would be:

1
3
MiffTheFox
This is one misconception I never had/heard of.
Brad Gilbert
Some of my friends used to play this robot-programming game where this was actually the case in the half-assed language you programmed your 'bot in.
Zarkonnen
This is awesome. This is the only answer that I'm upvoting, except for that other answer that I upvoted. Something like this does happen when you iterate through an Array and try to remove elements from the array...
Yar
Holy crap, wouldn't that be a show stopper?
NTDLS
Wouldn't you find out that wasn't true the first time you stepped thorugh it with the debugger?
John MacIntyre
It's roughly how conditional instructions on ARM work. They all have an `"if (a) then ..."` pattern (or `NOT a`), where `a` is one of the CPU flags. Since conditional jumps aren't very fast, it makes sense to have multiple conditional instructions with the same condition in a row. But if you did change that condition flag halfway, subsequent instructions will use the new flag value.
MSalters
+109  A: 

For the longest time I thought that Bad Programming was something that happened on the fringe.. that Doing Things Correctly was the norm. I'm not so naive these days.

Boo
I used to think Bad Programming was only done by other programmers, until I was done in by one of my Bad Programs. Now I Do Things Correctly! (You believe me this time, right?)
Jared Updike
OMG, I would upvote this again and again, till my clickfinger gets as tired as my code-reviewing eyes...
AviD
Totally. I've gone from "That never happens" to "That never happens except at *this* job" to "Every place has bad code."
Kyralessa
I kinda still hold that belief. I guess I haven't seen much of how code is written in the real world (typical companies).
hasen j
Hacking is the norm. Engineering is the purview of the truly competetent. If ever meet a software engineer I'll let you know.
corlettk
@corlettk: You mean monkey-coding is the norm, no? Hacking is an art, a high-level of art mind you, that I'm far far away from achieving.
hasen j
@Hasen: No, hacking is an analogy to unskillfully taking an axe to a tree, chiseling off tiny pieces in a mad panic with no real plan, and creating a bloody great mess until the tree finally falls on your head. A "hack" is "one who produces banal and mediocre work in the hope of gaining commercial success". Why it was that the computer field changed "hack" to mean "skilled", I'll never know.
Software Monkey
+48  A: 

I would say that storing the year element of a date as 2 digits was an assumption that afflicted an entire generation of developers. The money that was blown on Y2K was pretty horrific.

MikeJ
This is the only answer that I'll upvote, though it's a CW so it doesn't matter...
Yar
Good thing I am just in this for the babes. :-)
MikeJ
IIRC some systems back in the 60's and maybe 70's only used one digit because it used less memory. I have even seen paper forms where "196_" and "197_" was preprinted.
some
I still see forms with 200_ and presumably there are some now with 201_ printed.
Macha
The sad part is... Unix will have their second round at this in 2038
Evan Plaice
@Evan: If, that is, they're still using 32 bit machines in 2038.
Billy ONeal
@Billy Just because the machine architecture changes doesn't mean the data format will. Storing 2 digits of resolution in int format would make a byte (8bit) date format and, yet, it affected tons of 32bit hardware architecture machines in 2k. This is just one more example of why you don't let low level hardware guys specify data formats. They penny pinch bits with the knowledge that there will be a scheduled SNAFU in the distant future.
Evan Plaice
@Evan: Good point.
Billy ONeal
thanks for the edit.
MikeJ
+15  A: 

I used to believe that the majority of work on an application was actually programming. I'm sure this is true in some cases, but in my experience I spend more time researching, documenting, discussing, and analyzing than actually coding. (I work on software that operates a laser-based sensor, and determining how best to control the hardware is much more challenging than writing the code to do so.)

I also used to think that open environments where programmers can look over their shoulder and ask the guy (usually) next to them a question were the best environments for a team of programmers to hammer out a solution. It turns out that a dark lonely room is more productive, team or no team.

When I graduated, I assumed that programming professionally would be like programming in college, meaning that I would be given the inputs and expected outputs and asked to fill in the black box that does the conversion. In reality, I have to figure out the inputs, outputs and the black box.

I didn't used to think marketing and sales guys were the scourge of the human race, so naive.

Atilio Jobson
Someone else lost a vote just so I could vote this up.
Unkwntech
My personal favourite is this sort of conversation: BA: "The system requires these outputs. || Me: OK, we'll need these inputs. || BA: But the data-entry will cost millions! || Me: Yes, and where did you expect the system to get this data? || BA: Can't you make it up?"
corlettk
You're last point is missing a Lawyers and Bean-counters to go with the whole scourge-of-the-human-race part
Evan Plaice
+2  A: 

That I grok programming. By studying the SICP book I saw that I knew nothing. At least now I am delving into programming more.

Nick D
+3  A: 

the assumption that i was to make the program 100% complete and bug free and report it as "completed". Sometimes the company wants to release the program when there are many bugs to get market share first.

動靜能量
+4  A: 

that after I finish CS school, I can start a job and use my knowledge that I learned in school for real world applications. (I actually wish i wouldn't waste 4 years of my life in learning operating systems and prolog)

luvPlsQL
What's sad is that (to me at least) operating systems, prolog, and similar subjects (esp. AI and 3d graphics) were fun. I probably would have chose a different career if I knew the "real world" was far more mundane.
Cybis
agreed. It seems like most of us get stuck doing web applications and fairly simple database work after studying some hard core C/C++ development.
luvPlsQL
On the other hand, the reverse is just as true: "That I can build real world applications (well) without understanding the basics such as operating systems and prolog" - I find this very common amongst the bad programmers I meet...
AviD
+1  A: 

the assumption that if i write code really well and as bug free as possible, and that's the best thing i can do. turns out sometimes the managers prefer people who try to become their favorite instead of doing nice work.

動靜能量
The good programmer is both the favorite and code really well ! :D
Nicolas Dorier
Slashene: Great Comment :-). But obviously, people who try to please their manager are not the one who are the more serious in their work (are you?:-) )... And most of time when trying to do a better work (with fewer bugs), you take more time to do it: something your manager will always disagree with (even when you know you HAVE to do it).
yves Baumes
what if your manager make weird noises all the time and be really relaxed with all his friends who also work at the company, and be really strict and has the highest expectation about you? His friends don't have to respond to you since they know they cannot get fired. While on the other hand, your manager will call you on your cell phone when he has a question, expecting an immediate response. and even yell at you coz he thinks he pays you and can yell at you.
動靜能量
+5  A: 

never met with integer promotion before... and thought that 'z' would hold 255 in this code:

unsigned char x = 1;
unsigned char y = 2;
unsigned char z = abs(x - y);

correct value of z is 1

Andrey
Wow. That is evil
MikeJ
Depending on the implementation, z could be 65535. Or various other values.
Windows programmer
no, it could not. described behavior is correct according to standard. do you know a compiler that acts like you described?
Andrey
The standard allows a conforming implementation to define unsigned char (and plain char and signed char) as 16 bits, and to define int (and unsigned int) as 16 bits. It doesn't matter if I know a compiler that defines char as 16 bits. It doesn't matter if I used compilers that defined int as 16 bits. The standard allows it.
Windows programmer
sizeof(char) is not important in this case because due to promotion, the given expression that is passed as argument to abs becomes (int)x - (int)y, and abs(-1) would always be 1.
Andrey
Of course sizeof(char) is not important. sizeof(char) is always 1. Meanwhile, the standard allows a conforming implementation to define unsigned char as 16 bits and unsigned int as 16 bits. In that implementation, perfectly legally, x is 1, y is 2, for the subtraction x promotes to unsigned int with value 1, y promotes to unsigned int with value 2, the result of the subtraction is 65535, and abs(65535) is 65535. In that implementation the standard requires unsigned char to promote to unsigned int because plain (signed) int can't hold all the values that unsigned char can hold.
Windows programmer
I've just figured out that Andrey didn't know that CHAR_BIT is implementation defined. 16 is allowed.
Windows programmer
I got your point; but why in what case would unsigned char be promoted to unsigned int? it might happen only in case if int is not capable to hold values of unsigned char. If this is the case - then abs((int)65535) would return 1, because 65535 would represent -1 for int.If it is not the case (int is capable to hold unsigned char values) then promotion would be to int, not to unsigned int.z would still be 1.
Andrey
"it might happen only in case if int is not capable to hold values of unsigned char" -- Bingo, exactly as I wrote above. 16 bits from 16 bits. Thank you for finally understanding. ... "z would still be 1" -- oops, try again to read what you wrote a few lines earlier.
Windows programmer
OK, I see what Andrey almost said. We need to know which programming language is in use. If it's C then abs isn't overloaded and the argument will be demoted from unsigned int to int.
Windows programmer
ok, and if C++? wouldn't the same thing happen in C++? abs has overloads for int and long (it wouldn't make sense to have abs overloads for unsigned types)
Andrey
In my example with 16 bit char and 16 bit int, when C++ chooses an overload that takes a built in promotion from unsigned int, that built in promotion is going to be long not int. Plain int can lose some values of unsigned int but plain long cannot. (Again remember, this is in my example with 16 bit char and 16 bit int. In a different example where CHAR_BIT is 64 and int and long are all 64 bits, long would also lose some values of unsigned int.)
Windows programmer
+8  A: 

Some of the things that I still have trouble with are the following misconceptions - I still try and hold on to them even though I know better:

  • All stakeholders will make decisions about software design objectively. Those that aren't embroiled in writing the code make all sorts of decisions based entirely on emotion that don't always make sense to us developers.
  • Project budgets always make sense - I've seen companies that are quite happy to drop [just for example] $50,000 a month for years rather than pay $250,000 to have a project completed in 6 months. The government for one loses their annual budget if they don't spend it - so spend it they will, come hell or high water. It astounds me at how many project dollars are wasted on things like this.
  • You should always use the right tools for the right job - sometimes this decision is not in your hands. Sometimes it comes down from on high that "thou shalt use X technology" for this project, leaving you thinking "WTF! Who came up with that ridiculous idea?"... the guy paying your paycheque, that's who, now get it done.
  • Programming ideology comes first and foremost, everything else is secondary. In reality, deadlines and business objectives need to be met in order to get your paycheque. Sometimes you make the worst decisions because you just don't have time to do it the right way... just as sometimes that word is on the tip of your tongue but the minute it takes to recall it makes you choose a different and less ideal word. There isn't always time to do it right, sometimes there is only time to do it - however that may be. Hence oft' seen anti-patterns used by so called experienced developers who have to knock out a solution to a problem 10 minutes before the presentation deadline for the software being delivered to your best client tomorrow.
BenAlabaster
+2  A: 

Turns out it doesn't matter whether you check if memory allocation returns a reference or not under Linux, as it will actually lie to you and either actually allocate the memory at some time in the future or abort your program altogether if it doesn't have the memory you need.

zaratustra
+2  A: 

That programming elegance combined with automation was an adequate substitute for good old-fashioned testing.

ep4169
+2  A: 

I used to think that I will never program like top tier developer like the MS developer, but now I think I can write same clean code or even better.

Fred Yang
Go and have a look at the example code in the WDK (Windows Driver Kit), most of it is part of the Windows build and to my eyes pretty horrible.
Tony Edgecombe
+6  A: 

The OO is not necessarily better then non-OO.

i assumed that OO was always better.. then i discovered other techniques, such as functional programming, and had the realization that OO is not always better.

sean riley
You assumed that "OO is not necessarily better than non-OO" and your assumption turned out to be false, i.e. OO _is_ necessarily better than non-OO? or you assumed that OO was necessarily better than non-OO and then you learnt that it is not necessarily better?
Daniel Daranas
Sorry, that was ambiguous. i assumed that OO was always better.. then i discovered other techniques, such as functional programming, and had the realization that OO is not always better.
sean riley
Thanks for the clarification - that's what I imagined, but I wanted to have your precise thoughts!
Daniel Daranas
+1  A: 

That threads in Windows are cheap.

Turns out this is only somewhat true. A thread has a certain amount of overhead and requires its own address space where it can live and be happy. So if I find myself dealing with dozens of threads within a single application, I ask myself how I can simplify and consolidate everything into fewer threads.

Steve Wortham
+133  A: 

That you can fully understand a problem before you start developing.

Jeffrey Hines
This, my friend, should be: "That you can fully understand a problem."But it is so true. And apparnetly a hard concept to understand or even accept.
KarlP
You cannot understand the problem "fully", but certainly you MUST understand the problem ( at some degree ) before you start developing. http://bit.ly/kLXgL
OscarRyz
Sometimes you have to start developing to understand the problem. And/or, the problem changes the more you develop.
Evan Plaice
+1  A: 

That everything I wrote would fail at some point in the foreseeable future.

Not that everything won't eventually fall apart, but early on in my programming education, when I found try..catch blocks...I wrapped EVERYTHING in them....things that, if they failed, would have represented much bigger problems than my programs would be handling (e.g., the north and south pole have switched places)

Gus
My favourite bug: Apparently, the first time an F111 flew over the equator in "terrain following mode" (500ft above the ocean at about mach 1) it turned itself over... that's the only way the software could make sense of "left" and "right" at a negative latitude. Oops!
corlettk
+7  A: 

As an old procedural programmer, I didn't really understand OO when I first started programming in Java for a hobby project. Wrote lots of code without really understanding the point of interfaces, tried to maximize code re-use by forcing everything into an inheritance hierarchy - wishing Java had multiple inheritance when things wouldn't fit cleaning into one hierarchy. My code worked, but I wince at that early stuff now.

When I started reading about dynamic languages and trying to figure out a good one to learn, reading about Python's significant whitespace turned me off - I was convinced that I would hate that. But when I eventually learned Python, it became something I really like. We generally make the effort in whatever language to have consistent indent levels, but get nothing for it in return (other than the visual readability). In Python, I found that I wasn't doing any more effort than I had before with regard to indent levels, and Python handled what I'd been having to use braces or whatever for in other languages. It makes Python feel cleaner to me now.

Anon
+11  A: 

Back when I programmed on the TI-83, I thought you couldn't assign a variable to itself. So (ignoring that this is C code, not TI-BASIC) instead of writing

c = c + 1;

I would write

d = c + 1;
c = d;

When I learned about += and ++ it blew my mind.

Chris Lutz
and then you learned that some languages don't have those, and the clock gets set back.
Yar
I at least understood that if I wrote "c++" it was equivalent to "c = c + 1", so it wasn't set back far.
Chris Lutz
I grew up on visual basic at firsst (first program at 6-7), so I was raised with iterators.
Dmitri Farkov
I have dreamt of building a BASIC interpreter with this exact sort of brain-damage.
TokenMacGuy
+6  A: 

That goto's are harmful.

Now we us continue or break.

hWorks
well its not true any more because we've stopped using them!
mike g
It's all in how they're used. Compilers use nothing else.
Mike Dunlavey
That's like saying that writing programs in machine code is good because all computers use machine code. Goto's are harmful because they encourage programmers to create code that is difficult to read and debug.
Zack
@Zack, So GOTOs aren't harmful - programmers are harmful.
U62
@U62 GOTOs aren't harmful, programmers that use GOTOs are harmful.
Evan Plaice
+55  A: 

That programming software requires a strong foundation in higher math.

For years before I started coding I was always told that to be a good programmer you had to be good at advanced algebra, geometry, calculus, trig, etc.

Ten years later and I have only once had to do anything that an eighth grader couldn't.

ChiperSoft
Very true. In most cases you don't need to be a math expert. The only time I ever really needed to know any advanced math was when I was doing 3D programming as a hobby. In fact, it was actually the 3D programming during high school that inspired me to pay better attention in trig and pre-cal classes. Other than that though, very basic math is usually all you need.
Steve Wortham
I think you were misinformed. Sure, to be a good *programmer*, you don't really need to use much higher level math, but to truly understand and apply certain computer science concepts, you're going to need more than just eighth grade math.
htw
I would say being comfortable with binary logic (which isn't that much math) and how the CPU actually works (memory allocation, device communication, ALU and the interaction with whatever registers you have on your platform) is far more important to be a good programmer than a thorough understanding of advanced mathematics.
Martin P. Hellwig
I think the emphasis on math is to teach critical thinking skills and problem solving not as something that you would use in every day computer programming.
Zack
The kind of abstraction you need to understand advanced mathematics is very similar to the abstraction you need to create software.
OscarRyz
@Zack, @Oscar - I think thats the idea behind it, and thats what academics (my professors included) would like to believe. However, the abstractions for advanced mathematics are actually VERY different from abstractions you need to create actual software. In fact, it's rare to find people that do both well.
AviD
I think functional programming concepts is much easier to understand if you have a stronger foundation in mathematics, simply because you aren't frightened off by the syntax as much. It looks familiar. I made the mistake of using simple mathematical functions to demonstrate the functional programming concepts new to C#. Some people were immediately declaring that it was too complex.
Richard Hein
I would also have to disagree to an extent. I does not *REQUIRE* a good foundation in in higher maths but so many patterns and concepts are based on mathematical techniques and to create more complex software having a good mathematical understanding will be useful when writing your logic and algorithms.
Sheff
In good programming, you're right. In computer science, a strong foundation in mathematics is needed to really understand the depths of some of the topics.
fd
I think there's just some correlation, not causation. If you enjoy math, you migth be more likely to enjoy certain aspects of programming.
peterchen
Once I started working in relational theory and data management, I found that many of my iterative calculations could have been better handled by a better understanding of mathematical concepts and algorithms. Calculus II? Maybe not. But pre-calculus and trig, absolutely.
Chris Kaminski
In my programming career the only thing I regret is not having done MORE math. There's many advanced programming concepts that have me befuddled on a daily basis. (That said, it's mostly related to functional programming and type systems)
Rehno Lindeque
The only reason they gave us so many maths at the university was to make sure not all students would pass the first two years. Testing a student on his skills in mathematics is definitely more straightforward than testing on programming.
Dimitri C.
I completely and whole heartedly disagree with this answer.
Akusete
I've had to use algorithms based on `mod` functions all the time (usually in conditionals based on indexes), and I never learned that in eighth grade.
Lance Roberts
+18  A: 

That Unix and Linux OSs are well designed ... I should probably qualify this(!)

Firstly, the view is reenforced by such anti-truisms as:

  • every subsequent OS developed ends up redesigning Unix poorly (it's said about Lisp as well, where it is more true).
  • the list of rules that make the 'Unix philosophy'. It's not that they are wrong, it's the implication that Unix itself follows them closely.

It may be more true to say that they were well designed/well done, and surely parts of them are, but even this is just a relative judgment, relative to some awful versions of Windows. Here are some examples of things that are done badly:

  • configuration is a mess, ad-hoc flat file configs are not good
  • the C programming language should have been replaced (by something like D) a long time ago
  • shell scripting is schizophrenic. It is not good for development as it is shorthand designed for quick typing.
  • directory structures are badly named
  • the GNU tool chain is unnecessarily arcane
  • the belief that general purpose always trumps special purpose

Overall they require unnecessary expertise to operate. Or rather a lot of knowledge where there is only a moderate amount of understanding.

It's not all bad. Linux is politically better and not corrupted by business needs, but sadly to a large degree a lot of the technical highground has been lost.

mike g
It's better designed than Windows...
Zifre
I still think that the flat-file config is better, but ad-hoc is a disaster. Seems to me that the MacOS X plist mechanism makes a very good compromise
TokenMacGuy
I think the general point stands: Linux carries around a lot of cruft a bit like windows does. but your specific points don't really convince me, configuration works fine for most things (depends on how the configuration files are implemented), shell scripting is now done with python a lot, directory structure is down to taste, ... Bit weak really
wds
There are probably 'better' negative points - i haven't used it regularly for a while, but do you think that maybe you have low standards on this. Conf works, but not well - but really there should be a standard (with type information) which would make things like context sensitive help, and decent gui tools possible (that can handle all versions of conf files for instance). IMHO your POV lacks vision on this.
mike g
It isn't just shell scripting. There is a complete lack of separation from what is a Human interface (the interactive shell) from an engineering interface (the programs). Shell has perverted the api for programs, and torpedoed one of the goals of unix (everything as small programs). Output as human readable text, and input as save on typing single char switches. Take rm (a toy example)#this is ok for an experts command linerm -f #should be something like this in scriptsremove force=trueThe human layer should be a separate layer.
mike g
I won't rant much on dir naming, but the idea that its completely subjective is wrong i think.
mike g
Some of us thought that Unix redesigned Multics poorly. Unix was intentionally designed to avoid being everything Multics was, and then the rest was hacked in instead of being designed in.
Windows programmer
I'd argue that *nix in general are more secure and some brands (in my case Linux Mint) are more stable. Toss in Norton on Windows and my *nix will burn windows any day in performance. Better designed? not necessarily. Better in general? Yep. SideNote: I have rarely if ever touched a config file in Mint. Just about everything can be done with the GUI.
Evan Plaice
Linux not corrupted by business needs?! So many superior ideas have been rejected for inferior ones. Linux is today mostly about politics: my ex-colleague worked his ass off to get his patch accepted into the linux kernel (a slight enhancement of the TCP protocol). He could tell many "interesting" stories about people trying to block/sabotage patch acceptance on very dubious and sometimes incorrect technical grounds and assumptions.
zvrba
+42  A: 

That XML would be a truly interoperable and human readable data format.

Alex. S.
XML isn't a panacea but I wouldn't like to go back to the days where I regularly saw applications trying to squeeze relational data into single csv files.
Tony Edgecombe
its an inter-operable syntax, no doubt about that. Its just that syntax is often the least important aspect of any solution.
Simon Gibbs
+1, you could add small and fast to the wishlist too.
MarkJ
True but an improvement over csv and fixed length where without the documentation you are screwed.
PeteT
I love XML for the standardization it brought to data formats and for correctly handling character encodings. I hate what is sometimes done *using* XML, however.
Joachim Sauer
+4  A: 

Bitwise comparisons on integers in SQL WHERE clauses are practically free in terms of query performance.

As it happens, this is somewhat true for the first half-million rows or so. After that it turns out to be extremely UN-free.

JohnFx
UN-free == expensive? Is this a hidden political statement about the United Nations? Awesomes.
Kieveli
Please, which RDBMS does this apply to? I've never had a problemo in Access, Sequal, Ingres, Postgres, Informix, or MySql... though I've only (knowingly) dealt with multimillion row tables in Ingres and Informix.
corlettk
In my case SQL Server, but I think it would apply to any RDBMS. The trouble is that bitwise operations are not sargable and won't use indexes efficiently. The operation, however, is so fast even witha table scan I didn't notice it until it got really large.
JohnFx
The DB goes fast for the first half million and then slows down?
Qwertie
Just saying the performance profile resembles O(N)
JohnFx
+453  A: 

For a long time I assumed that everyone else had this super-mastery of all programming concepts (design patterns, the latest new language, computational complexity, lambda expressions, you name it).

Reading blogs, Stack Overflow and programming books always seemed to make me feel that I was behind the curve on the things that all programmers must just know intuitively.

I've realized over time that I'm effectively comparing my knowledge to the collective knowledge of many people, not a single individual and that is a pretty high bar for anyone. Most programmers in the real world have a cache of knowledge that is required to do their jobs and have more than a few areas that they are either weak or completely ignorant of.

JohnFx
So true! That's the problem of this age. Information is also discouraging. I had this revelation a few weeks ago when I felt like a complete loser in everything I did (not the first time) regarding research. Guys who get their papers published in IEEE Transactions do not necessarily have the same skills as guys who work at Google, boast in StackOverflow, ar excellent professors, or write great programming blogs. Of course, the best guys are exponentially cooler than we are, but they don't know everything you know that you don't know. So, stay cool.
zilupe
It also helps to understand that those bloggers aren't writing everything off the top of their heads either. Good bloggers research their topics and learn new things while writing posts.
JohnFx
I obsess daily about the stuff I don't have time to read about and learn. It leaves me with a horrendous feeling of guilt sometimes.
brad
I know how you feel. I try really hard to keep up with this stuff, but I do have a day job, after all!
JohnFx
The url is self explanatory: http://www.liveintentionally.com/Too_Many_Choices.htm
corlettk
Totally! Thanks for putting it like that, I felt so alienated when I started reading this site thinking "shit, I don't know anything!".
atc
I do exactly the same and had never realized it... i felt for a long time too pesimistic and modest about my skills based on the phenomenon described above...
Konstantinos
@Zilupe: Amen to that. I've published a few international conference papers and journals. In the eyes of some people, that sounded cool. Until you realized that it doesn't really take much effort to publish a paper. We're no genius. We're just like everyone else. We made mistakes, and we publish crap papers. Well, except for some minority group of real geniuses...
Hao Wooi Lim
Absolutely agree!! Most developers working on commercial projects are expected to deliver against tough deadlines. I wonder how many follow OO practices, good coding guidelines, test driven development etc. I have seen many .NET projects that don't follow a proper layered architecture mainly due to time and resource constraints.
Shaw
oh well, I guess I was a little off topic there. What I meant to say is not all are genius developers, and so when you see some one at SO espousing on the importance of TDD, OO, while all perfectly given with good intentions, don't fell bad that you aren't good enough, because you aren't doing them. We all work under the circumstances we find our selves in, and sometimes it's not possible to go by good practices.
Shaw
"[...] I'm effectively comparing my knowledge to the collective knowledge of many people". <<<< Exactly!!
hasen j
Maybe it is that tinge of guilt/regret that pushes us to get better at our craft. I suppose it is ultimately a good thing.
JohnFx
+1 Good thing I read this. I thought I was the only one.
Randell
This saved my brain from frying. Lately i've been thinking more about what i dont know and how it seems that everyone knows everything so perfectly than actually learning something! Glad to see i'm not the only one!
LnDCobra
Man I couldn't have read this at a better time..
Jeriko
I hear you brother.........
Night Shade
I agree what JohnFx's posting. However, FWITW @JohnFX... I DO know more than you ;-)
user279521
@user279521 - Oh yeah? How many fingers am I holding up?
JohnFx
@JohnFx 2months later..... +1 (damn you are slow dude)....
user279521
+1  A: 

That learning a whole new language would be really really hard.

too much php
it is learning the standard library that is hard.
GameFreak
A: 

That salesmen manage customer expectations realistically. (Trained in under-promising and over-delivering)

That software requirements generally come from market research.

Maltrap
+3  A: 

For a long time (about 5 years) I thought that PHP rocks.

I thought that I know algorithms. And then I joined Topcoder.com

zilupe
Yea, I've been there. It's funny how obvious that mistake becomes once you learn about that little thing called namespaces.
Evan Plaice
+20  A: 

I thought I would need it.

MrValdez
Joke explanation: The line is the opposite of YAGNI (You ain't gonna need it). In essence, I thought I would need a class/module/functionality/etc before I can complete my program.
MrValdez
I GET IT .
Ólafur Waage
I've long thought there should be an opposing principle: BWIIDNI?
Daniel Earwicker
+1  A: 

That run-time performance mattered. Total solution time is what matters, often.

Since learning python, I have weaned myself from my attachment to static typing.

mataap
I have tried Python before, but, believe it or not, I write more bugs in Python than C++ (and I don't have a whole lot of C++ experience). Static typing is just so much more productive.
Zifre
@Zifre: there is some truth there, but it also matters how quick you can fix them and how fast you can write the whole program. I had my share of bugs caused by dynamic typing, but since they were easy to fix they didn't bother me too much.
Cristian Ciupitu
+5  A: 

I just recently found out that over a million instructions are executed in a Hello World! c++ program I wrote. I never would have expected so much for anything as simple as a single cout statement

rzrgenesys187
wow ... where did you find that?
hasen j
Wow, oh ouzer! Where _did_ you find that out? Got links?
corlettk
I was doing some research for a project and using pin (http://www.pintool.org/). I typed out a little Hello World program to test with their pre-made instruction counting tools and was amazed at the output.
rzrgenesys187
Hmm... I just compiled a basic hello world program c++ and it was 5974 lines of assembly.
GameFreak
+2  A: 

Since college days, I thought myself to be master of programming. since I could code but others couldn't. But when I joined a company, then I was struck by my ignorance about basics. All my assumptions about myself turned out to be wrong! Now I know what I need to know and what I do not know!

Manohar
+3  A: 

When at college (mid 90's) they only had Windows 3.11 machines in the computer lab (I know, weird college).

For a while I thought that only the Windows platform was relevant to me as a professional programmer and that all other platforms were only interesting from an historical academic point of view.

After graduating from school and learning about modern unixes and linux environments I couldn't help feeling angry and disappointed about my lame school.

I cannot yet believe I graduated with a computer engineering degree without ever seeing a bash shell or even hearing about emacs or vim.

Sergio Acosta
That's... impressive, is about the only word I can think of.
mavnn
I got lucky... We had a xenix (an early Microsoft unix port to Intel) system at my TAFE college. I got to play, and one of my mates was hired back as the sys-admin... and we figured it out together. When I started work on Solaris I was streets ahead of my compatriots. Yep, a Windows only university environment is totally suckful.
corlettk
Who uses Unix anyway? At least thats what I thought when I was FORCED to learn ONLY Unix in uni, basically treating non-Unix enivronments as either toys for home (windows) or nonexistent legacy (Mainframes etc).
AviD
+2  A: 

That it was so important to make efficient programs without wasting a byte nor a CPU cycle.

But with more experience, its not about bytes or about CPU cycles, its about your flow of thought, continuous, uninterrupted, much like a poem.

Essentially, don't try too hard.

Lakshman Prasad
> Don't sweat the little stuff!~~ anon.
corlettk
I do the same thing!"Well, just possibly, why can't I optimize/combine that SQL query..."(for an quiz that gets 5 page views a month :) ).
CodeJoust
+1  A: 

I did not know something divided by 0 in Javascript is Infinity (IEEE 754 arithmetic). Learnt it the hard way recently.

Chetan Sastry
Something divided by zero in anything is infinity.
Sam152
Nah, in most programming languages it's an error. To most mathematicians, it's undefined (which is definately _not_ the same as it being infinity).
mavnn
Yeah, there was a time when I thought NaN was an elderly female relative...
Dan Diplo
That's weird, shouldn't the result be NaN?
Qwertie
+14  A: 

That code reviews are a waste of time.

Having moved from a company where they were entirely optional to one where they are mandatory (even audited) I've come to understand their usefulness. Having a second set of eyes on code, even on the most trivial pieces, can:

A) save you embarrassment when you screw up something trivial (a trivial code review, for instance, would have prevented us from spamming hundreds of emails to our customers, at my previous job) B) can teach you things that you didn't know in the first place (I'm ever learning new libraries at my current job - inevitably at a big company, someone has already stumbled upon the problem you have and done a better job solving it - it's just a matter of knowing where to look) C) at the very least ensure that someone other than yourself knows how things work.

In the end, I wind up happier with the code I submit here, than in my previous employment, even though back then I thought I knew everything :)

James
My first introduction to code reviews was in an organization that didn't actually believe in them, but wanted to say they did them. When I had my first experience of a raal honest code review, it was a bit of a shock.
Mark Bessey
+135  A: 

Smart People are Always Smarter than Me.

I can really beat myself up when I make mistakes and often get told off for self-deprecating. I used to look up in awe at a lot of developers and often assumed that since they knew more than me on X, they knew more than me.

As I have continued to gain experience and meet more people, I have started to realise that oftentimes, while they know more than me in a particular subject, they are not necessarily smarter than me/you.

Moral of the story: Never underestimate what you can bring to the table.

Rob Cooper
Good one! I am currently working with a colleague who really knows A LOT about .NET development. Took me some time to realise that I am better at understanding what the customers needs.
Treb
And on the other hand, that I know more than other people. It turns out that they just know different stuff. The other moral: Never underestimate what someone else can bring to the table.
thursdaysgeek
Here's that old "Do unto others" thing again... I'm coining a new phrase: Tech bulying ~ The state of feeling superior because you know some stuff, and making the mistake of letting everyone else know it. @seealso: smartass.
corlettk
Excellent observation - my version is more negative "Everyone does stupid now and then". Somewhat related to "don't flip the bozo bit".
peterchen
You only have to worry when stupid people, are smarter than you.
Brad Gilbert
We confide in our strength,without boasting of it;we respect that of others,without fearing it--Thomas Jefferson
krishna
A: 

He said he knew programming, it must be true!

Alfred
+91  A: 

That women find computer programmers sexy...

Bill Mueller
Wait a second???
çağdaş
he he he he.. okey, I was looking for something to keep my smile for the rest of the day. I think I've found it!!! :)
OscarRyz
Only women who put value in being able to feed one's family.
Kyralessa
I just didn't know what they find us un-sexy. I thought we were just like everybody else.
hasen j
Programmer women find programmers sexy... oh yes.
jess
On average, perhaps not. The geek stigma is vanishing but it's far from gone completely. Even so, there are still many women out there that like geeky/engineer/programmer types. They're just harder to find and may not make their attraction obvious.
rashfeather
The dude in "Hackers" got Angelina Jolie. That's not a bad haul.
JohnFx
My girlfriend isn't really technical, and she finds my programming sexy :)
Carson Myers
"Ooh, baby! Yeah, say 'if' - throw me some exceptions.. Yeah, you know how I want it" :P
cwap
But do men find women programmers sexy?
rlb.usa
Well DUH... Gamer babes are good too
Matthew Whited
@rlb.usa, at times, yes
Randell
What? Programmers are rich? When did this happen?
Filip Navara
Just run away when a girl finds you sexy because you are a programmer. She may be a freak.
JCasso
And a geek [one].
mlvljr
this definitely reigns true. women find you un-sexy, not the rest of us, :P
Dan Beam
Some women (the right kind) are attracted to insightful intelligent guys. Which, minus the prototypical neck-beard and sausage-gut, are pretty common traits of programmers. Sprinkle on a little concern for self-image/hygiene and the occasional thrill/excitement of extreme sports and you're well on your way.
Evan Plaice
Note: Neckbeards, and socks-with-sandals should make any sensible person cringe.
Evan Plaice
Women find men with steady paychecks sexy. A significant subset of this group is 'programmers'
mickeyf
True.. :D I love a guy who can program.
Dian
+8  A: 

My incorrect assumption: That while there's always some room for improvement, in my case, I am pretty much as good a programmer as I can be.

When I first got out of college, I'd already been programming C for 6 years, knew all about "structured programming", thought "OO" was just a fad, and thought "man, I am good!!"

10 years later, I was thinking "OK, back then I was nowhere near as good as I thought I was... now I get the ideas of polymorphism and how to write clean OO programs... now I'm really good".

So somehow, I was always really good, yet also always getting way better than I was earlier.

The penny dropped not long after that and I finally have "some" humility. There's always more to learn (have yet to write a proper program in a purely functional language like Haskell).

Paul Hollingsworth
I second the motion. Nobody is anywhere near half as good as they think they are, but that doesn't seem to prevent the smart ones from learning. The dumb ones persist with there delusions of adequacy despite all the evidence to the contrary; and refuse to learn, or be taught.
corlettk
+13  A: 

That the design of the NT operating system is flawed when compared to UNIX. It turned out that NT Kernel and design decisions are very similar to any modern UNIX like system and that most of the problems you get in the kernel is the result from third party buggy drivers written by buggy companies.

Martin P. Hellwig
I protest. One fundamental thing deliniates windows whatever to unix. Memory management. Windows detects an attempt to break in. Unix detects an attempt to break out... so windows programs can and do use unallocated memory. Yeck!
corlettk
@corlettk - do you have any references for what you mean by that?
Daniel Earwicker
It's wrong anyway. The relevant windows mechanism is page tables. He's suggesting that Windows VirtualAlloc()s everything, and you only need VirtualProtect to ask permission. The whole need for VirtualAlloc() pretty much proves him wrong.
MSalters
@Martin P. Hellwig Windows blocks RAW packets for 'security reasons'. If security is an excuse to block something, shouldn't they block the whole internet? Nuff said...
Evan Plaice
+2  A: 

That somehow a company that runs a large number of fairly high profile/high traffic websites actually knew what the heck they were doing. It ended up they were for the most part clueless and extremely lucky to be in the position that they were in. So I guess the moral would be,

solid software engineering && best practices != business success

or....

most critical software systems == crap

Tom Willis
The instance doesn't allways represent the whole. I guess the company concerned must be very very lucky to be in there current position... that or they're actually a front for a US bank.
corlettk
how about dead tree news? ;)
Tom Willis
A: 

That profiling and performance analysis were the same thing.

Then I found out that profilers, while better than nothing, contain faulty assumptions, such as:

  • only aggregates matter, not details
  • statistical precision is necessary in locating performance problems
  • measuring time, and locating unnecessary time-consuming operations, are the same thing
Mike Dunlavey
A profiler is a generic solution which was only ever intended to put you in "the ball park". Don't bother optimising code which the profile doesn't prove is a performance bottle-neck. I agree that this can be misleading. Once upon a time I found myself optimising an equals method, which was called literally trillions of times... until I said to myself "Hang on, millions yes, trillions no. Why is equals called trillions of times?" The moral of the story is that a profiler isn't a replacement for an IQ. Cheers. Keith.
corlettk
@corlettk: What I do now is wait until the program is being slow, and then take several samples of the call stack, using the "pause" button. Then I look for call sites that appear on multiple samples. Any such call site is a spot that, if I can optimize it, will speed up my program substantially. This flies in the face of all accepted wisdom about profiling.
Mike Dunlavey
+20  A: 

That dynamically typed languages like Python or Ruby are somehow less qualified for use on large projects.

FogleBird
I had this same awakening circa 2000. I read some stuff on the original wiki at www.c2.com and ended up starting this page: http://www.c2.com/cgi/wiki?UnificationOfStaticTypesAndUnitTests and was on the verge of concluding that I was irrationally attached to static typing. But I've since begun using an environment (C#) in which static typing really brings the IDE to life during editing, and I'm now pretty convinced that statically typed languages are better because they are easier to work with. There is no dynamically typed language that would not be improved by some static type info! :)
Daniel Earwicker
Statically typed can make the IDE infinitely better while dynamically typed can make the code infinitely shorter. It takes black magic to bend a statically typed language to break the boundaries whereas dynamically typed languages oft times don't have enough or clearly defined boundaries. Choose your poison.
Evan Plaice
It is possible to have nice short code in a statically typed language; see Haskell and boo.
Qwertie
+20  A: 

"The project will be done in 2 weeks"

and

"That will take 2 hour to implement"

Marc
Now I always take that time x2 or x3. If I delivered "on time", then I'll get praised on how fast that was
Eric
Yeah, and then you spend 3 hours just fighting a stupid bug, and they think you're not doing anything. Tell me about it.
hasen j
@Eric: Yes, I've been doing this for the past while and it's working out great. I even get to take time off (I'm self-employed, not a work truant!).
DisgruntledGoat
+1 because "It will be done in two weeks!" has become such a running joke with me that I have to mentally bitch slap myself every time I earnestly give an estimate that is, yet again, two weeks.
BlairHippo
+4  A: 

I used to think I was a pretty good programmer. Held that position for 2 years.

When you work in a vacuum, it's easy to fill the room :-D

Zachary Spencer
+4  A: 

That the now popular $ sign was illegal as part of a java/javascript identifier.

Hannes de Jager
Look at Perl and PHP, then you really wish it was illegal ;)
Frank
+2  A: 

This is embarrassing, but for the longest time I had believed it was more memory efficient to nest my method calls, or make multiple method calls, than to create a variable to store the value for each method call in C#.

Alexander Kahoun
You mean, more efficient not to store intermediate results in temporary variables? In .NET temporary variables do have a tiny bit of overhead compared to intermediate values, but the compiler will often create temporaries without you asking for them anyway, which you'll often see if you disassemble to CIL. You generally don't have to create "an object" to store the result of a method; I assume you mean "variable".
Qwertie
@Qwertie: Thanks. I updated the answer to read more clearly.
Alexander Kahoun
+2  A: 

Not longest-held, but at some point and for several years I:

  • Thought Microsoft Windows was the only Operating System in the world ( it was 1992 )
  • Knowing DOS was more than enough to have "advanced" OS knowledge.

That's why I didn't choose "computer course" in high school. I thought that I knew already enough about computers.

Later at university and out of my mistake:

  • I thought that UNIX os/programs were perfect and DOS/Windows won't ever come any close to it ( back then it look so true, I guess Linus at al thought the same and that's why Linux is sooo similar to UNIX and not.. well other OS's )

Finally and for a long time, I thought that:

  • Only my software sucks and commercial software was flawless, because... it was "COMERCIAL" software
  • USA software/engineers/products were synonyms of excellence and anything outside were just poor attempts.
OscarRyz
Oh, now you hurt my feelings. (I am a Swedish developer. Just kidding!)
Andreas Rejbrand
A: 

That an identity column can contain duplicate values: http://stackoverflow.com/questions/889855/identity-column-in-sql-server/889863#889863

Justin Ethier
+4  A: 

Thinking that I know everything about a certain language / topic in programming. Just not possible.

Dmitri Farkov
+4  A: 

That ASCII was stored in a different way to binary

Ben Reeves
What?! It is... ASCII is a character code, binary is a way of writing numbers...
Zifre
I meant that i though an image and a text file were stored differently on disk. That an image was binary and text was something else.
Ben Reeves
There is a nugget of truth in this. a few filesystems, especially network filesystems, handle bytes corresponding to newlines differently depending on whether they think the file is text or non-text. In particular, some made it very difficult to fix this when it happens to be wrong. Few new technologies do this because its a terrible idea.
TokenMacGuy
(Open)VMS for instance does it, so technically not entirely wrong. And the reason why C supports both file modes.
MSalters
+1  A: 

That because i built the software on my 'Standard' environment it would work on everyone's machine/server. Only to discover that i had installed some obscure libraries and services that actually were being used. And then discover that i leveraged a bug, that was subsequently patched.

Bluephlame
+1  A: 

You can't diagnose 'intermittent errors' in production. Rebooting the server is the only way to fix it.

Maybe is was MORE true in my early days of ASP coding. But there are a lot of good profiling tools to find memory leaks and other weird issues. Perfmon also provides lots of good diagnostic data. Plus you should be coding diagnostic logging into your application.

russau
+5  A: 

That virtual-machine architectures like Java and .NET were essentially worthless for anything except toy projects because of performance issues.

(Well, to be fair, maybe that WAS true at some point.)

JCCyC
That myth persists to this day. Counter argument: http://cplus.about.com/od/programmingchallenges/a/challenge12.htm java 0.02688359274 seconds; C# 0.166 secs; C++ 429.46 secs;http://forums.sun.com/thread.jspa?messageID=10435068#104350681st and 2nd are both VM's so don't tell be C++ is inherently faster, or slower. A bad craftsman blames his tools. The best violins where made before we know how to measure anything with sufficient precision to reproduce them. Aside: Bob Wilson on quantum physics: http://www.videosift.com/video/Robert-Anton-Wilson-explains-Quantum-Physics
corlettk
Just to nitpick, but .Net isn't a virtual machine. It's a just-in-time compiler, such that the IL is compiled to native machine code one time per deployment.
Joel Coehoorn
True, it uses a JIT, but using .NET "feels" the same as a Java-style VM design (and of course Java has a JIT too).
Qwertie
+4  A: 

In the early days, most personal computers had a cassette tape interface for loading and storing programs. I did not have a computer at this time but read everything I could get my hands on (mostly magazines) that had anything to do with computers (this was the late 70's - no internet for me). For some reason I was under the impression that programs were executed directly from the cassette tape and that the only reason computers had any RAM was to store variables while the program ran. I figured that when the code had to execute a jump instruction, it would somehow rewind or advance the tape to the correct position and continue from there.

Ferruccio
I still, to this day, have not been adequately able to explain the different between volatile memory (RAM) and non-volatile storage (hard-disk etc) to my mother.
Dan Diplo
Gotta love it...Amazing how things have changed... was that bytecode on the tapes? No high-level languages there.
CodeJoust
+2  A: 

I thought Windows 3.1 was only a platform to play solitaire. And DOS is a platform for BASICA.

henry
+1  A: 

That I know to write a proper web application and was all clear when I had to design stuff that works in all the browsers it screwed me.

Pranali Desai
+2  A: 

Error handling is unnecessary when you have tested your code thoroughly.

too much php
+2  A: 

I always assumed that anyone writing any code for any language used an editing program.

I was working with a client of mine who had me on mostly as support and to write some of the more complex things for him. Well one day he messed up a file, big time. He accidentally saved over three hours worth of his own work, and when I asked him why he didn't save more often he replied with, "because I wasn't done". Naturally, this was not an acceptable answer, and I poked and prodded a little further. I eventually came to find out that he he has never used any editing program, EVER! Not even notepad.exe! He had been using an online CPanel editor for files! It didn't even have a 'Find' function. He couldn't ever save until he was done because he was editing the live file on the site!

Needless to say I was flabbergasted, and he's still using the CPanel editor to this day...

Cpanel's editor!Cpanel is a good management, but seriously... I only use that for on-the-road patches...Never trust a remote server, sometimes I just copy a long comment to the clipboard so I don't have to worry if it doesn't post... (to many things online like killing sessions when you have a good, long comment or post etc.)
CodeJoust
For quick patches, I admit I have done that.
Macha
+3  A: 

That everyone else is using the latest and greatest technology, while my team is the only one stuck with inferior outdated tools. (Except for the mystic cobol dinosaurs)

ammoQ
Be nice to the COBOL dinosaurs. I control your bank account ;-)
corlettk
+1  A: 

That understanding pointers and recursivity would be freakin' hard.

That Integers in VB6 has different size than .Net.

That VB6 could make bit level operations.

Professional programmers make bug-less software.

yelinna
+4  A: 

That everyone wants to produce the best\most sutiable code possible for a problem...

AWC
oh, it's not true?
hasen j
+1  A: 

That OOP was obsolete :( I still regret thinking that till this very day.

Leo Jweda
Yeah, AOP totally superseded OOP, didn't you get the memo ;-)
corlettk
+2  A: 

Learning regular expressions will save you time

rjdevereux
really? They haven't saved you time? They save me a ton of work daily.
Demi
corlettk
LOL @ that, this reminds me of the quote: Some people, when confronted with a problem, think "I know, I'll use regular expressions." Now they have two problems.Thanks Jeff :D
Leo Jweda
They will, if you don't overuse them.Knowing the basics saved me a ton of time! (Try doing 4 chained str_replace's in a row...).
CodeJoust
+10  A: 

That bytes and characters were the practically same thing - "ASCII" was just a way of mapping a byte value to a glyph on the screen.

Reading about Unicode really opened my eyes (although I still don't fully understand it).

Cybis
Great article: http://www.joelonsoftware.com/articles/Unicode.html
corlettk
Indeed, things don't get hairy at all until you find about the transcoding tables that are in the individual font files.
TokenMacGuy
+12  A: 

That's its a 9-5 job

Nir
+5  A: 

That C++ was the coolest language out there!

hasen j
Of course it is. Don't you know?
Here Be Wolves
Yea, I used to think so, and I even used to argue *for* it.
hasen j
What's wrong with C++? I mean, I know there are things wrong with it, but it is pretty cool. I would argue for it.
Carson Myers
It's definetly not the coolest
hasen j
Not the coolest? You can do OOP and metaprogramming in an *efficient* way!
Eduardo León
It was cool... now it is sadly old and unDRY :(.Sure sure, still the best for efficient code, but not *that* cool.Metaprogramming? You mean Template Black Magic Trickery that Halts Compilers?Python is the new cool kid around... sure, it's a somewhat slow kid... but cool.Anyway, C++ is going through some surgeries to come out as the new C++0x... oops, C++1x kid. Then it will be cool again, like a 60 years old man dressing like a 15 years old teenager!
e.tadeu
-1. Template meta programming in C++ is the coolest thing there is.
Viktor Sehr
+56  A: 

That a large comment/code ratio is a good thing.

It took me a while to realize that code should be self documenting. Sure, a comment here and there is helpful if the code can't be made clearer or if there's an important reason why something is being done. But, in general, it's better to spend that comment time renaming variables. It's cleaner, clearer and the comments don't get "out of sync" with the code.

Clay Nichols
I agree _in_ the actual code... excluding javadoc comments (or equivalent).
corlettk
+1, don't even get me started on the treatises I used to write for 10 line functions
wds
To add to this, an assert() statement is better than documenting a precondition/postcondition. .NET 4 code contracts can automatically be turned into documentation, too!
Robert Fraser
A: 

That dimension n is an instance of dimension (n+1) when they're equivalent.

Larsson
+3  A: 

My longest held (and therefore most costly) incorrect assumption was: "The business's requirements are sane and reasonable, I'm just not understanding them yet."

100 green assumptions sitting on the wall,
and if one green assumption should accidently fall,
there'd be 99 green assumptions sitting on wall.

Alternately:

Humpty dumpty sat on the wall.
Humpty dumpty had a great fall,
and all kings horses and all the kings men,
said Effim, he's only a tech.

corlettk
+2  A: 

That always there is not enough time to finish it before deadline.

Lukas Šalkauskas
+6  A: 

don't use advanced implementation-specific features because you might want to switch implementations "sometime". i've done this time and again, and almost invariably the switch never happened.

Martin DeMello
+2  A: 

That a WTF is always an evidence of a bad professional.

In fact I've been realizing recently how many WTF's I committed myself throughout my career, but I was comforted when StackOverflow showed me they are just another software metric.

Romulo A. Ceccon
+2  A: 

That, by learning an exact science, I wouldn't need to improve my limited social skills.

Romulo A. Ceccon
software engineering is not an exact science though
wds
+9  A: 

That IDEs would get faster.

Anirudh
The trend continues with VS 2010 :(
Ben Aston
At least you get something for it. More recent IDEs do their job better than those from a while ago.
Billy ONeal
+2  A: 

That variables are actually just names for specific areas in the memory.

Gumbo
+3  A: 

That, being the owner of the code I write, I'm the only person who should understand or touch it.

Romulo A. Ceccon
A: 

Thinking I'm the only one person that makes a piece of code... then when I need that routine I can't remember what I did and simply copy/paste my own code.

Now, I know that everybody does that.

yelinna
+1  A: 

If I have a powerful static type system like the one in ML or Haskell, I should use it to encode as many invariants as possible. Only with experience did I learn that sometimes it's better to let the invariants be dynamic.

Norman Ramsey
A: 

When i was learning algorithms in my junior middle school, i thought NPC as just non-polynomial problems, which meant the complexity of this problem was no more simple than polynomial. I didn't recognize i was wrong until i learned computational theory in my college -_-b

ZelluX
A: 

Java is slow. So many perl fan bois on slashdot regurgitate(sp???) this, its sad.

mP
+5  A: 

It's important to subscribe to many RSS feeds, read many blogs and participate in open source projects.

I realized that, what is really important is that I spend more time doing coding. I have had the habit of reading and following many blogs, and while they are a rich source of information its really impossible to assimilate everything. It's very important to have balanced reading, and put more emphasis on practice.

Reg. open source, I'm afraid I won't be popular. I have tried participating in open source, and most of them in .NET. I'm appalled to see that many open source projects don't even follow a proper architecture. I saw one system in .NET not using a layered architecture, and database connection code was there all over the place including code behind, and I gave up.

Shaw
+13  A: 

This is really embarrassing but when I was starting to learn how to program nothing could satisfy me. I wanted to write video games. Not the trivial little programs all these books wanted me to write. So I decided I could easily skip 10 chapters and ignore the basics.

So I basically ignored variables!

The problem was that I did not recognize keywords from conventions:

Car car = new Car(); //good
Car test = new Car(); //wrong must be lowercase car!

for (int i = 0; i < 10; i++) //good
for (int test = 0; test < 10; test++)//wrong must be i

I did this for over a year and even made a tic-tac-to game in 3000 lines! I was thrilled by my awesomeness at that point, until I found a tic-tac-to in 150 lines on the Internet. Then realized I was an idiot and started over again.

MrHus
LOL. I like that one :)
Thomas Bratt
tica tac toe in 3000 lines, lawl
Petey B
Reminds me of when I did my first basic and didn't realise that $str in "input $str" could be named just about anything.
Viktor Sehr
+2  A: 

That creating a successful application can easily be done by only programmers. Software is also about ease of use, good looks, documentation and proper marketing. Software development is multi disciplinary and failing one discipline will probably fail the application.

Tarscher
+9  A: 

That one day I'd have a realistic idea how long it would take to build some nontrivial code/system/whatever.

DarkSquid
And if you have a good estimate, it's because you've advanced to the point that such projects are trivial for you :-)
scraimer
+5  A: 

I am a young fledgling developer hoping to do it professionally because it's what I love and this is a list of opinions i once held that I have learned through my brief experience are wrong

The horrible mess you end up with when you don't seperate user interface from logic at all is acceptable and is how everyone writes software

There's no such thing as too much complexity, or abstraction

One Class One Responsability - I never really had this concept, it's been very formitive for me

Testing is something I don't need to do when I'm coding in my bedroom

I don't need source control because it's overkill for the projects I do

Developers do everything, we're supposed to know how to design icons and make awesome looking layouts

Dispose doesn't always need a finaliser

An exception should be thrown whenever any type of error occurs

Exceptions are for error cases, and a lot of the time it's OK to just return a value indicating failure. I've come to understand this recently, I've been saying it and still throwing exceptions for much longer

I cam write an application that has no bugs at all

Crippledsmurf
Those are nice lessons, but... Which one(s) of those assumptions turned out to be incorrect?
Windows programmer
Recently, I've learned:GIT is amazing, and I thought the same thing.I'm also learning tests (other than manually testing... time consuming).One thing you might be missing-- debug using debuggers, not printing out at various execution times. (If possible).Coding to no errors, don't ever try to write a reliable program that relies on an external source. My only problem with a super-simple CMS was I relied on yahoo and f_open which hosting disabled, and yahoo changed the endpoint...
CodeJoust
+20  A: 

One assumption I had as a rookie those days was that people with more years in the field automatically are better developers..

Arcturus
Why may I not upvote you twice ?
Luc M
+1, but also, conversely, that I understood the problem better and could come up with a better solution than the senior dev.
Nathan Koop
+1  A: 

That a language suitable for systems programming must support [mutable] variables.

james woodyatt
A: 

That ASP.NET was so complicated it's not worth learning... oh wait, that one is right.

drozzy
not seeking facetious answers. less opinion, more honesty please.
Demi
well, the question is tagged "subjective"... however one elects to express that subjectivity is immaterial.
Jarret Hardie
@Jarret Hardie : it wasn't tagged sarcastic, however. His "answer" isn't answering the question at all - he's making a comment not about his assumptions but about his prejudices.
Demi
I disagree. If you are a professional programmer working on Windows ... it's totally worth the effort. (no down vote, but I do disagree)
John MacIntyre
+12  A: 

That I was a good programmer!

dean nolan
Welcome to the club!
Luc M
+1  A: 

That full Unicode support was a prerequisite for successfully deploying software to Asian regions.

Gerard
+4  A: 

That managers know what they talk about.

zvolkov
+1  A: 

I thought writing good enough software is an easy task

Upul
+2  A: 

Common poor assumptions: "Quality of Code is secondary". Even poorer assumption: "Quality of code is not important at all."

Quality of code can be a very broad concept. I disscued it quite thoroughly here.

Daniel Ribeiro
+2  A: 

That the more lines of code then the better the software would be.

Wow, that's one you definitely don't want.I spend a lot of time cleaning up code.The less lines the better. (and clearer syntax).
CodeJoust
+1  A: 

That our development methods were chosen and used because they were the best of breed.

Then I figured out that the tools we use had a much greater impact on what we did, when we did it, and how we did it than what I thought.

Karen Lopez
+2  A: 

That you could memset( this, 0, sizeof(TheObject) ) a C++ object in its constructor with no negative consequences

bobobobo
You'll zero out the vtable! If there's a vtable, I think it can only work if there is a derived class (which overwrites the vtable pointer when its constructor starts).
Qwertie
+8  A: 

That object orientation is always the best way to design source code and will always be.

Viktor Sehr
hehe, tell me about it. These days I hate OOP
hasen j
+1  A: 

That people actually cared about the technologies being used (open source/ closed source).

Ritesh M Nayak
Edit it to licenses not technologies.
Evan Plaice
+1  A: 

In the early eighties when I started playing around with computers (ZX81 with 1K of memory), I used spend hours to type in reams of machine code (bytes, not human readable assembly language) for games from magazines, essentially using BASIC Poke instructions.

I believed that if I ever entered a single instruction incorrectly then I'd have to go back to the beginning and start entering the machine code again from the start.

Damian Mehers
Ow. I've never had that one (other than writing classes in interactive consoles for fun... just because macs have ruby installed out of the box).
CodeJoust
+3  A: 

That the evaluation order of if statements in C/C++ was compiler-specific. So writing:

if ( pointer != NULL ) && ( pointer->doSomething() )

Was un-safe because the evaluation order could be swapped. I found out recently (after many years of spouting that lie) that its part of the ANSI-C specification, you can guarantee the order and its perfectly safe.

James

http://stackoverflow.com/questions/888224/what-is-your-longest-held-programming-assumption-that-turned-out-to-be-incorrect/888259#888259
Michael Myers
mmyers, you mentioned exactly this question, which this answerer's answer answered almost perfectly. Did you forget to add something else?
Windows programmer
Meanwhile, the evaluation of most expressions, including expressions in if statements, can often be compiler-specific. James Norris's if expression contains three operators. Two of the three do not impose any ordering.
Windows programmer
Yes I didn't spot the previous point, thanks! If you look under conditional operators in the ANSI C specification:http://std.dkuug.dk/JTC1/SC22/WG14/www/docs/n843.pdf6.5.13 Logical AND operator...Unlike the bitwise binary there is a sequence point after the evaluation of the first operand. If the first operandcompares equal to 0, the second operand is not evaluated.Further: I find it difficult to believe that languages like C++/Java built after the C spec do not follow this rule too.
I don't know of any language that doesn't use short-circuit and/or logic. Well, okay, VB. But it seems like I rely on short-circuit and/or logic just about every day. Hard to imagine what my code would look like if I didn't know this basic principle.
Qwertie
+3  A: 

That I would ever become wealthy programming software for someone else

slf
There where days when I though the same
Viktor Sehr
+2  A: 

That marketing guys care about what you do.

crauscher
Actually, that marketing guys UNDERSTAND what is possible and what isn't, so they don't try to sell the solution to famine everywhere in the world.
Eduardo León
+2  A: 

That you needed a client specification to complete a project. More times than not you start with a sales meeting and a notepad. Of course at the end of the meeting they would like a deadline, "just ballpark it".

Bill Szerdy
A: 

That the number of sides of a coin isn't 10.

Binary jokes are sooo...
Vincent
+6  A: 

That we as software engineers can understand what the user really wants.

Jim Evans
+6  A: 

That more comments are better. I've always tried to make my code as readable as possible--mainly because I'm almost certainly the guy that's going to fix the bug that I let slip by. So in years past, I used to have paragraphs after paragraphs of comments.

Eventually it dawned on me that there's a point where more comments--no matter how neatly structured--add no value and actually becomes a hassle to maintain. These days, I take the table-of-contents + footnotes approach and everyone's happier for it.

hythlodayr
This is duplicate
ssg
+12  A: 

When I first started after graduating from university I expected that more senior developers would know what they were doing. Boy was I wrong....

Mark
@Mark, and that people who told you "what was correct" wouldn't just be "saying something" because they didn't actually know the answer. (-:
Rob Wells
That's funny, my biggest misconception when I became a senior developer was that I expected university graduates would know what they were doing. :-)
tnyfst
@Mark: LOL @tnyfst: LOL again ;-)
Treb
+6  A: 

G'day,

That I'd be just designing and writing code.

No requirements gathering, documentation or supporting.

cheers,

Rob Wells
Thankfully all of that was drilled into me at university! I would have been given the shock of my life otherwise ;-)
Barry Gallagher
Ah... the number 1 reason why I got my diploma in IT, then went on straight to sign on in law enforcement. (ironically, I'm now a cop assigned to an IT project, doing requirements gathering, documentation and users-vendors liaison.) =P
Darkwoof
+2  A: 

I never thought I would be a professional programmer, I thought I would be working with electronics. But in the end, programming is so much easier and pays so much better that what started as a side job became my main thing.

Otávio Décio
+2  A: 

In school, you are taught the programming is "read input, process data, write output". In reality, there is rarely a processing step -- Most coding is just "read input, output"

Generally, it's either "read from user, write to database" or "read from database, display on screen". Those two cases cover about 95% of the work you'll ever do.

James Curran
I would vote this up, but technically it's not correct. I understand what you mean, but in reality there is still plenty of processing occurring.
Richard Hein
Um, my code processes data constantly, and in numerous ways...
Qwertie
+3  A: 

Satisfy a customer by implenting what he wants - unfortunately this implies that a customer knows what he wants.

tanascius
+2  A: 

My biggest preconception was that I would be allowed program the way I wanted to. Then of course I left university and got employed by a company that had ridiculous frameworks, rules and procedures in place that not only meant I wasn't programming the way I wanted to, but meant I was programming badly.

Barry Gallagher
+3  A: 

That people would care about best practices, or even consistency.

l0b0
+2  A: 

The less code the better. Now I know that sometimes it's worth to have more lines of code if it makes easier to read/understand

Sergio
+1  A: 

I assumed it was going to be a rollercoaster ride of fast cars, loose women, private jets and daring escapades. Just wait until I get my hands on that career advisor....

Visage
+2  A: 
  • I thought I'd be coding for 8 hours straight. Realistically, I get 4 hours a day of coding, 1 hour for lunch, 1 for coffee breaks, and 2 for screwing around / chit chatting/ stack over and under flowing.

  • Prior to working, I thought that all clients would be idiots and don't know two craps about computers. Boy was I wrong on that one. Sometimes, we get projects by people who can do it better than we can, they just don't have the time to do it.

  • I thought cubicles were bad, Right now I love them :D I actually moved from a door-ed office to a cubicle. I like the openness.

  • All programmers are not athletic. I thought that I was the only one that goes to the gym. Where I work, at least 10 of us go to the gym every day at 5 am.

  • I thought there would be no women programmers. A couple of our leads are ladies.

dassouki
I wish i can get one just one woman programmer!, where i came from i think its a taboo i dont know :(
gath
I would like to find a woman programmer to marry..
Qwertie
+6  A: 
  • My co-workers were/are producing supposedly bad code because they sucked/suck. It took me a while to learn that I should first check what really happened. Most of the times, bad code was caused by lack of management, customers who didn't want to check what they really wanted and started changing their minds like there's no tomorrow, or other circunstances out of anyone's control, like economic crysis.
  • Customers demand "for yesterday" features because they are stupid: Not really. It's about communication. If someone tells them it everything can really be done in 1 week, guess what? they'll want it in 1 week.
  • "Never change code that works". This is not a good thing IMO. You obviously don't have to change what's really working. However, if you never change a piece of code because it's supposedly working and it's too complex to change, you may end up finding out that code isn't really doing what it's supposed to do. Eg: I've seen a sales commission calculation software doing wrong calculations for two years because nobody wanted to maintain the software. Nobody at sales knew about it. The formula was so complex they didn't really know how to check the numbers.
João Marcus
+17  A: 

That I can understand my own code without comments!!!

TheMachineCharmer
It hurts when you realize that you can't understand it!
Luc M
Thats when you know you did a poor job writing it ;) But yeah it hurts :(
LnDCobra
On a similar note: "That i can understand my comments"
edorian
+1  A: 

The specs are complete and suffient

SeanX
+2  A: 

That Java passes copies of objects to functions, not references.

In other words, I thought that if you pass an object into a method, then change the object in some way, it doesn't change the object in the calling scope. I always passed objects into methods, manipulated them, then returned them!

DisgruntledGoat
That just allows you to().do().this(). :)
Arafangion
+7  A: 

I think I was 10 years old when someone convinced me that there will be a computer capable of running an infinite loop in under 3 seconds.

scraimer
+2  A: 

I always believed that to be a good programmer one has to know all the inner workings of the system. I was ashamed of the fact that i didn't know everything that is to be known about the language like its libraries, patterns, snippets before you start coding. Well, I am not so naive anymore.

lune
+3  A: 

That Python was an impractical, annoying language (I can still read some comments on my early code, complaining about it) and C++ what the only true object-oriented language.

I was so wrong I still fill ashamed.

Stefano Borini
A: 

A program can eventually have all of its problems ironed out.

TomFromThePool
+3  A: 
Chris S
+4  A: 

That I need to define all the variables I'll use in my function in its beginning (Pascal style).

I used to believe I need to think about ALL the resources to be used by my function and define them before I start coding, this is probably because my first language was Pascal where that's the requirement. Then when I moved to C, I would define temp variables that are used only within loops outside those loops, disregarding in-loop scope, just so that "everything will be defined in the beginning".

It took me several years to understand that defining all the resources in advance is not a holly cow, and that scoping is by itself ultra important to code readability.

Roee Adler
+3  A: 

That other people would be as bothered by known bugs as I was, and would make fixing them a priority over project work.

CodeByMoonlight
+4  A: 

That my schooling would prepare me for a job in the field.

SnOrfus
+2  A: 

That the benefit of OOP is that you get to reuse the object, when in reality it's the resuse of the rest of the code by creating a new object that has the same interface.

In reality, the object might be 2% of the code so reuse gets you only 2% benefit. The real benefit is reusing other 98% of the code by creating a new object that allows all the other code to something completely different. Now you have reuse of 98% of the code. Well worth th 3x longer it takes to write something as an object.

E.g., If you have a drawing program and suddenly there is a new shape you want to be able to draw you just change the ShapeObject (while keeping the interface the same). Nothing else in the program has to change.

Clay Nichols
+2  A: 

That I wouldn't need to rapidly refactor my Object oriented code. Martin Fowler finally opened my eyes.

Ritesh M Nayak
+2  A: 

That I would never find a practical use in programming for the Karnaugh maps I was taught in my computer science curriculum.

schemathings
+2  A: 

That tests were just another method of procrastination.

Macha
+3  A: 

That learning the language is just learning the syntax, and the most common parts of the standard library.

Macha
+3  A: 

I thought "duck typing" was actually "duct typing" when I first heard of it, similar to the way people often say duck tape. "Duck typing" just sounded wrong, while "duct typing" made a weird kind of sense (cobbled-together types).

Chinmay Kanchi
LOL, like duct tape typing.
Evan Plaice
Wait, it's not duck tape? I'm free to use it on geese, too?
Don Branson
Kind of makes me wonder why it isn't duct typing - seems very appropriate!
Kevin Wright
+1  A: 

That programming is for juniors and that the best project managers are people who can’t program.

Kdeveloper
A: 

That an html element id and name attribute where interchangable.

It turns out that elements with 'name' attributes are related/used.referenced for POSTs etc and 'id' attributes are used for DOM reference.

Mark Redman
+7  A: 

I was convinced, for at least 6 years, that every problem had exactly 1 solution.

Utterly unaware of multiple algorithms with differing complexities, space/time tradeoffs, OOP vs. Functional vs. Imperative, levels of abstraction and undecidable problems. When that blissful naivety broke, it opened up a world of possibilities and slammed the door on simply sitting down and building things. Took me a long time to figure out how to just pick one and run with it.

Kim Reece
A: 

That I could convince traditional procedural programmers of why OOP oft-times provides a better solution.

That is, a language that describes the world needs the ability to describe complex objects and their relationships.

Arguments usually included nonsense about abstract classes, which I responded to with "not all OOP programmers are fresh out of Uni and still obsessed with abstracts". Or the classic, "there's nothing you could do in OOP that I couldn't do with strictly procedural programming", which I usually replied to with, "It's not that you could, it's whether you would if you had a more extensive toolset".

I've learned to just accept that they don't see the world through the same lens I do.

Evan Plaice
Traditional procedural programmers have a different sight view of life. To them, a computer uses a program to process data. Input->Program->Output. Entangling data with procedures adds no value. In other words, in the mindset of a traditional programmer, the program is not even trying to describe complex objects and their relationships. It is not making a model of anything. It's using algorithms that read input and write output.
ammoQ
+2  A: 

That you never finish the project you didn't start.

Seems really stupid but I put off so many projects because the scale was simply overwhelming. Having just finished a monster of a project I realized I never would have started had I realized the scope of it. In reality though, even the most complex system is pretty simple when broken into discrete and defined pieces. Yet looked at on the macro level it is quickly overwhelming.

Serapth
+1  A: 

That PHP's mysql_fetch_row was the only way to retrieve data from an executed SQL query.

Honestly - I programmed an entire web application without using mysql_fetch_array, and had to change bunches of numbers every time I wanted to change the function to add an extra column.

Julian H. Lam
+4  A: 

That Internet Explorer 6 box model is an evil dumb idea MS came up with only to break compatibility with other browsers.

Lots of CSSing had convinced me that it's much more logical, and can make the page design maintenance (changing blocks paddings/borders/margins) much easier.

Think of the real world: changing an A4 page paddings or borders widths doesn't change the page width.

TheOsp
It may have a logic basis, but it still goes against the CSS specification, and is therefore a bug.
Kevin Wright
+4  A: 

I can read SO and get any work done.

FastAl
So true. +1 for humor.
cplotts
+1  A: 

That simplicity almost always beats complexity. KISS - Keep It Simple Stupid rules.

Edit: As Georg states below I got this one reversed. My mind must have gotten lost in the replies. Simplicity almost always makes your code better if used correctly.

mwgriffith
You might have misread the question. In accordance to the title it sounds like the believe in simplicity turned out to be incorrect?
Georg Fritzsche
I'd actually agree with that. The best and fastest software in the world is incredibly complex, and it got there for a reason.
Andres Jaan Tack
Sorry, I must have lost the question after reading too many replies. But your right, I should have said that complexity was better than simplicity. Meaning that simplicity is usually the best way to go when programming. Its easier to maintain, easier to debug, and it occasionally even runs faster.
mwgriffith
+4  A: 

That bytecode interpreted languages (like C# or F#) are slower than those reset - button - hogs that compile directly to machine code.

Well, when I started having that believe (in the 80s), it was true. However, even in C# - times I sometimes wondered if "putting that inner loop into a .cpp - file would make my app go faster").

Luckily, no.

Sadly, I just realized that a few years ago.

Turing Complete
Here's another: C# is not a bytecode interpreted language. There is a "bytecode" analog in IL, but C# IL is compiled upfront to fully native code before your program starts running.
Joel Coehoorn
Thats only part of what I meant. My belief was that the JIT was far inferior to directly compiled code, which is wrong.
Turing Complete
+2  A: 

That procedural developers/programmers unfamiliar with SQL and relational databases don't need any formal training or understanding of how to work with and or use SQL and that a quick read of something like SQL For Dummies is enough to be sufficient in working with Relational databases like Oracle & SQL Server.

Far too often many errors in applications dealing with data stored in a relational database like Oracle and SQL Server are caused by a lack of understanding or how to use the langauge of relational databases; SQL.

I used to work for a software vendor who had the mentality that all a developer needed was the SQL For Dummies book or something similiar and they would be fully equipped to handle any relational database issue. Now that the clients of this vendor have databases measuring in hundreds of gigabytes this lack of SQL knowledge is coming back around in a negative way. It's not just bad performing lookups and or updates and inserts that are a problem but the actual design of the database itself that is the real obstacle.

All of that could have been avoided and resulted in far less costs now if at that time the development lead would have treated SQL and relational databases with the same leve of respect that they did with the langauge they built the application with.

Don't dismiss SQL as unimportant because it WILL come back to haunt you eventually. You may be able to get away with it for a while, even years but you will eventually hit that breaking point where you can't progress without a complete re-design of your database and that is when the costs will be highest.

A: 

That....who needs JUnit testing when breakpoints are effective? (when testing applications in debug mode). I realised later why....

The Elite Gentleman
what is the reason why?
TandemAdam
The Elite Gentleman
A: 

that temporary solutions are not permanent solutions
or in other words: workarounds are not for ever :)).

Andrei T. Ursan
That's what you say! I don't wanna know how many of my workarounds are still floating around...
Bobby
well yeah, that's why that's wrong what I say, isn't it?my point is workarounds are for ever, the world is just not perfect at all...
Andrei T. Ursan
A: 

thread = process

radi