So... You've sat down and got a blank sheet in front of you, and a browser open in the background with access to the manual. What in your opinion makes you scratch your head and think Now how...
See also:
So... You've sat down and got a blank sheet in front of you, and a browser open in the background with access to the manual. What in your opinion makes you scratch your head and think Now how...
See also:
Connect to a database. Pull some data out of it and display it.
Best practices tend to trip me up. Why is something implemented that way over the alternatives. Is it just convention/programmer quirks, or is there some special language-specific reason for it?
I find that stuff can take a long time to pick up.
hello, world.
Seriously.
It varies from trivial (in which case I'll move on to the next thing having not "tripped up"), up to 15 or so text files (Symbian app doing everything the way you're supposed to, and it's extremely easy to get it wrong). It might even be impossible, in which case settle for flashing an LED.
Don't assume anything about a language / environment you know literally nothing about. Anything might trip me up, so I try to watch where I'm walking...
It depends on the language. When I learn a new language, I always feel the need to sell it to myself. So the first thing I do is write something that the language is supposed to be good at. I'm looking for that WOW moment that gets me hooked and drives me forward. There is nothing worse than doing something you think is useless.
The variable syntax messes me up for a while if it's different from what I'm used to.
Literal formats, especially in languages like F# which have many varieties for different sizes of int, signed/unsigned, floating point, etc.
Learning a new platform is worse than learning a new language. Yet another language which gives you something based on the POSIX C API is pretty trivial to pick up; learning say Win32 if you already know C++ is definitely not.
However, I usually learn a programming language with a specific goal in mind, so after the number guessing game I usually start writing the actual program I wanted to write in the first place.
Terminology. For example, an "object" in one language might have a slightly different semantics than an object in another.
And this might just be part of a larger issue -- when I learn a new language, I'm starting from the assumptions of every language I've dealt with in the past. Carrying those assumptions into the new language has tripped me up several times, the most notable example being the transition from Java to C# with respect to virtual methods (i.e., all are virtual in Java, only those you specify as such are virtual in C#).
It's always something to do with syntax.
I'll mix up whitespace if that matters for the language, add in curly brackets or semi-colons where they aren't needed, make variables global when I mean for them to be local or vice-versa, mix up lists, sets, arrays, prototypes, and tables, use the wrong if-then formatting, use the wrong operator to concatenate strings...
The same thing also tends to happen if I stop using a language for a while and then start using it again.
Now how do I write assert (1+2*3 == 7)
in this language? :)
An assert
statement is the most primitive form of debugging construct that is present in every language that I have ever tried. Once I know how to assert something, I can start recording the things that I am finding out about that language. I start off with 1+2*3 == 7
because it covers equality (is it a single = or double?) and arithmetic precedence (* is evaluated first).
Next tests would be:
assert("helloworld" == "hello"+"world") // can + concat strings?
assert("1two" == 1 + "two") // convert numbers to string or just die
and so on...
If I'm working on GUIs, it's always "How do I get to the toolbox/controls/widgets?"
New funky characters... Think Perl or something like that which adds @, -> <-. <=>. $, #
Depends on the language. The place where I usually run into problems is in dealing with containers. Most languages have a set of container classes that do about the same thing, but are just different enough to throw you off. For example:
These questions (for a variety of reasons) are what trip me up. Not because the answers are hard to find, but because they're the wrong questions.
And for the record:
The first step (and often the most difficult) is setting up the environment. Depending on this language and your experience level this can be a real pain in the butt.
'Wait, how do I install Apache?!' 'Crap, gcc doesn't have a pretty IDE with a button that looks like a green play button - how do I compile/link/execute my code?!?!'
My first step is finding a very tiny 'Hello World' program in that particular language and getting it to build/run on my machine.
If that turns out to be an ugly/painful experience, I go out of my way to find one of the most popular IDEs available for that language. I wouldn't write C# code in notepad and use the command-line compiler; I don't see why anyone would do that with another language. I have painful memories of making websites (mostly just HTML and Javascript) in notepad. Why? Because 'real programmers (ha!) use notepad'.
What a joke.
None of that has anything to do with learning the language exactly, but it's pretty hard to read a book in the dark.