views:

195

answers:

4

I just finished my first year in college and I'm currently looking for an internship opportunity. I just went through our college's archive of job postings from last 3 months and this is what I found out about what employers are currently looking for in the technical side:

J2EE
Scripting languages (perl/python)
SQL - usually any flavor, mysql the most common
Web Development stuff (html,css,xml,javascript,jsp)

Problem is I don't know anything from those except (x)html and CSS. I'm currently going through K&R C, not sure if I should stop.

My question is, should I be worried now and at least be proficient in one of those (I'm thinking web development) so I can at least be a bit marketable to potential employers?

I'm also looking for volunteer opportunities here and lots of them are looking for web designers, I guess you understand where I'm getting at now..

+3  A: 

It is important to get your bases before you diversify your skillset. Rather than learning too many things at once, I recommend you work double time on learning plain C or C++. as per your class (but as said working overtime, trying stuff on your own etc. in C language). When you start being proficient with C, you'll see that much of this knowledge "ports" to java or python, for example.

If you are really jumping at the bit, maybe javascript (or even javascript with jquery) would be a good parallel learning stack, as it is both similar and different enough that you have less risk of mixing things as with c and java. (Also, javascript requirements are less, mostly you need an editor and a web browser...)

Edit:   Why learning C/C++ is important
A comment to this response dismissed the interest of learning "a language he'll never use at work" and spoke strongly about the "uselessness of pointers". On the surface these arguments hold some truth:

  • There is, probably, a less than even chance [a] that anyone starting a career in programming in this millennium will have a direct use of C/C++ as part of his/her professional endeavors
  • Pointers are a powerful but dangerous things, most languages are the better for sheltering their users from such bug-producing devices. In fact some languages like java were explicitly created to void (pun non intended) the need for pointers!

And yet, in the face of such technically correct observations, many industry observers -who are more qualified and eloquent that I- persist in their recommendation that C/C++ should continue to be included as one of the founding blocks of a practically-oriented education in Computer Science. They also point out the relevance of the C language in of itself, as applied to various areas of the industry. I'll let such sources speak for themselves, one can find various forms of their argument online. I'd like however to present briefly why learning the language makes sense for the OP and other young people seeking a career in the field.

  • He is readily engaged in this class and might as well succeed at it!
    That's obviously a very weak argument with regard to the question of relevance of C/C++ "in the absolute", Yet, a valid and practical argument to him. Getting good grades and the respect of his professors may eventually afford him many additional opportunities.
  • A semester long class, in any language is hardly enough exposure that it would irremediably make him unfit to learning anything else. There is therefore, in the worse of cases, no harm done. However, and thanks to C's relative simplicity and focus on close-to-the-machine concepts, this should be enough time to provide him with an understanding of things at this particular level of abstraction.
  • If nothing else, this will provide him with elements of syntax broadly used elsewhere; for better or worse, C and C-like syntax have permeated so many languages.
  • A well rounded university CS cursus will include classes on topics such as Operating Systems or Compilers and such courses will either directly use or make references to the language. Here again one could argue that the recipients of such an education rarely design a full blown OS or a general purpose language during their professional life and that learning either of these is therefore a waste of time. Here again: direct usage of a particular knowledge or skill isn't the sole criteria for relevance.
  • The purpose of a formal education is to learn meta skills and to acquire a broad culture which will help adapt to (and in some cases shape) the evolving set of technologies associated with CS and IT. Learning a particular language can (and will!) be done at any time during one's career, and such often done by reading books, practicing alone and other self-teaching approaches. Such learning is made easier when applied atop solid foundations rather that atop a somewhat sophomoric grasp of concepts as is sometimes observed in people who learned too much too soon (and who focused on learning the operational interface of things rather than their essence).

[a] "less than even chance": this is just wild guess. I didn't attempt to quantify this scientifically by any mean, I'm just stating my agreement with the fact that a minority of the next generation of coders will have direct need for C/C++ and an even smaller majority will be frequent practitioners.

mjv
so he should learn language he will never use at work, because than its easier to learn used languages? c'mon man. you are lying to yourself and this dude. pointers are useless.
01
Just to balance out the hideous comment at the top (@01), I completely agree. These skills are still needed for today's programming. C++, for example, is still used a lot in other programming jobs (something that was asked), and knowledge of pointers, and other fundamentals are very useful.
Lucas Jones
+1  A: 

the problem is, it's really hard to learn useful skills without having a real project to work on. reading a book about a programing language won't teach you how to use it - only using it will teach you how to use it.

thankfully, we nerds have come up with some solutions to this problem. for instance, if you want to learn python (and you do!), try to get as far as you can in the python challenge - it's fun, and it'll teach the best scripting language out there. once you know python, you can use it as a base to learn more high-level programming paradigms, like object-oriented development, or build some basic web apps.

i really don't recommend starting with a language like javascript. it definitely has it's place, but it's inconsistencies will drive you bonkers.

there are real reasons to learn C. you will most likely need it in your coursework, especially if you have any systems-oriented classes. after writing an OS, a shell, TCP implementation and a web server as part of my coursework, i definitely felt like i had a good grasp on the fundamentals of how computers really work - and you can't do any of that stuff without C. if you're one of those developers who dabbles in web frameworks and doesn't know what's underneath it all, you'll a) always be dependent on sysadmins b) won't have a good understanding of performance issues and c) won't really know the craft of programming.

ultimately, though, don't stress out - you've got plenty of time.

Igor
A: 

One thing I recommend you is to specialize in one language or plateform and be better than anyone in your class on that subject. (Pass certifications, it's really important for a student, my life was so easier after that...). Work on what you love, not what THEY need.

This plateform will be your tool, and you'll have to polish it every single day.

This way, you'll never have to search an internship, you'll just have to choose the best place to work.

After that, I highly recommend you to choose another specialized field you'll like to know which doesn't overlap with what you have already learned (you don't need to be ablee to create websites in java+python+ruby+.net)... then restart what I've said. Never stop learning.

Nicolas Dorier
+2  A: 

This looks like a common problem with learning technologies.

Don't focus too much on any particular technology. By the time you graduate the technologies are going to change 10 times (a little exaggeration).

Of course during the internship you're probably expected to be able to write things in some language with some set of libraries. You should know enough of it to be comfortable writing real-world programs with it. Don't expect to memorize the entire standard library or something. There are guys who do that, but it's a really short-sighted strategy.

The real value is in things which don't change that quickly. Design skills, algorithms, math problems, theoretical foundations of computer science, good judgment, good development practices... Some of it you'll learn in college (more theoretical ones), and some during internships (more practical ones; just try to make sure that what they think are best practices are really best practices - apply some critical thinking to it).

phjr