views:

378

answers:

7

In a Channel 9 interview, David Callahan of Microsoft said,

Yes we would like to give you perfectly safe tools. That would require a complete rewrite of the infrastructure on up. So instead we will give you strong, effective tools and guidance on how to use them.

Is this really a good idea? Should be just continue refining our tools? Or should we drop them in favor for tools like functional languages that are verifiably safe?

+4  A: 

You change your tool set as much as you need right now and no more. If you just dropped the old style 98% of programmers (including myself) would lose a lot of productivity and industry would lose money. The latter is the reason most of us are here anyway.

I'd also like to draw attention to Parallel Extensions for .NET and also Flow Based Programming for Java, C#, and ActionScript 3.

http://www.jpaulmorrison.com/fbp/ http://code.google.com/p/actionflowscript/wiki/HomePage

Justin Bozonier
+1  A: 

I think we are still at the beginning on this. enevitably there will be comparisons of new world threading vs. traditional models that we have now. I remember the procedural vs. oop framework wars of the 90's. There is always going to be a space/time trade off in computer science that must be balanced.

Any new infrastructure is going to have to survive on its own merits: can you solve the problem expressively without lots and lots of work and tricky debugging. As good as VS is, its remained unchanged with other IDE's were 2 decades ago. one thing is for sure. threads are really a 3D kind of problem - 2d ascii files are not going to get us there. we hae to have the ability to trace the sequence in another dimension. most of us have a picture in our heads that simulates what the debugger does in a step by step trace. with threads we dont have that visual capability and its what makes it hard for us to imagine what will happen/can happen when.

MikeJ
+15  A: 

Judging from history, the sequence of events will be as follows:

  1. Users say, "We want languages that do new-thing, but we don't want to throw away our investment in old-thing."
  2. Toolmakers add new-thing features to old-thing languages.
  3. Users notice that new-thing + old-thing turns out to be a ghastly hybrid. They say, "Forget what we said earlier. We just want to do new-thing."
  4. Toolmakers make languages that just do new-thing.
  5. Academics whine, "But we've had languages that did new-thing since day 0! Why do you ignore us?"

In the case of structured programming, example languages are:

  • old-thing: early Fortran, COBOL, early BASIC
  • old-thing + new-thing: late Fortran, late BASIC
  • new-thing: C, Pascal, any modern language
  • whining academics: Lisp, Algol

For OO, examples are:

  • old-thing: C, Pascal
  • old-thing + new-thing: C++, late Pascal
  • new-thing: Java, C#
  • whining academics: Smalltalk

For parallel programming, we have:

  • old-thing: Java, C#
  • old-thing + new-thing: future versions of Java and C#
  • new-thing: F# might lead the way...
  • whining academics: Haskell, Erlang

Notes for pedants:

  • Erlang came from industry, not academia
  • Haskell's support for parallelism is still maturing
  • etc. Look, this is just a brief sketch from an embittered, whining academic.
dysfunctor
In any case, a rather much enjoyable sketch. Thank you.
Daniel Jomphe
Functional languages do not solve the parallelism problem. The automatic gains you can get are too fine grained. Functional languages just makes it easier to deal with side effects. They do not help you with designing coarse grained parallelism to make an algorithm fast on a parallel machine.
cdv
cdv: At the programming language level, there are three promising approaches to parallelism: Nested Data Parallelism, Shared Transactional Memory and the Actor model. All three are most naturally expressed in the functional paradigm.
dysfunctor
I'll say more on that: Of course functional languages don't solve the parallelism problem. There is no "silver bullet" that can solve the essential problems of software complexity, but we need the best tools that we can get and, the way things are looking, functional is the only game in town.
dysfunctor
If I could +10 this I would, well said.
Jim Burger
A: 

Strangely enough, Microsoft will be bringing out PLINQ, a parallel form of LINQ in the not too distant future, as well as the Task Parallel Library (TPL) (which began life as part of Robotics Studio). Both of these technologies will help developers by abstracting away some of complexities of synchonisation and locking, in much the same way as the .NET 2.0 BackgroundWorker. You will still be able to use the threading nuts and bolts, but I suspect most people will simply use the abstractions.

Mitch Wheat
+2  A: 

Based on the way hardware and infrastructure are heading, I think there will need to be a shift towards something like Erlang. However, i think it will be in the form of a hybrid solution will emerge in the form of a combination of the benefits of current programming languages plus parallel programming.

Brandon
+1  A: 

He lied. It is not possible to construct perfectly safe tools. At least not ones for programming in a Turing complete language. It is possible to create safer tools. But even state of the art tools, things academics "have been doing forever", or even things they have only just figured out will not make writing arbitrary parallel code either easy or safe. Easier and/or safer maybe. Just because we would like parallel programming to be simple doesn't mean it can be.

ejgottl
+1  A: 

one of the lessons of the Connection Machine (a massively parallel SIMD computer, with over 65,000 processors) was that parallel programming is really really hard - so hard that they basically had to ship a programmer with each machine sold!

so please improve the tools to the point where I don't have to care about parallelism, it just works faster/better. Anything less than that will relegate parallel programming to a relatively elite few.

Steven A. Lowe