views:

539

answers:

2

After reading some OpenJDK mailinglist entries, it seems that the Oracle developers are currently further removing things from the closure proposal, because earlier design mistakes in the Java language complicate the introduction of the Java closures.

Considering that Scala closures are much more powerful than the closures planned for Java 8, I wonder if it will be possible to e. g. call a Java method taking a closure from Scala, defining a closure in Java and giving it to a Scala function etc.?

So will Java closures be represented like their Scala counterparts in bytecode or differently? Will it be possible to close the gap of functionality between Java/Scala closures?

+6  A: 

You would likely be able to do this extremely easily using implicit conversions a la collection.JavaConversions whether or not they come out of the box.

Of course, this is not obviously so, because it may be the case that Java closures are turned into types which get generated by the JVM at runtime - I seem to recall a Neal Gafter presentation saying something along these lines

oxbow_lakes
+11  A: 

I think it's more complicated than assuming there's two groups of stakeholders here. The Project Lambda people seem to be working mostly independently of the Oracle people, occasionally throwing something over the wall that the Project Lambda people find out indirectly. (Scala, is of course the third stakeholder.)

Since the latest Project Lambda proposal is to eliminate function types all together, and just create some sort of fancy inference for implementing interfaces that have a single astract method (SAM types), I foresee the following:

  • Calling Scala code that requires a Scala closure will depend entirely on the implementation of the Function* traits (and the implementation of traits in general) -- whether it appears to the Java compiler as a SAM (which it is in Scala-land) or whether the non-abstract methods also appear abstract to the JVM. (I would think they currently do look like they're abstract since traits are implemented as interfaces, but I'm know almost nothing about Scala's implementation. This could be a big hurdle to interperability.)

    Complications with Java generics (in particular how to expressInt/int/Integer, or Unit/Nothing/void in a generic interface) may also complicate things.

  • Using Scala functions to implement Java SAMs will not be any different than it now -- you need to create an implicit conversion for the specific interface you wish to implement.

If the JVM gets function types (and Oracle seems not to have eliminated that possibility), it may depend how it's implemented. If they're first class objects implementing a particular interface, then all that Scala needs to do to be compatible is make Function* implement the new interface. If a new kind of type is implemented in the JVM entirely, then this could be difficult -- the Scala developers may wrap them using magic like they currently do for Arrays, or they may create create implicit conversions. (A new language concept seems a bit far-fetched.)

I hope that one of the results of all of this discussion is that all of the various JVM languages will agree on some standard way to represent closures -- so that Scala, Groovy, JRuby, etc... can all pass closures back and forth with a minimum of hassle.

What's more interesting to me is the proposals for virtual extension methods that will allow the Java Collections API to use lambdas. Depending on how these are implemented, they may greatly simplify some of the binary compatibility problems that we've had to deal with when changing Scala code, and they may help to more easily and efficiently implement traits.

I hope that some of the Scala developers are getting involved and offering their input, but I haven't actually seen any discussion of Scala on the Project Lambda lists, nor any participants who jump out to me as being Scala developers.

Ken Bloom
Thanks! Your answer is great. The best thing imho would be if Oracle would just adopt Scala's FunctionX classes so that things are interoperable without any magic. But I guess they will try anything to just _not_ do that.
soc
@soc: Scala's FunctionX classes work well for a number of Scala-specific reasions: (1) our type system hides the distinction between primatives and boxed types, and can fit `void`/`Unit` and `Nothing` into the type system naturally. (2) Definition-site covariance lets us easily pass `FuctionX`s in a natural way using only generics, and Java developers may feel the need to implement a radically new concept in their type system to implement this. (I think that's why Project Lambda is abandoning the idea of a function type.)
Ken Bloom
@soc: That said, I am disturbed by Project Lambda's apparent lack of discussion with other JVM constituencies.
Ken Bloom
Ken, the "problem" with FunctionX is that they are not single-method, which may cause problem as things stand. Also, not that there's no magic with `Array` in Scala since 2.8.0.
Daniel
@Daniel, I'm kinda hoping traits like `FunctionX` can be implemented by public defender methods, then the JVM (which would know something about public defender methods) should recognize that there's only one unimplemented method making `FunctionX` a SAM. As far as `Array`: it still manages to appear in the Scala class hierarchy, and there are implicit conversions to wrap it with the usual `Seq` methods. If the JVM has real function types, the same thing can be done for those.
Ken Bloom
@Ken Just like `Int` appears in the Scala class hierarchy, but Scala's `Array` is fully Java's `Array` nowadays. Of course, there are implicits, but I wouldn't use the word "magic" for that, whereas it certainly could be used to describe arrays on previous versions of Scala.
Daniel