views:

72

answers:

2

Hi SO users,

I'm trying to evolve an API. As part of this evolution I need to change the return type of a method to a subclass (specialize) in order for advanced clients to be able to access the new functionality. Example (ignore the ugly :

public interface Entity {
  boolean a();
}

public interface Intf1 {
  Entity entity();
}

public interface Main {
  Intf1 intf();
}

I now want to have ExtendedEntity, Intf2 and Main like this:

public interface ExtendedEntity extends Entity {
  boolean b();
}

public interface Intf2 extends Intf1 {
  ExtendedEntity entity();
}

public interface Main {
  Intf2 intf();
}

However, since method return type is part of it's signature, clients already compiled with the previous version of the code show linkage errors (method not found iirc).

What I would like to do is add a method to Main with a different return type. The two methods (one that return super type and one that return subtype) should be mapped to the same implementation method (which returns the subtype). Note - as far as I understand this is allowed by the JVM, but not by the Java spec.

My solution, which seems to work abuses (I have no other word for that) the Java class system to add the required interface.

public interface Main_Backward_Compatible {
  Intf1 intf();
}

public interface Main extends Main_Backward_Compatible{
  Intf2 intf();
}

Now old clients will have the correct method returned to the invokevirtual lookup (since the method with the correct return type exists in the type hierarchy) and the implementation that will actually work will be the one that returns the subtype Intf2.

This seems to work. In all the tests I could devise (barring reflection - but I don't care about that bit) it did work.
Will it always work? Is my reasoning (about the invokevirtual) correct?

And another, related, question - are there tools to check "real" binary compatibility? The only ones I've found look at each method by itself, but fail to consider type hierarchy.

Thanks,
Ran.

Edit - Tools I've tried and found "not so good" (do not take into account type hierarchy):

  1. Clirr 0.6.
  2. IntelliJ "APIComparator" plugin.

Edit2 - Of course, my clients are barred from creating implementation classes to my interfaces (think services). However, if you want the example to be complete, think abstract class (for Main) instead of interface.

A: 

It would be simpler not to change the existing interfaces at all. Anyone using your new interface will be writing new code anyway.

Implementations of the existing Main.intf() signature can return an instance of Intf2.

Optionally, you could provide a new accessor that does not require casting:

public interface Main2 extends Main {
  Intf2 intf2();
}
Andy Thomas-Cramer
This is just pushing the problem one step further - Main (or Main2) needs to be served from somewhere. It's true I can propagate this all the way upward - but that's touching a huge number of interfaces, none of them is related to the change (factories, services, more factories, more services....)
Ran Biron
You can return an instance of Intf2 from a method declared to return Intf1. You can avoid modification of existing interfaces.
Andy Thomas-Cramer
+1  A: 

This was long enough that I admit I didn't read everything scrupulously, but it seems like you might actually want to leverage generics here. If you type Intf1 I think you can maintain binary compatibility while introducing specializations:

public interface Intf1<T extends Entity> {
  T entity(); //erasure is still Entity so binary compatibility
}

public interface Intf2 extends Intf1<ExtendedEntity> { //if even needed
}

public interface Main {
  Intf1<ExtendedEntity> intf(); //erasure is still Intf1, the raw type
}

Edit #1: There are some caveats when trying to maintain binary compatibility. See the Generics Tutorial chapters 6 and 10 for more information.

Edit #2:

You can extend this concept to typing Main as well:

public interface Main<T, I extends Intf1<T>> {
    I intf(); //still has the same erasure as it used to, so binary compatible
}

Old clients would then be able to use the raw Main type as they used to with no recompilation needed, and new clients would type their references to Main:

Main<ExtendedEntity, Intf2> myMain = Factory.getMeAMain();
Intf2 intf = myMain.intf();
Mark Peters
I can't do this because I didn't thought of that before - and now Intf1 is already in the public API.Regarding new interfaces - I don't know in advance what I want to parameterize - so the next time I'll need to affect a change - I'd be in the same problem.Also, I can't retrofit Intf1 as generic since this violates binary compatibility (you can change the generic parameter, but not the fact that a class is parameterized or not).
Ran Biron
@Ran: I'm pretty sure you're wrong. When the collections APIs were retrofitted with generics in Java 1.5, for example, there was no need to recompile all existing code. So yeah I think you've got that wrong. See http://java.sun.com/j2se/1.5/pdf/generics-tutorial.pdf (The Generics Tutorial) (particularly sections 6.3 and 10) for a discussion of this issue.
Mark Peters
@Mark: Interesting. I'll have to consider this against the other solution (as always, the actual code is more complex than the toy example - I do need to return Intf2 since I'm adding methods to it). However you definitely got a +1 from me :)
Ran Biron
@Ran: I'll add a bit more on that note. As to whether Generics is a suitable alternative for you depends on your actual scenario. You shouldn't use them just because they make what you're trying to do possible...particularly in an API...but if the class hierarchy looks like it should use generics *anyway*, this is a way to move forward.
Mark Peters
@Mark: It just hit me now - if I'm using generics, I'll break source compatibility. This is less serious for me than breaking binary compatibility, but worrisome nonetheless. It's a nice idea anyway.
Ran Biron
@Ran: I'm not sure what you mean by that. At the most you'll get warnings about using raw types.
Mark Peters