I know why one shouldn't do that. But is there way to explain to a layman why this is not possible. You can explain this to a layman easily : Animal animal = new Dog();
. A dog is a kind of animal but a list of dogs is not a list of animals.
views:
1621answers:
13A List<Animal> is an object where you can insert any animal, for example a cat or an octopus. An ArrayList<Dog> isn't.
Suppose you could do this. One of the things that someone handed a List<Animal>
would reasonably expect to be able to do is to add a Giraffe
to it. What should happen when someone tries to add a Giraffe
to animals
? A run time error? That would seem to defeat the purpose of compile-time typing.
Notice that if you have
List<Dog> dogs = new ArrayList<Dog>()
then, if you could do
List<Animal> animals = dogs;
this does not turn dogs
into a List<Animal>
. The data structure underlying animals is still an ArrayList<Dog>
, so if you try to insert an Elephant
into animals
, you are actually inserting it into an ArrayList<Dog>
which is not going to work (the Elephant is obviously way too big ;-).
By inheriting you are actually creating common type for several classes . Here you have a common animal type . you are using it by creating an array in type of Animal and keeping values of similar types(inherited types dog, cat ,etc..).
Eg:
dim animalobj as new List(Animal)
animalobj(0)=new dog()
animalobj(1)=new Cat()
.......
Got it?
Imagine you create a list of Dogs. You then declare this as List<Animal> and hand it to a colleague. He, not unreasonably, believes he can put a Cat in it.
He then gives it back to you, and you now have a list of Dogs, with a Cat in the middle of it. Chaos ensues.
It's important to note that this restriction is there due to the mutability of the list. In Scala (for example), you can declare that a list of Dogs is a list of Animals. That's because Scala lists are (by default) immutable, and so adding a Cat to a list of Dogs would give you a new list of Animals.
The answer you're looking for is to do with concepts called covariance and contravariance. Some languages support these (.NET 4 adds support, for example), but some of the basic problems are demonstrated by code like this:
List<Animal> animals = new List<Dog>();
animals.Add(myDog); // works fine - this is a list of Dogs
animals.Add(myCat); // would compile fine if this were allowed, but would crash!
Because Cat would derive from animal, a compile-time check would suggest that it can be added to List. But, at runtime, you can't add a Cat to a list of Dogs!
So, though it may seem intuitively simple, these problems are actually very complex do deal with.
There's an MSDN overview of co/contravariance in .NET 4 here: http://msdn.microsoft.com/en-us/library/dd799517(VS.100).aspx - it's all applicable to java too, though I don't know what Java's support is like.
What you're trying to do is the following:
List<? extends Animal> animals = new ArrayList<Dog>()
That should work.
First, let's define our animal kingdom:
interface Animal {
}
class Dog implements Animal{
Integer dogTag() {
return 0;
}
}
class Doberman extends Dog {
}
Consider two parameterized interfaces:
interface Container<T> {
T get();
}
interface Comparator<T> {
int compare(T a, T b);
}
And implementations of these where T
is Dog
.
class DogContainer implements Container<Dog> {
private Dog dog;
public Dog get() {
dog = new Dog();
return dog;
}
}
class DogComparator implements Comparator<Dog> {
public int compare(Dog a, Dog b) {
return a.dogTag().compareTo(b.dogTag());
}
}
What you are asking is quite reasonable in the context of this Container
interface:
Container<Dog> kennel = new DogContainer();
// Invalid Java because of invariance.
// Container<Animal> zoo = new DogContainer();
// But we can annotate the type argument in the type of zoo to make
// to make it co-variant.
Container<? extends Animal> zoo = new DogContainer();
So why doesn't Java do this automatically? Consider what this would mean for Comparator
.
Comparator<Dog> dogComp = new DogComparator();
// Invalid Java, and nonsensical -- we couldn't use our DogComparator to compare cats!
// Comparator<Animal> animalComp = new DogComparator();
// Invalid Java, because Comparator is invariant in T
// Comparator<Doberman> dobermanComp = new DogComparator();
// So we introduce a contra-variance annotation on the type of dobermanComp.
Comparator<? super Doberman> dobermanComp = new DogComparator();
If Java automatically allowed Container<Dog>
to be assigned to Container<Animal>
, one would also expect that a Comparator<Dog>
could be assigned to a Comparator<Animal>
, which makes no sense -- how could a Comparator<Dog>
compare two Cats?
So what is the difference between Container
and Comparator
? Container produces values of the type T
, whereas Comparator
consumes them. These correspond to covariant and contra-variant usages of of the type parameter.
Sometimes the type parameter is used in both positions, making the interface invariant.
interface Adder<T> {
T plus(T a, T b);
}
Adder<Integer> addInt = new Adder<Integer>() {
public Integer plus(Integer a, Integer b) {
return a + b;
}
};
Adder<? extends Object> aObj = addInt;
// Obscure compile error, because it there Adder is not usable
// unless T is invariant.
//aObj.plus(new Object(), new Object());
For backwards compatibility reasons, Java defaults to invariance. You must explicitly choose the appropriate variance with ? extends X
or ? super X
on the types of the variables, fields, parameters, or method returns.
This is a real hassle -- every time someone uses the a generic type, they must make this decision! Surely the authors of Container
and Comparator
should be able to declare this once and for all.
This is called 'Declaration Site Variance', and is available in Scala.
trait Container[+T] { ... }
trait Comparator[-T] { ... }
The best layman answer I can give is this: because in designing generics they do not want to repeat the same decision that was made to Java's array type system that made it unsafe.
This is possible with arrays:
Object[] objArray = new String[] { "Hello!" };
objArray[0] = new Object();
This code compiles just fine because of the way array's type system works in Java. It would raise an ArrayStoreException
at run time.
The decision was made not to allow such unsafe behavior for generics.
See also elsewhere: Java Arrays Break Type Safety, which many considers one of Java Design Flaws.
This is a good reading and should explain it a litlle bit more:
http://blogs.msdn.com/csharpfaq/archive/2010/02/16/covariance-and-contravariance-faq.aspx
If you couldn't mutate the list then your reasoning would be perfectly sound. Unfortunately a List<>
is manipulated imperatively. Which means you can change a List<Animal>
by adding a new Animal
to it. If you were allowed to use a List<Dog>
as a List<Animal>
you could wind up with a list that also contains a Cat
.
If List<>
was incapable of mutation (like in Scala), then you could treat A List<Dog>
as a List<Animal>
. For instance, C# makes this behavior possible with covariant and contravariant generics type arguments.
This is an instance of the more general Liskov substitution principal.
The fact that mutation causes you an issue here happens elsewhere. Consider the types Square
and Rectangle
.
Is a Square
a Rectangle
? Certainly -- from a mathematical perspective.
You could define a Rectangle
class which offers readable getWidth
and getHeight
properties.
You could even add methods that calculate its area
or perimeter
, based on those properties.
You could then define a Square
class that subclasses Rectangle
and makes both getWidth
and getHeight
return the same value.
But what happens when you start allowing mutation via setWidth
or setHeight
?
Now, Square
is no longer a reasonable subclass of Rectangle
. Mutating one of those properties would have to silently change the other in order to maintain the invariant, and Liskov's substitution principal would be violated. Changing the width of a Square
would have an unexpected side-effect. In order to remain a square you would have to change the height as well, but you only asked to change the width!
You can't use your Square
whenever you could have used a Rectangle
. So, in the presence of mutation a Square
is not a Rectangle
!
You could make a new method on Rectangle
that knows how to clone the rectangle with a new width or a new height, and then your Square
could safely devolve to a Rectangle
during the cloning process, but now you are no longer mutating the original value.
Similarly a List<Dog>
cannot be a List<Animal>
when its interface empowers you to add new items to the list.
I'd say the simplest answer is to ignore the cats and dogs, they aren't relevant. What's important is the list itself.
List<Dog>
and
List<Animal>
are different types, that Dog derives from Animal has no bearing on this at all.
This statement is invalid
List<Animal> dogs = new List<Dog>();
for the same reason this one is
AnimalList dogs = new DogList();
While Dog may inherit from Animal, the list class generated by
List<Animal>
doesn't inherit from the list class generated by
List<Dog>
It's a mistake to assume that because two classes are related that using them as generic parameters will then make those generic classes also be related. While you could certainly add a dog to a
List<Animal>
that doesn't imply that
List<Dog>
is a subclass of
List<Animal>