-
Bug
-
Resolution: Unresolved
-
P5
-
5.0, 6, 7, 8
The main question I have is how type inference works in javac for
cases where GJ-like type inference is actually undecidable. Unfortunately,
with the introduction of variance annotations, the least upper bound
of two types does not always exist. Naturally, this would lead to a
looping compiler. javac doesn't loop but returns arbitrary results; so
I wonder if this is intentional, and if it's intentional, I wonder
what the correct behavior of a Java compiler is in such a case.
Here is an example:
class A<T> {}
class B extends A<B> {}
class C extends A<C> {}
class D {
<t> t choose(t x, t y) {
return x;
}
void foo() {
B b = new B();
C c = new C();
A<? extends A<? extends A<?>>> x = choose(b, c);
}
}
The polymorphic method 'choose' of class D expects two parameters of the
same type. When this function is called, the compiler has to infer the
"best" instantiation of the type parameter such that both actual
parameters conform to this type. In my example, the compiler has to
compute the least upper bound of B and C. A first approximation is
A<? extends A<?>>, a better type would be A<? extends A<? extends A<?>>>,
and so on. The problem now is that javac simply infers the type
A<? extends A<?>>; this decision is arbitrary and if a programmer
wants a better type (like in the program above), he has to enforce
the instantiation manually by giving the desired type. I personally
think that simply returning *a* solution is not a good idea; the compiler
should try to construct a really big type and fail if the type required
by the user is bigger (which never happens if the limit of the compiler
is high enough). Then you could sell this as an implementation restriction,
which is completely unrelevant for basically all hand-written Java
programs.
cases where GJ-like type inference is actually undecidable. Unfortunately,
with the introduction of variance annotations, the least upper bound
of two types does not always exist. Naturally, this would lead to a
looping compiler. javac doesn't loop but returns arbitrary results; so
I wonder if this is intentional, and if it's intentional, I wonder
what the correct behavior of a Java compiler is in such a case.
Here is an example:
class A<T> {}
class B extends A<B> {}
class C extends A<C> {}
class D {
<t> t choose(t x, t y) {
return x;
}
void foo() {
B b = new B();
C c = new C();
A<? extends A<? extends A<?>>> x = choose(b, c);
}
}
The polymorphic method 'choose' of class D expects two parameters of the
same type. When this function is called, the compiler has to infer the
"best" instantiation of the type parameter such that both actual
parameters conform to this type. In my example, the compiler has to
compute the least upper bound of B and C. A first approximation is
A<? extends A<?>>, a better type would be A<? extends A<? extends A<?>>>,
and so on. The problem now is that javac simply infers the type
A<? extends A<?>>; this decision is arbitrary and if a programmer
wants a better type (like in the program above), he has to enforce
the instantiation manually by giving the desired type. I personally
think that simply returning *a* solution is not a good idea; the compiler
should try to construct a really big type and fail if the type required
by the user is bigger (which never happens if the limit of the compiler
is high enough). Then you could sell this as an implementation restriction,
which is completely unrelevant for basically all hand-written Java
programs.
- is blocked by
-
JDK-8078095 4.10.4: Fix dependency on unspecified "infinite types" in lub
- Open
- relates to
-
JDK-4929881 What is the type of b?Integer.class:Float.class
- Resolved
-
JDK-8043725 javac fails with StackOverflowException
- Closed