`trait` can extend `abstract class`, but isn't a true JVM subtype

I discovered by accident that a trait can extend an abstract class (after 8 years as a Scala dev!!) . Putting aside the wisdom of doing so, it appears that from a JVM reflection standpoint the type hierarchy doesn’t exist:

scala> scala.util.Properties.versionNumberString
res0: String = 2.11.12

scala> abstract class Foo {
     |   val blah: String = "foo"
     | }
defined class Foo

scala> trait Bar extends Foo {
     |   val merf: String = "bar"
     | }
defined trait Bar

scala> classOf[Foo].isAssignableFrom(classOf[Bar])
res1: Boolean = false

scala> (new Bar {}).merf // Test to see if field exists in subtype
res2: String = bar

scala> classOf[Bar].getInterfaces
res3: Array[Class[_]] = Array()

scala> classOf[Bar].getSuperclass
res4: Class[_ >: Bar] = null

No superclass or interfaces!

My suspicion is that scalac delays extends until the concrete type is declared, as hinted by this (confusing) result:

scala> (new Bar{}).getClass.getSuperclass
res5: Class[?0] forSome { type ?0 >: ?0; type ?0 <: Bar } = class Foo

Again, I’m very surprised a trait can extend an abstract class at all (why allow it?). But even so, is it unreasonable to expect the type relationship to appear in the JVM class definition?

Any insight into the “correctness” of all this is appreciated. Feels like a major impedance mismatch in the Scala/JVM type systems, so I’m interested in more background. (It’s causing havoc with some Encoder code in Apache Spark, and I’m trying to find a workaround.)

Thanks!

PS: The real issue I’m struggling with is that I have a class something like:

case class Baz[+T <: Foo](t: T)

And some reflection-based serialization code attempts to do:

classOf[Baz[_]].getConstructor(classOf[Bar])

you get:

java.lang.NoSuchMethodException: Baz.<init>(Bar)

The JVM doesn’t know anything about types or type hierarchies in scala (or java for that matter), only about JVM classes.

There is nothing in the scala spec about how language constructs are represented on the JVM, so having strong expectations around that is indeed unwise, unless you’re willing to tangle with unspecced implementation details.

Spark in general very successfully tangos with these implementation details when types are pretty straightforward, and it sounds like you’re running into issues in a case where Spark fails to do so. The workaround is not to have a trait extend an abstract class, I suppose.

The JVM doesn’t know anything about types or type hierarchies in scala (or java for that matter), only about JVM classes.

What do you mean? The JVM knows superclasses and interfaces and tells you via Java reflection:

Welcome to Scala 2.13.0 (OpenJDK 64-Bit Server VM, Java 1.8.0_222).
Type in expressions for evaluation. Or try :help.

scala> “Yo!”.getClass.getSuperclass.getCanonicalName
res6: String = java.lang.Object

scala> “Yo!”.getClass.getInterfaces.map(_.getCanonicalName)
res10: Array[String] = Array(java.io.Serializable, java.lang.Comparable, java.lang.CharSequence)

If anyone is interested in the gory details of the actual use case, here’s a bug report and deeper analysis around it.

Those are classes, not types.