The Curious Case of the Failed Deserialization In the House of SBT

This is biting me when running a test suite that succeeds under Intellij in SBT.

I’ve minimised the problem down to these files:

build.sbt:



lazy val javaVersion = "17"

lazy val scala2_13_Version = "2.13.18"

lazy val scala3_Version = "3.3.7"

ThisBuild / scalaVersion := scala2_13_Version

lazy val settings = Seq(
  crossScalaVersions := Seq(scala2_13_Version, scala3_Version),
  name               := "americium",
  scalacOptions ++= (CrossVersion.partialVersion(
    scalaVersion.value
  ) match {
    case Some((2, _)) =>
      Seq("-Xsource:3", s"-java-output-version:$javaVersion")
    case Some((3, _)) =>
      Seq("-explain", s"-java-output-version:$javaVersion")

    case _ => Nil
  }),
  javacOptions ++= Seq("-source", javaVersion, "-target", javaVersion),
  Test / test / logLevel := Level.Error,

  libraryDependencies += "org.scalatest" %% "scalatest" % "3.2.19" % Test,
  libraryDependencies += "org.scalatestplus" %% "scalacheck-1-16" % "3.2.14.0" % Test
)

lazy val americium = (project in file("."))
  .settings(settings: _*)
  .disablePlugins(plugins.JUnitXmlReportPlugin)

(The cross building part isn’t required to reproduce, I just wanted to see if the choice of 2.13 versus 3 had any bearing - it doesn’t, although there were salient differences in how the error was picked up when I was working with an earlier less minimised reproduction that has test forking.)

Here’s the test:

package com.sageserpent.americium

import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest.matchers.should.Matchers

import _root_.java.io.{
  ByteArrayInputStream,
  ByteArrayOutputStream,
  ObjectInputStream,
  ObjectOutputStream
}
import scala.collection.immutable.SortedMap
import scala.util.Using

object TroubleAtMill {

  def deepCopy[X](original: X): X = {
    Using.resource(new ByteArrayOutputStream()) { outputStream =>
      Using.resource(new ObjectOutputStream(outputStream)) {
        objectOutputStream =>
          objectOutputStream.writeObject(original)

          Using.resource(new ByteArrayInputStream(outputStream.toByteArray)) {
            inputStream =>
              Using.resource(new ObjectInputStream(inputStream)) {
                objectInputStream =>
                  objectInputStream.readObject().asInstanceOf[X]
              }
          }
      }
    }
  }

  case class JackInABox[Caze](caze: Caze)

}

class TroubleAtMill extends AnyFlatSpec with Matchers {
  import TroubleAtMill.*

  "deep copying via Java serialization" should "be OK in all circumstances including under SBT" in {
    println(2 -> JackInABox(99)) // OK.

    println(deepCopy(2 -> JackInABox(99))) // OK.

    println(SortedMap(2 -> false)) // OK

    println(deepCopy(SortedMap(2 -> false))) // OK

    println(SortedMap(2 -> JackInABox(99))) // OK.

    println(
      deepCopy(SortedMap(2 -> JackInABox(99)))
    ) // IntelliJ - OK, SBT - not working for me!
  }

}

Meta-problem disclaimer: yes, I’m using Java serialization and not Kryo this time, because, well, reasons. I’m deep copying because, well, more reasons.

When I run this test in IntelliJ, I see the expected output:

(2,JackInABox(99))
(2,JackInABox(99))
TreeMap(2 -> false)
TreeMap(2 -> false)
TreeMap(2 -> JackInABox(99))
TreeMap(2 -> JackInABox(99))

When I run this in SBT, I see (regardless of whether using 2.13.18 or 3.3.7):

sbt:americium> test
[info] compiling 1 Scala source to /Users/gerardmurphy/IdeaProjects/americium/target/scala-2.13/test-classes ...
(2,JackInABox(99))
(2,JackInABox(99))
TreeMap(2 -> false)
TreeMap(2 -> false)
TreeMap(2 -> JackInABox(99))
[error] Test suite com.sageserpent.americium.TroubleAtMill failed with java.lang.ClassNotFoundException: com.sageserpent.americium.TroubleAtMill$JackInABox.
[error] This may be due to the ClassLoaderLayeringStrategy (ScalaLibrary) used by your task.
[error] To improve performance and reduce memory, sbt attempts to cache the class loaders used to load the project dependencies.
[error] The project class files are loaded in a separate class loader that is created for each test run.
[error] The test class loader accesses the project dependency classes using the cached project dependency classloader.
[error] With this approach, class loading may fail under the following conditions:
[error] 
[error]  * Dependencies use reflection to access classes in your project's classpath.
[error]    Java serialization/deserialization may cause this.
[error]  * An open package is accessed across layers. If the project's classes access or extend
[error]    jvm package private classes defined in a project dependency, it may cause an IllegalAccessError
[error]    because the jvm enforces package private at the classloader level.
[error] 
[error] These issues, along with others that were not enumerated above, may be resolved by changing the class loader layering strategy.
[error] The Flat and ScalaLibrary strategies bundle the full project classpath in the same class loader.
[error] To use one of these strategies, set the ClassLoaderLayeringStrategy key
[error] in your configuration, for example:
[error] 
[error] set americium / Test / classLoaderLayeringStrategy := ClassLoaderLayeringStrategy.ScalaLibrary
[error] set americium / Test / classLoaderLayeringStrategy := ClassLoaderLayeringStrategy.Flat
[error] 
[error] See ClassLoaderLayeringStrategy.scala for the full list of options.
[error] Failed tests:
[error] 	com.sageserpent.americium.TroubleAtMill
[error] (Test / test) sbt.TestsFailedException: Tests unsuccessful

Now I can start messing around with this classLoaderLayeringStrategy, but what is going on here? Should this happen in the first place?

When I was trying to minimise this problem I spent a lot of time encountering a different exception where a serialization proxy for SortedMap was failing a runtime type test in ObjectStreamClass.java, but may have just been a consequence of this more fundamental problem (it’s gone in the minimised test).

1 Like

Well, sticking a Test / classLoaderLayeringStrategy := ClassLoaderLayeringStrategy.Flat into the SBT definition made the problem go away, so at least that’s out of the way.

I’m still curious as to what was going on, but it’s not a burning issue any more. :sweat_smile:

(EDIT: today I learned that SBT tries to optimise loading times for test code, so has layered class loaders. I guess these don’t always play well with Java reflection out of the box, but OK, it works with the appropriate jemmying.)

1 Like

Thanks, TIL “jemmy”. I see you spell it “minimise” like “vise”. (I just realized that “vise” and “vice” might be the spelling upon which all nations must unite.)

I wonder how you pronounce “burning issue”.

I verified it works the same under sbt 2.0.0-RC8 (as we are in the era in which we must ask, which version of sbt).

I would be embarrassed to say I don’t know why deserialization (aka deserialisation, not to be confused with derealisation) has that behavior, if my previous experience with class loaders and serious serialization had not put me beyond that bar. I recall that Scala 2 REPL was aware of sbt class loading.

I “assume” (feel free to mock me with that “ass” word if not the other) that deserialization of X uses the class loader of X and not the context class loader, which is the “layered” loader.

I don’t see anything in the serialisation docs about class loading. I apologise in advance for starting every sentence with “I”, as though it were all about me. In my defence, I reverted to the other spellings with “s”, “c”, and “z”.

1 Like

We might call the new syntax style of “less fewer braces” simply the “embarrassed style”, which embraces less the radical format.

1 Like

बुर्निङ् इस्शू

Afsos hai, I’m not good with ITRANS. The halant bit isn’t what I’d write, but it’s the best I could do. Had to ratch around to remember how to do that, marra. Got to jemmy in the phonetics, ah?

I like to mock in my tests sometimes, but not on this occasion. :smile:.

1 Like