JDK bytecode version

Hi,

I’m umming and ahhing about what version of Java bytecode to target for a Scala library…

I maintain a library (Americium) that is intended to be used by both Java and Scala developers.

It has two APIs, one for Java folks using Java collections, Java options, Java function types etc and another for Scala using the obvious Scala equivalents. It’s expected that most clients will pick a single API entry point and work with that, although it is possible to get one form of the API from another to mix-and-match. Cool.

For a long time, I built this library using an explicit version of the Java language, wiring it into the build via javacOptions ++= Seq("-source", javaVersion, "-target", javaVersion).

This started off as 1.8 aka Java 8, then went to 1.9 / Java 9 to support some Java test code.

In the meantime, I’ve been happily building using whatever the latest and greatest JDK is - up to 21 for day to day work, with some experimental work on 22.

I was under the impression that setting javacOptions would affect the Scala code generation, but it’s recently come to light that the published JARs are a mixture of bytecode for Java 8 (classfile version 52) from the Scala sources and the bytecode version explicitly set by javacOptions, namely for Java 9 (classfile version 53).

To make matters worse, both the production and test Java and Scala code have little dependencies here and there on Java 11 SDK features that have happily compiled because I’ve used a recent JDK to develop with. I’m not completely sure what’s going on there, but it seems the bytecode is generated as described above, but presumably with an implicit dependency on a more recent JDK runtime library. There is even a dependency in some test code on Java 15 that’s crept in.

I discovered this recently after wiring in the java-output-version into scalacOptions, thinking this would just be a nice ‘conform with style’ change and then finding that a downstream project won’t build against the latest published library - the Scala bytecode has now jumped up to match the Java bytecode version. Playing around the now consistently set Java version then revealed the implicit JDK runtime dependencies.

What a mess!

Anyway, I’ve made a release that sets the Java version to 17 LTS, switched to doing builds on GitHub and jemmied the Github build to use JDK 17 to get things in order.

I’m taking the view that chasing JDK 8 isn’t worthwhile, as it is likely that shops using Java 8 are either running unmaintained legacy code or have an overly cautious / conservative attitude which would make it unlikely that they would want to use a new library anyway. However, JDK 17 is still fairly recent, and the only report with numbers I’ve seen indicates a split in adoption between JDK 11 LTS and JDK 17 LTS, so perhaps I’m being a bit draconian in expecting users to bump up to 17.

I’m interested to hear from folk responsible for production code as to what they would do in this position or expect as users…

I think you should set the target and release flags in the scalacOptions as well. target defines which version the bytecode of your scala files will have, and release which version of the jdk standard library will be used while compiling, so you can avoid accidentally using a method that was only added in java 21.

AFAIK they both are now replaced by java-output-version which sets both values at once since it was considered that users will always want to have them in sync.
But, not sure if they wanted to revert that since it seemed libraries do need to use both with different values.
c.c. @som-snytt

I am probably the worst person to answer… but my two cents.

  1. I do think is more sane to be explicit with the bytecode version and keep it consistent everywhere.
  2. I also agree that Java 8 support is past due.
  3. At work we are already migrating from Java 17 to Java 21.
  4. My little neotypes targets Java 17 because the underlying neo4j-java-driver now requires Java 17 at minimum because reasons.
  5. I would say that if you can support Java 11 with minimum issues target that, otherwise target the minimum LTS that your dependencies need.

Thanks all for the responses so far. I used java-output-version rather than the target and release options as I wanted to keep as much as possible in sync.

(As an aside, from what I can see there is nothing stopping me from referencing a JDK library higher than the targeted Java version, though - this seems to be down to the choice of JDK used to do the build. I’d love to be mistaken on this one, so if anyone knows better or knows a fix, please say so…)

@BalmungSan - I’m impressed that you work somewhere that is moving off Java 17 to Java 21. Given that you cite a library that is published using Java 17, perhaps I’m not being so draconian after all. My concern was not to make a ratchet that forces downstream users into an upgrade, but there are many others doing this, I can just participate in the communal subdivided shame - or is it heroism? :laughing:

I did experiment with a split build where test code can go up to Java 17 or higher and production code has to stick at Java 11. It does indeed generate the right bytecodes for the two categories of code, but I’m a bit queasy about testing with a JDK that isn’t the one targeted by the published library.

Ironically, Americium is expected to be a test-only dependency for most intended users, but I’d like to establish a good pattern for other projects all the same.

1 Like

AFAIK yeah you are mistaken.
The flag must ensure that you only reference stuff that is available in the specified Java version. I may be wrong tho.

Me too :sweat_smile:

This is indeed heroism, after all, winners write history, let’s just win and thus be the good ones :stuck_out_tongue:
Jokes aside, I do think the push to upgrade has to come from somewhere. Ideally somewhere are giants like Spring, Spark, typelevel, ZIO, etc. But, I can also see them being too careful given their big codebases and user base. As such, IMHO, the push may start with smaller libraries.

It’s funny how the whole point of Java never breaking backward compact is that upgrades should be easy. And yet, nobody want sot upgrade :grimacing:

I went back and did some more digging, I’m discovering it’s more subtle than I first thought.

The SBT setting javacOptions ++= Seq("-source", javaVersion, "-target", javaVersion) determines the classfile format (loosely the bytecode version, you know what I mean) for the compiled Java sources.

The SBT setting

scalacOptions ++= (CrossVersion.partialVersion(
    scalaVersion.value
  ) match {
    case Some((2, _)) =>
      Seq("-Xsource:3", s"-java-output-version:$javaVersion")
    case Some((3, _)) =>
      Seq("-explain", s"-java-output-version:$javaVersion")

    case _ => Nil
  })

determines the classfile format for the compiled Scala sources. Possibly it overrides what’s set by javacOptions too?

Where it gets interesting is if the Scala setting leaves out java-output-version, then not only does the classfile format for the compiled Scala sources default to Java 8 (52), but the build merrily pulls in runtime dependencies from the JDK used to build, which happens to be JDK 17 - so all those quiet dependencies on java.nio.Path.of (55), Optional.ifPresentOrElse (53), String.translateEscapes (59) etc are referenced.

This all works at runtime when running under a JDK >= 15, but will presumably explode for previous JDKs, despite the classfile formats being 8 or whatever javaopts picks up if it’s in the range [8, 15].

Reinstating java-output-version does pick up the errant references to dependencies that are not available - not sure how it diagnoses this when the JDK is 17, I speculate that the classfile format embeds a ‘defined since version …’ label on the compiled class or method definition. Or something …

Anyway, this is good news now that the SBT build is belt-and-braces with both javacOptions and scalacOptions - that should flag any accidental referencing of JDK features beyond the targeted JDK version.

Yeah, this behavior is exploited by fs2 for example. The flow interop module can only work on Java 9+ but the whole library is compiled and published using Java 8; meaning those bits are only accessible if the runtime is Java 9+ and will crash if referenced on Java 8 but if you don’t reference them then the JAR is completely safe.

In order to do that, the publish (and test) pipeline run on Java 17 (IIRC) but they don’t set the -java-output-version leaving the default of Java 8. If they would set the flag then that would break; which is what I was referring to previously:

1 Like

For further options or choices or confusion, there is a new option -system for picking your JDK location, where in JDK 8 you would use -bootclasspath, so -target is undeprecated for that use case.

I agree that it is confusing that -help says

  -release:<release>           Compile for a version of the Java API and target class file. (8,9,10,11,12,13,14,15,16,17,18,19,20,21,[22])
  -target:<target>             Target platform for object files. ([8],9,10,11,12,13,14,15,16,17,18,19,20,21,22)
                                 deprecated: Use -release instead to compile against the correct platform API.

where defaults for -release and -target are not coordinated. Maybe the compiler should warn, or an sbt plugin could warn if you’re publishing with uncoordinated settings.

(For local testing, I may or may not care. I find class file conflicts while “trying something real quick” to be quite annoying because unnecessary.)

The canonical way for an artifact to support multiple JDK versions is a multi-versioned jar. I have no idea if anyone does that.

I just contributed support for “nest mates.” It would be too bad if everyone says, We’re stuck publishing for JDK 8 so we can’t use that.

1 Like

Ah - I followed the link, and lo - I find ct.sym. So that’s how the magic is performed, at least for the public runtime bits.

(On a trivia note, I found this link: The Anatomy of ct.sym — How javac Ensures Backwards Compatibility - Gunnar Morling and saw the quote hic sunt dracones; that got me thinking that my use of the word ‘draconian’ might an etymological reference to mythical beasts, but no - there was a person called Draco who was a bit of a hardliner. Well now.)

Many thanks for your collective help. I shall stew over this a bit longer…

1 Like

I remember the model sentence from our Greek grammar, to the effect that there were many laws but one punishment, viz, death, so the Athenians said the laws of Draco were the laws of a serpent (δράκοντος).

However, wikipedia gives a nice summary with this little-known factoid: “Instead of oral laws known to a special class, arbitrarily applied and interpreted, all laws were written, thus being made known to all literate citizens as The Scala Specification.”

And that is why, to this day, the only penalty in Scala is thread death (θάνατος).

It’s fitting that so many were eager for a serpentine mascot, coiled like the famous staircase.

To follow up the meme, if you don’t pick a button, you get JDK 21 or 22 API.

I think the usual recommendation is a slow simmer rather than stewing.

1 Like