Hi ,
We are working on JVM performance optimization for Apache Spark – in that context we are seeing the following scala issue with JDK 9. Any suggestions are much appreciated.
With JDK 8 spark-shell is coming up properly …
unknown484d7ef0b631.attlocal.net:/home/ramki->echo $JAVA_HOME
/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.131-2.b11.el7_3.x86_64/jre
unknown484d7ef0b631.attlocal.net:/home/ramki->spark-shell
…
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_131)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
With a build based on JDK 9 , spark-shell spits out the following error … All other jtreg tests are passing with my build.
unknown484d7ef0b631.attlocal.net:/home/ramki->echo $JAVA_HOME
/usr/local/jvm/openjdk-9-internal/
unknown484d7ef0b631.attlocal.net:/home/ramki->spark-shell
…
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.
–
Thanks,
Ramki