JVM performance optimiztion for Apache spark

Hi ,

We are working on JVM performance optimization for Apache Spark – in that context we are seeing the following scala issue with JDK 9. Any suggestions are much appreciated.

With JDK​ ​8 spark-shell is coming up properly …

unknown484d7ef0b631.attlocal.net:/home/ramki->echo $JAVA_HOME


Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_131)
Type in expressions to have them evaluated.
Type :help for more information.


With ​a build based on JDK 9​ , spark-shell spits out the following error … All other jtreg tests are passing with my build.

unknown484d7ef0b631.attlocal.net:/home/ramki->echo $JAVA_HOME


Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

Failed to initialize compiler: object java.lang.Object in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.


The Spark Performance Tuning is the process of adjusting settings to record for memory, cores, and instances used by the system. This process guarantees that the Spark has optimal performance and prevents resource bottlenecking in Apache Spark.

Visit this link. https://data-flair.training/blogs/apache-spark-performance-tuning/