Run Spark code written in Scala in spark cluster

Hello Folk’s

I have a question on how to run spark code written in scala in spark cluster

I have IntelliJ IDE installed in my laptop. I am trying to do some Bigdata Spark POCs written in Scala. My requirement is that the spark-scala code written in IntelliJ IDE should run in spark cluster when I click Run. My spark cluster is residing in windows azure cloud. How can I achieve this?

Help me on this!

Thanks

Hi,

You can achieve this while you build your SparkSession from your test class. You can specify where the application will run, https://spark.apache.org/docs/latest/submitting-applications.html#master-urls

SparkSession.builder
.master("spark://HOST1:PORT1")
.appName("Simple Application")
.getOrCreate()