Run Spark code written in Scala in spark cluster

Hello Folk’s

I have a question on how to run spark code written in scala in spark cluster

I have IntelliJ IDE installed in my laptop. I am trying to do some Bigdata Spark POCs written in Scala. My requirement is that the spark-scala code written in IntelliJ IDE should run in spark cluster when I click Run. My spark cluster is residing in windows azure cloud. How can I achieve this?

Help me on this!



You can achieve this while you build your SparkSession from your test class. You can specify where the application will run,

.appName("Simple Application")