Issue with Spark session - scala

Hi All - I am getting below error when I execute the scala (version 2.13) code in intelliJ
import org.apache.spark.sql.SparkSession
object SparkSessionTest {
def main (args: Array[String]){

var s = SparkSession.clearActiveSession()
var f = SparkSession
.builder()
.config(“spark.master”, “local”)
.config(“spark.driver.memory”, “2048m”)
.config(“spark.executor.memory”, “2048m”)
.config(“spark.driver.bindAddress”, “127.0.0.1”)
.config(“spark.driver.host”, “localhost”)
.getOrCreate()
println("Spark Version : ")
}
}

import org.apache.spark.sql.SparkSession
object SparkSessionTest {
def main (args: Array[String]){

var s = SparkSession.clearActiveSession()
var f = SparkSession
.builder()
.config(“spark.master”, “local”)
.config(“spark.driver.memory”, “2048m”)
.config(“spark.executor.memory”, “2048m”)
.config(“spark.driver.bindAddress”, “127.0.0.1”)
.config(“spark.driver.host”, “localhost”)
.getOrCreate()
println("Spark Version : ")
}
}

You don’t seem to have included the error message?