How would I filter a dataframe in spark scala based on a map condition?

Hi how’s it going?

Say I have a dataframe, and a dictionary containing mappings as follows:

val df = spark.read
    .format("csv")
    .option("sep",",")
    .option("inferSchema","true")
    .option("header","true")
    .load(dbPath+"data" +".csv")

val cols = df.columns

val lookup_dict = Map("column1" -> "numeric",
                      "column2"->"string",
                      "column3"->"date")

I want to filter df to only the columns who’s values are equal to “date”.

So in the case above, it would return a single column dataframe, with “column3” because it’s mapped to “date” in lookup_dict.

Thank you.

I don’t use spark.sql but think you want .select.

https://spark.apache.org/docs/latest/sql-getting-started.html