How would I filter a dataframe in spark scala based on a map condition?

Hi how’s it going?

Say I have a dataframe, and a dictionary containing mappings as follows:

val df =
    .load(dbPath+"data" +".csv")

val cols = df.columns

val lookup_dict = Map("column1" -> "numeric",

I want to filter df to only the columns who’s values are equal to “date”.

So in the case above, it would return a single column dataframe, with “column3” because it’s mapped to “date” in lookup_dict.

Thank you.

I don’t use spark.sql but think you want .select.