Setup auto run Pyspark notebook from Alteryx

I need to schedule Pyspark notebook to run using Alteryx. There seems to be Python and Apache spark code options available in Alteryx but no Pyspark.

Can anyone tell me how to set up automatic Pyspark-notebook run from Alteryx?

This is a Scala forum. While Spark was written in Scala, and also uses Scala as one of the notebook languages, neither Python nor Alteryx are addressed here.

You likely are looking for an Apache Spark discussion forum.

The Apache Spark Code tool is a code editor that creates an Apache Spark context and executes Apache Spark commands directly from Designer. This tool uses the R programming language.

With an Apache Spark Direct connection established, the Code Editor activates. Use Insert Code to generate template functions in the code editor.

Read Data creates a read Alteryx Data function to return the incoming data as an Apache SparkSQL DataFrame.

Spark uses the programming languages Scala, Java, Python, and R.

This is a Scala programming language forum. You need to go to the Spark application and/or Alteryx application forums to resolve the application issues you have.

1 Like