- Apache Spark Deep Learning Cookbook
- Ahmed Sherif Amrith Ravindra
- 132字
- 2025-02-26 11:49:44
How it works...
This section explains how the SparkSession works as an entry point to develop within Spark.
- Staring with Spark 2.0, it is no longer necessary to create a SparkConf and SparkContext to begin development in Spark. Those steps are no longer needed as importing SparkSession will handle initializing a cluster. Additionally, it is important to note that SparkSession is part of the sql module from pyspark.
- We can assign properties to our SparkSession:
- master: assigns the Spark master URL to run on our local machine with the maximum available number of cores.
- appName: assign a name for the application
- config: assign 6gb to the spark.executor.memory
- getOrCreate: ensures that a SparkSession is created if one is not available and retrieves an existing one if it is available