- Apache Spark Deep Learning Cookbook
- Ahmed Sherif Amrith Ravindra
- 76字
- 2025-02-26 11:49:44
How to do it...
When working with PySpark, a SparkSession must first be imported and initialized before any dataframe creation can occur:
- Import a SparkSession using the following script:
from pyspark.sql import SparkSession
- Configure a SparkSession:
spark = SparkSession.builder \
.master("local") \
.appName("Neural Network Model") \
.config("spark.executor.memory", "6gb") \
.getOrCreate()
sc = spark.sparkContext
- In this situation, the SparkSession appName has been named Neural Network Model and 6gb has been assigned to the session memory.