Is it possible to get the current spark context settings in PySpark?

Solution 1:

Spark 2.1+

spark.sparkContext.getConf().getAll() where spark is your sparksession (gives you a dict with all configured settings)

Solution 2:

Yes: sc.getConf().getAll()

Which uses the method:

SparkConf.getAll()

as accessed by

SparkContext.sc.getConf()

But it does work:

In [4]: sc.getConf().getAll()
Out[4]:
[(u'spark.master', u'local'),
 (u'spark.rdd.compress', u'True'),
 (u'spark.serializer.objectStreamReset', u'100'),
 (u'spark.app.name', u'PySparkShell')]

Solution 3:

Spark 1.6+

sc.getConf.getAll.foreach(println)

Solution 4:

update configuration in Spark 2.3.1

To change the default spark configurations you can follow these steps:

Import the required classes

from pyspark.conf import SparkConf
from pyspark.sql import SparkSession

Get the default configurations

spark.sparkContext._conf.getAll()

Update the default configurations

conf = spark.sparkContext._conf.setAll([('spark.executor.memory', '4g'), ('spark.app.name', 'Spark Updated Conf'), ('spark.executor.cores', '4'), ('spark.cores.max', '4'), ('spark.driver.memory','4g')])

Stop the current Spark Session

spark.sparkContext.stop()

Create a Spark Session

spark = SparkSession.builder.config(conf=conf).getOrCreate()