Issue while opening Spark shell

I am trying to open spark using command

$ spark-shell

but getting warning. How to fix it.

Warning:

WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.

Solution 1:

Spark port you can change anytime when running in the command prompt

[hadoop@localhost ~]$ spark-shell --conf spark.ui.port=4041

by default, spark run into the 4040

Solution 2:

By default, Spark will try to bind port 4040. In your case there is already a spark process running on 4040.

The following message isn not an error as spark will run on port 4041:

WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.

From Spark Documentation:

Every SparkContext launches a web UI, by default on port 4040, that displays useful information about the application. This includes:

If multiple SparkContexts are running on the same host, they will bind to successive ports beginning with 4040 (4041, 4042, etc).

Solution 3:

The previous answer also helped me to start a spark sheet. On further research, I found that there is 16 attempt given by spark to auto-allocate a port. Refer to Spark Documentation

enter image description here

A good thing is that spark also suggests how to configure a new unused port and start the spark shell the port

java.net.BindException: Address already in use: Service 'SparkUI' failed after 16 retries (starting from 4040)! Consider explicitly setting the appropriate port for the service 'SparkUI' (**for example spark.ui.port for SparkUI**) to an available port or increasing spark.port.maxRetries.