Can't start spark-shell on windows 10 Spark 3.2.0 install

Issue

When I try to run spark-shell I get a huge message error that you can see here : https://pastebin.com/8D6RGxUJ

Install

I used this tutorial, but I already have python and java installed. I used spark 3.2.0 instead.

Config : Windows 10

  • HADOOP_HOME : C:\hadoop downloaded from https://github.com/cdarlint/winutils/tree/master/hadoop-3.2.0/bin

  • JAVA_HOME : C:\PROGRA~2\Java\jre1.8.0_311

  • SPARK_HOME : C:\Spark\spark-3.2.0-bin-hadoop3.2

  • in path :

    • %SPARK_HOME%\bin
    • %HADOOP_HOME%\bin

Solution 1:

My guess is that you have to put winutils.exe in the same folder as the $SPARK_HOME%\bin folder. I discovered that after starting from scratch and following this tutorial!