Why spark-shell fails with NullPointerException?

I used Spark 1.5.2 with Hadoop 2.6 and had similar problems. Solved by doing the following steps:

  1. Download winutils.exe from the repository to some local folder, e.g. C:\hadoop\bin.

  2. Set HADOOP_HOME to C:\hadoop.

  3. Create c:\tmp\hive directory (using Windows Explorer or any other tool).

  4. Open command prompt with admin rights.

  5. Run C:\hadoop\bin\winutils.exe chmod 777 /tmp/hive

With that, I am still getting some warnings, but no ERRORs and can run Spark applications just fine.


I was facing a similar issue, got it resolved by putting the winutil inside bin folder. The Hadoop_home should be set as C:\Winutils and winutil to be placed in C:\Winutils\bin.

Windows 10 64 bit Winutils are available in https://github.com/steveloughran/winutils/tree/master/hadoop-2.6.0/bin

Also ensure that command line has administrative access.

Refer https://wiki.apache.org/hadoop/WindowsProblems