The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- (on Windows)

I am running Spark on Windows 7. When I use Hive, I see the following error

The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- 

The permissions are set as the following

C:\tmp>ls -la
total 20
drwxr-xr-x    1 ADMIN Administ        0 Dec 10 13:06 .
drwxr-xr-x    1 ADMIN Administ    28672 Dec 10 09:53 ..
drwxr-xr-x    2 ADMIN Administ        0 Dec 10 12:22 hive

I have set "full control" to all users from Windows->properties->security->Advanced.

But I still see the same error. Any help please? I have checked a bunch of links, some say this is a bug on Spark 1.5. Is this true?

Thanks Aarthi


Solution 1:

First of all, make sure you are using correct Winutils for your OS. Then next step is permissions.
On Windows, you need to run following command on cmd:

D:\winutils\bin\winutils.exe chmod 777 D:\tmp\hive

Hope you have downloaded winutils already and set the HADOOP_HOME variable.

Solution 2:

First thing first check your computer domain. Try

c:\work\hadoop-2.2\bin\winutils.exe ls c:/tmp/hive

If this command says access denied or FindFileOwnerAndPermission error (1789): The trust relationship between this workstation and the primary domain failed.

It means your computer domain controller is not reachable , possible reason could be you are not on same VPN as your system domain controller.Connect to VPN and try again.

Now try the solution provided by Viktor or Nishu.