On my Windows system test fail as Context.getScratchDir fails:

java.io.IOException: Failed to set permissions of path: 
\HW\project\hive-monarch\itests\qtest\target\tmp\scratchdir\hive_2013-11-28_16-20-53_995_395869757529987755-1
 to 0700
                at 
org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:691)
                at 
org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:664)
                at 
org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:514)
                at 
org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:290)
                at 
org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:290)
                at 
org.apache.hadoop.fs.ProxyFileSystem.setPermission(ProxyFileSystem.java:290)
                at 
org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:290)
                at 
org.apache.hadoop.hive.ql.Context.getScratchDir(Context.java:212)
                at 
org.apache.hadoop.hive.ql.Context.getExternalScratchDir(Context.java:272)
                at 
org.apache.hadoop.hive.ql.Context.getExternalTmpFileURI(Context.java:365)
                ....

This seems related to the platform specific chmod, which on Windows relies on 
winutils.exe

But I don't understand how does Hive testing decide what Hadoop shim to use. 
The ShimLoader.getMajorVersion uses VersionInfo.getVersion() and in debugger I 
can see this returns "1.2.1" and accordingly the -core-1.2.1 JARs are used, 
from my Maven cache. This seems to ignore my current HADOOP_HOME and any 
CLASSPATH I try to set. The loaded JARs are not working properly on Windows and 
that causes my problems.

How can I control, when I run mvn test, what shims are being used?

Thanks,
~Remus

Reply via email to