[
https://issues.apache.org/jira/browse/HIVE-324?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12683214#action_12683214
]
Ashish Thusoo commented on HIVE-324:
------------------------------------
I think this should go into the tmp file that is created for the query as that
gets automatically cleaned up once the query is done its execution. The correct
way would be to augment moveWork to include a tmp directory and then pass it
from SemanticAnalyzer by first generating the name there using the getTmpFile()
call. You can look at the SemanticAnalyzer.java to see how this happens.
Thanks,
Ashish
> AccessControlException when load data into table
> ------------------------------------------------
>
> Key: HIVE-324
> URL: https://issues.apache.org/jira/browse/HIVE-324
> Project: Hadoop Hive
> Issue Type: Bug
> Components: Metastore
> Affects Versions: 0.3.0
> Reporter: Min Zhou
> Assignee: Min Zhou
> Priority: Critical
> Fix For: 0.3.0
>
> Attachments: hive.patch
>
>
> when loading data in non-supergroup user of hadoop, hadoop will throw a
> AccessControlException bacuase Hive try to do write operation at /tmp
> directory.
> This is obviously not allowed.
> see line 752 in Hive.java
> Path tmppath = new Path("/tmp/"+randGen.nextInt());
> try {
> fs.mkdirs(tmppath);
> ...
> }
> those lines will cause that exception.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.