[
https://issues.apache.org/jira/browse/HIVE-4824?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13750226#comment-13750226
]
Eugene Koifman commented on HIVE-4824:
--------------------------------------
Another possibility is to just call HCatCli directly from WebHCat - that would
simplify the architecure and improve perf of DDL ops dramatically.
One possible issue here is concurrency - hive code is not completely thread
safe. We could use a new ClassLoader for each call to HCatCli - this would
work around concurrency issues and will still be a good step forward.
> make TestWebHCatE2e run w/o requiring installing external hadoop
> ----------------------------------------------------------------
>
> Key: HIVE-4824
> URL: https://issues.apache.org/jira/browse/HIVE-4824
> Project: Hive
> Issue Type: Bug
> Components: HCatalog
> Affects Versions: 0.12.0
> Reporter: Eugene Koifman
> Assignee: Eugene Koifman
>
> Currently WebHCat will use hive/build/dist/hcatalog/bin/hcat to execute DDL
> commands, which in turn uses Hadoop Jar command.
> This in turn requires that HADOOP_HOME env var be defined and point to an
> existing Hadoop install.
> Need to see we can apply hive/testutils/hadoop idea here to make WebHCat not
> depend on external hadoop.
> This will make Unit tests better/easier to write and make dev/test cycle
> simpler.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira