[ 
https://issues.apache.org/jira/browse/HIVE-752?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12749039#action_12749039
 ] 

Todd Lipcon commented on HIVE-752:
----------------------------------

bq. Sure, every feature of Hive should be tested by hudson, but I do not think 
JUnit replaces real testing. I personally test everything on a live cluster 
because the world is full of gotchas. 

Unfortunately this increases the barrier to contribution significantly. As 
feature sets grow it becomes impossible to do this without a full time QA 
staff. So, for now, we need at least continuous smoke testing of all features, 
including HWI imho. When a release candidate is rolled it's time for the 
significant manual testing - asking for that on every commit is pretty 
difficult.

bq. I do think the shims should have some independent j-unit tests. Example 
what happens when someone trys to compile with 0.21.? I am hoping a unit test 
failure would be present in the shims.

The shims should be covered by virtue of their use elsewhere. I think in the 
original ticket I suggested that we get separate build running on Hudson that 
tests all of the unit tests against all of the supported Hadoop versions - I'm 
not sure who's in charge of the Hive Hudson, but if Cloudera can help get that 
set up we'd be happy to.

bq. I have much 0_20 angst. 1 because all the JMX stuff got renamed and all my 
cacti templates are broken  2. In general it seems like a lot changed and 
everyone is chasing after it.

+1 :) Also agreed that it is a blocker.

I'm getting on a plane in a couple hours for a week long vacation, but I'll try 
to sneak in some time to get this fixed. In the meantime, it would be great if 
you could write up a "manual test plan" for HWI. Its existence isn't even 
mentioned in README.txt, so it's difficult for new contributors to figure out 
what functionality they need to manually verify.

> Encountered ClassNotFound exception when trying HWI server
> ----------------------------------------------------------
>
>                 Key: HIVE-752
>                 URL: https://issues.apache.org/jira/browse/HIVE-752
>             Project: Hadoop Hive
>          Issue Type: Bug
>          Components: Clients
>         Environment: Hadoop 0.18.3
>            Reporter: Venkat Ramachandran
>            Assignee: Edward Capriolo
>            Priority: Blocker
>         Attachments: hive-752.diff
>
>
> Encountered ClassNotFound exception (for class: 
> org.apache.jetty.hive.shims.Jetty18Shims) when trying to start HWI server on 
> Hadoop 18.
> It appears that the class ShimLoader 
> (org.apache.hadoop.hive.shims.ShimLoader) is referring to incorrect classes 
> as below:
> static {
>     JETTY_SHIM_CLASSES.put("0.17", 
> "org.apache.jetty.hive.shims.Jetty17Shims");
>     JETTY_SHIM_CLASSES.put("0.18", 
> "org.apache.jetty.hive.shims.Jetty18Shims");
>     JETTY_SHIM_CLASSES.put("0.19", 
> "org.apache.jetty.hive.shims.Jetty19Shims");
>     JETTY_SHIM_CLASSES.put("0.20", 
> "org.apache.jetty.hive.shims.Jetty20Shims");
>   }
> however, I think it should be as below:
>  static 
>   {
>         JETTY_SHIM_CLASSES.put("0.17", 
> "org.apache.hadoop.hive.shims.Jetty17Shims");
>         JETTY_SHIM_CLASSES.put("0.18", 
> "org.apache.hadoop.hive.shims.Jetty18Shims");
>         JETTY_SHIM_CLASSES.put("0.19", 
> "org.apache.hadoop.hive.shims.Jetty19Shims");
>         JETTY_SHIM_CLASSES.put("0.20", 
> "org.apache.hadoop.hive.shims.Jetty20Shims");
>   } 
> Wondering if anybody else encountered this.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to