> On Nov. 14, 2014, 7:32 p.m., Marcelo Vanzin wrote: > > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HiveSparkClient.java, line > > 26 > > <https://reviews.apache.org/r/27987/diff/3/?file=763278#file763278line26> > > > > nit: space before { > > > > Maybe implement Closeable?
fixed. > On Nov. 14, 2014, 7:32 p.m., Marcelo Vanzin wrote: > > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HiveSparkClientFactory.java, > > line 101 > > <https://reviews.apache.org/r/27987/diff/3/?file=763279#file763279line101> > > > > Doesn't this work? > > > > for (Map.Entry<String, String> entry : hiveConf) yeah, it works. fixed. > On Nov. 14, 2014, 7:32 p.m., Marcelo Vanzin wrote: > > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HiveSparkClientFactory.java, > > line 107 > > <https://reviews.apache.org/r/27987/diff/3/?file=763279#file763279line107> > > > > Is Hive still using commons-logging? slf4j makes this much better since > > it handles format strings for you... Hive use commons-logging, keep this here. > On Nov. 14, 2014, 7:32 p.m., Marcelo Vanzin wrote: > > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java, > > line 94 > > <https://reviews.apache.org/r/27987/diff/3/?file=763281#file763281line94> > > > > Don't you get warnings here since JobHandle needs a type parameter? not actually in my Intellij, very strange. fix it. > On Nov. 14, 2014, 7:32 p.m., Marcelo Vanzin wrote: > > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java, line > > 51 > > <https://reviews.apache.org/r/27987/diff/3/?file=763284#file763284line51> > > > > You could use: > > > > new URI(path).getScheme() != null fixed. > On Nov. 14, 2014, 7:32 p.m., Marcelo Vanzin wrote: > > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java, line > > 55 > > <https://reviews.apache.org/r/27987/diff/3/?file=763284#file763284line55> > > > > You could use: > > > > new File(path).toURI().toURL() fixed. - chengxiang ----------------------------------------------------------- This is an automatically generated e-mail. To reply, visit: https://reviews.apache.org/r/27987/#review61479 ----------------------------------------------------------- On Nov. 14, 2014, 3:43 a.m., chengxiang li wrote: > > ----------------------------------------------------------- > This is an automatically generated e-mail. To reply, visit: > https://reviews.apache.org/r/27987/ > ----------------------------------------------------------- > > (Updated Nov. 14, 2014, 3:43 a.m.) > > > Review request for hive, Rui Li, Szehon Ho, and Xuefu Zhang. > > > Bugs: HIVE-8833 > https://issues.apache.org/jira/browse/HIVE-8833 > > > Repository: hive-git > > > Description > ------- > > Hive would support submitting spark job through both local spark client and > remote spark client. we should unify the spark client API, and implement > remote spark client through Remote Spark Context. > > > Diffs > ----- > > ql/pom.xml 06d7f27 > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HiveSparkClient.java > PRE-CREATION > > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HiveSparkClientFactory.java > PRE-CREATION > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/LocalHiveSparkClient.java > PRE-CREATION > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java > PRE-CREATION > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkClient.java ee16c9e > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkTask.java 2fea62d > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/SparkUtilities.java > e3e6d16 > > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/session/SparkSessionImpl.java > 51e0510 > ql/src/java/org/apache/hadoop/hive/ql/exec/spark/status/SparkJobRef.java > bf43b6e > > ql/src/java/org/apache/hadoop/hive/ql/optimizer/spark/SetSparkReducerParallelism.java > d4d14a3 > spark-client/src/main/java/org/apache/hive/spark/client/SparkClient.java > 8346b28 > > spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java > 5af66ee > > Diff: https://reviews.apache.org/r/27987/diff/ > > > Testing > ------- > > > Thanks, > > chengxiang li > >