Interesting.
I will watching your PR.
On Wed, Nov 18, 2015 at 7:51 AM, 임정택 wrote:
> Ted,
>
> I suspect I hit the issue
> https://issues.apache.org/jira/browse/SPARK-11818
> Could you refer the issue and verify that it makes sense?
>
> Thanks,
> Jungtaek Lim (HeartSaVioR)
>
>
Ted,
I suspect I hit the issue https://issues.apache.org/jira/browse/SPARK-11818
Could you refer the issue and verify that it makes sense?
Thanks,
Jungtaek Lim (HeartSaVioR)
2015-11-18 20:32 GMT+09:00 Ted Yu :
> Here is related code:
>
> private static void
Here is related code:
private static void checkDefaultsVersion(Configuration conf) {
if (conf.getBoolean("hbase.defaults.for.version.skip", Boolean.FALSE))
return;
String defaultsVersion = conf.get("hbase.defaults.for.version");
String thisVersion = VersionInfo.getVersion();
I am a bit curious:
Hbase depends on hdfs.
Has hdfs support for Mesos been fully implemented ?
Last time I checked, there was still work to be done.
Thanks
> On Nov 17, 2015, at 1:06 AM, 임정택 wrote:
>
> Oh, one thing I missed is, I built Spark 1.4.1 Cluster with 6 nodes of
Ted,
Could you elaborate, please?
I maintain separated HBase cluster and Mesos cluster for some reasons, and
I just can make it work via spark-submit or spark-shell / zeppelin with
newly initialized SparkContext.
Thanks,
Jungtaek Lim (HeartSaVioR)
2015-11-17 22:17 GMT+09:00 Ted Yu
I see - your HBase cluster is separate from Mesos cluster.
I somehow got (incorrect) impression that HBase cluster runs on Mesos.
On Tue, Nov 17, 2015 at 7:53 PM, 임정택 wrote:
> Ted,
>
> Could you elaborate, please?
>
> I maintain separated HBase cluster and Mesos cluster for
Ted,
Thanks for the reply.
My fat jar has dependency with spark related library to only spark-core as
"provided".
Seems like Spark only adds 0.98.7-hadoop2 of hbase-common in spark-example
module.
And if there're two hbase-default.xml in the classpath, should one of them
be loaded, instead of
Oh, one thing I missed is, I built Spark 1.4.1 Cluster with 6 nodes of
Mesos 0.22.1 H/A (via ZK) cluster.
2015-11-17 18:01 GMT+09:00 임정택 :
> Hi all,
>
> I'm evaluating zeppelin to run driver which interacts with HBase.
> I use fat jar to include HBase dependencies, and see
Hi all,
I'm evaluating zeppelin to run driver which interacts with HBase.
I use fat jar to include HBase dependencies, and see failures on executor
level.
I thought it is zeppelin's issue, but it fails on spark-shell, too.
I loaded fat jar via --jars option,
> ./bin/spark-shell --jars
I just make it work from both side (zeppelin, spark-shell) via initializing
another SparkContext and run.
But since it feels me as a workaround, so I'd love to get proper ways (or
more beautiful workarounds) to resolve this.
Please let me know if you have any suggestions.
Best,
Jungtaek Lim
10 matches
Mail list logo