Hi all,
I am wondering what Community will say about the need of removing SSL in
Spark's internal communication (File Server and Broadcast Server).
The problems I see are the following:
1. Each user must have his own keystore/truststore to use for his jobs -
sharing keystores is obviously unsecur
rg.apache.hadoop.fs.Path("/tmp/does-not-exist"),
> true)
> res3: Boolean = false
>
> Does that explain your confusion?
>
>
> On Sat, Jan 14, 2017 at 11:37 AM, Marcelo Vanzin
> wrote:
>
> Are you actually seeing a problem or just questioning the code?
&g
exist, fs.delete(stagingDirPath, true) won't
> cause failure but just return false.
>
>
> Rostyslav Sotnychenko wrote
> > Hi all!
> >
> > I am a bit confused why Spark AM and Client are both trying to delete
> > Staging Directory.
> >
> > https://g
Hi all!
I am a bit confused why Spark AM and Client are both trying to delete
Staging Directory.
https://github.com/apache/spark/blob/branch-2.1/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala#L1110
https://github.com/apache/spark/blob/branch-2.1/yarn/src/main/scala/org/apache/spark
Hi all!
I am a bit confused why Spark AM and Client are both trying to delete
Staging Directory.
https://github.com/apache/spark/blob/branch-2.1/yarn/
src/main/scala/org/apache/spark/deploy/yarn/Client.scala#L1110
https://github.com/apache/spark/blob/branch-2.1/yarn/
src/main/scala/org/apache/spa
Hello!
I tried compiling Spark 2.0 with Hive 2.0, but as expected this failed.
So I am wondering if there is any talks going on about adding support of
Hive 2.x to Spark? I was unable to find any JIRA about this.
Thanks,
Rostyslav
Hello!
I have a question regarding Hive and Spark.
As far as I know, in order to use Hive-on-Spark one need to compile Spark
without Hive profile, but that means that it won't be possible to access
Hive from normal Spark jobs.
How is community going to address this issue? Making two different
sp
com%3E>,
second is a on some companies Jira
<https://jira.talendforge.org/browse/TBD-3615>.
The only current workaround I have is upgrading to Spark 1.4.1 but this
isn't a solution.
Does anyone knows how to deal with it?
Thanks in advance,
Rostyslav Sotnychenko
-