Github user pwendell closed the pull request at:
https://github.com/apache/spark/pull/3130
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-71903029
Closing this to re-open with a new version.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-70231376
Hey All,
I don't see where spark-submit fits into the issue of having a conflicting
jetty library? If you have an application that requires a conflicting
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-70299385
One case I suppose where it could be relevant is if we only shaded Jetty
inside of our assembly jar. That would only help users of spark-submit (also,
it would be much
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-70321240
Looks like we're on the same page. However I believe this still raises the
question of how to best do the shading itself. It looks like the short-term
solution is to
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-69662311
Hi,
I was wondering if this is still being updated?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-66189849
Wanted to follow up on this - the priority of getting this done was just
increased for us.
---
If your project is set up for it, you can reply to this email and have
Github user mingyukim commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-65959927
To add a more specific use case of using Spark without spark-submit, we
have a REST server that has a long-running SparkContext, which serves simple
queries like
Github user ash211 commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-65878128
I think a core disconnect here is that the Spark team thought the majority
use of Spark in applications would be through the spark-submit script. But
Matt and I (and
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-65878359
Totally makes sense. I don't think I have enough context in the Spark world
as a whole to suggest a holistic build design, but I agree that this is where
the disconnect
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-65878693
I am also in the camp that wants to use Spark without `spark-submit`. My
broad experience is that it's quite possible to programmatically configure
`SparkContext`
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-65126080
Update on this?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user tomerk commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-63707707
I'm also seeing an error in Spark Streaming when using mvn package
-DskipTests:
``` sh
[INFO] compiler plugin:
Github user msgehard commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-63398400
I am also trying to do the same thing. We are trying to run Spark Streaming
embedded in another app that serves up data via Spring websockets. If you'd
like someone
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62608315
Any update on this?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62621917
Personally I'm backlogged trying to get the Scala 2.11 patch in for this
release. This might have to go into master on the 1.3 timeline - since there
are still a bunch
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62113523
[Test build #23045 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/23045/consoleFull)
for PR 3130 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62113528
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62116902
[Test build #23046 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/23046/consoleFull)
for PR 3130 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62116911
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62195376
@pwendell I tried to run a mvn compile and it broke trying to compile
Spark-SQL. Can you verify you're getting the same behavior? I'm taking your
changes and playing
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62201800
I also built the project with sbt/sbt clean compile assembly, and when
starting the master, I got the following stack trace:
:43:14 ERROR ActorSystemImpl:
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62202566
Can you try clearing your local ivy and maven caches before you build?
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62210767
It passed mvn compile after clearing the caches once, but now it's failing
on make-distribution in packaging GraphX. Do I need to clear the local caches
before each
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62211101
Scratch that, it failed on streaming, not GraphX
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62212257
nope you shouldn't have to. I can dig into his more later. I'm fine to also
work backwards from the last patch but I think we'll end up at the same thing
either way.
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-61942278
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-61942275
[Test build #22992 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/22992/consoleFull)
for PR 3130 at commit
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62029607
@ash211 please take a look at this as well. Going to test now.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62031510
Looks good to me if Spark still works (sorry, won't have a chance to test
this). Could you do a quick check to make sure jetty isn't somehow being
packaged into the
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62039059
@mccheah just a heads up I did add a small thing to this patch to make
SPARK_PREPEND_CLASSES work.
---
If your project is set up for it, you can reply to this email
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62046549
[Test build #23010 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/23010/consoleFull)
for PR 3130 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62046556
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62057049
@mccheah any luck with this? I just made some minor changes after testing
with SPARK_PREPEND_CLASSES.
---
If your project is set up for it, you can reply to this
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62057697
[Test build #23017 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/23017/consoleFull)
for PR 3130 at commit
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62057981
Working on it. Will let you know shortly.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-6207
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62069990
[Test build #23017 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/23017/consoleFull)
for PR 3130 at commit
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62070813
I'm pretty sure this doesn't work when Spark is built with maven. I'm going
to try with sbt, but this is what I've found so far.
I used make-distribution.sh to
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62071738
@mccheah can you describe in more details how you're running things?
The general contract is that you compile your app against spark-core, using
a provided scope
Github user mccheah commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62080459
This is not the way that we pull in the Spark dependency. We launch our
server as a standalone application, and specify that the spark core jar is a
library for the
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62081092
I see. That kind of usage does require the spark-core jar to include the
shaded jetty, which Patrick's patch doesn't do.
It also means that all other Spark
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62106993
Hey @mccheah I didn't realize you were trying to use Spark in this way. If
we want to support this we'd need to shade jetty in our core pom. I think
that's actually
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62107118
[Test build #23045 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/23045/consoleFull)
for PR 3130 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62107609
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62108201
Unfortunately this doesn't work super well in relation to inter-project
dependencies. I saw this in the streaming compile:
```
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62109653
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-62109943
[Test build #23046 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/23046/consoleFull)
for PR 3130 at commit
GitHub user pwendell opened a pull request:
https://github.com/apache/spark/pull/3130
SPARK-3996: Shade Jetty in Spark deliverables.
This patch relocates Jetty classes inside of the assembly jar in Spark.
It does not modify Spark's disclosed dependency on Jetty, since we
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-61935203
@vanzin and @mcheah please take a look. I verified locally that this shades
Jetty correctly in Spark packages.
---
If your project is set up for it, you can reply to
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/3130#issuecomment-61935458
[Test build #22992 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/22992/consoleFull)
for PR 3130 at commit
51 matches
Mail list logo