I had to make a small change to Emre's suggestion above, in order for my
changes to get picked up. This worked for me:
mvn --projects sql/core -DskipTests install #not package
mvn --projects assembly/ -DskipTests install
Pramod
On Tue, May 5, 2015 at 2:36 AM, Iulian Dragoș
wrote:
> I'm probably
I'm probably the only Eclipse user here, but it seems I have the best
workflow :) At least for me things work as they should: once I imported
projects in the workspace I can build and run/debug tests from the IDE. I
only go to sbt when I need to re-create projects or I want to run the full
test sui
In addition to Michael suggestion, in my SBT workflow I also use "~" to
automatically kickoff build and unit test. For example,
sbt/sbt "~streaming/test-only *BasicOperationsSuite*"
It will automatically detect any file changes in the project and start of
the compilation and testing.
So my full w
FWIW... My Spark SQL development workflow is usually to run "build/sbt
sparkShell" or "build/sbt 'sql/test-only '". These commands
starts in as little as 30s on my laptop, automatically figure out which
subprojects need to be rebuilt, and don't require the expensive assembly
creation.
On Mon, May
*
*
** ** ** ** ** ** Hi,
Is it really necessary to run **mvn --projects assembly/ -DskipTests
install ? Could you please explain why this is needed?
I got the changes after running "mvn --projects streaming/ -DskipTests
package".
Regards,
Meethu
On Monday 04 May 2015 02:20 PM, Em
Just to give you an example:
When I was trying to make a small change only to the Streaming component of
Spark, first I built and installed the whole Spark project (this took about
15 minutes on my 4-core, 4 GB RAM laptop). Then, after having changed files
only in Streaming, I ran something like (
No, I just need to build one project at a time. Right now SparkSql.
Pramod
On Mon, May 4, 2015 at 12:09 AM, Emre Sevinc wrote:
> Hello Pramod,
>
> Do you need to build the whole project every time? Generally you don't,
> e.g., when I was changing some files that belong only to Spark Streaming,
Hello Pramod,
Do you need to build the whole project every time? Generally you don't,
e.g., when I was changing some files that belong only to Spark Streaming, I
was building only the streaming (of course after having build and installed
the whole project, but that was done only once), and then th
Using the inbuilt maven and zinc it takes around 10 minutes for each build.
Is that reasonable?
My maven opts looks like this:
$ echo $MAVEN_OPTS
-Xmx12000m -XX:MaxPermSize=2048m
I'm running it as build/mvn -DskipTests package
Should I be tweaking my Zinc/Nailgun config?
Pramod
On Sun, May 3, 2
https://spark.apache.org/docs/latest/building-spark.html#building-with-buildmvn
On Sun, May 3, 2015 at 2:54 PM, Pramod Biligiri
wrote:
> This is great. I didn't know about the mvn script in the build directory.
>
> Pramod
>
> On Fri, May 1, 2015 at 9:51 AM, York, Brennon >
> wrote:
>
> > Follow
This is great. I didn't know about the mvn script in the build directory.
Pramod
On Fri, May 1, 2015 at 9:51 AM, York, Brennon
wrote:
> Following what Ted said, if you leverage the `mvn` from within the
> `build/` directory of Spark you¹ll get zinc for free which should help
> speed up build ti
Following what Ted said, if you leverage the `mvn` from within the
`build/` directory of Spark you¹ll get zinc for free which should help
speed up build times.
On 5/1/15, 9:45 AM, "Ted Yu" wrote:
>Pramod:
>Please remember to run Zinc so that the build is faster.
>
>Cheers
>
>On Fri, May 1, 2015
Pramod:
Please remember to run Zinc so that the build is faster.
Cheers
On Fri, May 1, 2015 at 9:36 AM, Ulanov, Alexander
wrote:
> Hi Pramod,
>
> For cluster-like tests you might want to use the same code as in mllib's
> LocalClusterSparkContext. You can rebuild only the package that you change
Hi Pramod,
For cluster-like tests you might want to use the same code as in mllib's
LocalClusterSparkContext. You can rebuild only the package that you change and
then run this main class.
Best regards, Alexander
-Original Message-
From: Pramod Biligiri [mailto:pramodbilig...@gmail.com
Hi Pramod,
If you are using sbt as your build, then you need to do sbt assembly once
and use sbt ~compile. Also export SPARK_PREPEND_CLASSES=1 this in your
shell and all nodes.
You can may be try this out ?
Thanks,
Prashant Sharma
On Fri, May 1, 2015 at 2:16 PM, Pramod Biligiri
wrote:
> Hi,
15 matches
Mail list logo