I've filed JIRA SPARK-22055 & SPARK-22054 to port the
> release scripts and allow injecting of the RM's key.
>
> On Mon, Sep 18, 2017 at 8:11 PM, Patrick Wendell <patr...@databricks.com>
> wrote:
>
>> For the current release - maybe Holden could just s
For the current release - maybe Holden could just sign the artifacts with
her own key manually, if this is a concern. I don't think that would
require modifying the release pipeline, except to just remove/ignore the
existing signatures.
- Patrick
On Mon, Sep 18, 2017 at 7:56 PM, Reynold Xin
://github.com/apache/spark/tree/master/dev/create-release
- Patrick
On Mon, Sep 18, 2017 at 6:23 PM, Patrick Wendell <patr...@databricks.com>
wrote:
> One thing we could do is modify the release tooling to allow the key to be
> injected each time, thus allowing any RM to insert t
extra care to make sure that can't happen, even if it
> is an annoyance for the release managers.
>
> On Sun, Sep 17, 2017 at 10:12 PM, Patrick Wendell <patr...@databricks.com>
> wrote:
>
>> Sparks release pipeline is automated and part of that automation includes
>> secu
Sparks release pipeline is automated and part of that automation includes
securely injecting this key for the purpose of signing. I asked the ASF to
provide a service account key several years ago but they suggested that we
use a key attributed to an individual even if the process is automated.
I
+1
On Wed, Dec 16, 2015 at 6:15 PM, Ted Yu wrote:
> Ran test suite (minus docker-integration-tests)
> All passed
>
> +1
>
> [INFO] Spark Project External ZeroMQ .. SUCCESS [
> 13.647 s]
> [INFO] Spark Project External Kafka ...
In terms of advertising to people the status of the release and whether an
RC is likely to go out, the best mechanism I can think of is our current
mechanism of using JIRA and respecting the semantics of a blocker JIRA. We
could do a better job though creating a JIRA dashboard for each release and
I also feel the same as Reynold. I agree we should minimize API breaks and
focus on fixing things around the edge that were mistakes (e.g. exposing
Guava and Akka) rather than any overhaul that could fragment the community.
Ideally a major release is a lightweight process we can do every couple of
Hey Jakob,
The builds in Spark are largely maintained by me, Sean, and Michael
Armbrust (for SBT). For historical reasons, Spark supports both a Maven and
SBT build. Maven is the build of reference for packaging Spark and is used
by many downstream packagers and to build all Spark releases. SBT
I believe this is some bug in our tests. For some reason we are using way
more memory than necessary. We'll probably need to log into Jenkins and
heap dump some running tests and figure out what is going on.
On Mon, Nov 2, 2015 at 7:42 AM, Ted Yu wrote:
> Looks like
I verified that the issue with build binaries being present in the source
release is fixed. Haven't done enough vetting for a full vote, but did
verify that.
On Sun, Oct 25, 2015 at 12:07 AM, Reynold Xin wrote:
> Please vote on releasing the following candidate as Apache
de me w/some specific failures so i can look
> in to them more closely?
>
> On Mon, Oct 19, 2015 at 12:27 PM, Patrick Wendell <pwend...@gmail.com>
> wrote:
> > Hey Shane,
> >
> > It also appears that every Spark build is failing right now. Could it be
> > related
I think many of them are coming form the Spark 1.4 builds:
https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/Spark-1.4-Maven-pre-YARN/3900/console
On Mon, Oct 19, 2015 at 1:44 PM, Patrick Wendell <pwend...@gmail.com> wrote:
> This is what I'
Hey Shane,
It also appears that every Spark build is failing right now. Could it be
related to your changes?
- Patrick
On Mon, Oct 19, 2015 at 11:13 AM, shane knapp wrote:
> worker 05 is back up now... looks like the machine OOMed and needed
> to be kicked.
>
> On Mon,
I would tend to agree with this approach. We should audit all
@Experimenetal labels before the 1.6 release and clear them out when
appropriate.
- Patrick
On Wed, Oct 14, 2015 at 2:13 AM, Sean Owen wrote:
> Someone asked, is "ML pipelines" stable? I said, no, most of the key
Hi Jakob,
There is a temporary issue with the Scala 2.11 build in SBT. The problem is
this wasn't previously covered by our automated tests so it broke without
us knowing - this has been actively discussed on the dev list in the last
24 hours. I am trying to get it working in our test harness
there was a reason this is hard. Certainly not
> worth building for each PR.
>
> On Mon, Oct 12, 2015 at 5:16 PM, Patrick Wendell <pwend...@gmail.com>
> wrote:
> > We already do automated compile testing for Scala 2.11 similar to Hadoop
> > versions:
> >
> >
I think Daniel is correct here. The source artifact incorrectly includes
jars. It is inadvertent and not part of our intended release process. This
was something I noticed in Spark 1.5.0 and filed a JIRA and was fixed by
updating our build scripts to fix it. However, our build environment was
not
*to not include binaries.
On Sun, Oct 11, 2015 at 9:35 PM, Patrick Wendell <pwend...@gmail.com> wrote:
> I think Daniel is correct here. The source artifact incorrectly includes
> jars. It is inadvertent and not part of our intended release process. This
> was something I noticed
like they are part of tests, and still nothing to do with Spark binaries.
> Those can and should stay.
>
> On Mon, Oct 12, 2015, 5:35 AM Patrick Wendell <pwend...@gmail.com> wrote:
>
>> I think Daniel is correct here. The source artifact incorrectly includes
>>
ifacts should be a deterministic function of the
> source at a certain point in time.
>
> I think the concern is about putting Spark binaries or its dependencies
> into a source release. That should not happen, but it is not what has
> happened here.
>
> On Mon, Oct 12, 2015, 6:03 AM
I would push back slightly. The reason we have the PR builds taking so long
is death by a million small things that we add. Doing a full 2.11 compile
is order minutes... it's a nontrivial increase to the build times.
It doesn't seem that bad to me to go back post-hoc once in a while and fix
2.11
the foreseeable future?
>
> Nick
>
>
> On Tue, Oct 6, 2015 at 1:13 AM Patrick Wendell <pwend...@gmail.com> wrote:
>
>> The missing artifacts are uploaded now. Things should propagate in the
>> next 24 hours. If there are still issues past then ping this
Hey Holden,
It would be helpful if you could outline the set of features you'd imagine
being part of Spark in a short doc. I didn't see a README on the existing
repo, so it's hard to know exactly what is being proposed.
As a general point of process, we've typically avoided merging modules into
The missing artifacts are uploaded now. Things should propagate in the next
24 hours. If there are still issues past then ping this thread. Thanks!
- Patrick
On Mon, Oct 5, 2015 at 2:41 PM, Nicholas Chammas wrote:
> Thanks for looking into this Josh.
>
> On Mon,
, Oct 1, 2015 at 11:30 AM, Patrick Wendell <pwend...@gmail.com> wrote:
> Ah - I can update it. Usually i do it after the release is cut. It's
> just a standard 3 month cadence.
>
> On Thu, Oct 1, 2015 at 3:55 AM, Sean Owen <so...@cloudera.com> wrote:
>> My guess is th
Ah - I can update it. Usually i do it after the release is cut. It's
just a standard 3 month cadence.
On Thu, Oct 1, 2015 at 3:55 AM, Sean Owen wrote:
> My guess is that the 1.6 merge window should close at the end of
> November (2 months from now)? I can update it but wanted
Hey Richard,
My assessment (just looked before I saw Sean's email) is the same as
his. The NOTICE file embeds other projects' licenses. If those
licenses themselves have pointers to other files or dependencies, we
don't embed them. I think this is standard practice.
- Patrick
On Thu, Sep 24,
I think it would be a big improvement to get rid of it. It's not how
jars are supposed to be packaged and it has caused problems in many
different context over the years.
For me a key step in moving away would be to fully audit/understand
all compatibility implications of removing it. If other
I just added snapshot builds for 1.5. They will take a few hours to
build, but once we get them working should publish every few hours.
https://amplab.cs.berkeley.edu/jenkins/view/Spark-Packaging
- Patrick
On Mon, Sep 21, 2015 at 10:36 PM, Bin Wang wrote:
> However I find
Hi All,
For pull requests that modify the build, you can now test different
build permutations as part of the pull request builder. To trigger
these, you add a special phrase to the title of the pull request.
Current options are:
[test-maven] - run tests using maven and not sbt
[test-hadoop1.0]
There is already code in place that restricts which tests run
depending on which code is modified. However, changes inside of
Spark's core currently require running all dependent tests. If you
have some ideas about how to improve that heuristic, it would be
great.
- Patrick
On Tue, Aug 25, 2015
Hey All,
Was wondering if people would be willing to avoid merging build
changes until we have put the tests in better shape. The reason is
that build changes are the most likely to cause downstream issues with
the test matrix and it's very difficult to reverse engineer which
patches caused which
Hey Meihua,
If you are a user of Spark, one thing that is really helpful is to run
Spark 1.5 on your workload and report any issues, performance
regressions, etc.
- Patrick
On Mon, Aug 3, 2015 at 11:49 PM, Akhil Das ak...@sigmoidanalytics.com wrote:
I think you can start from here
Yeah the best bet is to use ./build/mvn --force (otherwise we'll still
use your system maven).
- Patrick
On Mon, Aug 3, 2015 at 1:26 PM, Sean Owen so...@cloudera.com wrote:
That statement is true for Spark 1.4.x. But you've reminded me that I
failed to update this doc for 1.5, to say Maven
Hey All,
I got it up and running - it was a newly surfaced bug in the build scripts.
- Patrick
On Wed, Jul 29, 2015 at 6:05 AM, Bharath Ravi Kumar reachb...@gmail.com wrote:
Hey Patrick,
Any update on this front please?
Thanks,
Bharath
On Fri, Jul 24, 2015 at 8:38 PM, Patrick Wendell
Hey All,
I've mostly kept quiet since I am not very active in maintaining this
code anymore. However, it is a bit odd that the project is
split-brained with a lot of the code being on github and some in the
Spark repo.
If the consensus is to migrate everything to github, that seems okay
with me.
Yeah this could make sense - allowing data sources to register a short
name. What mechanism did you have in mind? To use the jar service loader?
The only issue is that there could be conflicts since many of these are
third party packages. If the same name were registered twice I'm not sure
what
Thanks ted for pointing this out. CC to Ryan and TD
On Tue, Jul 28, 2015 at 8:25 AM, Ted Yu yuzhih...@gmail.com wrote:
Hi,
I noticed that ReceiverTrackerSuite is failing in master Jenkins build for
both hadoop profiles.
The failure seems to start with:
Hi All,
If there is a build break (i.e. a compile issue or consistently
failing test) that somehow makes it into master, the best protocol is:
1. Revert the offending patch.
2. File a JIRA and assign it to the committer of the offending patch.
The JIRA should contain links to broken builds.
noticed the last (1.5) build has a timestamp of 16th July. Have nightly
builds been discontinued since then?
Thanks,
Bharath
On Sun, May 24, 2015 at 1:11 PM, Patrick Wendell pwend...@gmail.com wrote:
Hi All,
This week I got around to setting up nightly builds for Spark on
Jenkins. I'd like
Hi All,
A few times I've been asked about backporting and when to backport and
not backport fix patches. Since I have managed this for many of the
past releases, I wanted to point out the way I have been thinking
about it. If we have some consensus I can put it on the wiki.
The trade off when
I think we should just revert this patch on all affected branches. No
reason to leave the builds broken until a fix is in place.
- Patrick
On Sun, Jul 19, 2015 at 6:03 PM, Josh Rosen rosenvi...@gmail.com wrote:
Yep, I emailed TD about it; I think that we may need to make a change to the
pull
:
Responses inline, with some liberties on ordering.
On Sun, Jul 12, 2015 at 10:32 PM, Patrick Wendell pwend...@gmail.com
wrote:
Hey Sean B,
Would you mind outlining for me how we go about changing this policy -
I think it's outdated and doesn't make much sense. Ideally I'd like to
propose
Hey Sean,
One other thing I'd be okay doing is moving the main text about
nightly builds to the wiki and just have header called Nightly
builds at the end of the downloads page that says For developers,
Spark maintains nightly builds. More information is available on the
[Spark developer
+1 from me too
On Sat, Jul 18, 2015 at 3:32 AM, Ted Yu yuzhih...@gmail.com wrote:
+1 to removing commit messages.
On Jul 18, 2015, at 1:35 AM, Sean Owen so...@cloudera.com wrote:
+1 to removing them. Sometimes there are 50+ commits because people
have been merging from master into their
One related note here is that we have a Java version of this that is
an abstract class - in the doc it says that it exists more or less to
allow for binary compatibility (it says it's for Java users, but
really Scala could use this also):
Actually the java one is a concrete class.
On Wed, Jul 15, 2015 at 12:14 PM, Patrick Wendell pwend...@gmail.com wrote:
One related note here is that we have a Java version of this that is
an abstract class - in the doc it says that it exists more or less to
allow for binary compatibility
Hi All,
I'm happy to announce the Spark 1.4.1 maintenance release.
We recommend all users on the 1.4 branch upgrade to
this release, which contain several important bug fixes.
Download Spark 1.4.1 - http://spark.apache.org/downloads.html
Release notes -
This vote passes with 14 +1 (7 binding) votes and no 0 or -1 votes.
+1 (14):
Patrick Wendell
Reynold Xin
Sean Owen
Burak Yavuz
Mark Hamstra
Michael Armbrust
Andrew Or
York, Brennon
Krishna Sankar
Luciano Resende
Holden Karau
Tom Graves
Denny Lee
Sean McNamara
- Patrick
On Wed, Jul 8, 2015 at 10
pretty good to me. Mark it developers-only, not formally
tested by the community, etc.)
On Sun, Jul 12, 2015 at 7:50 PM, Patrick Wendell pwend...@gmail.com wrote:
Hey Sean B.,
Thanks for bringing this to our attention. I think putting them on the
developer wiki would substantially decrease
I think we can close this vote soon. Any addition votes/testing would
be much appreciated!
On Fri, Jul 10, 2015 at 11:30 AM, Sean McNamara
sean.mcnam...@webtrends.com wrote:
+1
Sean
On Jul 8, 2015, at 11:55 PM, Patrick Wendell pwend...@gmail.com wrote:
Please vote on releasing
Hey Sean B.,
Thanks for bringing this to our attention. I think putting them on the
developer wiki would substantially decrease visibility in a way that
is not beneficial to the project - this feature was specifically
requested by developers from other projects that integrate with Spark.
If the
+1
On Wed, Jul 8, 2015 at 10:55 PM, Patrick Wendell pwend...@gmail.com wrote:
Please vote on releasing the following candidate as Apache Spark version
1.4.1!
This release fixes a handful of known issues in Spark 1.4.0, listed here:
http://s.apache.org/spark-1.4.1
The tag to be voted
with exit code 1. See the log4j logs for more
detail. (HiveSparkSubmitSuite.scala:92)
On Tue, Jul 7, 2015 at 8:06 PM, Patrick Wendell pwend...@gmail.com
wrote:
Please vote on releasing the following candidate as Apache Spark version
1.4.1!
This release fixes a handful of known issues
Hey All,
The issue that Josh pointed out is not just a test failure, it's an
issue with an important bug fix that was not correctly back-ported
into the 1.4 branch. Unfortunately the overall state of the 1.4 branch
tests on Jenkins was not in great shape so this was missed earlier on.
Given that
Please vote on releasing the following candidate as Apache Spark version 1.4.1!
This release fixes a handful of known issues in Spark 1.4.0, listed here:
http://s.apache.org/spark-1.4.1
The tag to be voted on is v1.4.1-rc4 (commit dbaa5c2):
This vote is cancelled in favor of RC4.
- Patrick
On Tue, Jul 7, 2015 at 12:06 PM, Patrick Wendell pwend...@gmail.com wrote:
Please vote on releasing the following candidate as Apache Spark version
1.4.1!
This release fixes a handful of known issues in Spark 1.4.0, listed here:
http
Hey All,
This vote is cancelled in favor of RC3.
- Patrick
On Fri, Jul 3, 2015 at 1:15 PM, Patrick Wendell pwend...@gmail.com wrote:
Please vote on releasing the following candidate as Apache Spark version
1.4.1!
This release fixes a handful of known issues in Spark 1.4.0, listed here
-03 17:35 GMT-07:00 Krishna Sankar ksanka...@gmail.com:
Patrick,
I assume an RC3 will be out for folks like me to test the
distribution. As usual, I will run the tests when you have a new
distribution.
Cheers
k/
On Fri, Jul 3, 2015 at 4:38 PM, Patrick Wendell pwend...@gmail.com
wrote
-Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean
package
this also gave the ‘Dependency-reduced POM’ loop
Robin
On 3 Jul 2015, at 23:41, Patrick Wendell pwend...@gmail.com wrote:
What if you use the built-in maven (i.e. build/mvn). It might be that
we require a newer version of maven than
Let's continue the disucssion on the other thread relating to the master build.
On Fri, Jul 3, 2015 at 4:13 PM, Patrick Wendell pwend...@gmail.com wrote:
Thanks - it appears this is just a legitimate issue with the build,
affecting all versions of Maven.
On Fri, Jul 3, 2015 at 4:02 PM
:
Doesn't change anything for me.
On Fri, Jul 3, 2015 at 3:45 PM Patrick Wendell pwend...@gmail.com wrote:
Can you try using the built in maven build/mvn...? All of our builds
are passing on Jenkins so I wonder if it's a maven version issue:
https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA
://github.com/apache/spark/commit/bc51bcaea734fe64a90d007559e76f5ceebfea9e
On Fri, Jul 3, 2015 at 4:36 PM, Patrick Wendell pwend...@gmail.com wrote:
Okay I did some forensics with Sean Owen. Some things about this bug:
1. The underlying cause is that we added some code to make the tests
of sub
Can you try using the built in maven build/mvn...? All of our builds
are passing on Jenkins so I wonder if it's a maven version issue:
https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/
- Patrick
On Fri, Jul 3, 2015 at 3:14 PM, Ted Yu yuzhih...@gmail.com wrote:
Please take a look at
, family: mac
Let me nuke it and reinstall maven.
Cheers
k/
On Fri, Jul 3, 2015 at 3:41 PM, Patrick Wendell pwend...@gmail.com wrote:
What if you use the built-in maven (i.e. build/mvn). It might be that
we require a newer version of maven than you have. The release itself
is built with maven
the time of the RC voting is an
interesting topic, Sean I like your most recent proposal. Maybe we can
put that on the wiki or start a DISCUSS thread to cover that topic.
On Tue, Jun 23, 2015 at 10:37 PM, Patrick Wendell pwend...@gmail.com wrote:
Please vote on releasing the following candidate
Please vote on releasing the following candidate as Apache Spark version 1.4.1!
This release fixes a handful of known issues in Spark 1.4.0, listed here:
http://s.apache.org/spark-1.4.1
The tag to be voted on is v1.4.1-rc2 (commit 07b95c7):
/
On Tue, Jun 23, 2015 at 10:37 PM, Patrick Wendell pwend...@gmail.com
wrote:
Please vote on releasing the following candidate as Apache Spark version
1.4.1!
This release fixes a handful of known issues in Spark 1.4.0, listed here:
http://s.apache.org/spark-1.4.1
The tag to be voted
something that is definitely being worked on for
1.4.1?
On Wed, Jun 24, 2015 at 6:56 PM, Patrick Wendell pwend...@gmail.com wrote:
Hey Sean,
This is being shipped now because there is a severe bug in 1.4.0 that
can cause data corruption for Parquet users.
There are no blockers targeted
of that before we ask people to
seriously test these bits?
On Wed, Jun 24, 2015 at 8:37 AM, Patrick Wendell pwend...@gmail.com wrote:
Please vote on releasing the following candidate as Apache Spark version
1.4.1!
This release fixes a handful of known issues in Spark 1.4.0, listed here
Please vote on releasing the following candidate as Apache Spark version 1.4.1!
This release fixes a handful of known issues in Spark 1.4.0, listed here:
http://s.apache.org/spark-1.4.1
The tag to be voted on is v1.4.1-rc1 (commit 60e08e5):
is that it's much more efficient for us as the Spark
maintainers to pay this cost rather than to force a lot of our users
to deal with painful upgrades.
On Sat, Jun 13, 2015 at 1:39 AM, Steve Loughran ste...@hortonworks.com wrote:
On 12 Jun 2015, at 17:12, Patrick Wendell pwend...@gmail.com
I feel this is quite different from the Java 6 decision and personally
I don't see sufficient cause to do it.
I would like to understand though Sean - what is the proposal exactly?
Hadoop 2 itself supports all of the Hadoop 1 API's, so things like
removing the Hadoop 1 variant of sc.hadoopFile,
Hi All,
I'm happy to announce the availability of Spark 1.4.0! Spark 1.4.0 is
the fifth release on the API-compatible 1.X line. It is Spark's
largest release ever, with contributions from 210 developers and more
than 1,000 commits!
A huge thanks go to all of the individuals and organizations
This vote passes! Thanks to everyone who voted. I will get the release
artifacts and notes up within a day or two.
+1 (23 votes):
Reynold Xin*
Patrick Wendell*
Matei Zaharia*
Andrew Or*
Timothy Chen
Calvin Jia
Burak Yavuz
Krishna Sankar
Hari Shreedharan
Ram Sriharsha*
Kousuke Saruta
Sandy Ryza
Hey Hector,
It's not a bad idea. I think we'd want to do this by virtue of
allowing custom repositories, so users can add bintray or others.
- Patrick
On Wed, Jun 10, 2015 at 6:23 PM, Hector Yee hector@gmail.com wrote:
Hi Spark devs,
Is it possible to add jcenter or bintray support for
Hi All,
Thanks for the continued voting! I'm going to leave this thread open
for another few days to continue to collect feedback.
- Patrick
On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell pwend...@gmail.com wrote:
Please vote on releasing the following candidate as Apache Spark version
Hey Mike,
Stage ID's are not guaranteed to be sequential because of the way the
DAG scheduler works (only increasing). In some cases stage ID numbers
are skipped when stages are generated.
Any stage/ID that appears in the Spark UI is an actual stage, so if
you see ID's in there, but they are not
Hey All,
Just a request here - it would be great if people could create JIRA's
for any and all merged pull requests. The reason is that when patches
get reverted due to build breaks or other issues, it is very difficult
to keep track of what is going on if there is no JIRA. Here is a list
of 5
I will give +1 as well.
On Wed, Jun 3, 2015 at 11:59 PM, Reynold Xin r...@databricks.com wrote:
Let me give you the 1st
+1
On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell pwend...@gmail.com wrote:
He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
exact commit and all
randomSplit
9a88be1 [SPARK-6013] [ML] Add more Python ML examples for spark.ml
2bd4460 [SPARK-7954] [SPARKR] Create SparkContext in sparkRSQL init
On Fri, May 29, 2015 at 4:40 PM, Patrick Wendell pwend...@gmail.com wrote:
Please vote on releasing the following candidate as Apache Spark version
1.4.0
Please vote on releasing the following candidate as Apache Spark version 1.4.0!
The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
22596c534a38cfdda91aef18aa9037ab101e4251
The release files, including signatures, digests, etc.
He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
exact commit and all other information is correct. (thanks Shivaram
who pointed this out).
On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell pwend...@gmail.com wrote:
Please vote on releasing the following candidate as Apache
:978)
... but maybe I missed the memo about how to build for Hive? do I
still need another Hive profile?
Other tests, signatures, etc look good.
On Sat, May 30, 2015 at 12:40 AM, Patrick Wendell pwend...@gmail.com
wrote:
Please vote on releasing the following candidate as Apache Spark
Thanks for all the discussion on the vote thread. I am canceling this
vote in favor of RC3.
On Sun, May 24, 2015 at 12:22 AM, Patrick Wendell pwend...@gmail.com wrote:
Please vote on releasing the following candidate as Apache Spark version
1.4.0!
The tag to be voted on is v1.4.0-rc2 (commit
Please vote on releasing the following candidate as Apache Spark version 1.4.0!
The tag to be voted on is v1.4.0-rc3 (commit dd109a8):
https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=dd109a8746ec07c7c83995890fc2c0cd7a693730
The release files, including signatures, digests, etc.
Hi James,
As I said before that is not a blocker issue for this release, thanks.
Separately, there are some comments in this code review that indicate
you may be facing a bug in your own code rather than with Spark:
https://github.com/apache/spark/pull/5688#issuecomment-104491410
Please follow
Hey jameszhouyi,
Since SPARK-7119 is not a regression from earlier versions, we won't
hold the release for it. However, please comment on the JIRA if it is
affecting you... it will help us prioritize the bug.
- Patrick
On Fri, May 22, 2015 at 8:41 PM, jameszhouyi yiaz...@gmail.com wrote:
We
This vote is cancelled in favor of RC2.
On Tue, May 19, 2015 at 9:10 AM, Patrick Wendell pwend...@gmail.com wrote:
Please vote on releasing the following candidate as Apache Spark version
1.4.0!
The tag to be voted on is v1.4.0-rc1 (commit 777a081):
https://git-wip-us.apache.org/repos/asf?p
Please vote on releasing the following candidate as Apache Spark version 1.4.0!
The tag to be voted on is v1.4.0-rc2 (commit 03fb26a3):
https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=03fb26a3e50e00739cc815ba4e2e82d71d003168
The release files, including signatures, digests, etc.
Hi All,
This week I got around to setting up nightly builds for Spark on
Jenkins. I'd like feedback on these and if it's going well I can merge
the relevant automation scripts into Spark mainline and document it on
the website. Right now I'm doing:
1. SNAPSHOT's of Spark master and release
, Patrick Wendell pwend...@gmail.com wrote:
Hi All - unfortunately the fix introduced another bug, which is that
fixVersion was not updated properly. I've updated the script and had
one other person test it.
So committers please pull from master again thanks!
- Patrick
On Tue, May 12, 2015
Yes - spark packages can include non ASF licenses.
On Sat, May 23, 2015 at 6:16 PM, Debasish Das debasish.da...@gmail.com wrote:
Hi,
Is it possible to add GPL/LGPL code on spark packages or it must be licensed
under Apache as well ?
I want to expose Professor Tim Davis's LGPL library for
are not getting included.
On Fri, May 22, 2015 at 2:44 PM, Andrew Psaltis psaltis.and...@gmail.com
wrote:
All,
Should all the docs work from
http://people.apache.org/~pwendell/spark-1.4.0-rc1-docs/ ? If so the R API
docs 404.
On Tue, May 19, 2015 at 11:10 AM, Patrick Wendell pwend
, Filter,sortByKey (word count)
2.6. Recommendation (Movielens medium dataset ~1 M ratings) OK
Model evaluation/optimization (rank, numIter, lambda) with itertools
OK
Cheers
k/
On Tue, May 19, 2015 at 9:10 AM, Patrick Wendell pwend...@gmail.com wrote:
Please vote on releasing
Huai
On Tue, May 19, 2015 at 5:10 PM, Patrick Wendell pwend...@gmail.com
wrote:
Please vote on releasing the following candidate as Apache Spark version
1.4.0!
The tag to be voted on is v1.4.0-rc1 (commit 777a081):
https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h
Hey All,
Since we are now voting, please tread very carefully with branch-1.4 merges.
For instances, bug fixes that don't represent regressions from 1.3.X,
these probably shouldn't be merged unless they are extremely simple
and well reviewed.
As usual mature/core components (e.g. Spark core)
Please vote on releasing the following candidate as Apache Spark version 1.4.0!
The tag to be voted on is v1.4.0-rc1 (commit 777a081):
https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=777a08166f1fb144146ba32581d4632c3466541e
The release files, including signatures, digests, etc.
AM, Patrick Wendell pwend...@gmail.com wrote:
Please vote on releasing the following candidate as Apache Spark version
1.4.0!
The tag to be voted on is v1.4.0-rc1 (commit 777a081):
https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=777a08166f1fb144146ba32581d4632c3466541e
1 - 100 of 507 matches
Mail list logo