Re: If you use Spark 1.5 and disabled Tungsten mode ...

2015-11-01 Thread Reynold Xin
Thanks for reporting it, Sjoerd. You might have a different version of Janino brought in from somewhere else. This should fix your problem: https://github.com/apache/spark/pull/9372 Can you give it a try? On Tue, Oct 27, 2015 at 9:12 PM, Sjoerd Mulder wrote: > No the

unscribe

2015-11-01 Thread Chenxi Li
unscribe

Re: Downloading Hadoop from s3://spark-related-packages/

2015-11-01 Thread Steve Loughran
On 1 Nov 2015, at 03:17, Nicholas Chammas > wrote: https://s3.amazonaws.com/spark-related-packages/ spark-ec2 uses this bucket to download and install HDFS on clusters. Is it owned by the Spark project or by the AMPLab? Anyway, it

Re: Spark 1.6 Release Schedule

2015-11-01 Thread Sean Owen
I like the idea, but I think there's already a lot of triage backlog. Can we more concretely address this now and during the next two weeks? 1.6.0 stats from JIRA: 344 issues targeted at 1.6.0, of which 253 are from committers, of which 215 are improvements/other, of which 5 are

Re: Downloading Hadoop from s3://spark-related-packages/

2015-11-01 Thread Shivaram Venkataraman
I think that getting them from the ASF mirrors is a better strategy in general as it'll remove the overhead of keeping the S3 bucket up to date. It works in the spark-ec2 case because we only support a limited number of Hadoop versions from the tool. FWIW I don't have write access to the bucket

Some spark apps fail with "All masters are unresponsive", while others pass normally

2015-11-01 Thread Romi Kuntsman
[adding dev list since it's probably a bug, but i'm not sure how to reproduce so I can open a bug about it] Hi, I have a standalone Spark 1.4.0 cluster with 100s of applications running every day. >From time to time, the applications crash with the following error (see below) But at the same

Re: unscribe

2015-11-01 Thread Ted Yu
Please take a look at first section of spark.apache.org/community FYI On Sun, Nov 1, 2015 at 1:09 AM, Chenxi Li wrote: > unscribe >

Re: [Spark MLlib] about linear regression issue

2015-11-01 Thread DB Tsai
For the constrains like all weights >=0, people do LBFGS-B which is supported in our optimization library, Breeze. https://github.com/scalanlp/breeze/issues/323 However, in Spark's LiR, our implementation doesn't have constrain implementation. I do see this is useful given we're experimenting

Re: Downloading Hadoop from s3://spark-related-packages/

2015-11-01 Thread Nicholas Chammas
Oh, sweet! For example: http://www.apache.org/dyn/closer.cgi/hadoop/common/hadoop-2.7.1/hadoop-2.7.1.tar.gz?asjson=1 Thanks for sharing that tip. Looks like you can also use as_json (vs. asjson). Nick ​

Re: Downloading Hadoop from s3://spark-related-packages/

2015-11-01 Thread Shivaram Venkataraman
I think the lua one at https://svn.apache.org/repos/asf/infrastructure/site/trunk/content/dyn/closer.lua has replaced the cgi one from before. Also it looks like the lua one also supports `action=download` with a filename argument. So you could just do something like wget

Re: Downloading Hadoop from s3://spark-related-packages/

2015-11-01 Thread Nicholas Chammas
Hmm, yeah, some Googling confirms this, though there isn't any clear documentation about this. Strangely, if I click on the link from your email the download works, but curl and wget somehow don't get redirected correctly... Nick On Sun, Nov 1, 2015 at 6:40 PM Shivaram Venkataraman <

Re: Downloading Hadoop from s3://spark-related-packages/

2015-11-01 Thread Nicholas Chammas
OK, I’ll focus on the Apache mirrors going forward. The problem with the Apache mirrors, if I am not mistaken, is that you cannot use a single URL that automatically redirects you to a working mirror to download Hadoop. You have to pick a specific mirror and pray it doesn’t disappear tomorrow.

Re: Downloading Hadoop from s3://spark-related-packages/

2015-11-01 Thread Shivaram Venkataraman
On Sun, Nov 1, 2015 at 2:16 PM, Nicholas Chammas wrote: > OK, I’ll focus on the Apache mirrors going forward. > > The problem with the Apache mirrors, if I am not mistaken, is that you > cannot use a single URL that automatically redirects you to a working mirror > to

Re: Unable to run applications on spark in standalone cluster mode

2015-11-01 Thread Akhil Das
Can you paste the contents of your spark-env.sh file? Also would be good to have a look at the /etc/hosts file. Cannot bind to the given ip address can be resolved if you put the hostname instead of the ip address. Also make sure the configuration (conf directory) across your cluster have the same

Re: Implementation of RNN/LSTM in Spark

2015-11-01 Thread Sasaki Kai
Hi, Disha There seems to be no JIRA on RNN/LSTM directly. But there were several tickets about other type of networks regarding deep learning. Stacked Auto Encoder https://issues.apache.org/jira/browse/SPARK-2623 CNN

Implementation of RNN/LSTM in Spark

2015-11-01 Thread Disha Shrivastava
Hi, I wanted to know if someone is working on implementing RNN/LSTM in Spark or has already done. I am also willing to contribute to it and get some guidance on how to go about it. Thanks and Regards Disha Masters Student, IIT Delhi