for you to start.
There's some good discussion there as well as links to papers.
http://www.quora.com/Machine-Learning/What-is-the-difference-between-L1-and-L2-regularization
Sent while mobile. Pls excuse typos etc.
On Jan 8, 2014 2:24 PM, Walrus theCat walrusthe...@gmail.com wrote:
Hi,
Can
, and L2 norm just means squared length. It's
not something you would write an ML paper on any more than what the
vector dot product is. Are you asking something else?
On Thu, Jan 9, 2014 at 6:19 PM, Walrus theCat walrusthe...@gmail.com
wrote:
Thanks Christopher,
I wanted to know
information including the error logs?
--Hossein
On Thu, Dec 12, 2013 at 7:50 AM, Walrus theCat walrusthe...@gmail.comwrote:
Hi,
I'm reading through the STDERR logs of my slaves, and about 1/4 of them
don't actually start. Instead, the only thing on the log is the command
that should have launched
Hi,
I'm reading through the STDERR logs of my slaves, and about 1/4 of them
don't actually start. Instead, the only thing on the log is the command
that should have launched the process. Thoughts?
Thanks
Hi all,
I've had smashing success with Spark 0.7.x with this code, and this same
code on Spark 0.8.0 using a smaller data set. However, when I try to use a
larger data set, some strange behavior occurs.
I'm trying to do L2 regularization with Logistic Regression using the new
ML Lib.
Reading
Anyone have any ideas based on the stack trace?
Thanks
On Sun, Dec 1, 2013 at 9:09 PM, Walrus theCat walrusthe...@gmail.comwrote:
Shouldn't? I imported the new 0.8.0 jars into my build path, and had to
update my imports accordingly. The only way I upload the spark jars myself
.)
On Fri, Nov 29, 2013 at 10:12 PM, Ashish Rangole arang...@gmail.com wrote:
I am sure you have already checked this, any chance the classpath has
v 0.7.x jars in it?
On Nov 29, 2013 4:40 PM, Walrus theCat walrusthe...@gmail.com wrote:
The full context isn't much -- this is the first thing I do
of it on there.
Matei
On Nov 27, 2013, at 6:04 PM, Walrus theCat walrusthe...@gmail.com wrote:
To clarify, I just undid that var... field.. thing described above, and
it throws the same error.
On Wed, Nov 27, 2013 at 5:53 PM, Walrus theCat walrusthe...@gmail.comwrote:
Hi all,
This exception gets thrown
The full context isn't much -- this is the first thing I do in my main
method (assign a value to sc), and it throws this error.
On Fri, Nov 29, 2013 at 10:38 AM, Walrus theCat walrusthe...@gmail.comwrote:
Hi Matei,
Good to hear from you. The stack trace is below. I launched the
instances
.
Dankeschöen,
Walrus theCat
To clarify, I just undid that var... field.. thing described above, and
it throws the same error.
On Wed, Nov 27, 2013 at 5:53 PM, Walrus theCat walrusthe...@gmail.comwrote:
Hi all,
This exception gets thrown when I assign a value to the variable holding
my SparkContext. I initialize
, Walrus theCat walrusthe...@gmail.comwrote:
Hi,
I just updated my imports and tried to run my app using Spark 0.8, but it
breaks. The AMI's spark-shell says it's 0.7.3 or thereabouts, which is
what my app previously used. What is the official, step-by-step solution
to using Spark 0.8 on EC2
or so.
Matei
On Nov 11, 2013, at 12:13 PM, Walrus theCat walrusthe...@gmail.com
wrote:
Hi,
The docs say that we should be careful to increase spark.akka.threads
when our cluster size increases. Is there a rule of thumb for how much we
should increase it? For instance
to all worker nodes
multiple times. Then the broadcast variable is a good choice
2013/11/7 Walrus theCat walrusthe...@gmail.com
Shangyu,
Thanks for the tip re: the flag! Maybe the broadcast variable is only
for complex data structures?
On Sun, Nov 3, 2013 at 7:58 PM, Shangyu Luo lsy
Will that cause a hit to performance or cause the program to crash?
Thanks
Are there heuristics to check when the scheduler says it is missing
parents and just hangs?
On Thu, Oct 31, 2013 at 4:56 PM, Walrus theCat walrusthe...@gmail.comwrote:
Hi,
I'm not sure what's going on here. My code seems to be working thus far
(map at SparkLR:90 completed.) What can I do
Hi,
I'm not sure what's going on here. My code seems to be working thus far
(map at SparkLR:90 completed.) What can I do to help the scheduler out
here?
Thanks
13/10/31 02:10:13 INFO scheduler.DAGScheduler: Completed ShuffleMapTask(10,
211)
13/10/31 02:10:13 INFO scheduler.DAGScheduler: Stage
17 matches
Mail list logo