Re: Upgrade to Spark 1.2.1 using Guava

2015-03-02 Thread Pat Ferrel
Marcelo’s work-around works. So if you are using the itemsimilarity stuff, the CLI has a way to solve the class not found and I can point out how to do the equivalent if you are using the library API. Ping me if you care. On Feb 28, 2015, at 2:27 PM, Erlend Hamnaberg wrote: Yes. I ran into th

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-28 Thread Erlend Hamnaberg
Yes. I ran into this problem with mahout snapshot and spark 1.2.0 not really trying to figure out why that was a problem, since there were already too many moving parts in my app. Obviously there is a classpath issue somewhere. /Erlend On 27 Feb 2015 22:30, "Pat Ferrel" wrote: > @Erlend hah, we

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-28 Thread Pat Ferrel
Maybe but any time the work around is to use "spark-submit --conf spark.executor.extraClassPath=/guava.jar blah” that means that standalone apps must have hard coded paths that are honored on every worker. And as you know a lib is pretty much blocked from use of this version of Spark—hence the b

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-27 Thread Sean Owen
This seems like a job for userClassPathFirst. Or could be. It's definitely an issue of visibility between where the serializer is and where the user class is. At the top you said Pat that you didn't try this, but why not? On Fri, Feb 27, 2015 at 10:11 PM, Pat Ferrel wrote: > I’ll try to find a J

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-27 Thread Pat Ferrel
I’ll try to find a Jira for it. I hope a fix is in 1.3 On Feb 27, 2015, at 1:59 PM, Pat Ferrel wrote: Thanks! that worked. On Feb 27, 2015, at 1:50 PM, Pat Ferrel wrote: I don’t use spark-submit I have a standalone app. So I guess you want me to add that key/value to the conf in my code and

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-27 Thread Pat Ferrel
Thanks! that worked. On Feb 27, 2015, at 1:50 PM, Pat Ferrel wrote: I don’t use spark-submit I have a standalone app. So I guess you want me to add that key/value to the conf in my code and make sure it exists on workers. On Feb 27, 2015, at 1:47 PM, Marcelo Vanzin wrote: On Fri, Feb 27, 2

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-27 Thread Pat Ferrel
I don’t use spark-submit I have a standalone app. So I guess you want me to add that key/value to the conf in my code and make sure it exists on workers. On Feb 27, 2015, at 1:47 PM, Marcelo Vanzin wrote: On Fri, Feb 27, 2015 at 1:42 PM, Pat Ferrel wrote: > I changed in the spark master conf

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-27 Thread Marcelo Vanzin
On Fri, Feb 27, 2015 at 1:42 PM, Pat Ferrel wrote: > I changed in the spark master conf, which is also the only worker. I added a > path to the jar that has guava in it. Still can’t find the class. Sorry, I'm still confused about what config you're changing. I'm suggesting using: spark-submit -

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-27 Thread Pat Ferrel
I changed in the spark master conf, which is also the only worker. I added a path to the jar that has guava in it. Still can’t find the class. Trying Erland’s idea next. On Feb 27, 2015, at 1:35 PM, Marcelo Vanzin wrote: On Fri, Feb 27, 2015 at 1:30 PM, Pat Ferrel wrote: > @Marcelo do you mea

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-27 Thread Marcelo Vanzin
On Fri, Feb 27, 2015 at 1:30 PM, Pat Ferrel wrote: > @Marcelo do you mean by modifying spark.executor.extraClassPath on all > workers, that didn’t seem to work? That's an app configuration, not a worker configuration, so if you're trying to set it on the worker configuration it will definitely no

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-27 Thread Pat Ferrel
@Erlend hah, we were trying to merge your PR and ran into this—small world. You actually compile the JavaSerializer source in your project? @Marcelo do you mean by modifying spark.executor.extraClassPath on all workers, that didn’t seem to work? On Feb 27, 2015, at 1:23 PM, Erlend Hamnaberg wr

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-27 Thread Erlend Hamnaberg
Hi. I have had a simliar issue. I had to pull the JavaSerializer source into my own project, just so I got the classloading of this class under control. This must be a class loader issue with spark. -E On Fri, Feb 27, 2015 at 8:52 PM, Pat Ferrel wrote: > I understand that I need to supply Gua

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-27 Thread Marcelo Vanzin
Ah, I see. That makes a lot of sense now. You might be running into some weird class loader visibility issue. I've seen some bugs in jira about this in the past, maybe you're hitting one of them. Until I have some time to investigate (of if you're curious feel free to scavenge jira), a workaround

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-27 Thread Pat Ferrel
I understand that I need to supply Guava to Spark. The HashBiMap is created in the client and broadcast to the workers. So it is needed in both. To achieve this there is a deps.jar with Guava (and Scopt but that is only for the client). Scopt is found so I know the jar is fine for the client.

Re: Upgrade to Spark 1.2.1 using Guava

2015-02-25 Thread Marcelo Vanzin
Guava is not in Spark. (Well, long version: it's in Spark but it's relocated to a different package except for some special classes leaked through the public API.) If your app needs Guava, it needs to package Guava with it (e.g. by using maven-shade-plugin, or using "--jars" if only executors use