Hi Ted,

I did include explicitly breeze in my pom.xml

    <dependency>

      <groupId>org.scalanlp</groupId>

      <artifactId>breeze_${scala.binary.version}</artifactId>

      <version>0.9</version>

    </dependency>

But this error message still appears.


Thanks!

On Sat, Oct 4, 2014 at 2:03 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> See the last comment in that thread from Xiangrui:
>
> bq. include breeze in the dependency set of your project. Do not rely on 
> transitive
> dependencies
>
> Cheers
>
> On Sat, Oct 4, 2014 at 1:48 PM, 陈韵竹 <anny9...@gmail.com> wrote:
>
>> Hi Ted,
>>
>> So according to previous posts, the problem should be solved by changing
>> the spark-1.1.0 core pom file?
>>
>> Thanks!
>>
>> On Sat, Oct 4, 2014 at 1:06 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>> Cycling bits:
>>> http://search-hadoop.com/m/JW1q5UX9S1/breeze+spark&subj=Build+error+when+using+spark+with+breeze
>>>
>>> On Sat, Oct 4, 2014 at 12:59 PM, anny9699 <anny9...@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> I use the breeze.stats.distributions.Bernoulli in my code, however met
>>>> this
>>>> problem
>>>> java.lang.NoClassDefFoundError:
>>>> org/apache/commons/math3/random/RandomGenerator
>>>>
>>>> I read the posts about this problem before, and if I added
>>>>     <dependency>
>>>>       <groupId>org.apache.commons</groupId>
>>>>       <artifactId>commons-math3</artifactId>
>>>>       <version>3.3</version>
>>>>       <scope>runtime</scope>
>>>>     </dependency>
>>>> to my pom.xml, more serious issues will appear. The breeze dependency is
>>>> already in my pom.xml. I am using Spark-1.1.0. Seems I didn't meet this
>>>> issue when I was using Spark-1.0.0. Does anyone have some suggestions?
>>>>
>>>> Thanks a lot!
>>>> Anny
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/org-apache-commons-math3-random-RandomGenerator-issue-tp15748.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>>
>>>
>>
>

Reply via email to