Dilip:
Can you give the command you used ?

Which release were you building ?
What OS did you build on ?

Cheers

On Thu, Nov 5, 2015 at 10:21 AM, Dilip Biswal <[email protected]> wrote:

> Hello,
>
> I am getting the same build error about not being able to find
> com.google.common.hash.HashCodes.
>
> Is there a solution to this ?
>
> Regards,
> Dilip Biswal
> Tel: 408-463-4980
> [email protected]
>
>
>
> From:        Jean-Baptiste Onofré <[email protected]>
> To:        Ted Yu <[email protected]>
> Cc:        "[email protected]" <[email protected]>
> Date:        11/03/2015 07:20 AM
> Subject:        Re: Master build fails ?
> ------------------------------
>
>
>
> Hi Ted,
>
> thanks for the update. The build with sbt is in progress on my box.
>
> Regards
> JB
>
> On 11/03/2015 03:31 PM, Ted Yu wrote:
> > Interesting, Sbt builds were not all failing:
> >
> > https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/
> >
> > FYI
> >
> > On Tue, Nov 3, 2015 at 5:58 AM, Jean-Baptiste Onofré <[email protected]
> > <mailto:[email protected] <[email protected]>>> wrote:
>
> >
> >     Hi Jacek,
> >
> >     it works fine with mvn: the problem is with sbt.
> >
> >     I suspect a different reactor order in sbt compare to mvn.
> >
> >     Regards
> >     JB
> >
> >     On 11/03/2015 02:44 PM, Jacek Laskowski wrote:
> >
> >         Hi,
> >
> >         Just built the sources using the following command and it worked
> >         fine.
> >
> >         ➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
> >         -Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
> >         -DskipTests clean install
> >         ...
> >         [INFO]
> >
> ------------------------------------------------------------------------
> >         [INFO] BUILD SUCCESS
> >         [INFO]
> >
> ------------------------------------------------------------------------
> >         [INFO] Total time: 14:15 min
> >         [INFO] Finished at: 2015-11-03T14:40:40+01:00
> >         [INFO] Final Memory: 438M/1972M
> >         [INFO]
> >
> ------------------------------------------------------------------------
> >
> >         ➜  spark git:(master) ✗ java -version
> >         java version "1.8.0_66"
> >         Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
> >         Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
> >
> >         I'm on Mac OS.
> >
> >         Pozdrawiam,
> >         Jacek
> >
> >         --
> >         Jacek Laskowski |
> http://blog.japila.pl|
> >         http://blog.jaceklaskowski.pl
>
> >         Follow me at https://twitter.com/jaceklaskowski
> >         Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>
> >
> >
> >         On Tue, Nov 3, 2015 at 1:37 PM, Jean-Baptiste Onofré
> >         <[email protected] <mailto:[email protected] <[email protected]>>>
> wrote:
> >
> >             Thanks for the update, I used mvn to build but without hive
> >             profile.
> >
> >             Let me try with mvn with the same options as you and sbt
> also.
> >
> >             I keep you posted.
> >
> >             Regards
> >             JB
> >
> >             On 11/03/2015 12:55 PM, Jeff Zhang wrote:
> >
> >
> >                 I found it is due to SPARK-11073.
> >
> >                 Here's the command I used to build
> >
> >                 build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive
> >                 -Phive-thriftserver
> >                 -Psparkr
> >
> >                 On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré
> >                 <[email protected] <mailto:[email protected]
> <[email protected]>>
> >                 <mailto:[email protected] <[email protected]><
> mailto:[email protected] <[email protected]>>>> wrote:
>
> >
> >                       Hi Jeff,
> >
> >                       it works for me (with skipping the tests).
> >
> >                       Let me try again, just to be sure.
> >
> >                       Regards
> >                       JB
> >
> >
> >                       On 11/03/2015 11:50 AM, Jeff Zhang wrote:
> >
> >                           Looks like it's due to guava version
> >                 conflicts, I see both guava
> >                           14.0.1
> >                           and 16.0.1 under lib_managed/bundles. Anyone
> >                 meet this issue too ?
> >
> >                           [error]
> >
> >
> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
> >                           object HashCodes is not a member of package
> >                 com.google.common.hash
> >                           [error] import com.google.common.hash.HashCodes
> >                           [error]        ^
> >                           [info] Resolving
> >                 org.apache.commons#commons-math;2.2 ...
> >                           [error]
> >
> >
> /Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
> >                           not found: value HashCodes
> >                           [error]         val cookie =
> >                 HashCodes.fromBytes(secret).toString()
> >                           [error]                      ^
> >
> >
> >
> >
> >                           --
> >                           Best Regards
> >
> >                           Jeff Zhang
> >
> >
> >                       --
> >                       Jean-Baptiste Onofré
> >                 [email protected] <
> mailto:[email protected] <[email protected]>>
> >                 <mailto:[email protected] <[email protected]><
> mailto:[email protected] <[email protected]>>>
> >                 http://blog.nanthrax.net
> >                       Talend - http://www.talend.com
> >
> >
> >
> ---------------------------------------------------------------------
> >                       To unsubscribe, e-mail:
> >                 [email protected]
> >                 <mailto:[email protected]
> <[email protected]>>
> >                       <mailto:[email protected]
> <[email protected]>
> >                 <mailto:[email protected]
> <[email protected]>>>
> >                       For additional commands, e-mail:
> >                 [email protected] <
> mailto:[email protected] <[email protected]>>
> >                       <mailto:[email protected]
> <[email protected]>
>
> >                 <mailto:[email protected]
> <[email protected]>>>
> >
> >
> >
> >
> >                 --
> >                 Best Regards
> >
> >                 Jeff Zhang
> >
> >
> >
> >             --
> >             Jean-Baptiste Onofré
> >             [email protected] <mailto:[email protected]
> <[email protected]>>
> >             http://blog.nanthrax.net
> >             Talend - http://www.talend.com
> >
> >
> ---------------------------------------------------------------------
> >             To unsubscribe, e-mail: [email protected]
> >             <mailto:[email protected]
> <[email protected]>>
> >             For additional commands, e-mail: [email protected]
> >             <mailto:[email protected]
> <[email protected]>>
> >
> >
> >
> ---------------------------------------------------------------------
> >         To unsubscribe, e-mail: [email protected]
> >         <mailto:[email protected]
> <[email protected]>>
> >         For additional commands, e-mail: [email protected]
> >         <mailto:[email protected] <[email protected]>>
> >
> >
> >     --
> >     Jean-Baptiste Onofré
> >     [email protected] <mailto:[email protected]
> <[email protected]>>
> >     http://blog.nanthrax.net
> >     Talend - http://www.talend.com
> >
> >     ---------------------------------------------------------------------
> >     To unsubscribe, e-mail: [email protected]
> >     <mailto:[email protected]
> <[email protected]>>
> >     For additional commands, e-mail: [email protected]
> >     <mailto:[email protected] <[email protected]>>
> >
> >
>
> --
> Jean-Baptiste Onofré
> [email protected]
> http://blog.nanthrax.net
> Talend - http://www.talend.com
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>
>
>
>

Reply via email to