Thanks for the update, I used mvn to build but without hive profile.

Let me try with mvn with the same options as you and sbt also.

I keep you posted.

Regards
JB

On 11/03/2015 12:55 PM, Jeff Zhang wrote:
I found it is due to SPARK-11073.

Here's the command I used to build

build/sbt clean compile -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver
-Psparkr

On Tue, Nov 3, 2015 at 7:52 PM, Jean-Baptiste Onofré <j...@nanthrax.net
<mailto:j...@nanthrax.net>> wrote:

    Hi Jeff,

    it works for me (with skipping the tests).

    Let me try again, just to be sure.

    Regards
    JB


    On 11/03/2015 11:50 AM, Jeff Zhang wrote:

        Looks like it's due to guava version conflicts, I see both guava
        14.0.1
        and 16.0.1 under lib_managed/bundles. Anyone meet this issue too ?

        [error]
        
/Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:26:
        object HashCodes is not a member of package com.google.common.hash
        [error] import com.google.common.hash.HashCodes
        [error]        ^
        [info] Resolving org.apache.commons#commons-math;2.2 ...
        [error]
        
/Users/jzhang/github/spark_apache/core/src/main/scala/org/apache/spark/SecurityManager.scala:384:
        not found: value HashCodes
        [error]         val cookie = HashCodes.fromBytes(secret).toString()
        [error]                      ^




        --
        Best Regards

        Jeff Zhang


    --
    Jean-Baptiste Onofré
    jbono...@apache.org <mailto:jbono...@apache.org>
    http://blog.nanthrax.net
    Talend - http://www.talend.com

    ---------------------------------------------------------------------
    To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
    <mailto:dev-unsubscr...@spark.apache.org>
    For additional commands, e-mail: dev-h...@spark.apache.org
    <mailto:dev-h...@spark.apache.org>




--
Best Regards

Jeff Zhang

--
Jean-Baptiste Onofré
jbono...@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to