[ 
https://issues.apache.org/jira/browse/MAHOUT-1616?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14191252#comment-14191252
 ] 

ASF GitHub Bot commented on MAHOUT-1616:
----------------------------------------

Github user dlyubimov commented on the pull request:

    https://github.com/apache/mahout/pull/54#issuecomment-61210738
  
    oh.
    
    that's hadoop versioning thing then again.
    
    keep in mind -- hadoop version in maven dependencies is not the same as
    actual hadoop version spark has at runtime.
    
    in fact, spark modules are pulling hadoop version from spark transitive
    dependencies as compiled in your local maven.
    
    by default nightly will pool the default spark artifact from central, which
    will have whatever default hadoop version there, which is what most likely
    spark module is going to use for local tests. but generally it shouldn't
    matter which hadoop version spark tests are running on, because they are
    not writing/reading files created with any other version.
    
    
    On Thu, Oct 30, 2014 at 6:23 PM, Pat Ferrel <[email protected]>
    wrote:
    
    > I'm getting the snappy build test error that broke the nightly build.
    > Switching back to master and trying again
    >
    > —
    > Reply to this email directly or view it on GitHub
    > <https://github.com/apache/mahout/pull/54#issuecomment-61200459>.
    >


> Better support for hadoop dependencies of multiple versions 
> ------------------------------------------------------------
>
>                 Key: MAHOUT-1616
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1616
>             Project: Mahout
>          Issue Type: Improvement
>          Components: build
>            Reporter: Gokhan Capan
>            Assignee: Gokhan Capan
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to