[ 
https://issues.apache.org/jira/browse/SPARK-4816?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15051056#comment-15051056
 ] 

RJ Nowling edited comment on SPARK-4816 at 12/10/15 2:52 PM:
-------------------------------------------------------------

Hi [~srowen],

I haven't tried master yet but that wouldn't address the problem I'm seeing.  
As I said, I downloaded the source tarball from the spark.apache.org web site 
vs checkout out branch-1.4.  I think it has something to do with the release 
process (but saying this with ignorance of what is involved).  I ran the same 
build command with both the source tarball (which reported excluding the native 
libs in the shading) and the branch-1.4 head from git (which reported including 
the native libs in the shading).

The .m2 repo shouldn't be an issue.  Normally, Spark pulls in the {{core}} 
artifact ID, which excludes the native libraries.  When the {{netlib-lpgp}} 
profile is enabled, the Spark MLLib pom.xml adds the {{all}} artifact ID which 
pulls in the native libs.  ({{all}} is really just a pom.xml file that pulls in 
{{core}} + native libs).

I get that this is weird.  I also get that my lack of knowledge of the release 
process is basically zero.  But I shouldn't have different results from git vs 
the released source tarball.  Maybe it's not the release process -- maybe 
something has changed in the mean time.  I'll search through the commits on the 
branch-1.4 for something to related to shading.


was (Author: rnowling):
Hi [~srowen],

I haven't tried master yet but that wouldn't address the problem I'm seeing.  
As I said, I downloaded the source tarball from the spark.apache.org web site 
vs checkout out branch-1.4.  I think it has something to do with the release 
process (but saying this with ignorance of what is involved).  I ran the same 
build command with both the source tarball (which reported excluding the native 
libs in the shading) and the branch-1.4 head from git (which reported including 
the native libs in the shading).

The .m2 repo shouldn't be an issue.  Normally, Spark pulls in the {{core}} 
artifact ID, which excludes the native libraries.  When the {{netlib-lpgp} 
profile is enabled, the Spark MLLib pom.xml adds the {{all}} artifact ID which 
pulls in the native libs.  ({{all}} is really just a pom.xml file that pulls in 
{{core}} + native libs).

I get that this is weird.  I also get that my lack of knowledge of the release 
process is basically zero.  But I shouldn't have different results from git vs 
the released source tarball.  Maybe it's not the release process -- maybe 
something has changed in the mean time.  I'll search through the commits on the 
branch-1.4 for something to related to shading.

> Maven profile netlib-lgpl does not work
> ---------------------------------------
>
>                 Key: SPARK-4816
>                 URL: https://issues.apache.org/jira/browse/SPARK-4816
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 1.1.0
>         Environment: maven 3.0.5 / Ubuntu
>            Reporter: Guillaume Pitel
>            Priority: Minor
>             Fix For: 1.1.1
>
>
> When doing what the documentation recommends to recompile Spark with Netlib 
> Native system binding (i.e. to bind with openblas or, in my case, MKL), 
> mvn -Pnetlib-lgpl -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -DskipTests 
> clean package
> The resulting assembly jar still lacked the netlib-system class. (I checked 
> the content of spark-assembly...jar)
> When forcing the netlib-lgpl profile in MLLib package to be active, the jar 
> is correctly built.
> So I guess it's a problem with the way maven passes profiles activitations to 
> children modules.
> Also, despite the documentation claiming that if the job's jar contains 
> netlib with necessary bindings, it should works, it does not. The classloader 
> must be unhappy with two occurrences of netlib ?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to