Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1813#discussion_r16377010
  
    --- Diff: project/SparkBuild.scala ---
    @@ -276,6 +289,41 @@ object Assembly {
     
     }
     
    +/**
    + * Settings for the spark-core artifact. We don't want to expose Guava as 
a compile-time dependency,
    + * but at the same time the Java API exposes a Guava type (Optional). So 
we package it with the
    + * spark-core jar using the assembly plugin, and use the assembly 
deliverable as the main artifact
    + * for that project, disabling the non-assembly jar.
    + */
    +object CoreAssembly {
    --- End diff --
    
    Does this approach no longer create the `spark-core` jar? I don't think 
that is doable, because many people will do things like `sbt/sbt publish-local` 
and then link applications against their local install of spark. Could we just 
name the jar identically to the normal spark-core package jar?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to