Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17416#discussion_r108002222
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
    @@ -941,6 +965,12 @@ private[spark] object SparkSubmitUtils {
           val ri = ModuleRevisionId.newInstance(mvn.groupId, mvn.artifactId, 
mvn.version)
           val dd = new DefaultDependencyDescriptor(ri, false, false)
           dd.addDependencyConfiguration(ivyConfName, ivyConfName + "(runtime)")
    +      if (mvn.classifier.isDefined) {
    --- End diff --
    
    CC @brkyvz and @BryanCutler (and maybe @vanzin) for a look. This does _not_ 
quite work yet, but it's close. I believe it's this stanza that's not quite 
right, where I try to add classifier info to Ivy's description of an artifact. 
    
    For example `./bin/spark-shell --packages 
edu.stanford.nlp:stanford-corenlp:jar:models:3.4.1` works and seems to resolve 
the right artifact with classifier locally, no errors, but it doesn't seem to 
be on the classpath.
    
    I don't need you to debug it for me but mostly wondering if you know Ivy 
much better than I (not seen it until today) and can see an obvious error in 
this approach. I can't quite figure Ivy's concept that maps to "classifier" in 
Maven.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to