[ 
https://issues.apache.org/jira/browse/MAHOUT-2086?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Palumbo updated MAHOUT-2086:
-----------------------------------
    Summary: use consistent SBT resolvable jar naming scheme with the correct 
convention  (was: change jar naming scheme to the correct convention)

> use consistent SBT resolvable jar naming scheme with the correct convention
> ---------------------------------------------------------------------------
>
>                 Key: MAHOUT-2086
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-2086
>             Project: Mahout
>          Issue Type: Bug
>    Affects Versions: 14.1
>            Reporter: Andrew Palumbo
>            Assignee: Andrew Palumbo
>            Priority: Critical
>             Fix For: 14.1
>
>
>  currently we build jars for each module, module_x, in its respective target 
> directory.. naming that module as
> {code:java}
>  
> $MAHOUT_HOME/module_x/target/mahout-module-x_${scala.compat.version}-{project.version}.jar.{code}
>  and then copy using an ant plugin via the {{pom.xml}} each jar file to the 
> {{/lib}} directory.  [~pferrel]  has had issues resolving via SBT.  
>  
> It seems that our artifacts should be resolvable by SBT e.g, for core math, 
> Scala_2.12, mahout v14.1:
> {code:java}
>  libraryDependencies += "org.apache.mahout" % "mahout-core_2.12" % 
> "14.1"{code}
>  
> Currently we have our spark build pegged to 2.4.3.
>       
> Had trouble finding A naming convention for artifacts in the Scala docs.  
> However, this Spark style guide[1], makes the case for:
> {{module-x_scalaVersion-projectVersion.jar}}
> {{}}
> h2. JAR Files
> You can build projects that support multiple Spark versions or just a single 
> Spark version.
> h3. Projects that support a single Spark version
> JAR files built for a specific Spark version should be named like this:
>  
> {{spark-testing-base_2.11-2.1.0_0.6.0.jar}}
> Generically:
>  
> {{spark-testing-base_scalaVersion-sparkVersion_projectVersion.jar}}
> {{}}
> If you're using sbt assembly, you can use the following line of code to build 
> a JAR file using the correct naming conventions.
> assemblyJarName in assembly := 
> s"${name.value}_${scalaBinaryVersion.value}-${sparkVersion.value}_${version.value}.jar"
> If you're using {{sbt package}}, you can add this code to your {{build.sbt}} 
> file to generate a JAR file that follows the naming conventions.
> artifactName := { (sv: ScalaVersion, module: ModuleID, artifact: Artifact) => 
> artifact.name + "_" + sv.binary + "-" + sparkVersion.value + "_" + 
> module.revision + "." + artifact.extension }
> h3. Projects that support multiple Spark versions
> JAR files built for multiple Spark version should be named like this:
> {{spark-testing-base_2.11-0.6.0.jar}}
> Generically:
> {{spark-testing-base_scalaVersion-projectVersion.jar}}
> {{}}
> If you're using sbt assembly, you can use the following line of code to build 
> a JAR file using the correct naming conventions.
> assemblyJarName in assembly := 
> s"${name.value}_${scalaBinaryVersion.value}-${version.value}.jar"
> If you're using {{sbt package}}, you can add this code to your {{build.sbt}} 
> file to generate a JAR file that follows the naming conventions.
> artifactName := { (sv: ScalaVersion, module: ModuleID, artifact: Artifact) => 
> artifact.name + "_" + sv.binary + "-" + module.revision + "." + 
> artifact.extension }
>  
>  
> [[1] 
> https://github.com/mrpowers/spark-style-guide|https://github.com/mrpowers/spark-style-guide]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to