RE: Spark builds: allow user override of project version at buildtime
So, I actually tried this, and it built without problems, but publishing the artifacts to artifactory ended up with some strangeness in the child poms, where the property wasn’t resolved. This leads to issues pulling them into other projects of: “Could not find org.apache.spark:spark-parent_2.10:${spark.version}.” There's conflicting information out on the web about whether this should or shouldn't work, and whether it is or isn't a good idea. Broad consensus is that this is actually a bit of a hack around Maven, so it's probably not something we should do. I'll explore whether sbt is more flexible and does what's needed. Andrew From: Michael Armbrust [mailto:mich...@databricks.com] Sent: 26 August 2015 03:12 To: Marcelo Vanzin Cc: Rowson, Andrew G. (Financial&Risk) ; dev@spark.apache.org Subject: Re: Spark builds: allow user override of project version at buildtime This isn't really answering the question, but for what it is worth, I manage several different branches of Spark and publish custom named versions regularly to an internal repository, and this is *much* easier with SBT than with maven. You can actually link the Spark SBT build into an external SBT build and write commands that cross publish as needed. For your case something as simple as build/sbt "set version in Global := '1.4.1-custom-string'" publish might do the trick. On Tue, Aug 25, 2015 at 10:09 AM, Marcelo Vanzin wrote: On Tue, Aug 25, 2015 at 2:17 AM, wrote: > Then, if I wanted to do a build against a specific profile, I could also > pass in a -Dspark.version=1.4.1-custom-string and have the output artifacts > correctly named. The default behaviour should be the same. Child pom files > would need to reference ${spark.version} in their parent section I think. > > Any objections to this? Have you tried it? My understanding is that no project does that because it doesn't work. To resolve properties you need to read the parent pom(s), and if there's a variable reference there, well, you can't do it. Chicken & egg. -- Marcelo - To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org smime.p7s Description: S/MIME cryptographic signature
Re: Spark builds: allow user override of project version at buildtime
This isn't really answering the question, but for what it is worth, I manage several different branches of Spark and publish custom named versions regularly to an internal repository, and this is *much* easier with SBT than with maven. You can actually link the Spark SBT build into an external SBT build and write commands that cross publish as needed. For your case something as simple as build/sbt "set version in Global := '1.4.1-custom-string'" publish might do the trick. On Tue, Aug 25, 2015 at 10:09 AM, Marcelo Vanzin wrote: > On Tue, Aug 25, 2015 at 2:17 AM, > wrote: > > Then, if I wanted to do a build against a specific profile, I could also > > pass in a -Dspark.version=1.4.1-custom-string and have the output > artifacts > > correctly named. The default behaviour should be the same. Child pom > files > > would need to reference ${spark.version} in their parent section I think. > > > > Any objections to this? > > Have you tried it? My understanding is that no project does that > because it doesn't work. To resolve properties you need to read the > parent pom(s), and if there's a variable reference there, well, you > can't do it. Chicken & egg. > > -- > Marcelo > > - > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org > For additional commands, e-mail: dev-h...@spark.apache.org > >
Re: Spark builds: allow user override of project version at buildtime
On Tue, Aug 25, 2015 at 2:17 AM, wrote: > Then, if I wanted to do a build against a specific profile, I could also > pass in a -Dspark.version=1.4.1-custom-string and have the output artifacts > correctly named. The default behaviour should be the same. Child pom files > would need to reference ${spark.version} in their parent section I think. > > Any objections to this? Have you tried it? My understanding is that no project does that because it doesn't work. To resolve properties you need to read the parent pom(s), and if there's a variable reference there, well, you can't do it. Chicken & egg. -- Marcelo - To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org