Github user tmalaska commented on the pull request:
https://github.com/apache/spark/pull/507#issuecomment-41309801
@pwendell this is a great question.
The answer there is ether a bug in sbt or I'm missing something in the
SparkBuild.scala.
In the Flume 1.4.0 pom.xml there is a dependency on thrift but the version
is declared with a property and that property is defined in a profile. I'm not
sure if the issue related to the property or the profile or the combination,
but sbt does not use the value of the thirft version property and I get the
following exception.
sbt.ResolveException: unresolved dependency:
org.apache.thrift#libthrift;${thrift.version}: not found
Maven works just fine so I left that as is.
So with my limited understand of sbt and why it was craping out. I decided
to exclude the thrift dependency in Flume 1.4.0 and place it in the
SparkBuild.scala file.
I'm open to any and all help here. I don't know enough about sbt to know
why it is having trouble with this.
Side note, sbt works fine with Flume 1.3.0. This is because in Flume 1.3.0
the thrift version is hard coded in the Flume pom.xml. Flume 1.4.0 introduces
the property value.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---