Hi, I'm putting together a Spark package (in the spark-packages.org sense) and I'd like to make use of the class org.apache.spark.mllib.util.TestingUtils which appears in mllib/src/test. Can I declare a dependency in my build.sbt to pull in a suitable jar? I have searched around but I have not been able to identify a jar which contains TestingUtils. I suppose I could cut 'n' paste the relevant bits from the source code but I'd really rather just declare a dependency. I looked at a few other packages at spark-packages.org but I couldn't find an example of a project which was doing something similar.
Thanks in advance for any light you can shed on this problem. Robert Dodier --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org