If you're running Hadoop, too, now that Hortonworks supports Spark, you might be able to use their distribution.
Dean Wampler, Ph.D. Author: Programming Scala, 2nd Edition <http://shop.oreilly.com/product/0636920033073.do> (O'Reilly) Typesafe <http://typesafe.com> @deanwampler <http://twitter.com/deanwampler> http://polyglotprogramming.com On Thu, Apr 16, 2015 at 2:19 PM, Arun Lists <lists.a...@gmail.com> wrote: > Thanks, Matei! We'll try that and let you know if it works. You are > correct in inferring that some of the problems we had were with > dependencies. > > We also had problems with the spark-submit scripts. I will get the details > from the engineer who worked on the Windows builds and provide them to you. > > arun > > > On Thu, Apr 16, 2015 at 10:44 AM, Matei Zaharia <matei.zaha...@gmail.com> > wrote: > >> You could build Spark with Scala 2.11 on Mac / Linux and transfer it over >> to Windows. AFAIK it should build on Windows too, the only problem is that >> Maven might take a long time to download dependencies. What errors are you >> seeing? >> >> Matei >> >> > On Apr 16, 2015, at 9:23 AM, Arun Lists <lists.a...@gmail.com> wrote: >> > >> > We run Spark on Mac and Linux but also need to run it on Windows 8.1 >> and Windows Server. We ran into problems with the Scala 2.10 binary bundle >> for Spark 1.3.0 but managed to get it working. However, on Mac/Linux, we >> are on Scala 2.11.6 (we built Spark from the sources). On Windows, however >> despite our best efforts we cannot get Spark 1.3.0 as built from sources >> working for Scala 2.11.6. Spark has too many moving parts and dependencies! >> > >> > When can we expect to see a binary bundle for Spark 1.3.0 that is built >> for Scala 2.11.6? I read somewhere that the only reason that Spark 1.3.0 >> is still built for Scala 2.10 is because Kafka is still on Scala 2.10. For >> those of us who don't use Kafka, can we have a Scala 2.10 bundle. >> > >> > If there isn't an official bundle arriving any time soon, can someone >> who has built it for Windows 8.1 successfully please share with the group? >> > >> > Thanks, >> > arun >> > >> >> >