What was the build error? you didn't say. Are you sure it succeeded? Try running from the Spark home dir, not bin. I know we do run Windows tests and it appears to pass tests, etc.
On Thu, Dec 5, 2019 at 3:28 PM Ping Liu <pingpinga...@gmail.com> wrote: > > Hello, > > I understand Spark is preferably built on Linux. But I have a Windows > machine with a slow Virtual Box for Linux. So I wish I am able to build and > run Spark code on Windows environment. > > Unfortunately, > > # Apache Hadoop 2.6.X > ./build/mvn -Pyarn -DskipTests clean package > > # Apache Hadoop 2.7.X and later > ./build/mvn -Pyarn -Phadoop-2.7 -Dhadoop.version=2.7.3 -DskipTests clean > package > > > Both are listed on > http://spark.apache.org/docs/latest/building-spark.html#specifying-the-hadoop-version-and-enabling-yarn > > But neither works for me (I stay directly under spark root directory and run > "mvn -Pyarn -Phadoop-2.7 -Dhadoop.version=2.7.3 -DskipTests clean package" > > and > > Then I tried "mvn -Pyarn -Phadoop-3.2 -Dhadoop.version=3.2.1 -DskipTests > clean package" > > Now build works. But when I run spark-shell. I got the following error. > > D:\apache\spark\bin>spark-shell > Exception in thread "main" java.lang.NoSuchMethodError: > com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V > at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357) > at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338) > at > org.apache.spark.deploy.SparkHadoopUtil$.org$apache$spark$deploy$SparkHadoopUtil$$appendS3AndSparkHadoopHiveConfigurations(SparkHadoopUtil.scala:456) > at > org.apache.spark.deploy.SparkHadoopUtil$.newConfiguration(SparkHadoopUtil.scala:427) > at > org.apache.spark.deploy.SparkSubmit.$anonfun$prepareSubmitEnvironment$2(SparkSubmit.scala:342) > at > org.apache.spark.deploy.SparkSubmit$$Lambda$132/817978763.apply(Unknown > Source) > at scala.Option.getOrElse(Option.scala:189) > at > org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:342) > at > org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:871) > at > org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) > at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) > at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) > at > org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > > > Has anyone experienced building and running Spark source code successfully on > Windows? Could you please share your experience? > > Thanks a lot! > > Ping > --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org