works OK:
mvn -Pdeb -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -Phive -DskipTests
clean package
am I running the correct sbt command?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/No-module-named-pyspark-latest-built-tp18740p18787.html
-Phive -DskipTests
clean package
am I running the correct sbt command?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/No-module-named-pyspark-latest-built-tp18740p18787.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
://apache-spark-user-list.1001560.n3.nabble.com/No-module-named-pyspark-latest-built-tp18740.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
forgot to mention, that this setup works in spark standalone mode, only
problem when I run on yarn.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/No-module-named-pyspark-latest-built-tp18740p18777.html
Sent from the Apache Spark User List mailing list
-Dhadoop.version=2.3.0 -Phive -DskipTests
clean package
am I running the correct sbt command?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/No-module-named-pyspark-latest-built-tp18740p18787.html
Sent from the Apache Spark User List mailing list archive
-Dhadoop.version=2.3.0 -Phive -DskipTests
clean package
am I running the correct sbt command?
--
View this message in context: http://apache-spark-user-list.
1001560.n3.nabble.com/No-module-named-pyspark-latest-
built-tp18740p18787.html
Sent from the Apache Spark User List mailing
in context:
http://apache-spark-user-list.1001560.n3.nabble.com/No-module-named-pyspark-latest-built-tp18740p18787.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user
.nabble.com/No-module-named-pyspark-latest-built-tp18740p18787.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail