[ https://issues.apache.org/jira/browse/SPARK-23417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16368031#comment-16368031 ]
Bruce Robbins commented on SPARK-23417: --------------------------------------- This does the trick: {noformat} build/sbt -Pkafka-0-8 assembly/package streaming-kafka-0-8-assembly/assembly {noformat} There are also errant instructions for building a flume assembly jar. In that case the following works: {noformat} build/sbt -Pflume assembly/package streaming-flume-assembly/assembly {noformat} I can submit a PR to fix these messages. By the way, the above is just for the pyspark-streaming tests. The pyspark-sql tests have similar build requirements (e.g., at least one test needs a build with Hive profiles. Also, udf.py needs /sql/core/target/scala-2.11/test-classes/test/org/apache/spark/sql/JavaStringLength.class to exist.). The pyspark-sql tests don't check for these requirements, they just throw exceptions. But I won't address that here. > pyspark tests give wrong sbt instructions > ----------------------------------------- > > Key: SPARK-23417 > URL: https://issues.apache.org/jira/browse/SPARK-23417 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 2.4.0 > Reporter: Jose Torres > Priority: Minor > > When running python/run-tests, the script indicates that I must run > "'build/sbt assembly/package streaming-kafka-0-8-assembly/assembly' or > 'build/mvn -Pkafka-0-8 package'". The sbt command fails: > > [error] Expected ID character > [error] Not a valid command: streaming-kafka-0-8-assembly > [error] Expected project ID > [error] Expected configuration > [error] Expected ':' (if selecting a configuration) > [error] Expected key > [error] Not a valid key: streaming-kafka-0-8-assembly > [error] streaming-kafka-0-8-assembly/assembly > [error] -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org