[ https://issues.apache.org/jira/browse/SPARK-3521?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Radim Kolar closed SPARK-3521. ------------------------------ Resolution: Not a Problem Fix Version/s: 1.1.1 Compile problem is fixed on github branch-1.1 > Missing modules in 1.1.0 source distribution - cant be build with maven > ----------------------------------------------------------------------- > > Key: SPARK-3521 > URL: https://issues.apache.org/jira/browse/SPARK-3521 > Project: Spark > Issue Type: Bug > Components: Build > Affects Versions: 1.1.0 > Reporter: Radim Kolar > Priority: Minor > Fix For: 1.1.1 > > > modules {{bagel}}, {{mllib}}, {{flume-sink}} and {{flume}} are missing from > source code distro, spark cant be build with maven. It cant be build by > {{sbt/sbt}} either due to other bug (_java.lang.IllegalStateException: > impossible to get artifacts when data has not been loaded. IvyNode = > org.slf4j#slf4j-api;1.6.1_) > (hsn@sanatana:pts/6):work/spark-1.1.0% mvn -Pyarn -Phadoop-2.4 > -Dhadoop.version=2.4.1 -DskipTests clean package > [INFO] Scanning for projects... > [ERROR] The build could not read 1 project -> [Help 1] > [ERROR] > [ERROR] The project org.apache.spark:spark-parent:1.1.0 > (/home/hsn/myports/spark11/work/spark-1.1.0/pom.xml) has 4 errors > [ERROR] Child module /home/hsn/myports/spark11/work/spark-1.1.0/bagel of > /home/hsn/myports/spark11/work/spark-1.1.0/pom.xml does not exist > [ERROR] Child module /home/hsn/myports/spark11/work/spark-1.1.0/mllib of > /home/hsn/myports/spark11/work/spark-1.1.0/pom.xml does not exist > [ERROR] Child module > /home/hsn/myports/spark11/work/spark-1.1.0/external/flume of > /home/hsn/myports/spark11/work/spark-1.1.0/pom.xml does not exist > [ERROR] Child module > /home/hsn/myports/spark11/work/spark-1.1.0/external/flume-sink/pom.xml of > /home/hsn/myports/spark11/work/spark-1.1.0/pom.xml does not exist -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org