Re: Assembly jar file name does not match profile selection
Hi Alessandro, It's fixed by SPARK-3787 and will be applied to 1.2.1 and 1.3.0. https://issues.apache.org/jira/browse/SPARK-3787 - Kousuke (2014/12/27 11:15), Alessandro Baretta wrote: I am building spark with sbt off of branch 1.2. I'm using the following command: sbt/sbt -Pyarn -Phadoop-2.3 assembly (http://spark.apache.org/docs/latest/building-spark.html#building-with-sbt) Although the jar file I obtain does contain the proper version of the hadoop libraries (v. 2.4), the assembly jar file name refers to hadoop v.1.0.4: ./assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop1.0.4.jar Any idea why? Alex - To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org
Unsupported Catalyst types in Parquet
Michael, I'm having trouble storing my SchemaRDDs in Parquet format with SparkSQL, due to my RDDs having having DateType and DecimalType fields. What would it take to add Parquet support for these Catalyst? Are there any other Catalyst types for which there is no Catalyst support? Alex
Re: Assembly jar file name does not match profile selection
Here's what I get: ./assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop2.6.0.jar Alex On Fri, Dec 26, 2014 at 8:41 PM, Ted Yu wrote: > Can you try this command ? > > sbt/sbt -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive assembly > > On Fri, Dec 26, 2014 at 6:15 PM, Alessandro Baretta > wrote: > >> I am building spark with sbt off of branch 1.2. I'm using the following >> command: >> >> sbt/sbt -Pyarn -Phadoop-2.3 assembly >> >> ( >> http://spark.apache.org/docs/latest/building-spark.html#building-with-sbt >> ) >> >> Although the jar file I obtain does contain the proper version of the >> hadoop libraries (v. 2.4), the assembly jar file name refers to hadoop >> v.1.0.4: >> >> ./assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop1.0.4.jar >> >> Any idea why? >> >> >> Alex >> > >
Re: SQLContext is Serializable, SparkContext is not
The spark context reference is transient. On Fri, Dec 26, 2014 at 6:11 PM, Alessandro Baretta wrote: > How, O how can this be? Doesn't the SQLContext hold a reference to the > SparkContext? > > Alex >
Re: Assembly jar file name does not match profile selection
Can you try this command ? sbt/sbt -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive assembly On Fri, Dec 26, 2014 at 6:15 PM, Alessandro Baretta wrote: > I am building spark with sbt off of branch 1.2. I'm using the following > command: > > sbt/sbt -Pyarn -Phadoop-2.3 assembly > > (http://spark.apache.org/docs/latest/building-spark.html#building-with-sbt > ) > > Although the jar file I obtain does contain the proper version of the > hadoop libraries (v. 2.4), the assembly jar file name refers to hadoop > v.1.0.4: > > ./assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop1.0.4.jar > > Any idea why? > > > Alex >
Assembly jar file name does not match profile selection
I am building spark with sbt off of branch 1.2. I'm using the following command: sbt/sbt -Pyarn -Phadoop-2.3 assembly (http://spark.apache.org/docs/latest/building-spark.html#building-with-sbt) Although the jar file I obtain does contain the proper version of the hadoop libraries (v. 2.4), the assembly jar file name refers to hadoop v.1.0.4: ./assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop1.0.4.jar Any idea why? Alex
SQLContext is Serializable, SparkContext is not
How, O how can this be? Doesn't the SQLContext hold a reference to the SparkContext? Alex
SQL specification for reference during Spark SQL development
Do we have access to the SQL specification (say, SQL-92) for reference during Spark SQL development? I know it's not freely available on the web. Usually, you can only access drafts. I know that, generally, we look to other systems (especially Hive) when figuring out how something in Spark SQL should work, but it might be nice to have the standard available for reference. Nick
Re: Spark 1.2.0 Repl
It was not intended to be a public API but there is a request to keep publishing it as a developer API: https://issues.apache.org/jira/browse/SPARK-4923 On Dec 26, 2014 2:09 PM, "Dirceu Semighini Filho" < dirceu.semigh...@gmail.com> wrote: > Hello, > Is there any reason in not publishing spark repl in the version 1.2.0? > In repl/pom.xml the deploy and publish are been skipped. > > Regards, > Dirceu >
Spark 1.2.0 Repl
Hello, Is there any reason in not publishing spark repl in the version 1.2.0? In repl/pom.xml the deploy and publish are been skipped. Regards, Dirceu