org.apache.spark.sql.sources.DDLException: Unsupported dataType: [1.1] failure: ``varchar'' expected but identifier char found in spark-sql

2015-02-16 Thread Qiuzhuang Lian
Hi, I am not sure this has been reported already or not, I run into this error under spark-sql shell as build from newest of spark git trunk, spark-sql describe qiuzhuang_hcatlog_import; 15/02/17 14:38:36 ERROR SparkSQLDriver: Failed in [describe qiuzhuang_hcatlog_import]

Too many open files error

2014-11-19 Thread Qiuzhuang Lian
Hi All, While doing some ETL, I run into error of 'Too many open files' as following logs, Thanks, Qiuzhuang 4/11/20 20:12:02 INFO collection.ExternalAppendOnlyMap: Thread 63 spilling in-memory map of 100.8 KB to disk (953 times so far) 14/11/20 20:12:02 ERROR storage.DiskBlockObjectWriter:

src/main/resources/kv1.txt not found in example of HiveFromSpark

2014-11-04 Thread Qiuzhuang Lian
When running HiveFromSpark example via run-example shell, I got error, FAILED: SemanticException Line 1:23 Invalid path ''src/main/resources/kv1.txt'': No files matching path file:/home/kand/javaprojects/spark/src/main/resources/kv1.txt == END HIVE FAILURE OUTPUT

serialVersionUID incompatible error in class BlockManagerId

2014-10-24 Thread Qiuzhuang Lian
Hi, I update git today and when connecting to spark cluster, I got the serialVersionUID incompatible error in class BlockManagerId. Here is the log, Shouldn't we better give BlockManagerId a constant serialVersionUID avoid this? Thanks, Qiuzhuang scala val rdd = sc.parparallelize(1 to

Re: serialVersionUID incompatible error in class BlockManagerId

2014-10-24 Thread Qiuzhuang Lian
this issue, I’d check that you’ve run the “package” and “assembly” phases and that your Spark cluster is using this updated version. - Josh On October 24, 2014 at 6:17:26 PM, Qiuzhuang Lian ( qiuzhuang.l...@gmail.com) wrote: Hi, I update git today and when connecting to spark cluster, I got

Re: Run ScalaTest inside Intellij IDEA

2014-06-11 Thread Qiuzhuang Lian
) at com.martiansoftware.nailgun.NGSession.run(NGSession.java:319) On Jun 11, 2014, at 11:17 AM, Qiuzhuang Lian qiuzhuang.l...@gmail.com wrote: I also run into this problem when running examples in IDEA. The issue looks that it uses depends on too many jars and that the classpath seems to have length limit. So I

How to add user local repository defined in localRepository in settings.xml into Spark SBT build

2014-06-05 Thread Qiuzhuang Lian
Hi, I customized MVN_HOME/conf/settings.xml's localRepository tag To manage maven local jars. localRepositoryF:/Java/maven-build/.m2/repository/localRepository However when I build Spark with SBT, it seems that it still gets the default .m2 repository under Path.userHome + /.m2/repository How