This is how the table was created:
transactions = parts.map(lambda p: Row(customer_id=long(p[0]),
chain=int(p[1]), dept=int(p[2]), category=int(p[3]), company=int(p[4]),
brand=int(p[5]), date=str(p[6]), productsize=float(p[7]),
productmeasure=str(p[8]), purchasequantity=int(p[9]),
purchaseamount=f
How was the table created? Would you mind to share related code? It
seems that the underlying type of the |customer_id| field is actually
long, but the schema says it’s integer, basically it’s a type mismatch
error.
The first query succeeds because |SchemaRDD.count()| is translated to
somethi
Hi Cheng,
I am using Spark 1.1.0.
This is the stack trace:
14/10/10 12:17:40 WARN TaskSetManager: Lost task 120.0 in stage 7.0 (TID
2235, spark-w-0.c.db.internal): java.lang.ClassCastException: java.lang.Long
cannot be cast to java.lang.Integer
scala.runtime.BoxesRunTime.unboxToInt(Boxes
I am using the python api. Unfortunately, I cannot find the isCached method
equivalent in the documentation:
https://spark.apache.org/docs/1.1.0/api/python/index.html in the SQLContext
section.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Except
Hi Poiuytrez, what version of Spark are you using? Exception details
like stacktrace are really needed to investigate this issue. You can
find them in the executor logs, or just browse the application
stderr/stdout link from Spark Web UI.
On 10/9/14 9:37 PM, poiuytrez wrote:
Hello,
I have a
Can you try checking whether the table is being cached? You can use isCached
method. More details are here -
http://spark.apache.org/docs/1.0.2/api/java/org/apache/spark/sql/SQLContext.html
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Exception-