Hi Nashri,

this error comes from the Spark native jdbc connector as well. I'm
currently working on this. let me check.

in the mean time can you open a JIRRA for this>

best

On Thu, Aug 27, 2015 at 12:20 PM, Aaquibah Nashry <[email protected]> wrote:

> Hi,
>
> I created a datasource (RDBMS) and I am trying to retrieve data in that to
> the DAL. I was able to create the temporary tables using:
>
>
>
> *create temporary table tempSparkTable using CarbonJDBC options
> (dataSource "localDB", tableName "test123"); CREATE TEMPORARY TABLE
> tempDASTable USING CarbonAnalytics OPTIONS (tableName "dasTemp", schema "id
> INT, name STRING, countN INT");*
>
> When i tried to insert values using the following command:
>
> *insert overwrite table tempDASTable select id, name, countN from
> tempSparkTable;*
>
> i got the following error:
>
> *ERROR: *Job aborted due to stage failure: Task 0 in stage 1.0 failed 1
> times, most recent failure: Lost task 0.0 in stage 1.0 (TID 1, localhost):
> java.sql.SQLException: Invalid value for getInt() - 'id' at
> com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1094) at
> com.mysql.jdbc.SQLError.createSQLException(SQLError.java:997) at
> com.mysql.jdbc.SQLError.createSQLException(SQLError.java:983) at
> com.mysql.jdbc.SQLError.createSQLException(SQLError.java:928) at
> com.mysql.jdbc.ResultSetImpl.getInt(ResultSetImpl.java:2758) at
> org.apache.spark.sql.jdbc.JDBCRDD$$anon$1.getNext(JDBCRDD.scala:416) at
> org.apache.spark.sql.jdbc.JDBCRDD$$anon$1.hasNext(JDBCRDD.scala:472) at
> scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327) at
> org.wso2.carbon.analytics.spark.core.sources.AnalyticsWritingFunction.apply(AnalyticsWritingFunction.java:72)
> at
> org.wso2.carbon.analytics.spark.core.sources.AnalyticsWritingFunction.apply(AnalyticsWritingFunction.java:41)
> at
> org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:878)
> at
> org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:878)
> at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1767)
> at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1767)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63) at
> org.apache.spark.scheduler.Task.run(Task.scala:70) at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213) at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
>
> I get the same error when i try to use the following command in the
> console in DAS:
> *select * from tempSparkTable*
>
>
> Kindly provide assistance to overcome this issue.
> Thanks
>
> Regards,
>
> M.R.Aaquibah Nashry
> *Intern, Engineering**| **WSO2, Inc.*
> Mobile : +94 773946123
> Tel      : +94 112662541
> Email : [email protected] <[email protected]>
>



-- 
*Niranda Perera*
Software Engineer, WSO2 Inc.
Mobile: +94-71-554-8430
Twitter: @n1r44 <https://twitter.com/N1R44>
https://pythagoreanscript.wordpress.com/
_______________________________________________
Dev mailing list
[email protected]
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to