Hi Inosh,

Thanks for the info. As you pointed out, specifying DB2 type as DB2/NT.
This error is gone and statistics published successfully.

Thanks

On Tue, Nov 24, 2015 at 11:06 AM, Inosh Goonewardena <in...@wso2.com> wrote:

> Hi Lakshman,
>
> There was an issue [1] and this has been fixed now. As a workaround can
> you try by specifying the exact DB2 database type instead of DB* in
> rdbms-config.xml
>
> [1] https://wso2.org/jira/browse/DAS-311
>
>
> On Tuesday, November 24, 2015, Lakshman Udayakantha <lakshm...@wso2.com>
> wrote:
>
>> Hi,
>>
>> I configured APIM with DAS to publish runtime statistics with DB2. When
>> The spark query runs with cron job, I got below error.
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> * [2015-11-24 10:15:00,007]  INFO
>> {org.wso2.carbon.analytics.spark.core.AnalyticsTask} -  Executing the
>> schedule task for: APIM_STAT_SCRIPT for tenant id: -1234[2015-11-24
>> 10:16:07,353] ERROR {org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter}
>> -  Error in executing task: None.getjava.lang.RuntimeException: None.getat
>> org.apache.spark.sql.jdbc.carbon.JDBCRelation.insert(JDBCRelation.scala:193)at
>> org.apache.spark.sql.sources.InsertIntoDataSource.run(commands.scala:53)at
>> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)at
>> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)at
>> org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:68)at
>> org.apache.spark.sql.execution.SparkPlan$anonfun$execute$1.apply(SparkPlan.scala:88)at
>> org.apache.spark.sql.execution.SparkPlan$anonfun$execute$1.apply(SparkPlan.scala:88)at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)at
>> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:87)at
>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:950)at
>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:950)at
>> org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:144)at
>> org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:128)at
>> org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)at
>> org.apache.spark.sql.SQLContext.sql(SQLContext.scala:755)at
>> org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQueryLocal(SparkAnalyticsExecutor.java:710)at
>> org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQuery(SparkAnalyticsExecutor.java:692)at
>> org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeQuery(CarbonAnalyticsProcessorService.java:199)at
>> org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeScript(CarbonAnalyticsProcessorService.java:149)at
>> org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(AnalyticsTask.java:57)at
>> org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:67)at
>> org.quartz.core.JobRunShell.run(JobRunShell.java:213)at
>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)at
>> java.util.concurrent.FutureTask.run(FutureTask.java:262)at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)at
>> java.lang.Thread.run(Thread.java:745)Caused by:
>> java.util.NoSuchElementException: None.getat
>> scala.None$.get(Option.scala:313)at scala.None$.get(Option.scala:311)at
>> org.apache.spark.sql.jdbc.carbon.JdbcUtils$.getQueryConfigEntry(JdbcUtils.scala:69)at
>> org.apache.spark.sql.jdbc.carbon.JdbcUtils$.tableExists(JdbcUtils.scala:45)at
>> org.apache.spark.sql.jdbc.carbon.JDBCRelation.insert(JDBCRelation.scala:170)...
>> 26 more *
>>
>>
>> Debugged the analytics code and found this line will throw the error
>>
>> this.sqlCtx.sql(query);
>>
>> When running below query
>>
>> *INSERT OVERWRITE TABLE X1234_APIRequestSummaryData SELECT
>> api,api_version,version,apiPublisher,consumerKey,userId,context,max_request_time,total_request_count,hostName,year,month,day,time
>> FROM X1234_API_REQUEST_SUMMARY_FINAL*
>> Any idea to resolve this issue is much appreciated.
>>
>> Thanks
>> --
>> Lakshman Udayakantha
>> WSO2 Inc. www.wso2.com
>> lean.enterprise.middleware
>> Mobile: *0714388124*
>>
>>
>
> --
> Thanks & Regards,
>
> Inosh Goonewardena
> Associate Technical Lead- WSO2 Inc.
> Mobile: +94779966317
>
>


-- 
Lakshman Udayakantha
WSO2 Inc. www.wso2.com
lean.enterprise.middleware
Mobile: *0714388124*
_______________________________________________
Dev mailing list
Dev@wso2.org
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to