Hi Devs,

I was trying to publish APIM stats to DAS following this blog [1]

And in DAS console I got the following error.

[2015-11-04 12:10:08,163] ERROR
{org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter} -  Error in
executing task: Can't call commit when autocommit=true
java.lang.RuntimeException: Can't call commit when autocommit=true
at
org.apache.spark.sql.jdbc.carbon.JDBCRelation.insert(JDBCRelation.scala:193)
at org.apache.spark.sql.sources.InsertIntoDataSource.run(commands.scala:53)
at
org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
at
org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
at
org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:68)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:87)
at
org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:950)
at
org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:950)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:144)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:128)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:755)


Immediately after that error the below error was there.

[2015-11-04 12:15:00,027] ERROR
{org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter} -  Error in
executing task: Error while connecting to datasource WSO2AM_STATS_DB :
Table 'TP_WSO2AM_STATS_DB.API_REQUEST_SUMMARY' doesn't exist
java.lang.RuntimeException: Error while connecting to datasource
WSO2AM_STATS_DB : Table 'TP_WSO2AM_STATS_DB.API_REQUEST_SUMMARY' doesn't
exist
at
org.apache.spark.sql.jdbc.carbon.JDBCRelation.liftedTree1$1(JDBCRelation.scala:143)
at
org.apache.spark.sql.jdbc.carbon.JDBCRelation.<init>(JDBCRelation.scala:137)
at
org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider.createRelation(JDBCRelation.scala:119)
at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:269)
at org.apache.spark.sql.sources.CreateTempTableUsing.run(ddl.scala:412)
at
org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
at
org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
at
org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:68)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:87)
at
org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:950)
at
org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:950)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:144)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:128)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:755)


As the error message says the 'TP_WSO2AM_STATS_DB.API_REQUEST_SUMMARY'
table was gone !

*I added the relaxAutoCommit=true property to the JDBC connection string to
solve the first error message and stat feature worked.*

Is the table deletion, a result of the first error (trying to commit when
auto commit is enabled)

Thanks
Rushmin




[1] -
http://blog.rukspot.com/2015/09/publishing-apim-runtime-statistics-to.html



-- 
*Rushmin Fernando*
*Technical Lead*

WSO2 Inc. <http://wso2.com/> - Lean . Enterprise . Middleware

email : rush...@wso2.com
mobile : +94772310855
_______________________________________________
Dev mailing list
Dev@wso2.org
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to