Hi Jorge,

Did you recreate the Table schemas in RDBMS and check? Can you send the
stack trace of the error?

Thanks and Regards.

On Fri, Jan 15, 2016 at 9:03 PM, Jorge <[email protected]> wrote:

> Hi all.
>
> In my setup the error still arise, with
>
>         <defaultAutoCommit>false</defaultAutoCommit>
>
> set in the configuration and the ?relaxAutoCommit=true at the end of the
> connection url string.
>
> I setup another test environment with the same WSO2 DAS and the error is
> gone, very strange because in my production environment still persist.
>
> Any idea?
>
> Regards,
>                   Jorge.
>
> 2015-11-04 12:36 GMT-05:00 Niranda Perera <[email protected]>:
>
>> Hi Rushmin,
>>
>> yes, what Sinthuja explained is correct. this is a known bug in the
>> current CarbonJDBC connector and it is expected to fixed in the upcoming
>> DAS patch release.
>>
>> rgds
>>
>> On Wed, Nov 4, 2015 at 9:46 PM, Rukshan Premathunga <[email protected]>
>> wrote:
>>
>>> Hi Rushmin/Sinthuja,
>>>
>>> I'll update the blog accordingly. Thanks for Pointing the issue and
>>> Solution.
>>>
>>> Thanks and Regards.
>>>
>>> On Wed, Nov 4, 2015 at 9:09 PM, Rushmin Fernando <[email protected]>
>>> wrote:
>>>
>>>> Yes that explains ! Thanks Sinthuja :-)
>>>>
>>>> On Wed, Nov 4, 2015 at 3:36 PM, Sinthuja Ragendran <[email protected]>
>>>> wrote:
>>>>
>>>>> Hi Rushmin,
>>>>>
>>>>>
>>>>> On Wed, Nov 4, 2015 at 8:53 PM, Rushmin Fernando <[email protected]>
>>>>> wrote:
>>>>>
>>>>>> Hi Sinthuja,
>>>>>>
>>>>>> Yes the relaxAutoCommit parameter was not there in the beginning and
>>>>>> the table drop was probably due to that.
>>>>>>
>>>>>> Having the relaxAutoCommit in place and recreating the tables solved
>>>>>> the issue :-)
>>>>>>
>>>>>
>>>>> Great!
>>>>>
>>>>>
>>>>>>
>>>>>> Just wanted to know whether the auto-commit related error caused the
>>>>>> table drop.
>>>>>>
>>>>>
>>>>> No, it didn't cause the table to drop, the default behaviour it self
>>>>> will drop the table and create it again; but due to the exception, the
>>>>> creation part didn't happen and only dropping the table was executed.
>>>>>
>>>>> Thanks,
>>>>> Sinthuja.
>>>>>
>>>>>>
>>>>>> Thank you for your clarification.
>>>>>>
>>>>>> Thanks
>>>>>> Rushmin
>>>>>>
>>>>>> On Wed, Nov 4, 2015 at 2:20 PM, Sinthuja Ragendran <[email protected]
>>>>>> > wrote:
>>>>>>
>>>>>>> Hi Rushmin,
>>>>>>>
>>>>>>> Did you create the table after encountering the exception, and
>>>>>>> adding the relaxAutoCommit parameter as well? Because during the 
>>>>>>> exception
>>>>>>> phase it could have got dropped, and now it's again searching for the 
>>>>>>> non-
>>>>>>> existing table. Hence I wanted to make sure, you ran the mysql script 
>>>>>>> now,
>>>>>>> and still seeing this exception.
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Sinthuja.
>>>>>>>
>>>>>>> On Wed, Nov 4, 2015 at 7:46 PM, Rushmin Fernando <[email protected]>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi Sinthuja,
>>>>>>>>
>>>>>>>> Thanks for the quick response.
>>>>>>>>
>>>>>>>> I did create the tables as per the instructions :-)
>>>>>>>>
>>>>>>>> But 'API_REQUEST_SUMMARY' table was gone after the auto-commit
>>>>>>>> error, causing the subsequent error messages.
>>>>>>>>
>>>>>>>> I could reproduce it consistently.
>>>>>>>>
>>>>>>>> Thanks
>>>>>>>> Rushhmin
>>>>>>>>
>>>>>>>>
>>>>>>>> On Wed, Nov 4, 2015 at 2:09 PM, Sinthuja Ragendran <
>>>>>>>> [email protected]> wrote:
>>>>>>>>
>>>>>>>>> Hi Rushmin,
>>>>>>>>>
>>>>>>>>> Spark JDBC connector first drops the existing table and recreates,
>>>>>>>>> but anyhow to execute for the first time you need to have the table
>>>>>>>>> created. Therefore please create the tables in the database by running
>>>>>>>>> mysql script [1] which was given in the blog post.
>>>>>>>>>
>>>>>>>>> [1]
>>>>>>>>> https://github.com/ruks/WSO2-APIM_DAS_Analytics_CApp/tree/v1.0.2/dbscripts
>>>>>>>>>
>>>>>>>>> Thanks,
>>>>>>>>> Sinthuja.
>>>>>>>>>
>>>>>>>>> On Wed, Nov 4, 2015 at 7:24 PM, Rushmin Fernando <[email protected]
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> Hi Devs,
>>>>>>>>>>
>>>>>>>>>> I was trying to publish APIM stats to DAS following this blog [1]
>>>>>>>>>>
>>>>>>>>>> And in DAS console I got the following error.
>>>>>>>>>>
>>>>>>>>>> [2015-11-04 12:10:08,163] ERROR
>>>>>>>>>> {org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter} -  Error in
>>>>>>>>>> executing task: Can't call commit when autocommit=true
>>>>>>>>>> java.lang.RuntimeException: Can't call commit when autocommit=true
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.jdbc.carbon.JDBCRelation.insert(JDBCRelation.scala:193)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.sources.InsertIntoDataSource.run(commands.scala:53)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:68)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:87)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:950)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:950)
>>>>>>>>>> at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:144)
>>>>>>>>>> at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:128)
>>>>>>>>>> at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>>>>>>>>>> at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:755)
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Immediately after that error the below error was there.
>>>>>>>>>>
>>>>>>>>>> [2015-11-04 12:15:00,027] ERROR
>>>>>>>>>> {org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter} -  Error in
>>>>>>>>>> executing task: Error while connecting to datasource WSO2AM_STATS_DB 
>>>>>>>>>> :
>>>>>>>>>> Table 'TP_WSO2AM_STATS_DB.API_REQUEST_SUMMARY' doesn't exist
>>>>>>>>>> java.lang.RuntimeException: Error while connecting to datasource
>>>>>>>>>> WSO2AM_STATS_DB : Table 'TP_WSO2AM_STATS_DB.API_REQUEST_SUMMARY' 
>>>>>>>>>> doesn't
>>>>>>>>>> exist
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.jdbc.carbon.JDBCRelation.liftedTree1$1(JDBCRelation.scala:143)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.jdbc.carbon.JDBCRelation.<init>(JDBCRelation.scala:137)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider.createRelation(JDBCRelation.scala:119)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:269)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.sources.CreateTempTableUsing.run(ddl.scala:412)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:68)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:87)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:950)
>>>>>>>>>> at
>>>>>>>>>> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:950)
>>>>>>>>>> at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:144)
>>>>>>>>>> at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:128)
>>>>>>>>>> at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
>>>>>>>>>> at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:755)
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> As the error message says the
>>>>>>>>>> 'TP_WSO2AM_STATS_DB.API_REQUEST_SUMMARY' table was gone !
>>>>>>>>>>
>>>>>>>>>> *I added the relaxAutoCommit=true property to the JDBC connection
>>>>>>>>>> string to solve the first error message and stat feature worked.*
>>>>>>>>>>
>>>>>>>>>> Is the table deletion, a result of the first error (trying to
>>>>>>>>>> commit when auto commit is enabled)
>>>>>>>>>>
>>>>>>>>>> Thanks
>>>>>>>>>> Rushmin
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> [1] -
>>>>>>>>>> http://blog.rukspot.com/2015/09/publishing-apim-runtime-statistics-to.html
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> *Rushmin Fernando*
>>>>>>>>>> *Technical Lead*
>>>>>>>>>>
>>>>>>>>>> WSO2 Inc. <http://wso2.com/> - Lean . Enterprise . Middleware
>>>>>>>>>>
>>>>>>>>>> email : [email protected]
>>>>>>>>>> mobile : +94772310855
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> _______________________________________________
>>>>>>>>>> Dev mailing list
>>>>>>>>>> [email protected]
>>>>>>>>>> http://wso2.org/cgi-bin/mailman/listinfo/dev
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> *Sinthuja Rajendran*
>>>>>>>>> Associate Technical Lead
>>>>>>>>> WSO2, Inc.:http://wso2.com
>>>>>>>>>
>>>>>>>>> Blog: http://sinthu-rajan.blogspot.com/
>>>>>>>>> Mobile: +94774273955
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> *Rushmin Fernando*
>>>>>>>> *Technical Lead*
>>>>>>>>
>>>>>>>> WSO2 Inc. <http://wso2.com/> - Lean . Enterprise . Middleware
>>>>>>>>
>>>>>>>> email : [email protected]
>>>>>>>> mobile : +94772310855
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> *Sinthuja Rajendran*
>>>>>>> Associate Technical Lead
>>>>>>> WSO2, Inc.:http://wso2.com
>>>>>>>
>>>>>>> Blog: http://sinthu-rajan.blogspot.com/
>>>>>>> Mobile: +94774273955
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> *Rushmin Fernando*
>>>>>> *Technical Lead*
>>>>>>
>>>>>> WSO2 Inc. <http://wso2.com/> - Lean . Enterprise . Middleware
>>>>>>
>>>>>> email : [email protected]
>>>>>> mobile : +94772310855
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> *Sinthuja Rajendran*
>>>>> Associate Technical Lead
>>>>> WSO2, Inc.:http://wso2.com
>>>>>
>>>>> Blog: http://sinthu-rajan.blogspot.com/
>>>>> Mobile: +94774273955
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> *Rushmin Fernando*
>>>> *Technical Lead*
>>>>
>>>> WSO2 Inc. <http://wso2.com/> - Lean . Enterprise . Middleware
>>>>
>>>> email : [email protected]
>>>> mobile : +94772310855
>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> Rukshan Chathuranga.
>>> Software Engineer.
>>> WSO2, Inc.
>>>
>>
>>
>>
>> --
>> *Niranda Perera*
>> Software Engineer, WSO2 Inc.
>> Mobile: +94-71-554-8430
>> Twitter: @n1r44 <https://twitter.com/N1R44>
>> https://pythagoreanscript.wordpress.com/
>>
>> _______________________________________________
>> Dev mailing list
>> [email protected]
>> http://wso2.org/cgi-bin/mailman/listinfo/dev
>>
>>
>


-- 
Rukshan Chathuranga.
Software Engineer.
WSO2, Inc.
_______________________________________________
Dev mailing list
[email protected]
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to