Hi Sanjeewa,

On Mon, May 13, 2013 at 11:23 AM, Tharindu Mathew <[email protected]> wrote:

> BAM 2.0.1 is missing some key features and using that is not an acceptable
> answer. It HAS to work with 2.2.0 or higher. Since it doesn't with 2.2.0,
> it has to just work with 2.3.0.
>

I think we got issue in 2.2.0 due to the issue in toolbox. I hope if we are
using that toolbox it should work with 2.2.0 as well. Isn't it? AFAIK we
didn't fix anything in the BAM side for BAM- APIM integration.


> Can we get an integration test for the API-M toolbox in the BAM
> integration tests?


+1


>
>
> On Mon, May 13, 2013 at 11:17 AM, Sanjeewa Malalgoda <[email protected]>wrote:
>
>> API manager 1.4.0 and BAM 2.3.0 working together without any issue. Also
>> it works fine with BAM 2.0.1 if we changed cassandra data source definition
>> to old format.
>>
>> Thanks.
>>
>>
>>  On Mon, May 13, 2013 at 11:13 AM, Tharindu Mathew <[email protected]>wrote:
>>
>>> Excellent.
>>>
>>> So, are all BAM and API-M integration issues sorted out?
>>>
>>>
>>> On Sat, May 11, 2013 at 8:27 PM, Nuwan Dias <[email protected]> wrote:
>>>
>>>> Hi,
>>>>
>>>> The problem was with me not changing the port in the Cassandra
>>>> DataSource. Sorry for the noise. It works fine when I change the port to
>>>> 9161.
>>>>
>>>> Thanks,
>>>> NuwanD.
>>>>
>>>>
>>>> On Sat, May 11, 2013 at 8:19 PM, Tharindu Mathew <[email protected]>wrote:
>>>>
>>>>> Was the pack smoke tested? Are the BAM scenarios working?
>>>>>
>>>>> Sent from my Samsung Galaxy S3
>>>>> On May 11, 2013 7:59 PM, "Sanjeewa Malalgoda" <[email protected]>
>>>>> wrote:
>>>>>
>>>>>> Hi,
>>>>>> I think this is because error in data source configuration in BAM
>>>>>> side. Please double check WSO2BAM_CASSANDRA_DATASOURCE configuration. It
>>>>>> should be as follows(have to set port offset there as well). It worked 
>>>>>> for
>>>>>> me.
>>>>>>
>>>>>>         <datasource>
>>>>>>             <name>WSO2BAM_CASSANDRA_DATASOURCE</name>
>>>>>>             <description>The datasource used for Cassandra
>>>>>> data</description>
>>>>>>             <definition type="RDBMS">
>>>>>>                 <configuration>
>>>>>>                     <url>jdbc:cassandra://127.0.0.1:9161/EVENT_KS
>>>>>> </url>
>>>>>>                     <username>admin</username>
>>>>>>                     <password>admin</password>
>>>>>>                 </configuration>
>>>>>>             </definition>
>>>>>>         </datasource>
>>>>>>
>>>>>> Thanks,
>>>>>> Sanjeewa.
>>>>>>
>>>>>> On Sat, May 11, 2013 at 3:01 PM, Nuwan Dias <[email protected]> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> I am trying out the latest BAM pack to monitor statistics of the API
>>>>>>> Gateway. But I'm getting the following error on the BAM console when the
>>>>>>> Hive script runs. What could be wrong here? The issue is a Thrift
>>>>>>> connection refused error. I am running an API manager (offset 0), ESB
>>>>>>> (offset 2) and BAM (offset 1) on the same machine. I can see that the 
>>>>>>> data
>>>>>>> has been written to Cassandra. I have attached the toolbox as well.
>>>>>>>
>>>>>>> Hive history
>>>>>>> file=/home/nuwan/Work/AM/10-05-2013/wso2bam-2.3.0/tmp/hive/wso2-querylogs/hive_job_log_nuwan_201305111446_1870617742.txt
>>>>>>> [2013-05-11 14:48:00,092] ERROR
>>>>>>> {org.apache.hadoop.hive.cassandra.CassandraProxyClient} -  Error while
>>>>>>> trying to connect to cassandra host:localhost
>>>>>>> org.apache.hadoop.hive.cassandra.CassandraException: unable to
>>>>>>> connect to server
>>>>>>> at
>>>>>>> org.apache.hadoop.hive.cassandra.CassandraClientHolder.initClient(CassandraClientHolder.java:57)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hive.cassandra.CassandraClientHolder.<init>(CassandraClientHolder.java:37)
>>>>>>> at
>>>>>>> org.apache.hadoop.hive.cassandra.CassandraProxyClient.createConnection(CassandraProxyClient.java:181)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hive.cassandra.CassandraProxyClient.initializeConnection(CassandraProxyClient.java:207)
>>>>>>> at
>>>>>>> org.apache.hadoop.hive.cassandra.CassandraProxyClient.<init>(CassandraProxyClient.java:144)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hive.cassandra.CassandraManager.openConnection(CassandraManager.java:185)
>>>>>>> at
>>>>>>> org.apache.hadoop.hive.cassandra.CassandraStorageHandler.preCreateTable(CassandraStorageHandler.java:224)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:397)
>>>>>>> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:540)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3479)
>>>>>>> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:225)
>>>>>>>  at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
>>>>>>> at
>>>>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>>>>>>>  at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1351)
>>>>>>> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1126)
>>>>>>>  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:934)
>>>>>>> at
>>>>>>> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:201)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:187)
>>>>>>> at
>>>>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:337)
>>>>>>>  at
>>>>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:232)
>>>>>>> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>>>>  at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>>>>>>>  at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>>>>>>> at java.lang.Thread.run(Thread.java:662)
>>>>>>> Caused by: org.apache.thrift.transport.TTransportException:
>>>>>>> java.net.ConnectException: Connection refused
>>>>>>> at org.apache.thrift.transport.TSocket.open(TSocket.java:183)
>>>>>>>  at
>>>>>>> org.apache.thrift.transport.TFramedTransport.open(TFramedTransport.java:81)
>>>>>>> at
>>>>>>> org.apache.hadoop.hive.cassandra.CassandraClientHolder.initClient(CassandraClientHolder.java:54)
>>>>>>>  ... 24 more
>>>>>>> Caused by: java.net.ConnectException: Connection refused
>>>>>>> at java.net.PlainSocketImpl.socketConnect(Native Method)
>>>>>>>  at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
>>>>>>> at
>>>>>>> java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
>>>>>>>  at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
>>>>>>> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
>>>>>>>  at java.net.Socket.connect(Socket.java:529)
>>>>>>> at org.apache.thrift.transport.TSocket.open(TSocket.java:178)
>>>>>>>  ... 26 more
>>>>>>> FAILED: Error in metadata: java.lang.NullPointerException
>>>>>>> [2013-05-11 14:48:00,093] ERROR
>>>>>>> {org.apache.hadoop.hive.ql.exec.Task} -  FAILED: Error in metadata:
>>>>>>> java.lang.NullPointerException
>>>>>>> org.apache.hadoop.hive.ql.metadata.HiveException:
>>>>>>> java.lang.NullPointerException
>>>>>>> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:546)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3479)
>>>>>>> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:225)
>>>>>>>  at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
>>>>>>> at
>>>>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>>>>>>>  at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1351)
>>>>>>> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1126)
>>>>>>>  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:934)
>>>>>>> at
>>>>>>> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:201)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:187)
>>>>>>> at
>>>>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:337)
>>>>>>>  at
>>>>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:232)
>>>>>>> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>>>>  at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>>>>>>>  at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>>>>>>> at java.lang.Thread.run(Thread.java:662)
>>>>>>> Caused by: java.lang.NullPointerException
>>>>>>> at
>>>>>>> org.apache.hadoop.hive.cassandra.CassandraProxyClient.initializeConnection(CassandraProxyClient.java:223)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hive.cassandra.CassandraProxyClient.<init>(CassandraProxyClient.java:144)
>>>>>>> at
>>>>>>> org.apache.hadoop.hive.cassandra.CassandraManager.openConnection(CassandraManager.java:185)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hive.cassandra.CassandraStorageHandler.preCreateTable(CassandraStorageHandler.java:224)
>>>>>>> at
>>>>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:397)
>>>>>>>  at
>>>>>>> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:540)
>>>>>>> ... 16 more
>>>>>>>
>>>>>>> FAILED: Execution Error, return code 1 from
>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>> [2013-05-11 14:48:00,094] ERROR {org.apache.hadoop.hive.ql.Driver} -
>>>>>>>  FAILED: Execution Error, return code 1 from
>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>> [2013-05-11 14:48:00,094] ERROR
>>>>>>> {org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl} -  Error
>>>>>>> while executing Hive script.
>>>>>>> Query returned non-zero code: 9, cause: FAILED: Execution Error,
>>>>>>> return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>> java.sql.SQLException: Query returned non-zero code: 9, cause:
>>>>>>> FAILED: Execution Error, return code 1 from
>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>>  at
>>>>>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:189)
>>>>>>> at
>>>>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:337)
>>>>>>>  at
>>>>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:232)
>>>>>>> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>>>>  at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>>>>>>>  at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>>>>>>> at java.lang.Thread.run(Thread.java:662)
>>>>>>> [2013-05-11 14:48:00,096] ERROR
>>>>>>> {org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask} -  Error 
>>>>>>> while
>>>>>>> executing script : am_stats_analyzer_664
>>>>>>> org.wso2.carbon.analytics.hive.exception.HiveExecutionException:
>>>>>>> Error while executing Hive script.Query returned non-zero code: 9, 
>>>>>>> cause:
>>>>>>> FAILED: Execution Error, return code 1 from
>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>>  at
>>>>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl.execute(HiveExecutorServiceImpl.java:117)
>>>>>>> at
>>>>>>> org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask.execute(HiveScriptExecutorTask.java:60)
>>>>>>>  at
>>>>>>> org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:56)
>>>>>>> at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
>>>>>>>  at
>>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
>>>>>>> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>>>>  at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>>>> at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>>>>>>>  at
>>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>>>>>>> at java.lang.Thread.run(Thread.java:662)
>>>>>>>
>>>>>>> Thanks,
>>>>>>> NuwanD
>>>>>>>
>>>>>>>
>>>>>>> On Fri, May 10, 2013 at 11:09 AM, Gihan Anuruddha <[email protected]>wrote:
>>>>>>>
>>>>>>>> [Adding dev@]
>>>>>>>> Hi Nuwan,
>>>>>>>>
>>>>>>>> Noted. Sorry for the inconvenience.
>>>>>>>>
>>>>>>>> Regards,
>>>>>>>> Gihan
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, May 10, 2013 at 10:53 AM, Nuwan Dias <[email protected]>wrote:
>>>>>>>>
>>>>>>>>> Hi Gihan,
>>>>>>>>>
>>>>>>>>> This should be in the dev@ list.
>>>>>>>>>
>>>>>>>>> Thanks,
>>>>>>>>> NuwanD.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, May 10, 2013 at 10:31 AM, Gihan Anuruddha 
>>>>>>>>> <[email protected]>wrote:
>>>>>>>>>
>>>>>>>>>> Hi QA Team,
>>>>>>>>>>
>>>>>>>>>> Please find latest BAM 2.3.0 pack for QA testing at [1].
>>>>>>>>>>
>>>>>>>>>> [1] -
>>>>>>>>>> http://173.164.178.33/builds/BAM-2.3.0/09-05-2013/wso2bam-2.3.0.zip
>>>>>>>>>>
>>>>>>>>>> Regards,
>>>>>>>>>> Gihan
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> ---
>>>>>>>>>> W.G. Gihan Anuruddha
>>>>>>>>>> Senior Software Engineer | WSO2, Inc.
>>>>>>>>>> M: +94772272595
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> Nuwan Dias
>>>>>>>>>
>>>>>>>>> Member, Management Committee - Solutions Technology Group
>>>>>>>>> Software Engineer - WSO2, Inc. http://wso2.com
>>>>>>>>> email : [email protected]
>>>>>>>>> Phone : +94 777 775 729
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> ---
>>>>>>>> W.G. Gihan Anuruddha
>>>>>>>> Senior Software Engineer | WSO2, Inc.
>>>>>>>> M: +94772272595
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Nuwan Dias
>>>>>>>
>>>>>>> Member, Management Committee - Solutions Technology Group
>>>>>>> Software Engineer - WSO2, Inc. http://wso2.com
>>>>>>> email : [email protected]
>>>>>>> Phone : +94 777 775 729
>>>>>>>
>>>>>>> _______________________________________________
>>>>>>> Dev mailing list
>>>>>>> [email protected]
>>>>>>> http://wso2.org/cgi-bin/mailman/listinfo/dev
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> *Sanjeewa Malalgoda*
>>>>>> WSO2 Inc.
>>>>>> Mobile : +14084122175 | +94713068779
>>>>>>
>>>>>>  <http://sanjeewamalalgoda.blogspot.com/>blog
>>>>>> :http://sanjeewamalalgoda.blogspot.com/<http://sanjeewamalalgoda.blogspot.com/>
>>>>>>
>>>>>> _______________________________________________
>>>>>> Dev mailing list
>>>>>> [email protected]
>>>>>> http://wso2.org/cgi-bin/mailman/listinfo/dev
>>>>>>
>>>>>>
>>>>
>>>>
>>>> --
>>>> Nuwan Dias
>>>>
>>>> Member, Management Committee - Solutions Technology Group
>>>> Software Engineer - WSO2, Inc. http://wso2.com
>>>> email : [email protected]
>>>> Phone : +94 777 775 729
>>>>
>>>
>>>
>>>
>>> --
>>> Regards,
>>>
>>> Tharindu Mathew
>>>
>>> Associate Technical Lead, WSO2 BAM
>>> Member - Data Mgmt. Committee
>>>
>>> blog: http://tharindumathew.com/
>>> M: +94777759908
>>>
>>
>>
>>
>> --
>> *Sanjeewa Malalgoda*
>> WSO2 Inc.
>> Mobile : +14084122175 | +94713068779
>>
>>  <http://sanjeewamalalgoda.blogspot.com/>blog
>> :http://sanjeewamalalgoda.blogspot.com/<http://sanjeewamalalgoda.blogspot.com/>
>>
>
>
>
> --
> Regards,
>
> Tharindu Mathew
>
> Associate Technical Lead, WSO2 BAM
> Member - Data Mgmt. Committee
>
> blog: http://tharindumathew.com/
> M: +94777759908
>
> _______________________________________________
> Dev mailing list
> [email protected]
> http://wso2.org/cgi-bin/mailman/listinfo/dev
>
>


-- 
*Kasun Weranga*
**
Member, Management Committee - Data Technologies
Software Engineer
*WSO2, Inc.
*lean.enterprise.middleware.
mobile : +94 772314602
<http://sanjeewamalalgoda.blogspot.com/>blog :
http://kasunweranga.blogspot.com/
_______________________________________________
Dev mailing list
[email protected]
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to