adding Amani and NuwanD.

On Wed, Nov 5, 2014 at 3:56 PM, Lakshman Udayakantha <lakshm...@wso2.com>
wrote:

> Hi Gihan,
>
> As per the offline discussion with you, I changed the cron values that are
> not overlapping(prime numbers) in analyzers.properties file. Now errors are
> not appearing. publishing is also happening successfully. The reason was
> when running several hive scripts at the same time, h2 meta store will get
> corrupted. Thank you pointing me right direction.
>
> @looping all
>
> Another thing is what are the suggestions for these cron values? As an
> example ,Using prime numbers we can reduce, running several scripts at the
> same time. suggestions with valid reasons are welcome.
>
> Thanks
>
> On Wed, Nov 5, 2014 at 2:50 PM, Lakshman Udayakantha <lakshm...@wso2.com>
> wrote:
>
>> Hi Gihan,
>>
>> Yes. That's the case. When I delete the metastore all errors are gone.
>> Publishing is also ok. But when I restart the BAM instance errors are
>> appeared again. any idea why is that?
>>
>> Thanks
>>
>> On Wed, Nov 5, 2014 at 2:22 PM, Gihan Anuruddha <gi...@wso2.com> wrote:
>>
>>> Might be meta database got corrupted.  Delete metastore_db.h2.db in
>>> <BAM_HOME>/repository/database and try.
>>>
>>> On Wed, Nov 5, 2014 at 2:10 PM, Lakshman Udayakantha <lakshm...@wso2.com
>>> > wrote:
>>>
>>>> Hi Gihan,
>>>>
>>>> I didn't use any tool to explorer the h2 database. just checked the
>>>> published statistics in APIM publisher GUI and as you mentioned AUTOSERVER
>>>> property is true by default. Any other clue to spot the error?
>>>>
>>>> Thanks
>>>>
>>>> On Wed, Nov 5, 2014 at 1:58 PM, Gihan Anuruddha <gi...@wso2.com> wrote:
>>>>
>>>>> This can happen if you are using any tool to explore h2 database. Make
>>>>> sure you add AUTO_SERVER=TRUE part at the end of connection url.
>>>>>
>>>>> <url>jdbc:h2:<BAM_HOME
>>>>> >/repository/database/APIMGTSTATS_DB;AUTO_SERVER=TRUE</url>
>>>>>
>>>>> Regards,
>>>>> Gihan
>>>>>
>>>>> On Wed, Nov 5, 2014 at 1:27 PM, Lakshman Udayakantha <
>>>>> lakshm...@wso2.com> wrote:
>>>>>
>>>>>> I am working on this jira[1]. As per the offline discussion with
>>>>>> Amani, I have to break the hive script to individual scripts to address
>>>>>> follow analytic logics
>>>>>>
>>>>>> org_wso2_apimgt_statistics_destination
>>>>>>
>>>>>> org_wso2_apimgt_statistics_request
>>>>>>
>>>>>> org_wso2_apimgt_statistics_response
>>>>>>
>>>>>> org_wso2_apimgt_statistics_fault
>>>>>>
>>>>>> I have attached the analytic toolbox in which include four individual
>>>>>> hive scripts to address above analytic logics here[2]. When I deployed 
>>>>>> this
>>>>>> new toolbox on BAM, It posts some errors into the server logs. I pasted 
>>>>>> the
>>>>>> server log part that contain these errors. Anyway I can notice published
>>>>>> statistics on APIM dashboard. Anyone has any idea why I am getting these
>>>>>> errors?
>>>>>>
>>>>>> [2014-11-05 12:46:15,955] ERROR
>>>>>> {org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBManager} -  Failed to get
>>>>>> connection
>>>>>>
>>>>>> org.h2.jdbc.JdbcSQLException: Connection is broken: "null" [90067-140]
>>>>>>
>>>>>> at
>>>>>> org.h2.message.DbException.getJdbcSQLException(DbException.java:327)
>>>>>>
>>>>>> at org.h2.message.DbException.get(DbException.java:156)
>>>>>>
>>>>>> at org.h2.engine.SessionRemote.connectServer(SessionRemote.java:331)
>>>>>>
>>>>>> at
>>>>>> org.h2.engine.SessionRemote.connectEmbeddedOrServer(SessionRemote.java:253)
>>>>>>
>>>>>> at org.h2.engine.SessionRemote.createSession(SessionRemote.java:219)
>>>>>>
>>>>>> at org.h2.jdbc.JdbcConnection.<init>(JdbcConnection.java:111)
>>>>>>
>>>>>> at org.h2.jdbc.JdbcConnection.<init>(JdbcConnection.java:95)
>>>>>>
>>>>>> at org.h2.Driver.connect(Driver.java:73)
>>>>>>
>>>>>> at java.sql.DriverManager.getConnection(DriverManager.java:582)
>>>>>>
>>>>>> at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBManager.createConnection(DBManager.java:73)
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBManager.createConnection(DBManager.java:85)
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.hadoop.hive.jdbc.storage.JDBCDataOutputFormat.getHiveRecordWriter(JDBCDataOutputFormat.java:48)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:236)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:224)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:478)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:526)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:758)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:758)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:758)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.GroupByOperator.forward(GroupByOperator.java:964)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.GroupByOperator.processAggr(GroupByOperator.java:781)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.GroupByOperator.processOp(GroupByOperator.java:707)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:248)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:518)
>>>>>>
>>>>>> at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:419)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:257)
>>>>>>
>>>>>> Caused by: java.net.UnknownHostException: null
>>>>>>
>>>>>> at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
>>>>>>
>>>>>> at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:883)
>>>>>>
>>>>>> at
>>>>>> java.net.InetAddress.getAddressFromNameService(InetAddress.java:1236)
>>>>>>
>>>>>> at java.net.InetAddress.getAllByName0(InetAddress.java:1187)
>>>>>>
>>>>>> at java.net.InetAddress.getAllByName(InetAddress.java:1117)
>>>>>>
>>>>>> at java.net.InetAddress.getAllByName(InetAddress.java:1053)
>>>>>>
>>>>>> at java.net.InetAddress.getByName(InetAddress.java:1003)
>>>>>>
>>>>>> at org.h2.util.NetUtils.createSocket(NetUtils.java:90)
>>>>>>
>>>>>> at org.h2.engine.SessionRemote.initTransfer(SessionRemote.java:96)
>>>>>>
>>>>>> at org.h2.engine.SessionRemote.connectServer(SessionRemote.java:327)
>>>>>>
>>>>>> ... 30 more
>>>>>>
>>>>>> [2014-11-05 12:46:15,963] FATAL {ExecReducer} -
>>>>>> org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error 
>>>>>> while
>>>>>> processing row (tag=0)
>>>>>> {"key":{"_col0":"testAPI2:v1.0.0","_col1":"carbon.super","_col2":"/test/api2","_col3":"10.100.5.161","_col4":2014,"_col5":10,"_col6":31,"_col7":10,"_col8":14,"_col9":"2014-10-31
>>>>>> 10:14"},"value":{"_col0":{"count":1,"sum":583.0},"_col1":1},"alias":0}
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:257)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:518)
>>>>>>
>>>>>> at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:419)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:257)
>>>>>>
>>>>>> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
>>>>>> java.lang.NullPointerException
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.GroupByOperator.processOp(GroupByOperator.java:720)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:248)
>>>>>>
>>>>>> ... 3 more
>>>>>>
>>>>>> Caused by: java.lang.NullPointerException
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBOperation.isRowExisting(DBOperation.java:144)
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBOperation.writeToDB(DBOperation.java:59)
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBRecordWriter.write(DBRecordWriter.java:35)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:589)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:758)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:758)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:758)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.GroupByOperator.forward(GroupByOperator.java:964)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.GroupByOperator.processAggr(GroupByOperator.java:781)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.GroupByOperator.processOp(GroupByOperator.java:707)
>>>>>>
>>>>>> ... 5 more
>>>>>>
>>>>>> [2014-11-05 12:46:15,964]  WARN
>>>>>> {org.apache.hadoop.mapred.LocalJobRunner} -  job_local_0001
>>>>>>
>>>>>> java.lang.RuntimeException:
>>>>>> org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error 
>>>>>> while
>>>>>> processing row (tag=0)
>>>>>> {"key":{"_col0":"testAPI2:v1.0.0","_col1":"carbon.super","_col2":"/test/api2","_col3":"10.100.5.161","_col4":2014,"_col5":10,"_col6":31,"_col7":10,"_col8":14,"_col9":"2014-10-31
>>>>>> 10:14"},"value":{"_col0":{"count":1,"sum":583.0},"_col1":1},"alias":0}
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:269)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:518)
>>>>>>
>>>>>> at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:419)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:257)
>>>>>>
>>>>>> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive
>>>>>> Runtime Error while processing row (tag=0)
>>>>>> {"key":{"_col0":"testAPI2:v1.0.0","_col1":"carbon.super","_col2":"/test/api2","_col3":"10.100.5.161","_col4":2014,"_col5":10,"_col6":31,"_col7":10,"_col8":14,"_col9":"2014-10-31
>>>>>> 10:14"},"value":{"_col0":{"count":1,"sum":583.0},"_col1":1},"alias":0}
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:257)
>>>>>>
>>>>>> ... 3 more
>>>>>>
>>>>>> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
>>>>>> java.lang.NullPointerException
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.GroupByOperator.processOp(GroupByOperator.java:720)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:248)
>>>>>>
>>>>>> ... 3 more
>>>>>>
>>>>>> Caused by: java.lang.NullPointerException
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBOperation.isRowExisting(DBOperation.java:144)
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBOperation.writeToDB(DBOperation.java:59)
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBRecordWriter.write(DBRecordWriter.java:35)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:589)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:758)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:758)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:467)
>>>>>>
>>>>>> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:758)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.GroupByOperator.forward(GroupByOperator.java:964)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.GroupByOperator.processAggr(GroupByOperator.java:781)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.ql.exec.GroupByOperator.processOp(GroupByOperator.java:707)
>>>>>>
>>>>>> ... 5 more
>>>>>>
>>>>>> Ended Job = job_local_0001 with errors
>>>>>>
>>>>>> [2014-11-05 12:46:16,665] ERROR
>>>>>> {org.apache.hadoop.hive.ql.exec.ExecDriver} -  Ended Job = job_local_0001
>>>>>> with errors
>>>>>>
>>>>>> Error during job, obtaining debugging information...
>>>>>>
>>>>>> [2014-11-05 12:46:16,667] ERROR
>>>>>> {org.apache.hadoop.hive.ql.exec.ExecDriver} -  Error during job, 
>>>>>> obtaining
>>>>>> debugging information...
>>>>>>
>>>>>> Execution failed with exit status: 2
>>>>>>
>>>>>> [2014-11-05 12:46:16,827] ERROR {org.apache.hadoop.hive.ql.exec.Task}
>>>>>> -  Execution failed with exit status: 2
>>>>>>
>>>>>> Obtaining error information
>>>>>>
>>>>>> [2014-11-05 12:46:16,827] ERROR {org.apache.hadoop.hive.ql.exec.Task}
>>>>>> -  Obtaining error information
>>>>>>
>>>>>> Task failed!
>>>>>>
>>>>>> Task ID:
>>>>>>
>>>>>>   Stage-0
>>>>>>
>>>>>> Logs:
>>>>>>
>>>>>> [2014-11-05 12:46:16,827] ERROR {org.apache.hadoop.hive.ql.exec.Task}
>>>>>> -
>>>>>>
>>>>>> Task failed!
>>>>>>
>>>>>> Task ID:
>>>>>>
>>>>>>   Stage-0
>>>>>>
>>>>>> Logs:
>>>>>>
>>>>>>
>>>>>> /Users/lakshman/Desktop/hiveBreaker/test2/wso2bam-2.5.0/repository/logs//wso2carbon.log
>>>>>>
>>>>>> [2014-11-05 12:46:16,827] ERROR {org.apache.hadoop.hive.ql.exec.Task}
>>>>>> -
>>>>>> /Users/lakshman/Desktop/hiveBreaker/test2/wso2bam-2.5.0/repository/logs//wso2carbon.log
>>>>>>
>>>>>> [2014-11-05 12:46:16,827] ERROR
>>>>>> {org.apache.hadoop.hive.ql.exec.ExecDriver} -  Execution failed with exit
>>>>>> status: 2
>>>>>>
>>>>>> FAILED: Execution Error, return code 2 from
>>>>>> org.apache.hadoop.hive.ql.exec.MapRedTask
>>>>>>
>>>>>> [2014-11-05 12:46:16,828] ERROR {org.apache.hadoop.hive.ql.Driver} -
>>>>>> FAILED: Execution Error, return code 2 from
>>>>>> org.apache.hadoop.hive.ql.exec.MapRedTask
>>>>>>
>>>>>> [2014-11-05 12:46:16,832] ERROR
>>>>>> {org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl} -  Error
>>>>>> while executing Hive script.
>>>>>>
>>>>>> Query returned non-zero code: 9, cause: FAILED: Execution Error,
>>>>>> return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask
>>>>>>
>>>>>> java.sql.SQLException: Query returned non-zero code: 9, cause:
>>>>>> FAILED: Execution Error, return code 2 from
>>>>>> org.apache.hadoop.hive.ql.exec.MapRedTask
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:189)
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.executeHiveQuery(HiveExecutorServiceImpl.java:578)
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:286)
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:189)
>>>>>>
>>>>>> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>>>
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>>>
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
>>>>>>
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>>>>>>
>>>>>> at java.lang.Thread.run(Thread.java:695)
>>>>>>
>>>>>> [2014-11-05 12:46:16,833] ERROR
>>>>>> {org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask} -  Error 
>>>>>> while
>>>>>> executing script : am_response_stats_analyzer
>>>>>>
>>>>>> org.wso2.carbon.analytics.hive.exception.HiveExecutionException:
>>>>>> Error while executing Hive script.Query returned non-zero code: 9, cause:
>>>>>> FAILED: Execution Error, return code 2 from
>>>>>> org.apache.hadoop.hive.ql.exec.MapRedTask
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl.execute(HiveExecutorServiceImpl.java:115)
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask.execute(HiveScriptExecutorTask.java:70)
>>>>>>
>>>>>> at
>>>>>> org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:67)
>>>>>>
>>>>>> at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
>>>>>>
>>>>>> at
>>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
>>>>>>
>>>>>> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>>>
>>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>>>
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
>>>>>>
>>>>>> at
>>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>>>>>>
>>>>>> at java.lang.Thread.run(Thread.java:695)
>>>>>>
>>>>>> 2014-11-05 12:46:17,620 null map = 100%,  reduce = 100%
>>>>>>
>>>>>>
>>>>>> [1] https://wso2.org/jira/browse/APIMANAGER-2992
>>>>>>
>>>>>> [2]
>>>>>> https://drive.google.com/file/d/0B9KDy4GJKr1vNFp0NEVwbnZhbjg/view?usp=sharing
>>>>>> <https://drive.google.com/a/wso2.com/file/d/0Bx6Pq431GuEiSzBUVUxkNllnODQ/view?usp=sharing>
>>>>>>
>>>>>> --
>>>>>> Lakshman Udayakantha
>>>>>> WSO2 Inc. www.wso2.com
>>>>>> lean.enterprise.middleware
>>>>>> Mobile: *0711241005 <0711241005>*
>>>>>>
>>>>>>
>>>>>> _______________________________________________
>>>>>> Dev mailing list
>>>>>> Dev@wso2.org
>>>>>> http://wso2.org/cgi-bin/mailman/listinfo/dev
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> W.G. Gihan Anuruddha
>>>>> Senior Software Engineer | WSO2, Inc.
>>>>> M: +94772272595
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Lakshman Udayakantha
>>>> WSO2 Inc. www.wso2.com
>>>> lean.enterprise.middleware
>>>> Mobile: *0711241005 <0711241005>*
>>>>
>>>>
>>>
>>>
>>> --
>>> W.G. Gihan Anuruddha
>>> Senior Software Engineer | WSO2, Inc.
>>> M: +94772272595
>>>
>>
>>
>>
>> --
>> Lakshman Udayakantha
>> WSO2 Inc. www.wso2.com
>> lean.enterprise.middleware
>> Mobile: *0711241005*
>>
>>
>
>
> --
> Lakshman Udayakantha
> WSO2 Inc. www.wso2.com
> lean.enterprise.middleware
> Mobile: *0711241005*
>
>


-- 
Lakshman Udayakantha
WSO2 Inc. www.wso2.com
lean.enterprise.middleware
Mobile: *0711241005*
_______________________________________________
Dev mailing list
Dev@wso2.org
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to