Re: [Dev] [DAS][310] Errors while running APIM_INCREMENTAL_PROCESSING_SCRIPT

2016-09-21 Thread Malith Munasinghe
Thank you Supun for that information.

Regards,
Malith

On Thu, Sep 22, 2016 at 11:41 AM, Supun Sethunga  wrote:

> BTW, This issue has already been fixed in the analytics-apim master branch.
>
> Regards,
> Supun
>
> On Wed, Sep 21, 2016 at 3:04 PM, Malith Munasinghe 
> wrote:
>
>> Hi All,
>>
>> Thanks for the prompt responses we will do the needful.
>>
>> Regards,
>> Malith
>>
>> On Wed, Sep 21, 2016 at 2:54 PM, Rukshan Premathunga 
>> wrote:
>>
>>> Hi Malith,
>>>
>>> cApp we provided to Analytics APIM will not work for the DAS because of
>>> the changes happen to the DAS 3.1.0. Because of that we need to use
>>> Analytics APIM or have to update capp with the above changes.
>>>
>>> Thanks and Regards.
>>>
>>> On Wed, Sep 21, 2016 at 2:48 PM, Niranda Perera 
>>> wrote:
>>>
 Hi Malith,

 Yes, correct! you need to change the script.

 Additionally, there are some changes in the carbonJdbc connector as
 well... so, you might need to watch out for it!

 Please check with the APIM team and ESB team whether we are doing a
 feature release with the DAS 310 changes?

 cheers

 On Wed, Sep 21, 2016 at 5:11 AM, Malith Munasinghe 
 wrote:

> Hi All,
>
> While preparing a DAS 3.1.0 to run APIM Analytics I have added
> features as in [1]
> .
> After deploying the CApp for APIM Analytics I run in to below error.
> According to the error that *incrementalProcessing *is not a valid
> option. Also according to [2]
>  the
> syntax to parse this option is *incrementalParams. *In order to get
> DAS 3.1.0 to process APIM Analytics
> do we have to change the scripts with this option as well ?
>
>
> TID: [-1234] [] [2016-09-21 08:54:00,019] ERROR
> {org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService}
> -  Error while executing query : CREATE TEMPORARY TABLE
> APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
> "ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "year INT
> -i, month INT -i, day INT -i, hour INT -i, minute INT -i,consumerKey
> STRING, context STRING, api_version STRING, api STRING, version STRING,
> requestTime LONG, userId STRING, hostName STRING,apiPublisher STRING,
> total_request_count LONG, resourceTemplate STRING, method STRING,
> applicationName STRING, tenantDomain STRING,userAgent STRING,
> resourcePath STRING, request INT, applicationId STRING, tier STRING,
> throttledOut BOOLEAN, clientIp STRING,applicationOwner STRING,
> _timestamp LONG -i",primaryKeys "year, month, day, hour, minute,
> consumerKey, context, api_version, userId, hostName, apiPublisher,
> resourceTemplate, method, userAgent, clientIp",incrementalProcessing
> "APIMGT_PERMINUTE_REQUEST_DATA, HOUR",mergeSchema "false")
> {org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService}
> org.wso2.carbon.analytics.spark.core.exception.AnalyticsExecutionException:
> Exception in executing query CREATE TEMPORARY TABLE
> APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
> "ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "year INT
> -i, month INT -i, day INT -i, hour INT -i, minute INT -i,consumerKey
> STRING, context STRING, api_version STRING, api STRING, version STRING,
> requestTime LONG, userId STRING, hostName STRING,apiPublisher STRING,
> total_request_count LONG, resourceTemplate STRING, method STRING,
> applicationName STRING, tenantDomain STRING,userAgent STRING,
> resourcePath STRING, request INT, applicationId STRING, tier STRING,
> throttledOut BOOLEAN, clientIp STRING,applicationOwner STRING,
> _timestamp LONG -i",primaryKeys "year, month, day, hour, minute,
> consumerKey, context, api_version, userId, hostName, apiPublisher,
> resourceTemplate, method, userAgent, clientIp",incrementalProcessing
> "APIMGT_PERMINUTE_REQUEST_DATA, HOUR",mergeSchema "false")
> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
> Executor.executeQueryLocal(SparkAnalyticsExecutor.java:764)
> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
> Executor.executeQuery(SparkAnalyticsExecutor.java:721)
> at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcesso
> rService.executeQuery(CarbonAnalyticsProcessorService.java:201)
> at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcesso
> rService.executeScript(CarbonAnalyticsProcessorService.java:151)
> at org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(A
> nalyticsTask.java:60)
> at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute
> (TaskQuartzJobAdapter.java:67)
> at org.quartz.co

Re: [Dev] [DAS][310] Errors while running APIM_INCREMENTAL_PROCESSING_SCRIPT

2016-09-21 Thread Supun Sethunga
BTW, This issue has already been fixed in the analytics-apim master branch.

Regards,
Supun

On Wed, Sep 21, 2016 at 3:04 PM, Malith Munasinghe  wrote:

> Hi All,
>
> Thanks for the prompt responses we will do the needful.
>
> Regards,
> Malith
>
> On Wed, Sep 21, 2016 at 2:54 PM, Rukshan Premathunga 
> wrote:
>
>> Hi Malith,
>>
>> cApp we provided to Analytics APIM will not work for the DAS because of
>> the changes happen to the DAS 3.1.0. Because of that we need to use
>> Analytics APIM or have to update capp with the above changes.
>>
>> Thanks and Regards.
>>
>> On Wed, Sep 21, 2016 at 2:48 PM, Niranda Perera  wrote:
>>
>>> Hi Malith,
>>>
>>> Yes, correct! you need to change the script.
>>>
>>> Additionally, there are some changes in the carbonJdbc connector as
>>> well... so, you might need to watch out for it!
>>>
>>> Please check with the APIM team and ESB team whether we are doing a
>>> feature release with the DAS 310 changes?
>>>
>>> cheers
>>>
>>> On Wed, Sep 21, 2016 at 5:11 AM, Malith Munasinghe 
>>> wrote:
>>>
 Hi All,

 While preparing a DAS 3.1.0 to run APIM Analytics I have added features
 as in [1]
 .
 After deploying the CApp for APIM Analytics I run in to below error.
 According to the error that *incrementalProcessing *is not a valid
 option. Also according to [2]
  the
 syntax to parse this option is *incrementalParams. *In order to get
 DAS 3.1.0 to process APIM Analytics
 do we have to change the scripts with this option as well ?


 TID: [-1234] [] [2016-09-21 08:54:00,019] ERROR
 {org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService}
 -  Error while executing query : CREATE TEMPORARY TABLE
 APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
 "ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "year INT
 -i, month INT -i, day INT -i, hour INT -i, minute INT -i,consumerKey
 STRING, context STRING, api_version STRING, api STRING, version STRING,
 requestTime LONG, userId STRING, hostName STRING,apiPublisher STRING,
 total_request_count LONG, resourceTemplate STRING, method STRING,
 applicationName STRING, tenantDomain STRING,userAgent STRING,
 resourcePath STRING, request INT, applicationId STRING, tier STRING,
 throttledOut BOOLEAN, clientIp STRING,applicationOwner STRING,
 _timestamp LONG -i",primaryKeys "year, month, day, hour, minute,
 consumerKey, context, api_version, userId, hostName, apiPublisher,
 resourceTemplate, method, userAgent, clientIp",incrementalProcessing
 "APIMGT_PERMINUTE_REQUEST_DATA, HOUR",mergeSchema "false")
 {org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService}
 org.wso2.carbon.analytics.spark.core.exception.AnalyticsExecutionException:
 Exception in executing query CREATE TEMPORARY TABLE
 APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
 "ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "year INT
 -i, month INT -i, day INT -i, hour INT -i, minute INT -i,consumerKey
 STRING, context STRING, api_version STRING, api STRING, version STRING,
 requestTime LONG, userId STRING, hostName STRING,apiPublisher STRING,
 total_request_count LONG, resourceTemplate STRING, method STRING,
 applicationName STRING, tenantDomain STRING,userAgent STRING,
 resourcePath STRING, request INT, applicationId STRING, tier STRING,
 throttledOut BOOLEAN, clientIp STRING,applicationOwner STRING,
 _timestamp LONG -i",primaryKeys "year, month, day, hour, minute,
 consumerKey, context, api_version, userId, hostName, apiPublisher,
 resourceTemplate, method, userAgent, clientIp",incrementalProcessing
 "APIMGT_PERMINUTE_REQUEST_DATA, HOUR",mergeSchema "false")
 at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
 Executor.executeQueryLocal(SparkAnalyticsExecutor.java:764)
 at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
 Executor.executeQuery(SparkAnalyticsExecutor.java:721)
 at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcesso
 rService.executeQuery(CarbonAnalyticsProcessorService.java:201)
 at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcesso
 rService.executeScript(CarbonAnalyticsProcessorService.java:151)
 at org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(A
 nalyticsTask.java:60)
 at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute
 (TaskQuartzJobAdapter.java:67)
 at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
 at java.util.concurrent.Executors$RunnableAdapter.call(Executor
 s.java:471)
 at java.util.concurrent.FutureTask.run(FutureTask.java:262)
 at java.util.concurrent.Thre

Re: [Dev] [DAS][310] Errors while running APIM_INCREMENTAL_PROCESSING_SCRIPT

2016-09-21 Thread Malith Munasinghe
Hi All,

Thanks for the prompt responses we will do the needful.

Regards,
Malith

On Wed, Sep 21, 2016 at 2:54 PM, Rukshan Premathunga 
wrote:

> Hi Malith,
>
> cApp we provided to Analytics APIM will not work for the DAS because of
> the changes happen to the DAS 3.1.0. Because of that we need to use
> Analytics APIM or have to update capp with the above changes.
>
> Thanks and Regards.
>
> On Wed, Sep 21, 2016 at 2:48 PM, Niranda Perera  wrote:
>
>> Hi Malith,
>>
>> Yes, correct! you need to change the script.
>>
>> Additionally, there are some changes in the carbonJdbc connector as
>> well... so, you might need to watch out for it!
>>
>> Please check with the APIM team and ESB team whether we are doing a
>> feature release with the DAS 310 changes?
>>
>> cheers
>>
>> On Wed, Sep 21, 2016 at 5:11 AM, Malith Munasinghe 
>> wrote:
>>
>>> Hi All,
>>>
>>> While preparing a DAS 3.1.0 to run APIM Analytics I have added features
>>> as in [1]
>>> .
>>> After deploying the CApp for APIM Analytics I run in to below error.
>>> According to the error that *incrementalProcessing *is not a valid
>>> option. Also according to [2]
>>>  the
>>> syntax to parse this option is *incrementalParams. *In order to get DAS
>>> 3.1.0 to process APIM Analytics
>>> do we have to change the scripts with this option as well ?
>>>
>>>
>>> TID: [-1234] [] [2016-09-21 08:54:00,019] ERROR
>>> {org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService}
>>> -  Error while executing query : CREATE TEMPORARY TABLE
>>> APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
>>> "ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "year INT -i,
>>> month INT -i, day INT -i, hour INT -i, minute INT -i,consumerKey
>>> STRING, context STRING, api_version STRING, api STRING, version STRING,
>>> requestTime LONG, userId STRING, hostName STRING,apiPublisher STRING,
>>> total_request_count LONG, resourceTemplate STRING, method STRING,
>>> applicationName STRING, tenantDomain STRING,userAgent STRING,
>>> resourcePath STRING, request INT, applicationId STRING, tier STRING,
>>> throttledOut BOOLEAN, clientIp STRING,applicationOwner STRING,
>>> _timestamp LONG -i",primaryKeys "year, month, day, hour, minute,
>>> consumerKey, context, api_version, userId, hostName, apiPublisher,
>>> resourceTemplate, method, userAgent, clientIp",incrementalProcessing
>>> "APIMGT_PERMINUTE_REQUEST_DATA, HOUR",mergeSchema "false")
>>> {org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService}
>>> org.wso2.carbon.analytics.spark.core.exception.AnalyticsExecutionException:
>>> Exception in executing query CREATE TEMPORARY TABLE
>>> APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
>>> "ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "year INT -i,
>>> month INT -i, day INT -i, hour INT -i, minute INT -i,consumerKey
>>> STRING, context STRING, api_version STRING, api STRING, version STRING,
>>> requestTime LONG, userId STRING, hostName STRING,apiPublisher STRING,
>>> total_request_count LONG, resourceTemplate STRING, method STRING,
>>> applicationName STRING, tenantDomain STRING,userAgent STRING,
>>> resourcePath STRING, request INT, applicationId STRING, tier STRING,
>>> throttledOut BOOLEAN, clientIp STRING,applicationOwner STRING,
>>> _timestamp LONG -i",primaryKeys "year, month, day, hour, minute,
>>> consumerKey, context, api_version, userId, hostName, apiPublisher,
>>> resourceTemplate, method, userAgent, clientIp",incrementalProcessing
>>> "APIMGT_PERMINUTE_REQUEST_DATA, HOUR",mergeSchema "false")
>>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>>> Executor.executeQueryLocal(SparkAnalyticsExecutor.java:764)
>>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>>> Executor.executeQuery(SparkAnalyticsExecutor.java:721)
>>> at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcesso
>>> rService.executeQuery(CarbonAnalyticsProcessorService.java:201)
>>> at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcesso
>>> rService.executeScript(CarbonAnalyticsProcessorService.java:151)
>>> at org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(A
>>> nalyticsTask.java:60)
>>> at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute
>>> (TaskQuartzJobAdapter.java:67)
>>> at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
>>> at java.util.concurrent.Executors$RunnableAdapter.call(Executor
>>> s.java:471)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool
>>> Executor.java:1145)
>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoo
>>> lExecutor.java:615)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.RuntimeException: Unknown options :

Re: [Dev] [DAS][310] Errors while running APIM_INCREMENTAL_PROCESSING_SCRIPT

2016-09-21 Thread Rukshan Premathunga
Hi Malith,

cApp we provided to Analytics APIM will not work for the DAS because of the
changes happen to the DAS 3.1.0. Because of that we need to use Analytics
APIM or have to update capp with the above changes.

Thanks and Regards.

On Wed, Sep 21, 2016 at 2:48 PM, Niranda Perera  wrote:

> Hi Malith,
>
> Yes, correct! you need to change the script.
>
> Additionally, there are some changes in the carbonJdbc connector as
> well... so, you might need to watch out for it!
>
> Please check with the APIM team and ESB team whether we are doing a
> feature release with the DAS 310 changes?
>
> cheers
>
> On Wed, Sep 21, 2016 at 5:11 AM, Malith Munasinghe 
> wrote:
>
>> Hi All,
>>
>> While preparing a DAS 3.1.0 to run APIM Analytics I have added features
>> as in [1]
>> .
>> After deploying the CApp for APIM Analytics I run in to below error.
>> According to the error that *incrementalProcessing *is not a valid
>> option. Also according to [2]
>>  the syntax
>> to parse this option is *incrementalParams. *In order to get DAS 3.1.0
>> to process APIM Analytics
>> do we have to change the scripts with this option as well ?
>>
>>
>> TID: [-1234] [] [2016-09-21 08:54:00,019] ERROR
>> {org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService} -
>>  Error while executing query : CREATE TEMPORARY TABLE
>> APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
>> "ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "year INT -i,
>> month INT -i, day INT -i, hour INT -i, minute INT -i,consumerKey
>> STRING, context STRING, api_version STRING, api STRING, version STRING,
>> requestTime LONG, userId STRING, hostName STRING,apiPublisher STRING,
>> total_request_count LONG, resourceTemplate STRING, method STRING,
>> applicationName STRING, tenantDomain STRING,userAgent STRING,
>> resourcePath STRING, request INT, applicationId STRING, tier STRING,
>> throttledOut BOOLEAN, clientIp STRING,applicationOwner STRING,
>> _timestamp LONG -i",primaryKeys "year, month, day, hour, minute,
>> consumerKey, context, api_version, userId, hostName, apiPublisher,
>> resourceTemplate, method, userAgent, clientIp",incrementalProcessing
>> "APIMGT_PERMINUTE_REQUEST_DATA, HOUR",mergeSchema "false")
>> {org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService}
>> org.wso2.carbon.analytics.spark.core.exception.AnalyticsExecutionException:
>> Exception in executing query CREATE TEMPORARY TABLE
>> APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
>> "ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "year INT -i,
>> month INT -i, day INT -i, hour INT -i, minute INT -i,consumerKey
>> STRING, context STRING, api_version STRING, api STRING, version STRING,
>> requestTime LONG, userId STRING, hostName STRING,apiPublisher STRING,
>> total_request_count LONG, resourceTemplate STRING, method STRING,
>> applicationName STRING, tenantDomain STRING,userAgent STRING,
>> resourcePath STRING, request INT, applicationId STRING, tier STRING,
>> throttledOut BOOLEAN, clientIp STRING,applicationOwner STRING,
>> _timestamp LONG -i",primaryKeys "year, month, day, hour, minute,
>> consumerKey, context, api_version, userId, hostName, apiPublisher,
>> resourceTemplate, method, userAgent, clientIp",incrementalProcessing
>> "APIMGT_PERMINUTE_REQUEST_DATA, HOUR",mergeSchema "false")
>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>> Executor.executeQueryLocal(SparkAnalyticsExecutor.java:764)
>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>> Executor.executeQuery(SparkAnalyticsExecutor.java:721)
>> at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcesso
>> rService.executeQuery(CarbonAnalyticsProcessorService.java:201)
>> at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcesso
>> rService.executeScript(CarbonAnalyticsProcessorService.java:151)
>> at org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(
>> AnalyticsTask.java:60)
>> at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute
>> (TaskQuartzJobAdapter.java:67)
>> at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
>> at java.util.concurrent.Executors$RunnableAdapter.call(
>> Executors.java:471)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool
>> Executor.java:1145)
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoo
>> lExecutor.java:615)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.RuntimeException: Unknown options :
>> incrementalprocessing
>> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelati
>> onProvider.checkParameters(AnalyticsRelationProvider.java:123)
>> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelati
>> onProvider.setP

Re: [Dev] [DAS][310] Errors while running APIM_INCREMENTAL_PROCESSING_SCRIPT

2016-09-21 Thread Niranda Perera
Hi Malith,

Yes, correct! you need to change the script.

Additionally, there are some changes in the carbonJdbc connector as well...
so, you might need to watch out for it!

Please check with the APIM team and ESB team whether we are doing a feature
release with the DAS 310 changes?

cheers

On Wed, Sep 21, 2016 at 5:11 AM, Malith Munasinghe  wrote:

> Hi All,
>
> While preparing a DAS 3.1.0 to run APIM Analytics I have added features as
> in [1]
> .
> After deploying the CApp for APIM Analytics I run in to below error.
> According to the error that *incrementalProcessing *is not a valid
> option. Also according to [2]
>  the syntax
> to parse this option is *incrementalParams. *In order to get DAS 3.1.0 to
> process APIM Analytics
> do we have to change the scripts with this option as well ?
>
>
> TID: [-1234] [] [2016-09-21 08:54:00,019] ERROR {org.wso2.carbon.analytics.
> spark.core.CarbonAnalyticsProcessorService} -  Error while executing
> query : CREATE TEMPORARY TABLE APIMGT_PERMINUTE_REQUEST_DATA USING
> CarbonAnalytics OPTIONS(tableName 
> "ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST",
> schema "year INT -i, month INT -i, day INT -i, hour INT -i, minute INT
> -i,consumerKey STRING, context STRING, api_version STRING, api STRING,
> version STRING, requestTime LONG, userId STRING, hostName STRING,
>  apiPublisher STRING, total_request_count LONG, resourceTemplate STRING,
> method STRING, applicationName STRING, tenantDomain STRING,userAgent
> STRING, resourcePath STRING, request INT, applicationId STRING, tier
> STRING, throttledOut BOOLEAN, clientIp STRING,applicationOwner STRING,
> _timestamp LONG -i",primaryKeys "year, month, day, hour, minute,
> consumerKey, context, api_version, userId, hostName, apiPublisher,
> resourceTemplate, method, userAgent, clientIp",incrementalProcessing
> "APIMGT_PERMINUTE_REQUEST_DATA, HOUR",mergeSchema "false")
> {org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService}
> org.wso2.carbon.analytics.spark.core.exception.AnalyticsExecutionException:
> Exception in executing query CREATE TEMPORARY TABLE
> APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
> "ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "year INT -i,
> month INT -i, day INT -i, hour INT -i, minute INT -i,consumerKey
> STRING, context STRING, api_version STRING, api STRING, version STRING,
> requestTime LONG, userId STRING, hostName STRING,apiPublisher STRING,
> total_request_count LONG, resourceTemplate STRING, method STRING,
> applicationName STRING, tenantDomain STRING,userAgent STRING,
> resourcePath STRING, request INT, applicationId STRING, tier STRING,
> throttledOut BOOLEAN, clientIp STRING,applicationOwner STRING,
> _timestamp LONG -i",primaryKeys "year, month, day, hour, minute,
> consumerKey, context, api_version, userId, hostName, apiPublisher,
> resourceTemplate, method, userAgent, clientIp",incrementalProcessing
> "APIMGT_PERMINUTE_REQUEST_DATA, HOUR",mergeSchema "false")
> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.
> executeQueryLocal(SparkAnalyticsExecutor.java:764)
> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.
> executeQuery(SparkAnalyticsExecutor.java:721)
> at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorServic
> e.executeQuery(CarbonAnalyticsProcessorService.java:201)
> at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorServic
> e.executeScript(CarbonAnalyticsProcessorService.java:151)
> at org.wso2.carbon.analytics.spark.core.AnalyticsTask.
> execute(AnalyticsTask.java:60)
> at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.
> execute(TaskQuartzJobAdapter.java:67)
> at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1145)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.RuntimeException: Unknown options :
> incrementalprocessing
> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelationProvider.
> checkParameters(AnalyticsRelationProvider.java:123)
> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelationProvider.
> setParameters(AnalyticsRelationProvider.java:113)
> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelationProvider.
> createRelation(AnalyticsRelationProvider.java:75)
> at org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelationProvider.
> createRelation(AnalyticsRelationProvider.java:45)
> at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(
> ResolvedData

[Dev] [DAS][310] Errors while running APIM_INCREMENTAL_PROCESSING_SCRIPT

2016-09-21 Thread Malith Munasinghe
Hi All,

While preparing a DAS 3.1.0 to run APIM Analytics I have added features as
in [1]
.
After deploying the CApp for APIM Analytics I run in to below error.
According to the error that *incrementalProcessing *is not a valid option.
Also according to [2]
 the syntax to
parse this option is *incrementalParams. *In order to get DAS 3.1.0 to
process APIM Analytics
do we have to change the scripts with this option as well ?


TID: [-1234] [] [2016-09-21 08:54:00,019] ERROR
{org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService} -
 Error while executing query : CREATE TEMPORARY TABLE
APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
"ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "year INT -i,
month INT -i, day INT -i, hour INT -i, minute INT -i,consumerKey
STRING, context STRING, api_version STRING, api STRING, version STRING,
requestTime LONG, userId STRING, hostName STRING,apiPublisher STRING,
total_request_count LONG, resourceTemplate STRING, method STRING,
applicationName STRING, tenantDomain STRING,userAgent STRING,
resourcePath STRING, request INT, applicationId STRING, tier STRING,
throttledOut BOOLEAN, clientIp STRING,applicationOwner STRING,
_timestamp LONG -i",primaryKeys "year, month, day, hour, minute,
consumerKey, context, api_version, userId, hostName, apiPublisher,
resourceTemplate, method, userAgent, clientIp",incrementalProcessing
"APIMGT_PERMINUTE_REQUEST_DATA, HOUR",mergeSchema "false")
{org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService}
org.wso2.carbon.analytics.spark.core.exception.AnalyticsExecutionException:
Exception in executing query CREATE TEMPORARY TABLE
APIMGT_PERMINUTE_REQUEST_DATA USING CarbonAnalytics OPTIONS(tableName
"ORG_WSO2_APIMGT_STATISTICS_PERMINUTEREQUEST", schema "year INT -i,
month INT -i, day INT -i, hour INT -i, minute INT -i,consumerKey
STRING, context STRING, api_version STRING, api STRING, version STRING,
requestTime LONG, userId STRING, hostName STRING,apiPublisher STRING,
total_request_count LONG, resourceTemplate STRING, method STRING,
applicationName STRING, tenantDomain STRING,userAgent STRING,
resourcePath STRING, request INT, applicationId STRING, tier STRING,
throttledOut BOOLEAN, clientIp STRING,applicationOwner STRING,
_timestamp LONG -i",primaryKeys "year, month, day, hour, minute,
consumerKey, context, api_version, userId, hostName, apiPublisher,
resourceTemplate, method, userAgent, clientIp",incrementalProcessing
"APIMGT_PERMINUTE_REQUEST_DATA, HOUR",mergeSchema "false")
at
org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQueryLocal(SparkAnalyticsExecutor.java:764)
at
org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQuery(SparkAnalyticsExecutor.java:721)
at
org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeQuery(CarbonAnalyticsProcessorService.java:201)
at
org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeScript(CarbonAnalyticsProcessorService.java:151)
at
org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(AnalyticsTask.java:60)
at
org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:67)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Unknown options :
incrementalprocessing
at
org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelationProvider.checkParameters(AnalyticsRelationProvider.java:123)
at
org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelationProvider.setParameters(AnalyticsRelationProvider.java:113)
at
org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelationProvider.createRelation(AnalyticsRelationProvider.java:75)
at
org.wso2.carbon.analytics.spark.core.sources.AnalyticsRelationProvider.createRelation(AnalyticsRelationProvider.java:45)
at
org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
at
org.apache.spark.sql.execution.datasources.CreateTempTableUsing.run(ddl.scala:92)
at
org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
at
org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
at
org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.ap