Hi Imesh,

Please find the answers below:

1. What does this analysis do?

What I'm processing using spark query is from the raw data I received as
thrift events, I'm counting members is different states per application and
cluster every t sec. Spark script is scheduled to execute every t sec to do
that.

2. Why we cannot directly insert them to the RDBMS?

I'm doing the above processing in DAS spark environment / DAL (Data Access
Layer), we can't insert the results directly to RDBMS, because DAL don't
know anything about our RDBMS table and we are using temporary table to do
mapping only to RDBMS table in spark environment and as I understand
'create temporary table' will not create a physical table and do a mapping
to our RDBMS table in spark environment.

Thanks.

On Thu, Oct 1, 2015 at 1:33 PM, Sinthuja Ragendran <[email protected]>
wrote:

> Hi Niranda,
>
> On Thu, Oct 1, 2015 at 1:28 PM, Niranda Perera <[email protected]> wrote:
>
>> Hi Thanuja and Imesh,
>>
>> let me clarify the use of the term "create temporary table" with regard
>> to Spark.
>> inside DAS we save ('persist') data in DAL (Dara access layer) tables. So
>> in order for us to query these tables, spark needs some sort of a mapping
>> to the tables in DAL in its runtime environment. This mapping is created in
>> the temporary table queries. These temp tables are only a mapping. Not a
>> physical table.
>>
>> @thanuja, yes you are correct! We have to manually create the tables in
>> MySQL before making the temp table mapping in Spark SQL.
>>
> With Carbon JDBC connector, can we try to create the table if it is not
> existing? May be we can let the users to pass the actual create table
> statement as another parameter with options.  IMHO it will be more user
> friendly if we could do that, WDYT?
>
> Thanks,
> Sinthuja.
>
>
>> On Thu, Oct 1, 2015 at 9:53 AM, Thanuja Uruththirakodeeswaran <
>> [email protected]> wrote:
>>>
>>>
>>> In DAS spark environment, we can't directly insert the analyzed data to
>>> our mysql table. We should create a temporary table using our datasources
>>> to manipulate them.
>>>
>>
>> Yes, can you please explain the reasons? What does this analysis do? Why
>> we cannot directly insert them to the RDBMS?
>>
>> Thanks
>>
>> On Thu, Oct 1, 2015 at 9:53 AM, Thanuja Uruththirakodeeswaran <
>> [email protected]> wrote:
>>
>>> Hi Imesh,
>>>
>>> If we take the above scenario, I need to insert the analyzed/aggregated
>>> data which is obtained as result after spark sql processing, to my mysql
>>> table (sample_table). In order to do that, first we need to create a
>>> temporary table using the corresponding mysql database (sample_datasource)
>>> and table(sample_table) in spark environment and then only by inserting
>>> data to this temporary table in spark environment, we can update our mysql
>>> table.
>>>
>>> In DAS spark environment, we can't directly insert the analyzed data to
>>> our mysql table. We should create a temporary table using our datasources
>>> to manipulate them. I think that's why they named it as '*temporary*'
>>> table.
>>>
>>> @Niranda Please correct me if I'm wrong.
>>>
>>> Thanks.
>>>
>>> On Thu, Oct 1, 2015 at 7:00 AM, Imesh Gunaratne <[email protected]> wrote:
>>>
>>>> Hi Thanuja,
>>>>
>>>> Can you please explain the purpose of these temporary tables?
>>>>
>>>> Thanks
>>>>
>>>> On Wed, Sep 30, 2015 at 11:53 PM, Thanuja Uruththirakodeeswaran <
>>>> [email protected]> wrote:
>>>>
>>>>> Hi All,
>>>>>
>>>>> When we create temporary tables in spark environment using carbonJDBC
>>>>> option as explained in [1], we are using a datasource and tableName from
>>>>> which spark environment temporary table will get data as follow:
>>>>> CREATE TEMPORARY TABLE <temp_table> using CarbonJDBC options
>>>>> (dataSource "<datasource name>", tableName "<table name>");
>>>>>
>>>>> I've used a mysql database (sample_datasource) for datasource and used
>>>>> mysql tables created in that database for tableName (sample_table) as
>>>>> follow:
>>>>> CREATE TEMPORARY TABLE sample using CarbonJDBC options (dataSource "
>>>>> sample_datasource", tableName "sample_table");
>>>>>
>>>>> But I'm creating the mysql database and tables by executing sql
>>>>> statements manually. Is there a way in DAS that we can add these sql
>>>>> statements inside a script and create the database and tables when we 
>>>>> start
>>>>> the server?
>>>>>
>>>>> [1]. https://docs.wso2.com/display/DAS300/Spark+Query+Language
>>>>>
>>>>> Thanks.
>>>>>
>>>>> --
>>>>> Thanuja Uruththirakodeeswaran
>>>>> Software Engineer
>>>>> WSO2 Inc.;http://wso2.com
>>>>> lean.enterprise.middleware
>>>>>
>>>>> mobile: +94 774363167
>>>>>
>>>>> _______________________________________________
>>>>> Dev mailing list
>>>>> [email protected]
>>>>> http://wso2.org/cgi-bin/mailman/listinfo/dev
>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> *Imesh Gunaratne*
>>>> Senior Technical Lead
>>>> WSO2 Inc: http://wso2.com
>>>> T: +94 11 214 5345 M: +94 77 374 2057
>>>> W: http://imesh.gunaratne.org
>>>> Lean . Enterprise . Middleware
>>>>
>>>>
>>>
>>>
>>> --
>>> Thanuja Uruththirakodeeswaran
>>> Software Engineer
>>> WSO2 Inc.;http://wso2.com
>>> lean.enterprise.middleware
>>>
>>> mobile: +94 774363167
>>>
>>
>>
>>
>> --
>> *Imesh Gunaratne*
>> Senior Technical Lead
>> WSO2 Inc: http://wso2.com
>> T: +94 11 214 5345 M: +94 77 374 2057
>> W: http://imesh.gunaratne.org
>> Lean . Enterprise . Middleware
>>
>>
>> _______________________________________________
>> Dev mailing list
>> [email protected]
>> http://wso2.org/cgi-bin/mailman/listinfo/dev
>>
>>
>
>
> --
> *Sinthuja Rajendran*
> Associate Technical Lead
> WSO2, Inc.:http://wso2.com
>
> Blog: http://sinthu-rajan.blogspot.com/
> Mobile: +94774273955
>
>
>
> _______________________________________________
> Dev mailing list
> [email protected]
> http://wso2.org/cgi-bin/mailman/listinfo/dev
>
>


-- 
Thanuja Uruththirakodeeswaran
Software Engineer
WSO2 Inc.;http://wso2.com
lean.enterprise.middleware

mobile: +94 774363167
_______________________________________________
Dev mailing list
[email protected]
http://wso2.org/cgi-bin/mailman/listinfo/dev

Reply via email to