Hi Niranda, I have created a jira [1] to track the issues we have in carbonJDBC option.
[1]. https://wso2.org/jira/browse/DAS-273 Thanks. On Thu, Oct 1, 2015 at 3:12 PM, Sinthuja Ragendran <[email protected]> wrote: > Hi Niranda, > > On Thu, Oct 1, 2015 at 2:28 PM, Inosh Goonewardena <[email protected]> wrote: > >> Hi Niranda, >> >> On Thu, Oct 1, 2015 at 1:33 PM, Sinthuja Ragendran <[email protected]> >> wrote: >> >>> Hi Niranda, >>> >>> On Thu, Oct 1, 2015 at 1:28 PM, Niranda Perera <[email protected]> wrote: >>> >>>> Hi Thanuja and Imesh, >>>> >>>> let me clarify the use of the term "create temporary table" with regard >>>> to Spark. >>>> inside DAS we save ('persist') data in DAL (Dara access layer) tables. >>>> So in order for us to query these tables, spark needs some sort of a >>>> mapping to the tables in DAL in its runtime environment. This mapping is >>>> created in the temporary table queries. These temp tables are only a >>>> mapping. Not a physical table. >>>> >>>> @thanuja, yes you are correct! We have to manually create the tables in >>>> MySQL before making the temp table mapping in Spark SQL. >>>> >>> With Carbon JDBC connector, can we try to create the table if it is not >>> existing? May be we can let the users to pass the actual create table >>> statement as another parameter with options. IMHO it will be more user >>> friendly if we could do that, WDYT? >>> >> >> Yes. +1. For tables created using CarbonAnalytics it is possible to >> provide the table schema as below [1]. I believe we can use the similar >> approach in CarbonJDBC also to provide the create table query. >> >> As per the current implementation what happens is even though the table >> is created manually before the script execution, "insert overwrite..." >> statement execution will delete the original table and recreate a new table >> using a generated schema(schema information is generated using the original >> table structure). In this approach, table that is re-created at the query >> execution will not have primary keys and indexes of the original table(if >> there were any). So if we can provide a complete create table query, we can >> preserve original table structure too. >> > > Yeah, +1. > > >> >> On the other hand, I believe we should also support "insert into.." >> statements in CarbonJDBC. "insert into.." statements will not delete and >> recreate the table like the "insert overwrite..." statements, and it will >> only update the existing table[2]. >> > > Yeah, I also have concern on this. Because currently the insert overwrite > statement drops the table, and repopulate the data entirely, and hence the > dashboard which reads from the table may be empty/partial data. This is > also an issue when we are purging the original data scenario, where the > summarised data will also be cleaned and no old data available to > repopulate the summarised data again. Can we have it as replace the row if > it's already existing in insert overwrite, rather dropping the entire table > to avoid such issues? > > Thanks, > Sinthuja. > > >> >> [1] CREATE TEMPORARY TABLE plugUsage >> USING CarbonAnalytics >> OPTIONS (tableName "plug_usage", >> * schema "house_id INT, household_id INT, plug_id INT, usage >> FLOAT"*, >> primaryKeys "household_id, plug_id" >> ); >> >> [2] [Dev] [Architecture] Carbon Spark JDBC connector >> >> >>> >>> Thanks, >>> Sinthuja. >>> >>> >>>> On Thu, Oct 1, 2015 at 9:53 AM, Thanuja Uruththirakodeeswaran < >>>> [email protected]> wrote: >>>>> >>>>> >>>>> In DAS spark environment, we can't directly insert the analyzed data >>>>> to our mysql table. We should create a temporary table using our >>>>> datasources to manipulate them. >>>>> >>>> >>>> Yes, can you please explain the reasons? What does this analysis do? >>>> Why we cannot directly insert them to the RDBMS? >>>> >>>> Thanks >>>> >>>> On Thu, Oct 1, 2015 at 9:53 AM, Thanuja Uruththirakodeeswaran < >>>> [email protected]> wrote: >>>> >>>>> Hi Imesh, >>>>> >>>>> If we take the above scenario, I need to insert the >>>>> analyzed/aggregated data which is obtained as result after spark sql >>>>> processing, to my mysql table (sample_table). In order to do that, first >>>>> we >>>>> need to create a temporary table using the corresponding mysql database >>>>> (sample_datasource) and table(sample_table) in spark environment and then >>>>> only by inserting data to this temporary table in spark environment, we >>>>> can >>>>> update our mysql table. >>>>> >>>>> In DAS spark environment, we can't directly insert the analyzed data >>>>> to our mysql table. We should create a temporary table using our >>>>> datasources to manipulate them. I think that's why they named it as ' >>>>> *temporary*' table. >>>>> >>>>> @Niranda Please correct me if I'm wrong. >>>>> >>>>> Thanks. >>>>> >>>>> On Thu, Oct 1, 2015 at 7:00 AM, Imesh Gunaratne <[email protected]> >>>>> wrote: >>>>> >>>>>> Hi Thanuja, >>>>>> >>>>>> Can you please explain the purpose of these temporary tables? >>>>>> >>>>>> Thanks >>>>>> >>>>>> On Wed, Sep 30, 2015 at 11:53 PM, Thanuja Uruththirakodeeswaran < >>>>>> [email protected]> wrote: >>>>>> >>>>>>> Hi All, >>>>>>> >>>>>>> When we create temporary tables in spark environment using >>>>>>> carbonJDBC option as explained in [1], we are using a datasource and >>>>>>> tableName from which spark environment temporary table will get data as >>>>>>> follow: >>>>>>> CREATE TEMPORARY TABLE <temp_table> using CarbonJDBC options >>>>>>> (dataSource "<datasource name>", tableName "<table name>"); >>>>>>> >>>>>>> I've used a mysql database (sample_datasource) for datasource and >>>>>>> used mysql tables created in that database for tableName (sample_table) >>>>>>> as >>>>>>> follow: >>>>>>> CREATE TEMPORARY TABLE sample using CarbonJDBC options (dataSource " >>>>>>> sample_datasource", tableName "sample_table"); >>>>>>> >>>>>>> But I'm creating the mysql database and tables by executing sql >>>>>>> statements manually. Is there a way in DAS that we can add these sql >>>>>>> statements inside a script and create the database and tables when we >>>>>>> start >>>>>>> the server? >>>>>>> >>>>>>> [1]. https://docs.wso2.com/display/DAS300/Spark+Query+Language >>>>>>> >>>>>>> Thanks. >>>>>>> >>>>>>> -- >>>>>>> Thanuja Uruththirakodeeswaran >>>>>>> Software Engineer >>>>>>> WSO2 Inc.;http://wso2.com >>>>>>> lean.enterprise.middleware >>>>>>> >>>>>>> mobile: +94 774363167 >>>>>>> >>>>>>> _______________________________________________ >>>>>>> Dev mailing list >>>>>>> [email protected] >>>>>>> http://wso2.org/cgi-bin/mailman/listinfo/dev >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> *Imesh Gunaratne* >>>>>> Senior Technical Lead >>>>>> WSO2 Inc: http://wso2.com >>>>>> T: +94 11 214 5345 M: +94 77 374 2057 >>>>>> W: http://imesh.gunaratne.org >>>>>> Lean . Enterprise . Middleware >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> Thanuja Uruththirakodeeswaran >>>>> Software Engineer >>>>> WSO2 Inc.;http://wso2.com >>>>> lean.enterprise.middleware >>>>> >>>>> mobile: +94 774363167 >>>>> >>>> >>>> >>>> >>>> -- >>>> *Imesh Gunaratne* >>>> Senior Technical Lead >>>> WSO2 Inc: http://wso2.com >>>> T: +94 11 214 5345 M: +94 77 374 2057 >>>> W: http://imesh.gunaratne.org >>>> Lean . Enterprise . Middleware >>>> >>>> >>>> _______________________________________________ >>>> Dev mailing list >>>> [email protected] >>>> http://wso2.org/cgi-bin/mailman/listinfo/dev >>>> >>>> >>> >>> >>> -- >>> *Sinthuja Rajendran* >>> Associate Technical Lead >>> WSO2, Inc.:http://wso2.com >>> >>> Blog: http://sinthu-rajan.blogspot.com/ >>> Mobile: +94774273955 >>> >>> >>> >>> _______________________________________________ >>> Dev mailing list >>> [email protected] >>> http://wso2.org/cgi-bin/mailman/listinfo/dev >>> >>> >> >> >> -- >> Thanks & Regards, >> >> Inosh Goonewardena >> Associate Technical Lead- WSO2 Inc. >> Mobile: +94779966317 >> > > > > -- > *Sinthuja Rajendran* > Associate Technical Lead > WSO2, Inc.:http://wso2.com > > Blog: http://sinthu-rajan.blogspot.com/ > Mobile: +94774273955 > > > -- Thanuja Uruththirakodeeswaran Software Engineer WSO2 Inc.;http://wso2.com lean.enterprise.middleware mobile: +94 774363167
_______________________________________________ Dev mailing list [email protected] http://wso2.org/cgi-bin/mailman/listinfo/dev
