Hi,

I'm trying to do a batch insert using MyBatis & Phoenix and I'm ending up
in an exception (org.springframework.jdbc.BadSqlGrammarException:).

---------------------------------------------------------------------------------------------------------------------

Here is an example of what I'm doing:

*I have two entities: *
Authors { authorId, firstName, lastName }
Books { bookId, bookTitle, authorId }

*Data:*
Authors: Record-1: 001, john, henry
Books: Record-1: 001, A database primer, 001
Books: Record-2: 002, Building a datawarehouse, 001

*Model object:*
AuthorsBooksModel { authorId, firstName, lastName, booksList {bookId,
bookTitle} }

*Phoenix table (denormalized):*

authorsbooks { authorid, firstname, lastname, bookid, booktitle }
Create script: create table authorsbooks (authorid varchar, firstname
varchar, lastname varchar, bookid varchar, booktitle varchar, constraint
ab_pk primary key(authorid, bookid));

*Query:*
Using MyBatis batching, I create upsert statement from my nested model
object (AuthorsBooksModel) which is passed to phoenix:
upsert into authorsbooks (authorid, firstname, lastname, bookid, booktitle)
values ('001', 'john', 'henry', '001', 'A database primer'), ('001',
'john', 'henry', '002', 'Building a datawarehouse');

The above statement fails in the application with *BadSqlGrammarException*.

*Phoenix command line:*

When I execute the same upsert statement directly at Phoenix console, I see
the following error:

0: jdbc:phoenix:test.abc.def.com,sfdv> upsert into authorsbooks (authorid,
firstname, lastname, bookid, booktitle) values ('001', 'john', 'henry',
'001', 'A database primer'), ('001', 'john', 'henry', '002', 'Building a
datawarehouse');

*Error: ERROR 602 (42P00): Syntax error. Missing "EOF" at line 1, column
136. (state=42P00,code=602)*
---------------------------------------------------------------------------------------------------------------------

The same scenario works well with MyBatis - PostgreSQL.
*Create script:* create table authorsbooks (authorid varchar, firstname
varchar, lastname varchar, bookid varchar, booktitle varchar, constraint
ab_pk primary key(authorid, bookid));
*Insert:* insert into authorsbooks (authorid, firstname, lastname, bookid,
booktitle) values ('001', 'john', 'henry', '001', 'A database primer'),
('001', 'john', 'henry', '002', 'Building a datawarehouse');

---------------------------------------------------------------------------------------------------------------------

In my realtime application when I denormalize my nested model object I will
have hundreds of thousands of records and executing one statement at a time
from MyBatis to Phoenix is going to cause a network concern.

Can someone please take a look at the above scenario and help me fix it or
suggest me an alternative?

Thanks,
Vamsi Attluri.

Reply via email to