Hi all,

I found that there is a typo about the document of jdbc-client-driver on web 
https://apacheignite-sql.readme.io/v2.4/docs/jdbc-client-driver#streaming-mode

The typo is as:
// Opening a connection in the streaming mode and time based flushing set.
Connection conn = 
DriverManager.getConnection("jdbc:ignite:cfg://streaming=true@streamingFlushFrequency=1000@file:///etc/config/ignite-jdbc.xml");

The corrected text is:
“Connection conn = 
DriverManager.getConnection("jdbc:ignite:cfg://streaming=true:streamingFlushFrequency=1000:cache=personCache@file:/root/apache-ignite-fabric-2.4.0-bin/examples/config/example-cache.xml");”

I tried above text, and it was successful.

Thanks

Rick

From: Prasad Bhalerao [mailto:[email protected]]
Sent: Saturday, March 31, 2018 11:32 AM
To: [email protected]
Subject: Re: How to insert multiple rows/data into Cache once

Hi Andrey,

I have similar requirement and I am using cache.putAll method to update 
existing entries or to insert new ones.
I will be updating/inserting close to 3 million entries in one go.

I am using wrte through approach to update/insert/delete the data in oracle 
tables.
I am using cachestores writeAll/ deleteAll method to achieve this.

I am doing this in single  ignite distributed transaction.


Now the question is,
1a) Can I use streamer in ignite transaction?
1b) Can I use ignite jdbc bulk update, insert, delete with ignite distributed 
transaction?

2) if I use streamer will it invoke cache store writeAll method?
I meant does write through approach work with streamer.


3) If I use Jdbc bulk mode for cache update and insert or delete, will it 
invoke cache store's wrieAll and deleteAll method?
Does write through approach work with jdbc bulk update/insert/ delete?


4) Does ignite have any apis on cache only for update purpose? Put/putAll will 
insert or overwrite. What if I just want to update existing entries ?


Thanks,
Prasad

On Fri, Mar 30, 2018, 11:12 PM Andrey Mashenkov 
<[email protected]<mailto:[email protected]>> wrote:
Hi,

Ignite has 2 JDBC drivers.
1. Client driver [1] starts client node (has all failover features) and you 
have to pass client node config in URL.
2. Thin Driver [2] that connect directly to one of Ignite server node.

So, you need a first one to be able to use streaming mode.

[1] https://apacheignite-sql.readme.io/docs/jdbc-client-driver
[2] https://apacheignite-sql.readme.io/docs/jdbc-driver

On Fri, Mar 30, 2018 at 1:16 PM, 
<[email protected]<mailto:[email protected]>> wrote:
Hi Andrey,

I am trying to run [2], as:
// Register JDBC driver.
Class.forName("org.apache.ignite.IgniteJdbcDriver");
// Opening connection in the streaming mode.
Connection conn = 
DriverManager.getConnection("jdbc:ignite:cfg://streaming=true@file:///etc/config/ignite-jdbc.xml");

However, I'm a bit confused about that setting in [2] about the ignite-jdbc.xml.

I do not know how to find or create the xml, and here I run the ignite node via 
JVM.

If I can write java code to produce the ignite-jdbc or not? Or only complete 
Spring XML configuration?

By the way, I have tried the [1], that worked well.

Finally, I still need to use the SQL as a client node, and quick write data 
into cache.

Thank you for helping me

Rick


From: Andrey Mashenkov 
[mailto:[email protected]<mailto:[email protected]>]
Sent: Thursday, March 29, 2018 6:20 PM
To: [email protected]<mailto:[email protected]>
Subject: Re: How to insert multiple rows/data into Cache once

Hi,

Try to use DataStreamer for fast cache load [1].
If you need to use SQL, you can try to use bulk mode updated via JDBC [2]


Also a COPY SQL command [3] will be available in next 2.5 release.
The feature is already in master, you can try to build from it. See example [4]
.

[1] https://apacheignite.readme.io/docs/data-streamers
[2] https://apacheignite.readme.io/v2.0/docs/jdbc-driver#section-streaming-mode
[3] https://issues.apache.org/jira/browse/IGNITE-6917
[4] 
https://github.com/apache/ignite/blob/master/examples/src/main/java/org/apache/ignite/examples/sql/SqlJdbcCopyExample.java

On Thu, Mar 29, 2018 at 11:30 AM, 
<[email protected]<mailto:[email protected]>> wrote:
Dear all,

I am trying to use the SqlFieldsQuery sdk to insert data to one cache on Ignite.

I can insert one data into one cache at a time.

However, I have no idea to insert multiple rows/data into the cache once.

For example, I would like to insert 1000 rows/data into the cache once.

Here, I provide my code to everyone to reproduce my situation.
public class IgniteCreateServer {
public class Person {
 @QuerySqlField
 private String firstName;
  @QuerySqlField
  private String lastName;
  public Person(String firstName, String lastName) {
    this.firstName = firstName;
    this.lastName = lastName;
  }
}
public static void main(String[] args) {
cacheConf.setName("igniteCache");
cacheConf.setIndexedTypes(String.class, String.class);
cacheConf.setCacheMode(CacheMode.REPLICATED);
cacheConf.setAtomicityMode(CacheAtomicityMode.ATOMIC);
  cfg.setCacheConfiguration(cacheConf);
  Ignite igniteNode = Ignition.getOrStart(cfg);
  IgniteCache cacheKeyvalue = igniteNode.getOrCreateCache(cacheConf);

long starttime, endtime;
starttime = System.currentTimeMillis();
int datasize = 100000;
for (int i = 0; i < datasize; i++) {
    cacheKeyvalue.put("key " + Integer.toString(i), Integer.toString(i));
}
  endtime = System.currentTimeMillis();
  System.out.println("write " + datasize + " pairkeyvalue data: spend " + 
(endtime - starttime)  + "milliseconds");
//=================================================================================================================

cacheCfg.setName("personCache");
cacheCfg.setIndexedTypes(String.class, Person.class);
cacheCfg.setCacheMode(CacheMode.REPLICATED);
cacheCfg.setAtomicityMode(CacheAtomicityMode.ATOMIC);
IgniteCache cacheKeyTable = igniteNode.getOrCreateCache(cacheCfg);

long starttime1, endtime1;
starttime1 = System.currentTimeMillis();
for (int i = 0; i < datasize; i++) {
cacheKeyTable.query(new SqlFieldsQuery("INSERT INTO 
Person(_key,firstName,lastName) VALUES(?,?,?)")
.setArgs(i, "key " + Integer.toString(i), Integer.toString(i)));
}
endtime1 = System.currentTimeMillis();
System.out.println("write" + datasize + " table data: spend " + (endtime1 - 
starttime1)  + "milliseconds");
}

The my code show as:
“write 100000 pairkeyvalue data: spend 4734 milliseconds
write 100000 table data: spend 2846 milliseconds”

From the above result, I feel that using the SQL to insert data to cache is 
faster than using cache.getall().

I am not sure if this is correct or not?

In addition, that is important for me to insert data into cache via the use of 
SQL,
so I would like to insert multiple rows/data to accelerate it.

if any further information is needed, I am glad to be informed and will provide 
to you as soon as possible.

Thanks

Rick







--
本信件可能包含工研院機密資訊,非指定之收件者,請勿使用或揭露本信件內容,並請銷毀此信件。 This email may contain 
confidential information. Please do not use or disclose it in any way and 
delete it if you are not the intended recipient.



--
Best regards,
Andrey V. Mashenkov


--
本信件可能包含工研院機密資訊,非指定之收件者,請勿使用或揭露本信件內容,並請銷毀此信件。 This email may contain 
confidential information. Please do not use or disclose it in any way and 
delete it if you are not the intended recipient.



--
Best regards,
Andrey V. Mashenkov


--
本信件可能包含工研院機密資訊,非指定之收件者,請勿使用或揭露本信件內容,並請銷毀此信件。 This email may contain 
confidential information. Please do not use or disclose it in any way and 
delete it if you are not the intended recipient.

Reply via email to