I have a 22 GB h2 database. This database became unresponsive, therefore I want to migrate it to Postgresql. I've tried the following:
- Use "script to dump" from the web front end to create a dump. However, Postgresql can't process this dump because it contains unknown functions. - Use a Java program to read from the h2 db and insert into the Postgresql db. However, this doesn't work. It takes one hour to even connect to the h2 database and the "select *" statement never executes. The Java program uses jdbc to connect to the databases and "select * from tweets" to create a resultSet Iterator. This approach worked before when the db was < 1 GB. top shows me that the java program is running with 15% - 100% cpu load. When I start the Java programm, the db doubles in size and I can't even connect to it with the web front end. However, I have a copy of the 22 GB file. What to do? I've already tried "shutdown defrag" and compact to no avail. I use h2-1.4.195. The db was created with create table tweets (id long auto_increment primary key, text varchar(255), created date, retweet boolean, truncated boolean, link boolean, followers int) select * works from the web front end only. system_sequence has value 177130355. -- You received this message because you are subscribed to the Google Groups "H2 Database" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at https://groups.google.com/group/h2-database. For more options, visit https://groups.google.com/d/optout.
