Hi all,
Would somebody know how a user password is protected in derby ?
Thank you very much.
Regards,
Greg.
Thank you very much. It works !
Greg
Le dimanche 08 janvier 2006 15:21 -0800, Daniel John Debrunner a crit :
Grgoire Dubois wrote:
Here is some sample code.
Thanks for the repro.
If you change the statement type to ResultSet.TYPE_FORWARD_ONLY from
ResultSet.TYPE_SCROLL_INSENSITIVE
Hi all,
Here is the select I do on the following table. If file is a BLOB(2G), the request is very very slow (30-60s), even if there is only one line for the table. But if I replace BLOB(2G) by BLOB(5M) or BLOB(1G), the request becomes very fast.
Is there a reason ? Is there a workaround ?
I don't read the blob in my request (the blob is db_file.file, and it isn't used in the select) :
SELECT DISTINCT db_file.ID,db_file.name,db_file.reference,db_file.hash FROM db_file ORDER BY db_file.name;
And the slowliness of the request isn't related to the data filled in the blob, it is
Ok, I must have a problem, as I seem to have the same problem with mysql.
Thanks for your help.
Best regards.
Le lundi 09 janvier 2006 20:02 +0100, Grgoire Dubois a crit :
I don't read the blob in my request (the blob is db_file.file, and it isn't used in the select) :
SELECT
Here is some sample code.
I write a file from disk into the database, and then read the file from the database, and write it on disk.
import java.sql.*;
import java.io.*;
/**
*
* @author greg
*/
public class derby_filewrite_fileread {
private static File file = new
Hi all,
I am using the embedded driver.
I can write big blobs into the database , but reading blobs larger than a certain size fails with memory error
(java.lang.OutOfMemoryError).
How to repeat:
Insert a blob about 10Mo or more in size and then try to read it back out.
Does someone else
Hi all,
I inserted a 700Mo file into a blob (using embedded jdbc driver).
Then I use the command DELETE to delete the blob from the database. The
blob is well removed. But the disk space isn't recovered.
Is there a command that permits the recovery of disk space when using
blobs?
Thank you
It works great with the embedded driver. The problem comes with the
network client driver.
I wrote a log for the bug : ( -DERBY-550)
I'll watch the 326.
Le jeudi 01 septembre 2005 à 11:02 -0700, Sunitha Kambhampati a écrit :
Rajes Akkineni wrote:
Hi,
I have got the similar problem.
Hi all,
I'm using the org.apache.derby.jdbc.ClientDriver driver to access the
Derby database through network.
Writting small files (smaller than 5Mo) into the database works fine,
but I can't write big files (40Mo for example), without getting the
exception java.lang.OutOfMemoryError.
I think
with more than 2Gb RAM, is there an
other way to write big BLOBs in the Derby ?
Thank you very much.
Grégoire.
Le mercredi 31 août 2005 à 07:23 -0700, Kathey Marsden a écrit :
Grégoire Dubois wrote:
Hi all,
I'm using the org.apache.derby.jdbc.ClientDriver driver to access the
Derby database
Perhaps it could help : the DBMS Mckoi ( http://mckoi.com/database/ )
has got a JDBC driver that supports blob streaming over network. Code is
open.
I tried with IBM DB2, and same as Derby, streaming isn't supported with
their JDBC driver. For a so big DBMS, I found it odd.
I'm aiming to try
12 matches
Mail list logo