Re: [sqlite] reg:blob data reading

2006-09-19 Thread Dennis Jenkins

Dennis Jenkins wrote:

Teg wrote:

Hello Dennis,
  I'm, probably going to be offering optional encryption too. Why did
you chose to use the SQLite encryption extensions versus just
encrypting the blobs after you read them back in and before you write
them out?
  


1) We wanted the entire database encrypted.  There is sensitive 
non-blob data too.


2) Dr. Hipp's encryption extension is well tested and already 
integrated into sqlite.


3) The encryption is very transparent to the rest of our application.  
I don't have to manually call functions to look up keys and encrypt or 
decrypt blocks of data.




4) Updates to the blobs can now take advantage of the ACIDity of the 
sqlite engine.  (Is that a valid use of the acronym 'ACID'?  Gotta love 
the English language.  We can conjugate anything anyway we want to.)



-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: [sqlite] reg:blob data reading

2006-09-19 Thread Dennis Jenkins

Teg wrote:

Hello Dennis,
  
I'm, probably going to be offering optional encryption too. Why did

you chose to use the SQLite encryption extensions versus just
encrypting the blobs after you read them back in and before you write
them out?
  


1) We wanted the entire database encrypted.  There is sensitive non-blob 
data too.


2) Dr. Hipp's encryption extension is well tested and already integrated 
into sqlite.


3) The encryption is very transparent to the rest of our application.  I 
don't have to manually call functions to look up keys and encrypt or 
decrypt blocks of data.



-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: Re[4]: [sqlite] reg:blob data reading

2006-09-18 Thread Thomas . L
On Mon, 18 Sep 2006 10:20:36 -0400, you wrote:

>There's no right or wrong way.

There is viewpoint from June 2006 with title: 
   "To BLOB or Not To BLOB:
Large Object Storage in a Database or a Filesystem?"

at
http://research.microsoft.com/research/pubs/view.aspx?type=technical%20report=1089

Maybe it turns a light on... ;-)


Best Regards
Thomas

www.thlu.de

-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re[2]: [sqlite] reg:blob data reading

2006-09-18 Thread Teg
Hello Dennis,

Monday, September 18, 2006, 11:50:03 AM, you wrote:

DJ> Jay Sprenkle wrote:
>> On 9/18/06, Teg <[EMAIL PROTECTED]> wrote:
>>> Hello Jay,

DJ> Everyone has different needs.  We like keeping all of the data (blobs
DJ> included) in one data file.  We also use the encryption extension, and
DJ> it is mandatory that our blobs be encrypted.  I don't need "read 
DJ> arbitrary byte ranges from a blob" for my work project, but I could use
DJ> them in a personal project that involves sqlite (no encryption here; but
DJ> it is important to keep all data in one data file).


DJ> 
-
DJ> To unsubscribe, send email to [EMAIL PROTECTED]
DJ> 
-


I'm, probably going to be offering optional encryption too. Why did
you chose to use the SQLite encryption extensions versus just
encrypting the blobs after you read them back in and before you write
them out?

-- 
Best regards,
 Tegmailto:[EMAIL PROTECTED]


-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: [sqlite] reg:blob data reading

2006-09-18 Thread Dennis Jenkins

Jay Sprenkle wrote:

On 9/18/06, Teg <[EMAIL PROTECTED]> wrote:

Hello Jay,

The whole reason I store files in the DB in the first place is to have
a single "package" to move around and backup when needed. My
application is storing whole series of PNG and JPG files in the
DB with meta data describing where the images came from.


My technique won't help you then. I use it for things like scanning 
images

of documents and using the database to keep track of the documents.
I never have to search a picture using a select statement so it would
be silly for me to put them into the database. I just back up the file
system using off the shelf backup software and it works fine.



Everyone has different needs.  We like keeping all of the data (blobs 
included) in one data file.  We also use the encryption extension, and 
it is mandatory that our blobs be encrypted.  I don't need "read 
arbitrary byte ranges from a blob" for my work project, but I could use 
them in a personal project that involves sqlite (no encryption here; but 
it is important to keep all data in one data file).



-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: Re[4]: [sqlite] reg:blob data reading

2006-09-18 Thread Jay Sprenkle

On 9/18/06, Teg <[EMAIL PROTECTED]> wrote:

Hello Jay,

There's no right or wrong way. I was just suggesting that there are
cases where you want to store the whole file in the DB. I have an
application that generates 1000's of 150K compressed files. I've been
toying with the idea of shoving them all onto a DB because of the way
Windows groans when you have to enumerate folders with many small
files.


The scanner I use takes care of that by making a directory for each
batch of documents scanned. I've seen that same thing on Windows
systems for other projects. I ended up having to create a manager that would
store files into numbered sub directories to avoid it.


The downside of course is I'd have to vacuum the tables from time to
time.


That's why we get paid the 'big bucks'   ;)

-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re[4]: [sqlite] reg:blob data reading

2006-09-18 Thread Teg
Hello Jay,

Monday, September 18, 2006, 10:05:19 AM, you wrote:

JS> On 9/18/06, Teg <[EMAIL PROTECTED]> wrote:
>> Hello Jay,
>>
>> The whole reason I store files in the DB in the first place is to have
>> a single "package" to move around and backup when needed. My
>> application is storing whole series of PNG and JPG files in the
>> DB with meta data describing where the images came from.

JS> My technique won't help you then. I use it for things like scanning images
JS> of documents and using the database to keep track of the documents.
JS> I never have to search a picture using a select statement so it would
JS> be silly for me to put them into the database. I just back up the file
JS> system using off the shelf backup software and it works fine.

JS> --
JS> SqliteImporter and SqliteReplicator: Command line utilities for Sqlite
JS> http://www.reddawn.net/~jsprenkl/Sqlite

JS> Cthulhu Bucks!
JS> http://www.cthulhubucks.com

JS> 
-
JS> To unsubscribe, send email to [EMAIL PROTECTED]
JS> 
-


There's no right or wrong way. I was just suggesting that there are
cases where you want to store the whole file in the DB. I have an
application that generates 1000's of 150K compressed files. I've been
toying with the idea of shoving them all onto a DB because of the way
Windows groans when you have to enumerate folders with many small
files.

In the case of these small files, performance is dominated by
enumerating and decompressing so, even if it's a bit slower selecting
the files out of the DB, any improvement in enumeration speed would make
a noticeable performance boost.

The downside of course is I'd have to vacuum the tables from time to
time.

-- 
Best regards,
 Tegmailto:[EMAIL PROTECTED]


-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: [sqlite] reg:blob data reading

2006-09-18 Thread sandhya
Thank you very much just i need confirmation on this..
If you don't mind could you please tell me if i want to perform that kind of
operation Is there any way other than storing in some temp file and
reading.using normal fopen() calls.

Please do needful
Thanks a lot
Sandhya R




- Original Message - 
From: <[EMAIL PROTECTED]>
To: <sqlite-users@sqlite.org>; "Teg" <[EMAIL PROTECTED]>
Sent: Monday, September 18, 2006 7:32 PM
Subject: Re: [sqlite] reg:blob data reading


> SQLite does not (at this time) have the ability to incrementally
> read or write BLOBs.  You have to read and write the whole blob
> all at once.
> --
> D. Richard Hipp   <[EMAIL PROTECTED]>
>
>
> --
---
> To unsubscribe, send email to [EMAIL PROTECTED]
> --
---
>



-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: Re[2]: [sqlite] reg:blob data reading

2006-09-18 Thread Jay Sprenkle

On 9/18/06, Teg <[EMAIL PROTECTED]> wrote:

Hello Jay,

The whole reason I store files in the DB in the first place is to have
a single "package" to move around and backup when needed. My
application is storing whole series of PNG and JPG files in the
DB with meta data describing where the images came from.


My technique won't help you then. I use it for things like scanning images
of documents and using the database to keep track of the documents.
I never have to search a picture using a select statement so it would
be silly for me to put them into the database. I just back up the file
system using off the shelf backup software and it works fine.

--
SqliteImporter and SqliteReplicator: Command line utilities for Sqlite
http://www.reddawn.net/~jsprenkl/Sqlite

Cthulhu Bucks!
http://www.cthulhubucks.com

-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: [sqlite] reg:blob data reading

2006-09-18 Thread drh
SQLite does not (at this time) have the ability to incrementally
read or write BLOBs.  You have to read and write the whole blob
all at once.
--
D. Richard Hipp   <[EMAIL PROTECTED]>


-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re[2]: [sqlite] reg:blob data reading

2006-09-18 Thread Teg
Hello Jay,

Monday, September 18, 2006, 9:23:27 AM, you wrote:

JS> On 9/18/06, sandhya <[EMAIL PROTECTED]> wrote:
>> I think too, if they are Big-Blobs, it is better to store only a Reference
>> to a File.
>>
>> May i know litlle more clearly about this?What it mean actually?

JS> Store the path to the file in the database ( C:\somefile.dat   or
JS> /tmp/somefile.dat ).
JS> Then open the file using regular file handing routines (  fopen() etc ).

JS> 
-
JS> To unsubscribe, send email to [EMAIL PROTECTED]
JS> 
-


The whole reason I store files in the DB in the first place is to have
a single "package" to move around and backup when needed. My
application is storing whole series of PNG and JPG files in the
DB with meta data describing where the images came from.

I like the concept of being able to set an upper limit on the number
of retrieved bytes on a blob. I don't see any easy way to do it
though. My images files tend to be from 50-500K so, performance wise it's
pretty quick.

-- 
Best regards,
 Tegmailto:[EMAIL PROTECTED]


-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: [sqlite] reg:blob data reading

2006-09-18 Thread Jay Sprenkle

On 9/18/06, sandhya <[EMAIL PROTECTED]> wrote:

I think too, if they are Big-Blobs, it is better to store only a Reference
to a File.

May i know litlle more clearly about this?What it mean actually?


Store the path to the file in the database ( C:\somefile.dat   or
/tmp/somefile.dat ).
Then open the file using regular file handing routines (  fopen() etc ).

-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: [sqlite] reg:blob data reading

2006-09-18 Thread sandhya
You mean,
  I have to get the entire BLOB from the DB and has to store it in some
temp file and reading it?

Is there no way we can read the required no.of bytes of the data  from the
DB directly and setting the pointer or handle to the current position?
Please tell me whether the way i am thinking is wrong?

Help me
Thank you
Sandhya






- Original Message - 
From: "Jay Sprenkle" <[EMAIL PROTECTED]>
To: <sqlite-users@sqlite.org>
Sent: Monday, September 18, 2006 6:53 PM
Subject: Re: [sqlite] reg:blob data reading


> On 9/18/06, sandhya <[EMAIL PROTECTED]> wrote:
> > I think too, if they are Big-Blobs, it is better to store only a
Reference
> > to a File.
> >
> > May i know litlle more clearly about this?What it mean actually?
>
> Store the path to the file in the database ( C:\somefile.dat   or
> /tmp/somefile.dat ).
> Then open the file using regular file handing routines (  fopen() etc ).
>
> --
---
> To unsubscribe, send email to [EMAIL PROTECTED]
> --
---
>



-
To unsubscribe, send email to [EMAIL PROTECTED]
-



[sqlite] reg:blob data reading

2006-09-18 Thread sandhya
I think too, if they are Big-Blobs, it is better to store only a Reference
to a File.

May i know litlle more clearly about this?What it mean actually?

Right now what i am doing is, I have a directory in which somany files are
there and i have loaded all the files into database and the data(ie content
of files as blob).
But my application is reading only 512 bytes at a time..If that is the case
how can i perform / handle this data .
Please suggest me your views.

Thanks a lot
Sandhya R