> 1. If you are committing 1MB of data into the database per operation,
> that would take a looong time, and I'm not sure if that is the most
> efficient, or even the easiest way to do what you are trying to do. Why
> I say that is, Oracle is fine when you break up the data and commit,
> perhaps line by line, to the database. Not very fast or efficient, but
> still manageable. My experience of Oracle, though limited, has not
> exactly given me much confidence in its ability to handle huge data in
> individual fields. Stuff the fields with too much data, and try to work
> with them, and you will realize the necessity to tune Oracle. I'm not an
> expert on Oracle, nor do I care to be, so I won't try this.

I put huge amounts of data into oracle using transactions, it is fast and
quite efficent.  The largest lob that I currently have tried is about 43
megs.  Breaking the text up by line would make the sql quite a bit more
complicated so I'd stay away from that.  Also, if you setup the intermedia
text index you can do keyword searches from sql against the text data
(assuming it's in a clob).  Setting up the index isn't too hard, but if you
want it in a particular tablespace (other than the user's default
tablespace) you'll have to change your default tablespace to the one you
want the index in prior to creating the index.  After the index is created
you can change back to the old one.  Also the text index doesn't
auto-update, so you'll have to issue a dml command to trigger the index
update.

--mikej
-=-----
mike jackson
[EMAIL PROTECTED]




---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to