We are getting a feed of an 800 MB file that will come in nightly. It needs to be
loaded to the database. Per requirements, we have to add some data to the file before
loading(its not negotiable).
ksh eats up 24% of total CPU on a 4 CPU Solaris box. We cannot do this. I am not
allowed to
considered Perl ?? also the load you mention is for what? adding data to the data file
you get or loading into the db??
Raj
Rajendra dot Jamadagni at nospamespn dot com
All Views expressed in this email are strictly
Ryan,
Could you cat the second file on to the end of the first file and have
the data load successfully?
cat file2 file1
How about a second box to perform the editing of the data file.
Something that resource intensive and manditory should not have a
problem getting funded.
Ron
[EMAIL
Ryan - Can you provide more details? Typically ksh scripts are much, much
more efficient than alternate methods, such as manipulating data within the
database. Depending on which method you are using to measure CPU usage, you
may be seeing 1/4 of one CPU. But even if your script is using a full
If you're on 9i, external tables and pipelined table functions should be
useful...
on 1/22/04 7:59 AM, [EMAIL PROTECTED] at [EMAIL PROTECTED] wrote:
We are getting a feed of an 800 MB file that will come in nightly. It needs to
be loaded to the database. Per requirements, we have to add some