data manipulation of a large unix file

2004-01-22 Thread ryan.gaffuri
We are getting a feed of an 800 MB file that will come in nightly. It needs to be loaded to the database. Per requirements, we have to add some data to the file before loading(its not negotiable). ksh eats up 24% of total CPU on a 4 CPU Solaris box. We cannot do this. I am not allowed to

RE: data manipulation of a large Unix file

2004-01-22 Thread Jamadagni, Rajendra
considered Perl ?? also the load you mention is for what? adding data to the data file you get or loading into the db?? Raj Rajendra dot Jamadagni at nospamespn dot com All Views expressed in this email are strictly

Re: data manipulation of a large unix file

2004-01-22 Thread Ron Rogers
Ryan, Could you cat the second file on to the end of the first file and have the data load successfully? cat file2 file1 How about a second box to perform the editing of the data file. Something that resource intensive and manditory should not have a problem getting funded. Ron [EMAIL

RE: data manipulation of a large unix file

2004-01-22 Thread DENNIS WILLIAMS
Ryan - Can you provide more details? Typically ksh scripts are much, much more efficient than alternate methods, such as manipulating data within the database. Depending on which method you are using to measure CPU usage, you may be seeing 1/4 of one CPU. But even if your script is using a full

Re: data manipulation of a large unix file

2004-01-22 Thread Tim Gorman
If you're on 9i, external tables and pipelined table functions should be useful... on 1/22/04 7:59 AM, [EMAIL PROTECTED] at [EMAIL PROTECTED] wrote: We are getting a feed of an 800 MB file that will come in nightly. It needs to be loaded to the database. Per requirements, we have to add some