I have a stored procedure that exports data from a series of tables to a specific format. The file is fixed with and there is data layout for the mapping. Anyway. My question is...
Each row contains about 40,000 columns and there are thousands of rows. I am building this row in a clob, I just keep concatenating it until the next row starts. Basic stuff. This is very slow, I am guessing because I am using clobs. In my testing with a varchar2(4000) the procedure is very fast. Obviously, when I test with the varchar, I have to limit the data that I retrieve to under 4000 chars. What I need to know is, is there a better way to handle this?
My first thought is to create 10 varchar variables and spread the data across them and bring it together in the end. Or I could create a table with a varchar field and spread it across multiple rows and bring that together in the end.
The clobs work fine when exporting just a few rows but when I try to export a few hundred; The program takes about an hour and temp table space grows by 768 megs.
Thanks,
David
[Todays Threads] [This Message] [Subscription] [Fast Unsubscribe] [User Settings] [Donations and Support]

