Along the lines of TG's suggestion about a redesign ... Maybe it doesn't
need to be too drastic. Would it be feasible to process 50 x 1,000
values ?

You could build the arrays of 1,000 (even 10,000 would be significantly
better I would think) invoices in a matrix, substitute 1,000 values,
then tumble through the existing (unchanged?) logic.

Just a thought  

>-----Original Message-----
>From: [EMAIL PROTECTED] 
>[mailto:[EMAIL PROTECTED] On Behalf Of Mark Johnson
>Sent: Saturday, 13 August 2005 3:50 AM
>To: [email protected]
>Subject: Re: [U2] Remove Scenario
>
>I tried that although the proportional fonts in emails sort of 
>hid that in my original post. I think it was my second example 
>labeled slight improvement. If you're indicating to improve my 
>MV counter then I doubt that's where the issue lies. The real 
>bottleneck is truly the string of TABLE multi-values that 
>will, in essence, have 50,000 values.
>
>Thanks.
>
>----- Original Message -----
>From: <[EMAIL PROTECTED]>
>To: <[email protected]>
>Sent: Friday, August 12, 2005 1:33 AM
>Subject: Re: [U2] Remove Scenario
>
>
>> One thought. It seems I remember that using VAR += 1 is 
>significantly 
>> faster than VAR = VAR + 1. It's not much, but maybe it will help.
>>
>> Karl
>>
>> <quote who="Mark Johnson">
>> > Here's a doozy.
>> >
>> > Thanks for the previous suggestion of using REMOVE instead 
>of the <> 
>> > extractions. That's working very well.
>> >
>> > New problem.
>> >
>> > One client's application is written in The Programmer's 
>Helper (TPH)
>which
>> > MATREADS and has EQUATES assigning variables like INVNO.TABLE TO
>CUST(40)
>> > etc.
>> >
>> > The program is written with INVNO.TABLE<1,X> style 
>extracts everywhere.
>> > There
>> > are probably 15 mv'd fields with the suffix TABLE and their mv 
>> > counters are in sync.
>> >
>> > Prior to using REMOVE (it had an issue on D3), I MATREAD in a
>BIG(300000)
>> > array which breezed through the high item count of 155,000
>records.(REMOVE
>> > took 8 seconds, BIG took around 12 and <> took over 9 minutes).
>> >
>> > Here's the rub. This is a Cash Reciept application where the BIG 
>> > array
>is
>> > one
>> > customer's invoices. The load-in process jogs through the 
>BIG array 
>> > and for those items with a non-zero balance, it creates these 15 
>> > TABLE
>variables.
>> > Trouble is, if there's 155,000 total records for this 1 customer,
>100,000
>> > may
>> > have a balance of zero leaving 50,000 to be handled in the 
>application.
>> >
>> > So while REMOVE is a great way to extract from BIG as a dynamic 
>> > array
>and
>> > MATREAD is great for extracting from a DIM array, what 
>would be the 
>> > best way to build these 15 separately named TABLE variables. The 
>> > original program (sans
>> > REMOVE) looked like this:
>> >
>> > C=DCOUNT(BIG,CHAR(254))
>> > FOR I=1 TO C
>> >     ID=BIG<I>
>> >     READV BAL FROM ARFILE, ID, 10 THEN
>> >         IF BAL # 0 THEN
>> >             INV.TABLE<1,-1>=ID
>> >             AAA.TABLE<1,-1>=SOMETHING ELSE
>> >             BBB.TABLE<1,-1>=SOMETHING ELSE
>> >             CCC.TABLE<1,-1>=SOMETHING ELSE
>> >             MMM.TABLE<1,-1>=SOMETHING ELSE
>> >         END
>> >     END
>> > NEXT I
>> >
>> > So while REMOVE is a great extractor for these 150,000 
>fields, what 
>> > is a great inserter for these 15 TABLE variables. In essence, the 
>> > BAL # 0 is 50,000 records.
>> >
>> > I tried
>> > MV=MV+1
>> > INV.TABLE<1,MV>=ID
>> > etc
>> >
>> > and got a minor improvement.
>> >
>> > I tried
>> > INV.TABLE:[EMAIL PROTECTED]:SOMETHING ELSE
>> > etc
>> >
>> > and got a slightly better improvement.
>> >
>> > In either case, you could see the progressive 
>(exponential) delay as 
>> > it performs these 50,000 (x 15) TABLE actions.
>> >
>> > I tried using my DIM BIG(300000) where the dim element number was 
>> > the insertable MV and I used the dynamic array concept on each 
>> > dimensioned array element. Thus:
>> >
>> > MV=0 ; L=0
>> > LOOP
>> >     REMOVE ID FROM XREF AT L SETTING D
>> >     READV BAL FROM ARFILE, ID, 10 THEN
>> >         IF BAL # 0 THEN
>> >             MV=MV+1
>> >             BIG(MV)<1>=ID
>> >             BIG(MV)<2>=SOMETHING ELSE
>> >             BIG(MV)<3>=SOMETHING ELSE
>> >             BIG(MV)<15>=SOMETHING ELSE
>> >         END
>> >     END
>> > UNTIL D=0 DO ; REPEAT
>> >
>> > and it took only 8 seconds. Cool. So now I have a dimensioned BIG 
>> > array with 50,000 elements each having 15 attributes.
>> >
>> > Because the infidel TABLE variables are scattered throughout this 
>> > generated 1,500 line program, I don't want to search and replace 
>> > them all with
>their
>> > BIG(MV)<12> equivilents unless I really have to. 
>Eventually, I have 
>> > to take these mv'd TABLE variables and writev (sic) them onto the 
>> > data file.
>> >
>> > MATBUILD doesn't seem to work with 2 dimensioned 
>dimensioned arrays 
>> > nor with elements containing attributes or values. It only 
>likes the 
>> > elements
>being
>> > simple variables.
>> >
>> > If this were a report program I would kick it off on a phantom and 
>> > be
>done
>> > with it. Since it's a user oriented Cash Receipts program, 
>the user 
>> > literally waits 5-9 minutes while a single customer 'loads'. Of 
>> > course, the larger more important customers are handled more 
>> > frequently, thus more headaches.
>> >
>> > So the question is whether there is an INSERT or append function 
>> > with
>the
>> > magic of REMOVE.
>> >
>> > Thanks for any insights.
>> > Mark Johnson
>> > -------
>> > u2-users mailing list
>> > [email protected]
>> > To unsubscribe please visit http://listserver.u2ug.org/
>> >
>>
>>
>> --
>> Karl L. Pearson
>> Director of IT,
>> ATS Industrial Supply
>> Direct: 801-978-4429
>> Toll-free: 800-789-9300 1,29
>> Fax: 801-972-3888
>> http://www.atsindustrial.com
>> [EMAIL PROTECTED]
>> -------
>> u2-users mailing list
>> [email protected]
>> To unsubscribe please visit http://listserver.u2ug.org/
>-------
>u2-users mailing list
>[email protected]
>To unsubscribe please visit http://listserver.u2ug.org/
-------
u2-users mailing list
[email protected]
To unsubscribe please visit http://listserver.u2ug.org/

Reply via email to