Hi Dennis, Sorry, I am not in favour of this if this means that I would loose the possibility to index arrays with text or to have to specify the array size in advance.
Personally, I need to index with text parts that I don't know in advance what the size of the array will be. With this type of constraint, the solution you suggest in fact considerably slows down the programs I write. My experience with a program doing exactly the same thing: Visual Basic 5 hours (which does *exactly* what you request, Awk 10 min which does *exactly* what you complain about. Gawk does it very fast without the need for any of these two options. Look at this link, for evidence: http://www.cs.wustl.edu/~loui/sigplan "Two pearls in GAWK: its regular expressions and its associative arrays." By chance, could SQLlite help to speed up your array processing? (after all, a database with two columns of data is just another representation for an array and you can apply a formula to transform the values of one column) Marielle To all the speed freaks, I know that I have pushed for faster array processing and have even proposed an "Array sub-processor" as a possible solution. However, after giving this much thought, I don't believe that a separate array sub-processor is needed to solve the problem of speed. Transcript could provide the speed needed for processing arrays in a simpler way. If Transcript had an array declaration command that allowed the user to fix the dimensions and data size/type for an array and limited the "keys" to integer indexes, then high speed array processing could be built-in. _______________________________________________ use-revolution mailing list [email protected] Please visit this url to subscribe, unsubscribe and manage your subscription preferences: http://lists.runrev.com/mailman/listinfo/use-revolution
