How big of an array have you created without noticing any reduction in speed?

I'm pondering the best way to store a large conglomerate of data, and was hoping for a general guideline as to how humungous of an array I could create without tripping over it.

I'm currently set up to break it into smaller pieces. However, it would be awesome if I could store the data all together.

The array would have about 300,000 pieces of data, broken into about 300 keys, each with 1000 bits of data.

Currently I have 300 separate arrays planned, only one being loaded at any given time.

Have any of you worked with arrays this size?  Any speed issues?

Shari
--
WlND0WS and MAClNT0SH shareware games
BIackjack GoId
http://www.gypsyware.com
_______________________________________________
use-revolution mailing list
[email protected]
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-revolution

Reply via email to