This was exactly what I was thinking about in case of memory limitations,
but our sparse arrays should not exceed 1Mb of data (as a result of summing
up millions of records), so I hope not to have the need to use Cachetemp.

I wonder why nobody has written an OLAP server for Cache, there should be a
market for it.
(Maybe there is one, and I don't know).
Thanks a lot!
Max

"Sam Jones" <[EMAIL PROTECTED]> ha scritto nel messaggio
news:[EMAIL PROTECTED]
> If you have a lot of data and local arrays become too slow, you
> could use a global in cachetemp indexed by $J so that each
> "user" has their own space to work in. If you don't want to
> create a global mapping, all globals who's name begins with
> "CacheTemp" are mapped to the CACHETEMP database. Cache'
> SQL uses ^CacheTemp so you might want to use ^CacheTempX
> (for example).
>
>
> "Max Sebastiani" <[EMAIL PROTECTED]> wrote in message
> news:[EMAIL PROTECTED]
> > Cheers Ram�n, and
> > thanks to everybody for your replies!
> >
> > Wow, these are great news for me:
> > Cache' seems the best technology for this kind of things.
> >
> > Performance is not an issue in our case, but using sparse arrays
improves
> > the quality of code a lot, and this makes the difference for us, since
> we're
> > going to re-sell the solution to more than one customer.
> >
> > Max
> >
> >
> > "Max Sebastiani" <[EMAIL PROTECTED]> ha scritto nel messaggio
> > news:[EMAIL PROTECTED]
> > > Hello, I'm wondering if sparse arrays in COS are suitable for
> > > multidimensional data analysis.
> > >
> > > My intention is to store information into arrays like:
> > >
> > > s ProductsSold("Apple","1997","december","Venezuela")=2000
> > >
> > > and create some routines to manipulate them (collapse dimensions,
apply
> > > functions to each term, distribute values, etc).
> > >
> > > My first question is: are sparse arrays (in memory, not globals)
limited
> > to
> > > 32k ?
> > >
> > > Max
> > >
> > >
> >
> >
>
>



Reply via email to