Hi Malcolm,

I understand the problem of determining memory requirements is more
> complicated than just volume.  There are the number of nodes to consider,
> and the number and type of the indexes.
>
**
>
> ** **
>
> I was curious if any formulas existed I could utilize to determine minimum
> memory requirements.
>

No, there is no such formula.  Apart from the data structure and indexes
there is also one important factor - workload - i.e. queries/updates you
run. Besides, what does enough mean? 100MB is *physically* enough to run any
query on any data.

I believe the only really effective approach to analyze queries. Run them
and look how many blocks they read/write (this information is available in
event log after session is closed).

Ivan Shcheklein,
Sedna Team
------------------------------------------------------------------------------
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security 
threats, fraudulent activity, and more. Splunk takes this data and makes 
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2d-c2
_______________________________________________
Sedna-discussion mailing list
Sedna-discussion@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/sedna-discussion

Reply via email to