Hi

> Iâm writing an application that performs complex computations on
biomolecules. The application needs to
> read a number of different structures (= molecules) from disk and then
perform the necessary calculations.
> These molecules are held in a stl vector, for example:
>
> std::vector <CMolecule> m_vMolecules;
>
> This works fine as long as the vector doesnât exceed the RAM memory.
Unfortunately, this is not the
> general case because the number of molecules to be read is typically
higher than 20,000 and so, when the
> RAM memory is not enough, the applicationâs efficiency decreases
dramatically. Perhaps I could try to
> read only the molecules that donât go above the memory and when the rest
of the molecules are needed,
> release the first chunk and read a new one. Is this a good approach? If
so, any help on the best way to
> implement this scheme?

How much RAM do you have available? I've allocated vectors with about 20
million elements before now without any problem, but I had 2GB ram to play
with.

Are you reserving the vector elements up front? This can help as the vector
will not need to reallocate during population. We had a problem where the
vector ran out of memory, but it already took up half the available ram,
therefore a new block of memory of sufficient size could not be allocated to
copy the elements into.

Are you sorting your elements? If so then consider holding them by pointer.
That way when you are moving them around, you are only copying pointers, not
entire data structures.

Regards
Paul

Paul Grenyer
email: [EMAIL PROTECTED]
web: http://www.paulgrenyer.co.uk
articles: http://www.paulgrenyer.dyndns.org/articles/

Jensen should have gone to Williams.
Ecclestone is killing the sport.


_______________________________________________
msvc mailing list
[EMAIL PROTECTED]
See http://beginthread.com/mailman/listinfo/msvc_beginthread.com for subscription 
changes, and list archive.

Reply via email to