My understanding is that if my server formats messages into strings (to
send to the clients), then each string is allocated new memory from the
heap.  When the strings have been sent and go out of scope (ie, a search of
the program roots find no reference to that string) the string memory may
then be reclaimed by the garbage collector.  However, I read that the
garbage collector does not activate until more memory is needed from the
heap.

Is that all correct?  Because that would imply that server processes would
slowly grow until they reached their heap maximum.  That does not seem like
a desirable effect...

Or do wonks code their servers using only static Char buffers and
StringBuilder classes?  I have explored this a bit - it's kinda hard to
avoid the system routines that require a string parameter, especially since
you can't cast a StringBuilder class as a string.  Yes, I know I can
convert the StringBuilder class into a string, but that was just what I was
trying to avoid...

Would someone please smack me between the eyes, tell me what stupid/wrong
assumption I'm making, make fun of me (optional), and set me straight?

Or - Is there a list of "best practices" in using strings?

Many thanks,
-Scott Burrington

You can read messages from the Advanced DOTNET archive, unsubscribe from Advanced 
DOTNET, or
subscribe to other DevelopMentor lists at http://discuss.develop.com.

Reply via email to