Brett W. McCoy wrote:
> On 8/19/07, mano M <[EMAIL PROTECTED]> wrote:
> 
>>   I am using   shell command  to sort an input file in my c++ program.
>>   system("sort -t',' -nk1,1 -nk2,2 -nk3,3)  ;
>>
>>   I would like to how much memory is reuiqred to sort a big file of 500 MB.
> 
> That depends on the shell command you are using, OS, etc (I am
> guessing you are using Linux or some other Unix variant). A 500 MB
> file is going to take a good bit of time to sort (the sort program
> probably uses some intermediate files and doesn't do it all in memory,
> you might want to take a look at the source code for it)

According to the 'sort' command under Windows, it does an in-memory sort 
up to 90% of available RAM and then uses a merge sort with a temporary 
file if more than that is needed.

If you wanted to sort a 500MB file entirely in-memory (assuming 500MB is 
available), you could load it all in one go and then you need to count 
how many lines are in the file and multiply that by sizeof(char *) to 
get how much additional RAM is needed.  That's a rough estimate give or 
take a couple hundred bytes.  The actual sort operation would just 
rearrange pointers to the data (which you pre-fill with pointers to the 
start of each line in the big 500MB chunk of data you loaded in).

-- 
Thomas Hruska
CubicleSoft President
Ph: 517-803-4197

*NEW* MyTaskFocus 1.1
Get on task.  Stay on task.

http://www.CubicleSoft.com/MyTaskFocus/

Reply via email to