Confirmed, as long as you use the correct parameters, 'cp' will fully duplicate everything. Whoever ported it did a good job.

jim

On 08-11-2009, at 7:32 PM, P Witte wrote:

James Hunkins wrote:

I have had that happen occasionally in the past, especially when I was doing a lot of developing for QDT which generated a lot of files every time I rebuilt it. My guess is that it is some sort of fragmentation or less likely a file system table corruption. What I did to resolve it was to recursively copy all directories and files to a brand new drive and then, after exiting QPC, rename the original as a backup name (just in case) and rename the new one as the original name. In QPC you can also just change the names in the configuration for each drive. I used the 'cp' command (I think that it came with some Unix command set) to do the copying. One time hint, do not run it in verbose mode or else, depending on how many files, it will seem to take forever. Once you verify that the new drive is good, then you can get rid of the original that you backed up. I have never had a failure in doing this. Copying the entire qxl.win file takes the entire directory data structure with it so if it had a problem before, it will still have one.

Exactly right Jim. Way to go! I presume cp maintains the file dates, etc? Otherwise I have a home grown routine for this if anyone needs one.

On 08-11-2009, at 6:42 PM, P Witte wrote:
<>

Next step will be create a new blank QXL.WIN and copy everything there to see if that gives the same result (i.e. checking for fragmented qxl.win?)
Thats what Id try. Even if you copied all your files into the root directory they should all fit!
I just tried:
for i = 1 to 999999: save 'win5_t'&i: at 0,0:print i,
on a blank qxl.win file. Its still going strong at 16000 files plus!
I'll let you know the final answer unless Marcel gets there first ;o)

Cant wait for the final result as after about 16k files in one directory things really start to slow down! No wonder, as for each save the filing subsystem has to check every entry before the next save can occur.

Having determined that there is no arbitrary limit, I presume that the max number of files in one directory is probably around 32k (though it could be 65k) However, the practical limit at present is about 16k - 20k as after that, even on a 2.66 GHz Core Duo, things slow down unbearably.

Qpac absolutely hates having to read such a large directory, and if a Sort is switched on it takes forever!

A theoretical answer would be much quicker to obtain, but Im rather rusty at the moment and would know where to look for the data. But this way of finding out also gives some practical insights.

The final figure at closing time is: 27490 files in the root directory! and that took almost one hour to complete.

Good night!

Per
_______________________________________________
QL-Users Mailing List
http://www.q-v-d.demon.co.uk/smsqe.htm

_______________________________________________
QL-Users Mailing List
http://www.q-v-d.demon.co.uk/smsqe.htm

Reply via email to