MY largest file so far. :)
>> On Tue, 16 Feb 2010 11:09:11 -0500, "Dan H. Eicher" <[email protected]> said: > Thanks. > Another freaky file, but not the 4TB sqlite files that have been the > problem in the past. > PS.. Can I put in a dsm.excl for files over a certain size? > -rw------- 1 akhade grad 98P Jan 29 17:15 ak.txt > r...@backup-0:[/export/homes20/akhade/DBI/BACKUP/Chris]#ls -l | grep txt > -rw------- 1 akhade grad 110338190950531072 Jan 29 17:15 ak.txt > be made in 00:00:09 ^MA Reconnection attempt will be made in 00:00:08 > ^MA Reconnection attempt will be made in 00:00:07 ^MA Reconnection att > empt will be made in 00:00:06 ^MA Reconnection attempt will be made in > 00:00:05 ^MA Reconnection attempt will be made in 00:00:04 ^MA Reconn > ection attempt will be made in 00:00:03 ^MA Reconnection attempt will be > made in 00:00:02 ^MA Reconnection attempt will be made in 00:00:01 > A Reconnection attempt will be made in 00:00:00 ... successful > Retry # 2 Normal File--> 110,338,190,950,531,072 > /export/homes20/akhade/DBI/BACKUP/Chris/ak.txt ** Unsuccessful ** > Retry # 3 Normal File--> 110,338,190,950,531,072 > /export/homes20/akhade/DBI/BACKUP/Chris/ak.txt ** Unsuccessful ** - Allen S. Rout
