Hi, I could need help in a somewhat unusual situation: A fellow PhD student is doing work time measurements with a proprietary digital video recoder based on Linux. His 120 GB disk is approaching full capacity and our only backup solution is the department server with its DAT DDS 4 tape drive (20 GB native) running Amanda for standard backups, of course. Without backup we have to process the data and delete it.
I have not 'rooted' the box yet, but by inserting a graphics card I see Suse Linux 6.0 beeing used as an OS and that the large disk has no file system or partition table and might be used as a raw device (/dev/hdb). That's a clever way to circumvent nonexisting 2.0.36 Kernel handling of large files. That's ok as long as everything is done on the proprietary TV monitor interface, which has only local backup capabilities. But I don't know the data format, so for remote backups I need to split the raw device file in chunks of 19 GB and _reliably_ add them back. Not to think of incrementals. I can think of several ways to backup, but I've to get restores right the first time. Has someone _done_ something similar? I can think of several ways to split my backup: dd if=/dev/hdb of=/dev/rmt offset=... tar -Mcvf /dev/hdb /dev/rmt Thanks for proven receipts, Johannes Nie� P.S: I don't have a second 120 GB disk or equivalent disk storage space easily available. It's an unwieldy amount of data to handle :-( P.P.S: The vendor support has not answered my emails.
