Darxus [EMAIL PROTECTED] writes:
Well... I would say I've had substantial success.
Cygnus gnu win32 gzip/tar was able to extract my 2.6 gigabyte .tar.gz, and
all the filenames seemed to be intact in windows' long filename format.
But when I booted back to Linux, and attempted to copy the
Well... I would say I've had substantial success.
Cygnus gnu win32 gzip/tar was able to extract my 2.6 gigabyte .tar.gz, and
all the filenames seemed to be intact in windows' long filename format.
But when I booted back to Linux, and attempted to copy the files over to
my current home directory,
On Fri, 30 Oct 1998, Hamish Moffatt wrote:
On Thu, Oct 29, 1998 at 05:45:57PM -0800, Joey Hess wrote:
Joey Hess wrote:
* Get and install mtools.deb
* Edit /etc/mtools.conf, set it up so it can access your dos partition, as
say, drive c: (follow the comments, looks easy).
* mtype
I also realized that the mcopy from mtools could be used to copy to
stdout, and that it, in any case, would not stop for a ^Z, so I tried it,
but it had the same problme. Here's the output (which stops extracting at
about 1.2gb):
[EMAIL PROTECTED]:~$ mcopy c:/linux/home.tgz - | tar -ztvf -
I also realized that the mcopy from mtools could be used to copy to
stdout, and that it, in any case, would not stop for a ^Z, so I tried it,
but it had the same problme. Here's the output (which stops extracting at
about 1.2gb):
I've seen quite a bit going on in this thread. First off,
Darxus [EMAIL PROTECTED] writes:
On Thu, 29 Oct 1998, Laurent PICOULEAU wrote:
I think that dd will have no problem to truncate you file :
dd if=tarfile of=just_a_try bs=1k count=200 would give you a file
named just_a_try containing the beginning of your tarfile but limited
to a
Somewhere in this thread I saw that this might be possible on an
unpartitioned drive. Would it (dd) work on unpartitioned space on your
existing drive. I read SOMEWHERE about lossless partitioning.
apparently you can partition your drive without losing your data.
On Wed, 28 Oct 1998, Darxus wrote:
I installed Windows (see what you made me do??).
[...]
So unlike our beloved Unix utilities, winzip CAN seek past 2gb. BUT it
can't untar and unzip at the same time, and since I don't have over 5.2gb
of fat32 storage space, I don't have enough room to
On Wed, 28 Oct 1998, Kenneth Scharf wrote:
Mind you, it's a bit of a worry we are being outdone by Windows.
[...]
-
Pehaps it's time for the ex3fs? Hmmm, might look at the source for
ex2fs (and all utilities) for file pointers and change
[EMAIL PROTECTED] wrote:
Darxus writes:
Well, I remembered seeing something somewhere that let me find the inode
number of a file, and found it in ls's man page. Any chance I can use an
inode number to tell dd where to find it ?
Isn't the file on a fat32 filesystem? Seems like this is
Joey Hess wrote:
* Get and install mtools.deb
* Edit /etc/mtools.conf, set it up so it can access your dos partition, as
say, drive c: (follow the comments, looks easy).
* mtype c:/huge.tar.gz | tar
Hm, looking through the mtools docs, you may need to first mcd c: and then
mtype
On Thu, 29 Oct 1998, Joey Hess wrote:
Aha!!! I'll bet John is right - mtools should not go through the VFS layer,
and should be able to read a file this size. It's too bad we don't have an
mcat...
Aha again! We do. Ok, here's what you need to do:
* Get and install mtools.deb
* Edit
Darxus writes:
Well, I started out being really excited. Within seconds I had the index
of my 2.6gb home.tgz scrolling down my screen. But it looks like it will
only extract about 1.2gb this way, and unfortunately, as I now know, the
stuff I need is at the end :/
Something to try with dd:
On Thu, Oct 29, 1998 at 05:45:57PM -0800, Joey Hess wrote:
Joey Hess wrote:
* Get and install mtools.deb
* Edit /etc/mtools.conf, set it up so it can access your dos partition, as
say, drive c: (follow the comments, looks easy).
* mtype c:/huge.tar.gz | tar
Hm, looking through
Darxus wrote:
The command I used is:
mtype c:/linux/home.tgz | gunzip home.tar
This was after I noticed
mtype c:/linux/home.tgz | tar -zxvf - home/darxus
didn't restore everything.
Why did both variants fail? Any errors?
--
see shy jo
Hamish Moffatt wrote:
Under DOS, type is for text files and will termiante reading at the
first ^Z I would guess; if mtype mimics this, it won't work.
If that's so, it should be pretty eacy to hack up a mcat from the mtype
sources that behaves better.
--
see shy jo
On Tue, 27 Oct 1998, Gary L. Hennigan wrote:
[snip]
| You might think that it would sit there chewing on the file for a bit
| before it got to some point beyond what it could deal with. Nope.
| Didn't even start -- failed to even open the file up.
No, compressed files have to be read
David B. Teague [EMAIL PROTECTED] writes:
| On Tue, 27 Oct 1998, Gary L. Hennigan wrote:
|
| [snip]
|
| | You might think that it would sit there chewing on the file for a bit
| | before it got to some point beyond what it could deal with. Nope.
| | Didn't even start -- failed to even open
On 28 Oct 1998, Gary L. Hennigan wrote:
That was an excellent idea, unfortunately Darxus has already tried
this and it didn't work for him. Perhaps gzip tries to read the whole
file and even though, in your case, the file is truncated it'll do
what it can. In Darxus' case that means it's
I installed Windows (see what you made me do??).
I installed WinZip.
I told WinZip to open my 2.6gb home.tgz file... it said okay... it said
this file contains home.tar, you want me to extract it to a temp dir
open it ?
So unlike our beloved Unix utilities, winzip CAN seek past 2gb. BUT it
Darxus wrote:
I installed Windows (see what you made me do??).
I installed WinZip.
I told WinZip to open my 2.6gb home.tgz file... it said okay... it said
this file contains home.tar, you want me to extract it to a temp dir
open it ?
So unlike our beloved Unix utilities, winzip CAN seek
On Wed, 28 Oct 1998, Kenneth Scharf wrote:
==
Does this mean that there is a 2gb limit on any individual file, or
that the ENTIRE file system can't be larger than 2gb. If the latter
than does that mean that a linux disk partition can't be
On Tue, 27 Oct 1998, Darxus wrote:
On 27 Oct 1998, Gary L. Hennigan wrote:
| I felt like checking. Oops. When I reinstalled tried to restore it,
I
| found out that gzip can't seek to the end of the file (dies around
2gb?).
[...]
| You can force gzip to handle it as a stream.
On Wed, 28 Oct 1998, Raymond A. Ingles wrote:
On Wed, 28 Oct 1998, Kenneth Scharf wrote:
==
Does this mean that there is a 2gb limit on any individual file, or
that the ENTIRE file system can't be larger than 2gb. If the latter
than
On Thu, Oct 29, 1998 at 12:39:59AM -0600, Mark Panzer wrote:
Darxus wrote:
I installed Windows (see what you made me do??).
I installed WinZip.
I told WinZip to open my 2.6gb home.tgz file... it said okay... it said
this file contains home.tar, you want me to extract it to a temp dir
On Wed, Oct 28, 1998 at 02:57:53PM -0500, Raymond A. Ingles wrote:
There *is* support for 2GB files somewhere, but I think you'll have to
do some web searches or hit the [EMAIL PROTECTED] mailing list
for info on where and how. (Or, as has also been suggested, find a 64-bit
machine. :-/ )
I
In article [EMAIL PROTECTED] you write:
That was an excellent idea, unfortunately Darxus has already tried
this and it didn't work for him. Perhaps gzip tries to read the whole
file and even though, in your case, the file is truncated it'll do
what it can. In Darxus' case that means it's trying to
From: Mark Panzer [EMAIL PROTECTED]
I'm guessing Win95 can go out to 4GB 2^32 so maybe if you tried to
extract this archive in Win it still wouldn't work (BTW: how did this
file get created? The whole thing would have to be over 4GB uncompressed
(right?)) Well here's what you can assume, 1.
Darxus [EMAIL PROTECTED] writes:
| On 28 Oct 1998, Gary L. Hennigan wrote:
|
| That was an excellent idea, unfortunately Darxus has already tried
| this and it didn't work for him. Perhaps gzip tries to read the whole
| file and even though, in your case, the file is truncated it'll do
|
On 29 Oct 1998, Gary L. Hennigan wrote:
A fellow user, via personal email, suggested you might try the Unix
head utility. Something like:
head --bytes 1900m |gzip -d -c|tar tf -
Unfortunately that has the same result -- nothing.
Darxus writes:
I would not be able to use dd to do a raw copy to that hard drive,
because of the 2gb limit.
Why do you think you can't use dd? It just deals in blocks, and as long as
you are using raw devices the VFS never gets involved.
--
John Hasler
[EMAIL PROTECTED] (John Hasler)
Dancing
On 29 Oct 1998 [EMAIL PROTECTED] wrote:
Darxus writes:
I would not be able to use dd to do a raw copy to that hard drive,
because of the 2gb limit.
Why do you think you can't use dd? It just deals in blocks, and as long as
you are using raw devices the VFS never gets involved.
[EMAIL
On Wed, Oct 28, 1998 at 10:13:59PM -0500, Darxus wrote:
On 28 Oct 1998, Gary L. Hennigan wrote:
That was an excellent idea, unfortunately Darxus has already tried
this and it didn't work for him. Perhaps gzip tries to read the whole
file and even though, in your case, the file is
On Thu, 29 Oct 1998, Laurent PICOULEAU wrote:
I think that dd will have no problem to truncate you file :
dd if=tarfile of=just_a_try bs=1k count=200 would give you a file
named just_a_try containing the beginning of your tarfile but limited
to a size of slightly less than 2GB. Just
Darxus [EMAIL PROTECTED] writes:
| On 29 Oct 1998, Gary L. Hennigan wrote:
|
| A fellow user, via personal email, suggested you might try the Unix
| head utility. Something like:
|
| head --bytes 1900m |gzip -d -c|tar tf -
|
| Unfortunately that has the same result -- nothing.
|
Darxus [EMAIL PROTECTED] writes:
| On Thu, 29 Oct 1998, Laurent PICOULEAU wrote:
|
| I think that dd will have no problem to truncate you file :
| dd if=tarfile of=just_a_try bs=1k count=200 would give you a file
| named just_a_try containing the beginning of your tarfile but limited
|
Date: 29 Oct 1998 14:50:17 -0700
Okay, I wrote this code based on a suggestion from the Kernel mailing
list. Give it a shot and see what happens. You run it just like you
would using cat, e.g.,
readbytes file.tgz|gunzip -c|tar tf -
The suggestion was that perhaps open() wasn't
On Thu, Oct 29, 1998 at 05:26:19PM -0500, Darxus wrote:
Well, I remembered seeing something somewhere that let me find the inode
number of a file, and found it in ls's man page. Any chance I can use an
inode number to tell dd where to find it ?
You can use debugfs to get the inode and
I wrote:
Why do you think you can't use dd? It just deals in blocks, and as long as
you are using raw devices the VFS never gets involved.
^^^
Darxus writes:
[EMAIL PROTECTED]:/mnt/c/linux dd if=home.tgz of=/dev/null
0+0 records in
0+0 records out
That was my 2.6gb
Darxus writes:
Well, I remembered seeing something somewhere that let me find the inode
number of a file, and found it in ls's man page. Any chance I can use an
inode number to tell dd where to find it ?
Isn't the file on a fat32 filesystem? Seems like this is the place for DOS
filesystem
On 27 Oct 1998, Gary L. Hennigan wrote:
Yikes!
Yup :)
I missed your original post. That's what I get for replying to you via
someone else's reply and not reading the subject closely enough. Duh?
Where on earth did you store this file? I could've sworn the ext2fs
had a 2GB/file limit on
Quoting Darxus ([EMAIL PROTECTED]):
Any chance I can split this thing into like, 2 pieces, and be able to
access half of it ?
First half--no problem; second half--no go. Did you try
filname tar zxf -
Mike Stone
Hi,
They are both lovely suggestions, unfortunately the problem is a bit more
substantial. The 1st thing I tried was tar -zxvf home.tgz, and a couple
of the things I tried soon after that were cat and less. Neither of which
read any of it -- less is the only thing that did anything useful,
Darxus wrote:
Where on earth did you store this file? I could've sworn the ext2fs
had a 2GB/file limit on it? Certainly all the file utilities do. It
fat32 filesystem.
[snip]
Well, I think more than that, the assignment of the pointer to the file
probably failed, before it even got to
On Wed, Oct 28, 1998 at 12:53:53AM -0800, Keith Beattie wrote:
Darxus wrote:
Where on earth did you store this file? I could've sworn the ext2fs
had a 2GB/file limit on it? Certainly all the file utilities do. It
fat32 filesystem.
Hang on a second here, if the file exceeds the
Hi!
Darxus ([EMAIL PROTECTED]):
| Any more ideas ? :)
Only thing I can think of is getting access to a 64-bit machine,
decompressing the file there, tarring the contents off to tape and
then restoring them on your machine. Or at least putting them into
sub-2GB chunks before taking
Perhaps it's still ON the fat32 file system.
Mind you, it's a bit of a worry we are being outdone by Windows.
A 2Gb limit is almost unreasonable these days; I have some 400-500Mb
MPEG video files here, so a 2Gb one isn't out of the question,
especially with DVD.
Before I wiped redhat to install debian several weeks ago, I backed up my
/etc, /root, and /home directories to .tar.gz's on another drive.
Unfortunately I was overconfident that my syntax was correct, and that
there were no other problems that could occur -- I think I verified that I
could
On 27 Oct 1998, Gary L. Hennigan wrote:
| I felt like checking. Oops. When I reinstalled tried to restore it, I
| found out that gzip can't seek to the end of the file (dies around 2gb?).
|
| You can force gzip to handle it as a stream. Try something like:
|
| cat tarfile.tgz |
Darxus [EMAIL PROTECTED] writes:
| On 27 Oct 1998, Gary L. Hennigan wrote:
|
| | I felt like checking. Oops. When I reinstalled tried to
| | restore it, I
| | found out that gzip can't seek to the end of the file (dies
| | around 2gb?).
| |
| | You can force gzip to handle it as
50 matches
Mail list logo