Re: can't zip large files 2gb

2007-05-04 Thread David Banning
 What version of gzip are you using? From www.gzip.org:
 
 gzip 1.2.4 may crash when an input file name is too long (over
 1020 characters). 

not in my case.

 The buffer overflow may be exploited if gzip is
 run by a server such as an ftp server. Some ftp servers allow
 compression and decompression on the fly and are thus vulnerable.

I'm not running it via ftp.  

 This http://www.gzip.org/gzip-1.2.4b.patchpatch to gzip 1.2.4
 fixes the problem. The beta version
 http://www.gzip.org/gzip-1.3.3.tar.gz1.3.3 already includes a
 sufficient patch; use this version if you have to handle files
 larger than 2 GB. 

Tried that version, but still I have the same problem.

A new official version of gzip will be released
 soon.Also, check if you have compiled gzip with 64bit I/O.
 Check this gzip FAQ http://www.gzip.org/#faq10 too.HTH  

I used their suggestion for unzip; 

gunzip  file.gz  file
 
which actually completes (siting a crc error), 
but gives a file size 2 bites greater than the original when unzipped.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-04 Thread dex

Check for root kits?

Try the same operation on a known working system, take that output
file and do a diff with that and the corrupt one after a 'strings', so
'strings new.gz  new-text', 'strings corrupt.gz  corrupt-text',
'diff new-text corrupt-text'.  I'm just interested in how it's being
corrupted and maybe the strings output will tell you something.

Sorry if this was specified before, but did this just start happening
or is this the first time you've tried to gzip large files on this
system?
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-04 Thread David Banning
 Try the same operation on a known working system, take that output
 file and do a diff with that and the corrupt one after a 'strings', so
 'strings new.gz  new-text', 'strings corrupt.gz  corrupt-text',
 'diff new-text corrupt-text'.  I'm just interested in how it's being
 corrupted and maybe the strings output will tell you something.

I don't have a separate system, but I tried the strings output
of the tar before compression and the strings output of the tar
-after- compression and uncompression - as I mentioned the size output
is only two bites difference. 

The result was that the memory was exhausted on attempting a diff
of the two files, but there was around a 1 meg difference between
the two 1.5G ascii files.


 Sorry if this was specified before, but did this just start happening
 or is this the first time you've tried to gzip large files on this
 system?

first time I have tried files of this size - but I get the same problem
no matter what compression utility I use; tried gzip, bzip2, rzip
and compress.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-04 Thread David Banning
 available. At least some of your utilities in /usr/bin are statically
 linked.
 
 In 5.5 the statically linked utilities are in /stand, while dynamically
 linked versions of the same are in their normal places.
OK.

 -r-xr-xr-x  30 root  wheel  2046148 Nov  4  2004 /stand/gzip
 
 I would expect your gzip to be about the same 2M in size.
Yes. It is.

 
 % ldd /stand/gzip /usr/bin/gzip
 ldd: /stand/gzip: not a dynamic executable
 /usr/bin/gzip:
 libc.so.5 = /lib/libc.so.5 (0x2808)

 If you have sources I would try replacing those utilities. If that works
 then we have exonerated your hardware, but now all your software is
 suspect. 

I have upgraded my system to FreeBsd 4.9, the libc.so.5 in your case is
libc.so.4 in my case but it looks like it has been rebuilt in the upgrade.

I have downloaded and compiled the new 1.3.3 beta gzip from the gzip
website, and built it with no difference in my problem resulting.
I have uninstalled and re-installed bzip2 - also with the same eroneous
results.

What seems strange is that the failure is not a massive failure
the gzipped and then gunzipped file is only 2 bites difference
on a 3G file. I am wondering now if something could be amiss in
my BIOS - any thoughts here?
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-04 Thread David Kelly
On Fri, May 04, 2007 at 01:42:59PM -0400, David Banning wrote:
 
 What seems strange is that the failure is not a massive failure
 the gzipped and then gunzipped file is only 2 bites difference
 on a 3G file. I am wondering now if something could be amiss in
 my BIOS - any thoughts here?

FreeBSD should be BIOS agnostic. Once booted the BIOS is out of the
picture.

Where do the original and output differ? Use cmp(1) and it should list
the offset where first difference occurs:

% cmp aes_ctr.c aes_ctr2.c
aes_ctr.c aes_ctr2.c differ: char 1065, line 50

Probably won't list a line number. What would be telling is if the
problem occurs on a nice boundary like 2G. Sadly that would be typical
of Linux as FreeBSD has not had a 2G file size limit, ever. At least not
since 2.0.0. It has had cases where signed 32 bit values were used that
caused problems. To this day MacOS X's ftp client doesn't understand
file transfers over 2G. Transfers them just fine but gets confused as to
how much it has downloaded and how much is remaining.

Am thinking I know a 4.x machine that can be fired up with little more
effort than to attach a KVM and flip the switch. Will see in about 6 or
8 hours.

-- 
David Kelly N4HHE, [EMAIL PROTECTED]

Whom computers would destroy, they must first drive mad.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-04 Thread David Kelly
On Fri, May 04, 2007 at 01:33:09PM -0400, David Banning wrote:
 
 first time I have tried files of this size - but I get the same
 problem no matter what compression utility I use; tried gzip, bzip2,
 rzip and compress.

Which is why I suspected a corrupted library and asked that ldd(1) be
run on his utilities. We found they were statically linked, in
/usr/bin/. That sounds fishy.

As for running out of ram, if the whole system was built statically
linked then such would not be surprising.

-- 
David Kelly N4HHE, [EMAIL PROTECTED]

Whom computers would destroy, they must first drive mad.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-04 Thread CyberLeo Kitsana
David Banning wrote:
 I am attempting to zip large files that are 2GB - 3GB.

If you wish to exclude the disks completely, you can try generating a
large file on a known-good host and running md5 on it, then reading it
via NFS on the suspect machine, piping through gzip and gunzip, and
right into md5, then comparing the numbers.

cat largefile | gzip | gunzip | md5

In this way, the file never even touches the disk, and that subsystem
can be ruled out entirely.

On a side note, does anybody know if those lossless codecs use the FPU?

--
Fuzzy love,
-CyberLeo
Technical Administrator
CyberLeo.Net Webhosting
http://www.CyberLeo.Net
[EMAIL PROTECTED]

Furry Peace! - http://www.fur.com/peace/
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-03 Thread David Banning
I have replace my memory. It didn't make any difference.

root# gunzip *ian_mail*
gunzip: 3s1.com-ian_mail-full-20070503-0105AM.1.tgz: 
invalid compressed data--format violated
root# 

and another way;

root# tar tzf *ian_mail*

lists most files in the tgz, then terminates with;

...
tar: Skipping to next header
tar: Archive contains obsolescent base-64 headers

gzip: stdin: invalid compressed data--format violated
tar: Child returned status 1
tar: Error exit delayed from previous errors
root#


___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-03 Thread Kris Kennaway
On Thu, May 03, 2007 at 01:48:43AM -0400, David Banning wrote:
 I have replace my memory. It didn't make any difference.
 
 root# gunzip *ian_mail*
 gunzip: 3s1.com-ian_mail-full-20070503-0105AM.1.tgz: 
 invalid compressed data--format violated
 root# 
 
 and another way;
 
 root# tar tzf *ian_mail*
 
 lists most files in the tgz, then terminates with;
 
 ...
 tar: Skipping to next header
 tar: Archive contains obsolescent base-64 headers
 
 gzip: stdin: invalid compressed data--format violated
 tar: Child returned status 1
 tar: Error exit delayed from previous errors
 root#

Unless you can demonstrate some other systematic effect (e.g. always
truncated at the same size), it looks like you have some other kind of
failing hardware that is silently corrupting the data during writing
or reading from disk.

Kris


pgpX3acHy2uXB.pgp
Description: PGP signature


Re: can't zip large files 2gb

2007-05-03 Thread David Kelly


On May 2, 2007, at 11:41 PM, David Banning wrote:

I haven't been paying 100% attention. Just how does it fail? What  
do you

mean by corrupt?

Does the process run to completion?


All programs zip with no errors. On reading;

root# bzip2 -t zippedfile.bz2
bzip2: 3s1.com-smartstage_ftp-full-20070502-0125AM.1b.tar.bz2:
data integrity (CRC) error in data


Can't keep from thinking somehow your hardware is broken because  
myself and others have been gzip, bzip, zipping, large files for a  
long time under FreeBSD without problems. BUT on 6.0 I had  
intermittent problems trying to download files over 4 GB with  
FreeBSD. But seemingly only with old files, more than a day old.


A workaround was to cat file  /dev/null from a shell login about the  
same time as I started the ftp download. This clue helped eventually  
find the real problem in the FreeBSD kernel.


Just for kicks, try cat file  /dev/null while the compression  
process is running on same file. This might help keep your source  
file in cache while the compression process runs.


Apparently you have a spare original copy of the data laying around  
but another thing to try is gzip -c file  file.gz which does not  
destroy the original.


--
David Kelly N4HHE, [EMAIL PROTECTED]

Whom computers would destroy, they must first drive mad.

___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-03 Thread DAve

David Banning wrote:

I have replace my memory. It didn't make any difference.

root# gunzip *ian_mail*
gunzip: 3s1.com-ian_mail-full-20070503-0105AM.1.tgz: 
invalid compressed data--format violated
root# 


and another way;

root# tar tzf *ian_mail*

lists most files in the tgz, then terminates with;

...
tar: Skipping to next header
tar: Archive contains obsolescent base-64 headers

gzip: stdin: invalid compressed data--format violated
tar: Child returned status 1
tar: Error exit delayed from previous errors
root#


Just a thought and maybe nothing at all. Your example shows you are 
using tar. What happens if


1) You only archive with tar, and then read the tar file back, is it 
successful? (no compression)


2) You only use gzip, cat a few dozen log files together for example to 
get the same file size. Can gunzip uncompress them?


3) You state in your earlier emails you tried gzip from ports. Were your 
ports updated recently? FBSD 4 is no longer supported. Possibly your new 
install of gzip is not working properly. If so you might try 
deinstalling gzip, installing the ports system from your install CDs, 
reinstalling gzip. (I have seen many users drop into 
/usr/ports/something/something then enter make;make install;make clean 
and never notice a warning or issue fly by on the monitor during the build)


Just thinking outloud, try tar and gunzip separately and see if the 
problem persists.


Still think maybe a hardware issue.

DAve


--
Three years now I've asked Google why they don't have a
logo change for Memorial Day. Why do they choose to do logos
for other non-international holidays, but nothing for
Veterans?

Maybe they forgot who made that choice possible.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-03 Thread David Kelly
Correction:

On Thu, May 03, 2007 at 07:10:37AM -0500, David Kelly wrote:
 
 Can't keep from thinking somehow your hardware is broken because  
 myself and others have been gzip, bzip, zipping, large files for a  
 long time under FreeBSD without problems. BUT on 6.0 I had  
 intermittent problems trying to download files over 4 GB with  
 FreeBSD. But seemingly only with old files, more than a day old.

over 4 GB *from* FreeBSD. ftpd was having problems reading the large
file, sometimes.

-- 
David Kelly N4HHE, [EMAIL PROTECTED]

Whom computers would destroy, they must first drive mad.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-03 Thread David Banning
  
  gzip: stdin: invalid compressed data--format violated
  tar: Child returned status 1
  tar: Error exit delayed from previous errors
  root#
 
 Unless you can demonstrate some other systematic effect (e.g. always
 truncated at the same size), it looks like you have some other kind of
 failing hardware that is silently corrupting the data during writing
 or reading from disk.

You are probably right. I am going to upgrade to 6.x and see
if that helps - it may not but I have to upgrade one day anyway.
It will be interesting to see if the problem follows me.

Thanks for your input -
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-03 Thread DAve

David Banning wrote:

gzip: stdin: invalid compressed data--format violated
tar: Child returned status 1
tar: Error exit delayed from previous errors
root#

Unless you can demonstrate some other systematic effect (e.g. always
truncated at the same size), it looks like you have some other kind of
failing hardware that is silently corrupting the data during writing
or reading from disk.


You are probably right. I am going to upgrade to 6.x and see
if that helps - it may not but I have to upgrade one day anyway.
It will be interesting to see if the problem follows me.



I would think that over first. Unless the problem you have is noted as 
fixed in a later version, an upgrade for upgrade's sake is not the right 
course of action.


If you upgrade and the problem persists, you won't know what the problem 
was. So far I haven't heard anyone who thinks it is a problem with the 
distribution.


If you upgrade and the problem goes away, you still don't know what the 
problem was. If the problem returns a week, a month, a year down the 
road you are right back where you are now.


Just my two cents worth.

DAve


--
Three years now I've asked Google why they don't have a
logo change for Memorial Day. Why do they choose to do logos
for other non-international holidays, but nothing for
Veterans?

Maybe they forgot who made that choice possible.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-02 Thread Giorgos Keramidas
On 2007-05-01 15:58, David Banning [EMAIL PROTECTED] wrote:
 I am attempting to zip large files that are 2GB - 3GB.

 uname -a;

 FreeBSD 3s1.com 4.11-STABLE FreeBSD 4.11-STABLE #7

 I have tried gzip, bzip2 from the ports and rzip.

 All give no errors on zipping, but will not unzip, siting CRC
 errors.

 Is there a maximum file size for zipping? Is my system too old?
 Maybe a file or library that all zip programs depend on that is
 corrupt?

A lot of the features related to file sizes and other attributes
of the files stored on a disk depend highly on the type of file
system used on the disk.

What file system does the destination directory live in?

- Giorgos

___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-02 Thread David Banning
 Maybe you have defective RAM in the upper memory area.
 Try running MEMtest86 to see you have some bad memory.

You may have something here. I don't have a floppy on this machine,
and I can't shut down my server to test the memory but I may shut
it down long enough to swap the memory chips so I can test them in
another machine.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-02 Thread David Banning
 A lot of the features related to file sizes and other attributes
 of the files stored on a disk depend highly on the type of file
 system used on the disk.
 
 What file system does the destination directory live in?

originally my problem was with a dedicated ide (on ide cable in machine)
secondary mounted drive - 300G

I tried it in /usr with same results.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-02 Thread Giorgos Keramidas
On 2007-05-02 12:26, David Banning [EMAIL PROTECTED] wrote:
Giorgos Keramidas wrote:
 A lot of the features related to file sizes and other attributes of
 the files stored on a disk depend highly on the type of file system
 used on the disk.
 
 What file system does the destination directory live in?
 
 originally my problem was with a dedicated ide (on ide cable in machine)
 secondary mounted drive - 300G
 
 I tried it in /usr with same results.

The disk type isn't really what I asked about.  Is your /usr file system
mounted from UFS (I haven't kept all the messages of the thread, so I
don't remember from the df output; please excuse my short memory, if
I'm repeating a question already answered).

___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-02 Thread David Banning
  originally my problem was with a dedicated ide (on ide cable in machine)
  secondary mounted drive - 300G
  
  I tried it in /usr with same results.
 
 The disk type isn't really what I asked about.  Is your /usr file system
 mounted from UFS (I haven't kept all the messages of the thread, so I
 don't remember from the df output; please excuse my short memory, if
 I'm repeating a question already answered).

I was actually stabbing at the answer there - yes, both file systems
tried are UFS, each are on separate drives, both have plenty of space
and I have done an error free fsck on one of those drives, the other
is mounted and running so I have not tried fsck. 

Here is a summary;

original 3G tar file; untars fine
gzip; corrupts
bzip2; currupts
compress; corrupts
rzip; corrupts

I realize this looks like it may be memory, but running top I notice
that archivers use very little memory, between 1-10 meg while running,
while they do keep the processor fairly busy working.

There is one thing on my mind - I only have 512Meg in my machine. I
installed another 512M to make it 1G and the machine crashed once
per week; the new memory card is what I concluded was a problem. 

I took out the memory card concluding that is was the the new
memory I installed and then deinstalled that was problematic. Just
so were clear - all of my zip problems have been been running on my
original, problem free 512M memory.

Now I'm thinking of another possiblity - could it be that installing
the -new- memory caused the machine to reorganize how the -old- memory was
used - exposing a problem in the original memory that before the 
machine didn't use that often? 

Hope you followed that -

___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-02 Thread David Kelly
On Wed, May 02, 2007 at 02:08:16PM -0400, David Banning wrote:

 Here is a summary;
 
 original 3G tar file; untars fine
 gzip; corrupts
 bzip2; currupts
 compress; corrupts
 rzip; corrupts

I haven't been paying 100% attention. Just how does it fail? What do you
mean by corrupt?

Does the process run to completion?

Are the output zip files reasonable in size?

Are the expanded files reasonable in size? If so where does the mismatch
start?

Is the problem always in the same place for the same input file?

-- 
David Kelly N4HHE, [EMAIL PROTECTED]

Whom computers would destroy, they must first drive mad.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-02 Thread David Banning
 I haven't been paying 100% attention. Just how does it fail? What do you
 mean by corrupt?
 
 Does the process run to completion?

All programs zip with no errors. On reading;

root# bzip2 -t zippedfile.bz2
bzip2: 3s1.com-smartstage_ftp-full-20070502-0125AM.1b.tar.bz2: 
data integrity (CRC) error in data

You can use the `bzip2recover' program to attempt to recover
data from undamaged sections of corrupted files.

- gzip also sites a crc error  
- can't remember rzip's error, (it does output an error.)
- uncompress goes without echoing error but tar expanded is
  not able to be untarred.

 Are the output zip files reasonable in size?

The zipped file size seems reasonable in each case.

 Are the expanded files reasonable in size? 

expanding will not complete, except uncompress, which expands the 
file to the original size, plus 6 bites, then the tar file expanded
is unreadable.

If so where does the mismatch
 start?

on expanding, it seems the error happens near or at the end of
the expanding process before halting and exiting with error,
that is if I attempt to read the file with tar -tzf filename.tgz
in gzip's case or tar -tyf filename.bz2 in bzip2's case.

 
 Is the problem always in the same place for the same input file?

Pretty much, but I can't say if it is exactly the same in each case.

I am going to attempt swapping memory and see if the error continues.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-02 Thread Igor B. Bykhalo
Hello David,

Wednesday, May 2, 2007, 8:46:56 AM, you wrote:

 On Tue, May 01, 2007 at 11:53:55PM -0400, Kris Kennaway wrote:
 On Tue, May 01, 2007 at 11:22:28PM -0400, David Banning wrote:
  Another piece of info - I just complied rzip and it seems I 
  have the same problem there! There must be something in common,
  that these programs are using...
 
 Is your filesystem full? :)

 Not at all;

Just guessing:

 Filesystem1K-blocks Used Avail Capacity  Mounted on
 /dev/ad0s1a  503966   11072835292224%/
 /dev/ad0s1f  2579982952820783212%/tmp
   ^^
   Your /tmp is about 250 MB

 /dev/ad0s1g75407576 51862570  1751240075%/usr
 /dev/ad0s1e  503966   26056020309056%/var
   ^^
Your /var (with /var/tmp) is about 500MB

Can't it be that zip just don't have enough space for temporary storage?


 procfs44 0   100%/proc
 linprocfs 44 0   100%
 /usr/compat/linux/proc
 70.52.121.240:/usr/backup  75331512 15213578  5409141422%/usr/optex
 /dev/ad1s1e   307684276 73248808 20982072626%/tusr
 ___
 freebsd-questions@freebsd.org mailing list
 http://lists.freebsd.org/mailman/listinfo/freebsd-questions
 To unsubscribe, send any mail to [EMAIL PROTECTED]



-- 
Best regards,
 Igor B. Bykhalomailto:[EMAIL PROTECTED]

___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-02 Thread David Banning
 Can't it be that zip just don't have enough space for temporary storage?

Hi Igor. Thanks for the input. While gzipping and gunziping I
watched those directories and they don't change. The new file
is being created in the target directory - when it completes
it deletes the old file in the same directory.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


can't zip large files 2gb

2007-05-01 Thread David Banning
I am attempting to zip large files that are 2GB - 3GB.

uname -a;

FreeBSD 3s1.com 4.11-STABLE FreeBSD 4.11-STABLE #7

I have tried gzip, bzip2 from the ports and rzip.

All give no errors on zipping, but will not unzip, siting
CRC errors.

Is there a maximum file size for zipping? Is my system too old?
Maybe a file or library that all zip programs depend on that is
corrupt?

___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-01 Thread Kris Kennaway
On Tue, May 01, 2007 at 03:58:26PM -0400, David Banning wrote:
 I am attempting to zip large files that are 2GB - 3GB.
 
 uname -a;
 
 FreeBSD 3s1.com 4.11-STABLE FreeBSD 4.11-STABLE #7
 
 I have tried gzip, bzip2 from the ports and rzip.

OK, none of those are zip though :) They're completely different
algorithms.

 All give no errors on zipping, but will not unzip, siting
 CRC errors.
 
 Is there a maximum file size for zipping? Is my system too old?
 Maybe a file or library that all zip programs depend on that is
 corrupt?

Quite possible your system is too old, it works on modern versions of
FreeBSD at least with gzip and bzip2.

Kris


pgpPMpfrfKETS.pgp
Description: PGP signature


Re: can't zip large files 2gb

2007-05-01 Thread Martin Tournoij
On Tue 01 May 2007 16:05, Kris Kennaway wrote:
 On Tue, May 01, 2007 at 03:58:26PM -0400, David Banning wrote:
  I am attempting to zip large files that are 2GB - 3GB.
  
  uname -a;
  
  FreeBSD 3s1.com 4.11-STABLE FreeBSD 4.11-STABLE #7
  
  I have tried gzip, bzip2 from the ports and rzip.
 
 OK, none of those are zip though :) They're completely different
 algorithms.
 
  All give no errors on zipping, but will not unzip, siting
  CRC errors.
  
  Is there a maximum file size for zipping? Is my system too old?
  Maybe a file or library that all zip programs depend on that is
  corrupt?
 
 Quite possible your system is too old, it works on modern versions of
 FreeBSD at least with gzip and bzip2.
 
 Kris

I can confirm that compressing large files works without problems on
FreeBSD 6.
A while ago I tested different archivers, and used gzip, bzip2, and
7-zip to compress and decompress some large files (3 to 4GB)

Martin
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-01 Thread DAve

David Banning wrote:

I am attempting to zip large files that are 2GB - 3GB.

uname -a;

FreeBSD 3s1.com 4.11-STABLE FreeBSD 4.11-STABLE #7

I have tried gzip, bzip2 from the ports and rzip.

All give no errors on zipping, but will not unzip, siting
CRC errors.

Is there a maximum file size for zipping? Is my system too old?
Maybe a file or library that all zip programs depend on that is
corrupt?


Your system is not too old, there were plenty of big files around when 
4.11 was released. Sometimes we had to refill the oil lamps before gzip 
completed, but we made do.


I routinely gzip and gunzip files several gb in size on a 4.8 release 
machine. If I had to point fingers I would question your disk. Try using 
gzip and gunzip on a different drive.


Also you don't say if the files are local, if you are transmitting the 
files make certain they are not being sent in ascii format, ftp for 
instance.


DAve


--
Three years now I've asked Google why they don't have a
logo change for Memorial Day. Why do they choose to do logos
for other non-international holidays, but nothing for
Veterans?

Maybe they forgot who made that choice possible.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-01 Thread David Banning
 Maybe a file or library that all zip programs depend on that is
 corrupt?
 
 Your system is not too old, there were plenty of big files around when 
 4.11 was released. Sometimes we had to refill the oil lamps before gzip 
 completed, but we made do.

You are right about the age of the system - I just got the same error 
gzipping 339M file, but not a smaller 149M file. I tried your disk 
idea. I unmounted the volume I was using and did an fsck with no
errors. Then I gzip'ped a new set of files on another drive, and
got the same error. I recompiled gzip from source also. 

 Also you don't say if the files are local, if you are transmitting the 
 files make certain they are not being sent in ascii format, ftp for 
 instance.

I am not transmitting the file via FTP or anything else right now. 
All is local for now.
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-01 Thread David Banning
Another piece of info - I just complied rzip and it seems I 
have the same problem there! There must be something in common,
that these programs are using...
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-01 Thread Kris Kennaway
On Tue, May 01, 2007 at 11:22:28PM -0400, David Banning wrote:
 Another piece of info - I just complied rzip and it seems I 
 have the same problem there! There must be something in common,
 that these programs are using...

Is your filesystem full? :)

Kris
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]


Re: can't zip large files 2gb

2007-05-01 Thread David Banning
On Tue, May 01, 2007 at 11:53:55PM -0400, Kris Kennaway wrote:
 On Tue, May 01, 2007 at 11:22:28PM -0400, David Banning wrote:
  Another piece of info - I just complied rzip and it seems I 
  have the same problem there! There must be something in common,
  that these programs are using...
 
 Is your filesystem full? :)

Not at all;

Filesystem1K-blocks Used Avail Capacity  Mounted on
/dev/ad0s1a  503966   11072835292224%/
/dev/ad0s1f  2579982952820783212%/tmp
/dev/ad0s1g75407576 51862570  1751240075%/usr
/dev/ad0s1e  503966   26056020309056%/var
procfs44 0   100%/proc
linprocfs 44 0   100%
/usr/compat/linux/proc
70.52.121.240:/usr/backup  75331512 15213578  5409141422%/usr/optex
/dev/ad1s1e   307684276 73248808 20982072626%/tusr
___
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to [EMAIL PROTECTED]