can't zip large files 2gb >

David Banning david+dated+1178731994.335e16 at skytracker.ca
Fri May 4 17:53:23 UTC 2007


> Try the same operation on a known working system, take that output
> file and do a diff with that and the corrupt one after a 'strings', so
> 'strings new.gz > new-text', 'strings corrupt.gz > corrupt-text',
> 'diff new-text corrupt-text'.  I'm just interested in how it's being
> corrupted and maybe the strings output will tell you something.

I don't have a separate system, but I tried the strings output
of the tar before compression and the strings output of the tar
-after- compression and uncompression - as I mentioned the size output
is only two bites difference. 

The result was that the memory was exhausted on attempting a diff
of the two files, but there was around a 1 meg difference between
the two 1.5G ascii files.


> Sorry if this was specified before, but did this just start happening
> or is this the first time you've tried to gzip large files on this
> system?

first time I have tried files of this size - but I get the same problem
no matter what compression utility I use; tried gzip, bzip2, rzip
and compress.


More information about the freebsd-questions mailing list