packages compressed with xz
c.kworr at gmail.com
Sun Dec 5 17:56:44 UTC 2010
05.12.2010 18:03, Ion-Mihai Tetcu wrote:
>> And those ones are all empty at start. So say, if you are compressing
>> something really huge trying to use 4G of memory you end using that
>> much memory between 2G - 3G of source data. And we will need 512MB to
>> decompress that hunk of data.
>> Are the packages _that_ large?
> [ .. ]
> The biggest package that can be produced by a port it's a bit over 10G.
0k, I'll try to be sharper. How many of that huge packages will run on a
prehistoric cpu/64Mb mem combo?
I'll bet even working with the OpenOffice will be... challenging.
Let's take the other route. Packages that large will benefit from
maximum xz compression most. The question clearly is "Would we like to
ditch maximum compression for those huge ones taking care of the
hardware that would never be ready to run them?"
Don't listen to me anyway. Our first goal is clearly making FreeBSD work
everywhere. Capping the xz limits would do nothing for me personally
because all of my machines are capable of building any software I want -
I don't use packages at all. So I don't have a right to vote. We should
think from the point of people running that hardware. And I think we
should support them.
Sphinx of black quartz judge my vow.
More information about the freebsd-ports