I think packaging ought to be able to use binary dependencies. Some
disagree. The binary ZStandard decompressor could be offered in a
gzip-compressed wheel.
The reason an improved packaging format can only use ZStandard and not LZMA
is that we need to improve everyone's experience, not just
On Tue, 13 Oct 2020 05:58:45 -
"Ma Lin" wrote:
>
> I heard in data science domain, the data is often huge, such as hundreds of
> GB or more. If people can make full use of multi-core CPU to compress, the
> experience will be much better than zlib.
This is true, but in data science it is
> but Python isn't trying to have the "optimal cutting-edge" thing in its
> standard library. More like "the well-established, widely-used" thing.
I also agree with this.
At present, I have confidence in zstd. There seems to be a trend that some
programmer users are switching to zstd.
Don't
On Tue, Oct 13, 2020 at 8:16 AM Ma Lin wrote:
> Zstd has some advantages: fast speed, multi-threaded compression,
> dictionary for small data, etc. IMO it's suitable as a replacement for
> zlib, but at this time:
> 1, If it is included into stdlib, it will take advantage of the huge
>
I wrote a zstd module for stdlib:
https://github.com/animalize/cpython/pull/8/files
And a PyPI version based on it:
PyPI: https://pypi.org/project/pyzstd/
Doc: https://pyzstd.readthedocs.io/en/latest/
If you decide to include it into stdlib, the work can be done in a short
29.09.20 16:26, Omer Katz пише:
> What are the use-cases for LZMA that make it qualify to be part of the
> stdlib? Why was that library included?
> I think we shouldn't discriminate. If there are a couple of use-cases users
> need and the implementation is sufficiently stable, I see no reason
On Tue, Sep 29, 2020 at 6:34 AM Eric V. Smith wrote:
> > I think we shouldn't discriminate. If there are a couple of use-cases
> users need and the implementation is sufficiently stable, I see no reason
> not to include those libraries in stdlib.
>
I think this was covered earlier in this
On 9/29/2020 9:26 AM, Omer Katz wrote:
I actually disagree on HTTP2 but that's beside the point.
What are the use-cases for LZMA that make it qualify to be part of the stdlib?
Why was that library included?
I think we shouldn't discriminate. If there are a couple of use-cases users
need and
I actually disagree on HTTP2 but that's beside the point.
What are the use-cases for LZMA that make it qualify to be part of the stdlib?
Why was that library included?
I think we shouldn't discriminate. If there are a couple of use-cases users
need and the implementation is sufficiently stable,
On Wed, Sep 23, 2020 at 3:10 AM David Mertz wrote:
> On Tue, Sep 22, 2020 at 11:55 PM Paul Moore wrote:
>
>> The point of this request is that Python's packaging infrastructure is
>> looking at what compression we use for wheels - the current
>> compression is suboptimal for huge binaries like
On Wed, 23 Sep 2020 13:26:13 +0300
Omer Katz wrote:
> I pointed out a use case for Brotli & HTTP2 as a concrete example for why
> it'd be more convenient to include brotli as a module.
> I'm sure there are other cases I haven't thought about.
>
> I don't understand why LZMA should be included
Let's put it this way. If you can only support 3 compression algorithms in
the stdlib, which there would you choose? If only 4? If only 10?
Each one is concrete maintenance work. There's nothing *wrong* with any of
them, and someone uses each of the top 10 or 50. But some kind of cut-off
of
On Wed, 23 Sep 2020 at 11:09, David Mertz wrote:
> It's hard to see where packaging would have any advantage with brotli or zstd
> over lzma. XZ is more widely used, and package size seems to dominate speed.
> There are definitely some intermediate compression levels where both brotli
> and
I pointed out a use case for Brotli & HTTP2 as a concrete example for why
it'd be more convenient to include brotli as a module.
I'm sure there are other cases I haven't thought about.
I don't understand why LZMA should be included while zstd or brotli
shouldn't.
What's the actual policy here?
On Tue, Sep 22, 2020 at 11:55 PM Paul Moore wrote:
> The point of this request is that Python's packaging infrastructure is
> looking at what compression we use for wheels - the current
> compression is suboptimal for huge binaries like tensorflow. Packaging
> is in a unique situation, because
On Wed, 23 Sep 2020 at 08:08, Paul Sokolovsky wrote:
> In the meantime, why can't you use modules on PyPI/github/wherever
> else?
There are significant use cases where 3rd party modules are not easy
to use. But let's not get sucked into that digression here.
The point of this request is that
Mainly because we previously explored creating wheels with better
compression.
But I think that if LZMA was included, then other new algorithms should be
included as well.
בתאריך יום ד׳, 23 בספט׳ 2020, 10:08, מאת Paul Sokolovsky :
> Hello,
>
> On Wed, 23 Sep 2020 06:11:27 -
> "Omer Katz"
Hello,
On Wed, 23 Sep 2020 06:11:27 -
"Omer Katz" wrote:
> zstd is used when you need fast compression speed, not the best ratio.
>
> Maybe we can ask google and facebook to contribute their
> implementations?
And $$$ to support maintaining it over the years.
In the meantime, why can't
zstd is used when you need fast compression speed, not the best ratio.
Maybe we can ask google and facebook to contribute their implementations?
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to
> On 21 Sep 2020, at 16:14, Antoine Pitrou wrote:
>
>
> Hi,
>
> On Mon, 21 Sep 2020 09:31:47 -
> "Omer Katz" mailto:omer.d...@gmail.com>> wrote:
>> Hello everyone,
>>
>> We have many compression algorithms as standard modules, but we lack two
>> important new algorithms: brotli and
Hi,
On Mon, 21 Sep 2020 09:31:47 -
"Omer Katz" wrote:
> Hello everyone,
>
> We have many compression algorithms as standard modules, but we lack two
> important new algorithms: brotli and zstandard.
>
> Brotli is an important compression algorithm both for HTTP2 and for
>
21 matches
Mail list logo