Re: [Distutils] Contributing money to package authors/maintainers via PyPI

2016-07-25 Thread Nick Coghlan
On 26 July 2016 at 04:52, Chris Barker  wrote:
> On Sat, Jul 23, 2016 at 12:22 PM, Nathaniel Smith  wrote:
> note: for a higher level of support, the PSF _could_ follow the numfocus
> approach:
>
> NumFocus is a properly set-up non-profit that can act as a gateway for
> particular projects, so that the individual projects don't need to set up
> all that accounting and legal infrastructure:
>
> http://www.numfocus.org/information-on-fiscal-sponsorship.html
>
> However, there is still a pretty big barrier to entry to become a sponsored
> organization, as there should be.

The PSF has considered this, but there's not a lot of value we could
provide above and beyond other organisations that already do this for
open source projects in general. For example:

- Software Freedom Conservancy
- Software in the Public Interest
- Outercurve Foundation

However, similar to the more direct crowdfunding options, pointing
folks towards organisations like these via the publisher-facing pages
in Warehouse is certainly something that could be done, especially as
their download counts start to grow.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] bulk upload

2016-07-25 Thread Chris Angelico
On Tue, Jul 26, 2016 at 8:03 AM, Donald Stufft  wrote:
>
>> On Jul 25, 2016, at 5:57 PM, Chris Angelico  wrote:
>>
>> On Tue, Jul 26, 2016 at 6:16 AM, Donald Stufft  wrote:
>>> Disk space is super cheap. We’re currently using Amazon S3 to store our
>>> files, and the storage portion of our “bill” there is something like
>>> $10/month for all of PyPI (out of a total “cost” of ~$35,000/month). Almost
>>> all of our “cost” for PyPI as a whole comes from bandwidth used not from
>>> storage.
>>
>> Does anyone mirror all of PyPI? If so, "storage" suddenly also means
>> "bandwidth”.
>
>
> Yes folks do mirror all of PyPI, but it’s not as simple as storage == 
> bandwidth. The price of the bandwidth is paid generally when the file is 
> uploaded so deleting doesn’t reduce the bandwidth demands of existing 
> mirrors. It *does* increase the bandwidth demands of a brand new mirror, but 
> a single full mirror represents 0.089% of the total monthly bandwidth of PyPI 
> and there are no indications that there are significant numbers of new 
> mirrors being added regularly to where it would even matter.
>

Good stats, thanks.

ChrisA
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] bulk upload

2016-07-25 Thread Donald Stufft

> On Jul 25, 2016, at 5:57 PM, Chris Angelico  wrote:
> 
> On Tue, Jul 26, 2016 at 6:16 AM, Donald Stufft  wrote:
>> Disk space is super cheap. We’re currently using Amazon S3 to store our
>> files, and the storage portion of our “bill” there is something like
>> $10/month for all of PyPI (out of a total “cost” of ~$35,000/month). Almost
>> all of our “cost” for PyPI as a whole comes from bandwidth used not from
>> storage.
> 
> Does anyone mirror all of PyPI? If so, "storage" suddenly also means
> "bandwidth”.


Yes folks do mirror all of PyPI, but it’s not as simple as storage == 
bandwidth. The price of the bandwidth is paid generally when the file is 
uploaded so deleting doesn’t reduce the bandwidth demands of existing mirrors. 
It *does* increase the bandwidth demands of a brand new mirror, but a single 
full mirror represents 0.089% of the total monthly bandwidth of PyPI and there 
are no indications that there are significant numbers of new mirrors being 
added regularly to where it would even matter.


—
Donald Stufft



___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] bulk upload

2016-07-25 Thread Chris Angelico
On Tue, Jul 26, 2016 at 6:16 AM, Donald Stufft  wrote:
> Disk space is super cheap. We’re currently using Amazon S3 to store our
> files, and the storage portion of our “bill” there is something like
> $10/month for all of PyPI (out of a total “cost” of ~$35,000/month). Almost
> all of our “cost” for PyPI as a whole comes from bandwidth used not from
> storage.

Does anyone mirror all of PyPI? If so, "storage" suddenly also means
"bandwidth".

ChrisA
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] bulk upload

2016-07-25 Thread Donald Stufft

> On Jul 25, 2016, at 3:05 PM, Chris Barker  wrote:
> 
> On Mon, Jul 25, 2016 at 8:55 AM, Robin Becker  > wrote:
> In our private readonly pypi we have 93 releases. I don't think that burden 
> should fall on pypi. However, it's not clear to me if I should push micro 
> releases to pypi and then remove them when another release is made. Is there 
> a way to remove a 'release' completely?
> 
> I'm pretty sure there is no way to remove a release (at least not routinely). 
> thi sis by design -- if someone has done something with that particular 
> release, we want it to be reproducible.


Authors can delete files, releases, or projects but can never re-upload an 
already uploaded file, even if they delete it. It is discouraged to actually do 
this though (and in the future we may change it to a soft delete that just 
hides it from everything with the ability to restore it). It is discouraged for 
basically the reason you mentioned, people pin to specific versions (and 
sometimes specific hashes) and we don’t want to break their deployments.

> 
> I see the point, but it's a little be too bad -- I know I've got some 
> releases up there that were replaced VERY soon due to a build error or some 
> carelessness on my part :-)
> 
> Apparently, disk space is cheap enough that PyPI doesn't need to worry about 
> it.

Disk space is super cheap. We’re currently using Amazon S3 to store our files, 
and the storage portion of our “bill” there is something like $10/month for all 
of PyPI (out of a total “cost” of ~$35,000/month). Almost all of our “cost” for 
PyPI as a whole comes from bandwidth used not from storage.

—
Donald Stufft



___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] bulk upload

2016-07-25 Thread Chris Barker
On Mon, Jul 25, 2016 at 8:55 AM, Robin Becker  wrote:

> In our private readonly pypi we have 93 releases. I don't think that
> burden should fall on pypi. However, it's not clear to me if I should push
> micro releases to pypi and then remove them when another release is made.
> Is there a way to remove a 'release' completely?


I'm pretty sure there is no way to remove a release (at least not
routinely). thi sis by design -- if someone has done something with that
particular release, we want it to be reproducible.

I see the point, but it's a little be too bad -- I know I've got some
releases up there that were replaced VERY soon due to a build error or some
carelessness on my part :-)

Apparently, disk space is cheap enough that PyPI doesn't need to worry
about it.

Are you running into any problems?

I did try to reduce my manylinux sizes by using a library of shared object
> codes (ie a .a built from the PIC compile objs), but I didn't seem able to
> make this work properly; the resulting .so seems to contain the whole
> library (freetype).


is this a problem other than file sizes? I think until / if Nathanial (or
someone :-) ) comes up with a standard way to make wheels of shared libs,
we'll simply have to live with large binaries.

-CHB



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Contributing money to package authors/maintainers via PyPI

2016-07-25 Thread Chris Barker
On Sat, Jul 23, 2016 at 12:22 PM, Nathaniel Smith  wrote:

> OTOH, if we give up on that part of the idea, then it becomes much easier
> :-). It'd be straightforward for PyPI to provide a "how to donate to this
> project" box on each project page, that has links to whatever donation
> processing service(s) the project prefers.
>
+1 on this one -- shift all the legal hassles to the projects, but provide
a tiny bit of infrastructure to make it easier for people to find.

note: for a higher level of support, the PSF _could_ follow the numfocus
approach:


NumFocus is a properly set-up non-profit that can act as a gateway for
particular projects, so that the individual projects don't need to set up
all that accounting and legal infrastructure:

http://www.numfocus.org/information-on-fiscal-sponsorship.html


However, there is still a pretty big barrier to entry to become a sponsored
organization, as there should be.

So I think it would be great if PyPi could simply put a donate button in
there, but not try to get into the money laundering business.

-CHB



-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR(206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115   (206) 526-6317   main reception

chris.bar...@noaa.gov
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] bulk upload

2016-07-25 Thread Robin Becker

On 25/07/2016 15:30, Daniel Holth wrote:

1. There is a tool called twine that is the best way to upload to pypi


thanks I'll check that out.


2. I'm not aware of any aggregate limits but I'm pretty sure each
individual file can only be so big
In our private readonly pypi we have 93 releases. I don't think that burden 
should fall on pypi. However, it's not clear to me if I should push micro 
releases to pypi and then remove them when another release is made. Is there a 
way to remove a 'release' completely? The edit pages seem to suggest so, but 
does that remove the files?




3. Maybe the platform returns as manylinux1? Set an environment variable to
ask for static linking, and check for it in your build script?


...
I did try manylinux1 (after PEP 513), but it didn't seem to work; looked at sys 
platform, os name and the platform module, but see only this


> platform=Linux-3.16.0-50-generic-x86_64-with-redhat-5.11-Final
> sys.platform=linux2
> os.name=posix

however, it's easy enough to export an environment variable in the docker 
startup script.


I did try to reduce my manylinux sizes by using a library of shared object codes 
(ie a .a built from the PIC compile objs), but I didn't seem able to make this 
work properly; the resulting .so seems to contain the whole library (freetype). 
The windows linker seems able to pick up only the required bits so the windows 
wheels are much smaller.

--
Robin Becker
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] bulk upload

2016-07-25 Thread Daniel Holth
1. There is a tool called twine that is the best way to upload to pypi

2. I'm not aware of any aggregate limits but I'm pretty sure each
individual file can only be so big

3. Maybe the platform returns as manylinux1? Set an environment variable to
ask for static linking, and check for it in your build script?



On Mon, Jul 25, 2016 at 8:05 AM Robin Becker  wrote:

> I have started to make manylinux wheels for reportlab.
>
> Our work flow is split across multiple machines. In the end we create a
> total of
> 19 package files (10 manylinux, 8 windows + 1 source); these total 53Mb.
>
> 1) Is there a convenient way to upload a new version starting from the
> package
> files themselves? Normally we try to test the packages before they are
> uploaded
> which implies we cannot just use the distutils upload command.
>
>
> 2) I assume I cannot just keep on uploading new versions to pypi.
> Presumably I
> would have to delete a micro release before uploading a new one and only
> keep
> significant releases.
>
> 3) The manylinux builds are significantly larger than the windows ones
> because
> the manylinux build is not statically linking those bits of freetype which
> we
> use. Is there a way to detect that I'm building under manylinux?
> --
> Robin Becker
> ___
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> https://mail.python.org/mailman/listinfo/distutils-sig
>
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


[Distutils] bulk upload

2016-07-25 Thread Robin Becker

I have started to make manylinux wheels for reportlab.

Our work flow is split across multiple machines. In the end we create a total of 
19 package files (10 manylinux, 8 windows + 1 source); these total 53Mb.


1) Is there a convenient way to upload a new version starting from the package 
files themselves? Normally we try to test the packages before they are uploaded 
which implies we cannot just use the distutils upload command.



2) I assume I cannot just keep on uploading new versions to pypi. Presumably I 
would have to delete a micro release before uploading a new one and only keep 
significant releases.


3) The manylinux builds are significantly larger than the windows ones because 
the manylinux build is not statically linking those bits of freetype which we 
use. Is there a way to detect that I'm building under manylinux?

--
Robin Becker
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig