The other solution is to create a new git repository just for frequently
updated files like this one… I mean we don't want to end up like pytz do we?

PS: A good thing about pytz is other languages literally just parse pytz's
list for their own timezone implementation. No Python. Easy! - With this
being in JSON, I could imagine using Terraform libraries in Go; instead of
Libcloud; to do multicloud, and use this costing system to say where and
when.

Samuel Marks
Charity <https://sydneyscientific.org> | consultancy <https://offscale.io>
| open-source <https://github.com/offscale> | LinkedIn
<https://linkedin.com/in/samuelmarks>


On Thu, Jul 2, 2020 at 9:51 PM Jay Rolette <role...@infinite.io> wrote:

> Same here!
>
> Thanks,
> Jay
>
> On Wed, Jul 1, 2020 at 12:45 PM Francisco Ros <fj...@doalitic.com> wrote:
>
> > Hey Tomaz,
> >
> > I'd really love to see this :-)
> >
> > Thanks,
> > Francisco
> >
> > > El 1 jul 2020, a las 12:00, Tomaz Muraus <to...@apache.org> escribió:
> > >
> > > Recently one of the Libcloud contributors (Eis-D-Z) published various
> > > improvements to our price scraping scripts and added some new ones -
> > > https://github.com/apache/libcloud/pulls/Eis-D-Z.
> > >
> > > I think it would now make sense to run those scraping scripts on a
> > > continuous basis as part of our CI (e.g. once a day) and publish the
> > > generated file to some well known location (e.g. public read-only S3
> > > bucket).
> > >
> > > In fact, that was also the plan when we originally
> > > added libcloud.pricing.download_pricing_file function and related
> > > functionality quite a long time ago.
> > >
> > > IIRC, the plan was to include an auto-generated pricing file directly
> > > inside the git repo, but this is more complicated and I would need to
> > > contact the ASF infra team if they even allow something like that
> > (updating
> > > and committing a change as a bot user on our CI - Travis CI).
> > >
> > > So for now, I will probably just publish this auto-generated
> pricing.json
> > > file to a public read-only S3 bucket (I will make sure to set up
> correct
> > > rate limits and alerts to prevent abuse, even though the pricing file
> > > itself is quite small).
> > >
> > > What do other people think?
> >
> >
>

Reply via email to