Hey Tomaz,

I'd really love to see this :-)

Thanks,
Francisco

> El 1 jul 2020, a las 12:00, Tomaz Muraus <to...@apache.org> escribió:
> 
> Recently one of the Libcloud contributors (Eis-D-Z) published various
> improvements to our price scraping scripts and added some new ones -
> https://github.com/apache/libcloud/pulls/Eis-D-Z.
> 
> I think it would now make sense to run those scraping scripts on a
> continuous basis as part of our CI (e.g. once a day) and publish the
> generated file to some well known location (e.g. public read-only S3
> bucket).
> 
> In fact, that was also the plan when we originally
> added libcloud.pricing.download_pricing_file function and related
> functionality quite a long time ago.
> 
> IIRC, the plan was to include an auto-generated pricing file directly
> inside the git repo, but this is more complicated and I would need to
> contact the ASF infra team if they even allow something like that (updating
> and committing a change as a bot user on our CI - Travis CI).
> 
> So for now, I will probably just publish this auto-generated pricing.json
> file to a public read-only S3 bucket (I will make sure to set up correct
> rate limits and alerts to prevent abuse, even though the pricing file
> itself is quite small).
> 
> What do other people think?

Reply via email to