Those files have now been made public.
I will publish a blog post with some details on that in the near future.
On Sat, Jul 11, 2020 at 8:48 PM Tomaz Muraus wrote:
> I added some information on this new behavior here -
>
I added some information on this new behavior here -
https://github.com/apache/libcloud/blob/f122600d2adf181a9b100cdd552cd02979c5b1b9/docs/compute/pricing.rst#downloading-latest-pricing-data-from-an-s3-bucket
Keep in mind that those 3 files are not public yet. I plan to make them
public and
Yeah, I would actually prefer a git repository so everything is version
controlled, etc., but I went with the fastest and simplest approach
possible.
I'm not exactly sure what the ASF rules are for something like that (I
would need to ask ASF infra team to create a new repo, create a bot account
The other solution is to create a new git repository just for frequently
updated files like this one… I mean we don't want to end up like pytz do we?
PS: A good thing about pytz is other languages literally just parse pytz's
list for their own timezone implementation. No Python. Easy! - With this
Same here!
Thanks,
Jay
On Wed, Jul 1, 2020 at 12:45 PM Francisco Ros wrote:
> Hey Tomaz,
>
> I'd really love to see this :-)
>
> Thanks,
> Francisco
>
> > El 1 jul 2020, a las 12:00, Tomaz Muraus escribió:
> >
> > Recently one of the Libcloud contributors (Eis-D-Z) published various
> >
Hey Tomaz,
I'd really love to see this :-)
Thanks,
Francisco
> El 1 jul 2020, a las 12:00, Tomaz Muraus escribió:
>
> Recently one of the Libcloud contributors (Eis-D-Z) published various
> improvements to our price scraping scripts and added some new ones -
>
Recently one of the Libcloud contributors (Eis-D-Z) published various
improvements to our price scraping scripts and added some new ones -
https://github.com/apache/libcloud/pulls/Eis-D-Z.
I think it would now make sense to run those scraping scripts on a
continuous basis as part of our CI (e.g.