Hi,
Am 04/02/2020 um 15.25 schrieb Frederik Ramm:
> Hm, the wording is a bit unfortunate really. Of course this "internal
> use only" applies to the personal data in the file which according to
> (the LWG's interpretation of) the GDPR is ok to use for OSM's own
> purposes but not for blasting it
Hi,
On 04.02.20 14:10, Colin Smale wrote:
>> The Geofabrik download server has full history files for every region it
>> offers. Unlike the non-history extracts. these files are only available
>> for users who log in with their OSM user name.
> Aah, thanks Frederik, I didn't know about this.
On 2020-02-04 13:36, Frederik Ramm wrote:
> Hi,
>
> On 04.02.20 13:22, Colin Smale wrote:
>
>> Correct me if I am wrong, but I don't remember ever seeing
>> regional full history files.
>
> The Geofabrik download server has full history files for every region it
> offers. Unlike the
Hi,
On 04.02.20 13:22, Colin Smale wrote:
> Correct me if I am wrong, but I don't remember ever seeing
> regional full history files.
The Geofabrik download server has full history files for every region it
offers. Unlike the non-history extracts. these files are only available
for users who log
I wonder how many users actually need a planet-wide planet file. Surely
there are loads of cases where a regional extract would suffice for the
use case in hand. How about encouraging people to consider using a
regional download?
Something else, only slightly off-topic: I have often had ideas in
We are talking literally about a one command "pipeline" that already does
everything right and consumes 1% of the volume of a weekly download of the
planet.
Not to mention that you get a daily (or whatever you want) updated planet out
of it that contains a defined set of diffs (contrary to
A similar discussion has happened recently when BitTorrent downloads were
announced as an experimental feature (wiki discussion
, twitter
, reddit
Andy, I agree that being frugal with bandwidth is important. Yet, there is
a significant operations cost involved here, that I suspect very few will
actually be willing to spend, unless it is made trivial -- the cost of
setting up an independent planet file update pipeline - i.e. a docker image
On 02/02/2020 16:39, Yuri Astrakhan wrote:
* Anyone working on an evolving project like OpenMapTiles would attest
that the import schema constantly changes.
Indeed, but ...
Every time schema changes, one needs to download newest planet, import
it based on the new schema, and run diffs
While keeping a planet file up to date is really easy, it is probably not the
first idea that comes to mind when you first plan to do something with the data.
This service looks like a good idea, but filtering abusers must be kept in mind
anyway.
Yves
Andy, two major reasons:
* Anyone working on an evolving project like OpenMapTiles would attest that
the import schema constantly changes. Every time schema changes, one needs
to download newest planet, import it based on the new schema, and run diffs
from that point.
* Automation / easy
> For those who download OSM data regularly, there is now a simple way to reduce the load on the primary OSM servers, while also making download much faster and ensure the data is correct.Apologies if this has been done to death already, but surely if you are downloading the entire planet
The news mentions that downloads from Planet OSM are currently rate limited
to 400 kB/s and suggest to use mirrors, but does not mention the related
announcement about the new tool to simplify such downloads. I think it will
help anyone downloading, and it might be worth including in the next
13 matches
Mail list logo