Simon,
A way to do this is to install a caching proxy and direct web traffic to
that proxy. e.g a httpd or nginx setup.
Another way is to use the StreamManager/LocationMapper map the URls to a
local file and manage that file (e.g. a regular script that updates the
local copy. (Beware there are two LocationMapper - old one in core for
(extreme!) legacy and the current one in jena-arq/RIOT).
Andy
On 01/07/2022 10:33, Simon Gray wrote:
Hi everyone,
I love how Jena will download schemas on-demand from URLs, but one issue I have
is that this feature is that it does not guarantee reproducibility e.g. when
there is no Internet connection or if the schema server is offline.
I have downloaded some of the schemas I use and provide these locally, but
occasionally Jena will still error out since not every schema exists locally so
the remainder are still downloaded via a network connection. I was wondering if
there is a simple way to persist a snapshot of the schema files downloaded by
any one Jena instance, so that I do not have go and fetch all of these manually.
Kind regards,
Simon Gray
Software developer,
Centre for Language Technology,
University of Copenhagen