Thanks Sergio

will set up a JIRA task.

I guess I'd like to explore the next question in the architecture then -
which is using native RDBMS backends - and schemas (not just sotring
triples in a db..)
There is a Use Case for a Linked Data Platform to be retrofitted to
existing databases - rather than generating a RDF encoded data mart. (the
data mart makes the LDP transaction support less relevant)

The specific use case I am exploring is the augmentation of spatio-temporal
data (and the a very large amount of existing data fits this pattern) - and
augmenting it with more flexible metadata graphs, which can then be
accessed in different subsets, based on varying amounts of reasoning. LDP
makes sense for the metadata - but I want the platform to be able to reuse
the Linked Data layer - including URI redirection, content negotiation,
templating and rendering.

It is possible to put SPARQL over the relational dastabase with triplify or
D2RQ - so has anybody thought about the role of Marmotta here? AFAICT there
is no equivalent Apache project - but quite a lot of options out there.

The LDCache seems to be related - it makes federated resources available to
a client - its not 100% clear but it seems to be limited to the LDR Get
methods (?). I'm not sure if this is the way to plug in SPARQL backends -
but having two different federation mechanisms requires some thought.
SPARQL has some built-in federation methods of course.

Cheers
Rob Atkinson


On Tue, 23 Feb 2016 at 02:02 Sergio Fernández <[email protected]> wrote:

> Hi,
>
> On Mon, Feb 22, 2016 at 3:54 AM, Rob Atkinson <[email protected]>
> wrote:
>
> > I am looking at linked data applications to add value to data exposed via
> > Web Services, to add the missing semantics about the exposed content
> needed
> > to actually discover and use those services.
> > i need to be able to traverse graphs composed of things like VoiD,
> > RDF-Datacube etc.
> > The RDF-shapes scope covers this, and there are some elements of Marmotta
> > such
> > as LDPath that are relevant. I've previously built the functionality I
> > needed using the LinkedDataAPI (
> https://github.com/UKGovLD/linked-data-api
> > ).
> >
> > Is there anything in Marmotta to support parameterised SPARQL queries
> > accessed via URL based APIs, and building a response by traversing paths
> > from the query results?
> >
>
> No AFAIK, and that would be a really valuable addition to the
> marmotta-sparql module. While I also see a dedicate LDA module, I
> understood that just that could be a good focus to start.
>
> The main hurdle I see is that Marmotta is keen to bind to a single
> > repository - whereas I specifically want to combine data from
> triple-stores
> > and existing RDBMS environments. Decoupling APIs against separate SPARQL
> > endpoints works fine for this. IMHO If the Marmotta build configured APIs
> > by default against its configured SPARQL endpoint, but an API
> configuration
> > module allowed additional APIs to be configured against alternative
> SPARQL
> > endpoints then all would be bliss.
> >
>
> Well, the architectural choice in Marmotta was to build a Linked Data
> server on top of a single (but pluggable) triple store. I still think that
> if want to implement an kind of federation it should be done in the backend
> layer, not in the application itself. But we're happy to discuss options.
>
> To start, register all this stuff as issues in Jira. I'd recommend you to
> start getting familiar with our code base and apis. So probably adding
> support to marmotta-sparql parameterised SPARQL queries via URLs could be a
> good entry point. What do you think, Rob?
>
> Thanks for sharing with us all your ideas.
>
> Cheers.
>
>
> --
> Sergio Fernández
> Partner Technology Manager
> Redlink GmbH
> m: +43 6602747925
> e: [email protected]
> w: http://redlink.co
>

Reply via email to