Re: How to make complex SPARQL queries reusable?

2017-04-24 Thread Paul Tyson
Another option is to express the query logic in standard rule notation, such as 
RIF, and translate to sparql. This approach is especially indicated if the 
sparql queries represent actual business rules.

Regards,
--Paul

> On Apr 24, 2017, at 06:22, Andy Seaborne  wrote:
> 
> Simon,
> 
> 1/ SpinRDF provides ways of defining custom functions and property functions 
> in SPARQL.
> 
> http://spinrdf.org/
> 
> There'll probably be something in the SHACL-sphere (not in the standard, but 
> in the same general area/style/framework) soon.
> 
> 2/ There are methods in ExprUtils to parse expressions in SPARQL syntax.
> 
> You could build a general helper.
> 
> That does not enable syntax replacement but does cover (?) your MyFunc 
> example.
> 
>Andy
> 
>> On 24/04/17 01:02, Simon Schäfer wrote:
>> Hello,
>> 
>> I have complex SPARQL queries, which I would like to divide into several 
>> parts (like function definitions), in order to reuse these parts among 
>> different SPARQL queries and in order to make a single SPARQL query easier 
>> to understand. What are my options to achieve this?
>> 
>> I had a look at Jenas built in functionality to support user defined 
>> function definitions. The problem with them is that it seems that they can 
>> be used only for simple functionality like calculating the max of two 
>> integers. But I have quite complex functionality, which I don't want to 
>> rewrite in Java. Example:
>> 
>> 
>> select * where {
>>  ?s a ?tpe .
>>  filter not exists {
>>?sub rdfs:subClassOf ?tpe .
>>filter (?sub != ?tpe)
>>  }
>> }
>> 
>> 
>> It would be great if that could be separated into:
>> 
>> 
>> public class MyFunc extends FunctionBase1 {
>>public NodeValue exec(NodeValue v) {
>>return NodeValue.fromSparql("filter not exists {"
>>+ "   ?sub rdfs:subClassOf ?tpe ."
>>+ "  filter (?sub != ?tpe)"
>>+ "}") ;
>>}
>> }
>> // and then later
>> FunctionRegistry.get().put("http://example.org/function#myFunc;, 
>> MyFunc.class) ;
>> 
>> 
>> and then:
>> 
>> 
>> prefix fun:
>> select * where {
>>  ?s a ?tpe .
>>  filter(fun:myFunc(?tpe))
>> }
>> 
>> 
>> Basically I'm looking for a way to call a SPARQL query from within a SPARQL 
>> query. Is that possible?
>> 
>> 



Re: How to make complex SPARQL queries reusable?

2017-04-24 Thread Andy Seaborne

Simon,

1/ SpinRDF provides ways of defining custom functions and property 
functions in SPARQL.


http://spinrdf.org/

There'll probably be something in the SHACL-sphere (not in the standard, 
but in the same general area/style/framework) soon.


2/ There are methods in ExprUtils to parse expressions in SPARQL syntax.

You could build a general helper.

That does not enable syntax replacement but does cover (?) your MyFunc 
example.


Andy

On 24/04/17 01:02, Simon Schäfer wrote:

Hello,

I have complex SPARQL queries, which I would like to divide into several parts 
(like function definitions), in order to reuse these parts among different 
SPARQL queries and in order to make a single SPARQL query easier to understand. 
What are my options to achieve this?

I had a look at Jenas built in functionality to support user defined function 
definitions. The problem with them is that it seems that they can be used only 
for simple functionality like calculating the max of two integers. But I have 
quite complex functionality, which I don't want to rewrite in Java. Example:


select * where {
  ?s a ?tpe .
  filter not exists {
?sub rdfs:subClassOf ?tpe .
filter (?sub != ?tpe)
  }
}


It would be great if that could be separated into:


public class MyFunc extends FunctionBase1 {
public NodeValue exec(NodeValue v) {
return NodeValue.fromSparql("filter not exists {"
+ "   ?sub rdfs:subClassOf ?tpe ."
+ "  filter (?sub != ?tpe)"
+ "}") ;
}
}
// and then later
FunctionRegistry.get().put("http://example.org/function#myFunc;, MyFunc.class) ;


and then:


prefix fun:
select * where {
  ?s a ?tpe .
  filter(fun:myFunc(?tpe))
}


Basically I'm looking for a way to call a SPARQL query from within a SPARQL 
query. Is that possible?




Re: How to make complex SPARQL queries reusable?

2017-04-24 Thread Claude Warren
You could use federated queries to return the sub query -- this is probably
not efficient but might provide a starting point for futher investigation.

If you are doing this in code you could use the QueryBuilder (
https://jena.apache.org/documentation/extras/querybuilder/) and pass the
sub query to the outer query.

Claude

On Mon, Apr 24, 2017 at 1:02 AM, Simon Schäfer  wrote:

> Hello,
>
> I have complex SPARQL queries, which I would like to divide into several
> parts (like function definitions), in order to reuse these parts among
> different SPARQL queries and in order to make a single SPARQL query easier
> to understand. What are my options to achieve this?
>
> I had a look at Jenas built in functionality to support user defined
> function definitions. The problem with them is that it seems that they can
> be used only for simple functionality like calculating the max of two
> integers. But I have quite complex functionality, which I don't want to
> rewrite in Java. Example:
>
> 
> 
> select * where {
>   ?s a ?tpe .
>   filter not exists {
> ?sub rdfs:subClassOf ?tpe .
> filter (?sub != ?tpe)
>   }
> }
> 
> 
>
> It would be great if that could be separated into:
>
> 
> 
> public class MyFunc extends FunctionBase1 {
> public NodeValue exec(NodeValue v) {
> return NodeValue.fromSparql("filter not exists {"
> + "   ?sub rdfs:subClassOf ?tpe ."
> + "  filter (?sub != ?tpe)"
> + "}") ;
> }
> }
> // and then later
> FunctionRegistry.get().put("http://example.org/function#myFunc;,
> MyFunc.class) ;
> 
> 
>
> and then:
>
> 
> 
> prefix fun:
> select * where {
>   ?s a ?tpe .
>   filter(fun:myFunc(?tpe))
> }
> 
> 
>
> Basically I'm looking for a way to call a SPARQL query from within a
> SPARQL query. Is that possible?
>
>
>


-- 
I like: Like Like - The likeliest place on the web

LinkedIn: http://www.linkedin.com/in/claudewarren


Re: How to make complex SPARQL queries reusable?

2017-04-24 Thread Lorenz B.
I don't think that this is possible with the current implementation
since a Function is supposed to return a single value and you'd need
something that returns a set of values resp. bindings.

Have you checked SPARQL SPIN [1]? Maybe this is something you could use.

[1] http://spinrdf.org/

> Hello,
>
> I have complex SPARQL queries, which I would like to divide into several 
> parts (like function definitions), in order to reuse these parts among 
> different SPARQL queries and in order to make a single SPARQL query easier to 
> understand. What are my options to achieve this?
>
> I had a look at Jenas built in functionality to support user defined function 
> definitions. The problem with them is that it seems that they can be used 
> only for simple functionality like calculating the max of two integers. But I 
> have quite complex functionality, which I don't want to rewrite in Java. 
> Example:
>
> 
> select * where {
>   ?s a ?tpe .
>   filter not exists {
> ?sub rdfs:subClassOf ?tpe .
> filter (?sub != ?tpe)
>   }
> }
> 
>
> It would be great if that could be separated into:
>
> 
> public class MyFunc extends FunctionBase1 {
> public NodeValue exec(NodeValue v) {
> return NodeValue.fromSparql("filter not exists {"
> + "   ?sub rdfs:subClassOf ?tpe ."
> + "  filter (?sub != ?tpe)"
> + "}") ;
> }
> }
> // and then later
> FunctionRegistry.get().put("http://example.org/function#myFunc;, 
> MyFunc.class) ;
> 
>
> and then:
>
> 
> prefix fun:
> select * where {
>   ?s a ?tpe .
>   filter(fun:myFunc(?tpe))
> }
> 
>
> Basically I'm looking for a way to call a SPARQL query from within a SPARQL 
> query. Is that possible?
>
>
-- 
Lorenz Bühmann
AKSW group, University of Leipzig
Group: http://aksw.org - semantic web research center