Re: resource rdfs label

2016-04-25 Thread Martynas Jusevičius
You should not be looking at the URI as a label.

How did you try rdfs:label? Something like this should work:

SELECT *
WHERE
{
  ?x tre:has_boss tre:Good ;
rdfs:label ?label .
}

Needs tre: and rdfs: PREFIXes of course.

On Mon, Apr 25, 2016 at 5:33 PM, mehmet mehmet  wrote:
> This is my Jena query
>
> "SELECT * " + "WHERE {"  ?x tre:has_boss tre:Good." + "}" ;
>
> IT gives me correct answer but in URI form
>
> www.semanticweb/myontology/IT_Manager
>
> *I want just IT_Manager. I tried rdfs:label but does not work.*
>
> (IT_Manager is a resource object in my ontology)


Re: Limit requests to localhost

2016-04-25 Thread Don Rolph
See below and experiment.

But I think that by moving the localhostfilter up before all URLs you get
your desired behavior.

The default shiro.ini file looks something like this:

# Licensed under the terms of http://www.apache.org/licenses/LICENSE-2.0

[main]
# Development
ssl.enabled = false

plainMatcher=org.apache.shiro.authc.credential.SimpleCredentialsMatcher
#iniRealm=org.apache.shiro.realm.text.IniRealm
iniRealm.credentialsMatcher = $plainMatcher

localhostFilter=org.apache.jena.fuseki.authz.LocalhostFilter

[users]
# Implicitly adds "iniRealm =  org.apache.shiro.realm.text.IniRealm"
admin=pw

[roles]

[urls]
## Control functions open to anyone
/$/status = anon
/$/ping   = anon

## and the rest are restricted to localhost.
/$/** = localhostFilter

## If you want simple, basic authentication user/password
## on the operations,
##1 - set a better password in [users] above.
##2 - comment out the "/$/** = localhost" line and use:
## "/$/** = authcBasic,user[admin]"

## or to allow any access.
##/$/** = anon

# Everything else
/**=anon

I believe the key is the local filter statement.

My sense is something like this shold work:

# Licensed under the terms of http://www.apache.org/licenses/LICENSE-2.0

[main]
# Development
ssl.enabled = false

plainMatcher=org.apache.shiro.authc.credential.SimpleCredentialsMatcher
#iniRealm=org.apache.shiro.realm.text.IniRealm
iniRealm.credentialsMatcher = $plainMatcher

localhostFilter=org.apache.jena.fuseki.authz.LocalhostFilter

[users]
# Implicitly adds "iniRealm =  org.apache.shiro.realm.text.IniRealm"
admin=pw

[roles]

[urls]
##restricted to localhost.
/** = localhostFilter

## Control functions open to anyone
/$/status = anon
/$/ping   = anon



## If you want simple, basic authentication user/password
## on the operations,
##1 - set a better password in [users] above.
##2 - comment out the "/$/** = localhost" line and use:
## "/$/** = authcBasic,user[admin]"

## or to allow any access.
##/$/** = anon

# Everything else
/**=anon


On Mon, Apr 25, 2016 at 11:21 AM, Bangalore Akhilesh <
bangalore.akhil...@gmail.com> wrote:

> Hi,
>
> We have deployed Fuseki 2 on Tomcat and would like to limit the requests to
> localhost (for security reasons).
>
> I am not well versed with Apache Shiro. So, can you please let me know how
> to go about it?
>
> I would also like to know how to extend Apache Shiro to include custom
> authoization schemes.
>
> Thanks,
> Akhilesh
>



-- 

73,
AB1PH
Don Rolph


resource rdfs label

2016-04-25 Thread mehmet mehmet
This is my Jena query

"SELECT * " + "WHERE {"  ?x tre:has_boss tre:Good." + "}" ;

IT gives me correct answer but in URI form

www.semanticweb/myontology/IT_Manager

*I want just IT_Manager. I tried rdfs:label but does not work.*

(IT_Manager is a resource object in my ontology)


Re: Permission for Publishing Benchmark results

2016-04-25 Thread Andy Seaborne

(Wrong mailing list!)

One of the tenets of open source is "No Discrimination Against Fields of 
Endeavor" [1].


You don't need permission for an open source system to evaluate it.

Andy

[1] https://opensource.org/osd-annotated

On 25/04/16 09:52, Felix Conrads wrote:



Hey ho

not sure if this the best mailing list for the issue, but as i can't
find a better on i will ask here.

we currently evaluate a triple store benchmark building on top of our
previous efforts [1,2,3]. We would kindly like to ask you for permission
to include blazegraph in the benchmark.We try to ensure fair benchmarks
and are not affiliated with any triple store vendor.

The benchmark uses DBpedia Live [4] as well as Semantic Web Dog Food [5]
and tests both query and update performance, in particular for triples
stores under load. (There is an update thread and multiple threads are
querying the system at the same time.) We used the standard settings for
alle triple stores.


Kind regards,

Felix Conrads on behalf of the IGUANA benchmarking team

[1] http://aksw.org/Projects/DBPSB.html

[2] http://jens-lehmann.org/files/2011/dbpsb.pdf

[3] http://jens-lehmann.org/files/2012/aaai_dbpedia_benchmark.pdf
[4] http://live.dbpedia.org
[5] http://semanticweb.org




---
Diese E-Mail wurde von Avast Antivirus-Software auf Viren geprüft.
https://www.avast.com/antivirus





Re: How to read from file & web in one query

2016-04-25 Thread kumar rohit
I have one locan ontology from Protege and another I want to query dbpedia
endpoint. The scenario is something like I have is Subject and property
from my protege ontology and object which I already imported in Protege is
from dbpedia.
Suppose I have

Prefix MO < www.semanticweb/myontology#>
Prefix dbpedia< http://dbpedia.org/resource/>

select ?obj where { MO:TajMahal MO:locatedIn ?obj}

So Subject/predicate from local ontology and object/instance from dbpedia

regards

On Sat, Apr 23, 2016 at 10:02 PM, Andy Seaborne  wrote:

> The SERVICE keyword in SPARQL allows the query to call out to another
> SPARQL endpoint.
>
> Andy
>
>
> On 20/04/16 15:23, kumar rohit wrote:
>
>> Hello
>>
>> How can we read a file from Protege and web i-e dbpedia in one SParql
>> query
>> using Jena code.
>>
>> Suppose I have subject and predicate from my Protege ontology and Object
>> from Dbpedia like:
>>
>> Mod:EiffelTower Mod:resides_In ?var  (?var should be France from Dbpedia
>> resource)
>>
>> How can we write the query in Jena? Is it necessary that first we will
>> import/read the entire Protege file in Jena or some other way exists?
>>
>>
>


Permission for Publishing Benchmark results

2016-04-25 Thread Felix Conrads


Hey ho

not sure if this the best mailing list for the issue, but as i can't
find a better on i will ask here.

we currently evaluate a triple store benchmark building on top of our
previous efforts [1,2,3]. We would kindly like to ask you for permission
to include blazegraph in the benchmark. We try to ensure fair benchmarks
and are not affiliated with any triple store vendor.

The benchmark uses DBpedia Live [4] as well as Semantic Web Dog Food [5]
and tests both query and update performance, in particular for triples
stores under load. (There is an update thread and multiple threads are
querying the system at the same time.) We used the standard settings for
alle triple stores.


Kind regards,

Felix Conrads on behalf of the IGUANA benchmarking team

[1] http://aksw.org/Projects/DBPSB.html

[2] http://jens-lehmann.org/files/2011/dbpsb.pdf

[3] http://jens-lehmann.org/files/2012/aaai_dbpedia_benchmark.pdf
[4] http://live.dbpedia.org
[5] http://semanticweb.org




---
Diese E-Mail wurde von Avast Antivirus-Software auf Viren geprüft.
https://www.avast.com/antivirus



Re: Imported ontology handling

2016-04-25 Thread Laurent Rucquoy
Hello,

Thank you very much for your help, it was very clear and very useful for me.



On 24 April 2016 at 13:05, Dave Reynolds  wrote:

> Hi,
>
> On 22/04/16 12:45, Laurent Rucquoy wrote:
>
>> Hello,
>>
>> I want to manage a TDB notably to store observations which use terms
>> defined in an external ontology.
>>
>> This ontology is defined in OWL files available on the following web page:
>> https://bioportal.bioontology.org/ontologies/RADLEX?p=summary
>>
>> Example of OWL file used:
>> - 3.13.1 version :
>>
>> http://data.bioontology.org/ontologies/RADLEX/submissions/36/download?apikey=8b5b7825-538d-40e0-9e9e-5ab9274a9aeb
>> - 3.12 version :
>>
>> http://data.bioontology.org/ontologies/RADLEX/submissions/31/download?apikey=8b5b7825-538d-40e0-9e9e-5ab9274a9aeb
>>
>>
>> What is the best practice to handle the ontology use ?
>>
>> My idea is to import the OWL file as a named model in my TDB whereas my
>> instances are stored in the default model. These instances will be linked
>> to the ontology through  resources (where
>> RID is the local id of terms defined in this ontology)
>>
>> When I will have to reason with the ontology, I will use a 'work' model
>> resulting from the union of the ontology named model and the default
>> model.
>>
>>
>> My questions:
>>
>> 1) Is this the right way to reason with imported ontologies (i.e. the
>> default model to store the instances, named models used to import
>> different
>> versions of an ontology and a 'work' model resulting from the union of
>> default model and named model) ?
>>
>
> There's no "right" answer here. It'll depend on your work flow and the
> sorts of queries you want to make.
>
> That said I would suggest putting your instance data in a named graph as
> well, not in the default model. That leaves you free to set "union default"
> so that you can query the union of the instance and ontology data.
>
> Note that the built-in Jena reasoners are in-memory reasoners only and
> reasoning over a TDB model will be slow and not improve scaling.
>
> 2) How can I handle the different versions of OWL files ?
>>
>> e.g. in one version of this ontology, the RID31872 term is identified by
>> the
>>  uri
>> while the same term is identified by the
>>  uri
>>
>
> Ugh, that's completely horrible. I don't see a reasonable way you can
> handle that.
>
> As far as I can see there is no relationship between the different
> versions of the term. Just because they happen to have the same localname
> is irrelevant, they are different resources. Looking at those files I see
> no provision for versioning - there's no unversioned resources, no
> versioning links, no mapping terms, nothing. Hopefully that's somewhere and
> I'm just missing it.
>
> Unless you have some separate mapping information that isn't included in
> those links then I'm afraid this is a case of "don't start from here".
>
> Which information will be the more useful to store in my default model to
>> be able to link to the corresponding term in the different versions of the
>> ontology since the base uri could change from one version to the other
>> (while the local part is still the same) ?
>>
>
> As I say, there's just no easy way to handle that. You are dealing with
> "ontologies" that have made no provision for versioning. Indeed I would
> suggest you are dealing with data that started out not as an ontology and
> has just been mapped to OWL syntax.
>
> To fix that would require deep understanding of what the nature of the
> changes are between those different versions.
>
> Assuming the concepts actually have closely related meanings between the
> different versions (a big assumption) then my best advice would be to
> create a new URI set with unversioned URI corresponding to each concept in
> the union of the ontology versions you are looking at. Use those
> unversioned URIs in your instance data. Then create a set of mappings to
> map your unversioned resources to the versioned ones. Precisely what
> mapping terms to use depends on the detailed semantics involved.
>
> Dave
>
>