Re: SHACL

2024-04-06 Thread Beaudet, David
Paul, I started writing a validator for Linked Art a couple of years ago that 
has some SHACL examples you might find useful.

https://github.com/linked-art/shacl-validator

- Dave Beaudet


On Apr 6, 2024 12:52, Paul Tyson  wrote:
This is old, but might be useful. It has a chapter on SHACL.

https://gcc02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fbook.validatingrdf.com%2F=05%7C02%7C%7Cf8969d026af84604b08108dc5659f16c%7C53f6461e95ad4b08a8da973e49ae9312%7C0%7C0%7C638480191519535786%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C=dLNK8hNKKXKteItrBE0AkGgc05iFqqvm1B1q27CqcT0%3D=0

Regards,
--Paul

On 4/5/24 04:58, Hashim Khan wrote:
> Hi,
>   I am interested in working with SHACL shapes for validation. I would
> like to know if someone  points out a nice resource.
>
> Best,
>



Re: JSON-LD: 1.0 or 1.1

2022-04-23 Thread Beaudet, David
FWIW, 1.1 works better for my use cases which are currently all Linked Art 
centric.

https://linked.art

Dave Beaudet



On Apr 23, 2022 13:16, Andy Seaborne  wrote:
What should the default settings be JSON-LD 1.0 or 1.1?

It is not a simple choice.

There is slightly pretty writing of JSON-LD 1.1 now - prefixes and
native types.

A new issue is
e.g. 
https://gcc02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fapache%2Fjena%2Fissues%2F1254data=05%7C01%7C%7Ce588fae0f1d04750bb1a08da254cf794%7C53f6461e95ad4b08a8da973e49ae9312%7C0%7C0%7C637863309768468706%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7Csdata=z%2F8C6g5neNKttkS6e2ssOpUjH2dlMSLbU8Nwe0Lh314%3Dreserved=0

JSON-LD 1.1 is not completely backwards compatible with JSON-LD 1.0.

Current status:

There are two JSON-LD subsystems with their own language and format
constants as well as terms for the system "JSON-LD" settings.  API code
can choose which it wishes to use. All the previous features of JSON-LD
1.0 writing are available this way.

The decision is what are the defaults for application/ld+json.

This affects Fuseki users where where isn't scope to have a switch
between 1.0 and 1.1.

In issues/1254, the reading client software isn't Jena so the
conservative choice of writing 1.0 while reading 1.1 does not really
work out.

The decision for Jena is when to switch.

JSON-LD 1.1 is becoming the norm.

It isn't practical to remain at JSON-LD 1.0 indefinitely.

I think we're at the point where we ought to switch reading and writing
to JSON-LD 1.1 unless we have examples/evidence where this is a problem.

 Andy


>> On Fri, Mar 11, 2022 at 11:39 AM Andy Seaborne  wrote:
>>>
>>> Jena has both JSON 1.0, provided by jsonld-java, and JSON-LD 1.1,
>>> provided by Titanium.
>>>
>>> What should the default settings be?
>>>
>>> For parsing that means what is bound to "application/ld+json" and file
>>> extension .jsonld.
>>>
>>> For writing, it means what is setup for Lang.JSONLD.
>>>
>>> This is two decisions - parsing and writing can be different.
>>>
>>>
>>> But.
>>>
>>> It is not so simple:
>>>
>>> 1/ For Java11, the default settings for java.net.http can't contact
>>> schema.org.

This seems to only affect on a few (maybe one) Java 11 build version.
The latest Java11 available on Ubuntu works.

>>> 2/ Jena is writing JSON-LD 1.1 without much in the way of transformation
>>> nor creating a @context from the RDF data. It prints full URIs; numbers
>>> aren't abbreviated etc etc. so it not very pretty.

There is slightly pretty writing of JSON-LD 1.1 now - prefixes and
native types.

So there is "plain", no prefixes, no native types (and hence completely
faithly RDF), and "pretty" (JSON numbers).


Add issue 3:

JSON-LD 1.1 is not completely backwards compatible with JSON-LD 1.0.

e.g. 
https://gcc02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fapache%2Fjena%2Fissues%2F1254data=05%7C01%7C%7Ce588fae0f1d04750bb1a08da254cf794%7C53f6461e95ad4b08a8da973e49ae9312%7C0%7C0%7C637863309768468706%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7Csdata=z%2F8C6g5neNKttkS6e2ssOpUjH2dlMSLbU8Nwe0Lh314%3Dreserved=0



Re: Disabling BNode UID generation

2022-02-09 Thread Beaudet, David
I ran across an API call the other day that checks isomorphism.  See the 
topbraid shacl library junit test runner. I think it's called by the dash test 
case class to make sure the resulting graph matches the expected response.


On Feb 9, 2022 11:10, "Shaw, Ryan"  wrote:
Thank you, Andy.

I agree that working on the triple level is the correct way to approach this. I 
was looking for something quick and dirty that would work with textual diffing 
by a VCS, hence my focus on the blank node labels.

Are there any examples of how to use the isomorphism utilities in Jena?

> On Feb 5, 2022, at 12:48 PM, Andy Seaborne  wrote:
>
>
>
> On 04/02/2022 19:09, Shaw, Ryan wrote:
>> Hello,
>> I am trying to experiment with generating diffable N-Triples or flat Turtle 
>> files.
> ...
>> Thanks,
>> Ryan
>
>
> Info: There is work on a charter for
>
> "RDF Dataset Canonicalization and Hash Working Group"
>
> https://gcc02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fw3c.github.io%2Frch-wg-charter%2Fdata=04%7C01%7C%7C9b4e78ea9e08469c023008d9ebe6a533%7C53f6461e95ad4b08a8da973e49ae9312%7C0%7C0%7C637800198129953885%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000sdata=iFjDAQwclQvtNtNPWQ1c98VVZh5WzEjyFcSRzP%2FckkQ%3Dreserved=0
>
> The end of section 1 has some links to related work.
>
> Given RDF is inherently unordered, canonicalization and "diff of triples" are 
> related.
>
>
> For diff-able files, what counts as "different" between two files?
>
> Instead of changing the bnode algorithm, have you considered making use of 
> bnode-isomorphism? That is, during a diff, maintain a growing mapping from 
> bnodes in one list of triples to bnodes in the other list?
> Iso.isomorphicTriples
>
> (The list being the triples in encounter order during parsing). It is working 
> not so much on the syntax as the abstraction of triples. e.g A Turtle file 
> and an NT file produced by parsing the TTL file can be defined to be "the 
> same".
>
> It's fairly portable across files generated by other systems as well except 
> for Turtle lists - Jena as a fixed order for triple generation for a list but 
> it isn't necesasrily the same for all systems.
>
> Jena's Turtle algorithm, which is in LangTurtleBase, generates in list order, 
> with rdf:first, then rdf:rest; the triple the referencing the list appears 
> after the list. It happens to be the way the spec explains it:
>   
> https://gcc02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.w3.org%2FTR%2Fturtle%2F%23sec-parsing-triplesdata=04%7C01%7C%7C9b4e78ea9e08469c023008d9ebe6a533%7C53f6461e95ad4b08a8da973e49ae9312%7C0%7C0%7C637800198129953885%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000sdata=Y1dlFIAko0H92M2VQrUDvDQmZqCWYwDuJUFNFJoSVyc%3Dreserved=0
> but that is defining the outcome and isn't a requirement.
>
>Andy



RE: auto-polymorphism?

2022-01-03 Thread Beaudet, David


Seems this is due to using a model with inference.   

Switching from

OntModelSpec modelSpec = OntModelSpec.RDFS_MEM_RDFS_INF;

to

OntModelSpec modelSpec = OntModelSpec.RDFS_MEM;

Prevented the additional classes from being added to the individual in the 
resulting graph. 


-Original Message-
From: Beaudet, David  
Sent: Monday, January 3, 2022 4:11 PM
To: users@jena.apache.org
Subject: auto-polymorphism?


Greetings and Happy New Year all.

I'm writing some SHACL to validate Linked Art which is based on CiDOC-CRM.  
When I consume the following JSON-LD file with Jena, the "Place" object becomes 
polymorphic in the sense of Jena automatically assigning another class to it 
due to a property definition in the ontology that sets the rdfs:domain to 
crm:E33_Linguistic_Object due to the presence of the "language" property.  I've 
pasted the relevant ontology portion below the JSON-LD.  Is there a way to 
disable the auto-assignment of this additional class so that SHACL constraints 
can be used to validate the object of "referred_to_by" instead of it becoming 
automatically valid due to the polymorphic behavior?  When I remove 
 from the definition, the 
additional class assignment on ingest is prevented.

Thanks,

Dave

{
  "@context": 
https://gcc02.safelinks.protection.outlook.com/?url=https%3A%2F%2Flinked.art%2Fns%2Fv1%2Flinked-art.jsondata=04%7C01%7C%7Ce1cd38374081455b3e3608d9cefd92fd%7C53f6461e95ad4b08a8da973e49ae9312%7C0%7C0%7C637768411645195492%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000sdata=IlY6mqroMD56amhcDwKo7zx7VwBaaYstAXpX49poubo%3Dreserved=0,
  "id": 
https://gcc02.safelinks.protection.outlook.com/?url=https%3A%2F%2Flinked.art%2Fexample%2Fobject%2F0data=04%7C01%7C%7Ce1cd38374081455b3e3608d9cefd92fd%7C53f6461e95ad4b08a8da973e49ae9312%7C0%7C0%7C637768411645195492%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000sdata=VkI8Hlctfgl%2FQ05iGO%2Bykp0o6Fy0xkSFy5KtlkixHjs%3Dreserved=0,
  "type": "HumanMadeObject",
  "_label": "Mona Lisa",
  "referred_to_by": [
{
  "type": "Place",
  "language": [
{
  "id": 
https://gcc02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fvocab.getty.edu%2Faat%2F300388277data=04%7C01%7C%7Ce1cd38374081455b3e3608d9cefd92fd%7C53f6461e95ad4b08a8da973e49ae9312%7C0%7C0%7C637768411645195492%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000sdata=njMjm2w88Hs5nVun6jVS%2BHb045ZcHlSM%2FCBv61tD7YA%3Dreserved=0,
  "type": "Language",
  "_label": "English"
}
  ],
  "classified_as": [
{
  "id": 
https://gcc02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fvocab.getty.edu%2Faat%2F300435416data=04%7C01%7C%7Ce1cd38374081455b3e3608d9cefd92fd%7C53f6461e95ad4b08a8da973e49ae9312%7C0%7C0%7C637768411645195492%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000sdata=n8LiBMGjAFpuVS0J%2F2oOKyAppY3X%2FmYlJ%2B61xC497B8%3Dreserved=0,
  "type": "Type",
  "_label": "Description",
  "classified_as": [
{
  "id": 
https://gcc02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fvocab.getty.edu%2Faat%2F300418049data=04%7C01%7C%7Ce1cd38374081455b3e3608d9cefd92fd%7C53f6461e95ad4b08a8da973e49ae9312%7C0%7C0%7C637768411645195492%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000sdata=A4Emc9gWvy65ux49%2BCZOYnMQNZeglBUaFDds3eo6oVg%3Dreserved=0,
  "type": "Type",
  "_label": "Brief Text"
}
  ]
}
  ],
  "content": "This portrait was doubtless started in Florence around 1503. 
It is thought to be of Lisa Gherardini, wife of a Florentine cloth merchant ..."
}
  ]
}


est en langue


has language
hat Sprache
 

 da lngua 


This property describes the E56 Language of an E33 Linguistic 
Object.
Linguistic Objects are composed in one or more human Languages. This property 
allows these languages to be documented.


 




auto-polymorphism?

2022-01-03 Thread Beaudet, David

Greetings and Happy New Year all.

I'm writing some SHACL to validate Linked Art which is based on CiDOC-CRM.  
When I consume the following JSON-LD file with Jena, the "Place" object becomes 
polymorphic in the sense of Jena automatically assigning another class to it 
due to a property definition in the ontology that sets the rdfs:domain to 
crm:E33_Linguistic_Object due to the presence of the "language" property.  I've 
pasted the relevant ontology portion below the JSON-LD.  Is there a way to 
disable the auto-assignment of this additional class so that SHACL constraints 
can be used to validate the object of "referred_to_by" instead of it becoming 
automatically valid due to the polymorphic behavior?  When I remove 
 from the definition, the 
additional class assignment on ingest is prevented.

Thanks,

Dave

{
  "@context": https://linked.art/ns/v1/linked-art.json,
  "id": https://linked.art/example/object/0,
  "type": "HumanMadeObject",
  "_label": "Mona Lisa",
  "referred_to_by": [
{
  "type": "Place",
  "language": [
{
  "id": http://vocab.getty.edu/aat/300388277,
  "type": "Language",
  "_label": "English"
}
  ],
  "classified_as": [
{
  "id": http://vocab.getty.edu/aat/300435416,
  "type": "Type",
  "_label": "Description",
  "classified_as": [
{
  "id": http://vocab.getty.edu/aat/300418049,
  "type": "Type",
  "_label": "Brief Text"
}
  ]
}
  ],
  "content": "This portrait was doubtless started in Florence around 1503. 
It is thought to be of Lisa Gherardini, wife of a Florentine cloth merchant ..."
}
  ]
}


est en langue


has language
hat Sprache
 

 da lngua 


This property describes the E56 Language of an E33 Linguistic 
Object.
Linguistic Objects are composed in one or more human Languages. This property 
allows these languages to be documented.








Re: Missed inference in Jena Fuseki

2021-12-21 Thread Beaudet, David

Hello,

In 2020, there was a question to the list (see below) about inference and 
queries involving predicates that have an owl:inverseOf and I think I'm running 
into a similar issue with crm:P46_is_composed_of vs. crm:P46i_forms_part_of 
where the latter is not resolving in SPARQL queries using any of the OWL or 
RDFS Jena reasoners.

Was a ticket ever created for to investigate this further?  Is duplicating the 
inverted predicates in the data still the recommend approach?  The Jena 
documentation seems to suggest that (all?) of the built-in reasoners have 
support for owl:inverseOf although in practice is that actually the case?

I can provide a full working example with code and data for the latest Jena 
snapshot if it would help.

Thanks,

Dave Beaudet



Hello, I'm using Jena Fuseki 3.13.1 (with OWLFBRuleReasoner), and I have 
asserted (uploaded) the following triples: @prefix rdfs: 
> . 
@prefix owl: > . 
@prefix f: > . f:Bob f:hasWife 
f:Alice . f:Bob f:hasWife f:Alice2 . f:Alice2 f:hasHusband f:Bob2 . f:hasWife a 
owl:FunctionalProperty . f:hasWife a owl:InverseFunctionalProperty . 
f:hasHusband owl:inverseOf f:hasWife . Now, If I query and ASK { f:Alice 
owl:sameAs f:Alice2 }, I get true. However, If I ASK { f:Bob owl:sameAs f:Bob2 
}, I get false! Loading the same triples on another reasoner (owl-rl), I get 
the triple f:Bob owl:sameAs f:Bob2 inferred. I have asked this very question on 
StackOverflow 
(https://stackoverflow.com/questions/59603569/jena-fuseki-missed-inference?noredirect=1#comment105379894_59603569
 
),
 and I got pointed out to the owl-fb rules file. Tweaking it a bit I noticed 
that If I explicitly add the forward version of inverseOf, I get that f:Bob 
owl:sameAs f:Bob2: [inverseOf2b: (?P owl:inverseOf ?Q), (?X ?P ?Y) -> (?Y ?Q 
?X) ] Am I missing something? Best regards, Andrea Leofreddi