I meant the last query in my previous email gives me a single Concept in
Fuseki. Try running it on dbpedia it gives lots and lots of other results
of course.


On Thu, Mar 7, 2013 at 2:11 PM, Ahmed Sobhi <[email protected]> wrote:

> I ended up using the nt version of dbpedia. It takes a lot of time, the
> ~20m triples took around 2.5h. I still haven't tried the ttl.
>
> This time, I got no error/warning messages. The POS.dat, OPS.dat and
> OSP.dat are all around 760MB.
>
> However, I'm still getting weird results.
>
> A query like
>   SELECT ?s ?p ?o WHERE {?s ?p ?o.} LIMIT 100
> works fine
>
> but
>
>   select distinct ?Concept where {[] a ?Concept} LIMIT 100
>
> from dbpedia gives a single Concept <http://www.opengis.net/gml/_Feature>
>
> Something is still wrong. Any ideas?
>
>
>
> On Thu, Mar 7, 2013 at 1:39 PM, Andy Seaborne <[email protected]> wrote:
>
>> On 07/03/13 08:40, Ahmed Sobhi wrote:
>>
>>> I was loading the .ttl version of the dump
>>> http://downloads.dbpedia.org/**3.8/en/mappingbased_**
>>> properties_en.ttl.bz2<http://downloads.dbpedia.org/3.8/en/mappingbased_properties_en.ttl.bz2>
>>> .
>>> I kept on getting unicode related warnings so maybe that's the reason.
>>>
>>
>> I got a lot (a LOT!) of warnings about bad URIs including a few normal
>> form (NFC) warnings but no hard unicode warnings (the load would stop).
>>
>> Did you issue the three different queries against the same running
>> server?  Or did you run the server three times?
>>
>>
>>  In answer to your questions: yes (as far as I remember but I'll double
>>> check), no, 0.2.6.
>>>
>>> I'm repeating but with the .nt version of the dump
>>> http://downloads.dbpedia.org/**3.8/en/mappingbased_**
>>> properties_en.nt.bz2<http://downloads.dbpedia.org/3.8/en/mappingbased_properties_en.nt.bz2>which
>>> doesn't give any warnings and I'll let you know what happens
>>>
>>
>> It is useful to check the data before loading - try "riot --validate".
>> It is faster than loading (I get it parsing at 220K triples/s).
>>
>> The ntriples route (not Turtle) does less checking - the URI issues
>> causing the warnings are still there.
>>
>>         Andy
>>
>>
>>>
>>> On Wed, Mar 6, 2013 at 11:19 PM, Andy Seaborne <[email protected]> wrote:
>>>
>>>  On 06/03/13 19:09, Ahmed Sobhi wrote:
>>>>
>>>>  Hi,
>>>>>
>>>>> I'm a jena newbie. I am trying to load the DBpedia's Ontology Infoxbox
>>>>> properties
>>>>> http://downloads.dbpedia.org/****3.8/en/mappingbased_**<http://downloads.dbpedia.org/**3.8/en/mappingbased_**>
>>>>> properties_en.ttl.bz2<http://**downloads.dbpedia.org/3.8/en/**
>>>>> mappingbased_properties_en.**ttl.bz2<http://downloads.dbpedia.org/3.8/en/mappingbased_properties_en.ttl.bz2>
>>>>> >
>>>>> ,
>>>>> http://wiki.dbpedia.org/****Downloads38#h227-1<http://wiki.dbpedia.org/**Downloads38#h227-1>
>>>>> <http://**wiki.dbpedia.org/Downloads38#**h227-1<http://wiki.dbpedia.org/Downloads38#h227-1>>into
>>>>> a Fuseki server running on
>>>>>
>>>>> my machine.
>>>>>
>>>>> For that I did the following
>>>>>
>>>>>      1. Create the tdb using the following command
>>>>>      java tdb.tdbloader --loc=dbpedia mappingbased_properties_en.ttl
>>>>>
>>>>>      2. Ran the server using the following
>>>>>      .\fuseki-server.bat --loc=../dbpedia /dbpedia
>>>>>
>>>>>   ...
>>>>>
>>>>
>>>>
>>>>   However, when specifying the subject instead, I get the correct
>>>> results.
>>>>
>>>>>
>>>>> Any ideas why specifying the predicate or the object doesn't work?
>>>>>
>>>>>
>>>> Works for me.
>>>>
>>>> Did the load finish cleanly before you started Fuseki?
>>>> Did you move the database after loading?
>>>> Which version are you using?
>>>>
>>>> It could be the POS and OSP indexes are broken - how big are the POS.dat
>>>> and OSP.dat files?
>>>>
>>>>          Andy
>>>>
>>>>
>>>>
>>>>  Note: The exact query works just fine on DBpedia. I'm therefore
>>>>> suspecting
>>>>> Fuseki but I'm unable to pinpoint the reason why it's not working
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>
>
>
> --
> Best Regards,
> Ahmed Sobhi
> http://about.me/humanzz
>



-- 
Best Regards,
Ahmed Sobhi
http://about.me/humanzz

Reply via email to