Il 14/03/2012 15:34, Andreas Schultz ha scritto:
> Hi Chryssa,
>
> actually I also have problems loading this data set with virtuoso. But the
> problems start after about 120M or so triples. For loading I execute
> following command:
>
> DB.DBA.TTLP_MT(file_to_string_output ('dataset-path'), 'http://default/',
> 'graph-uri', 0)
>
> Hope, that helps.
I didn't figure out that problem before reading your emails.
I checked this on my instance:
wc -l page_links_en.nt
gives me back 145877010 rows.
While this query:
SELECT COUNT(*) WHERE {?s dbpedia-owl:wikiPageWikiLink ?o}
returns 118039661.
Where are the remaining 28M triples? :-)
>
> Andreas
>
>> Andreas, I would be grateful for any feedback or advice, since I have not
>> figured out the problem yet!
>>
>> Pablo thanks as well! :)
>>
>>
>>
>> ???? 14 ??????? 2012 1:24 ?.?., ? ??????? Pablo Mendes<
>> [email protected]> ??????:
>>
>>> Andreas (in CC) seems to be loading wikiPageLink triples right now.
>>> Maybe
>>> he can advise.
>>>
>>> Cheers,
>>> Pablo
>>>
>>>
>>> 2012/3/14 Chryssa Zerva<[email protected]>
>>>
>>>> I have fixed the NumberOfBuffers and the MaxDirtyBuffers according to
>>>> this link http://docs.openlinksw.com/virtuoso/rdfperformancetuning.html
>>>> (my settings are a bit lower than the suggested 4GB settings)
>>>>
>>>> Do the buffers configurations relate to the number of triples I am able
>>>> to load?? I thought they just relate to the loading time. Should I
>>>> change
>>>> sth?
>>>>
>>>> Thanks a lot!
>>>>
>>>>
>>>> ???? 14 ??????? 2012 12:28 ?.?., ? ??????? Roberto Mirizzi<
>>>> [email protected]> ??????:
>>>>
>>>> Il 14/03/2012 11:02, Chryssa Zerva ha scritto:
>>>>> If you mean the ll_state of the load_list table, it has the value 2.
>>>>> Also the ll_error has the value NULL.
>>>>>
>>>>> I am aware of the use of ShortenLongURI's parameter, and I had no
>>>>> problems with long URI's.
>>>>> Also while loading there was no error message.
>>>>>
>>>>>
>>>>> What about the buffers?
>>>>> NumberOfBuffers
>>>>> MaxDirtyBuffers
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> ???? 14 ??????? 2012 11:35 ?.?., ? ??????? Hugh Williams<
>>>>> [email protected]> ??????:
>>>>>
>>>>>> Hi Chryssa,
>>>>>>
>>>>>> What does the "load_list" table report as the status of loading that
>>>>>> dataset ?
>>>>>>
>>>>>> Please also refer to the following tip on the Virtuoso
>>>>>> "ShortenLongURIs" param which is required for some of the DBpedia 3.7
>>>>>> datasets for them to load fully:
>>>>>>
>>>>>>
>>>>>> http://virtuoso.openlinksw.com/dataspace/dav/wiki/Main/VirtTipsAndTricksGuideShortenLongURIs
>>>>>>
>>>>>> Best Regards
>>>>>> Hugh Williams
>>>>>> Professional Services
>>>>>> OpenLink Software, Inc. //
>>>>>> http://www.openlinksw.com/
>>>>>> 10 Burlington Mall Road, Suite 265, Burlington MA 01803
>>>>>> Weblog -- http://www.openlinksw.com/blogs/
>>>>>> LinkedIn -- http://www.linkedin.com/company/openlink-software/
>>>>>> Twitter -- http://twitter.com/OpenLink
>>>>>> Google+ -- http://plus.google.com/100570109519069333827/
>>>>>> Facebook -- http://www.facebook.com/OpenLinkSoftware
>>>>>> Universal Data Access, Integration, and Management Technology
>>>>>> Providers
>>>>>>
>>>>>> On 14 Mar 2012, at 09:17, Chryssa Zerva wrote:
>>>>>>
>>>>>>> Hello, I have installed a dbpedia endpoint and I am trying to load
>>>>>> the dbpedia datasets I have downloaded. (I downloaded the 3.7 dbpedia
>>>>>> datasets).
>>>>>>> While all other files are loaded correctly, when I try to load
>>>>>> "page_links_en.nt" the loader loads only 4951244 triples instead of
>>>>>> 145.9M
>>>>>> triplets. I tried redownloading the file and reloading it to the
>>>>>> virtuoso
>>>>>> endpoint and still I got the exact same number of triples.
>>>>>>> I load the files using the isql commands:
>>>>>>> ld_dir('absolut/path/to/folder', '*.*', 'graphname');
>>>>>>> rdf_loader_run();
>>>>>>>
>>>>>>> I used the same commands for all a=other datasets and they worked
>>>>>> fine.
>>>>>>> Is there any problem with the file I downloaded or sth else I am
>>>>>> doing wrong?
>>>>>>> Thanks a lot for your help,
>>>>>>> Chryssa
>>>>>> >
>>>>>> ------------------------------------------------------------------------------
>>>>>>> Virtualization& Cloud Management Using Capacity Planning
>>>>>>> Cloud computing makes use of virtualization - but cloud computing
>>>>>>> also focuses on allowing computing to be delivered as a service.
>>>>>>>
>>>>>> http://www.accelacomm.com/jaw/sfnl/114/51521223/_______________________________________________
>>>>>>> Dbpedia-discussion mailing list
>>>>>>> [email protected]
>>>>>>> https://lists.sourceforge.net/lists/listinfo/dbpedia-discussion
>>>>>>
>>>>>
>>>>> ------------------------------------------------------------------------------
>>>>> Virtualization& Cloud Management Using Capacity Planning
>>>>> Cloud computing makes use of virtualization - but cloud computing
>>>>> also focuses on allowing computing to be delivered as a
>>>>> service.http://www.accelacomm.com/jaw/sfnl/114/51521223/
>>>>>
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> Dbpedia-discussion mailing
>>>>> [email protected]https://lists.sourceforge.net/lists/listinfo/dbpedia-discussion
>>>>>
>>>>>
>>>>>
>>>>
>>>> ------------------------------------------------------------------------------
>>>> Virtualization& Cloud Management Using Capacity Planning
>>>> Cloud computing makes use of virtualization - but cloud computing
>>>> also focuses on allowing computing to be delivered as a service.
>>>> http://www.accelacomm.com/jaw/sfnl/114/51521223/
>>>> _______________________________________________
>>>> Dbpedia-discussion mailing list
>>>> [email protected]
>>>> https://lists.sourceforge.net/lists/listinfo/dbpedia-discussion
>>>>
>>>>
>
------------------------------------------------------------------------------
Virtualization & Cloud Management Using Capacity Planning
Cloud computing makes use of virtualization - but cloud computing
also focuses on allowing computing to be delivered as a service.
http://www.accelacomm.com/jaw/sfnl/114/51521223/
_______________________________________________
Dbpedia-discussion mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/dbpedia-discussion