Hi Benson, I'll reply properly later (deadlines looming) but a quick thought ... have you looked at the RDF coming out of OpenCalais and/or the TSO Document Enrichment Service?
I can't offhand recall to what extent they use OWL, whether they are DL compliant and whether they reference their ontologies using explicit owl:imports. But it might be worth a quick look at those to see how others in a similar space are handling it. Cheers, Dave On Thu, 2010-12-09 at 12:57 -0500, Benson Margulies wrote: > Dave, > > Permit me to take this up one more conceptual level. > > So, here at Basis we have a named entity extractor, plus we have been > dabbling in JAPE rules to build some relationship extraction, and we > have a coref system coming on line. > > We want to represent the output of these things in RDF, and then do > 'interesting' queries in the RDF we come up with, and we eventually > want to extend to querying dbpedia. > > I read a few books and tutorials and concluded that it made sense to > work OWL-ishly instead of with naked RDF. I flirted with Proton, but > decided for now to use my own little ontology. > > The flow is that one process does all this NLP and derives RDF/OWL, > one graph per source. The second process takes these graphs and wants > to stuff them into a store. And the third will do queries and > visualization. On another thread I'm borrowing Andy's neurons on the > subject of choosing a tuple store. > > After reading your messages, my thought is that I need to add the > import into 'process 1', I don't really need any model in 'process 2' > if I'm just pushing RDF/XML from here to there, and that in process 3 > the big question is to pick a store. > > Can you give me a pointer to read up to conform to OWL/DL, which from > your email seems like it's what I'm stumbling toward doing? > > Or do you care to give me a shove in some other direction altogether? > > thanks, > benson
