Catrina;  Yes, caching is involved when using
sml:ImportOracleRDFDatabase.

When a query is executed, regardless of where it is executed (i.e. a
script, the SPARQL View, etc.), Composer/Live will query the
appropriate data store and return all results.  So you can write your
query without having to know whether the data is on a back-end, in a
text serialization, etc.  If the Oracle back-end is involved, the
query is passed to Oracle and results returned to the script.

A caveat is that Composer's interface will only display x results.
The default for x is 1000 and can be modified in Preference > TopBraid
Composer > Max number of instances to display.

-- Scott

On Aug 28, 1:06 pm, Catrina <[email protected]>
wrote:
> I'm storing my ontologies in an Oracle database triple store.  I've
> got some additional data that I'd like to merge with the existing data
> in the database.  Therefore, I have a SPARQLMotion script import the
> Oracle RDF database with the sml:ImportOracleRDFDatabase module.  When
> this step is executed, will all the triples from the database be
> loaded into memory?  Is there any caching involved?
>
> Are there any SPARQLMotion modules that will allow me to query the RDF
> store for certain nodes (these are the nodes that I want to update)?
>
> Thanks,
> Catrina
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"TopBraid Composer Users" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/topbraid-composer-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to