On Mon, 2008-12-08 at 18:59 +0100, Philip Van Hoof wrote:
> All metadata engines are nowadays working on a method to let them get
> their metadata fed by external applications.
> Such APIs come down to storing RDF triples. A RDF triple comes down to a
> URI, a property and a value.
> For example (in Turtle format, which is SparQL's inline format and the
> typical w3's RDF storage format):
> We'd like to make an Evolution plugin that does this for Tracker. 
> Obviously would it be as easy as letting software like Beagle become an
> implementer of "prox"'s InsertRDFTriples to start supporting Beagle with
> the same code and Evolution plugin, this way.
> I just don't know which EPlugin hooks I should use. Iterating all
> accounts and foreach account all folders and foreach folder all
> CamelMessageInfo instances is trivial and I know how to do this.
> What I don't know is what reliable hooks are for:
>   * Application started

org.gnome.evolution.shell.events:1.0 - es-event.c - 

sample plugin:

>   * Account added


sample plugin:

For account-added: id = org.gnome.evolution.mail.config.accountDruid
For account-edited: id = org.gnome.evolution.mail.config.accountEditor

>   * Account removed

You may have to write a new hook

>   * Folder created
>   * Folder deleted
>   * Folder moved
>   * Message deleted (expunged)
>   * Message flagged for removal 
>   * Message flagged as Read and as Unread
>   * Message flagged (generic)
>   * Message moved (ie. deleted + created)
>   * New message received
>     * Full message 
>     * Just the ENVELOPE

If you try to update your metadata for every of the above operations, it
may be a overkill in terms of performance (and I believe more disk
access as well for updating your metadata store). You can add a new hook
while any change is made to the summary DB and listen to that. All the
above changes will have to eventually come to summary DB for them to be

However, I personally believe:

More and more applications are using sqlite (firefox and evolution my
two most used apps.)  So, it may be a better idea to directly map the
tables in an sqlite database into the search applications' data-store
(beagle, tracker etc.) instead of depending on the applications to give
the up-to-date data. 

When we implemented on-disk-summary for evolution summaries, we removed
the meta-summary code (used by beagle). We had to provide a way for
helping Beagle / Tracker to know of modified/new mails, so they could
(re)index these mails. Some suggested that we should add a DATETIME
field which contains the time-stamp of the time last modified/created
for each record. However, this in addition to bloating the database,
also does not provide any information about deleted records.

If, inside the sqlite db, if we have a special table comprising:
table-name,primary keys of records of last N records
modified/added,time-added; Any search application can make use of this
and update its lucene (or whatever) data ,  

It may not be the neatest approach, but what I want to say is : Instead
of depending on the enduser applications (which use sqlite) for giving
data, search applications, should be able to get the data from the db
itself. This also provides additional benefits like creating/updating
search indices when the machine is idle, instead of choking the
applications when they are running, etc.

My 0.2 EUROes ;-)


Evolution-hackers mailing list

Reply via email to