Re: Building beagle 0.3.2
this may be a somewhat profane question: I'm trying to build beagle 0.3.2 on Ubuntu 7.10. I did not find glib-sharp2 anywhere on the net though. BTW, I noticed yesterday that beagle-0.3.2 is in Hardy. Maybe the Hardy package could be changed to a Gutsy package or something. - dBera -- - Debajyoti Bera @ http://dtecht.blogspot.com beagle / KDE fan Mandriva / Inspiron-1100 user ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers
Re: Building beagle 0.3.2
Way ahead of you, workin on it now, same packaging issues never really got fixed, but I guess something is better than nothing. On Jan 10, 2008 7:13 AM, D Bera [EMAIL PROTECTED] wrote: this may be a somewhat profane question: I'm trying to build beagle 0.3.2 on Ubuntu 7.10. I did not find glib-sharp2 anywhere on the net though. BTW, I noticed yesterday that beagle-0.3.2 is in Hardy. Maybe the Hardy package could be changed to a Gutsy package or something. - dBera -- - Debajyoti Bera @ http://dtecht.blogspot.com beagle / KDE fan Mandriva / Inspiron-1100 user ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers -- Cheers, Kevin Kubasik http://kubasik.net/blog ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers
Re: Building beagle 0.3.2
So yeah, the network enabled beagle issue still isn't addressed in hardy, but if we rebuild and provide backports of the gnome stack then I guess its fine. Its all still more or less the same packaging I originally did, which is a little bit of a bummer, I was hoping there would be some awesome means of handling avahi support in a package, but I guess not.. Bad news is with the kde4 release and subsequent package surge, the build servers are a little backed up, once I have everything built and in a ppa I'll hit the list again. Cheers, Kevin Kubasik On Jan 10, 2008 8:05 AM, Kevin Kubasik [EMAIL PROTECTED] wrote: Way ahead of you, workin on it now, same packaging issues never really got fixed, but I guess something is better than nothing. On Jan 10, 2008 7:13 AM, D Bera [EMAIL PROTECTED] wrote: this may be a somewhat profane question: I'm trying to build beagle 0.3.2 on Ubuntu 7.10. I did not find glib-sharp2 anywhere on the net though. BTW, I noticed yesterday that beagle-0.3.2 is in Hardy. Maybe the Hardy package could be changed to a Gutsy package or something. - dBera -- - Debajyoti Bera @ http://dtecht.blogspot.com beagle / KDE fan Mandriva / Inspiron-1100 user ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers -- Cheers, Kevin Kubasik http://kubasik.net/blog -- Cheers, Kevin Kubasik http://kubasik.net/blog ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers
Re: Building beagle 0.3.2
So yeah, the network enabled beagle issue still isn't addressed in hardy, but if we rebuild and provide backports of the gnome stack then I guess its fine. Its all still more or less the same packaging I originally did, which is a little bit of a bummer, I was hoping there would be some awesome means of handling avahi support in a package, but I guess not.. The avahi/networking part is not yet moved to its own assembly, so packaging it separately is not an option right now. The debian packages have avahi enabled by default. But I strongly recommend _against_ doing that for normal users - just for one reason, it has not received enough testing to say it works. Also, it could have security issues and other unforeseen problems. - dBera -- - Debajyoti Bera @ http://dtecht.blogspot.com beagle / KDE fan Mandriva / Inspiron-1100 user ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers
Re: Building beagle 0.3.2
I think it was disabled in the config though, just built with avahi support, unless that changed. The gutsy backports will be pretty lame, a few dep changes thats about it, were trying to keep most packaging dev in debian now, so were consolidated, problem is debian is just a _tad_ bit stricter about packaging rules etc. Anyways, i'll mention it, if its not gonna be changed in debian we can patch that out for ubuntu. -Kevin On Jan 10, 2008 8:44 AM, D Bera [EMAIL PROTECTED] wrote: So yeah, the network enabled beagle issue still isn't addressed in hardy, but if we rebuild and provide backports of the gnome stack then I guess its fine. Its all still more or less the same packaging I originally did, which is a little bit of a bummer, I was hoping there would be some awesome means of handling avahi support in a package, but I guess not.. The avahi/networking part is not yet moved to its own assembly, so packaging it separately is not an option right now. The debian packages have avahi enabled by default. But I strongly recommend _against_ doing that for normal users - just for one reason, it has not received enough testing to say it works. Also, it could have security issues and other unforeseen problems. - dBera -- - Debajyoti Bera @ http://dtecht.blogspot.com beagle / KDE fan Mandriva / Inspiron-1100 user ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers -- Cheers, Kevin Kubasik http://kubasik.net/blog ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers
[ANNOUNCE] Nemo 0.2.0 released
Hi We're very pleased to announce a new version of Nemo. Nemo is the first software to support the new standard Xesam[1], for talking to desktop search servers. In this release we have only tested the Xesam interface against the beagle xesam adaptor (version 0.2 with the following patches required[2]) but it should work against all complying servers. The performance compared to tracker has been a lot better because of the improved way one can query data. Tracker is slated to support Xesam in the next major release. Furthermore the indexing performance has been improved a lot and a dialog has been added to the tray to monitor the progress. Please use the newly created mailing list[3] for reporting feedback/bugs/wishes. Thanks! [1]: http://www.xesam.org [2]: http://mail.gnome.org/archives/dashboard-hackers/2008-January/msg00021.html [3]: http://groups.google.com/group/codename-nemo -- Anders Rune Jensen http://people.iola.dk/arj/ ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers
Re: Beagle's scope
On Jan 8, 2008 7:44 AM, D Bera [EMAIL PROTECTED] wrote: Hi Kevin, 1. RDF Store. I know that the Beagle++ folks had integrated an RDF store into their Beagle modifications. Are there any plans for Beagle proper to include an RDF store? Or does this belong under a separate project? If Beagle will incorporate its own RDF store, should an external application (such as Dashboard) be able to add things to Beagle's RDF store or should it maintain its own RDF store? The beagle-rdf-branch in svn has a partial implementation of overlaying an RDF store on beagle data i.e. to say handle RDF queries instead of usual BeagleClient queries. The code in the branch handles all kinds of subject, predicate, object queries. What is missing is to implement a Semweb.Selectable source to add proper RDF flavour to the process. My RDF knowledge is not enough to finish the rest :( The above is an experiment to see if beagle could be mimicked as an RDF store while not designed to be one from ground up. If it works, then any RDF client will be able to query beagle. The other part about storing data is tricky. Again beagle was designed to pull and index information from data, not store data. There is API to add extra data in beagle (search Joe's blog) like tags or additional properties. But for serious applications it is much desirable to have a dedicated store and let beagle index it for you. There has been on-and-off efforts in writing a separate metadata store (possibly a simple sqlite table to start with) but none was completed. Hey, I should probably put this into a branch, since it actually is sitting on something not completely unlike what is being mentioned here. Offering some simple Api to store Triples (read: Uri, datatype, and data) in a sqlite table wasn't the problem, it was determining where and how we link it in. Personally, I can't really seem to figure out _why_ people want all their data siloed into one big btree but thats besides the point, I think that there is a certain buzz afoot, and seeing how leaftag is dead on the ground, and were already a daemon running, it wouldn't be completely out of the question for us to provide a means to store metadata (line drawn a real data, e-mail should not be stored in Beagle ;) ). The other concern/issue/quandry is how should metadata stored in some sqlite table somewhere be queried/merged with the Lucene results? How would those terms weigh? The exact procedures of storage really aren't all that important to people, what I do think we might want to look into is simplifying the process of 'feeding' and 'retreiving' data based on a uri in our Api. Say I have an e-mail that has been assigned a Red label in whatever e-mail client. Right now, our API makes finding that information about that e-mail result easy, if we find it as a result of some other query. However, while the capability exists, finding out if 'email://[EMAIL PROTECTED]' has a red label is not a seamless API action. An idea that I've fiddled with for a while is beefing up our Hit class to support basic CRUD against Beagle, Hear me out! So, I figure out uri queries and get a Hit representing what Beagle knows about 'email://[EMAIL PROTECTED]', we have a pretty robust set of metadata, the client program displays this to the user. What if the user changes the label to Blue now? Granted normally we would recrawl and notice right away, but lets pretend that this is a 'proactive' program, and all of our indexed data about this e-mail client has to come over the BeagleClient API. While this change certainly isn't hard, its a far cry from: hit.Properties[label] = blue hit.Save(); OR hit.SaveAsync(); Which does the needed logic to update our indexies with the new data. This opens the door to clients not using beagle as some storage mechanism, but for its powerful tokenizers, stemmers and lightning fast full-text search. Building on that, it would be pretty easy to have a hit.RemoveFromIndex(); method. The Create step is a little trickier, since Hit's and Indexable's are fundamentally different things, however I do think that we might want to give a slightly more 'code-concise' means of both indexing and querying single or sets of Uri's. Something like client.GetHit(Uri); client.GetHits(Uri[]); Maybe something as concise as client.Index(Uri,Data,Properties[]) I would also propose adding another override of the AddProperty() method which just takes 2 strings and assumes you wanted a Property (not Keyword, Date, or Unsearched). However, all that doesn't do us much good without guaranteed persistence, I haven't looked, but If I were to use the client API to index a Uri already in another index, then that Uri got signaled for removal, would the user-inserted information still be available? Anyways, something to think about, personally, if we start pushing elements of the Beagle Client API as a easy way to get hyper-powered search for all your content as an application developer (for
Re: [Xesam] [ANNOUNCE] Nemo 0.2.0 released
On 10/01/2008, Anders Rune Jensen [EMAIL PROTECTED] wrote: Hi We're very pleased to announce a new version of Nemo. Nemo is the first software to support the new standard Xesam[1], for talking to desktop search servers. In this release we have only tested the Xesam interface against the beagle xesam adaptor (version 0.2 with the following patches required[2]) but it should work against all complying servers. The performance compared to tracker has been a lot better because of the improved way one can query data. Tracker is slated to support Xesam in the next major release. Furthermore the indexing performance has been improved a lot and a dialog has been added to the tray to monitor the progress. Please use the newly created mailing list[3] for reporting feedback/bugs/wishes. Thanks! [1]: http://www.xesam.org [2]: http://mail.gnome.org/archives/dashboard-hackers/2008-January/msg00021.html [3]: http://groups.google.com/group/codename-nemo Sweet! And gongrats. Be sure to post any problems you have on the Xesam side of things to this list (as you have already done in the past), so that we can resolve them asap. Cheers, Mikkel ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers
Re: [Xesam] [ANNOUNCE] Nemo 0.2.0 released
On 10/01/2008, Mikkel Kamstrup Erlandsen [EMAIL PROTECTED] wrote: On 10/01/2008, Anders Rune Jensen [EMAIL PROTECTED] wrote: Hi We're very pleased to announce a new version of Nemo. Nemo is the first software to support the new standard Xesam[1], for talking to desktop search servers. In this release we have only tested the Xesam interface against the beagle xesam adaptor (version 0.2 with the following patches required[2]) but it should work against all complying servers. The performance compared to tracker has been a lot better because of the improved way one can query data. Tracker is slated to support Xesam in the next major release. Furthermore the indexing performance has been improved a lot and a dialog has been added to the tray to monitor the progress. Please use the newly created mailing list[3] for reporting feedback/bugs/wishes. Thanks! [1]: http://www.xesam.org [2]: http://mail.gnome.org/archives/dashboard-hackers/2008-January/msg00021.html [3]: http://groups.google.com/group/codename-nemo Sweet! And gongrats. Be sure to post any problems you have on the Xesam side of things to this list (as you have already done in the past), so that we can resolve them asap. Two posts in 5 minutes shows my excitement :-) I just wanted you to know that I added Nemo to http://www.xesam.org/main/XesamUsers Cheers, Mikkel ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers
Re: [Xesam] [ANNOUNCE] Nemo 0.2.0 released
Two posts in 5 minutes shows my excitement :-) Mii 2oo :) I just wanted you to know that I added Nemo to http://www.xesam.org/main/XesamUsers In a sense, Nemo is the first real user; the others are service providers :) (Command line clients for the desktop-search apps using xesam query language do not qualify as real xesam users). - dBera -- - Debajyoti Bera @ http://dtecht.blogspot.com beagle / KDE fan Mandriva / Inspiron-1100 user ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers
beagle IPTC crawling question
Hi all, I searched the list archives and didn't find anything related to this yet. I am using googles picasa to manage my image collection. I have been adding keywords to the images in the picasa interface, and inspection after the fact shows that the keywords are being stored as IPTC data in the image. For instance, one image with the keywords cindy and swatch has this for IPTC data (found using exiv2 pr -p i imagename): Iptc.Application2.0x0075 String152 hellostamp gid0-0-7fff-0/gid md50-0-0-0/md5 origWidth0/origWidth origHeight0/origHeight origSize0/origSize /hellostamp Iptc.Application2.ContactString 82 picasastamp keywordcindy/keyword keywordswatch/keyword /picasastamp Iptc.Application2.Keywords String 6 cindy Iptc.Application2.Keywords String 7 swatch Now, Beagle does not pick up either of those two keywords at all. However, importing that image into F-Spot, then exiting the program will produce a hit in beagle on one of the keywords correctly. Is this something that is known? Is there perhaps some way to address this, or has it already been fixed? pat ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers
Re: beagle IPTC crawling question
I am using googles picasa to manage my image collection. I have been adding keywords to the images in the picasa interface, and inspection after the fact shows that the keywords are being stored as IPTC data in the image. For instance, one image with the keywords cindy and swatch has this for IPTC data (found using exiv2 pr -p i imagename): ... Now, Beagle does not pick up either of those two keywords at all. IIRC, IPTC indexing was added in 0.3.0. Are you using any of 0.3.x releases ? Even with 0.3.x, some of the IPTC tags are not extracted due to a limitation in the F-Spot code that beagle uses for IPTC extraction. If I recall correctly, its one of the lens/makernote related tag. If you are using 0.3.x and beagle is not indexing Application2, then it might be related to the F-Spot code. Let us know. BTW, you can use beagle-extract-content /path/to/filename to figure out what properties are extracted. - dBera -- - Debajyoti Bera @ http://dtecht.blogspot.com beagle / KDE fan Mandriva / Inspiron-1100 user ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers
Metadata Storage Daemon
Alright, so there was a quick chat in the IRC last night where a few of us realized that we wanted a simple metadata storage implementation to try and centralize whats going on all crazy-like with several different daemons all coming from completely different directions. My rough proposal is really 2 part 1) A simple metadata storage service over dbus would be quite simple, obviously better API's cost us more time and energy, but the backbone of such a system is extremely rudimentary. I propose that we just go ahead and write one. No desktop search or filters etc. Just a few calls exposed to dbus to store, query and delete Triples (A Combo of some uniqueid, data, and the datatype/metadata). At its core this is a sqlite db with a little extra work. 2) We take what we learn from the simple implementation and build it into a Xesam spec for metadata storage. As well as building an 'official' Gnome ontology. While the strength of the current Xesam Query spec is a great indicator of how planning can design a wonderful system, I think metadata is slightly different. Any true store (that reaches the universal acceptance needed for ubiquity) needs to be generic, _any_ metadata about _any_ source, with social rules governing where and how data is labeled. Since I had about an hour to kill this evening, I sloshed together some python to outline what I am getting at. The hodgepodge system I see as most prudent would handle an MP3 as follows file:///home/kevin/music/song.mp3 | dc:title | Cool Song file:///home/kevin/music/song.mp3 | dc:author | Great Band file:///home/kevin/music/song.mp3 | music:rating | 4 file:///home/kevin/music/song.mp3 | gnome:tag | Star Or some files like file:///home/kevin/Documents/hippo.odt | gnome:project | ZooAnimals file:///home/kevin/Documents/hippo.odt | gnome:project | FatAnimals file:///home/kevin/Documents/giaraffe.odt | gnome:project | ZooAnimals file:///home/kevin/Documents/zebrah.odt | gnome:project | ZooAnimals We throw in basic timestamping of all actions and I think we have 90% of the desktops metadata storage needs covered. The best part is that the footprint would be minuscule, and the code relatively stable. While a query system that supports wildcards etc would probably we way better, I more just wanted the idea to show. I used SQLObject since it makes life painless and I wanted to finish both this e-mail and the sample code in under an hour. Combined with proper namespacing of applications etc. This is all we really need at the core (maybe a few more columns or indexies). Anyways, Please share API thoughts so we can at least pick a general direction. I would be really interested to know a little more about the more elaborate potential use cases. Honestly, I see 80% of use being: 1) Add lots of attributes for a Uri 2) Query for all attributes associated with a Uri or Query for a specific attribute associated with a Uri 3) Query for Uri sets that have a certain value in a certain attribute. * (This starts to venture into the realm of our indexers obviously this is a regular use case, and we would need it plenty, I'm just noting that any spec we try to make from this should probably _count_ on the other desktop searches indexing their metadata, so we really just filter on them.) Anyways, the blob of silly test code is in a bzr brach at http://kubasik.net/dev/metadata_daemon.kkubasik/ so feel free to bzr branch http://kubasik.net/dev/metadata_daemon.kkubasik/ away. I know this isn't at all near a full implementation or spec, but I did want to get the ball rolling on it, as it seems like a lot of people agree that an ultra-discreet (and part of Gnome proper) system for storing and querying metadata is in the near future. -- Cheers, Kevin Kubasik http://kubasik.net/blog ___ Dashboard-hackers mailing list Dashboard-hackers@gnome.org http://mail.gnome.org/mailman/listinfo/dashboard-hackers