info was stored in the simple RTF files.

However now I'd like to rewrite this program in Python (using PyQt) as
I want to make it cross-platform and add/remove some features. Now I'm
thinking about where to store my information. Would it be better to
use files as I used to do or use the database, SQLite in particular?
What will be faster and more flexible in the long run? This
application will be in the memory most of the time so I'm also
concerned about memory usage.

Not knowing what you do with the files, nor what sort of data they contain makes it a bit difficult to make suggestions in light of the context. However:

- sqlite will let you perform arbitrary queries against your data, so if you want to aggregate or perform complex conditional tests, it's just SQL.

- I don't know if you're currently keeping the RTF in memory the whole time, or if you repeatedly reload (whether in one go, or streaming it) and reparse the file. This sounds memory and/or processor intensive. Using sqlite, the processing is done at the C-module level, the data is kept on disk and only brought into memory as-requested, being released when you're done with it.

- concurrently sharing a sqlite database should have minimal issues. Sharing RTF files concurrently means locking/contention issues. May not be an issue for you.

- sqlite comes built-in with Python2.5+ while RTF processing is not batteries-included from what I can tell[1]

So in the general case, I see sqlite being a notable win over RTF. Depending on your data, if it's just key/value pairs of strings, you might look into the anydbm module[2] which has an even simpler interface.

-tkc

[1]
http://pyrtf.sf.net

[2]
http://docs.python.org/library/anydbm.html





--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to