David Worrall wrote:
> I'm working w. a dynamic dataset of some 3500 tables - each table
> grows sequentially. Total data ~= 5GB
Sorry I've no comparisons for you, but what kind of data is it?
If it's a lot of numbers and short strings, PyTables+ numpy could be
very fast.
MySQL has a reputat
Hi all,
Does anyone know of a study which compares access-time on of the
various python persistence technIques?
I'm working w. a dynamic dataset of some 3500 tables - each table
grows sequentially. Total data ~= 5GB
Don't mind if it's a bit awkward - but for a time-critical applic. so
need