And if, at the same time, the scan takes several hours or even up to a full day (I'm talking about libraries well in excess of half a million tracks here) it means this type of user pretty much exclusively uses BMF. Which brings us to the next two points:
I'm sorry to say, but I don't think we should care too much about the 500k+ tracks problem. How many such users do you know? We shouldn't waste too much time for the <1% case. Users who can afford 100k+ tracks should be able to overcome the worst performance by throwing money at the problem: get a powerful, dedicated machine, use SSD only. I've seen too many cases where such users tried to run LMS on some underpowered hardware (SOHO NAS). If you can afford that many tracks, then you can afford a reasonable machine. It shouldn't cost more than the equivalent of a few thousand tracks - a low percentage of the overall cost, isn't it?
3. It looks like performance with the current SQLite degrades in a non-linear fashion with large libraries. This sees to have been better with MySQL.
I'm happy to provide hooks in LMS and help for a third party dev to provide a MySQL enabling plugin. But building it in to the main LMS distribution to overcome problems of a few users is not going to happen.
-- Michael _______________________________________________ beta mailing list [email protected] http://lists.slimdevices.com/mailman/listinfo/beta
