>>>>> andyg <[email protected]> writes:
> I've changed all scanning in the current 7.4 over to the new scanner > code I've been working on. I expect there will be bugs. I've started a > wiki page that attempts to describe how the new scanner in 7.4 works, > please read it and let me know if you need more info or have problems. > http://wiki.slimdevices.com/index.php/NewScanner This sounds like great stuff, Andy, I'm excited to check it out! In automatic rescan mode, once it detects a song has changed, how long does it take to update the database? Will it still be doing full versions of all the slow post-scan steps, or will those be made incremental and fast? How about rescanning playlists? This is probably my biggest frustration right now - I edit a playlist then to get the changes seen by the server I need to manually rescan, and because of the post-scan steps that takes a significant amount of time. It should at least be smart enough to skip the post-scan steps if no tag changes are found rescanning the playlists, which should usually be the case unless your playlist changes refer to (new) files outside the server's normal music directory. This could be a big win for very little effort. On that web page you write: | Scanning of file metadata and tags accounted for approx. 1/3 of the | scan time in prior versions of SqueezeCenter. All the Perl-based file | scanning modules have been replaced by a single C-based module, | Audio::Scan. This reduces this metadata/tag reading time to a much | smaller fraction of the scan process. The rest of the scan time is | taken in the Perl database abstraction layer (DBIx::Class), and | remains generally unchanged. By what fraction has the tag scanning been sped up? Attacking the component taking 1/3 of the time is certainly useful, but usually one would first go after the component taking 2/3.. any plans to speed up the database access? From what I understand DBIx is really slow, is there a plan to use something else? Also, I would think it'd be a big win to cache updates to the database and only commit in bigger chunks while scanning. I've got about 22,000 mp3s, and can extract the tags in less than 45 seconds so I really do not see why a full rescan should take any more than about a minute: [mp3] g...@lwm| find . -name '*.mp3' -print | wc -l 22162 [mp3] g...@lwm| time find . -name '*.mp3' -print0 | xargs -0 id3info | wc -c 11067476 find . -name '*.mp3' -print0 0.03s user 0.07s system 0% cpu 41.251 total xargs -0 id3info 12.63s user 28.98s system 96% cpu 43.314 total wc -c 0.12s user 0.96s system 2% cpu 43.311 total And that's only about 10 megs of tag data (probably a lot less if you consider how verbose the output of id3info is); there's no reason I see not to store all the tag data in ram while scanning and commit to the DB all at once at the end. thanks, Greg _______________________________________________ beta mailing list [email protected] http://lists.slimdevices.com/mailman/listinfo/beta
