Hi all
Just for information:
I'm a user with a large database consisting of 400k tracks. I was stuck
on 7.5.6 due to slow performance of SQLite. I just updated to 7.9 and I
can say WOW! Thank you thank you thank you!!!
Scanning times (all with SSD, Intel Core Duo 2.5 ghz, 3 gb RAM):
7.5.6 dele
...sorry, LMS 7.9 running under Windows 7 Home Premium.
Cheers Bronx
Bronx's Profile: http://forums.slimdevices.com/member.php?userid=33115
View this thread: http://forums.slimdevices.com/showthread.php?t=89391
__
erland;648454 Wrote:
>
> Maybe it could be the CPU ?
> My virtual machine only have access to one of the CPU cores and I'm
> guessing you have a multi core CPU which probably is an advantage with
> MySQL both since it runs in a separate process and also because it's
> probably more optimized fo
JJZolx;648151 Wrote:
> No, I did not test with a 65,000 track library. The library had 260,000
> tracks. The results I posted above showed the elapsed time at different
> points in the scan.
>
> I see results similar to yours when the library is small.
>
Do you use any customized settings in yo
Philip Meyer;648153 Wrote:
> >A quick analysis indicates that the difference could be:
> >- Linux vs Windows
> >- SQLite version (I think I used a newer version than JJZolx)
> >- MySQL version (I used the same as in 7.5, not sure what JJZolx
> uses)
> >- Tagging difference (my library is tagged d
>A quick analysis indicates that the difference could be:
>- Linux vs Windows
>- SQLite version (I think I used a newer version than JJZolx)
>- MySQL version (I used the same as in 7.5, not sure what JJZolx uses)
>- Tagging difference (my library is tagged differently than JJZolx)
>- FLAC size diff
erland;648148 Wrote:
> As you can see, my results are very different from the similar results
> JJZolx saw with 65 000 tracks. In all my tests SQLite is a lot faster
> than MySQL which is completely opposite to what JJZolx got earlier in
> his tests on Windows.
No, I did not test with a 65,000 t
I've done some test on my Ubuntu box and compared SQLite with MySQL and
my results are a bit different than JJZolx.
I've cleared the file system cache before each test and I've executed
each test case twice to ensure I get the same performance. All the
tests started with an empty Cache directory
Is it possible to split the library up ?? It may be convenient to keep
that many tracks in one folder, but why not split them up ?
This may not be possible (I have nevr tried it, but then again, I only
have a 65GB library!)
--
kappclark
-
Is it possible to time how long it takes to browse the library using SQLite and
MySQL engines, or is it not possible to get a scan to complete to a satifactory
level with MySQL DB? i.e. if artwork was turned off, for example?
Phil
___
discuss mailing
JJZolx;647828 Wrote:
> Here are the results of the first test with a 260,000 file library.
I've plotted those figures on a graph.
Interesting - it shows that SQLite isn't linear - performance gets
slightly worse as the library size increases, whereas MySQL is very
close to linear.
Obviously, t
JJZolx;647828 Wrote:
>
> These times were from the initial scans launched by starting up the
> server with an empty cache folder and (in the case of MySQL) an empty
> database. Both scans probably benefited from system caching, since I
> ran them soon after generating the library itself. I ran t
Here are the results of the first test with a 260,000 file library. The
Flac files are all 10 seconds of silence, then tagged exactly as my
real Flac library. What I did to generate the library was to duplicate
the albums in my 32.7k file library multiple times, changing the album
and folder name
I like to point to an exact comparison of the MySQL and SQLite in the
German Forums. Maybee MrFloppy can give some comments in English here.
After Compare the 2 database the result is clear for the SQLite.
http://forums.slimdevices.com/showthread.php?t=89118
--
arztde
---
Philip Meyer;647712 Wrote:
>
> I'm not sure how the scanner works now, but assume it needs to read
> content back out of the DB when deciding how to process each file
> scanned. As it looks like there aren't any tidy-up phases any more, it
> must be running queries or reading content back out t
>Does anyone know if an analyze tables is done after inserting each 5
>tracks (or different count)
>Every major database needs this if there are bulk inserts and reads
>happening at the same time (which I expect the scanner to does?)
One suggestion during "new schema" discussions, was to have
erland;647703 Wrote:
> Just out of interest, do you have the "Database Memory Config" set to
> "Normal" or "High" ?
>
> With a library of this size, it's probably recommended to set it to
> "High".
Yes, it's set to High.
--
JJZolx
-
JJZolx;647701 Wrote:
> Well, that just plain wasn't going to happen. It would have taken
> weeks.
>
> Trying again with a 260k track library. I'll run the MySQL scan first
> this time.
>
Just out of interest, do you have the "Database Memory Config" set to
"Normal" or "High" ?
With a library o
JJZolx;647688 Wrote:
> If it keeps degrading like this as the database grows larger, it's going
> to take a _very_ long time to scan the whole library. Maybe longer than
> I have the patience for.
Well, that just plain wasn't going to happen. It would have taken
weeks.
Trying again with a 260k
bluegaspode;647691 Wrote:
> Does anyone know if an analyze tables is done after inserting each 5
> tracks (or different count)
> Every major database needs this if there are bulk inserts and reads
> happening at the same time (which I expect the scanner to does?)
>
As far as I can see it ru
Does anyone know if an analyze tables is done after inserting each 5
tracks (or different count)
Every major database needs this if there are bulk inserts and reads
happening at the same time (which I expect the scanner to does?)
--
bluegaspode
Did you know: *'SqueezePlayer' (www.squeezep
Ok, the new library has been generated: 1,000,000 tracks (Why mess
around, right?)
I'm scanning it now with a 7.6.1 server running SQLite. I'm not
certain, but already I may be seeing something. Discovery of the 1M
files took just 9:28. Scanning the first 25k tracks took about 10
minutes. Now, at
I'm generating a very large test library right now and will post the
results after it's been scanned under both SQLite and MySQL. Maybe
sometime tomorrow or Monday.
It won't be possible to compare the overall scan times of SQLite/7.6 to
MySQL/7.6, since a clear & rescan with MySQL in 7.6 hangs du
>I worry about SQLite. Does anyone have experience with such large
>databases with SQLite? In the beta forum Ive read somewhere that with
>very large databases SQLite may run unexpectedly slow and possibly
>even slower than MySQL? Thank you for your answers.
>
I worried about the move to SQLite t
Bronx, add another Gb of RAM, it's very cheap. Then set the the new
advanced performance option in 7.6.x to use extra memory.
http://forums.slimdevices.com/showthread.php?t=88904&page=3 shows
scanning performance of one of my boxes, a mini-itx with a low power
dual core AMD X2 235E @ 2.7Ghz, 4Gb R
Wow, Erland, Slate thanx for the quick response. I'll surely will report
back when I dare to use 7.6.x in some days.
Daniel
--
Bronx
Bronx's Profile: http://forums.slimdevices.com/member.php?userid=33115
View this thread:
pallfreeman got 200k
http://forums.slimdevices.com/showthread.php?t=89205
For me the 7.6.x fullscan times are 4 times faster than 7.5.x.
If you are cautious then wait for 7.6.1 to be released.
Would you have problems if you tried 7.6.1 today?! you might:
- if you use cue sheets
- use unc paths
Bronx;647414 Wrote:
> Hi all
>
> I've been a user of the Squeezebox for 8 years now, Im a silent reader
> of the forum. I have all models except the Slimp and Squeezebox 1 in
> operation. My database is very large and includes over 300,000 tracks.
> It runs stable and ok on a dedicated dual Pen
Hi all
I've been a user of the Squeezebox for 8 years now, Im a silent reader
of the forum. I have all models except the Slimp and Squeezebox 1 in
operation. My database is very large and includes over 300,000 tracks.
It runs stable and ok on a dedicated dual Pentium D with 3.0 GHz and 1
Giga me
29 matches
Mail list logo