Re: [PHP] Size of Arrays

2004-01-30 Thread Gandolf
Hi at all,

 Indeed. The following script had my machine happily swapping like crazy
 at 320,000 array items (each being 1k, task manager showed 640meg total
 memory in use on this development machine with 512meg RAM running WinXP).

 ?php
  while (@ob_end_flush());
  $data = Array();
  while (true)
  {
  $data[] = str_repeat('#', 1024);
  print count($data).\n;
  }
 ?

 To the OP, why do you need to store all the items in memory at the same
 time? Can't you read some, process them then read the next lot, process
 them, read the next lot etc?

No I couldn't because we have to use 1,6 Mio values direktly. If we read
them each for each it will take a lot of time.

We discover we can use an Array with 2.000.000 Values - if higher the server
(Intel 1 GHz, 32 Bit, 1 GB RAM) gives up.

Ciao,

Sandro

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Size of Arrays

2004-01-30 Thread Stuart
Gandolf wrote:
To the OP, why do you need to store all the items in memory at the same
time? Can't you read some, process them then read the next lot, process
them, read the next lot etc?
No I couldn't because we have to use 1,6 Mio values direktly. If we read
them each for each it will take a lot of time.
Without testing I can't be sure, but my past experience suggests that 
working with several smaller sets of data one at a time would be more 
efficient than working with one big one if that's possible. I could of 
course be wrong since I don't know the internals of PHP too well, but 
I'd be surprised if I was.

--
Stuart
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: [PHP] Size of Arrays

2004-01-30 Thread Galen
Should anyone be planning experiments or projects with large arrays, 
use a for() loop instead of a foreach() loop when working with your 
array. And be sure to count the array, stick that result into a 
variable, then compare against the variable instead of running the 
count() again.

On huge arrays (hundreds of thousands of elements, three dimensions, 
several megabytes) the performance impact is huge! The downside is 
you'll probably have to use numeric keys instead of associative, but 
for big arrays, numeric keys will probably also keep more memory free 
and improve performance a bit.

Good luck with your arrays!

-Galen

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


RE: [PHP] Size of Arrays

2004-01-30 Thread Chris W. Parker
Galen mailto:[EMAIL PROTECTED]
on Friday, January 30, 2004 12:35 PM said:

 On huge arrays (hundreds of thousands of elements, three dimensions,
 several megabytes) the performance impact is huge! The downside is
 you'll probably have to use numeric keys instead of associative, but
 for big arrays, numeric keys will probably also keep more memory free
 and improve performance a bit.

And here's another optimization (for the anal type).

$yCnt = -1;
$array_size = count($array);

while(++$yCnt  $array_size)
{
// do your stuff
}



Chris.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Size of Arrays

2004-01-29 Thread David T-G
Sandro --

...and then Gandolf said...
% 
% Hello NG,

Hello from the mailing list :-)


% could anyone tell me how much is be the biggest size of an array - how many
% values could an array have?!
% I have to import 100.000 to 10.000.000 values an I have to know where the
% limit is.

1) I know of no practical limit, though realistically it's probably
MAXINT or so.

2) Five minutes with a test script would tell you, and I bet google could
tell you, so why bother the list with a question you could so easily
answer for yourself?  Go and try it and then report back where (if at
all) it breaks.  My bet is that you'l exhaust RAM and swap before you
break php, especially if you actually store something -- like, say, 32
bytes of useless string output -- in each entry.


% 
% Thanks a lot,
% 
% Sandro


HTH  HAND  Good luck

:-D
-- 
David T-G  * There is too much animal courage in 
(play) [EMAIL PROTECTED] * society and not sufficient moral courage.
(work) [EMAIL PROTECTED]  -- Mary Baker Eddy, Science and Health
http://justpickone.org/davidtg/  Shpx gur Pbzzhavpngvbaf Qrprapl Npg!



pgp0.pgp
Description: PGP signature


Re: [PHP] Size of Arrays

2004-01-29 Thread Stuart
David T-G wrote:
2) Five minutes with a test script would tell you, and I bet google could
tell you, so why bother the list with a question you could so easily
answer for yourself?  Go and try it and then report back where (if at
all) it breaks.  My bet is that you'l exhaust RAM and swap before you
break php, especially if you actually store something -- like, say, 32
bytes of useless string output -- in each entry.
Indeed. The following script had my machine happily swapping like crazy 
at 320,000 array items (each being 1k, task manager showed 640meg total 
memory in use on this development machine with 512meg RAM running WinXP).

?php
while (@ob_end_flush());
$data = Array();
while (true)
{
$data[] = str_repeat('#', 1024);
print count($data).\n;
}
?
To the OP, why do you need to store all the items in memory at the same 
time? Can't you read some, process them then read the next lot, process 
them, read the next lot etc?

--
Stuart
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: [PHP] Size of Arrays

2004-01-29 Thread David T-G
Stuart, et al --

...and then Stuart said...
% 
% David T-G wrote:
% 2) Five minutes with a test script would tell you, and I bet google could
...
% all) it breaks.  My bet is that you'l exhaust RAM and swap before you
% break php, especially if you actually store something -- like, say, 32
% bytes of useless string output -- in each entry.
% 
% Indeed. The following script had my machine happily swapping like crazy 
% at 320,000 array items (each being 1k, task manager showed 640meg total 
% memory in use on this development machine with 512meg RAM running WinXP).
[snip]

*grin*

We can figure, then, that if you cut down to 32b you'll probably get to
about a million entries, which is a nice, BIG, round[ish] number.  You
ought to give it a go.

My laptop and my dev server are in the middle of a 12G transfer, and my
production servers are, well, production, so I can't try this now, but if
there's interest I'll give it a go later.  My dev box (LFS Linux) only
has 384M RAM, though, so it won't be anything awesome.  A buddy of mine
hosts on a 2G [quad Xeon -- yum!] box with another 2G of swap, but he'd
kill me if I took down his business!

Anyone (perhaps at some company with deeper pockets than mine) have a
hefty dev server that could risk a crash in the name of playing mine's
bigger? :-)


HAND

:-D
-- 
David T-G  * There is too much animal courage in 
(play) [EMAIL PROTECTED] * society and not sufficient moral courage.
(work) [EMAIL PROTECTED]  -- Mary Baker Eddy, Science and Health
http://justpickone.org/davidtg/  Shpx gur Pbzzhavpngvbaf Qrprapl Npg!



pgp0.pgp
Description: PGP signature