Normally with extremely small sets of data, the actual time is so small that
even if one is faster the performance enhancement for the overall process
will be insignificant.

You might want to consider other elements to make your decision for
example...

-Is this one page of many and all the other pages use a database?

-Who (person/group) will maintain the data and which source would they like
more.

-It is small now but is it likely to grow significantly?


> Behalf Of Scott K Purcell
>
>
> Hello,
> Question: I have just a small amount of state information I need to store
> for a new site I am creating. Most sites I have done require either a
> relational DB backend, but this site has a minimal amount of
> data. It will
> actually be a hash array, and involves less than 150 keys.
>
> So in my eyes, I have two ways to go. Create a quick DBM, and let
> my script
> call it each time I need data, or B). Write the data to a "text" file on
> the webserver, and just read from it each time I need data.
>
> My real question is: which anology is faster. A) Accessing a DBM that is
> local to the webserver, adding entries at login, and accessing those
> entries while they are on line. B) Creating a file that is local to the
> webserver, and adding entries at login there and reading from that file
> while they are browsing?
>
> If no one has any ideas which is faster, I can try and put together a
> Benchmark::IO study, but I figure someone may of done this already.
>
> Thanks For Your Time,
> Sincerely
> Scott


---
You are currently subscribed to perl-win32-users as: [archive@jab.org]
To unsubscribe, forward this message to
         [EMAIL PROTECTED]
For non-automated Mailing List support, send email to  
         [EMAIL PROTECTED]

Reply via email to