Andrew --
...and then Andrew Fenn said...
%
% I have two text files with two rows of data on each line sperated by a tab for about
20 lines in each file. Would it be faster accessing this data by putting it in a mysql
table?
For something so small I'd go with a text file unless you happen to
Hi Jay,
El mar, 20-04-2004 a las 07:52, Jay Blanchard escribió:
[snip]
I have two text files with two rows of data on each line sperated by a
tab for about 20 lines in each file. Would it be faster accessing this
data by putting it in a mysql table?
[/snip]
Sounds like the perfect
El mar, 20-04-2004 a las 08:24, John Nichel escribió:
Jay Blanchard wrote:
There is always awk!
Aw, no votes for cat | grep? ;)
No, that would be worse since the system needs to spawn a new process
and this is slow too. Specially on a high loaded system.
If you are concerned
William Lovaton wrote:
El mar, 20-04-2004 a las 08:24, John Nichel escribió:
Jay Blanchard wrote:
There is always awk!
Aw, no votes for cat | grep? ;)
No, that would be worse since the system needs to spawn a new process
and this is slow too. Specially on a high loaded system.
If you
I have two text files with two rows of data on each line sperated by a tab for about
20 lines in each file. Would it be faster accessing this data by putting it in a mysql
table?
-
Yahoo! Messenger - Communicate instantly...Ping your friends
[snip]
I have two text files with two rows of data on each line sperated by a
tab for about 20 lines in each file. Would it be faster accessing this
data by putting it in a mysql table?
[/snip]
Sounds like the perfect opportunity to set up a test! File I/O is always
slower.
--
PHP General
Isn't the database kept in a file? Why would that file be faster than a plain
text file?
cheers,
Travis
Jay Blanchard wrote:
[snip]
I have two text files with two rows of data on each line sperated by a
tab for about 20 lines in each file. Would it be faster accessing this
data by putting it
[snip]
Isn't the database kept in a file? Why would that file be faster than a
plain
text file?
[/snip]
Of course it is, but the file is kept accessible by the database server
deamon so you are not reading from a file in the tradional sense.
--
PHP General Mailing List (http://www.php.net/)
To
:[EMAIL PROTECTED]
Sent: 20 April 2004 14:12
To: [EMAIL PROTECTED]
Subject: Re: [PHP] Whats faster? text files or mysql?
Isn't the database kept in a file? Why would that file be faster
than a plain
text file?
cheers,
Travis
Jay Blanchard wrote:
[snip]
I have two text files with two
[snip]
I'm not sure but I think it would depend on the programming language and
with php File I/O is slower then database (mySql anyway).
Where as Perl would probably be quicker with the text file.
[/snip]
There is always awk!
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe,
From: Andrew Fenn [EMAIL PROTECTED]
I have two text files with two rows of data on each
line sperated by a tab for about 20 lines in each file.
Would it be faster accessing this data by putting it in a
mysql table?
Depends what you want to do with the data. If you're searching through for
Mark Cubitt wrote:
I'm not sure but I think it would depend on the programming language and
with php File I/O is slower then database (mySql anyway).
Where as Perl would probably be quicker with the text file.
I could be wrong though
It would really depend on the size of the file, and how the db
Jay Blanchard wrote:
[snip]
I'm not sure but I think it would depend on the programming language and
with php File I/O is slower then database (mySql anyway).
Where as Perl would probably be quicker with the text file.
[/snip]
There is always awk!
Aw, no votes for cat | grep? ;)
--
13 matches
Mail list logo