Michael B Allen wrote:
Hi Ed,
I very much doubt exec-ing and then backticking (which is forks and
then execs again) would achieve the desired result.
One thing that might, would be to write a long-lived daemon that opens
the file(s) being appended to and waits for data on a named pipe. PHP
call
Mitch Pirtle wrote:
On Wed, Mar 17, 2010 at 7:45 PM, CED wrote:
Basically I am trying to find the most incredibly fast way to append to a
file, and support a massive potential concurrency (think huge logging system
for an ISP etc.).
Why not look into syslog-ng or mongodb's gridfs and/
whenever I need efficient high concurrency writes, I use error_log()...-
Original Message -From: CED Date: Wednesday, March 17, 2010 7:46 pmSubject:
[nyphp-talk] Fastest PHP WritingTo: NYPHP Talk > List,> > I am doing some tests
on PHP write speeds and concurrency and > was > wondering i
On Wed, Mar 17, 2010 at 7:45 PM, CED wrote:
> Basically I am trying to find the most incredibly fast way to append to a
> file, and support a massive potential concurrency (think huge logging system
> for an ISP etc.).
Why not look into syslog-ng or mongodb's gridfs and/or capped collections?
Ju
On 3/17/10 7:45 PM, CED wrote:
I am doing some tests on PHP write speeds and concurrency and was
wondering if anyone has done the same?
Basically I am trying to find the most incredibly fast way to append
to a file, and support a massive potential concurrency (think huge
logging system for an
Hi Ed,
I very much doubt exec-ing and then backticking (which is forks and
then execs again) would achieve the desired result.
One thing that might, would be to write a long-lived daemon that opens
the file(s) being appended to and waits for data on a named pipe. PHP
callers can just open the pip
List,
I am doing some tests on PHP write speeds and concurrency and was
wondering if anyone has done the same?
I am currently testing -
fopen('filethingy', 'a')
passthru()
exec()
`echo "stuff" >> filethingy.txt`
Basically I am trying to find the most incredibly fast way to append