Just for interest sake, are you going to record a row in the table for
each log statement, or a row for each log file (as it were)?

Here at work we started doing a row per log statement, but our code
produces a prodigious number of log statments, so that each run of the
application was filling up about 100,000 rows.  We found that creating a
normal log file, and then inserted the whole thing as a blob into the db
was much more efficient.

I only wish I had more time in the day, this would be a fun project too
:-(

Richard

On Wed, 2003-02-19 at 07:51, Ceki Gülcü wrote:
> 
> Unless someone does it before, I intend to rewrite the JDBCAppender 
> completely. Removing the existing JDBCAppender is my preferred choice.
> 
> At 22:11 18.02.2003 -0800, you wrote:
> >I propose that we place the current version of
> >o.a.log4j.jdbc.JDBCAppender.java into the log4j-sandbox (I don't know if we
> >should remove it altogether from the v1.3 release?).  In doing so,
> >interested log4j developers need to step forward to begin the task of
> >continuing its development.  I know Kevin Steppe had more ideas for evolving
> >this class, and others have spoken up as well.  Sounds like it needs the
> >ability to support different jdbc driver behaviors, clob's, etc.  It seems
> >that this appender is popular with the log4j user community, so I think it
> >is right for it to be given some needed attention and upgrades.
> >
> >Interested developers, demonstrating interest, proposals, and ability can be
> >granted committer rights to the sandbox cvs.
> >
> >Once JDBCAppender reaches a stable point of evolution, we can vote to
> >re-admit it to the core release.
> >
> >+1
> >
> >-Mark
> 
> --
> Ceki 
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]



---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to