Hi Ralph

I do not necessarily see this as a problem of the logging configuration.
Yes, it can be solved with the logging configuration but you could do
otherwise. You could log at debug and roll the logs quickly. Further poll
and aggregate the logs in regular and short intervals with a platform like
zabbix, loki, splunk, ... This way you do have the log data when you need
it and you can apply retention policies on the aggregated stream of data.
On top of that you can query logs and derive other metrics of interest.

Dominik

On Thu, 7 Apr 2022 at 08:04, Ralph Goers <ralph.go...@dslextreme.com> wrote:

> Thanks Robert. Yes. We want to log everything that happens for a single
> user
> or a single account across all the services that may get called.
>
> Viewing the logs is not a problem. We route logs for everything to
> Elasticsearch
> and can view them with Kibana.
>
> The issue is simply in getting Log4j to enable the logging for those
> users.
>
> But I think I’ve come up with a better solution. I am going to create a
> new Filter
> that retrieves its configuration from Spring Cloud Config. It will
> periodically poll
> for changes to that file. If the file changes then the values to be
> matched against
> will be updated. So the filter will be configured with the location of the
> file to use
> and the poll interval. This way we don’t need to go through
> reconfiguration so it
> should have less overhead when the file changes. The only downside to this
> is
> that it is still subject to polling delays but without something like
> RabbitMQ or
> Kafka I don’t see a better way to do it.
>
> Ralph
>
> > On Apr 6, 2022, at 3:56 PM, Robert Middleton <rmiddle...@apache.org>
> wrote:
> >
> > So if I'm understanding you correctly, you want to do live debugging of
> one
> > user's information, but you can't do that because that information is not
> > going to the logs.  In order to get that information, some part of log4j
> > would need to be reconfigured in order to send that information out to
> the
> > logs.
> >
> > I haven't done much with Spring before, so I can't really speak to that
> > part of it much.  If only one person is attempting to view the logs, then
> > something like the following would be my idea:
> > 1. Connect to the service using a log viewing application of some kind
> > 2. Using this log viewing application, send a special command that
> > reconfigures a filter on whatever appender the log viewing application is
> > using.  Presumably this is some sort of user input box.
> >
> > This probably wouldn't scale all that well to the hundreds of instances
> > that you have, as I'm assuming that the reason you would commit a config
> > file in git is so that all the services can pull the new file at (more or
> > less) the same time and all reconfigure appropriately.  The advantage
> > however is that it wouldn't require any config file changes, and you
> don't
> > spam the log viewing application with useless log messages(e.g. filter on
> > the log4j side instead of sending all messages to the log viewer and
> > filtering there).
> >
> > -Robert Middleton
> >
> >
> > On Wed, Apr 6, 2022 at 3:14 AM Ralph Goers <ralph.go...@dslextreme.com>
> > wrote:
> >
> >> I’m looking for some inspiration.
> >>
> >> At work we use Spring Cloud Config and host our logging configuration
> >> there. It is shared by something like 150 services, which comes out to
> >> hundreds of service instances. We have a standard of including the
> user’s
> >> id and the customer’s account number in the ThreadContext and passing
> those
> >> automatically from one service to another.  Operationally, went to
> normally
> >> log at WARN or INFO. But when a customer calls with a problem we want
> to be
> >> able to enable debug logging for that user or account. I know I could
> use
> >> either the DynamicThresholdFilter or the ContextMapFilter to do this.
> >>
> >> So here are the issues.
> >> 1. We don’t really want operations folks editing the log4j configuration
> >> file directly.
> >> 2. Our SCC uses a Git repo hosted by a third party. We don’t really want
> >> to open access to allow Git to call SCCs endpoint to notify it of
> changes.
> >> 3. Without the Git notification we will have to poll for changes in SCC.
> >> This means it could be a few minutes before SCC notices changes and then
> >> another few minutes before Log4j notices changes.
> >>
> >>
> >> I’m considering doing something like:
> >>
> >> <DynamicThresholdFilter key=“accountNumber" defaultThreshold=“ERROR"
> >> onMatch="ACCEPT" onMismatch="NEUTRAL">
> >>   <KeyValuePair key=“${scc:accountNumber}" value="DEBUG"/>
> >> </DynamicThresholdFilter>
> >>
> >> Where scc represents a new SpringCloudConfigLookup. This would have to
> >> represent a file in SCC that contains a mapping for accountNumber to
> >> something.
> >>
> >> There are a number of issues with this.
> >> 1. This still requires editing and committing a file.
> >> 2. It has the polling delay.
> >> 3. Log4j-Spring-Cloud-Config would have to cause the file to be
> monitored
> >> just as the config file is.
> >> 4. The syntax of the existing Filters requires that when looking for
> >> multiple users each must be specified in its own key. That would mean
> the
> >> filter would have to be preconfigured with the maximum number of
> accounts
> >> it can look for. It would also mean the syntax for specifying the
> account
> >> numbers could get messy. However, it could register a Watcher for the
> >> external file and force a reconfiguration when it does.
> >>
> >> An alternative to reconfiguring could be to specify the variable as
> >> $${src:accountNumber) and resolve the lookup for every log event.
> >> Presumably it would use a cached value and cause the value to change
> when
> >> the file is changed.
> >>
> >> These are just the first few things I thought of to do this. Does anyone
> >> else have any ideas?
> >>
> >> Ralph
>
>

-- 
Dominik Psenner

Reply via email to