Anyone have any ideas on the best way to do this? Matt Wise Sr. Systems Architect Nextdoor.com
On Sat, Nov 9, 2013 at 5:28 PM, Matt Wise <m...@nextdoor.com> wrote: > Hey we'd like to set up a default format for all of our logging systems... > perhaps looking like this: > > "key1=value1;key2=value2;key3=value3...." > > With this pattern, we'd allow developers to define any key/value pairs > they want to log, and separate them with a common separator. > > If we did this, what do we need to do in Flume to get Flume to parse out > the key=value pairs into dynamic headers? We pass our data from Flume into > both HDFS and ElasticSearch sinks. We would really like to have these > fields dynamically sent to the sinks for much easier parsing and analysis > later. > > Any thoughts on this? I know that we can define a unique interceptor for > each service that looks for explicit field names ... but thats a nightmare > to manage. I really want something truly dynamic. > > Matt Wise > Sr. Systems Architect > Nextdoor.com >