Yes our central server would be Hadoop. Exactly how would this work with flume? Would I write Avro to a file source which flume would then ship over to one of our collectors or is there a better/native way? Would I have to include the schema in each event? FYI we would be doing this primarily from a rails application.
Does anyone ever use Avro with a message bus like RabbitMQ? On May 23, 2013, at 9:16 PM, Sean Busbey <[email protected]> wrote: > Yep. Avro would be great at that (provided your central consumer is Avro > friendly, like a Hadoop system). Make sure that all of your schemas have > default values defined for fields so that schema evolution will be easier in > the future. > > > On Thu, May 23, 2013 at 4:29 PM, Mark <[email protected]> wrote: > We're thinking about generating logs and events with Avro and shipping them > to a central collector service via Flume. Is this a valid use case? > > > > > -- > Sean
