Hi,I have configured Flume to "tail -f" logs from my Varnish server - pretty 
much standard Apache HTTP logs.However, sometimes Flume chokes on some special 
characters and dies - stops processing new log entries.
See below for a stack trace.
It seems like this exact issue was reported as Flume bug in 1.4.x 
version:https://issues.apache.org/jira/browse/FLUME-2052and it was marked as 
resolved in 1.5.0 version.The version I am using is Flume 1.5.2 - and I am 
still seeing this issue...
Could somebody confirm/deny if what I am seeing is the same issue and should 
have been fixed? OR is this completely different?
Thank you!Marina


06 Mar 2015 18:16:57,820 ERROR [pool-3-thread-1] 
(org.apache.flume.source.SpoolDirectorySource$SpoolDirectoryRunnable.run:256)  
- FATAL: Spool Directory source r1: { spoolDir: /data1/varnish-logs-active }: 
Uncaught exception in SpoolDirectorySource thread. Restart or reconfigure Flume 
to continue processing.

java.nio.charset.MalformedInputException: Input length = 1

at java.nio.charset.CoderResult.throwException(CoderResult.java:260)

at 
org.apache.flume.serialization.ResettableFileInputStream.readChar(ResettableFileInputStream.java:195)

at 
org.apache.flume.serialization.LineDeserializer.readLine(LineDeserializer.java:134)

at 
org.apache.flume.serialization.LineDeserializer.readEvent(LineDeserializer.java:72)

at 
org.apache.flume.serialization.LineDeserializer.readEvents(LineDeserializer.java:91)

at 
org.apache.flume.client.avro.ReliableSpoolingFileEventReader.readEvents(ReliableSpoolingFileEventReader.java:238)

at 
org.apache.flume.source.SpoolDirectorySource$SpoolDirectoryRunnable.run(SpoolDirectorySource.java:227)

at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)

at java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)

at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)



Reply via email to