gt; >> b) It's unclear to me where the output data goes when you create a
> policy.
> >> E.g. say I have:
> >>
> >> from HDFS_AUDIT_LOG_ENRICHED_STREAM_SANDBOX[str:contains(src,'/hbase')]
> >> select * group by user insert into hdfs_audit_log_enriched_s
There is a data preparation stage between data source(HDFS audit log) and
Alert Engine. This stage is running in Storm and transform the raw HDFS log
into something which can be alerted.
The input for data preparation is hdfs_audit_log_sandbox topic and output is
hdfs_audit_log_enriched_sandbox.
The data flow you described should be correct.
But to be accurate, there are two stream processing for hdfs log monitoring.
Processing 1: data preparation, i.e. enrich the raw audit log. Here enrich
means add extra information to raw audit log.
The input is topic hdfs_audit_log_sandbox, and the
just repackaged it from the tar and it works
> fine
> > for me.
> > can you please retry ?
> >
> > can somebody else confirm the issue ?
> >
> > Thanks
> > Jayesh
> >
> > On Tue, Aug 8, 2017 at 9:32 PM, Edward Zhang <yonzhang2...@apac
Thanks Jayesh.
I tried, the build with "mvn clean compile -DskipTests" has the following
error
[ERROR] Failed to execute goal on project alert-assembly: Could not resolve
dependencies for project org.apache.eagle:alert-assembly:jar:0.5.0: The
following artifacts could not be resolved: