More or less yes (it transforms sql statements into KafkaStreams jobs).
But I simply call a rest endpoint that has as response Transfer-Encoding:
chunked and the data comes in as you can check with the linked image.
The only thing it's that I have blank rows (but if I debug the code the
data is in there) and when I stop the job every blank row gets the correct
value.


Il giorno lun 7 ott 2019 alle ore 08:06 Jeff Zhang <zjf...@gmail.com> ha
scritto:

> I don't know much about KSQL, is each running a streaming job which never
> stops ?
>
> Andrea Santurbano <sant...@gmail.com> 于2019年10月7日周一 上午4:09写道:
>
> > Hi guys,
> > I'm building a KQL interpreter. KSQL is the streaming SQL engine that
> > enables real-time data processing against Apache Kafka topics.
> > I created an interpreter that leverages the InterpreterOutputStream class
> > in order to stream the result from the backend to the frontend, but I get
> > the following representation in the frontend:
> >
> > https://imgur.com/a/J7unk25
> >
> > What am I doing wrong?
> >
> > Following the Interpreter implementation:
> >
> >
> >
> https://github.com/conker84/zeppelin/blob/kafka/kafka/src/main/java/org/apache/zeppelin/kafka/KafkaKSQLInterpreter.java
> >
> > Thanks a lot
> >
> > Andrea
> >
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Reply via email to