Yes, you should provide input in format: s3n://ukey:upass@bucketName/path
-Priyanka
On Thu, Dec 1, 2016 at 12:46 PM, Vishal Agrawal wrote:
> Thank you Priyanka for a quick response.
>
> I need to use S3 bucket as my source of data. So do I need to give my S3
> bucket path there?
>
>
> Thanks,
>
I'm really struggling to get this working. I am extending
AbstractKafkaSinglePortInputOperator to return a POJO. I read stuff from
Kafka and write to HDFS, pretty basic stuff.
The following example works fine:
KafkaSinglePortInputOperator in = dag.addOperator("kafkaIn", new
KafkaSinglePortInp
Unfortunately, we’re only seeing the initial connection refused exception that
I posted. So, if the issue is related to the buffer server as suggested below,
is that something we can mitigate? Do we need to allocate more memory to the
buffer servers?
“The exception is raised due to the buffer s
I am attaching my whole project if that can help figure out what I am
missing.
On Thu, Dec 1, 2016 at 7:41 AM, Max Bridgewater
wrote:
> I'm really struggling to get this working. I am extending
> AbstractKafkaSinglePortInputOperator to return a POJO. I read stuff from
> Kafka and write to HDFS,
Is the exception reproducible? If yes, can you provide full app master
log with the debug level enabled.
Thank you,
Vlad
On 12/1/16 05:29, Feldkamp, Brandon (CONT) wrote:
Unfortunately, we’re only seeing the initial connection refused
exception that I posted. So, if the issue is related to
Max,
The classes under *contrib/src/main/java/com/datatorrent/contrib/kafka* use
the old 0.8 Kafka API
whereas those under *kafka/src/main/java/org/apache/apex/malhar/kafka* use
the new 0.9 API.
*KafkaSinglePortInputOperator* (used in your working version) is in the
latter whereas
*AbstractKafkaSi
Thanks Ram. I changed the class to following but it's still not working.
Not seeing anything useful in the logs. The app is not even deploying
successfully.
package com.example.myapexapp;
import org.apache.apex.malhar.kafka.AbstractKafkaInputOperator;
import org.apache.kafka.clients.consumer.Con
I am trying to use the Malhar JSonParser. My code goes like this:
KafkaSinglePortInputOperator in = dag.addOperator("kafkaIn", new
KafkaSinglePortInputOperator());
in.setInitialOffset(
AbstractKafkaInputOperator.InitialOffset.EARLIEST.name());
JsonParser parser= dag.addOperator("jsonPar
Max,
Seems like you need to add the following dependency.
com.github.fge
json-schema-validator
2.0.1
See the following example, but it is marked as optional so not sure if it
is really required.
https://github.com/DataTorrent/examples/blob/master/tutorials/parser/pom
Hi Max,
You need "json-schema-validator" in your project pom as Ashwin suggested.
In contrib project this dependency is marked as optional so it won't be
included by default unless you include it explicitly from your application
pom.
Ashwin,
We should remove optional tag from examples project. Bu
Hi Max,
Operators must be serialized using kryo.
Issue is KafkaTestUserPortStringInputOperator is not kryo serializable
because of mapper object.
Please change the following line
private final ObjectMapper mapper = new ObjectMapper();
to
private final transient ObjectMapper mapper = new ObjectMa
11 matches
Mail list logo