error ..
type mismatch;
found : com.fasterxml.jackson.module.scala.DefaultScalaModule.type
required:
org.apache.flink.kubernetes.shaded.com.fasterxml.jackson.databind.Module
Any way how I can fix this ?
regards.
--
Debasish Ghosh
http://manning.com/ghosh2
http://manning.com/ghosh
Twttr
.
regards.
--
Debasish Ghosh
http://manning.com/ghosh2
http://manning.com/ghosh
Twttr: @debasishg
Blog: http://debasishg.blogspot.com
Code: http://github.com/debasishg
l be in the same
consumer-group?
Thanks for any help.
regards.
--
Debasish Ghosh
http://manning.com/ghosh2
http://manning.com/ghosh
Twttr: @debasishg
Blog: http://debasishg.blogspot.com
Code: http://github.com/debasishg
ported thrown...
>>
>> Best,
>> tison.
>>
>>
>> Biao Liu 于2019年9月24日周二 上午10:34写道:
>>
>>>
>>> > We submit the code through Kubernetes Flink Operator which uses the
>>> REST API to submit the job to the Job Manager
>>>
>
ubmit one
> job at a time or when multiple jobs are submitted at the same time? I'm
> asking this because I noticed that you used Future to execute the job
> unblocking. I guess ThreadLocal doesn't work well in this case.
>
> Regards,
> Dian
>
> 在 2019年9月23日,下午11:57,Debasish Gho
our function which returns a Unit. It's not
ExecutionGraph. It builds the DataStream s by reading from Kafka and then
finally writes to Kafka.
>
> Best,
> tison.
>
>
> Debasish Ghosh 于2019年9月23日周一 下午8:21写道:
>
>> This is the complete stack trace which we get from execution on
&
Could you paste the full exception stack if it exists? It's
> difficult to figure out what's wrong with the current stack trace.
>
> Regards,
> Dian
>
> 在 2019年9月23日,下午6:55,Debasish Ghosh 写道:
>
> Can it be the case that the threadLocal stuff in
> https://github.com/apache/flink
operator ? Utils also selects the factory to create the context based on
either Thread local storage or a static mutable variable.
Can these be source of problems in our case ?
regards.
On Mon, Sep 23, 2019 at 3:58 PM Debasish Ghosh
wrote:
> ah .. Ok .. I get the Throwable part. I am us
I also think that the wrong StreamExecutionEnvironment is used.
>
> Regards,
> Dian
>
> [1]
> https://github.com/apache/flink/blob/e2728c0dddafcfe7fac0652084be6c7fd9714d85/flink-clients/src/main/java/org/apache/flink/client/program/OptimizerPlanEnvironment.java#L76
>
> 在 2019年9月23日,下午6:08,Deb
you reached out to the FlinkK8sOperator team on Slack? They’re
>> usually pretty active on there.
>>
>> Here’s the link:
>>
>> https://join.slack.com/t/flinkk8soperator/shared_invite/enQtNzIxMjc5NDYxODkxLTEwMThmN2I0M2QwYjM3ZDljYTFhMGRiNDUzM2FjZGYzNTRjYWNmYTE1NzNlN
AM Debasish Ghosh
wrote:
> Thanks for the pointer .. I will try debugging. I am getting this
> exception running my application on Kubernetes using the Flink operator
> from Lyft.
>
> regards.
>
> On Sat, 21 Sep 2019 at 6:44 AM, Dian Fu wrote:
>
>> This excep
src/main/java/org/apache/flink/client/program/OptimizerPlanEnvironment.java#L76
> [2]
> https://github.com/apache/flink/blob/15a7f978a2e591451c194c41408a12e64d04e9ba/flink-streaming-java/src/main/java/org/apache/flink/streaming/api/environment/StreamPlanEnvironment.java#L57
>
> 在 2019年9月21日,上午4:14,D
to such a stacktrace ? In my case it starts from
env.execute(..) but does not give any information as to what can go wrong.
Any help will be appreciated.
regards.
--
Debasish Ghosh
http://manning.com/ghosh2
http://manning.com/ghosh
Twttr: @debasishg
Blog: http://debasishg.blogspot.com
Code: http://github.com
re an example of before and after of your classes
> for future reference?
>
> On Thu, 19 Sep 2019, 10:42 Debasish Ghosh,
> wrote:
>
>> We solved the problem of serialization by making some things transient
>> which were being captured as part of the closure. So we no lo
gt;
> Thanks,
> Biao /'bɪ.aʊ/
>
>
>
> On Thu, 19 Sep 2019 at 15:03, Debasish Ghosh
> wrote:
>
>> I think what you are pointing at is asynchronous datastream operations.
>> In our case we want to submit the entire job in a Future. Something like
>> the follow
>
> Thanks,
> Rafi
>
> [1]
> https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/operators/asyncio.html#async-io-api
>
> On Wed, Sep 18, 2019 at 8:26 PM Debasish Ghosh
> wrote:
>
>> ok, the above problem was due to some serialization issues
anyone please help with an explanation ?
regards.
On Wed, Sep 18, 2019 at 12:22 AM Debasish Ghosh
wrote:
> I think the issue may not be linked with Future. What happens is when this
> piece of code is executed ..
>
> val rides: DataStream[TaxiRide] =
> readStream(inTaxiRide)
>
referenced objects, for example, the outer class instance. If the outer
> class is not serializable, this error would happen.
>
> You could have a try to move the piece of codes to a named non-inner class.
>
> Thanks,
> Biao /'bɪ.aʊ/
>
>
>
> On Tue, 17 Sep 2019 at 02
My main question is why serialisation kicks in when I try to execute within
a `Future` and not otherwise.
regards.
On Mon, 16 Sep 2019 at 4:46 PM, Debasish Ghosh
wrote:
> Yes, they are generated from Avro Schema and implements Serializable ..
>
> On Mon, Sep 16, 2019 at 4:40 PM Deep
Yes, they are generated from Avro Schema and implements Serializable ..
On Mon, Sep 16, 2019 at 4:40 PM Deepak Sharma wrote:
> Does TaxiRide or TaxiRideFare implements Serializable?
>
> On Mon, Sep 16, 2019 at 3:47 PM Debasish Ghosh
> wrote:
>
>> Hello -
>>
>
he.avro.Schema$Field
at
java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
Any thoughts why this may happen ?
regards.
--
Debasish Ghosh
http://manning.com/ghosh2
http://manning.com/ghosh
Twttr: @debasishg
Blog: http://debasishg.blogspot.com
Code: http://github.com/debasishg
n
> release-1.9.1.
>
> Best,
> tison.
>
> [1] https://issues.apache.org/jira/browse/FLINK-12501
>
>
> Debasish Ghosh 于2019年9月9日周一 下午6:20写道:
>
>> Thanks Kurt. I was just asking as it would help us a lot with the issue (
>> https://github.com/apache/fli
l: class Overrides
> [ERROR] location: class
> org.apache.flink.streaming.api.operators.StreamSink
>
> Has anyone run into this problem?
>
> --
> Best Regards,
> Yuval Itzchakov.
>
--
Debasish Ghosh
http://manning.com/ghosh2
http://manning.com/ghosh
Twttr: @debasishg
Blog: http://debasishg.blogspot.com
Code: http://github.com/debasishg
Hello -
Is there a plan for a Flink 1.9.1 release in the short term ? We are using
Flink and Avro with Avrohugger generating Scala case classes form Avro
schema. Hence we need https://github.com/apache/flink/pull/9565 which has
been closed recently.
regards.
--
Debasish Ghosh
http
Hello Aljoscha -
I made a comment on your PR (
https://github.com/apache/flink/pull/9565/files#r319598469). With the
suggested fix it runs fine .. Thanks.
regards.
On Fri, Aug 30, 2019 at 4:48 PM Debasish Ghosh
wrote:
> Thanks a lot .. sure I can do a build with this PR and check.
>
>
tFormat.java#L116>
> (compare
> <https://github.com/apache/avro/blob/master/lang/java/avro/src/main/java/org/apache/avro/specific/SpecificDatumReader.java#L37-L46>
> ).
Would love to know if there has been some plans towards fixing this issue ..
regards.
On Thu, Aug 29, 2019 at
> I'm asking just to double check, since from my understanding of the issue,
> the problem should have already existed before.
>
> Thanks,
> Gordon
>
> On Sun, May 12, 2019 at 3:53 PM Debasish Ghosh
> wrote:
>
>> Hello -
>>
>> Facing an issue with
s and a short to mid term goal
> of the project is to either remove or shade away these components so java
> users have a pure java dependency.
>
> Seth
>
> On Mon, Aug 26, 2019 at 11:59 AM Debasish Ghosh
> wrote:
>
>> actually the scala and java code are completely se
ks but not java, there might've been something
> to do with the implicit variable passing for your `readStream`, which is
> very tricky mixing with Java code. So I would avoid mixing them if possible.
>
> --
> Rong
>
> On Sun, Aug 25, 2019 at 11:10 PM Debasish Ghosh
> wro
Looks like using the following overload of
StreamExecutionEnvironment.addSource which takes a TypeInformation as well,
does the trick ..
env.addSource(
FlinkSource.collectionSourceFunction(data),
TypeInformation.of(Data.class)
)
regards.
On Mon, Aug 26, 2019 at 11:24 AM Debasish Ghosh
oh .. and I am using Flink 1.8 ..
On Mon, Aug 26, 2019 at 12:09 AM Debasish Ghosh
wrote:
> Thanks for the feedback .. here are the details ..
>
> Just to give u some background the original API is a Scala API as follows
> ..
>
> final def readStream[In: TypeInformation: Des
is determined by its
> underlying transformation, so you cannot set it directly.
>
> Thanks,
> Rong
>
> On Sat, Aug 24, 2019 at 12:29 PM Debasish Ghosh
> wrote:
>
>> Thanks .. I tried this ..
>>
>> DataStream ins = readStream(in, Data.class, serdeD
ation);*
> DataStream simples = ins.map((Data d) -> new Simple(d.getName()))
> .returns(new TypeHint(){}.getTypeInfo());
>
> --
> Rong
>
> On Fri, Aug 23, 2019 at 9:55 AM Debasish Ghosh
> wrote:
>
>> Hello -
>>
>> I have the following call to addSource whe
same
exception. Now after adding the returns call, nothing changes.
Any help will be appreciated ..
regards.
--
Debasish Ghosh
http://manning.com/ghosh2
http://manning.com/ghosh
Twttr: @debasishg
Blog: http://debasishg.blogspot.com
Code: http://github.com/debasishg
considering isolating the classloader that
> contain Akka and Scala to allow the applications and Flink to use different
> Akka versions.
>
> https://issues.apache.org/jira/browse/FLINK-10903
>
> Best,
> Haibo
>
> At 2019-07-25 00:07:27, "Debasish Ghosh" wrote
Also wanted to check if anyone has ventured into this exercise of shading
Akka in Flink ..
Is this something that qualifies as one of the roadmap items in Flink ?
regards.
On Wed, Jul 24, 2019 at 3:44 PM Debasish Ghosh
wrote:
> Hi Haibo - Thanks for the clarification ..
>
> regards.
&
Hi Haibo - Thanks for the clarification ..
regards.
On Wed, Jul 24, 2019 at 2:58 PM Haibo Sun wrote:
> Hi Debasish Ghosh,
>
> I agree that Flink should shade its Akka.
>
> Maybe you misunderstood me. I mean, in the absence of official shading
> Akka in Flink, the relativel
issue.
>
>
> Haibo Sun 于2019年7月24日周三 下午4:07写道:
>
>> Hi, Debasish Ghosh
>>
>> I don't know why not shade Akka, maybe it can be shaded. Chesnay may be
>> able to answer that.
>> I recommend to shade Akka dependency of your application because it don't
&g
ncommon math but also
> be curious why Flink doesn't shaded all of akka dependencies...
>
> Best,
> tison.
>
>
> Debasish Ghosh 于2019年7月24日周三 下午3:15写道:
>
>> Hello Haibo -
>>
>> Yes, my application depends on Akka 2.5.
>> Just curious, why do you think
Hello Haibo -
Yes, my application depends on Akka 2.5.
Just curious, why do you think it's recommended to shade Akka version of my
application instead of Flink ?
regards.
On Wed, Jul 24, 2019 at 12:42 PM Haibo Sun wrote:
> Hi Debasish Ghosh,
>
> Does your application have to depen
know Flink
1.9 has upgraded to Akka 2.5 but this is (I think) going to be a recurring
problem down the line with mismatch between the new releases of Akka and
Flink.
regards.
--
Debasish Ghosh
http://manning.com/ghosh2
http://manning.com/ghosh
Twttr: @debasishg
Blog: http://debasishg.blogspot.com
ase check them out and add a new one if they don't describe what you
> are looking for.
>
> Cheers,
> Fabian
>
> [1] https://issues.apache.org/jira/browse/FLINK-9679
> [2] https://issues.apache.org/jira/browse/FLINK-8378
>
> Am Fr., 7. Juni 2019 um 18:01 Uhr schrie
Hello -
Is there any specific reason we have AvroDeserializationSchema in Flink but
not AvroSerializationSchema ? Instead we have AvroRowSerializationSchema,
which serializes objects that are represented in (nested) Flink rows.
regards.
--
Debasish Ghosh
http://manning.com/ghosh2
http
ble to stop it after some timeout. Or maybe if the job fails I
> would like to do some cleanups.
>
> What is the idiomatic way to design such APIs in Flink ?
>
> regards.
> --
> Debasish Ghosh
> http://manning.com/ghosh2
> http://manning.com/ghosh
>
> Twttr: @debasish
would like to do some cleanups.
What is the idiomatic way to design such APIs in Flink ?
regards.
--
Debasish Ghosh
http://manning.com/ghosh2
http://manning.com/ghosh
Twttr: @debasishg
Blog: http://debasishg.blogspot.com
Code: http://github.com/debasishg
.
Any help will be appreciated.
regards.
--
Debasish Ghosh
http://manning.com/ghosh2
http://manning.com/ghosh
Twttr: @debasishg
Blog: http://debasishg.blogspot.com
Code: http://github.com/debasishg
blem should have already existed before.
>
> Thanks,
> Gordon
>
> On Sun, May 12, 2019 at 3:53 PM Debasish Ghosh
> wrote:
>
>> Hello -
>>
>> Facing an issue with avro serialization with Scala case classes generated
>> through avrohugger ..
>> Scala c
ow(EventTimeSessionWindows.withGap(joinGap))
> .apply(new JoinFunction(), new MyAvroTypeInfo<>(SomeAvro.class));
>
>
> Another benefit of this approach over the Kryo serializer option is that
> you would support state migration.
>
> Hope this helps,
> Rafi
>
>
>
dev/custom_serializers.html
>
> [2]
> https://ci.apache.org/projects/flink/flink-docs-release-1.8/api/java/org/apache/flink/api/common/typeinfo/TypeInfo.html
>
>
> On Mon, May 13, 2019 at 6:50 PM Debasish Ghosh
> wrote:
>
>> Hello -
>>
>> I am using Avro ba
to register such a custom serializer ?
regards.
--
Debasish Ghosh
http://manning.com/ghosh2
http://manning.com/ghosh
Twttr: @debasishg
Blog: http://debasishg.blogspot.com
Code: http://github.com/debasishg
.RuntimeException: Serializing the source elements failed:
> avro.shaded.com.google.common.util.concurrent.UncheckedExecutionException:
> org.apache.avro.AvroRuntimeException: Not a Specific class: class
> pipelines.flink.avro.Data
Any help or workaround will be appreciated ..
regards.
--
Debasish Ghosh
http://manning.com/ghosh2
http://manning.com/ghosh
Twttr: @debasishg
Blog: http://debasishg.blogspot.com
Code: http://github.com/debasishg
ot;)
>- set "classloader.resolve-order" to "parent-first"
>
>
> On 28.02.2018 14:28, Debasish Ghosh wrote:
>
> Thanks for the suggestion. I copied the application jar to lib. The error
> doesn't come but I get another err
ed in 1.4.2 which should
> be released within the next days.
>
> As a temporary workaround you can copy your app-assembly-1.0.jar into the
> /lib directory.
>
>
> On 28.02.2018 08:45, Debasish Ghosh wrote:
>
> Hi -
>
> Facing a ClassNotFoundExceptio
val flinkScala= "org.apache.flink" %%
"flink-scala"% "1.4.1" % "provided"
val flinkStreamingScala = "org.apache.flink" %%
"flink-streaming-scala" % "1.4.1
the machine directly to get the stout file, you'll
> find the output.
>
> On Thu, Feb 23, 2017 at 9:24 PM, Debasish Ghosh <ghosh.debas...@gmail.com>
> wrote:
>
>> Yes .. I was running Flink on a DC/OS cluster.
>>
>> AFAIR I checked the taskmanager log fro
etz...@apache.org> wrote:
> Hi Debashish,
>
> did you execute Flink in a distributed setting? print() will output the
> stream contents on stdout on the respective worker node (taskmanager), not
> on the machine that submitted the job.
>
> On Thu, Feb 23, 2017 at 5:41 PM,
ource(*new* FlinkKafkaConsumer010<>("test", *new*
>>> SimpleStringSchema(), properties));
>>>
>>> stream.map(s -> Integer.*valueOf*(s)).flatMap(flatMapper).returns(
>>>
>>> *new* TypeHint<Tuple2<Integer, Integer>>() {
>
57 matches
Mail list logo