Re: Table API in process function

2023-10-15 Thread Feng Jin
Hi Yashoda,

I think this is not a reasonable way and it is not supported at the moment.

I suggest that you can convert the DataStream generated by windowsAll into
a Table, and then use the TableAPI.

AllWindowProcess -> ConvertDataStreamToTable ->  ProcessUsingTableAPI


Best,
Feng

On Fri, Oct 13, 2023 at 9:31 PM Yashoda Krishna T 
wrote:

> Is it possible to use table API inside a processAll window function .
> Lets say, the use case is process function should enrich for each element
> by querying some SQL queries over the entire elements in the window using
> table API. Is this case supported in flink? If not what is the suggested way
>
> Thanks
>


RE: File Source Watermark Issue

2023-10-15 Thread Kirti Dhar Upadhyay K via user
Hi Community,

Can someone help me here?

Regards,
Kirti Dhar

From: Kirti Dhar Upadhyay K
Sent: 10 October 2023 15:52
To: user@flink.apache.org
Subject: File Source Watermark Issue

Hi Team,

I am using Flink File Source with window aggregator as process function, and 
stuck with a weird issues.
File source doesn't seem emitting/progressing the watermarks, whereas if I put 
a delay (say 100ms) while extracting timestamp from event, it is working fine.

A bit same thing I found in comments here 
https://stackoverflow.com/questions/68736330/flink-watermark-not-advancing-at-all-stuck-at-9223372036854775808/68743019#68743019

Can someone help me here?

Regards,
Kirti Dhar


Re: NPE in Calcite RelMetadataQueryBase

2023-10-15 Thread Jad Naous
I've now tracked this down to the fact that my application is trying to run
a query in a separate thread from the one that set up the table
environment. is StreamTableEnvironment supposed to be thread safe? Can it
be a singleton that is available to multiple threads to server queries on
or should each thread built its own?

Thanks!

Jad Naous 
Grepr, CEO/Founder



On Fri, Oct 13, 2023 at 5:42 PM Jad Naous  wrote:

> Hi all,
>
> We're using the Table API to read in batch mode from an Iceberg table on
> s3 using DynamoDB as the catalog implementation. We're hitting a NPE in the
> Calcite planner. The same query works fine when using the local FS and
> in-memory catalog. Below is a snipped stacktrace. Any thoughts on how to
> troubleshoot this?Any help would be much appreciated!
> Thanks!
> Jad
>
> java.lang.NullPointerException: metadataHandlerProvider
> at java.base/java.util.Objects.requireNonNull(Objects.java:246)
> at
> org.apache.calcite.rel.metadata.RelMetadataQueryBase.getMetadataHandlerProvider(RelMetadataQueryBase.java:122)
> at
> org.apache.calcite.rel.metadata.RelMetadataQueryBase.revise(RelMetadataQueryBase.java:118)
> at
> org.apache.calcite.rel.metadata.RelMetadataQuery.collations(RelMetadataQuery.java:604)
> at
> org.apache.calcite.rel.metadata.RelMdCollation.project(RelMdCollation.java:291)
> at
> org.apache.calcite.rel.logical.LogicalProject.lambda$create$0(LogicalProject.java:125)
> at org.apache.calcite.plan.RelTraitSet.replaceIfs(RelTraitSet.java:244)
> at
> org.apache.calcite.rel.logical.LogicalProject.create(LogicalProject.java:124)
> at
> org.apache.calcite.rel.logical.LogicalProject.create(LogicalProject.java:114)
> at
> org.apache.calcite.rel.core.RelFactories$ProjectFactoryImpl.createProject(RelFactories.java:178)
> at org.apache.calcite.tools.RelBuilder.project_(RelBuilder.java:2191)
> at org.apache.calcite.tools.RelBuilder.project(RelBuilder.java:1970)
> at
> org.apache.flink.table.planner.plan.QueryOperationConverter$SingleRelVisitor.visit(QueryOperationConverter.java:165)
> at
> org.apache.flink.table.planner.plan.QueryOperationConverter$SingleRelVisitor.visit(QueryOperationConverter.java:158)
> at
> org.apache.flink.table.operations.ProjectQueryOperation.accept(ProjectQueryOperation.java:76)
> at
> org.apache.flink.table.planner.plan.QueryOperationConverter.defaultMethod(QueryOperationConverter.java:155)
> at
> org.apache.flink.table.planner.plan.QueryOperationConverter.defaultMethod(QueryOperationConverter.java:135)
> at
> org.apache.flink.table.operations.utils.QueryOperationDefaultVisitor.visit(QueryOperationDefaultVisitor.java:47)
> at
> org.apache.flink.table.operations.ProjectQueryOperation.accept(ProjectQueryOperation.java:76)
> at
> org.apache.flink.table.planner.plan.QueryOperationConverter.lambda$defaultMethod$0(QueryOperationConverter.java:154)
> at
> java.base/java.util.Collections$SingletonList.forEach(Collections.java:4856)
> at
> org.apache.flink.table.planner.plan.QueryOperationConverter.defaultMethod(QueryOperationConverter.java:154)
> at
> org.apache.flink.table.planner.plan.QueryOperationConverter.defaultMethod(QueryOperationConverter.java:135)
> at
> org.apache.flink.table.operations.utils.QueryOperationDefaultVisitor.visit(QueryOperationDefaultVisitor.java:72)
> at
> org.apache.flink.table.operations.FilterQueryOperation.accept(FilterQueryOperation.java:67)
> at
> org.apache.flink.table.planner.plan.QueryOperationConverter.lambda$defaultMethod$0(QueryOperationConverter.java:154)
> at
> java.base/java.util.Collections$SingletonList.forEach(Collections.java:4856)
> at
> org.apache.flink.table.planner.plan.QueryOperationConverter.defaultMethod(QueryOperationConverter.java:154)
> at
> org.apache.flink.table.planner.plan.QueryOperationConverter.defaultMethod(QueryOperationConverter.java:135)
> at
> org.apache.flink.table.operations.utils.QueryOperationDefaultVisitor.visit(QueryOperationDefaultVisitor.java:82)
> at
> org.apache.flink.table.operations.SortQueryOperation.accept(SortQueryOperation.java:93)
> at
> org.apache.flink.table.planner.calcite.FlinkRelBuilder.queryOperation(FlinkRelBuilder.java:261)
> at
> org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:224)
> at
> org.apache.flink.table.planner.delegation.PlannerBase.$anonfun$translate$1(PlannerBase.scala:194)
> at
> scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233)
> at scala.collection.Iterator.foreach(Iterator.scala:937)
> at scala.collection.Iterator.foreach$(Iterator.scala:937)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1425)
> at scala.collection.IterableLike.foreach(IterableLike.scala:70)
> at scala.collection.IterableLike.foreach$(IterableLike.scala:69)
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> at scala.collection.TraversableLike.map(TraversableLike.scala:233)
> at scala.collection.TraversableLike.map$(TraversableLike.scala:226)
> at scala.collection.Ab