HeartSaVioR commented on issue #1914:
URL: https://github.com/apache/iceberg/issues/1914#issuecomment-744771533


   I don't know much about Flink support on Iceberg. Probably @openinx can give 
more information here.
   
   From Spark 3.0+ you can set up custom catalog for Iceberg and execute create 
table against the catalog, and partitioned by hours is supported in create 
table syntax. I don't know Flink will include this as a part of create table 
syntax though. The ability to write to the partitioned table is completely 
different story, so you may want to check whether you can write to the table 
partitioned by hours from Flink. (I guess you can, but just to double check.)
   
   I don't know much about Flink itself, but it sounds odd if Flink SQL CLI 
(you meant CLI, not Flink SQL itself, right?) only works with standalone. I'd 
expect Flink SQL CLI to submit a job to the cluster and receive the output and 
print out, but well, I don't know.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to