The rule of thumb in Beam says that all IO connectors (and other transforms) 
are supported by all runners written with the same SDK by default (thanks to 
Beam model).

Since both SnowflakeIO and FlinkRunner are written natively in Java, so answer 
to your question is YES, SnowflakeIO should be supported by FlinkRunner. Just 
build your project with FlinkRunner dependency and specify this runner while 
running your pipeline (see “Classic(Java)” examples here [1]).

Another option, in case if you need to use an IO connector of one SDK with a 
pipeline written in another SDK, is a cross-language pipeline. In this case, 
both IO connector and runner have to be portable and then it may be run using 
Beam Portability framework (for example, Python pipeline with Java IO 
connectors). Though, you don’t need that for your use case (SnowflakeIO with 
FlinkRunner) if you use only Java SDK for your Beam pipeline.

You can find more details on supported IO connectors and SDKs here [2]. 

—
Alexey

[1] https://beam.apache.org/documentation/runners/flink/
[2] https://beam.apache.org/documentation/io/connectors/


> On 13 Oct 2023, at 03:40, mybeam <mybeam...@gmail.com> wrote:
> 
> Hello,
> 
> As per the document on 
> https://beam.apache.org/documentation/io/built-in/snowflake/#running-main-command-with-pipeline-options,
>  I can only see DirectRunner and DataflowRunner, and I have tested the 
> DirectRunner which is working fine. Just wonder if it supports Flink 
> officially. Any comments are welcomed. Thanks.
> 
> 

Reply via email to