Congratulations! Well deserved!


| |
18717838093
|
|
18717838...@126.com
|
签名由网易邮箱大师定制


On 07/16/2021 19:50,wangxianghu<wxhj...@126.com> wrote:
Congratulations! Well deserved!

2021年7月16日 下午6:52,vino yang <vinoy...@apache.org> 写道:

Congratulation to both of you! Well deserved!

Best,
Vino

leesf <leesf0...@gmail.com <mailto:leesf0...@gmail.com>> 于2021年7月16日周五 下午6:38写道:
Hi all,

Please join me in congratulating our newest committers Pengzhiwei and DannyChan.

Pengzhiwei has been a consistent contributor to Hudi, he has contributed 
numerous features to Hudi, such as Spark SQL integration with Hudi, Spark 
Structured Streaming Source for Hudi and Spark FileIndex for Hudi and also lots 
of other good contributions around Spark, and also very active to answer 
users's questions. He is a solid team player and an asset to the project.

DannyChan has contributed many good features, such as new streaming write 
pipeline for Flink with automatic compaction and cleaning (COW and MOR), batch 
and streaming reader for Flink (COW and MOR) and support Flink SQL connectors 
(reader and writer), he is actively join the ML and answer users' questions as 
well as wrote a Hudi Flink integration guide and launched a live show to 
promote Hudi Flink integration for Chinese users.

Thanks so much for your continued contributions to make Hudi better and better!

Also I would like to introduce the current state of Hudi in China. Hudi becomes 
more and more popular in China with the help of all community members and has 
been adopted by almost all top companies in China, including Alibaba, Baidu, 
ByteDance, Huawei, Tencent and other companies, from startups to large 
companies, data scale from TB to PB. You would find the logo wall below(PS: 
unofficial statistics, just listed some of them and you can contact me to add 
your company logo if wanted).

We would not achieve this without such a good community and the contribution of 
all community members. Cheers and Go!



Thanks,
Leesf

Reply via email to