hapihu opened a new pull request #16878:
URL: https://github.com/apache/flink/pull/16878


   There are some typo issues in these modules.
   
    
   ```bash
   # Use the Codespell tool to check typo issue.
   pip install codespell
   
   codespell -h
    ```
   
   1、 codespell flink-java/src
   
   ```bash
   
flink-java/src/main/java/org/apache/flink/api/java/operators/PartitionOperator.java:125:
 partioning ==> partitioning
   
   
flink-java/src/main/java/org/apache/flink/api/java/operators/PartitionOperator.java:128:
 neccessary ==> necessary
    ```
   2、 codespell flink-clients/
   
   ```bash
   
flink-clients/src/test/java/org/apache/flink/client/program/DefaultPackagedProgramRetrieverTest.java:545:
 acessible ==> accessible
    ```
   
   3、codespell flink-connectors/ -S '.xml' -S '.iml' -S '*.txt'
   
    ```bash
   
flink-connectors/flink-connector-base/src/main/java/org/apache/flink/connector/base/source/reader/SourceReaderOptions.java:25:
 tht ==> that
   
   
flink-connectors/flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/catalog/PostgresCatalogTestBase.java:192:
 doens't ==> doesn't
   
   
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/dialect/JdbcDialect.java:96:
 PostgresSQL ==> postgresql
   
   
flink-connectors/flink-connector-cassandra/src/test/java/org/apache/flink/batch/connectors/cassandra/CustomCassandraAnnotatedPojo.java:38:
 instanciation ==> instantiation
   
   
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/HiveParserCalcitePlanner.java:822:
 partion ==> partition
   
   
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/HiveParserTypeCheckProcFactory.java:943:
 funtion ==> function
   
   
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/copy/HiveASTParseDriver.java:55:
 funtion ==> function
   
   
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/copy/HiveASTParseDriver.java:51:
 characteres ==> characters
   
   
flink-connectors/flink-connector-kinesis/src/main/java/org/apache/flink/streaming/connectors/kinesis/util/KinesisConfigUtil.java:436:
 paremeters ==> parameters
   
   
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/copy/HiveParserBaseSemanticAnalyzer.java:2369:
 Unkown ==> Unknown
   
   
flink-connectors/flink-connector-kinesis/src/main/java/org/apache/flink/streaming/connectors/kinesis/config/ConsumerConfigConstants.java:75:
 reprsents ==> represents
   
   
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/functions/hive/HiveFunctionWrapper.java:28:
 functino ==> function
   
   
   
flink-connectors/flink-connector-hbase-2.2/src/main/java/org/apache/flink/connector/hbase2/source/HBaseRowDataAsyncLookupFunction.java:62:
 implemenation
   
   
   
flink-connectors/flink-connector-pulsar/src/main/java/org/apache/flink/connector/pulsar/source/enumerator/cursor/StartCursor.java:70:
 ture ==> true
   
   
   
flink-connectors/flink-connector-pulsar/src/test/resources/containers/txnStandalone.conf:907:
 partions ==> partitions
   
   
flink-connectors/flink-connector-pulsar/src/test/resources/containers/txnStandalone.conf:468:
 implementatation ==> implementation
   
   
   
flink-connectors/flink-connector-files/src/main/java/org/apache/flink/connector/file/src/enumerate/BlockSplittingRecursiveEnumerator.java:141:
 bloc ==> block
   
   
flink-connectors/flink-connector-files/src/main/java/org/apache/flink/connector/file/src/reader/SimpleStreamFormat.java:37:
 te ==> the
   
   
   
flink-connectors/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/source/reader/deserializer/KafkaRecordDeserializationSchema.java:70:
 determin ==> determine
   
   
flink-connectors/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/source/enumerator/subscriber/TopicListSubscriber.java:36:
 hav ==> have
   
   
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/copy/HiveParserQBSubQuery.java:555:
 correlatd ==> correlated
   
   
   
flink-connectors/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumerBaseTest.java:263:
 intial ==> initial
   
flink-connectors/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumerBaseTest.java:302:
 intial ==> initial
   
flink-connectors/flink-connector-kinesis/src/main/java/org/apache/flink/streaming/connectors/kinesis/internals/KinesisDataFetcher.java:249:
 wth ==> with
   
flink-connectors/flink-connector-files/src/test/java/org/apache/flink/connector/file/sink/writer/FileWriterBucketStateSerializerMigrationTest.java:232:
 comitted ==> committed
   
   
flink-connectors/flink-connector-rabbitmq/src/main/java/org/apache/flink/streaming/connectors/rabbitmq/RMQDeserializationSchema.java:96:
 invokation ==> invocation
   
   
flink-connectors/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/KafkaConsumerTestBase.java:171:
 doesnt ==> doesn't
   ```
    
   
   4、codespell flink-core/ -S '.xml' -S '.iml' -S '*.txt'
   
    
   ```bash
   
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerConfigSnapshot.java:194:
 preform ==> perform
   
   
flink-core/src/main/java/org/apache/flink/configuration/CheckpointingOptions.java:150:
 aways
   
   
flink-core/src/main/java/org/apache/flink/configuration/ConfigOption.java:164: 
documention ==> documentation
   
flink-core/src/main/java/org/apache/flink/configuration/ConfigOption.java:175: 
documention ==> documentation
   
flink-core/src/main/java/org/apache/flink/core/fs/RecoverableFsDataOutputStream.java:54:
 retured ==> returned
   
   
flink-core/src/test/java/org/apache/flink/api/common/io/DelimitedInputFormatTest.java:352:
 skipp ==> skip
   
   flink-core/src/main/java/org/apache/flink/util/InstantiationUtil.java:216: 
occurences ==> occurrences
    ```
    
   
    


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to