wuguihu created FLINK-23859:
-------------------------------
Summary: [typo][flink-core][flink-connectors]fix typo for code
Key: FLINK-23859
URL: https://issues.apache.org/jira/browse/FLINK-23859
Project: Flink
Issue Type: Bug
Reporter: wuguihu
There are some typo issues in these modules.
{code:java}
# Use the Codespell tool to check typo issue.
pip install codespell
codespell -h
{code}
1、 codespell flink-java/src
{code:java}
flink-java/src/main/java/org/apache/flink/api/java/operators/PartitionOperator.java:125:
partioning ==> partitioning
flink-java/src/main/java/org/apache/flink/api/java/operators/PartitionOperator.java:128:
neccessary ==> necessary
{code}
2、 codespell flink-clients/
{code:java}
flink-clients/src/test/java/org/apache/flink/client/program/DefaultPackagedProgramRetrieverTest.java:545:
acessible ==> accessible
{code}
3、codespell flink-connectors/ -S '*.xml' -S '*.iml' -S '*.txt'
{code:java}
flink-connectors/flink-connector-base/src/main/java/org/apache/flink/connector/base/source/reader/SourceReaderOptions.java:25:
tht ==> that
flink-connectors/flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/catalog/PostgresCatalogTestBase.java:192:
doens't ==> doesn't
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/dialect/JdbcDialect.java:96:
PostgresSQL ==> postgresql
flink-connectors/flink-connector-cassandra/src/test/java/org/apache/flink/batch/connectors/cassandra/CustomCassandraAnnotatedPojo.java:38:
instanciation ==> instantiation
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/HiveParserCalcitePlanner.java:822:
partion ==> partition
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/HiveParserTypeCheckProcFactory.java:943:
funtion ==> function
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/copy/HiveASTParseDriver.java:55:
funtion ==> function
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/copy/HiveASTParseDriver.java:51:
characteres ==> characters
flink-connectors/flink-connector-kinesis/src/main/java/org/apache/flink/streaming/connectors/kinesis/util/KinesisConfigUtil.java:436:
paremeters ==> parameters
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/copy/HiveParserBaseSemanticAnalyzer.java:2369:
Unkown ==> Unknown
flink-connectors/flink-connector-kinesis/src/main/java/org/apache/flink/streaming/connectors/kinesis/config/ConsumerConfigConstants.java:75:
reprsents ==> represents
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/functions/hive/HiveFunctionWrapper.java:28:
functino ==> function
flink-connectors/flink-connector-hbase-2.2/src/main/java/org/apache/flink/connector/hbase2/source/HBaseRowDataAsyncLookupFunction.java:62:
implemenation
flink-connectors/flink-connector-pulsar/src/main/java/org/apache/flink/connector/pulsar/source/enumerator/cursor/StartCursor.java:70:
ture ==> true
flink-connectors/flink-connector-pulsar/src/test/resources/containers/txnStandalone.conf:907:
partions ==> partitions
flink-connectors/flink-connector-pulsar/src/test/resources/containers/txnStandalone.conf:468:
implementatation ==> implementation
flink-connectors/flink-connector-files/src/main/java/org/apache/flink/connector/file/src/enumerate/BlockSplittingRecursiveEnumerator.java:141:
bloc ==> block
flink-connectors/flink-connector-files/src/main/java/org/apache/flink/connector/file/src/reader/SimpleStreamFormat.java:37:
te ==> the
flink-connectors/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/source/reader/deserializer/KafkaRecordDeserializationSchema.java:70:
determin ==> determine
flink-connectors/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/source/enumerator/subscriber/TopicListSubscriber.java:36:
hav ==> have
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/copy/HiveParserQBSubQuery.java:555:
correlatd ==> correlated
flink-connectors/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumerBaseTest.java:263:
intial ==> initial
flink-connectors/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumerBaseTest.java:302:
intial ==> initial
flink-connectors/flink-connector-kinesis/src/main/java/org/apache/flink/streaming/connectors/kinesis/internals/KinesisDataFetcher.java:249:
wth ==> with
flink-connectors/flink-connector-files/src/test/java/org/apache/flink/connector/file/sink/writer/FileWriterBucketStateSerializerMigrationTest.java:232:
comitted ==> committed
flink-connectors/flink-connector-rabbitmq/src/main/java/org/apache/flink/streaming/connectors/rabbitmq/RMQDeserializationSchema.java:96:
invokation ==> invocation
flink-connectors/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/KafkaConsumerTestBase.java:171:
doesnt ==> doesn't
{code}
4、codespell flink-core/ -S '*.xml' -S '*.iml' -S '*.txt'
{code:bash}
flink-core/src/main/java/org/apache/flink/api/common/typeutils/TypeSerializerConfigSnapshot.java:194:
preform ==> perform
flink-core/src/main/java/org/apache/flink/configuration/CheckpointingOptions.java:150:
aways
flink-core/src/main/java/org/apache/flink/configuration/ConfigOption.java:164:
documention ==> documentation
flink-core/src/main/java/org/apache/flink/configuration/ConfigOption.java:175:
documention ==> documentation
flink-core/src/main/java/org/apache/flink/core/fs/RecoverableFsDataOutputStream.java:54:
retured ==> returned
flink-core/src/test/java/org/apache/flink/api/common/io/DelimitedInputFormatTest.java:352:
skipp ==> skip
flink-core/src/main/java/org/apache/flink/util/InstantiationUtil.java:216:
occurences ==> occurrences
{code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)