daipom commented on issue #2863:
URL: https://github.com/apache/arrow-adbc/issues/2863#issuecomment-2944946411

   Remaining
   
   ```
   .github/workflows/r-extended.yml:92: verions ==> versions
   CHANGELOG.md:350: Inital ==> Initial
   CHANGELOG.md:693: consisent ==> consistent
   csharp/src/Apache.Arrow.Adbc/AdbcConnection11.cs:356: statistcs ==> 
statistics
   csharp/src/Apache.Arrow.Adbc/C/CAdbcConnection.cs:37: unintialized ==> 
uninitialized
   csharp/src/Apache.Arrow.Adbc/C/CAdbcDatabase.cs:39: unintialized ==> 
uninitialized
   csharp/src/Apache.Arrow.Adbc/C/CAdbcDriver.cs:40: unintialized ==> 
uninitialized
   csharp/src/Apache.Arrow.Adbc/C/CAdbcDriver.cs:47: unintialized ==> 
uninitialized
   csharp/src/Apache.Arrow.Adbc/C/CAdbcPartitions.cs:63: unintialized ==> 
uninitialized
   csharp/src/Apache.Arrow.Adbc/C/CAdbcStatement.cs:47: unintialized ==> 
uninitialized
   csharp/src/Client/AdbcConnection.cs:45: Intializes ==> Initializes
   csharp/src/Client/AdbcConnection.cs:56: Intializes ==> Initializes
   csharp/src/Drivers/Apache/Hive2/DecimalUtility.cs:172: fron ==> from, front
   csharp/src/Drivers/Apache/Hive2/README.md:70: specifed ==> specified
   csharp/src/Drivers/Apache/Hive2/SqlTypeNameParser.cs:140: defintion ==> 
definition
   csharp/src/Drivers/Apache/Hive2/SqlTypeNameParser.cs:153: defintion ==> 
definition
   csharp/src/Drivers/Apache/Hive2/SqlTypeNameParser.cs:35: defintion ==> 
definition
   csharp/src/Drivers/Apache/Hive2/SqlTypeNameParser.cs:490: STUCT ==> STRUCT
   csharp/src/Drivers/Apache/Impala/README.md:106: standar ==> standard
   csharp/src/Drivers/Apache/Impala/README.md:68: specifed ==> specified
   csharp/src/Drivers/Apache/Spark/README.md:64: specifed ==> specified
   csharp/src/Drivers/Apache/Thrift/BitmapUtilities.cs:53: remaing ==> remaining
   csharp/src/Drivers/Databricks/DatabricksStatement.cs:45: instad ==> instead
   csharp/test/Apache.Arrow.Adbc.Tests/TestBase.cs:124: Ditionary ==> Dictionary
   csharp/test/Apache.Arrow.Adbc.Tests/TestBase.cs:198: formated ==> formatted
   csharp/test/Apache.Arrow.Adbc.Tests/TestBase.cs:214: formated ==> formatted
   csharp/test/Apache.Arrow.Adbc.Tests/TestBase.cs:230: formated ==> formatted
   csharp/test/Apache.Arrow.Adbc.Tests/TestBase.cs:307: statment ==> statement
   csharp/test/Apache.Arrow.Adbc.Tests/TestBase.cs:62: ouput ==> output
   csharp/test/Drivers/Apache/Common/ClientTests.cs:214: timout ==> timeout
   csharp/test/Drivers/Apache/Common/DateTimeValueTests.cs:61: Timstamp ==> 
Timestamp
   csharp/test/Drivers/Apache/Common/DateTimeValueTests.cs:83: Timstamp ==> 
Timestamp
   csharp/test/Drivers/Apache/Impala/Resources/impalaconfig.json:8: schem ==> 
scheme
   csharp/test/Drivers/Apache/Spark/SparkConnectionTest.cs:48: exeption ==> 
exception, exemption
   csharp/test/Drivers/Apache/Spark/SparkConnectionTest.cs:49: exeption ==> 
exception, exemption
   csharp/test/Drivers/BigQuery/AuthenticationTests.cs:65: behvior ==> behavior
   csharp/test/Drivers/Databricks/DatabricksConnectionTest.cs:55: exeption ==> 
exception, exemption
   csharp/test/Drivers/Databricks/DatabricksConnectionTest.cs:56: exeption ==> 
exception, exemption
   csharp/test/Drivers/Databricks/DateTimeValueTests.cs:42: Timstamp ==> 
Timestamp
   csharp/test/Drivers/Databricks/StatementTests.cs:437: mintues ==> minutes
   csharp/test/Drivers/Interop/FlightSql/DriverTests.cs:46: caseSenstive ==> 
case-sensitive
   csharp/test/Drivers/Interop/FlightSql/DriverTests.cs:55: caseSenstive ==> 
case-sensitive
   csharp/test/Drivers/Interop/FlightSql/readme.md:36: compatibilty ==> 
compatibility
   csharp/test/Drivers/Interop/FlightSql/readme.md:62: resutls ==> results
   csharp/test/Drivers/Interop/Snowflake/SnowflakeTestingUtils.cs:169: occured 
==> occurred
   csharp/test/Drivers/Interop/Snowflake/SnowflakeTestingUtils.cs:169: resouce 
==> resource
   dev/bench/README.md:42: passsword ==> password
   dev/release/verify-release-candidate.sh:950: enviroment ==> environment
   go/adbc/adbc.go:647: returnes ==> returns
   go/adbc/driver/bigquery/statement.go:572: Unparseable ==> Unparsable
   go/adbc/driver/flightsql/flightsql_statement.go:429: returnes ==> returns
   go/adbc/driver/internal/driverbase/connection.go:372: implementor ==> 
implementer
   go/adbc/driver/internal/driverbase/rotating_file_writer.go:260: exising ==> 
existing
   go/adbc/driver/internal/driverbase/rotating_file_writer.go:293: 
lexigraphically ==> lexicographically
   go/adbc/driver/snowflake/bulk_ingestion.go:491: propogate ==> propagate
   go/adbc/driver/snowflake/bulk_ingestion.go:561: recieved ==> received
   go/adbc/driver/snowflake/bulk_ingestion.go:88: bandwith ==> bandwidth
   go/adbc/driver/snowflake/driver_test.go:2075: exepected ==> expected
   go/adbc/driver/snowflake/driver_test.go:2093: exepected ==> expected
   go/adbc/driver/snowflake/statement.go:470: returnes ==> returns
   go/adbc/ext.go:68: Implementors ==> Implementers
   r/adbcdrivermanager/src/options.cc:106: suppported ==> supported
   r/adbcdrivermanager/src/options.cc:127: suppported ==> supported
   r/adbcdrivermanager/src/options.cc:84: suppported ==> supported
   rust/core/src/driver_manager.rs:157: droped ==> dropped
   rust/core/src/ffi/types.rs:116: unintialized ==> uninitialized
   rust/core/src/ffi/types.rs:129: unintialized ==> uninitialized
   rust/core/src/ffi/types.rs:133: unintialized ==> uninitialized
   rust/core/src/ffi/types.rs:65: unintialized ==> uninitialized
   rust/core/src/ffi/types.rs:673: transfered ==> transferred
   rust/core/src/ffi/types.rs:77: unintialized ==> uninitialized
   rust/core/src/ffi/types.rs:89: unintialized ==> uninitialized
   rust/core/src/lib.rs:56: langages ==> languages
   rust/driver/dummy/tests/driver_exporter_dummy.rs:19: trough ==> through
   rust/driver/snowflake/src/duration.rs:49: eror ==> error
   rust/README.md:51: trough ==> through
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to