baibaichen opened a new pull request, #11719:
URL: https://github.com/apache/incubator-gluten/pull/11719
## What changes were proposed in this pull request?
Add Parquet type widening support to Velox and enable 80 of 84 tests in
`GlutenParquetTypeWideningSuite`.
### Changes
1. **Point Velox to type widening branch** (`get-velox.sh`):
Use `baibaichen/pr3/parquet-type-widening` Velox branch with INT→Decimal,
INT→Double, Float→Double widening support.
2. **Update VeloxTestSettings** (`spark40 + spark41`):
Remove 15 excludes for widening tests now passing.
3. **Disable native writer** (`GlutenParquetTypeWideningSuite.scala`):
This suite tests the READ path only. Disable native writer so Spark's
writer produces correct V2 encodings (DELTA_BINARY_PACKED/DELTA_BYTE_ARRAY).
Remove 10 more excludes.
4. **Fallback to vanilla reader when vectorized=false**
(`BasicScanExecTransformer.scala`):
When `PARQUET_VECTORIZED_READER_ENABLED=false`, fallback to Spark's
vanilla parquet-mr reader instead of Velox native reader. This preserves
parquet-mr's behavior (decimal precision narrowing, null on overflow). Remove
34 more excludes.
### Test Results
| | PR2 | PR3 |
|--|-----|-----|
| ✅ Passed | 21 | 80 (+59) |
| ❌ Excluded | 63 | 4 (-59) |
Remaining 4 excludes: Velox does not support DELTA_BYTE_ARRAY encoding for
FIXED_LEN_BYTE_ARRAY decimals.
Depends on #11689 (PR2).
Fixes #11683
## How was this patch tested?
Local tests: TypeWideningSuite 80 pass / 4 ignored (spark40 and spark41).
## Was this patch authored or co-authored using generative AI tooling?
Yes, co-authored with GitHub Copilot.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]