XuQianJin-Stars opened a new pull request, #2408:
URL: https://github.com/apache/fluss/pull/2408
### Purpose
Linked issue: close #2406
This PR introduces the parser and execution framework for Spark's CALL
procedure command, allowing users to invoke stored procedures using SQL syntax
like `CALL sys.procedure_name(args)`. This provides a foundation for
implementing various administrative and maintenance operations.
### Brief change log
**Core Framework:**
- Added `Procedure` interface in
`fluss-spark-common/src/main/java/org/apache/fluss/spark/procedure/Procedure.java`
- Added `ProcedureParameter` and `ProcedureParameterImpl` for parameter
definitions
- Added `BaseProcedure` abstract class providing common utilities
- Added `ProcedureBuilder` interface for procedure instantiation
- Added `ProcedureCatalog` interface for catalog integration
**Parser & SQL Extensions:**
- Created ANTLR grammar `FlussSqlExtensions.g4` for CALL statement syntax
- Implemented `FlussSparkSqlParser` extending Spark's `ParserInterface`
- Implemented `FlussSqlExtensionsAstBuilder` to convert ANTLR parse tree to
logical plans
- Added custom `Origin` and `CurrentOrigin` handling for source position
tracking
- Added Maven ANTLR4 plugin configuration to `fluss-spark-common/pom.xml`
**Logical & Physical Plans:**
- Created `FlussCallStatement` (unresolved) and `FlussCallCommand`
(resolved) logical plan nodes
- Created `FlussCallArgument`, `FlussPositionalArgument`, and
`FlussNamedArgument` for argument representation
- Implemented `CallProcedureExec` physical plan node for execution
**Analysis & Execution:**
- Implemented `FlussProcedureResolver` analyzer rule for procedure
resolution and validation
- Implemented `FlussStrategy` planner strategy to inject `CallProcedureExec`
- Created `FlussSparkSessionExtensions` to register all custom components
**Catalog Integration:**
- Modified `SparkCatalog` to implement `ProcedureCatalog`
- Updated `FlussSparkTestBase` to enable SQL extensions in test environment
**Procedure Registry:**
- Created `SparkProcedures` registry for managing procedure builders
- Added `NoSuchProcedureException` for error handling
**Example Implementation:**
- Implemented `CompactProcedure` as a sample procedure (skeleton
implementation)
**Documentation & Tests:**
- Added `PROCEDURES.md` documenting the new feature
- Added `CallStatementParserTest` with comprehensive parser tests
### Tests
**Unit Tests:**
- `CallStatementParserTest`: Tests parsing of CALL statements
- `testCallWithBackticks`: Tests backtick-quoted identifiers
- `testCallWithNamedArguments`: Tests named argument syntax
- `testCallWithPositionalArguments`: Tests positional arguments with
various data types
- `testCallWithMixedArguments`: Tests mixed named and positional arguments
- `testCallSimpleProcedure`: Tests simple procedure call
All existing tests in `fluss-spark-ut` module pass successfully.
### API and Format
**New Public APIs:**
- `Procedure` interface: Defines contract for stored procedures
- `ProcedureParameter` interface: Defines procedure parameters
- `ProcedureCatalog` interface: Extends Spark's `TableCatalog` with
procedure loading capability
**Modified APIs:**
- `SparkCatalog` now implements `ProcedureCatalog` interface
**No changes to storage format.**
### Documentation
**New feature introduced:** Spark CALL procedure command support
**Documentation added:**
- `fluss-spark/PROCEDURES.md`: Comprehensive guide on using the CALL
procedure feature
- Syntax examples
- Available procedures
- Usage guidelines
- Extension points for custom procedures
**Configuration required:**
Users need to configure Spark session with:
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]