Hi Yujia, You might take inspiration from Coral https://github.com/linkedin/coral. It is based on Calcite but uses the Hive parser (which is compatible with Spark SQL) to generate the SQL and Rel nodes. There is a PR that uses the native Spark parser as well https://github.com/linkedin/coral/pull/339. Merging it is a work in progress.
Thanks, Walaa. On Thu, May 30, 2024 at 9:07 AM Mihai Budiu <mbu...@gmail.com> wrote: > The SQL language has several sublanguages: the query language, the data > definition language, and the data manipulation language. The core of > Calcite is mostly about the query language, but there are Calcite > components that deal with the other languages as well (e.g., server, babel). > > Both these components also show how the Calcite parser can be customized. > > In our project we have also extended the parser. You can see for example > our PR which does a minimal change to the Calcite parser: > https://github.com/feldera/feldera/pull/210, it could be a useful > guideline for your needs. We use maven in our build, so you can see how the > build has to be structured. The config.fmpp file has some "metadata" about > how the changes are integrated into the existing parser, while the *.ftl > files contain actual parser code written in the JavaCC parser generator > language. > > Mihai > ________________________________ > From: 奚钰佳 <yu...@qq.com.INVALID> > Sent: Wednesday, May 29, 2024 7:01 PM > To: dev <dev@calcite.apache.org> > Subject: How to PARSER the SPARK SQL > > Hello calcite team, > > I am Yujia and I want to parse the spark sql by calcite. But some keywords > are not supported by calcite. > Here is my question in stackoverflow: > https://stackoverflow.com/questions/78547328/how-can-i-parse-the-spark-sql-by-calcite-sqlparser-like-create-temporary-table > > > > I have two questions: > > How to parse the SPARK SQL by calcite? > > How can I extend the syntax on demand? like create temporary table? > > For question 1: Is there any way to parse the spark sql with the parser > that support the spark sql? > > > For question 2: > Also, I tried to extend the syntax with below steps but met the > strange compile error with the copied Parser.jj file. > Steps > > Copy the Parser.jj file from calcite > > add SqlNode SqlCreateTempTable() :... in parserImpls.ftl > > add class CreateTempTable that extends SqlCall to > define the getOperator, getOperandList, unparse > > add related config in config.fmpp > > mvn generate-sources to generate the TestSqlParserImpl that > defined in the step 4 > > at step5, it has below error: > FMPP processing session failed. [ERROR] Caused by: > freemarker.core.InvalidReferenceException: The following has evaluated to > null or missing: [ERROR] ==> default [in template "Parser.jj" at > line 1124, column 43] [ERROR] [ERROR] ---- [ERROR] Tip: If the failing > expression is known to legally refer to something that's sometimes null or > missing, either specify a default value like myOptionalVar!myDefault, or > use <#if myOptionalVar??>when-present<#else>when-missing</#if>. > (These only cover the last step of the expression; to cover the whole > expression, use parenthesis: (myOptionalVar.foo)!myDefault, > (myOptionalVar.foo)?? [ERROR] ---- [ERROR] [ERROR] ---- [ERROR] FTL stack > trace ("~" means nesting-related): [ERROR] - > Failed at: #if (parser.createStatementParserMeth... [in template > "Parser.jj" at line 1124, column 1] > Is my steps right? And why the same Parser.jj file has error when compile? > > > > > Thanks, > > Yujia >