KevinAppelBofa commented on pull request #34693:
URL: https://github.com/apache/spark/pull/34693#issuecomment-981708494


   @peter-toth thank you for working on this, I was able to get a spark 
3.3.0-snapshot compiled and test the changes you made.  I ran both the sample 
queries first and those were able to work, then I ran the temp table query and 
this is also working; that one was easy to split into the withClause and query. 
 I am running into an issue though getting the CTE query to run, i have tried 
to split this up a few ways but I keep getting the same error which is below.  
I'm going to try to add a logwarning to dump out the query it is trying to run 
to get the schema and see if I can get that to run directly in the sql server.  
This was the issue I ran into originally I was able to get the test CTE to work 
and doing a $CTEQUERY where 1=0; was working but in this more complex CTE I 
can't find a spot where to add the 1=0 to get a schema back only.
   
   `py4j.protocol.Py4JJavaError: An error occurred while calling o85.load.
   : com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near the 
keyword 'WITH'.
           at 
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:262)
           at 
com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1632)
           at 
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:602)
           at 
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:524)
           at 
com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7418)
           at 
com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:3272)
           at 
com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:247)
           at 
com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:222)
           at 
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeQuery(SQLServerPreparedStatement.java:446)
           at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.getQueryOutputSchema(JDBCRDD.scala:69)
           at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:59)
           at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:240)
           at 
org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:36)
           at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:350)
           at 
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:227)
           at 
org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:209)
           at scala.Option.getOrElse(Option.scala:189)
           at 
org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:209)
           at 
org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:170)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
           at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
           at py4j.Gateway.invoke(Gateway.java:282)
           at 
py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
           at py4j.commands.CallCommand.execute(CallCommand.java:79)
           at 
py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
           at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
           at java.lang.Thread.run(Thread.java:748)`
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to