[jira] [Updated] (SPARK-36862) ERROR CodeGenerator: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java'

2021-09-28 Thread Magdalena Pilawska (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-36862?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Magdalena Pilawska updated SPARK-36862:
---
Affects Version/s: (was: 3.1.2)
  Description: 
Hi,

I am getting the following error running spark-submit command:

ERROR CodeGenerator: failed to compile: 
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
321, Column 103: ')' expected instead of '['

 

It fails running the spark sql command on delta lake: 
spark.sql(sqlTransformation)

The template of sqlTransformation is as follows:

MERGE INTO target_table AS d
 USING source_table AS s 
 on s.id = d.id
 WHEN MATCHED AND d.hash_value <> s.hash_value
 THEN UPDATE SET d.name =s.name, d.address = s.address

 

It is permanent error both for *spark 3.1.1* version.

 

The same works fine with spark 3.0.0.

 

Here is the full log:

2021-09-22 16:43:22,110 ERROR CodeGenerator: failed to compile: 
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 55, 
Column 103: ')' expected instead of '['2021-09-22 16:43:22,110 ERROR 
CodeGenerator: failed to compile: 
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 55, 
Column 103: ')' expected instead of 
'['org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
55, Column 103: ')' expected instead of '[' at 
org.codehaus.janino.TokenStreamImpl.compileException(TokenStreamImpl.java:362) 
at org.codehaus.janino.TokenStreamImpl.read(TokenStreamImpl.java:150) at 
org.codehaus.janino.Parser.read(Parser.java:3703) at 
org.codehaus.janino.Parser.parseFormalParameters(Parser.java:1622) at 
org.codehaus.janino.Parser.parseMethodDeclarationRest(Parser.java:1518) at 
org.codehaus.janino.Parser.parseClassBodyDeclaration(Parser.java:1028) at 
org.codehaus.janino.Parser.parseClassBody(Parser.java:841) at 
org.codehaus.janino.Parser.parseClassDeclarationRest(Parser.java:736) at 
org.codehaus.janino.Parser.parseClassBodyDeclaration(Parser.java:941) at 
org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:234) at 
org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:205) at 
org.codehaus.commons.compiler.Cookable.cook(Cookable.java:80) at 
org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:1427)
 at 
org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1524)
 at 
org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1521)
 at 
org.sparkproject.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
 at 
org.sparkproject.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379) 
at 
org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
 at org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2257) 
at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4000) at 
org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004) at 
org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
 at 
org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1375)
 at 
org.apache.spark.sql.execution.WholeStageCodegenExec.liftedTree1$1(WholeStageCodegenExec.scala:721)
 at 
org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:720)
 at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
 at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
 at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) 
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220) 
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181) at 
org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.inputRDD$lzycompute(ShuffleExchangeExec.scala:160)
 at 
org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.inputRDD(ShuffleExchangeExec.scala:160)
 at 
org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture$lzycompute(ShuffleExchangeExec.scala:164)
 at 
org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture(ShuffleExchangeExec.scala:163)
 at 
org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.$anonfun$materializeFuture$2(ShuffleExchangeExec.scala:100)
 at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52) 
at 
org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.$anonfun$materializeFuture$1(ShuffleExchangeExec.scala:100)
 at org.apache.spark.sql.util.LazyValue.getOrInit(LazyValue.scala:41) at 
org.apache.spark.sql.execution.exchange.Exchange.getOrInitMaterializeFuture(Exchange.scala:68)
 at 

[jira] [Comment Edited] (SPARK-36862) ERROR CodeGenerator: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java'

2021-09-28 Thread Magdalena Pilawska (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-36862?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17421374#comment-17421374
 ] 

Magdalena Pilawska edited comment on SPARK-36862 at 9/28/21, 1:11 PM:
--

I get the physical execution plan as a part of output log but I cannot share 
that in public if you mean so.

Any thoughts why the same works on spark 3.0.0? 


was (Author: mpilaw):
I get the physical execution plan as a part of output log but I cannot share 
that in public if you mean so.

Any thoughts why the same works on 3.0.0? 

> ERROR CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java'
> -
>
> Key: SPARK-36862
> URL: https://issues.apache.org/jira/browse/SPARK-36862
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Submit, SQL
>Affects Versions: 3.1.1, 3.1.2
> Environment: Spark 3.1.1 and Spark 3.1.2
> hadoop 3.2.1
>Reporter: Magdalena Pilawska
>Priority: Major
>
> Hi,
> I am getting the following error running spark-submit command:
> ERROR CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
> 321, Column 103: ')' expected instead of '['
>  
> It fails running the spark sql command on delta lake: 
> spark.sql(sqlTransformation)
> The template of sqlTransformation is as follows:
> MERGE INTO target_table AS d
>  USING source_table AS s 
>  on s.id = d.id
>  WHEN MATCHED AND d.hash_value <> s.hash_value
>  THEN UPDATE SET d.name =s.name, d.address = s.address
>  
> It is permanent error both for *spark 3.1.1* and *3.1.2* versions.
>  
> The same works fine with spark 3.0.0.
>  
> Here is the full log:
> 2021-09-22 16:43:22,110 ERROR CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
> 55, Column 103: ')' expected instead of '['2021-09-22 16:43:22,110 ERROR 
> CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
> 55, Column 103: ')' expected instead of 
> '['org.codehaus.commons.compiler.CompileException: File 'generated.java', 
> Line 55, Column 103: ')' expected instead of '[' at 
> org.codehaus.janino.TokenStreamImpl.compileException(TokenStreamImpl.java:362)
>  at org.codehaus.janino.TokenStreamImpl.read(TokenStreamImpl.java:150) at 
> org.codehaus.janino.Parser.read(Parser.java:3703) at 
> org.codehaus.janino.Parser.parseFormalParameters(Parser.java:1622) at 
> org.codehaus.janino.Parser.parseMethodDeclarationRest(Parser.java:1518) at 
> org.codehaus.janino.Parser.parseClassBodyDeclaration(Parser.java:1028) at 
> org.codehaus.janino.Parser.parseClassBody(Parser.java:841) at 
> org.codehaus.janino.Parser.parseClassDeclarationRest(Parser.java:736) at 
> org.codehaus.janino.Parser.parseClassBodyDeclaration(Parser.java:941) at 
> org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:234) at 
> org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:205) at 
> org.codehaus.commons.compiler.Cookable.cook(Cookable.java:80) at 
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:1427)
>  at 
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1524)
>  at 
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1521)
>  at 
> org.sparkproject.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
>  at 
> org.sparkproject.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
>  at 
> org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
>  at org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2257) 
> at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4000) at 
> org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004) at 
> org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
>  at 
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1375)
>  at 
> org.apache.spark.sql.execution.WholeStageCodegenExec.liftedTree1$1(WholeStageCodegenExec.scala:721)
>  at 
> org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:720)
>  at 
> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>  at 
> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>  at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>  at 
> 

[jira] [Commented] (SPARK-36862) ERROR CodeGenerator: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java'

2021-09-28 Thread Magdalena Pilawska (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-36862?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17421374#comment-17421374
 ] 

Magdalena Pilawska commented on SPARK-36862:


I get the physical execution plan as a part of output log but I cannot share 
that in public if you mean so.

Any thoughts why the same works on 3.0.0? 

> ERROR CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java'
> -
>
> Key: SPARK-36862
> URL: https://issues.apache.org/jira/browse/SPARK-36862
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Submit, SQL
>Affects Versions: 3.1.1, 3.1.2
> Environment: Spark 3.1.1 and Spark 3.1.2
> hadoop 3.2.1
>Reporter: Magdalena Pilawska
>Priority: Major
>
> Hi,
> I am getting the following error running spark-submit command:
> ERROR CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
> 321, Column 103: ')' expected instead of '['
>  
> It fails running the spark sql command on delta lake: 
> spark.sql(sqlTransformation)
> The template of sqlTransformation is as follows:
> MERGE INTO target_table AS d
>  USING source_table AS s 
>  on s.id = d.id
>  WHEN MATCHED AND d.hash_value <> s.hash_value
>  THEN UPDATE SET d.name =s.name, d.address = s.address
>  
> It is permanent error both for *spark 3.1.1* and *3.1.2* versions.
>  
> The same works fine with spark 3.0.0.
>  
> Here is the full log:
> 2021-09-22 16:43:22,110 ERROR CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
> 55, Column 103: ')' expected instead of '['2021-09-22 16:43:22,110 ERROR 
> CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
> 55, Column 103: ')' expected instead of 
> '['org.codehaus.commons.compiler.CompileException: File 'generated.java', 
> Line 55, Column 103: ')' expected instead of '[' at 
> org.codehaus.janino.TokenStreamImpl.compileException(TokenStreamImpl.java:362)
>  at org.codehaus.janino.TokenStreamImpl.read(TokenStreamImpl.java:150) at 
> org.codehaus.janino.Parser.read(Parser.java:3703) at 
> org.codehaus.janino.Parser.parseFormalParameters(Parser.java:1622) at 
> org.codehaus.janino.Parser.parseMethodDeclarationRest(Parser.java:1518) at 
> org.codehaus.janino.Parser.parseClassBodyDeclaration(Parser.java:1028) at 
> org.codehaus.janino.Parser.parseClassBody(Parser.java:841) at 
> org.codehaus.janino.Parser.parseClassDeclarationRest(Parser.java:736) at 
> org.codehaus.janino.Parser.parseClassBodyDeclaration(Parser.java:941) at 
> org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:234) at 
> org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:205) at 
> org.codehaus.commons.compiler.Cookable.cook(Cookable.java:80) at 
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:1427)
>  at 
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1524)
>  at 
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1521)
>  at 
> org.sparkproject.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
>  at 
> org.sparkproject.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
>  at 
> org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
>  at org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2257) 
> at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4000) at 
> org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004) at 
> org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
>  at 
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1375)
>  at 
> org.apache.spark.sql.execution.WholeStageCodegenExec.liftedTree1$1(WholeStageCodegenExec.scala:721)
>  at 
> org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:720)
>  at 
> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>  at 
> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>  at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>  at 
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220) at 
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181) at 
> org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.inputRDD$lzycompute(ShuffleExchangeExec.scala:160)
>  at 
> 

[jira] [Commented] (SPARK-36862) ERROR CodeGenerator: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java'

2021-09-27 Thread Magdalena Pilawska (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-36862?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17420730#comment-17420730
 ] 

Magdalena Pilawska commented on SPARK-36862:


Hi [~kabhwan],

I updated the description with triggered operation and log details, thanks.

> ERROR CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java'
> -
>
> Key: SPARK-36862
> URL: https://issues.apache.org/jira/browse/SPARK-36862
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Submit, SQL
>Affects Versions: 3.1.1, 3.1.2
> Environment: Spark 3.1.1 and Spark 3.1.2
> hadoop 3.2.1
>Reporter: Magdalena Pilawska
>Priority: Major
>
> Hi,
> I am getting the following error running spark-submit command:
> ERROR CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
> 321, Column 103: ')' expected instead of '['
>  
> It fails running the spark sql command on delta lake: 
> spark.sql(sqlTransformation)
> The template of sqlTransformation is as follows:
> MERGE INTO target_table AS d
>  USING source_table AS s 
>  on s.id = d.id
>  WHEN MATCHED AND d.hash_value <> s.hash_value
>  THEN UPDATE SET d.name =s.name, d.address = s.address
>  
> It is permanent error both for *spark 3.1.1* and *3.1.2* versions.
>  
> The same works fine with spark 3.0.0.
>  
> Here is the full log:
> 2021-09-22 16:43:22,110 ERROR CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
> 55, Column 103: ')' expected instead of '['2021-09-22 16:43:22,110 ERROR 
> CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
> 55, Column 103: ')' expected instead of 
> '['org.codehaus.commons.compiler.CompileException: File 'generated.java', 
> Line 55, Column 103: ')' expected instead of '[' at 
> org.codehaus.janino.TokenStreamImpl.compileException(TokenStreamImpl.java:362)
>  at org.codehaus.janino.TokenStreamImpl.read(TokenStreamImpl.java:150) at 
> org.codehaus.janino.Parser.read(Parser.java:3703) at 
> org.codehaus.janino.Parser.parseFormalParameters(Parser.java:1622) at 
> org.codehaus.janino.Parser.parseMethodDeclarationRest(Parser.java:1518) at 
> org.codehaus.janino.Parser.parseClassBodyDeclaration(Parser.java:1028) at 
> org.codehaus.janino.Parser.parseClassBody(Parser.java:841) at 
> org.codehaus.janino.Parser.parseClassDeclarationRest(Parser.java:736) at 
> org.codehaus.janino.Parser.parseClassBodyDeclaration(Parser.java:941) at 
> org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:234) at 
> org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:205) at 
> org.codehaus.commons.compiler.Cookable.cook(Cookable.java:80) at 
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:1427)
>  at 
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1524)
>  at 
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1521)
>  at 
> org.sparkproject.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
>  at 
> org.sparkproject.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
>  at 
> org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
>  at org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2257) 
> at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4000) at 
> org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004) at 
> org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
>  at 
> org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1375)
>  at 
> org.apache.spark.sql.execution.WholeStageCodegenExec.liftedTree1$1(WholeStageCodegenExec.scala:721)
>  at 
> org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:720)
>  at 
> org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
>  at 
> org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
>  at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>  at 
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220) at 
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181) at 
> org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.inputRDD$lzycompute(ShuffleExchangeExec.scala:160)
>  at 
> 

[jira] [Updated] (SPARK-36862) ERROR CodeGenerator: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java'

2021-09-27 Thread Magdalena Pilawska (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-36862?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Magdalena Pilawska updated SPARK-36862:
---
Description: 
Hi,

I am getting the following error running spark-submit command:

ERROR CodeGenerator: failed to compile: 
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
321, Column 103: ')' expected instead of '['

 

It fails running the spark sql command on delta lake: 
spark.sql(sqlTransformation)

The template of sqlTransformation is as follows:

MERGE INTO target_table AS d
 USING source_table AS s 
 on s.id = d.id
 WHEN MATCHED AND d.hash_value <> s.hash_value
 THEN UPDATE SET d.name =s.name, d.address = s.address

 

It is permanent error both for *spark 3.1.1* and *3.1.2* versions.

 

The same works fine with spark 3.0.0.

 

Here is the full log:

2021-09-22 16:43:22,110 ERROR CodeGenerator: failed to compile: 
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 55, 
Column 103: ')' expected instead of '['2021-09-22 16:43:22,110 ERROR 
CodeGenerator: failed to compile: 
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 55, 
Column 103: ')' expected instead of 
'['org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
55, Column 103: ')' expected instead of '[' at 
org.codehaus.janino.TokenStreamImpl.compileException(TokenStreamImpl.java:362) 
at org.codehaus.janino.TokenStreamImpl.read(TokenStreamImpl.java:150) at 
org.codehaus.janino.Parser.read(Parser.java:3703) at 
org.codehaus.janino.Parser.parseFormalParameters(Parser.java:1622) at 
org.codehaus.janino.Parser.parseMethodDeclarationRest(Parser.java:1518) at 
org.codehaus.janino.Parser.parseClassBodyDeclaration(Parser.java:1028) at 
org.codehaus.janino.Parser.parseClassBody(Parser.java:841) at 
org.codehaus.janino.Parser.parseClassDeclarationRest(Parser.java:736) at 
org.codehaus.janino.Parser.parseClassBodyDeclaration(Parser.java:941) at 
org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:234) at 
org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:205) at 
org.codehaus.commons.compiler.Cookable.cook(Cookable.java:80) at 
org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:1427)
 at 
org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1524)
 at 
org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1521)
 at 
org.sparkproject.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
 at 
org.sparkproject.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379) 
at 
org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
 at org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2257) 
at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4000) at 
org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004) at 
org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
 at 
org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1375)
 at 
org.apache.spark.sql.execution.WholeStageCodegenExec.liftedTree1$1(WholeStageCodegenExec.scala:721)
 at 
org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:720)
 at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
 at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
 at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) 
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220) 
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181) at 
org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.inputRDD$lzycompute(ShuffleExchangeExec.scala:160)
 at 
org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.inputRDD(ShuffleExchangeExec.scala:160)
 at 
org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture$lzycompute(ShuffleExchangeExec.scala:164)
 at 
org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture(ShuffleExchangeExec.scala:163)
 at 
org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.$anonfun$materializeFuture$2(ShuffleExchangeExec.scala:100)
 at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52) 
at 
org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.$anonfun$materializeFuture$1(ShuffleExchangeExec.scala:100)
 at org.apache.spark.sql.util.LazyValue.getOrInit(LazyValue.scala:41) at 
org.apache.spark.sql.execution.exchange.Exchange.getOrInitMaterializeFuture(Exchange.scala:68)
 at 

[jira] [Updated] (SPARK-36862) ERROR CodeGenerator: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java'

2021-09-27 Thread Magdalena Pilawska (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-36862?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Magdalena Pilawska updated SPARK-36862:
---
Description: 
Hi,

I am getting the following error running spark-submit command:

ERROR CodeGenerator: failed to compile: 
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
321, Column 103: ')' expected instead of '['

 

It fails running the spark sql command on delta lake: 
spark.sql(sqlTransformation)

The template of sqlTransformation is as follows:

MERGE INTO target_table AS d
USING source_table AS s 
on s.id = d.id
WHEN MATCHED AND d.hash_value <> s.hash_value
THEN UPDATE SET d.name =s.name, d.address = s.address

 

It is permanent error both for *spark 3.1.1* and *3.1.2* versions.

 

The same works fine with spark 3.0.0.

  was:
Hi,

I am getting the following error running spark-submit command:


ERROR CodeGenerator: failed to compile: 
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
321, Column 103: ')' expected instead of '['

 

It is permanent error both for spark 3.1.1 and 3.1.2 versions.

 

The same works fine with spark 3.0.0.


> ERROR CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java'
> -
>
> Key: SPARK-36862
> URL: https://issues.apache.org/jira/browse/SPARK-36862
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Submit, SQL
>Affects Versions: 3.1.1, 3.1.2
> Environment: Spark 3.1.1 and Spark 3.1.2
> hadoop 3.2.1
>Reporter: Magdalena Pilawska
>Priority: Major
>
> Hi,
> I am getting the following error running spark-submit command:
> ERROR CodeGenerator: failed to compile: 
> org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
> 321, Column 103: ')' expected instead of '['
>  
> It fails running the spark sql command on delta lake: 
> spark.sql(sqlTransformation)
> The template of sqlTransformation is as follows:
> MERGE INTO target_table AS d
> USING source_table AS s 
> on s.id = d.id
> WHEN MATCHED AND d.hash_value <> s.hash_value
> THEN UPDATE SET d.name =s.name, d.address = s.address
>  
> It is permanent error both for *spark 3.1.1* and *3.1.2* versions.
>  
> The same works fine with spark 3.0.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36862) ERROR CodeGenerator: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java'

2021-09-27 Thread Magdalena Pilawska (Jira)
Magdalena Pilawska created SPARK-36862:
--

 Summary: ERROR CodeGenerator: failed to compile: 
org.codehaus.commons.compiler.CompileException: File 'generated.java'
 Key: SPARK-36862
 URL: https://issues.apache.org/jira/browse/SPARK-36862
 Project: Spark
  Issue Type: Bug
  Components: Spark Submit, SQL
Affects Versions: 3.1.2, 3.1.1
 Environment: Spark 3.1.1 and Spark 3.1.2

hadoop 3.2.1
Reporter: Magdalena Pilawska


Hi,

I am getting the following error running spark-submit command:


ERROR CodeGenerator: failed to compile: 
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 
321, Column 103: ')' expected instead of '['

 

It is permanent error both for spark 3.1.1 and 3.1.2 versions.

 

The same works fine with spark 3.0.0.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org