Re: Problem building spark-catalyst_2.12 with Maven

2022-02-10 Thread Martin Grigorov
I've found the problem!
It was indeed a local thingy!

$ cat ~/.mavenrc
MAVEN_OPTS='-XX:+TieredCompilation -XX:TieredStopAtLevel=1'

I've added this some time ago. It optimizes the build time. But it seems it
also overrides the env var MAVEN_OPTS...

Now it fails with:

[INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @
spark-catalyst_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file:
/home/martin/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.15__52.0-1.3.1_20191012T045515.jar
[INFO] compiler plugin:
BasicArtifact(com.github.ghik,silencer-plugin_2.12.15,1.7.6,null)
[INFO] Compiling 372 Scala sources and 171 Java sources to
/home/martin/git/apache/spark/sql/catalyst/target/scala-2.12/classes ...

[ERROR] [Error] : error writing
/home/martin/git/apache/spark/sql/catalyst/target/scala-2.12/classes/org/apache/spark/sql/catalyst/analysis/Analyzer$ResolveGroupingAnalytics$$anonfun$org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveGroupingAnalytics$$replaceGroupingFunc$1.class:
java.nio.file.FileSystemException
/home/martin/git/apache/spark/sql/catalyst/target/scala-2.12/classes/org/apache/spark/sql/catalyst/analysis/Analyzer$ResolveGroupingAnalytics$$anonfun$org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveGroupingAnalytics$$replaceGroupingFunc$1.class:
File name too long
but this is well documented:
https://spark.apache.org/docs/latest/building-spark.html#encrypted-filesystems

All works now!
Thank you, Sean!


On Thu, Feb 10, 2022 at 10:13 PM Sean Owen  wrote:

> I think it's another occurrence that I had to change or had to set
> MAVEN_OPTS. I think this occurs in a way that this setting doesn't affect,
> though I don't quite understand it. Try the stack size in test runner
> configs
>
> On Thu, Feb 10, 2022, 2:02 PM Martin Grigorov 
> wrote:
>
>> Hi Sean,
>>
>> On Thu, Feb 10, 2022 at 5:37 PM Sean Owen  wrote:
>>
>>> Yes I've seen this; the JVM stack size needs to be increased. I'm not
>>> sure if it's env specific (though you and I at least have hit it, I think
>>> others), or whether we need to change our build script.
>>> In the pom.xml file, find "-Xss..." settings and make them something
>>> like "-Xss4m", see if that works.
>>>
>>
>> It is already a much bigger value - 128m (
>> https://github.com/apache/spark/blob/50256bde9bdf217413545a6d2945d6c61bf4cfff/pom.xml#L2845
>> )
>> I've tried smaller and bigger values for all jvmArgs next to this one.
>> None helped!
>> I also have the feeling it is something in my environment that overrides
>> these values but so far I cannot identify anything.
>>
>>
>>
>>>
>>> On Thu, Feb 10, 2022 at 8:54 AM Martin Grigorov 
>>> wrote:
>>>
 Hi,

 I am not able to build Spark due to the following error :

 ERROR] ## Exception when compiling 543 sources to
 /home/martin/git/apache/spark/sql/catalyst/target/scala-2.12/classes
 java.lang.BootstrapMethodError: call site initialization exception
 java.lang.invoke.CallSite.makeSite(CallSite.java:341)

 java.lang.invoke.MethodHandleNatives.linkCallSiteImpl(MethodHandleNatives.java:307)

 java.lang.invoke.MethodHandleNatives.linkCallSite(MethodHandleNatives.java:297)
 scala.tools.nsc.typechecker.Typers$Typer.typedBlock(Typers.scala:2504)

 scala.tools.nsc.typechecker.Typers$Typer.$anonfun$typed1$103(Typers.scala:5711)

 scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1(Typers.scala:500)
 scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5746)
 scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5781)
 ...
 Caused by: java.lang.StackOverflowError
 at java.lang.ref.Reference. (Reference.java:303)
 at java.lang.ref.WeakReference. (WeakReference.java:57)
 at
 java.lang.invoke.MethodType$ConcurrentWeakInternSet$WeakEntry.
 (MethodType.java:1269)
 at java.lang.invoke.MethodType$ConcurrentWeakInternSet.get
 (MethodType.java:1216)
 at java.lang.invoke.MethodType.makeImpl (MethodType.java:302)
 at java.lang.invoke.MethodType.dropParameterTypes
 (MethodType.java:573)
 at java.lang.invoke.MethodType.replaceParameterTypes
 (MethodType.java:467)
 at java.lang.invoke.MethodHandle.asSpreader (MethodHandle.java:875)
 at java.lang.invoke.Invokers.spreadInvoker (Invokers.java:158)
 at java.lang.invoke.CallSite.makeSite (CallSite.java:324)
 at java.lang.invoke.MethodHandleNatives.linkCallSiteImpl
 (MethodHandleNatives.java:307)
 at java.lang.invoke.MethodHandleNatives.linkCallSite
 (MethodHandleNatives.java:297)
 at scala.tools.nsc.typechecker.Typers$Typer.typedBlock
 (Typers.scala:2504)
 at scala.tools.nsc.typechecker.Typers$Typer.$anonfun$typed1$103
 (Typers.scala:5711)
 at
 scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1
 (Typers.scala:500)

Re: Problem building spark-catalyst_2.12 with Maven

2022-02-10 Thread Sean Owen
I think it's another occurrence that I had to change or had to set
MAVEN_OPTS. I think this occurs in a way that this setting doesn't affect,
though I don't quite understand it. Try the stack size in test runner
configs

On Thu, Feb 10, 2022, 2:02 PM Martin Grigorov  wrote:

> Hi Sean,
>
> On Thu, Feb 10, 2022 at 5:37 PM Sean Owen  wrote:
>
>> Yes I've seen this; the JVM stack size needs to be increased. I'm not
>> sure if it's env specific (though you and I at least have hit it, I think
>> others), or whether we need to change our build script.
>> In the pom.xml file, find "-Xss..." settings and make them something like
>> "-Xss4m", see if that works.
>>
>
> It is already a much bigger value - 128m (
> https://github.com/apache/spark/blob/50256bde9bdf217413545a6d2945d6c61bf4cfff/pom.xml#L2845
> )
> I've tried smaller and bigger values for all jvmArgs next to this one.
> None helped!
> I also have the feeling it is something in my environment that overrides
> these values but so far I cannot identify anything.
>
>
>
>>
>> On Thu, Feb 10, 2022 at 8:54 AM Martin Grigorov 
>> wrote:
>>
>>> Hi,
>>>
>>> I am not able to build Spark due to the following error :
>>>
>>> ERROR] ## Exception when compiling 543 sources to
>>> /home/martin/git/apache/spark/sql/catalyst/target/scala-2.12/classes
>>> java.lang.BootstrapMethodError: call site initialization exception
>>> java.lang.invoke.CallSite.makeSite(CallSite.java:341)
>>>
>>> java.lang.invoke.MethodHandleNatives.linkCallSiteImpl(MethodHandleNatives.java:307)
>>>
>>> java.lang.invoke.MethodHandleNatives.linkCallSite(MethodHandleNatives.java:297)
>>> scala.tools.nsc.typechecker.Typers$Typer.typedBlock(Typers.scala:2504)
>>>
>>> scala.tools.nsc.typechecker.Typers$Typer.$anonfun$typed1$103(Typers.scala:5711)
>>>
>>> scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1(Typers.scala:500)
>>> scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5746)
>>> scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5781)
>>> ...
>>> Caused by: java.lang.StackOverflowError
>>> at java.lang.ref.Reference. (Reference.java:303)
>>> at java.lang.ref.WeakReference. (WeakReference.java:57)
>>> at
>>> java.lang.invoke.MethodType$ConcurrentWeakInternSet$WeakEntry.
>>> (MethodType.java:1269)
>>> at java.lang.invoke.MethodType$ConcurrentWeakInternSet.get
>>> (MethodType.java:1216)
>>> at java.lang.invoke.MethodType.makeImpl (MethodType.java:302)
>>> at java.lang.invoke.MethodType.dropParameterTypes
>>> (MethodType.java:573)
>>> at java.lang.invoke.MethodType.replaceParameterTypes
>>> (MethodType.java:467)
>>> at java.lang.invoke.MethodHandle.asSpreader (MethodHandle.java:875)
>>> at java.lang.invoke.Invokers.spreadInvoker (Invokers.java:158)
>>> at java.lang.invoke.CallSite.makeSite (CallSite.java:324)
>>> at java.lang.invoke.MethodHandleNatives.linkCallSiteImpl
>>> (MethodHandleNatives.java:307)
>>> at java.lang.invoke.MethodHandleNatives.linkCallSite
>>> (MethodHandleNatives.java:297)
>>> at scala.tools.nsc.typechecker.Typers$Typer.typedBlock
>>> (Typers.scala:2504)
>>> at scala.tools.nsc.typechecker.Typers$Typer.$anonfun$typed1$103
>>> (Typers.scala:5711)
>>> at
>>> scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1
>>> (Typers.scala:500)
>>> at scala.tools.nsc.typechecker.Typers$Typer.typed1
>>> (Typers.scala:5746)
>>> at scala.tools.nsc.typechecker.Typers$Typer.typed (Typers.scala:5781)
>>>
>>> I have played a lot with the scala-maven-plugin jvmArg settings at [1]
>>> but so far nothing helps.
>>> Same error for Scala 2.12 and 2.13.
>>>
>>> The command I use is: ./build/mvn install -Pkubernetes -DskipTests
>>>
>>> I need to create a distribution from master branch.
>>>
>>> Java: 1.8.0_312
>>> Maven: 3.8.4
>>> OS: Ubuntu 21.10
>>>
>>> Any hints ?
>>> Thank you!
>>>
>>> 1.
>>> https://github.com/apache/spark/blob/50256bde9bdf217413545a6d2945d6c61bf4cfff/pom.xml#L2845-L2849
>>>
>>


Re: Problem building spark-catalyst_2.12 with Maven

2022-02-10 Thread Martin Grigorov
Hi Sean,

On Thu, Feb 10, 2022 at 5:37 PM Sean Owen  wrote:

> Yes I've seen this; the JVM stack size needs to be increased. I'm not sure
> if it's env specific (though you and I at least have hit it, I think
> others), or whether we need to change our build script.
> In the pom.xml file, find "-Xss..." settings and make them something like
> "-Xss4m", see if that works.
>

It is already a much bigger value - 128m (
https://github.com/apache/spark/blob/50256bde9bdf217413545a6d2945d6c61bf4cfff/pom.xml#L2845
)
I've tried smaller and bigger values for all jvmArgs next to this one. None
helped!
I also have the feeling it is something in my environment that overrides
these values but so far I cannot identify anything.



>
> On Thu, Feb 10, 2022 at 8:54 AM Martin Grigorov 
> wrote:
>
>> Hi,
>>
>> I am not able to build Spark due to the following error :
>>
>> ERROR] ## Exception when compiling 543 sources to
>> /home/martin/git/apache/spark/sql/catalyst/target/scala-2.12/classes
>> java.lang.BootstrapMethodError: call site initialization exception
>> java.lang.invoke.CallSite.makeSite(CallSite.java:341)
>>
>> java.lang.invoke.MethodHandleNatives.linkCallSiteImpl(MethodHandleNatives.java:307)
>>
>> java.lang.invoke.MethodHandleNatives.linkCallSite(MethodHandleNatives.java:297)
>> scala.tools.nsc.typechecker.Typers$Typer.typedBlock(Typers.scala:2504)
>>
>> scala.tools.nsc.typechecker.Typers$Typer.$anonfun$typed1$103(Typers.scala:5711)
>>
>> scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1(Typers.scala:500)
>> scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5746)
>> scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5781)
>> ...
>> Caused by: java.lang.StackOverflowError
>> at java.lang.ref.Reference. (Reference.java:303)
>> at java.lang.ref.WeakReference. (WeakReference.java:57)
>> at
>> java.lang.invoke.MethodType$ConcurrentWeakInternSet$WeakEntry.
>> (MethodType.java:1269)
>> at java.lang.invoke.MethodType$ConcurrentWeakInternSet.get
>> (MethodType.java:1216)
>> at java.lang.invoke.MethodType.makeImpl (MethodType.java:302)
>> at java.lang.invoke.MethodType.dropParameterTypes
>> (MethodType.java:573)
>> at java.lang.invoke.MethodType.replaceParameterTypes
>> (MethodType.java:467)
>> at java.lang.invoke.MethodHandle.asSpreader (MethodHandle.java:875)
>> at java.lang.invoke.Invokers.spreadInvoker (Invokers.java:158)
>> at java.lang.invoke.CallSite.makeSite (CallSite.java:324)
>> at java.lang.invoke.MethodHandleNatives.linkCallSiteImpl
>> (MethodHandleNatives.java:307)
>> at java.lang.invoke.MethodHandleNatives.linkCallSite
>> (MethodHandleNatives.java:297)
>> at scala.tools.nsc.typechecker.Typers$Typer.typedBlock
>> (Typers.scala:2504)
>> at scala.tools.nsc.typechecker.Typers$Typer.$anonfun$typed1$103
>> (Typers.scala:5711)
>> at scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1
>> (Typers.scala:500)
>> at scala.tools.nsc.typechecker.Typers$Typer.typed1 (Typers.scala:5746)
>> at scala.tools.nsc.typechecker.Typers$Typer.typed (Typers.scala:5781)
>>
>> I have played a lot with the scala-maven-plugin jvmArg settings at [1]
>> but so far nothing helps.
>> Same error for Scala 2.12 and 2.13.
>>
>> The command I use is: ./build/mvn install -Pkubernetes -DskipTests
>>
>> I need to create a distribution from master branch.
>>
>> Java: 1.8.0_312
>> Maven: 3.8.4
>> OS: Ubuntu 21.10
>>
>> Any hints ?
>> Thank you!
>>
>> 1.
>> https://github.com/apache/spark/blob/50256bde9bdf217413545a6d2945d6c61bf4cfff/pom.xml#L2845-L2849
>>
>


Re: Problem building spark-catalyst_2.12 with Maven

2022-02-10 Thread Sean Owen
Yes I've seen this; the JVM stack size needs to be increased. I'm not sure
if it's env specific (though you and I at least have hit it, I think
others), or whether we need to change our build script.
In the pom.xml file, find "-Xss..." settings and make them something like
"-Xss4m", see if that works.

On Thu, Feb 10, 2022 at 8:54 AM Martin Grigorov 
wrote:

> Hi,
>
> I am not able to build Spark due to the following error :
>
> ERROR] ## Exception when compiling 543 sources to
> /home/martin/git/apache/spark/sql/catalyst/target/scala-2.12/classes
> java.lang.BootstrapMethodError: call site initialization exception
> java.lang.invoke.CallSite.makeSite(CallSite.java:341)
>
> java.lang.invoke.MethodHandleNatives.linkCallSiteImpl(MethodHandleNatives.java:307)
>
> java.lang.invoke.MethodHandleNatives.linkCallSite(MethodHandleNatives.java:297)
> scala.tools.nsc.typechecker.Typers$Typer.typedBlock(Typers.scala:2504)
>
> scala.tools.nsc.typechecker.Typers$Typer.$anonfun$typed1$103(Typers.scala:5711)
>
> scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1(Typers.scala:500)
> scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5746)
> scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5781)
> ...
> Caused by: java.lang.StackOverflowError
> at java.lang.ref.Reference. (Reference.java:303)
> at java.lang.ref.WeakReference. (WeakReference.java:57)
> at
> java.lang.invoke.MethodType$ConcurrentWeakInternSet$WeakEntry.
> (MethodType.java:1269)
> at java.lang.invoke.MethodType$ConcurrentWeakInternSet.get
> (MethodType.java:1216)
> at java.lang.invoke.MethodType.makeImpl (MethodType.java:302)
> at java.lang.invoke.MethodType.dropParameterTypes (MethodType.java:573)
> at java.lang.invoke.MethodType.replaceParameterTypes
> (MethodType.java:467)
> at java.lang.invoke.MethodHandle.asSpreader (MethodHandle.java:875)
> at java.lang.invoke.Invokers.spreadInvoker (Invokers.java:158)
> at java.lang.invoke.CallSite.makeSite (CallSite.java:324)
> at java.lang.invoke.MethodHandleNatives.linkCallSiteImpl
> (MethodHandleNatives.java:307)
> at java.lang.invoke.MethodHandleNatives.linkCallSite
> (MethodHandleNatives.java:297)
> at scala.tools.nsc.typechecker.Typers$Typer.typedBlock
> (Typers.scala:2504)
> at scala.tools.nsc.typechecker.Typers$Typer.$anonfun$typed1$103
> (Typers.scala:5711)
> at scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1
> (Typers.scala:500)
> at scala.tools.nsc.typechecker.Typers$Typer.typed1 (Typers.scala:5746)
> at scala.tools.nsc.typechecker.Typers$Typer.typed (Typers.scala:5781)
>
> I have played a lot with the scala-maven-plugin jvmArg settings at [1] but
> so far nothing helps.
> Same error for Scala 2.12 and 2.13.
>
> The command I use is: ./build/mvn install -Pkubernetes -DskipTests
>
> I need to create a distribution from master branch.
>
> Java: 1.8.0_312
> Maven: 3.8.4
> OS: Ubuntu 21.10
>
> Any hints ?
> Thank you!
>
> 1.
> https://github.com/apache/spark/blob/50256bde9bdf217413545a6d2945d6c61bf4cfff/pom.xml#L2845-L2849
>


Re: Problem building Spark

2015-10-19 Thread Ted Yu
See this thread
http://search-hadoop.com/m/q3RTtV3VFNdgNri2=Re+Build+spark+1+5+1+branch+fails

> On Oct 19, 2015, at 6:59 PM, Annabel Melongo 
>  wrote:
> 
> I tried to build Spark according to the build directions and the it failed 
> due to the following error: 
>  
>  
>  
>  
>  
>  
> Building Spark - Spark 1.5.1 Documentation
> Building Spark Building with build/mvn Building a Runnable Distribution 
> Setting up Maven’s Memory Usage Specifying the Hadoop Version Building With 
> Hive and JDBC Support Building for Scala 2.11
> View on spark.apache.org
> Preview by Yahoo
>  
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-assembly-plugin:2.5.5:single (test-jar-with- 
>   dependencies) on project spark-streaming-mqtt_2.10: Failed to 
> create assembly: Error creating assembly archive test-
> jar-with-dependencies: Problem creating jar: Execution exception (and 
> the archive is probably corrupt but I could not 
> delete it): Java heap space -> [Help 1]
> 
> Any help?  I have a 64-bit windows 8 machine


Re: Problem building Spark

2015-10-19 Thread Tathagata Das
Seems to be a heap space issue for Maven. Have you configured Maven's
memory according the instruction on the web page?

export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"


On Mon, Oct 19, 2015 at 6:59 PM, Annabel Melongo <
melongo_anna...@yahoo.com.invalid> wrote:

> I tried to build Spark according to the build directions
>  and the it
> failed due to the following error:
>
>
>
>
>
>
> Building Spark - Spark 1.5.1 Documentation
> 
> Building Spark Building with build/mvn Building a Runnable Distribution
> Setting up Maven’s Memory Usage Specifying the Hadoop Version Building With
> Hive and JDBC Support Building for Scala 2.11
> View on spark.apache.org
> 
> Preview by Yahoo
>
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-assembly-plugin:2.5.5:single
> (test-jar-with-
>   dependencies) on project spark-streaming-mqtt_2.10: Failed to
> create assembly: Error creating assembly archive test-
> jar-with-dependencies: Problem creating jar: Execution exception
> (and the archive is probably corrupt but I could not
> delete it): Java heap space -> [Help 1]
>
> Any help?  I have a 64-bit windows 8 machine
>