Yes, I have tried MAVEN_OPTS with

-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m

-Xmx4g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m

-Xmx2g -XX:MaxPermSize=1g -XX:ReservedCodeCacheSize=512m

None of them works. All failed with the same error.

thanks






On Sat, Oct 17, 2015 at 2:44 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> Have you set MAVEN_OPTS with the following ?
> -Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m
>
> Cheers
>
> On Sat, Oct 17, 2015 at 2:35 PM, Chester Chen <ches...@alpinenow.com>
> wrote:
>
>> I was using jdk 1.7 and maven version is the same as pom file.
>>
>> áš› |(v1.5.1)|$ java -version
>> java version "1.7.0_51"
>> Java(TM) SE Runtime Environment (build 1.7.0_51-b13)
>> Java HotSpot(TM) 64-Bit Server VM (build 24.51-b03, mixed mode)
>>
>> Using build/sbt still fail the same with -Denforcer.skip, with mvn build,
>> it fails with
>>
>>
>> [ERROR] PermGen space -> [Help 1]
>> [ERROR]
>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>> -e switch.
>> [ERROR] Re-run Maven using the -X switch to enable full debug logging
>>
>> I am giving up on this. Just using 1.5.2-SNAPSHOT for now.
>>
>> Chester
>>
>>
>> On Mon, Oct 12, 2015 at 12:05 AM, Xiao Li <gatorsm...@gmail.com> wrote:
>>
>>> Hi, Chester,
>>>
>>> Please check your pom.xml. Your java.version and maven.version might not
>>> match your build environment.
>>>
>>> Or using -Denforcer.skip=true from the command line to skip it.
>>>
>>> Good luck,
>>>
>>> Xiao Li
>>>
>>> 2015-10-08 10:35 GMT-07:00 Chester Chen <ches...@alpinenow.com>:
>>>
>>>> Question regarding branch-1.5  build.
>>>>
>>>> Noticed that the spark project no longer publish the spark-assembly. We
>>>> have to build ourselves ( until we find way to not depends on assembly
>>>> jar).
>>>>
>>>>
>>>> I check out the tag v.1.5.1 release version and using the sbt to build
>>>> it, I get the following error
>>>>
>>>> build/sbt -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
>>>> -Phive-thriftserver -DskipTests clean package assembly
>>>>
>>>>
>>>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [warn] ::          UNRESOLVED DEPENDENCIES         ::
>>>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [warn] :: org.apache.spark#spark-network-common_2.10;1.5.1:
>>>> configuration not public in
>>>> org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It was required
>>>> from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
>>>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [warn]
>>>> [warn] Note: Unresolved dependencies path:
>>>> [warn] org.apache.spark:spark-network-common_2.10:1.5.1
>>>> ((com.typesafe.sbt.pom.MavenHelper) MavenHelper.scala#L76)
>>>> [warn]  +- org.apache.spark:spark-network-shuffle_2.10:1.5.1
>>>> [info] Packaging
>>>> /Users/chester/projects/alpine/apache/spark/launcher/target/scala-2.10/spark-launcher_2.10-1.5.1.jar
>>>> ...
>>>> [info] Done packaging.
>>>> [warn] four warnings found
>>>> [warn] Note: Some input files use unchecked or unsafe operations.
>>>> [warn] Note: Recompile with -Xlint:unchecked for details.
>>>> [warn] No main class detected
>>>> [info] Packaging
>>>> /Users/chester/projects/alpine/apache/spark/external/flume-sink/target/scala-2.10/spark-streaming-flume-sink_2.10-1.5.1.jar
>>>> ...
>>>> [info] Done packaging.
>>>> sbt.ResolveException: unresolved dependency:
>>>> org.apache.spark#spark-network-common_2.10;1.5.1: configuration not public
>>>> in org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It was
>>>> required from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
>>>>
>>>>
>>>> Somehow the network-shuffle can't find the test jar needed ( not sure
>>>> why test still needed, even the  -DskipTests is already specified)
>>>>
>>>> tried the maven command, the build failed as well ( without assembly)
>>>>
>>>> mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
>>>> -Phive-thriftserver -DskipTests clean package
>>>>
>>>> [ERROR] Failed to execute goal
>>>> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
>>>> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
>>>> failed. Look above for specific messages explaining why the rule failed. ->
>>>> [Help 1]
>>>> [ERROR]
>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
>>>> the -e switch.
>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>> [ERROR]
>>>> [ERROR] For more information about the errors and possible solutions,
>>>> please read the following articles:
>>>> [ERROR] [Help 1]
>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>
>>>>
>>>>
>>>> I checkout the branch-1.5 and replaced "1.5.2-SNAPSHOT" with "1.5.1"
>>>> and build/sbt will still fail ( same error as above for sbt)
>>>>
>>>> But if I keep the version string as "1.5.2-SNAPSHOT", the build/sbt
>>>> works fine.
>>>>
>>>>
>>>> Any ideas ?
>>>>
>>>> Chester
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>
>

Reply via email to