[ 
https://issues.apache.org/jira/browse/SPARK-3191?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14117400#comment-14117400
 ] 

zhengbing li commented on SPARK-3191:
-------------------------------------

Another way to solve this issuse is to set environment variable:
export MAVEN_OPTS="-Dmaven.wagon.http.ssl.insecure=true 
-Dmaven.wagon.http.ssl.allowall=true"

> Add explanation of support building spark with maven in http proxy environment
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-3191
>                 URL: https://issues.apache.org/jira/browse/SPARK-3191
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation
>    Affects Versions: 1.0.2
>         Environment: linux suse 11
> maven version:apache-maven-3.0.5
> spark version: 1.0.1
> proxy setting of maven is:
>   <proxy>
>       <id>lzb</id>
>       <active>true</active>
>       <protocol>http</protocol>
>       <username>user</username>
>       <password>password</password>
>       <host>proxy.company.com</host>
>       <port>8080</port>
>       <nonProxyHosts>*.company.com</nonProxyHosts>
>     </proxy>
>            Reporter: zhengbing li
>            Priority: Trivial
>              Labels: build, maven
>             Fix For: 1.1.0
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> When I use "mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean 
> package" in http proxy enviroment, I cannot finish this task.Error is as 
> follows:
> [INFO] Spark Project YARN Stable API ..................... SUCCESS [34.217s]
> [INFO] Spark Project Assembly ............................ FAILURE [43.133s]
> [INFO] Spark Project External Twitter .................... SKIPPED
> [INFO] Spark Project External Kafka ...................... SKIPPED
> [INFO] Spark Project External Flume ...................... SKIPPED
> [INFO] Spark Project External ZeroMQ ..................... SKIPPED
> [INFO] Spark Project External MQTT ....................... SKIPPED
> [INFO] Spark Project Examples ............................ SKIPPED
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Total time: 27:57.309s
> [INFO] Finished at: Sat Aug 23 09:43:21 CST 2014
> [INFO] Final Memory: 51M/1080M
> [INFO] 
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-shade-plugin:2.2:shade (default) on project 
> spark-assembly_2.10: Execution default of goal 
> org.apache.maven.plugins:maven-shade-plugin:2.2:shade failed: Plugin 
> org.apache.maven.plugins:maven-shade-plugin:2.2 or one of its dependencies 
> could not be resolved: Could not find artifact 
> com.google.code.findbugs:jsr305:jar:1.3.9 -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
> switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions, please 
> read the following articles:
> [ERROR] [Help 1] 
> http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with the 
> command
> [ERROR]   mvn <goals> -rf :spark-assembly_2.10
> If you use this command, It is ok
> mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 
> -Dmaven.wagon.http.ssl.insecure=true -Dmaven.wagon.http.ssl.allowall=true 
> -DskipTests clean package
> The error is not very obvious, I spent a long time to solve this issues
> In order to facilitate other guys who use spark in http proxy environment, I 
> highly recommed add this to documents



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to