[ 
https://issues.apache.org/jira/browse/SPARK-19176?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15817753#comment-15817753
 ] 

jin xing commented on SPARK-19176:
----------------------------------

[~joshrosen]
How do you think about this? I can make a pr if possible. Should it be 
https://github.com/JoshRosen/hive.git ?

> Change bin.xml to be compatible with groupId "org.spark-project.hive"
> ---------------------------------------------------------------------
>
>                 Key: SPARK-19176
>                 URL: https://issues.apache.org/jira/browse/SPARK-19176
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.1.0
>            Reporter: jin xing
>         Attachments: bin.patch
>
>
> Running "mvn clean package -DskipTests -Phadoop-2,dist" agains 
> https://github.com/JoshRosen/hive.git on branch: release-1.2.1-spark2 failed 
> with following failure:
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-assembly-plugin:2.3:single (assemble) on 
> project hive-packaging: Assembly is incorrectly configured: bin: Assembly is 
> incorrectly configured: bin:
> [ERROR] Assembly: bin is not configured correctly: One or more filters had 
> unmatched criteria. Check debug log for more information.
> Dependency set in packaging/src/main/assembly/bin.xml should be changed 
> compatible with group id "org.spark-project.hive".
> We found this issue when doing some customization on Hive on which spark 
> depends. Not sure if  "mvn clean package -DskipTests -Phadoop-2,dist" is a 
> proper way for building, but bin.xml needs to be changed compatible with 
> groupId (I think).
> I'm not sure if source code of Hive on which  Spark depends are maintained on 
> https://github.com/JoshRosen/hive.git. I can't find the corresponding JIRA.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to