[ https://issues.apache.org/jira/browse/SPARK-21814?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16138075#comment-16138075 ]
xinzhang edited comment on SPARK-21814 at 8/23/17 3:32 PM: ----------------------------------------------------------- Thanks your reply. (I will del this one hour later may be later) was (Author: zhangxin0112zx): Thanks your reply. (I will del this one hour later) > build spark current master can not use hive metadatamysql > --------------------------------------------------------- > > Key: SPARK-21814 > URL: https://issues.apache.org/jira/browse/SPARK-21814 > Project: Spark > Issue Type: Question > Components: Build, SQL > Affects Versions: 2.2.0 > Reporter: xinzhang > > Hi. I builded spark(master) source code by myself and it was successful. > Useed the cmd : > ./dev/make-distribution.sh --name custom-spark --pip --r --tgz -Psparkr > -Phive -Phive-thriftserver -Pyarn > But when I used the 'spark-sql' for connnecting the metadata(I put my Hive's > conf hive-site.xml into the $SPARK_HOME/conf/ ) . It seems do not worked.It > always connected use derby(My hive-site.xml use MySQL as metadata db). > I could not judge the problem's reason. > Is my build cmd right? If not.Which cmd should I use for build the project by > myself. > Any suggestes will be helpful. > the spark source code's last commit is : > [root@node3 spark]# git log > commit be72b157ea13ea116c5178a9e41e37ae24090f72 > Author: gatorsmile <gatorsm...@gmail.com> > Date: Tue Aug 22 17:54:39 2017 +0800 > [SPARK-21803][TEST] Remove the HiveDDLCommandSuite > > ## What changes were proposed in this pull request? > We do not have any Hive-specific parser. It does not make sense to keep a > parser-specific test suite `HiveDDLCommandSuite.scala` in the Hive package. > This PR is to > > ## How was this patch tested? > N/A > > Author: gatorsmile <gatorsm...@gmail.com> > > Closes #19015 from gatorsmile/combineDDL. -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org