Hi Stephen,
How did you generate your Maven workspace? You need to make sure the Hive
profile is enabled for it. For example sbt/sbt -Phive gen-idea.
Matei
On Oct 28, 2014, at 7:42 PM, Stephen Boesch java...@gmail.com wrote:
I have run on the command line via maven and it is fine:
mvn
Hi Matei,
Until my latest pull from upstream/master it had not been necessary to
add the hive profile: is it now??
I am not using sbt gen-idea. The way to open in intellij has been to Open
the parent directory. IJ recognizes it as a maven project.
There are several steps to do surgery on the
Hey Stephen,
In some cases in the maven build we now have pluggable source
directories based on profiles using the maven build helper plug-in.
This is necessary to support cross building against different Hive
versions, and there will be additional instances of this due to
supporting scala 2.11
Thanks Patrick for the heads up.
I have not been successful to discover a combination of profiles (i.e.
enabling hive or hive-0.12.0 or hive-13.0) that works in Intellij with
maven. Anyone who knows how to handle this - a quick note here would be
appreciated.
2014-10-28 20:20 GMT-07:00 Patrick
-Phive is to enable hive-0.13.1 and -Phive -Phive-0.12.0” is to enable
hive-0.12.0. Note that the thrift-server is not supported yet in hive-0.13, but
expected to go to upstream soon (Spark-3720).
Thanks.
Zhan Zhang
On Oct 28, 2014, at 9:09 PM, Stephen Boesch java...@gmail.com wrote:
Yes, these two combinations work for me.
On 10/29/14 12:32 PM, Zhan Zhang wrote:
-Phive is to enable hive-0.13.1 and -Phive -Phive-0.12.0” is to enable
hive-0.12.0. Note that the thrift-server is not supported yet in hive-0.13, but
expected to go to upstream soon (Spark-3720).
Thanks.
Zhan
I am interested specifically in how to build (and hopefully run/debug..)
under Intellij. Your posts sound like command line maven - which has
always been working already.
Do you have instructions for building in IJ?
2014-10-28 21:38 GMT-07:00 Cheng Lian lian.cs@gmail.com:
Yes, these two
Btw - we should have part of the official docs that describes a full
from scratch build in IntelliJ including any gotchas. Then we can
update it if there are build changes that alter it. I created this
JIRA for it:
https://issues.apache.org/jira/browse/SPARK-4128
On Tue, Oct 28, 2014 at 9:42 PM,
You may first open the root pom.xml file in IDEA, and then go for menu
View / Tool Windows / Maven Projects, then choose desired Maven profile
combination under the Profiles node (e.g. I usually use hadoop-2.4 +
hive + hive-0.12.0). IDEA will ask you to re-import the Maven projects,
confirm,
I just started a totally fresh IntelliJ project importing from our
root pom. I used all the default options and I added hadoop-2.4,
hive, hive-0.13.1 profiles. I was able to run spark core tests from
within IntelliJ. Didn't try anything beyond that, but FWIW this
worked.
- Patrick
On Tue, Oct
Hao Cheng had just written such a from scratch guide for building
Spark SQL in IDEA. Although it's written in Chinese, I think the
illustrations are already descriptive enough.
http://www.cnblogs.com//articles/4058371.html
On 10/29/14 12:45 PM, Patrick Wendell wrote:
Btw - we should
I have selected the same options as Cheng LIang: hadoop-2.4, hive, hive
0.12.0 . After a full Rebuild in IJ I still see the HiveShim errors.
I really do not know what is different. I had pulled three hours ago from
github upstream master.
Just for kicks i am trying PW's combination which uses
Cheng - to make it recognize the new HiveShim for 0.12 I had to click
on spark-hive under packages in the left pane, then go to Open
Module Settings - then explicitly add the v0.12.0/src/main/scala
folder to the sources by navigating to it and then ctrl+click to add
it as a source. Did you have to
Thanks guys - adding the source root for the shim manually was the issue.
For some reason the other issue I was struggling with
(NoCLassDefFoundError on ThreadFactoryBuilder) also disappeared. I am able
to run tests now inside IJ. Woot
2014-10-28 22:13 GMT-07:00 Patrick Wendell
Oops - I actually should have added v0.13.0 (i.e. to match whatever I
did in the profile).
On Tue, Oct 28, 2014 at 10:05 PM, Patrick Wendell pwend...@gmail.com wrote:
Cheng - to make it recognize the new HiveShim for 0.12 I had to click
on spark-hive under packages in the left pane, then go to
Hm, the shim source folder could be automatically recognized some time
before, although at a wrong directory level (sql/hive/v0.12.0/src
instead of sql/hive/v0.12.0/src/main/scala), it compiles.
Just tried against a fresh checkout, indeed need to add shim source
folder manually. Sorry for the
16 matches
Mail list logo