[jira] [Commented] (HIVE-14825) Figure out the minimum set of required jars for Hive on Spark after bumping up to Spark 2.0.0

2016-11-24 Thread Rui Li (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14825?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15694590#comment-15694590
 ] 

Rui Li commented on HIVE-14825:
---

[~kellyzly] btw I think you're pinging the wrong Rui Li :)

> Figure out the minimum set of required jars for Hive on Spark after bumping 
> up to Spark 2.0.0
> -
>
> Key: HIVE-14825
> URL: https://issues.apache.org/jira/browse/HIVE-14825
> Project: Hive
>  Issue Type: Task
>  Components: Documentation
>Reporter: Ferdinand Xu
>Assignee: Rui Li
> Fix For: 2.2.0
>
>
> Considering that there's no assembly jar for Spark since 2.0.0, we should 
> figure out the minimum set of required jars for HoS to work after bumping up 
> to Spark 2.0.0. By this way, users can decide whether they want to add just 
> the required jars, or all the jars under spark's dir for convenience.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14825) Figure out the minimum set of required jars for Hive on Spark after bumping up to Spark 2.0.0

2016-11-24 Thread Rui Li (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14825?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15694587#comment-15694587
 ] 

Rui Li commented on HIVE-14825:
---

I don't think so. Figuring out how spark prepares the classpath for containers 
may be helpful to avoid conflicts. I'll also take a look.

> Figure out the minimum set of required jars for Hive on Spark after bumping 
> up to Spark 2.0.0
> -
>
> Key: HIVE-14825
> URL: https://issues.apache.org/jira/browse/HIVE-14825
> Project: Hive
>  Issue Type: Task
>  Components: Documentation
>Reporter: Ferdinand Xu
>Assignee: Rui Li
> Fix For: 2.2.0
>
>
> Considering that there's no assembly jar for Spark since 2.0.0, we should 
> figure out the minimum set of required jars for HoS to work after bumping up 
> to Spark 2.0.0. By this way, users can decide whether they want to add just 
> the required jars, or all the jars under spark's dir for convenience.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14825) Figure out the minimum set of required jars for Hive on Spark after bumping up to Spark 2.0.0

2016-11-23 Thread liyunzhang_intel (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14825?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15692463#comment-15692463
 ] 

liyunzhang_intel commented on HIVE-14825:
-

[~ruili]:  Will spark load all the jars in $HIVE_HOME/lib/ if i copy all the 
$SPARK_HOME/jars/* to $HIVE_HOME/lib?   when i read the code, i found hive only 
send hive-exec*jar to spark.


> Figure out the minimum set of required jars for Hive on Spark after bumping 
> up to Spark 2.0.0
> -
>
> Key: HIVE-14825
> URL: https://issues.apache.org/jira/browse/HIVE-14825
> Project: Hive
>  Issue Type: Task
>  Components: Documentation
>Reporter: Ferdinand Xu
>Assignee: Rui Li
> Fix For: 2.2.0
>
>
> Considering that there's no assembly jar for Spark since 2.0.0, we should 
> figure out the minimum set of required jars for HoS to work after bumping up 
> to Spark 2.0.0. By this way, users can decide whether they want to add just 
> the required jars, or all the jars under spark's dir for convenience.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14825) Figure out the minimum set of required jars for Hive on Spark after bumping up to Spark 2.0.0

2016-11-22 Thread Rui Li (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14825?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15688618#comment-15688618
 ] 

Rui Li commented on HIVE-14825:
---

Hi [~kellyzly], it's in the [Configuring 
Hive|https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started#HiveonSpark:GettingStarted-ConfiguringHive]
 part.
Suppose you run with spark on yarn, then you can link scala-library, spark-core 
and spark-network-common to your hive lib.

> Figure out the minimum set of required jars for Hive on Spark after bumping 
> up to Spark 2.0.0
> -
>
> Key: HIVE-14825
> URL: https://issues.apache.org/jira/browse/HIVE-14825
> Project: Hive
>  Issue Type: Task
>  Components: Documentation
>Reporter: Ferdinand Xu
>Assignee: Rui Li
> Fix For: 2.2.0
>
>
> Considering that there's no assembly jar for Spark since 2.0.0, we should 
> figure out the minimum set of required jars for HoS to work after bumping up 
> to Spark 2.0.0. By this way, users can decide whether they want to add just 
> the required jars, or all the jars under spark's dir for convenience.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14825) Figure out the minimum set of required jars for Hive on Spark after bumping up to Spark 2.0.0

2016-11-22 Thread liyunzhang_intel (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14825?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15688540#comment-15688540
 ] 

liyunzhang_intel commented on HIVE-14825:
-

[~lirui]: i have not found the necessary libs on [current wiki 
page|https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started],
 if i miss something, please tell me.

> Figure out the minimum set of required jars for Hive on Spark after bumping 
> up to Spark 2.0.0
> -
>
> Key: HIVE-14825
> URL: https://issues.apache.org/jira/browse/HIVE-14825
> Project: Hive
>  Issue Type: Task
>  Components: Documentation
>Reporter: Ferdinand Xu
>Assignee: Rui Li
> Fix For: 2.2.0
>
>
> Considering that there's no assembly jar for Spark since 2.0.0, we should 
> figure out the minimum set of required jars for HoS to work after bumping up 
> to Spark 2.0.0. By this way, users can decide whether they want to add just 
> the required jars, or all the jars under spark's dir for convenience.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14825) Figure out the minimum set of required jars for Hive on Spark after bumping up to Spark 2.0.0

2016-11-07 Thread Ferdinand Xu (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14825?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15644100#comment-15644100
 ] 

Ferdinand Xu commented on HIVE-14825:
-

Thanks [~lirui] for figuring this out. Could you update the WIKI as well? Thank 
you!

> Figure out the minimum set of required jars for Hive on Spark after bumping 
> up to Spark 2.0.0
> -
>
> Key: HIVE-14825
> URL: https://issues.apache.org/jira/browse/HIVE-14825
> Project: Hive
>  Issue Type: Bug
>Reporter: Ferdinand Xu
>
> Considering that there's no assembly jar for Spark since 2.0.0, we should 
> figure out the minimum set of required jars for HoS to work after bumping up 
> to Spark 2.0.0. By this way, users can decide whether they want to add just 
> the required jars, or all the jars under spark's dir for convenience.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14825) Figure out the minimum set of required jars for Hive on Spark after bumping up to Spark 2.0.0

2016-11-07 Thread Rui Li (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14825?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15643542#comment-15643542
 ] 

Rui Li commented on HIVE-14825:
---

For yarn, I can run some simple queries with the following jars:
{noformat}
scala-library
spark-core
spark-network-common
{noformat}
For local mode, the extra jars needed:
{noformat}
chill-java
chill
jackson-module-paranamer
jackson-module-scala
jersey-container-servlet-core
jersey-server
json4s-ast
kryo-shaded
minlog
scala-xml
spark-launcher
spark-network-shuffle
spark-unsafe
xbean-asm5-shaded
{noformat}

> Figure out the minimum set of required jars for Hive on Spark after bumping 
> up to Spark 2.0.0
> -
>
> Key: HIVE-14825
> URL: https://issues.apache.org/jira/browse/HIVE-14825
> Project: Hive
>  Issue Type: Bug
>Reporter: Ferdinand Xu
>
> Considering that there's no assembly jar for Spark since 2.0.0, we should 
> figure out the minimum set of required jars for HoS to work after bumping up 
> to Spark 2.0.0. By this way, users can decide whether they want to add just 
> the required jars, or all the jars under spark's dir for convenience.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (HIVE-14825) Figure out the minimum set of required jars for Hive on Spark after bumping up to Spark 2.0.0

2016-09-22 Thread Rui Li (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-14825?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15515155#comment-15515155
 ] 

Rui Li commented on HIVE-14825:
---

Thanks [~Ferd] for tracking this. I expect the minimum set to be fairly small :)

> Figure out the minimum set of required jars for Hive on Spark after bumping 
> up to Spark 2.0.0
> -
>
> Key: HIVE-14825
> URL: https://issues.apache.org/jira/browse/HIVE-14825
> Project: Hive
>  Issue Type: Bug
>Reporter: Ferdinand Xu
>
> Considering that there's no assembly jar for Spark since 2.0.0, we should 
> figure out the minimum set of required jars for HoS to work after bumping up 
> to Spark 2.0.0. By this way, users can decide whether they want to add just 
> the required jars, or all the jars under spark's dir for convenience.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)