[jira] [Closed] (SPARK-17622) Cannot run create or load DF on Windows- Spark 2.0.0

2016-09-22 Thread renzhi he (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-17622?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

renzhi he closed SPARK-17622.
-
Resolution: Not A Bug

> Cannot run create or load DF on Windows- Spark 2.0.0
> 
>
> Key: SPARK-17622
> URL: https://issues.apache.org/jira/browse/SPARK-17622
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.0.0
> Environment: windows 10
> R 3.3.1
> RStudio 1.0.20
>Reporter: renzhi he
>  Labels: windows
>
> Under spark2.0.0- on Windows- when try to load or create data with the 
> similar codes below, I also get error message and cannot execute the 
> functions.
> |sc <- sparkR.session(master="local",sparkConfig = list(spark.driver.memory = 
> "2g")) |
> |df <- as.DataFrame(faithful) |
> Here is the error message:
> #Error in invokeJava(isStatic = TRUE, className, methodName, ...) :   
>  
> #java.lang.reflect.InvocationTargetException
> #at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> #at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> #at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> #at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> #at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
> #at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
> #at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
> #at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
> #at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
> #at 
> org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
> #at org.apache.spark.sql.hive.HiveSharedSt
> However, under spark1.6.1 or spark1.6.2, run the same functional functions, 
> there will be no problem.
> |sc1 <- sparkR.init(master = "local", sparkEnvir = 
> list(spark.driver.memory="2g"))|
> |sqlContext <- sparkRSQL.init(sc1)|
> |df <- as.DataFrame(sqlContext,faithful|



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-17622) Cannot run create or load DF on Windows- Spark 2.0.0

2016-09-22 Thread renzhi he (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-17622?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15513639#comment-15513639
 ] 

renzhi he commented on SPARK-17622:
---

Yes, I did not build Hive or other Hadoop-related modules.
Still learning these whole things.
I think this issue might be closed ;)

> Cannot run create or load DF on Windows- Spark 2.0.0
> 
>
> Key: SPARK-17622
> URL: https://issues.apache.org/jira/browse/SPARK-17622
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.0.0
> Environment: windows 10
> R 3.3.1
> RStudio 1.0.20
>Reporter: renzhi he
>  Labels: windows
>
> Under spark2.0.0- on Windows- when try to load or create data with the 
> similar codes below, I also get error message and cannot execute the 
> functions.
> |sc <- sparkR.session(master="local",sparkConfig = list(spark.driver.memory = 
> "2g")) |
> |df <- as.DataFrame(faithful) |
> Here is the error message:
> #Error in invokeJava(isStatic = TRUE, className, methodName, ...) :   
>  
> #java.lang.reflect.InvocationTargetException
> #at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> #at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> #at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> #at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> #at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
> #at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
> #at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
> #at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
> #at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
> #at 
> org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
> #at org.apache.spark.sql.hive.HiveSharedSt
> However, under spark1.6.1 or spark1.6.2, run the same functional functions, 
> there will be no problem.
> |sc1 <- sparkR.init(master = "local", sparkEnvir = 
> list(spark.driver.memory="2g"))|
> |sqlContext <- sparkRSQL.init(sc1)|
> |df <- as.DataFrame(sqlContext,faithful|



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-17622) Cannot run create or load DF on Windows- Spark 2.0.0

2016-09-21 Thread renzhi he (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-17622?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15512050#comment-15512050
 ] 

renzhi he commented on SPARK-17622:
---

Hi Sean,


Sorry, I just stepped into this field.

For this "bug", I just added a spark.sql.warehouse.dir="my/own/drive" in my 
sparkConfig list, and then my spark worked.

Just new to spark and R, so I am still confused to few things, I will spend 
more on reading the official docs, and sorry for bothering you :)

Best wishes,
Renzhi

 

> Cannot run create or load DF on Windows- Spark 2.0.0
> 
>
> Key: SPARK-17622
> URL: https://issues.apache.org/jira/browse/SPARK-17622
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 2.0.0
> Environment: windows 10
> R 3.3.1
> RStudio 1.0.20
>Reporter: renzhi he
>  Labels: windows
>
> Under spark2.0.0- on Windows- when try to load or create data with the 
> similar codes below, I also get error message and cannot execute the 
> functions.
> |sc <- sparkR.session(master="local",sparkConfig = list(spark.driver.memory = 
> "2g")) |
> |df <- as.DataFrame(faithful) |
> Here is the error message:
> #Error in invokeJava(isStatic = TRUE, className, methodName, ...) :   
>  
> #java.lang.reflect.InvocationTargetException
> #at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> #at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> #at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> #at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> #at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
> #at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
> #at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
> #at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
> #at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
> #at 
> org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
> #at org.apache.spark.sql.hive.HiveSharedSt
> However, under spark1.6.1 or spark1.6.2, run the same functional functions, 
> there will be no problem.
> |sc1 <- sparkR.init(master = "local", sparkEnvir = 
> list(spark.driver.memory="2g"))|
> |sqlContext <- sparkRSQL.init(sc1)|
> |df <- as.DataFrame(sqlContext,faithful|



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-17622) Cannot run create or load DF on Windows- Spark 2.0.0

2016-09-21 Thread renzhi he (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-17622?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

renzhi he updated SPARK-17622:
--
Description: 
Under spark2.0.0- on Windows- when try to load or create data with the similar 
codes below, I also get error message and cannot execute the functions.
|sc <- sparkR.session(master="local",sparkConfig = list(spark.driver.memory = 
"2g")) |
|df <- as.DataFrame(faithful) |


Here is the error message:
#Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
#java.lang.reflect.InvocationTargetException
#at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
#at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
#at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
#at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
#at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
#at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
#at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
#at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
#at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
#at 
org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
#at org.apache.spark.sql.hive.HiveSharedSt


However, under spark1.6.1 or spark1.6.2, run the same functional functions, 
there will be no problem.
|sc1 <- sparkR.init(master = "local", sparkEnvir = 
list(spark.driver.memory="2g"))|
|sqlContext <- sparkRSQL.init(sc1)|
|df <- as.DataFrame(sqlContext,faithful|

  was:
Under spark2.0.0- on Windows- when try to load or create data with the similar 
codes below, I also get error message and cannot execute the functions.
|sc <- sparkR.session(master="local",sparkConfig = list(spark.driver.memory = 
"2g")) |
|df <- as.DataFrame(faithful) |


Here is the error message:
#Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
#java.lang.reflect.InvocationTargetException
#at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
#at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
#at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
#at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
#at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
#at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
#at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
#at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
#at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
#at 
org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
#at org.apache.spark.sql.hive.HiveSharedSt


However, under spark1.6.1 or spark1.6.2, run the same functional functions, 
there will be no problem.
|sc1 <- sparkR.init(master = "local", sparkEnvir = 
list(spark.driver.memory="2g"))|
|sqlContext <- sparkRSQL.init(sc1)|
|df <- as.DataFrame(sqlContext,faithful|



> Cannot run create or load DF on Windows- Spark 2.0.0
> 
>
> Key: SPARK-17622
> URL: https://issues.apache.org/jira/browse/SPARK-17622
> Project: Spark
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 2.0.0
> Environment: windows 10
> R 3.3.1
> RStudio 1.0.20
>Reporter: renzhi he
>  Labels: windows
> Fix For: 1.6.1, 1.6.2
>
>
> Under spark2.0.0- on Windows- when try to load or create data with the 
> similar codes below, I also get error message and cannot execute the 
> functions.
> |sc <- sparkR.session(master="local",sparkConfig = list(spark.driver.memory = 
> "2g")) |
> |df <- as.DataFrame(faithful) |
> Here is the error message:
> #Error in invokeJava(isStatic = TRUE, className, methodName, ...) :   
>  
> #java.lang.reflect.InvocationTargetException
> #at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> #at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> #at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> #at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> #at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
> #at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
> 

[jira] [Updated] (SPARK-17622) Cannot run create or load DF on Windows- Spark 2.0.0

2016-09-21 Thread renzhi he (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-17622?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

renzhi he updated SPARK-17622:
--
Description: 
Under spark2.0.0- on Windows- when try to load or create data with the similar 
codes below, I also get error message and cannot execute the functions.
|sc <- sparkR.session(master="local",sparkConfig = list(spark.driver.memory = 
"2g")) |
|df <- as.DataFrame(faithful) |


Here is the error message:
#Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
#java.lang.reflect.InvocationTargetException
#at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
#at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
#at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
#at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
#at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
#at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
#at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
#at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
#at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
#at 
org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
#at org.apache.spark.sql.hive.HiveSharedSt


However, under spark1.6.1 or spark1.6.2, run the same functional functions, 
there will be no problem.
|sc1 <- sparkR.init(master = "local", sparkEnvir = 
list(spark.driver.memory="2g"))|
|sqlContext <- sparkRSQL.init(sc1)|
|df <- as.DataFrame(sqlContext,faithful|


> Cannot run create or load DF on Windows- Spark 2.0.0
> 
>
> Key: SPARK-17622
> URL: https://issues.apache.org/jira/browse/SPARK-17622
> Project: Spark
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 2.0.0
> Environment: windows 10
> R 3.3.1
> RStudio 1.0.20
>Reporter: renzhi he
>  Labels: windows
> Fix For: 1.6.1, 1.6.2
>
>
> Under spark2.0.0- on Windows- when try to load or create data with the 
> similar codes below, I also get error message and cannot execute the 
> functions.
> |sc <- sparkR.session(master="local",sparkConfig = list(spark.driver.memory = 
> "2g")) |
> |df <- as.DataFrame(faithful) |
> Here is the error message:
> #Error in invokeJava(isStatic = TRUE, className, methodName, ...) :   
>  
> #java.lang.reflect.InvocationTargetException
> #at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> #at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> #at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> #at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> #at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
> #at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
> #at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
> #at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
> #at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
> #at 
> org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
> #at org.apache.spark.sql.hive.HiveSharedSt
> However, under spark1.6.1 or spark1.6.2, run the same functional functions, 
> there will be no problem.
> |sc1 <- sparkR.init(master = "local", sparkEnvir = 
> list(spark.driver.memory="2g"))|
> |sqlContext <- sparkRSQL.init(sc1)|
> |df <- as.DataFrame(sqlContext,faithful|



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-17622) Cannot run create or load DF on Windows- Spark 2.0.0

2016-09-21 Thread renzhi he (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-17622?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

renzhi he updated SPARK-17622:
--
Description: (was: sc <- sparkR.session(master="local[*]",  sparkConfig 
= list(spark.driver.memory = "2g"))

df <- as.DataFrame(faithful)

get error below:

Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
at 
org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
at org.apache.spark.sql.hive.HiveSharedSt


on spark 1.6.1 and spark 1.6.2 can run the corresponding codes.
sc1 <- sparkR.init(master = "local[*]", sparkEnvir = 
list(spark.driver.memory="2g"))
sqlContext <- sparkRSQL.init(sc1)
df <- as.DataFrame(sqlContext,faithful))

> Cannot run create or load DF on Windows- Spark 2.0.0
> 
>
> Key: SPARK-17622
> URL: https://issues.apache.org/jira/browse/SPARK-17622
> Project: Spark
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 2.0.0
> Environment: windows 10
> R 3.3.1
> RStudio 1.0.20
>Reporter: renzhi he
>  Labels: windows
> Fix For: 1.6.1, 1.6.2
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-17622) Cannot run create or load DF on Windows- Spark 2.0.0

2016-09-21 Thread renzhi he (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-17622?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

renzhi he updated SPARK-17622:
--
Description: 
sc <- sparkR.session(master="local[*]",  sparkConfig = list(spark.driver.memory 
= "2g"))

df <- as.DataFrame(faithful)

get error below:

Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
at 
org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
at org.apache.spark.sql.hive.HiveSharedSt


on spark 1.6.1 and spark 1.6.2 can run the corresponding codes.
sc1 <- sparkR.init(master = "local[*]", sparkEnvir = 
list(spark.driver.memory="2g"))
sqlContext <- sparkRSQL.init(sc1)
df <- as.DataFrame(sqlContext,faithful)

  was:
sc <- sparkR.session(master="local[*]", appName="sparkR", sparkConfig = 
list(spark.driver.memory = "2g"))

df <- as.DataFrame(faithful)

get error below:

Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
at 
org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
at org.apache.spark.sql.hive.HiveSharedSt


on spark 1.6.1 and spark 1.6.2 can run the corresponding codes.
sc1 <- sparkR.init(master = "local[*]", sparkEnvir = 
list(spark.driver.memory="2g"))
sqlContext <- sparkRSQL.init(sc1)
df <- as.DataFrame(sqlContext,faithful)


> Cannot run create or load DF on Windows- Spark 2.0.0
> 
>
> Key: SPARK-17622
> URL: https://issues.apache.org/jira/browse/SPARK-17622
> Project: Spark
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 2.0.0
> Environment: windows 10
> R 3.3.1
> RStudio 1.0.20
>Reporter: renzhi he
>  Labels: windows
> Fix For: 1.6.1, 1.6.2
>
>
> sc <- sparkR.session(master="local[*]",  sparkConfig = 
> list(spark.driver.memory = "2g"))
> df <- as.DataFrame(faithful)
> get error below:
> Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
> at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
> at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
> at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
> at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
> at 
> org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
> at org.apache.spark.sql.hive.HiveSharedSt
> on spark 1.6.1 and spark 1.6.2 can run the corresponding codes.
> sc1 <- sparkR.init(master = "local[*]", sparkEnvir = 
> list(spark.driver.memory="2g"))
> sqlContext <- 

[jira] [Updated] (SPARK-17622) Cannot run create or load DF on Windows- Spark 2.0.0

2016-09-21 Thread renzhi he (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-17622?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

renzhi he updated SPARK-17622:
--
Summary: Cannot run create or load DF on Windows- Spark 2.0.0  (was: Cannot 
run SparkR function on Windows- Spark 2.0.0)

> Cannot run create or load DF on Windows- Spark 2.0.0
> 
>
> Key: SPARK-17622
> URL: https://issues.apache.org/jira/browse/SPARK-17622
> Project: Spark
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 2.0.0
> Environment: windows 10
> R 3.3.1
> RStudio 1.0.20
>Reporter: renzhi he
>  Labels: windows
> Fix For: 1.6.1, 1.6.2
>
>
> sc <- sparkR.session(master="local[*]", appName="sparkR", sparkConfig = 
> list(spark.driver.memory = "2g"))
> df <- as.DataFrame(faithful)
> get error below:
> Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
> at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
> at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
> at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
> at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
> at 
> org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
> at org.apache.spark.sql.hive.HiveSharedSt
> on spark 1.6.1 and spark 1.6.2 can run the corresponding codes.
> sc1 <- sparkR.init(master = "local[*]", sparkEnvir = 
> list(spark.driver.memory="2g"))
> sqlContext <- sparkRSQL.init(sc1)
> df <- as.DataFrame(sqlContext,faithful)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-17622) Cannot run SparkR function on Windows- Spark 2.0.0

2016-09-21 Thread renzhi he (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-17622?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

renzhi he updated SPARK-17622:
--
Description: 
sc <- sparkR.session(master="local[*]", appName="sparkR", sparkConfig = 
list(spark.driver.memory = "2g"))

df <- as.DataFrame(faithful)

get error below:

Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
at 
org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
at org.apache.spark.sql.hive.HiveSharedSt


on spark 1.6.1 and spark 1.6.2 can run the corresponding codes.
sc1 <- sparkR.init(master = "local[*]", sparkEnvir = 
list(spark.driver.memory="2g"))
sqlContext <- sparkRSQL.init(sc1)
df <- as.DataFrame(sqlContext,faithful)

  was:
sc <- sparkR.session(master="spark://spark01.cmua.dom:7077", appName="sparkR", 
sparkConfig = list(spark.driver.memory = "2g"))

df <- as.DataFrame(faithful)


get error below:
Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
   at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
  at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
   at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
at 
org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
at org.apache.spark.sql.hive.HiveSharedSt


> Cannot run SparkR function on Windows- Spark 2.0.0
> --
>
> Key: SPARK-17622
> URL: https://issues.apache.org/jira/browse/SPARK-17622
> Project: Spark
>  Issue Type: Bug
>  Components: Java API
>Affects Versions: 2.0.0
> Environment: windows 10
> R 3.3.1
> RStudio 1.0.20
>Reporter: renzhi he
>  Labels: windows
> Fix For: 1.6.1, 1.6.2
>
>
> sc <- sparkR.session(master="local[*]", appName="sparkR", sparkConfig = 
> list(spark.driver.memory = "2g"))
> df <- as.DataFrame(faithful)
> get error below:
> Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
> at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
> at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
> at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
> at 
> org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
> at 
> org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
> at org.apache.spark.sql.hive.HiveSharedSt
> on spark 1.6.1 and spark 1.6.2 can run the corresponding codes.
> sc1 <- sparkR.init(master = "local[*]", sparkEnvir = 
> list(spark.driver.memory="2g"))
> sqlContext <- sparkRSQL.init(sc1)
> df <- as.DataFrame(sqlContext,faithful)



--
This message was sent by 

[jira] [Created] (SPARK-17622) Cannot run SparkR function on Windows- Spark 2.0.0

2016-09-21 Thread renzhi he (JIRA)
renzhi he created SPARK-17622:
-

 Summary: Cannot run SparkR function on Windows- Spark 2.0.0
 Key: SPARK-17622
 URL: https://issues.apache.org/jira/browse/SPARK-17622
 Project: Spark
  Issue Type: Bug
  Components: Java API
Affects Versions: 2.0.0
 Environment: windows 10
R 3.3.1
RStudio 1.0.20
Reporter: renzhi he
 Fix For: 1.6.2, 1.6.1


sc <- sparkR.session(master="spark://spark01.cmua.dom:7077", appName="sparkR", 
sparkConfig = list(spark.driver.memory = "2g"))

df <- as.DataFrame(faithful)


get error below:
Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
   at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
  at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
   at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
at 
org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
at org.apache.spark.sql.hive.HiveSharedSt



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org