[jira] [Updated] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

2015-07-28 Thread Michael Armbrust (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Armbrust updated SPARK-9280:

Issue Type: Sub-task  (was: Bug)
Parent: SPARK-9410

 New HiveContext object unexpectedly loads configuration settings from history 
 --

 Key: SPARK-9280
 URL: https://issues.apache.org/jira/browse/SPARK-9280
 Project: Spark
  Issue Type: Sub-task
  Components: SQL
Affects Versions: 1.3.1, 1.4.1
Reporter: Tien-Dung LE

 In a spark session, stopping a spark context and create a new spark context 
 and hive context does not clean the spark sql configuration. More precisely, 
 the new hive context still keeps the previous configuration settings. It 
 would be great if someone can let us know how to avoid this situation.
 {code:title=New hive context should not load the configurations from history}
 sqlContext.setConf( spark.sql.shuffle.partitions, 10)
 sc.stop
 val sparkConf2 = new org.apache.spark.SparkConf()
 val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
 val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
 sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
 // got 20 as expected
 sqlContext2.setConf( foo, foo) 
 sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
 // expected 30 but got 10
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

2015-07-27 Thread Tien-Dung LE (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tien-Dung LE updated SPARK-9280:

Description: 
In a spark session, stopping a spark context and create a new spark context and 
hive context does not clean the spark sql configuration. More precisely, the 
new hive context still keeps the previous configuration settings. It would be 
great if someone can let us know how to avoid this situation.

{code:title=New hive context should not load the configurations from history}

sqlContext.setConf( spark.sql.shuffle.partitions, 10)
sc.stop

val sparkConf2 = new org.apache.spark.SparkConf()
val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )

sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
// got 20 as expected

sqlContext2.setConf( foo, foo) 

sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
// expected 30 but got 10

{code}

  was:
In a spark-shell session, stopping a spark context and create a new spark 
context and hive context does not clean the spark sql configuration. More 
precisely, the new hive context still keeps the previous configuration 
settings. It would be great if someone can let us know how to avoid this 
situation.

{code:title=New hive context should not load the configurations from history}

sqlContext.setConf( spark.sql.shuffle.partitions, 10)
sc.stop

val sparkConf2 = new org.apache.spark.SparkConf()
val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )

sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
// got 20 as expected

sqlContext2.setConf( foo, foo) 

sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
// expected 30 but got 10

{code}


 New HiveContext object unexpectedly loads configuration settings from history 
 --

 Key: SPARK-9280
 URL: https://issues.apache.org/jira/browse/SPARK-9280
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.3.1, 1.4.1
Reporter: Tien-Dung LE

 In a spark session, stopping a spark context and create a new spark context 
 and hive context does not clean the spark sql configuration. More precisely, 
 the new hive context still keeps the previous configuration settings. It 
 would be great if someone can let us know how to avoid this situation.
 {code:title=New hive context should not load the configurations from history}
 sqlContext.setConf( spark.sql.shuffle.partitions, 10)
 sc.stop
 val sparkConf2 = new org.apache.spark.SparkConf()
 val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
 val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
 sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
 // got 20 as expected
 sqlContext2.setConf( foo, foo) 
 sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
 // expected 30 but got 10
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

2015-07-27 Thread Tien-Dung LE (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tien-Dung LE updated SPARK-9280:

Description: 
In a spark-shell session, stopping a spark context and create a new spark 
context and hive context does not clean the spark sql configuration. More 
precisely, the new hive context still keeps the previous configuration 
settings. It would be great if someone can let us know how to avoid this 
situation.

{code:title=New hive context should not load the configurations from history}

sqlContext.setConf( spark.sql.shuffle.partitions, 10)
sc.stop

val sparkConf2 = new org.apache.spark.SparkConf()
val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )

sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
// got 20 as expected

sqlContext2.setConf( foo, foo) 

sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
// expected 30 but got 10

{code}

  was:
In a spark-shell session, stopping a spark context and create a new spark 
context and hive context does not clean the spark sql configuration. More 
precisely, the new hive context still keeps the previous configuration 
settings. It would be great if someone can let us know how to avoid this 
situation.

{code:title=New hive context should not load the configurations from history}
case class Foo ( x: Int = (math.random * 1e3).toInt)
val foo = (1 to 100).map(i = Foo()).toDF
foo.saveAsParquetFile( foo )
sqlContext.setConf( spark.sql.shuffle.partitions, 10)

sc.stop

val sparkConf2 = new org.apache.spark.SparkConf()
val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )

sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
// got 20 as expected
val foo2 = sqlContext2.parquetFile( foo )
sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
// expected 30 but got 10
{code}


 New HiveContext object unexpectedly loads configuration settings from history 
 --

 Key: SPARK-9280
 URL: https://issues.apache.org/jira/browse/SPARK-9280
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.3.1, 1.4.1
Reporter: Tien-Dung LE

 In a spark-shell session, stopping a spark context and create a new spark 
 context and hive context does not clean the spark sql configuration. More 
 precisely, the new hive context still keeps the previous configuration 
 settings. It would be great if someone can let us know how to avoid this 
 situation.
 {code:title=New hive context should not load the configurations from history}
 sqlContext.setConf( spark.sql.shuffle.partitions, 10)
 sc.stop
 val sparkConf2 = new org.apache.spark.SparkConf()
 val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
 val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
 sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
 // got 20 as expected
 sqlContext2.setConf( foo, foo) 
 sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
 // expected 30 but got 10
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

2015-07-24 Thread Tien-Dung LE (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tien-Dung LE updated SPARK-9280:

Affects Version/s: 1.4.1

 New HiveContext object unexpectedly loads configuration settings from history 
 --

 Key: SPARK-9280
 URL: https://issues.apache.org/jira/browse/SPARK-9280
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.3.1, 1.4.1
Reporter: Tien-Dung LE

 In a spark-shell session, stopping a spark context and create a new spark 
 context and hive context does not clean the spark sql configuration. More 
 precisely, the new hive context still keeps the previous configuration 
 settings. It would be great if someone can let us know how to avoid this 
 situation.
 {code:title=New hive context should not load the configurations from history}
 case class Foo ( x: Int = (math.random * 1e3).toInt)
 val foo = (1 to 100).map(i = Foo()).toDF
 foo.saveAsParquetFile( foo )
 sqlContext.setConf( spark.sql.shuffle.partitions, 10)
 sc.stop
 val sparkConf2 = new org.apache.spark.SparkConf()
 val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
 val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
 sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
 // got 20 as expected
 val foo2 = sqlContext2.parquetFile( foo )
 sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
 // expected 30 but got 10
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

2015-07-23 Thread Tien-Dung LE (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tien-Dung LE updated SPARK-9280:

Affects Version/s: 1.3.1

 New HiveContext object unexpectedly loads configuration settings from history 
 --

 Key: SPARK-9280
 URL: https://issues.apache.org/jira/browse/SPARK-9280
 Project: Spark
  Issue Type: Bug
Affects Versions: 1.3.1
Reporter: Tien-Dung LE

 In a spark-shell session, stopping a spark context and create a new spark 
 context and hive context does not clean the spark sql configuration. More 
 precisely, the new hive context still keeps the previous configuration 
 settings. Here is a code to show this scenario.
 {code:title=New hive context should not load the configurations from history}
 case class Foo ( x: Int = (math.random * 1e3).toInt)
 val foo = (1 to 100).map(i = Foo()).toDF
 foo.saveAsParquetFile( foo )
 sqlContext.setConf( spark.sql.shuffle.partitions, 10)
 sc.stop
 val sparkConf2 = new org.apache.spark.SparkConf()
 val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
 val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
 sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
 val foo2 = sqlContext2.parquetFile( foo )
 sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
 // expected 30 but got 10
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

2015-07-23 Thread Tien-Dung LE (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tien-Dung LE updated SPARK-9280:

Description: 
In a spark-shell session, stopping a spark context and create a new spark 
context and hive context does not clean the spark sql configuration. More 
precisely, the new hive context still keeps the previous configuration 
settings. It would be great if someone can let us know how to avoid this 
situation.

{code:title=New hive context should not load the configurations from history}
case class Foo ( x: Int = (math.random * 1e3).toInt)
val foo = (1 to 100).map(i = Foo()).toDF
foo.saveAsParquetFile( foo )
sqlContext.setConf( spark.sql.shuffle.partitions, 10)

sc.stop

val sparkConf2 = new org.apache.spark.SparkConf()
val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )

sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
// got 20 as expected
val foo2 = sqlContext2.parquetFile( foo )
sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
// expected 30 but got 10
{code}

  was:
In a spark-shell session, stopping a spark context and create a new spark 
context and hive context does not clean the spark sql configuration. More 
precisely, the new hive context still keeps the previous configuration 
settings. It would be great if someone can let us know how to avoid this 
situation.

{code:title=New hive context should not load the configurations from history}
case class Foo ( x: Int = (math.random * 1e3).toInt)
val foo = (1 to 100).map(i = Foo()).toDF
foo.saveAsParquetFile( foo )
sqlContext.setConf( spark.sql.shuffle.partitions, 10)

sc.stop

val sparkConf2 = new org.apache.spark.SparkConf()
val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )

sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
val foo2 = sqlContext2.parquetFile( foo )
sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
// expected 30 but got 10
{code}


 New HiveContext object unexpectedly loads configuration settings from history 
 --

 Key: SPARK-9280
 URL: https://issues.apache.org/jira/browse/SPARK-9280
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.3.1
Reporter: Tien-Dung LE

 In a spark-shell session, stopping a spark context and create a new spark 
 context and hive context does not clean the spark sql configuration. More 
 precisely, the new hive context still keeps the previous configuration 
 settings. It would be great if someone can let us know how to avoid this 
 situation.
 {code:title=New hive context should not load the configurations from history}
 case class Foo ( x: Int = (math.random * 1e3).toInt)
 val foo = (1 to 100).map(i = Foo()).toDF
 foo.saveAsParquetFile( foo )
 sqlContext.setConf( spark.sql.shuffle.partitions, 10)
 sc.stop
 val sparkConf2 = new org.apache.spark.SparkConf()
 val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
 val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
 sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
 // got 20 as expected
 val foo2 = sqlContext2.parquetFile( foo )
 sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
 // expected 30 but got 10
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

2015-07-23 Thread Tien-Dung LE (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tien-Dung LE updated SPARK-9280:

Component/s: SQL

 New HiveContext object unexpectedly loads configuration settings from history 
 --

 Key: SPARK-9280
 URL: https://issues.apache.org/jira/browse/SPARK-9280
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.3.1
Reporter: Tien-Dung LE

 In a spark-shell session, stopping a spark context and create a new spark 
 context and hive context does not clean the spark sql configuration. More 
 precisely, the new hive context still keeps the previous configuration 
 settings. Here is a code to show this scenario.
 {code:title=New hive context should not load the configurations from history}
 case class Foo ( x: Int = (math.random * 1e3).toInt)
 val foo = (1 to 100).map(i = Foo()).toDF
 foo.saveAsParquetFile( foo )
 sqlContext.setConf( spark.sql.shuffle.partitions, 10)
 sc.stop
 val sparkConf2 = new org.apache.spark.SparkConf()
 val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
 val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
 sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
 val foo2 = sqlContext2.parquetFile( foo )
 sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
 // expected 30 but got 10
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

2015-07-23 Thread Tien-Dung LE (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tien-Dung LE updated SPARK-9280:

Description: 
In a spark-shell session, stopping a spark context and create a new spark 
context and hive context does not clean the spark sql configuration. More 
precisely, the new hive context still keeps the previous configuration 
settings. It would be great if someone can let us know how to avoid this 
situation.

{code:title=New hive context should not load the configurations from history}
case class Foo ( x: Int = (math.random * 1e3).toInt)
val foo = (1 to 100).map(i = Foo()).toDF
foo.saveAsParquetFile( foo )
sqlContext.setConf( spark.sql.shuffle.partitions, 10)

sc.stop

val sparkConf2 = new org.apache.spark.SparkConf()
val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )

sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
val foo2 = sqlContext2.parquetFile( foo )
sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
// expected 30 but got 10
{code}

  was:
In a spark-shell session, stopping a spark context and create a new spark 
context and hive context does not clean the spark sql configuration. More 
precisely, the new hive context still keeps the previous configuration 
settings. Here is a code to show this scenario.

{code:title=New hive context should not load the configurations from history}
case class Foo ( x: Int = (math.random * 1e3).toInt)
val foo = (1 to 100).map(i = Foo()).toDF
foo.saveAsParquetFile( foo )
sqlContext.setConf( spark.sql.shuffle.partitions, 10)

sc.stop

val sparkConf2 = new org.apache.spark.SparkConf()
val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )

sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
val foo2 = sqlContext2.parquetFile( foo )
sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
// expected 30 but got 10
{code}


 New HiveContext object unexpectedly loads configuration settings from history 
 --

 Key: SPARK-9280
 URL: https://issues.apache.org/jira/browse/SPARK-9280
 Project: Spark
  Issue Type: Bug
  Components: SQL
Affects Versions: 1.3.1
Reporter: Tien-Dung LE

 In a spark-shell session, stopping a spark context and create a new spark 
 context and hive context does not clean the spark sql configuration. More 
 precisely, the new hive context still keeps the previous configuration 
 settings. It would be great if someone can let us know how to avoid this 
 situation.
 {code:title=New hive context should not load the configurations from history}
 case class Foo ( x: Int = (math.random * 1e3).toInt)
 val foo = (1 to 100).map(i = Foo()).toDF
 foo.saveAsParquetFile( foo )
 sqlContext.setConf( spark.sql.shuffle.partitions, 10)
 sc.stop
 val sparkConf2 = new org.apache.spark.SparkConf()
 val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
 val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
 sqlContext2.getConf( spark.sql.shuffle.partitions, 20) 
 val foo2 = sqlContext2.parquetFile( foo )
 sqlContext2.getConf( spark.sql.shuffle.partitions, 30)
 // expected 30 but got 10
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org