[GitHub] spark pull request #21726: Branch 2.3

2018-07-18 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/21726


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21726: Branch 2.3

2018-07-06 Thread shshahpk
GitHub user shshahpk opened a pull request:

https://github.com/apache/spark/pull/21726

Branch 2.3

## What changes were proposed in this pull request?

(Please fill in changes proposed in this fix)

## How was this patch tested?

(Please explain how this patch was tested. E.g. unit tests, integration 
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/apache/spark branch-2.3

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/21726.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #21726


commit dfb16147791ff87342ff852105420a5eac5c553b
Author: Dongjoon Hyun 
Date:   2018-02-09T04:54:57Z

[SPARK-23186][SQL] Initialize DriverManager first before loading JDBC 
Drivers

## What changes were proposed in this pull request?

Since some JDBC Drivers have class initialization code to call 
`DriverManager`, we need to initialize `DriverManager` first in order to avoid 
potential executor-side **deadlock** situations like the following (or 
[STORM-2527](https://issues.apache.org/jira/browse/STORM-2527)).

```
Thread 9587: (state = BLOCKED)
 - 
sun.reflect.NativeConstructorAccessorImpl.newInstance0(java.lang.reflect.Constructor,
 java.lang.Object[]) bci=0 (Compiled frame; information may be imprecise)
 - 
sun.reflect.NativeConstructorAccessorImpl.newInstance(java.lang.Object[]) 
bci=85, line=62 (Compiled frame)
 - 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(java.lang.Object[]) 
bci=5, line=45 (Compiled frame)
 - java.lang.reflect.Constructor.newInstance(java.lang.Object[]) bci=79, 
line=423 (Compiled frame)
 - java.lang.Class.newInstance() bci=138, line=442 (Compiled frame)
 - java.util.ServiceLoader$LazyIterator.nextService() bci=119, line=380 
(Interpreted frame)
 - java.util.ServiceLoader$LazyIterator.next() bci=11, line=404 
(Interpreted frame)
 - java.util.ServiceLoader$1.next() bci=37, line=480 (Interpreted frame)
 - java.sql.DriverManager$2.run() bci=21, line=603 (Interpreted frame)
 - java.sql.DriverManager$2.run() bci=1, line=583 (Interpreted frame)
 - 
java.security.AccessController.doPrivileged(java.security.PrivilegedAction) 
bci=0 (Compiled frame)
 - java.sql.DriverManager.loadInitialDrivers() bci=27, line=583 
(Interpreted frame)
 - java.sql.DriverManager.() bci=32, line=101 (Interpreted frame)
 - 
org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(java.lang.String,
 java.lang.Integer, java.lang.String, java.util.Properties) bci=12, line=98 
(Interpreted frame)
 - 
org.apache.phoenix.mapreduce.util.ConnectionUtil.getInputConnection(org.apache.hadoop.conf.Configuration,
 java.util.Properties) bci=22, line=57 (Interpreted frame)
 - 
org.apache.phoenix.mapreduce.PhoenixInputFormat.getQueryPlan(org.apache.hadoop.mapreduce.JobContext,
 org.apache.hadoop.conf.Configuration) bci=61, line=116 (Interpreted frame)
 - 
org.apache.phoenix.mapreduce.PhoenixInputFormat.createRecordReader(org.apache.hadoop.mapreduce.InputSplit,
 org.apache.hadoop.mapreduce.TaskAttemptContext) bci=10, line=71 (Interpreted 
frame)
 - 
org.apache.spark.rdd.NewHadoopRDD$$anon$1.(org.apache.spark.rdd.NewHadoopRDD,
 org.apache.spark.Partition, org.apache.spark.TaskContext) bci=233, line=156 
(Interpreted frame)

Thread 9170: (state = BLOCKED)
 - org.apache.phoenix.jdbc.PhoenixDriver.() bci=35, line=125 
(Interpreted frame)
 - 
sun.reflect.NativeConstructorAccessorImpl.newInstance0(java.lang.reflect.Constructor,
 java.lang.Object[]) bci=0 (Compiled frame)
 - 
sun.reflect.NativeConstructorAccessorImpl.newInstance(java.lang.Object[]) 
bci=85, line=62 (Compiled frame)
 - 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(java.lang.Object[]) 
bci=5, line=45 (Compiled frame)
 - java.lang.reflect.Constructor.newInstance(java.lang.Object[]) bci=79, 
line=423 (Compiled frame)
 - java.lang.Class.newInstance() bci=138, line=442 (Compiled frame)
 - 
org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(java.lang.String)
 bci=89, line=46 (Interpreted frame)
 - 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$2.apply()
 bci=7, line=53 (Interpreted frame)
 - 
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$2.apply()
 bci=1, line=52 (Interpreted frame)
 - 
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anon$1.(org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD,
 org.apache.spark.Partition,