Github user Stibbons commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14180#discussion_r73142048
  
    --- Diff: python/pyspark/worker.py ---
    @@ -19,18 +19,27 @@
     Worker that receives input from Piped RDD.
     """
     from __future__ import print_function
    +
     import os
    +import socket
     import sys
     import time
    -import socket
     import traceback
     
    +from pyspark import shuffle
     from pyspark.accumulators import _accumulatorRegistry
    -from pyspark.broadcast import Broadcast, _broadcastRegistry
    +from pyspark.broadcast import Broadcast
    +from pyspark.broadcast import _broadcastRegistry
     from pyspark.files import SparkFiles
    -from pyspark.serializers import write_with_length, write_int, read_long, \
    -    write_long, read_int, SpecialLengths, UTF8Deserializer, 
PickleSerializer, BatchedSerializer
    -from pyspark import shuffle
    +from pyspark.serializers import BatchedSerializer
    +from pyspark.serializers import PickleSerializer
    +from pyspark.serializers import SpecialLengths
    +from pyspark.serializers import UTF8Deserializer
    +from pyspark.serializers import read_int
    +from pyspark.serializers import read_long
    +from pyspark.serializers import write_int
    +from pyspark.serializers import write_long
    +from pyspark.serializers import write_with_length
     
    --- End diff --
    
    I have the habit to rearrange import statements. It is more readable and 
maintainable, and ease merges. I can move this to external pr.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to