Github user tdas commented on the pull request:

    https://github.com/apache/spark/pull/4754#issuecomment-75930186
  
    I tested. Still not working. I enabled verbose logging on spark-submit and 
saw this 
    
    ```
    [tdas @ Zion spark2] bin/spark-submit --verbose --master local[4] 
--repositories 
https://repository.apache.org/content/repositories/orgapachespark-1069/ 
--packages org.apache.spark:spark-streaming-kafka_2.10:1.3.0 
examples/src/main/python/streaming/kafka_wordcount.py localhost:2181 test
    Using properties file: null
    Parsed arguments:
      master                  local[4]
      deployMode              null
      executorMemory          null
      executorCores           null
      totalExecutorCores      null
      propertiesFile          null
      driverMemory            null
      driverCores             null
      driverExtraClassPath    null
      driverExtraLibraryPath  null
      driverExtraJavaOptions  null
      supervise               false
      queue                   null
      numExecutors            null
      files                   null
      pyFiles                 null
      archives                null
      mainClass               null
      primaryResource         
file:/Users/tdas/Projects/Spark/spark2/examples/src/main/python/streaming/kafka_wordcount.py
      name                    kafka_wordcount.py
      childArgs               [localhost:2181 test]
      jars                    null
      packages                org.apache.spark:spark-streaming-kafka_2.10:1.3.0
      repositories            
https://repository.apache.org/content/repositories/orgapachespark-1069/
      verbose                 true
    
    Spark properties used, including those specified through
     --conf and those from the properties file null:
    
    
    
    Ivy Default Cache set to: /Users/tdas/.ivy2/cache
    The jars for the packages stored in: /Users/tdas/.ivy2/jars
    https://repository.apache.org/content/repositories/orgapachespark-1069/ 
added as a remote repository with the name: repo-1
    :: loading settings :: url = 
jar:file:/Users/tdas/Projects/Spark/spark2/assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop1.0.4.jar!/org/apache/ivy/core/settings/ivysettings.xml
    org.apache.spark#spark-streaming-kafka_2.10 added as a dependency
    :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found org.apache.spark#spark-streaming-kafka_2.10;1.3.0 in repo-1
        found org.apache.kafka#kafka_2.10;0.8.1.1 in list
        found com.yammer.metrics#metrics-core;2.2.0 in list
        found org.slf4j#slf4j-api;1.7.10 in list
        found org.xerial.snappy#snappy-java;1.1.1.6 in list
        found com.101tec#zkclient;0.3 in list
        found log4j#log4j;1.2.17 in list
        found org.spark-project.spark#unused;1.0.0 in list
    :: resolution report :: resolve 370ms :: artifacts dl 17ms
        :: modules in use:
        com.101tec#zkclient;0.3 from list in [default]
        com.yammer.metrics#metrics-core;2.2.0 from list in [default]
        log4j#log4j;1.2.17 from list in [default]
        org.apache.kafka#kafka_2.10;0.8.1.1 from list in [default]
        org.apache.spark#spark-streaming-kafka_2.10;1.3.0 from repo-1 in 
[default]
        org.slf4j#slf4j-api;1.7.10 from list in [default]
        org.spark-project.spark#unused;1.0.0 from list in [default]
        org.xerial.snappy#snappy-java;1.1.1.6 from list in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   8   |   0   |   0   |   0   ||   8   |   0   |
        ---------------------------------------------------------------------
    :: retrieving :: org.apache.spark#spark-submit-parent
        confs: [default]
        0 artifacts copied, 8 already retrieved (0kB/7ms)
    Main class:
    org.apache.spark.deploy.PythonRunner
    Arguments:
    
file:/Users/tdas/Projects/Spark/spark2/examples/src/main/python/streaming/kafka_wordcount.py
    
/Users/tdas/.ivy2/jars/spark-streaming-kafka_2.10.jar,/Users/tdas/.ivy2/jars/kafka_2.10.jar,/Users/tdas/.ivy2/jars/unused.jar,/Users/tdas/.ivy2/jars/metrics-core.jar,/Users/tdas/.ivy2/jars/snappy-java.jar,/Users/tdas/.ivy2/jars/zkclient.jar,/Users/tdas/.ivy2/jars/slf4j-api.jar,/Users/tdas/.ivy2/jars/log4j.jar
    localhost:2181
    test
    System properties:
    SPARK_SUBMIT -> true
    spark.submit.pyFiles -> 
/Users/tdas/.ivy2/jars/spark-streaming-kafka_2.10.jar,/Users/tdas/.ivy2/jars/kafka_2.10.jar,/Users/tdas/.ivy2/jars/unused.jar,/Users/tdas/.ivy2/jars/metrics-core.jar,/Users/tdas/.ivy2/jars/snappy-java.jar,/Users/tdas/.ivy2/jars/zkclient.jar,/Users/tdas/.ivy2/jars/slf4j-api.jar,/Users/tdas/.ivy2/jars/log4j.jar
    spark.files -> 
file:/Users/tdas/Projects/Spark/spark2/examples/src/main/python/streaming/kafka_wordcount.py,file:/Users/tdas/.ivy2/jars/spark-streaming-kafka_2.10.jar,file:/Users/tdas/.ivy2/jars/kafka_2.10.jar,file:/Users/tdas/.ivy2/jars/unused.jar,file:/Users/tdas/.ivy2/jars/metrics-core.jar,file:/Users/tdas/.ivy2/jars/snappy-java.jar,file:/Users/tdas/.ivy2/jars/zkclient.jar,file:/Users/tdas/.ivy2/jars/slf4j-api.jar,file:/Users/tdas/.ivy2/jars/log4j.jar
    spark.app.name -> kafka_wordcount.py
    spark.jars -> 
file:/Users/tdas/.ivy2/jars/spark-streaming-kafka_2.10.jar,file:/Users/tdas/.ivy2/jars/kafka_2.10.jar,file:/Users/tdas/.ivy2/jars/unused.jar,file:/Users/tdas/.ivy2/jars/metrics-core.jar,file:/Users/tdas/.ivy2/jars/snappy-java.jar,file:/Users/tdas/.ivy2/jars/zkclient.jar,file:/Users/tdas/.ivy2/jars/slf4j-api.jar,file:/Users/tdas/.ivy2/jars/log4j.jar
    spark.master -> local[4]
    spark.driver.extraClassPath -> 
/Users/tdas/.ivy2/jars/spark-streaming-kafka_2.10.jar,/Users/tdas/.ivy2/jars/kafka_2.10.jar,/Users/tdas/.ivy2/jars/unused.jar,/Users/tdas/.ivy2/jars/metrics-core.jar,/Users/tdas/.ivy2/jars/snappy-java.jar,/Users/tdas/.ivy2/jars/zkclient.jar,/Users/tdas/.ivy2/jars/slf4j-api.jar,/Users/tdas/.ivy2/jars/log4j.jar
    Classpath elements:
    /Users/tdas/.ivy2/jars/spark-streaming-kafka_2.10.jar
    /Users/tdas/.ivy2/jars/kafka_2.10.jar
    /Users/tdas/.ivy2/jars/unused.jar
    /Users/tdas/.ivy2/jars/metrics-core.jar
    /Users/tdas/.ivy2/jars/snappy-java.jar
    /Users/tdas/.ivy2/jars/zkclient.jar
    /Users/tdas/.ivy2/jars/slf4j-api.jar
    /Users/tdas/.ivy2/jars/log4j.jar
    
    ```
    
    So i can see that the relevant jars are being added to the classpath 
elements but pyspark is still unable to find 
org.apache.spark.streaming.kafka.KafkaUtils (from 
/Users/tdas/.ivy2/jars/spark-streaming-kafka_2.10.jar).
    
    Lets debug this tomorrow morning.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to