Just wanted to point out that the stack trace doesn't actually show the
error coming from code in the NiFi Site-To-Site client, so I wonder if it
is something else related to Spark.

Seems similar to this error, but not sure:
https://stackoverflow.com/questions/27013795/failed-to-run-the-spark-example-locally-on-a-macbook-with-error-lost-task-1-0-i

On Sat, Feb 20, 2016 at 5:16 PM, Joe Witt <joe.w...@gmail.com> wrote:

> Kyle
>
> Can you try connecting to that nifi port using telnet and see if you are
> able?
>
> Use the same host and port as you are in your spark job.
>
> Thanks
> Joe
> On Feb 20, 2016 4:55 PM, "Kyle Burke" <kyle.bu...@ignitionone.com> wrote:
>
>> All,
>>    I’m attempting to connect Spark to Nifi but I’m getting a “connect
>> timed out” error when spark tries to pull records from the input port. I
>> don’t understand why I”m getting the issue because nifi and spark are both
>> running on my local laptop. Any suggestions about how to get around the
>> issue?
>>
>> *It appears that nifi is listening on the port because I see the
>> following when running the lsof command:*
>>
>> java    31455 kyle.burke 1054u  IPv4 0x1024ddd67a640091      0t0  TCP
>> *:9099 (LISTEN)
>>
>>
>> *I’ve been following the instructions give in these two articles:*
>> https://blogs.apache.org/nifi/entry/stream_processing_nifi_and_spark
>>
>> https://community.hortonworks.com/articles/12708/nifi-feeding-data-to-spark-streaming.html
>>
>> *Here is how I have my nifi.properties setting:*
>>
>> # Site to Site properties
>>
>> nifi.remote.input.socket.host=
>>
>> nifi.remote.input.socket.port=9099
>>
>> nifi.remote.input.secure=false
>>
>>
>> *Below is the full error stack:*
>>
>> 16/02/20 16:34:45 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID
>> 0)
>>
>> java.net.SocketTimeoutException: connect timed out
>>
>> at java.net.PlainSocketImpl.socketConnect(Native Method)
>>
>> at
>> java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
>>
>> at
>> java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
>>
>> at
>> java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
>>
>> at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
>>
>> at java.net.Socket.connect(Socket.java:589)
>>
>> at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
>>
>> at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
>>
>> at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
>>
>> at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
>>
>> at sun.net.www.http.HttpClient.New(HttpClient.java:308)
>>
>> at sun.net.www.http.HttpClient.New(HttpClient.java:326)
>>
>> at
>> sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1168)
>>
>> at
>> sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1104)
>>
>> at
>> sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:998)
>>
>> at
>> sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:932)
>>
>> at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:555)
>>
>> at org.apache.spark.util.Utils$.fetchFile(Utils.scala:369)
>>
>> at
>> org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:405)
>>
>> at
>> org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:397)
>>
>> at
>> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
>>
>> at
>> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>>
>> at
>> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>>
>> at
>> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
>>
>> at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
>>
>> at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
>>
>> at
>> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
>>
>> at org.apache.spark.executor.Executor.org
>> $apache$spark$executor$Executor$$updateDependencies(Executor.scala:397)
>>
>> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)
>>
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>
>> at java.lang.Thread.run(Thread.java:745)
>>
>>
>> Respectfully,
>>
>> *Kyle Burke *| Data Science Engineer
>> *IgnitionOne - *Marketing Technology. Simplified.
>>
>

Reply via email to