It's probably not your code.

What's the full command line you use to submit the job?

Are you sure the job on the cluster has access to the network interface?
Can you test the receiver by itself without Spark? For example, does this
line work as expected:

List<PcapNetworkInterface> nifs = Pcaps.findAllDevs();

dean

Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
Typesafe <http://typesafe.com>
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com

On Mon, Apr 27, 2015 at 4:03 AM, Hai Shan Wu <wuh...@cn.ibm.com> wrote:

> Hi Everyone
>
> We use pcap4j to capture network packets and then use spark streaming to
> analyze captured packets. However, we met a strange problem.
>
> If we run our application on spark locally (for example, spark-submit
> --master local[2]), then the program runs successfully.
>
> If we run our application on spark standalone cluster, then the program
> will tell us that NO NIFs found.
>
> I also attach two test files for clarification.
>
> So anyone can help on this? Thanks in advance!
>
>
> *(See attached file: PcapReceiver.java)**(See attached file:
> TestPcapSpark.java)*
>
> Best regards,
>
> - Haishan
>
> Haishan Wu (吴海珊)
>
> IBM Research - China
> Tel: 86-10-58748508
> Fax: 86-10-58748330
> Email: wuh...@cn.ibm.com
> Lotus Notes: Hai Shan Wu/China/IBM
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

Reply via email to