Hi Paolo,

The application hangs because Ignite client node, that is used by Spark worker, 
can’t connect to the cluster
3797 [tcp-client-disco-msg-worker-#4%null%] WARN  
org.apache.ignite.spi.discovery.tcp.TcpDiscoverySpi  - IP finder returned empty 
addresses list. Please check IP finder configuration and make sure multicast 
works on your network. Will retry every 2 secs.

To fix the issue you have to use one of IP Finders implementations [1] that 
will let each cluster node to find each other.
One of the most common solutions is to use TcpDiscoveryVmIpFinder [2] listing 
all the IPs of all the cluster nodes and set this IP finder to 
IgniteConfiguration on a node startup.

Also you may want to refer to the following discussion where the user also had 
an issue with IP finders initially.

[1] https://apacheignite.readme.io/docs/cluster-config 
<https://apacheignite.readme.io/docs/cluster-config>
[2] 
https://apacheignite.readme.io/v1.6/docs/cluster-config#static-ip-based-discovery
 
<https://apacheignite.readme.io/v1.6/docs/cluster-config#static-ip-based-discovery>
[3] 
http://apache-ignite-users.70518.x6.nabble.com/Ignite-for-Spark-on-YARN-Deployment-tp5465.html
 
<http://apache-ignite-users.70518.x6.nabble.com/Ignite-for-Spark-on-YARN-Deployment-tp5465.html>

—
Denis

> On Jun 12, 2016, at 4:24 PM, Paolo Di Tommaso <[email protected]> 
> wrote:
> 
> Hi, 
> 
> I'm giving a try to the Spark integration provided by Ignite by using the 
> embedded deployment mode described here 
> <https://apacheignite-fs.readme.io/docs/installation-deployment>. 
> 
>  I've setup a local cluster made up a master and a worker node. 
> 
> This is my basic Ignite-Spark application: 
> 
> public class JavaLaunchIgnite {
> 
>     static public void main(String... args) {
>         // -- spark context
>         SparkConf sparkConf = new SparkConf().setAppName("Spark-Ignite");
>         JavaSparkContext sc = new JavaSparkContext(sparkConf);
> 
>         // -- ignite configuration
>         IgniteOutClosure cfg = new IgniteOutClosure() {
>             @Override public Object apply() {
>                 return new IgniteConfiguration();
>             }};
>         // -- ignite context
>         JavaIgniteContext<Integer,Integer> ic = new 
> JavaIgniteContext<Integer,Integer>(sc, cfg);
>         final Ignite ignite = ic.ignite();
>         ic.ignite().compute().broadcast(new IgniteRunnable() {
>             @Override public void run() {
>                 System.out.println(">>> Hello Node: " + 
> ignite.cluster().localNode().id());
>             }});
> 
>         ic.close(true);
>         System.out.println(">>> DONE");
>     }
> }
> 
> However when I submit it it simply hangs. By using the Spark web console, I 
> can see that the application is correctly deployed and running but it never 
> stops. 
> 
> In the Spark worker node I can find any log produced by Ignite (which is 
> supposed to deploy an Ignite worker). See here 
> <http://pastebin.com/KdEA0KUq>. 
> 
> Instead I can see the Ignite output in the log of the spark-submit log. See 
> here <http://pastebin.com/Ff6fxYBF>. 
> 
> 
> Does anybody have any clue why this app just hangs? 
> 
> 
> Cheers,
> Paolo
> 

Reply via email to