Your understanding is correct.

How many nodes do you have?

Please provide full logs from the started Ignite instances.



2016-06-15 18:34 GMT+03:00 Paolo Di Tommaso <[email protected]>:

> OK, using `ic.close(false)` instead of `ic.close(true)` that exception is
> not reported.
>
> However I'm a bit confused. The close argument is named
> `shutdownIgniteOnWorkers` so I was thinking that is required to set it true
> to shutdown the Ignite daemon when the app is terminated.
>
> How it is supposed to be used that flag?
>
>
> Cheers,
> Paolo
>
>
> On Wed, Jun 15, 2016 at 5:06 PM, Paolo Di Tommaso <
> [email protected]> wrote:
>
>> The version is 1.6.0#20160518-sha1:0b22c45b and the following is the
>> script I'm using.
>>
>>
>> https://github.com/pditommaso/gspark/blob/master/src/main/groovy/org/apache/ignite/examples/JavaIgniteSimpleApp.java
>>
>>
>>
>> Cheers, p
>>
>>
>> On Wed, Jun 15, 2016 at 5:00 PM, Alexei Scherbakov <
>> [email protected]> wrote:
>>
>>> I don't think it's OK.
>>>
>>> Which Ingite's version do you use?
>>>
>>> 2016-06-15 15:35 GMT+03:00 Paolo Di Tommaso <[email protected]>:
>>>
>>>> Great, now it works! Thanks a lot.
>>>>
>>>>
>>>> I have only a NPE during the application shutdown (you can find the
>>>> stack trace at this link <http://pastebin.com/y0EM7qXU>). Is this
>>>> normal? and in any case is there a way to avoid it?
>>>>
>>>>
>>>> Cheers,
>>>> Paolo
>>>>
>>>>
>>>>
>>>> On Wed, Jun 15, 2016 at 1:25 PM, Alexei Scherbakov <
>>>> [email protected]> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> To automatically start Ignite nodes you must pass false parameter to
>>>>> 3-d IgniteContext argument like:
>>>>>
>>>>> // java
>>>>> SparcContext sc = ...
>>>>> new JavaIgniteContext<>(sc, new IgniteConfigProvider(), false);;
>>>>>
>>>>> or
>>>>>
>>>>> // scala
>>>>> SparcContext sc = ...
>>>>> new IgniteContext[String, String](sc,() ⇒ configurationClo(), false)
>>>>>
>>>>> 2016-06-15 13:31 GMT+03:00 Paolo Di Tommaso <[email protected]
>>>>> >:
>>>>>
>>>>>> Hi all,
>>>>>>
>>>>>> I'm struggling deploying an Ignite application in a Spark (local)
>>>>>> cluster using the Embedded deploying described at this link
>>>>>> <https://apacheignite-fs.readme.io/docs/installation-deployment#embedded-deployment>.
>>>>>>
>>>>>>
>>>>>> The documentation seems suggesting that Ignite workers are
>>>>>> automatically instantiated at runtime when submitting the Ignite app.
>>>>>>
>>>>>> Could you please confirm that this is the expected behaviour?
>>>>>>
>>>>>>
>>>>>> In my tests the when the application starts it simply hangs,
>>>>>> reporting this warning message:
>>>>>>
>>>>>> WARN  org.apache.ignite.spi.discovery.tcp.TcpDiscoverySpi  - Failed
>>>>>> to connect to any address from IP finder (will retry to join topology 
>>>>>> every
>>>>>> 2 secs): [/192.168.1.36:47500, /192.168.99.1:47500]
>>>>>>
>>>>>> It looks like there are not ignite daemons to which connect to. Also
>>>>>> inspecting the Spark worker log I'm unable to find any message produced 
>>>>>> by
>>>>>> Ignite. I'm expecting instead to find the log messages produced by the
>>>>>> ignite daemon startup.
>>>>>>
>>>>>>
>>>>>> Any idea what's wrong?
>>>>>>
>>>>>>
>>>>>> Cheers,
>>>>>> Paolo
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>> Best regards,
>>>>> Alexei Scherbakov
>>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>>
>>> Best regards,
>>> Alexei Scherbakov
>>>
>>
>>
>


-- 

Best regards,
Alexei Scherbakov

Reply via email to