Hi,
In Spark you can use next options:
* spark.driver.extraJavaOptions
* spark.executor.extraJavaOptions
You can path your IGNITE JVM options there like -DIGNITE_QUIET=false.
Generally clients will be started on executors during data loading or
data reading but you also can start them on
Stopped client node in the machine where spark worker node was running and
started spark shell. It started a ignite client node from within.
I have only 1 problem that how to specify ignite jvm options from spark. it
is taking default -Xms and -Xmx arguments which are very less
--
Sent from:
I integrated spark and ignite using thrift client will that do ?
https://github.com/kali786516/ApacheIgnitePoc/blob/master/src/main/scala/com/ignite/examples/spark/SparkClientConnectionTest.scala#L73
On Monday, December 9, 2019, Denis Magda wrote:
> Hi, just ensure that "clientMode" is set
Hi, just ensure that "clientMode" is set in IgniteConfiguration that you
pass to Spark's IgniteContext object. Spark worker will spin up a client
node automatically for you and that one will reach out to the server
(assuming you properly configured Ignite discovery SPI in the same
Hi,
Then what I have currently implemented is hopefully not embedded mode
is it?
Also I wanted to know if should install client nodes on spark worker nodes
if spark is going to start a client node itself ?
--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/
That’s just how the Spark integration works!
I suppose you could use the Spark’s JDBC connection to access Ignite, but you’d
lose some of the flexibility.
Regards,
Stephen
> On 6 Dec 2019, at 17:04, datta wrote:
>
> Hi,
>
> I have installed ignite in 2 machines .
>
> 1 - server and 1 as
Hi,
I have installed ignite in 2 machines .
1 - server and 1 as client
in the client machine i have installed spark and copied the the required
ignite jars inside the spark_home's jars folder.
the problem is when i am starting spark and try to read a table in ignite
using spark dataframe api