That’s just how the Spark integration works! 

I suppose you could use the Spark’s JDBC connection to access Ignite, but you’d 
lose some of the flexibility.

Regards,
Stephen

> On 6 Dec 2019, at 17:04, datta <tableau.in...@gmail.com> wrote:
> 
> Hi,
> 
> I have installed ignite in 2 machines .
> 
> 1 - server and 1 as client
> 
> in the client machine i have installed spark and copied the the required
> ignite jars inside the spark_home's jars folder.
> 
> the problem is when i am starting spark and try to read a table in ignite
> using spark dataframe api it is starting another client node from spark.
> 
> How to prevent this from happening?
> 
> 
> ignite version 2.5.0
> spark version 2.2.0
> 
> FYI I am using HortonWorks Ambari Platform. as i could not upgrade spark to
> 2.3.0 as it would break other things i had to use ignite 2.5.0 which
> supports 2.2.0 spark
> 
> 
> 
> --
> Sent from: http://apache-ignite-users.70518.x6.nabble.com/


Reply via email to