Thanks a lot Stephan.
I will explore more on Ignite sql, just that I have to get the data from
Ignite cache , dont want to share and maintain same Spark context.
I think I will get more clarification, If I go through the docs.
Thanks again.
Regards,
Bala
--
Sent from:
Write to Ignite using the Ignite-Spark integration:
input = spark.read.parquet(HDFS_ACCOUNT)
input.write.format("ignite")
.option("table","sfdc_account_parquet")
.option("primaryKeyFields”,”key1,key2")
.option("config",configFile)
.save()
At that
I’m not sure I fully understand what you’re trying to do. It looks like you’re
trying to put an entire DataFrame (a collection of records) into a single value
in Ignite? Even if there’s only a single record, you probably want to put the
row into Ignite rather than the whole DF.
But I think
Hi Stephan,Here is the codeaccountDF =
spark.read.parquet(HDFS_ACCOUNT)self._igCache.put('sfdc_account_parquet',
accountDF, value_hint=BinaryObject)I tried with many object types in
value_hint, also tried leaving it as empty, nothing worked.*Error: without
value_hint*TypeError: object of type
Can you share some code and the actual errors you’re getting? It’s not entirely
clear to me what you’re trying to do? Are you using the new Python thin-client?
Or are you using Spark's Python support along with Ignite’s support for
DataFrames? And what do you mean by “complex object type”?
Hi,I'm trying to put a *Dataframe *after reading from a Parquet file into
Ignite cache, throwing error. I have tried with all kind of Complex object
types, no luck.Is it possible to put Dataframe using Python ? is it
supported?Is there any better way I can replace Livy with Ignite?Thanks,Bala