1. Suppose I read a parquet file using Spark into a dataframe df
When I do df.write.format("ignite"), where is the table written?
Is it in ignite cache?
Also can many tables be written in same way into the same ignite cache?
2. If the above is ignite cache then -
What is CacheConfiguration in java
And client.get_or_creat_cache() in python
Here we need to specify cache schema right?
Regards
Arunima Barik
On Mon, 3 Jul, 2023, 1:20 pm Stephen Darlington, <
[email protected]> wrote:
> A quick search came up with this:
> https://www.arm64.ca/post/reading-parquet-files-java/
>
> Data is generally bulk loaded in the loadCache implementation. Which data
> is loaded would depend on your implementation.
>
> On 1 Jul 2023, at 13:11, Arunima Barik <[email protected]> wrote:
>
> Hello Team
>
> I wish to create a cache layer over an existing parquet database. Some
> doubts regarding the same -
>
> 1) How to read the parquet file into Ignite - Spark or something else
>
> 2) While implementing the CacheStore interface for this, how to load a
> part of data to the cache.
> Will it be done automatically or it has to be coded explicitly
>
> Regards
> Arunima
>
>
>