I believe this issue was fixed in Spark 2.4.
Spark DataSource V2 has been still being radically developed - It is not
complete yet until now.
So, I think the feasible option to get through at the current moment is:
1. upgrade to higher Spark versions
2. disable filter push down at your DataSou
Hi,
I am using spark v2.3.2. I have an implementation of DSV2. Here is what is
happening:
1) Obtained a dataframe using MyDataSource
scala> val df1 = spark.read.format("com.shubham.MyDataSource").load
> MyDataSource.MyDataSource
> MyDataSource.createReader: Going to create a new MyDataSourceRead