Hello,

So we have iterator type of RDD passing schema as object.
Some of the rows inserting but bulk data is not inserting into the table.
Using phoenix 5.0.0-Hbase-2.0 in our application.

It's showing warning in executor logs while we are catch the exception in
insert method. Error below

"Error occurred while executing the insert in table for phoenix
ERROR 1111 (XCL11) : Connection is closed".

We have upgraded our cluster CDH, Spark and Phoenix with below versions.

CDH - 5.14 to 6.2.1
Spark - 1.6 to 2.4
Phoenix - 4.8.2-HBase-1.2 to 5.1.0-HBase-2.1.

P.S - we are not using phoenix-connetors for phoenix spark connection.

Kindly help on this issue.

Thanks,
Ankit Joshi

On Mon, Sep 27, 2021, 2:36 PM Istvan Toth <st...@cloudera.com> wrote:

> This is not enough information to diagnose the problem.
> However, I suggest building and using the HEAD revision from the
> phoenix-connectors repo.
>
>
> On Mon, Sep 27, 2021 at 8:51 AM Ankit Joshi <ankit.joshi00...@gmail.com>
> wrote:
>
>> Hello Team,
>>
>> I am using phoenix 5.0.0-HBase-2.0 as dependency and in cluster we have
>> 5.1.0 and HBase 2.1.
>> I am able to load data from phoenix into DataFrame.
>> But while inserting (upsert) it from RDD failing with below error.
>> "Error occurred while executing the insert in table for phoenix
>> ERROR 1111 (XCL11) : Connection is closed".
>>
>>
>> Please suggest for this issue.
>>
>> Thanks,
>>
>>
>
> --
> *István Tóth* | Staff Software Engineer
> st...@cloudera.com <https://www.cloudera.com>
> [image: Cloudera] <https://www.cloudera.com/>
> [image: Cloudera on Twitter] <https://twitter.com/cloudera> [image:
> Cloudera on Facebook] <https://www.facebook.com/cloudera> [image:
> Cloudera on LinkedIn] <https://www.linkedin.com/company/cloudera>
> <https://www.cloudera.com/>
> ------------------------------
>

Reply via email to