Hello Team,
Getting error while upgrading from 4.8.2 to 5.2.1.
ERROR 504 (42703) : Undefined column:
ColumnName=SYSTEM.CATALOG.COLUMN_QUALIFIER
Failed upgrading System tables.
Kindly help.
Thanks & Regards,
Ankit Joshi
.
Please advise me. Any help would be appreciated.
Thanks & Regards,
Ankit Joshi
Hi Viraj,
Could you please help me with the steps or any document I can refer for
"Execute Upgrade" command.
Thanks & Regards,
Ankit Joshi
On Wed, Jun 2, 2021, 3:26 PM Viraj Jasani wrote:
> Hi Ankit,
>
> This coproc is no longer in use and should not be loaded.
compatible version with HBase is 5.1.0. so downloaded from Apache
phoenix site and placed phoenix server jar in /usr/lib/hbase.
Please suggest if anything need to change in hbase-site.xml and how I
update System tables schema.
Thanks,
Ankit Joshi
On Thu, Jun 3, 2021, 12:56 PM Viraj Jasani wrote
.
Thanks & Regards,
Ankit Joshi
ng phoenix-connetors for phoenix spark connection.
Kindly help on this issue.
Thanks,
Ankit Joshi
On Mon, Sep 27, 2021, 2:36 PM Istvan Toth wrote:
> This is not enough information to diagnose the problem.
> However, I suggest building and using the HEAD revision from the
> phoenix-connectors
Hello Team,
I am using phoenix 5.0.0-HBase-2.0 as dependency and in cluster we have
5.1.0 and HBase 2.1.
I am able to load data from phoenix into DataFrame.
But while inserting (upsert) it from RDD failing with below error.
"Error occurred while executing the insert in table for phoenix
ERROR
this issue?
P.S - Phoenix version 5.0.0-Hbase-2.0
Hadoop version is 3.0.0
Thanks & Regards,
Ankit Joshi
and Hadoop version.
Phoenix Version from 4.14-hbase-1.2 to 5.0.0-Hbase-2.0
Hadoop version from 2.0 to 3.0.0
Note - Similar code works fine with the previous versions.
Please suggest.
Thanks and Regards,
Ankit Joshi
version in CDH 6.2.1.
P.S - We set the phoenix.query.maxServerCacheBytes = 200 mb.
Kindly help on this issue.
Thanks,
Ankit Joshi
10 matches
Mail list logo