You can try increasing phoenix.query.timeoutMs (and
hbase.client.scanner.timeout.period) on the client .
https://phoenix.apache.org/tuning.html
On Fri, May 13, 2016 at 1:51 PM, 景涛 <844300...@qq.com> wrote:
> When I query from a very big table
> It get errors as follow:
>
>
When I query from a very big table
It get errors as follow:
java.lang.RuntimeException: org.apache.phoenix.exception.PhoenixIOException:
org.apache.phoenix.exception.PhoenixIOException: Failed after attempts=36,
exceptions:
Fri May 13 16:15:29 CST 2016, null, java.net.SocketTimeoutException:
I really recommend managing hbase-site.xml outside of your uberjar. You
should already be doing this anyway with other resources, such as
log4j.properties.
If you are intent on setting them programmatically, you'll need to pass
them into the PhoenixConnection's HBaseConfiguration object. I
Can you tell me how to set these client-side properties programmatically?
I'm using JDBI, which uses JDBC; I'm building the whole application into
an executable jar. It's not clear to me where I would put a
hbase-site.xml; but I suspect that it is easier in any case to set the
Phoenix
The other important timeout is Phoenix specific: phoenix.query.timeoutMs.
Set this in your hbase-site.xml on the client side to the value in
milliseconds for the amount of time you're willing to wait before the query
finishes. I might be wrong, but I believe the hbase.rpc.timeout config
parameter
I am facing the same problem - it seems that my newly applied settings are
not being picked up correctly. I have put the rpc.timeout as well as
phoenix.query.timeout to appropriate values; in addition I changed the
client retries to 50 instead of 36 (I see the number 36 in your message
too) and
@phoenix.apache.org
主题: Re: timeouts for long queries
I am facing the same problem - it seems that my newly applied settings are not
being picked up correctly. I have put the rpc.timeout as well as
phoenix.query.timeout to appropriate values; in addition I changed the client
retries to 50 instead of 36 (I