GUC: transaction_isolationValue: read committed
GUC: transaction_read_onlyValue: off
GUC: transform_null_equalsValue: off
GUC: unix_socket_directoryValue:
GUC: unix_socket_groupValue:
GUC: unix_socket_permissionsValue: 511
GUC: update_process_titleValue: on
GUC: vacuum_cost_delayValue: 0
GUC: vacuum_cost_limitValue: 200
GUC: vacuum_cost_page_dirtyValue: 20
GUC: vacuum_cost_page_missValue: 10
GUC: vacuum_freeze_min_ageValue: 1
GUC: work_memValue: 51200
At 2016-09-21 13:00:41, "Vineet Goel" <vvin...@apache.org> wrote:
Could you please post your SQL DDL statement? How many URLs do you have in your
external table? Also, your HASH dist table - how many buckets are defined, if
any? Are the # of URLs more than the # of buckets or
default_hash_table_bucket_number value? Perhaps you can attach your
hawq-site.xml file as well.
Also see:
http://hdb.docs.pivotal.io/20/datamgmt/load/g-gpfdist-protocol.html
Thanks
Vineet
On Tue, Sep 20, 2016 at 7:07 PM 来熊 <yin@163.com> wrote:
Hi,all:
I am testing hawq 2.0.0 , and I find a problem like this:
I load data from an external table (created using "like target_table"
statement) ,
if the target table was distributed by some column(s), it raise this error:
External scan error: There are more external files (URLs) than primary
segments that can read them (COptTasks.cpp:1756)
if the target table was distributed randomly, it works well,
I don't set any parameter special,does anybody know how to resolve this problem?
thanks a lot.