Could you please post your SQL DDL statement? How many URLs do you have in
your external table? Also, your HASH dist table - how many buckets are
defined, if any? Are the # of URLs more than the # of buckets or
default_hash_table_bucket_number value? Perhaps you can attach your
hawq-site.xml file as well.

Also see:
http://hdb.docs.pivotal.io/20/datamgmt/load/g-gpfdist-protocol.html

Thanks
Vineet


On Tue, Sep 20, 2016 at 7:07 PM 来熊 <yin....@163.com> wrote:

> Hi,all:
>     I am testing hawq 2.0.0 , and I find a problem like this:
>  I load data from an external table (created using "like target_table"
> statement) ,
> if the target table was distributed by some column(s), it raise this error:
>  External scan error: There are more external files (URLs) than primary
> segments that can read them (COptTasks.cpp:1756)
> if the target table was distributed randomly, it works well,
> I don't set any parameter special,does anybody know how to resolve this
> problem?
> thanks a lot.
>

Reply via email to