You can instead set Porfile=HdfsTextSimple insteado f setting fragmenter,accessor and resolver.
On Wed, May 18, 2016 at 1:47 AM, jin tao <[email protected]> wrote: > Thanks. > > resolved it by using the full java class path. > > the correct ddl: > create external table test_pxf( > a int, > b char(1) > ) > LOCATION > ('pxf://vb2.hdp:51200/tmpdata/test.csv?Fragmenter=org.apache.hawq.pxf.plugins.hdfs.HdfsDataFragmenter&ACCESSOR=org.apache.hawq.pxf.plugins.hdfs.LineBreakAccessor&Resolver=org.apache.hawq.pxf.plugins.hdfs.StringPassResolver')FORMAT > 'CSV'(DELIMITER = E','); > > > On 05/18/2016 03:02 PM, jin tao wrote: > >> Hi, >> >> I installed hawq 2.0 and pxf 3.0 in a cluster using ambari successfully. >> 1 master,3 salves,4 pxf client on each machine. >> While I use PXF to query data on hdfs,it >> shows:java.lang.ClassNotFoundException: HdfsDataFragmenter. >> the DDL is : >> create external table test_pxf( >> a int, >> b char(1) >> ) >> LOCATION >> ('pxf://vb2.hdp:51200/tmpdata/test.csv?Fragmenter=HdfsDataFragmenter&ACCESSOR=LineBreakAccessor&Resolver=TextResolver')FORMAT >> 'CSV'(DELIMITER = E','); >> >> when I run: >> postgres=# select * from test_pxf; >> ERROR: remote component error (500) from '192.168.81.11:51200': type >> Exception report message java.lang.ClassNotFoundException: >> HdfsDataFragmenter description The server encountered an internal error >> that prevented it from fulfilling this request. exception >> javax.servlet.ServletException: java.lang.ClassNotFoundException: >> HdfsDataFragmenter (libchurl.c:878) >> >> Pleas anyone could help me to fix this issue? >> Thanks a lot. >> >> Bset regards! >> >> jin tao >> > > -- shivram mani
