Indeed which was puzzling to me. However, that was not the problem because we didn't even reach that point.
Cheers, Abdullah. On Tue, Nov 15, 2016 at 9:25 PM, Mike Carey <[email protected]> wrote: > If that's the problem, we might at least be able to improve the error > handling/messaging... :-) > > > On 11/15/16 9:16 PM, abdullah alamoudi wrote: > >> There is also a missing parameter: ("input-format"="text-input-format"). >> In >> HDFS, the file containing the data can have one of many formats, Text, >> Sequence, RC, etc. Hence, the adapter needs to know the file input format >> so it can access it. >> >> Cheers, >> ~Abdullah. >> >> On Tue, Nov 15, 2016 at 9:13 PM, abdullah alamoudi <[email protected]> >> wrote: >> >> Where is the attachment? >>> >>> On Tue, Nov 15, 2016 at 9:11 PM, mingda li <[email protected]> >>> wrote: >>> >>> Hi, >>>> >>>> Has anyone loaded data from HDFS? I met a problem when loading data use >>>> following query: >>>> >>>> use dataverse tpcds3; >>>> >>>> load dataset inventory >>>> >>>> using hdfs(("hdfs"="hdfs://SCAI01.CS.UCLA.EDU:9000 >>>> <http://scai01.cs.ucla.edu:9000/>"),("path"="/cla >>>> sh/datasets/tpcds/10/inventory"),("format"="delimited-text") >>>> ,("delimiter"="|")); >>>> >>>> >>>> The Error in web interface is: >>>> >>>> Internal error. Please check instance logs for further details. >>>> [NullPointerException] >>>> >>>> >>>> I check the cc.log (cluster controller) and find the following problem: >>>> >>>> SEVERE: Unable to create adapter >>>> >>>> org.apache.hyracks.algebricks.common.exceptions.AlgebricksException: >>>> Unable to create adapter >>>> >>>> at org.apache.asterix.metadata.de >>>> clared.AqlMetadataProvider.get >>>> ConfiguredAdapterFactory(AqlMetadataProvider.java:990) >>>> >>>> at org.apache.asterix.metadata.de >>>> clared.LoadableDataSource.buil >>>> dDatasourceScanRuntime(LoadableDataSource.java:141) >>>> >>>> More about the log is in the attachment. >>>> >>>> >>>> I think there is no problem about the syntax of query. Does anyone have >>>> idea about this? >>>> >>>> >>>> Thanks, >>>> >>>> Mingda >>>> >>>> >>>> >
