ashoksri0 commented on issue #3299:
URL: https://github.com/apache/iceberg/issues/3299#issuecomment-947732293
> ```java
> table.newAppend().appendFile(dataFile).commit();
> ```
>
> It only update metadata when you commit it. won't mv parquet file to
default data directory. like: if the parquet file in /usr/data/xxx.parquet, the
metadata manifect file will add datafile uri to /usr/data/xxx.parquet, etc. if
the parquet file in /tmp/my/xxx.parquet, the metadata manifect file will add
datafile uri to /tmp/my/xxx.parquet. won't mv the parquet file to
/user/hive/warehouse/claims_sys.db/SUBSCRIBER/data/xxx.parquet.
>
> so, you can make a data directory and mv data file to the data directory.
>
> or, like @fengguangyuan say, before `java
table.newAppend().appendFile(dataFile).commit(); ` update datafile, you need
`java spark.read().parquet("/path.parquet").writeTo("SUBSCRIBER"); ` it may mv
data file to the target directory
as per my understanding, I need to move the parquet to the data folder
programmatically or manually, then I need to add the URI of that file to
dataFile, will get back to you by following above steps thanks for the help to
all
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]