[
https://issues.apache.org/jira/browse/AMBARI-13773?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15056266#comment-15056266
]
Raf Mathijs commented on AMBARI-13773:
--------------------------------------
https://github.com/apache/ambari/blob/trunk/contrib/views/files/src/main/java/org/apache/ambari/view/filebrowser/UploadService.java
could it be related to the value of 1024 in this code ?
private void uploadFile(final String filePath, InputStream uploadedInputStream)
throws IOException, InterruptedException {
int read;
byte[] chunk = new byte[1024];
FSDataOutputStream out = null;
try {
out = getApi(context).create(filePath, false);
while ((read = uploadedInputStream.read(chunk)) != -1) {
out.write(chunk, 0, read);
}
} finally {
if (out != null) {
out.close();
}
}
}
> Ambari views file upload corruption
> -----------------------------------
>
> Key: AMBARI-13773
> URL: https://issues.apache.org/jira/browse/AMBARI-13773
> Project: Ambari
> Issue Type: Bug
> Components: ambari-views
> Affects Versions: 2.1.0
> Environment: HDP 2.3.0.0-2557 sandbox loaded by VirtualBox 5 hosted
> by Windows 7 Pro.
> 4 cores, 8 Go RAM.
> Reporter: Régis GARMY
>
> When uploading text files (csv), files are corrupted.
> It affects Ambari HDFS Files view and Ambari Local Files view.
> When I upload a 100 rows csv file located on NTFS W7 filesystem, the file is
> transferred without error. But the file is bigger on both HDFS and CentOs
> storage : they've got 102 rows, the end of the file has been filled with
> extra datas. It looks like data block filling.
> File is correctly uploaded with Hue File manager.
> File size reported by HDFS web UI is always integer while using Ambari for
> upload (61 KB), float using Hue (60.04 KB),
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)