cates, rather
than fixed.
> Task Tracker does not handle the case of read only local dir case correctly
>
>
> Key: HADOOP-308
> URL: https://issues.apache.org/ji
[
https://issues.apache.org/jira/browse/HADOOP-308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Owen O'Malley reopened HADOOP-308:
--
> Task Tracker does not handle the case of read only local dir case c
> Task Tracker does not handle the case of read only local dir case correctly
>
>
> Key: HADOOP-308
> URL: https://issues.apache.org/jira/browse/HADOOP-308
>
d by HADOOP-2227.
> Task Tracker does not handle the case of read only local dir case correctly
>
>
> Key: HADOOP-308
> URL: https://issues.apache.org/jir
oes not handle the case of read only local dir case correctly
>
>
> Key: HADOOP-308
> URL: https://issues.apache.org/jira/browse/HADOOP-308
> Project: Hadoop
>
which only
one is read-only, and seemingly submits any task to the read-only disk (no job
got successfully submitted since Nov 30), although mapred.local.dir in
hadoop-site.xml specifies local directories on all 4 disks. This node has 3
good disks, still accepts tasks, but cannot execute any of
and all in-memory content was flushed as well as setting the table
read-only. This would make it so that the hbase mapfiles in hdfs would have
all updates persisted. The files in the FS would then be as fit as possible
for feeding direct into mapreduce jobs (keys would be row/column/timestamp
HDFS files are read-only, that is you cannot modify them after they are
created written to and closed.
But you can always remove them. There is no protection against that as
of today.
File permissions are being developed as we speak.
http://issues.apache.org/jira/browse/HADOOP-1298
You can also
Any thoughts on how to make files in hdfs read-only? We would like to
protect some files.
How could that be implemented?
cheers
--
Torsten
[hbase] Add a read-only attribute to columns
Key: HADOOP-1958
URL: https://issues.apache.org/jira/browse/HADOOP-1958
Project: Hadoop
Issue Type: New Feature
Components: contrib/hbase
exception like:
(Read-only file system) at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.(FileOutputStream.java:179)
at java.io.FileOutputStream.(FileOutputStream.java:131)
at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.(DFSClient.java:723)
at
Task Tracker does not handle the case of read only local dir case correctly
Key: HADOOP-308
URL: http://issues.apache.org/jira/browse/HADOOP-308
Project: Hadoop
Type: Bug
Versions
12 matches
Mail list logo