[ 
https://issues.apache.org/jira/browse/HADOOP-5974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12717373#action_12717373
 ] 

Konstantin Boudnik commented on HADOOP-5974:
--------------------------------------------

My patch is pretty much ready and requires a couple of libraries to be added to 
the Hadoop project. These libraries aren't associated with any of Apache's 
projects: they are under Eclipse Software License and are distributed from 
their website.

I'm not sure what is the 'rule of thumb' to add the libraries to ivy 
configuration for Hadoop? Or shall they be added statically, e.g. into SVN 
repository? I assume that the latter is a bad idea generally, which leaves us 
with the former option.

Can any of the watchers comment on this, please?

> Add orthogonal fault injection mechanism/framework
> --------------------------------------------------
>
>                 Key: HADOOP-5974
>                 URL: https://issues.apache.org/jira/browse/HADOOP-5974
>             Project: Hadoop Core
>          Issue Type: Test
>          Components: test
>            Reporter: Konstantin Boudnik
>            Assignee: Konstantin Boudnik
>
> It'd be great to have a fault injection mechanism for Hadoop.
> Having such solution in place will allow to increase test coverage of error 
> handling and recovery mechanisms, reduce reproduction time and increase the 
> reproduction rate of the problems.
> Ideally, the system has to be orthogonal to the current code and test base. 
> E.g. faults have to be injected at build time and would have to be 
> configurable, e.g. all faults could be turned off, or only some of them would 
> be allowed to happen. Also, fault injection has to be separated from 
> production build. 

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to