[
https://issues.apache.org/jira/browse/HDFS-6819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14101021#comment-14101021
]
Colin Patrick McCabe commented on HDFS-6819:
--------------------------------------------
bq. My question is that what is the common practice for injecting fault ? When
I need to inject some code in DataNode, shall I add a method in
DataNodeFaultInjector and call the method where I want to inject error?
There are a few different ways to do it. One is to use the "Injector" classes.
This works well when you have a bunch of different system tests that want to
test the same class of faults. I also find it to be very clear, since it's
obvious exactly where the faults are being injected.
Another is to use composition, creating a true unit test (not a system test)
that uses individual classes directly. This is the best method by far, when
you can do it. But of course, not all tests are true unit tests.
Finally, you can use Mockito, which can create "mock" classes with just one or
a few methods replaced. There are some pretty advanced uses of Mockito in the
code that you can learn from. Check out {{TestBPOfferService.java}}, for
example.
Whatever method is clearest and simplest is the best!
> make HDFS fault injection framework working with maven
> ------------------------------------------------------
>
> Key: HDFS-6819
> URL: https://issues.apache.org/jira/browse/HDFS-6819
> Project: Hadoop HDFS
> Issue Type: Task
> Reporter: George Wong
> Assignee: George Wong
>
> In current trunk code repo, the FI framework does not work. Because maven
> build process does not execute the AspectJ injection.
> Since FI is very useful for testing and bug reproduce, it is better to make
> FI framework working in the trunk code.
--
This message was sent by Atlassian JIRA
(v6.2#6252)