[Gluster-Maintainers] Jenkins build is back to normal : regression-test-burn-in #1818

2016-10-03 Thread jenkins
See 

___
maintainers mailing list
maintainers@gluster.org
http://www.gluster.org/mailman/listinfo/maintainers


[Gluster-Maintainers] Build failed in Jenkins: regression-test-burn-in #1819

2016-10-03 Thread jenkins
See 

--
[...truncated 2050 lines...]
ok 37, LINENUM:76
ok 38, LINENUM:77
ok 39, LINENUM:78
ok
All tests successful.
Files=1, Tests=39, 100 wallclock secs ( 0.02 usr  0.00 sys + 16.82 cusr  4.80 
csys = 21.64 CPU)
Result: PASS
End of test ./tests/basic/afr/granular-esh/add-brick.t




[03:56:59] Running tests in file 
./tests/basic/afr/granular-esh/conservative-merge.t
./tests/basic/afr/granular-esh/conservative-merge.t .. 
1..85
ok 1, LINENUM:11
ok 2, LINENUM:12
ok 3, LINENUM:13
ok 4, LINENUM:14
ok 5, LINENUM:15
ok 6, LINENUM:16
ok 7, LINENUM:17
ok 8, LINENUM:18
ok 9, LINENUM:20
ok 10, LINENUM:21
ok 11, LINENUM:23
ok 12, LINENUM:29
ok 13, LINENUM:31
ok 14, LINENUM:33
ok 15, LINENUM:35
ok 16, LINENUM:40
ok 17, LINENUM:41
ok 18, LINENUM:42
ok 19, LINENUM:43
ok 20, LINENUM:44
ok 21, LINENUM:45
ok 22, LINENUM:46
ok 23, LINENUM:47
ok 24, LINENUM:48
ok 25, LINENUM:49
ok 26, LINENUM:51
ok 27, LINENUM:52
ok 28, LINENUM:53
ok 29, LINENUM:54
ok 30, LINENUM:56
ok 31, LINENUM:59
ok 32, LINENUM:63
ok 33, LINENUM:67
ok 34, LINENUM:68
ok 35, LINENUM:69
ok 36, LINENUM:70
ok 37, LINENUM:71
ok 38, LINENUM:72
ok 39, LINENUM:73
ok 40, LINENUM:74
ok 41, LINENUM:75
ok 42, LINENUM:76
ok 43, LINENUM:78
ok 44, LINENUM:79
ok 45, LINENUM:80
ok 46, LINENUM:82
ok 47, LINENUM:83
ok 48, LINENUM:84
ok 49, LINENUM:85
ok 50, LINENUM:87
ok 51, LINENUM:90
ok 52, LINENUM:94
ok 53, LINENUM:95
ok 54, LINENUM:100
ok 55, LINENUM:100
ok 56, LINENUM:100
ok 57, LINENUM:100
ok 58, LINENUM:103
ok 59, LINENUM:104
ok 60, LINENUM:106
ok 61, LINENUM:107
ok 62, LINENUM:109
ok 63, LINENUM:110
ok 64, LINENUM:112
ok 65, LINENUM:113
ok 66, LINENUM:116
ok 67, LINENUM:117
ok 68, LINENUM:118
ok 69, LINENUM:119
ok 70, LINENUM:120
ok 71, LINENUM:121
ok 72, LINENUM:122
ok 73, LINENUM:123
ok 74, LINENUM:124
ok 75, LINENUM:125
ok 76, LINENUM:126
ok 77, LINENUM:127
ok 78, LINENUM:129
ok 79, LINENUM:130
ok 80, LINENUM:131
ok 81, LINENUM:132
ok 82, LINENUM:133
ok 83, LINENUM:134
ok 84, LINENUM:135
ok 85, LINENUM:136
ok
All tests successful.
Files=1, Tests=85, 113 wallclock secs ( 0.03 usr  0.01 sys + 19.30 cusr  5.35 
csys = 24.69 CPU)
Result: PASS
End of test ./tests/basic/afr/granular-esh/conservative-merge.t




[03:58:52] Running tests in file ./tests/basic/afr/granular-esh/granular-esh.t
stat: cannot stat 
`/d/backends/patchy1/.glusterfs/indices/entry-changes/----0001/dir':
 No such file or directory
./tests/basic/afr/granular-esh/granular-esh.t .. 
1..82
ok 1, LINENUM:11
ok 2, LINENUM:12
ok 3, LINENUM:13
ok 4, LINENUM:14
ok 5, LINENUM:15
ok 6, LINENUM:16
ok 7, LINENUM:17
ok 8, LINENUM:18
ok 9, LINENUM:19
ok 10, LINENUM:21
ok 11, LINENUM:30
ok 12, LINENUM:39
ok 13, LINENUM:48
ok 14, LINENUM:53
ok 15, LINENUM:53
ok 16, LINENUM:57
ok 17, LINENUM:60
ok 18, LINENUM:63
ok 19, LINENUM:66
ok 20, LINENUM:71
ok 21, LINENUM:71
ok 22, LINENUM:74
ok 23, LINENUM:75
ok 24, LINENUM:76
ok 25, LINENUM:79
ok 26, LINENUM:80
ok 27, LINENUM:81
ok 28, LINENUM:84
ok 29, LINENUM:92
ok 30, LINENUM:95
ok 31, LINENUM:97
ok 32, LINENUM:98
ok 33, LINENUM:101
not ok 34 , LINENUM:103
FAILED COMMAND: stat 
/d/backends/patchy1/.glusterfs/indices/entry-changes/----0001/dir
ok 35, LINENUM:104
ok 36, LINENUM:105
ok 37, LINENUM:106
ok 38, LINENUM:108
ok 39, LINENUM:109
ok 40, LINENUM:110
ok 41, LINENUM:112
ok 42, LINENUM:113
ok 43, LINENUM:114
ok 44, LINENUM:115
ok 45, LINENUM:116
ok 46, LINENUM:119
ok 47, LINENUM:124
ok 48, LINENUM:124
ok 49, LINENUM:129
ok 50, LINENUM:130
ok 51, LINENUM:129
ok 52, LINENUM:130
ok 53, LINENUM:129
ok 54, LINENUM:130
ok 55, LINENUM:134
ok 56, LINENUM:135
ok 57, LINENUM:136
ok 58, LINENUM:137
ok 59, LINENUM:138
ok 60, LINENUM:139
ok 61, LINENUM:140
ok 62, LINENUM:143
ok 63, LINENUM:144
ok 64, LINENUM:145
ok 65, LINENUM:146
ok 66, LINENUM:147
ok 67, LINENUM:148
ok 68, LINENUM:149
ok 69, LINENUM:150
ok 70, LINENUM:152
ok 71, LINENUM:153
ok 72, LINENUM:154
ok 73, LINENUM:156
ok 74, LINENUM:157
ok 75, LINENUM:158
ok 76, LINENUM:159
ok 77, LINENUM:160
ok 78, LINENUM:161
ok 79, LINENUM:163
ok 80, LINENUM:164
ok 81, LINENUM:165
ok 82, LINENUM:166
Failed 1/82 subtests 

Test Summary Report
---
./tests/basic/afr/granular-esh/granular-esh.t (Wstat: 0 Tests: 82 Failed: 1)
  Failed test:  34
Files=1, Tests=82, 99 wallclock secs ( 0.03 usr  0.01 sys + 15.82 cusr  4.65 
csys = 20.51 CPU)
Result: FAIL
End of test ./tests/basic/afr/granular-esh/granular-esh.t



Run complete

Re: [Gluster-Maintainers] [Gluster-devel] 'Reviewd-by' tag for commits

2016-10-03 Thread Pranith Kumar Karampuri
On Mon, Oct 3, 2016 at 12:17 PM, Joe Julian  wrote:

> If you get credit for +1, shouldn't you also get credit for -1? It seems
> to me that catching a fault is at least as valuable if not more so.
>

Yes when I said review it could be either +1/-1/+2


>
> On October 3, 2016 3:58:32 AM GMT+02:00, Pranith Kumar Karampuri <
> pkara...@redhat.com> wrote:
>>
>>
>>
>> On Mon, Oct 3, 2016 at 7:23 AM, Ravishankar N 
>> wrote:
>>
>>> On 10/03/2016 06:58 AM, Pranith Kumar Karampuri wrote:
>>>
>>>
>>>
>>> On Mon, Oct 3, 2016 at 6:41 AM, Pranith Kumar Karampuri <
>>> pkara...@redhat.com> wrote:
>>>


 On Fri, Sep 30, 2016 at 8:50 PM, Ravishankar N 
 wrote:

> On 09/30/2016 06:38 PM, Niels de Vos wrote:
>
> On Fri, Sep 30, 2016 at 07:11:51AM +0530, Pranith Kumar Karampuri wrote:
>
> hi,
>  At the moment 'Reviewed-by' tag comes only if a +1 is given on the
> final version of the patch. But for most of the patches, different people
> would spend time on different versions making the patch better, they may
> not get time to do the review for every version of the patch. Is it
> possible to change the gerrit script to add 'Reviewed-by' for all the
> people who participated in the review?
>
> +1 to this. For the argument that this *might* encourage me-too +1s,
> it only exposes
> such persons in bad light.
>
> Or removing 'Reviewed-by' tag completely would also help to make sure it
> doesn't give skewed counts.
>
> I'm not going to lie, for me, that takes away the incentive of doing
> any reviews at all.
>

 Could you elaborate why? May be you should also talk about your primary
 motivation for doing reviews.

>>>
>>> I guess it is probably because the effort needs to be recognized? I
>>> think there is an option to recognize it so it is probably not a good idea
>>> to remove the tag I guess.
>>>
>>>
>>> Yes, numbers provide good motivation for me:
>>> Motivation for looking at patches and finding bugs for known components
>>> even though I am not its maintainer.
>>> Motivation to learning new components because a bug and a fix is usually
>>> when I look at code for unknown components.
>>> Motivation to level-up when statistics indicate I'm behind my peers.
>>>
>>> I think even you said some time back in an ML thread that what can be
>>> measured can be improved.
>>>
>>
>> I am still not sure how to quantify good review from a bad one. So not
>> sure how it can be measured thus improved. I guess at this point getting
>> more eyes on the patches is good enough.
>>
>>
>>>
>>> -Ravi
>>>
>>>
>>>

 I would not feel comfortable automatically adding Reviewed-by tags for
> people that did not review the last version. They may not agree with the
> last version, so adding their "approved stamp" on it may not be correct.
> See the description of Reviewed-by in the Linux kernel sources [0].
>
> While the Linux kernel model is the poster child for projects to draw
> standards
> from, IMO, their email based review system is certainly not one to
> emulate. It
> does not provide a clean way to view patch-set diffs, does not present
> a single
> URL based history that tracks all review comments, relies on the
> sender to
> provide information on what changed between versions, allows a variety
> of
> 'Komedians' [1] to add random tags which may or may not be picked up
> by the maintainer who takes patches in etc.
>
> Maybe we can add an additional tag that mentions all the people that
> did do reviews of older versions of the patch. Not sure what the tag
> would be, maybe just CC?
>
> It depends on what tags would be processed to obtain statistics on
> review contributions.
> I agree that not all reviewers might be okay with the latest revision
> but that
> % might be miniscule (zero, really) compared to the normal case where
> the reviewer spent
> considerable time and effort to provide feedback (and an eventual +1)
> on previous
> revisions. If converting all +1s into 'Reviewed-by's is not feasible
> in gerrit
> or is not considered acceptable, then the maintainer could wait for a
> reasonable
> time for reviewers to give +1 for the final revision before he/she
> goes ahead
> with a +2 and merges it. While we cannot wait indefinitely for all
> acks, a comment
> like 'LGTM, will wait for a day for other acks before I go ahead and
> merge' would be
> appreciated.
>
> Enough of bike-shedding from my end I suppose.:-)
> Ravi
>
> [1] https://lwn.net/Articles/503829/
>
> Niels
>
> 0. 
> http://git.kernel.org/cgit/linux/kernel/git/torvalds/linux.git/tree/Documentation/SubmittingPatches#n552
>
>