Abhishek,

I will go through the attachments you sent last time and get back.
Apologies for missing that out.

~Atin

On 04/14/2016 07:13 AM, ABHISHEK PALIWAL wrote:
> Hi,
> 
> I have still facing this problem and analyzing the system as well as
> gluster logs.
> 
> From the system logs /var/log/messages I found that time stamp of
> temporary n/w failure logs is very near and between to “volume status”
> command and before the remove-brick command where our brick is offline.
> 
> Please suggest me based on below logs is this could be possible reason
> for and stopping Brick to come online?
> 
> we have collect logs from two setup
> 
> Setup 1:
> 
> 
> 002500> cat /var/log/messages | grep -r "name lookup failed for"
> Apr  8 06:52:08 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2361]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  8 06:52:09 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2639]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  8 06:52:09 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2657]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  8 06:52:09 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2677]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  8 06:52:10 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2878]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
>  
> # cat /var/log/glusterfs/cmd_history.log | grep -r "SUCCESS"
> [2016-04-08 06:52:13.687779]  : volume status : SUCCESS
> [2016-04-08 06:52:13.703139]  : volume status : SUCCESS
> [2016-04-08 06:52:14.041347]  : volume remove-brick c_glusterfs replica
> 1 10.32.1.144:/opt/lvmdir/c2/brick force : SUCCESS
> [2016-04-08 06:52:14.135737]  : peer detach 10.32.1.144 : SUCCESS    
> [2016-04-08 06:52:15.415575]  : peer probe 10.32.1.144 : SUCCESS    
> [2016-04-08 06:52:17.941650]  : volume add-brick c_glusterfs replica 2
> 10.32.1.144:/opt/lvmdir/c2/brick force : SUCCESS
> 
> 
> Setup 2:
> 
> # cat /var/log/glusterfs/cmd_history.log | grep -r "SUCCESS"
> [2016-04-07 23:48:00.544269]  : volume status : SUCCESS
> [2016-04-07 23:48:00.564007]  : volume status : SUCCESS
> [2016-04-07 23:48:01.624280]  : volume status : SUCCESS
> [2016-04-07 23:48:01.642542]  : volume status : SUCCESS
> [2016-04-07 23:48:02.699085]  : volume status : SUCCESS
> [2016-04-07 23:48:02.716108]  : volume status : SUCCESS
> [2016-04-07 23:48:03.782118]  : volume status : SUCCESS
> [2016-04-07 23:48:03.832059]  : volume status : SUCCESS
> [2016-04-07 23:56:22.709733]  : volume set c_glusterfs nfs.disable on :
> SUCCESS
> [2016-04-07 23:56:24.037270]  : volume start c_glusterfs force : SUCCESS
> [2016-04-07 23:56:34.326782]  : volume status : SUCCESS
> [2016-04-07 23:56:34.352975]  : volume status : SUCCESS
> [2016-04-07 23:56:35.441763]  : volume status : SUCCESS
> [2016-04-07 23:56:35.467474]  : volume status : SUCCESS
> [2016-04-07 23:56:36.544532]  : volume status : SUCCESS
> [2016-04-07 23:56:36.563667]  : volume status : SUCCESS
> [2016-04-07 23:56:37.633660]  : volume status : SUCCESS
> [2016-04-07 23:56:37.653251]  : volume status : SUCCESS
> [2016-04-07 23:56:38.726406]  : volume status : SUCCESS
> [2016-04-07 23:56:38.746097]  : volume status : SUCCESS
> [2016-04-07 23:56:39.805968]  : volume status : SUCCESS
> [2016-04-07 23:56:39.824601]  : volume status : SUCCESS
> [2016-04-07 23:56:40.886599]  : volume status : SUCCESS
> [2016-04-07 23:56:40.905728]  : volume status : SUCCESS
> [2016-04-07 23:56:41.963659]  : volume status : SUCCESS
> [2016-04-07 23:56:41.980006]  : volume status : SUCCESS
> [2016-04-07 23:56:43.037351]  : volume status : SUCCESS
> [2016-04-07 23:56:43.053252]  : volume status : SUCCESS
> [2016-04-07 23:56:44.112666]  : volume status : SUCCESS
> [2016-04-07 23:56:44.129299]  : volume status : SUCCESS
> [2016-04-07 23:56:45.186047]  : volume status : SUCCESS
> [2016-04-07 23:56:45.201741]  : volume status : SUCCESS
> [2016-04-07 23:56:46.259575]  : volume status : SUCCESS
> [2016-04-07 23:56:46.274823]  : volume status : SUCCESS
> [2016-04-07 23:56:47.332110]  : volume status : SUCCESS
> [2016-04-07 23:56:47.347350]  : volume status : SUCCESS
> [2016-04-07 23:56:48.408752]  : volume status : SUCCESS
> [2016-04-07 23:56:48.424115]  : volume status : SUCCESS
> [2016-04-07 23:56:49.481196]  : volume status : SUCCESS
> [2016-04-07 23:56:49.496417]  : volume status : SUCCESS
> [2016-04-07 23:56:50.556596]  : volume status : SUCCESS
> [2016-04-07 23:56:50.572201]  : volume status : SUCCESS
> [2016-04-07 23:56:51.645649]  : volume status : SUCCESS
> [2016-04-07 23:56:51.660980]  : volume status : SUCCESS
> [2016-04-07 23:56:52.718777]  : volume status : SUCCESS
> [2016-04-07 23:56:52.734715]  : volume status : SUCCESS
> [2016-04-07 23:56:53.798818]  : volume status : SUCCESS
> [2016-04-07 23:56:53.814556]  : volume status : SUCCESS
> [2016-04-07 23:56:54.873792]  : volume status : SUCCESS
> [2016-04-07 23:56:54.889273]  : volume status : SUCCESS
> [2016-04-07 23:56:55.949193]  : volume status : SUCCESS
> [2016-04-07 23:56:55.964678]  : volume status : SUCCESS
> [2016-04-07 23:56:57.026010]  : volume status : SUCCESS
> [2016-04-07 23:56:57.042384]  : volume status : SUCCESS
> [2016-04-07 23:56:58.104969]  : volume status : SUCCESS
> [2016-04-07 23:56:58.121515]  : volume status : SUCCESS
> [2016-04-07 23:56:59.177334]  : volume status : SUCCESS
> [2016-04-07 23:56:59.193038]  : volume status : SUCCESS
> [2016-04-07 23:57:00.249726]  : volume status : SUCCESS
> [2016-04-07 23:57:00.264757]  : volume status : SUCCESS
> [2016-04-07 23:57:01.321589]  : volume status : SUCCESS
> [2016-04-07 23:57:01.337437]  : volume status : SUCCESS
> [2016-04-07 23:57:02.394071]  : volume status : SUCCESS
> [2016-04-07 23:57:02.409723]  : volume status : SUCCESS
> [2016-04-07 23:57:03.465013]  : volume status : SUCCESS
> [2016-04-07 23:57:03.479919]  : volume status : SUCCESS
> [2016-04-07 23:57:04.535535]  : volume status : SUCCESS
> [2016-04-07 23:57:04.550799]  : volume status : SUCCESS
> [2016-04-07 23:57:05.607067]  : volume status : SUCCESS
> [2016-04-07 23:57:05.622387]  : volume status : SUCCESS
> [2016-04-07 23:57:06.677708]  : volume status : SUCCESS
> [2016-04-07 23:57:06.692804]  : volume status : SUCCESS
> [2016-04-07 23:57:07.749353]  : volume status : SUCCESS
> [2016-04-07 23:57:07.764955]  : volume status : SUCCESS
> [2016-04-07 23:57:08.821939]  : volume status : SUCCESS
> [2016-04-07 23:57:08.838049]  : volume status : SUCCESS
> [2016-04-07 23:57:09.892807]  : volume status : SUCCESS
> [2016-04-07 23:57:09.907671]  : volume status : SUCCESS
> [2016-04-07 23:57:10.962583]  : volume status : SUCCESS
> [2016-04-07 23:57:10.977470]  : volume status : SUCCESS
> [2016-04-07 23:57:12.032784]  : volume status : SUCCESS
> [2016-04-07 23:57:12.048140]  : volume status : SUCCESS
> [2016-04-07 23:57:13.102801]  : volume status : SUCCESS
> [2016-04-07 23:57:13.117916]  : volume status : SUCCESS
> [2016-04-07 23:57:14.174420]  : volume status : SUCCESS
> [2016-04-07 23:57:14.190000]  : volume status : SUCCESS
> [2016-04-07 23:57:15.245023]  : volume status : SUCCESS
> [2016-04-07 23:57:15.260118]  : volume status : SUCCESS
> [2016-04-07 23:57:16.316116]  : volume status : SUCCESS
> [2016-04-07 23:57:16.331088]  : volume status : SUCCESS
> [2016-04-07 23:57:17.385876]  : volume status : SUCCESS
> [2016-04-07 23:57:17.401176]  : volume status : SUCCESS
> [2016-04-07 23:57:18.456077]  : volume status : SUCCESS
> [2016-04-07 23:57:18.471002]  : volume status : SUCCESS
> [2016-04-07 23:57:19.526370]  : volume status : SUCCESS
> [2016-04-07 23:57:19.541316]  : volume status : SUCCESS
> [2016-04-07 23:57:20.599767]  : volume status : SUCCESS
> [2016-04-07 23:57:20.614622]  : volume status : SUCCESS
> [2016-04-07 23:57:21.670278]  : volume status : SUCCESS
> [2016-04-07 23:57:21.686137]  : volume status : SUCCESS
> [2016-04-07 23:57:22.741620]  : volume status : SUCCESS
> [2016-04-07 23:57:22.757029]  : volume status : SUCCESS
> [2016-04-07 23:57:23.814023]  : volume status : SUCCESS
> [2016-04-07 23:57:23.828713]  : volume status : SUCCESS
> [2016-04-07 23:57:24.885976]  : volume status : SUCCESS
> [2016-04-07 23:57:24.900984]  : volume status : SUCCESS
> [2016-04-07 23:57:25.956532]  : volume status : SUCCESS
> [2016-04-07 23:57:25.971524]  : volume status : SUCCESS
> [2016-04-07 23:57:27.027527]  : volume status : SUCCESS
> [2016-04-07 23:57:27.043078]  : volume status : SUCCESS
> [2016-04-07 23:57:28.098295]  : volume status : SUCCESS
> [2016-04-07 23:57:28.113567]  : volume status : SUCCESS
> [2016-04-07 23:57:29.176173]  : volume status : SUCCESS
> [2016-04-07 23:57:29.191290]  : volume status : SUCCESS
> [2016-04-07 23:57:30.246027]  : volume status : SUCCESS
> [2016-04-07 23:57:30.261358]  : volume status : SUCCESS
> [2016-04-07 23:57:31.319007]  : volume status : SUCCESS
> [2016-04-07 23:57:31.333841]  : volume status : SUCCESS
> [2016-04-07 23:57:32.388578]  : volume status : SUCCESS
> [2016-04-07 23:57:32.405997]  : volume status : SUCCESS
> [2016-04-07 23:57:33.460740]  : volume status : SUCCESS
> [2016-04-07 23:57:33.475143]  : volume status : SUCCESS
> [2016-04-07 23:57:34.529614]  : volume status : SUCCESS
> [2016-04-07 23:57:34.544573]  : volume status : SUCCESS
> [2016-04-07 23:57:35.601543]  : volume status : SUCCESS
> [2016-04-07 23:57:35.616131]  : volume status : SUCCESS
> [2016-04-07 23:57:36.671247]  : volume status : SUCCESS
> [2016-04-07 23:57:36.686742]  : volume status : SUCCESS
> [2016-04-07 23:57:37.742293]  : volume status : SUCCESS
> [2016-04-07 23:57:37.757473]  : volume status : SUCCESS
> [2016-04-07 23:57:38.815902]  : volume status : SUCCESS
> [2016-04-07 23:57:38.832081]  : volume status : SUCCESS
> [2016-04-07 23:57:39.887027]  : volume status : SUCCESS
> [2016-04-07 23:57:39.902269]  : volume status : SUCCESS
> [2016-04-07 23:57:40.957761]  : volume status : SUCCESS
> [2016-04-07 23:57:40.973322]  : volume status : SUCCESS
> [2016-04-07 23:57:42.028674]  : volume status : SUCCESS
> [2016-04-07 23:57:42.045765]  : volume status : SUCCESS
> [2016-04-07 23:57:43.104397]  : volume status : SUCCESS
> [2016-04-07 23:57:43.121970]  : volume status : SUCCESS
> [2016-04-07 23:57:44.183029]  : volume status : SUCCESS
> [2016-04-07 23:57:44.201168]  : volume status : SUCCESS
> [2016-04-07 23:57:45.256772]  : volume status : SUCCESS
> [2016-04-07 23:57:45.274251]  : volume status : SUCCESS
> [2016-04-07 23:57:46.331280]  : volume status : SUCCESS
> [2016-04-07 23:57:46.349052]  : volume status : SUCCESS
> [2016-04-07 23:57:47.408266]  : volume status : SUCCESS
> [2016-04-07 23:57:47.426468]  : volume status : SUCCESS
> [2016-04-07 23:57:48.483301]  : volume status : SUCCESS
> [2016-04-07 23:57:48.500622]  : volume status : SUCCESS
> [2016-04-07 23:57:49.556927]  : volume status : SUCCESS
> [2016-04-07 23:57:49.573825]  : volume status : SUCCESS
> [2016-04-07 23:57:50.630662]  : volume status : SUCCESS
> [2016-04-07 23:57:50.647136]  : volume status : SUCCESS
> [2016-04-07 23:57:51.705079]  : volume status : SUCCESS
> [2016-04-07 23:57:51.722046]  : volume status : SUCCESS
> [2016-04-07 23:57:52.777994]  : volume status : SUCCESS
> [2016-04-07 23:57:52.794681]  : volume status : SUCCESS
> [2016-04-07 23:57:53.853193]  : volume status : SUCCESS
> [2016-04-07 23:57:53.870088]  : volume status : SUCCESS
> [2016-04-07 23:57:54.928807]  : volume status : SUCCESS
> [2016-04-07 23:57:54.946332]  : volume status : SUCCESS
> [2016-04-07 23:57:56.003859]  : volume status : SUCCESS
> [2016-04-07 23:57:56.020624]  : volume status : SUCCESS
> [2016-04-07 23:57:57.076639]  : volume status : SUCCESS
> [2016-04-07 23:57:57.093430]  : volume status : SUCCESS
> [2016-04-07 23:57:58.148805]  : volume status : SUCCESS
> [2016-04-07 23:57:58.167321]  : volume status : SUCCESS
> [2016-04-07 23:57:59.225112]  : volume status : SUCCESS
> [2016-04-07 23:57:59.242433]  : volume status : SUCCESS
> [2016-04-07 23:58:00.300050]  : volume status : SUCCESS
> [2016-04-07 23:58:00.317123]  : volume status : SUCCESS
> [2016-04-07 23:58:01.373615]  : volume status : SUCCESS
> [2016-04-07 23:58:01.390987]  : volume status : SUCCESS
> [2016-04-07 23:58:02.448198]  : volume status : SUCCESS
> [2016-04-07 23:58:02.464742]  : volume status : SUCCESS
> [2016-04-07 23:58:03.522714]  : volume status : SUCCESS
> [2016-04-07 23:58:03.539341]  : volume status : SUCCESS
> [2016-04-07 23:58:04.595848]  : volume status : SUCCESS
> [2016-04-07 23:58:04.613142]  : volume status : SUCCESS
> [2016-04-07 23:58:05.670772]  : volume status : SUCCESS
> [2016-04-07 23:58:05.687886]  : volume status : SUCCESS
> [2016-04-07 23:58:06.744241]  : volume status : SUCCESS
> [2016-04-07 23:58:06.761412]  : volume status : SUCCESS
> [2016-04-07 23:58:07.818249]  : volume status : SUCCESS
> [2016-04-07 23:58:07.835174]  : volume status : SUCCESS
> [2016-04-07 23:58:08.891052]  : volume status : SUCCESS
> [2016-04-07 23:58:08.908118]  : volume status : SUCCESS
> [2016-04-07 23:58:09.963741]  : volume status : SUCCESS
> [2016-04-07 23:58:09.980837]  : volume status : SUCCESS
> [2016-04-07 23:58:11.036566]  : volume status : SUCCESS
> [2016-04-07 23:58:11.053823]  : volume status : SUCCESS
> [2016-04-07 23:58:12.111426]  : volume status : SUCCESS
> [2016-04-07 23:58:12.128725]  : volume status : SUCCESS
> [2016-04-07 23:58:13.184849]  : volume status : SUCCESS
> [2016-04-07 23:58:13.201892]  : volume status : SUCCESS
> [2016-04-07 23:58:14.257657]  : volume status : SUCCESS
> [2016-04-07 23:58:14.274758]  : volume status : SUCCESS
> [2016-04-07 23:58:15.330533]  : volume status : SUCCESS
> [2016-04-07 23:58:15.347758]  : volume status : SUCCESS
> [2016-04-07 23:58:16.405524]  : volume status : SUCCESS
> [2016-04-07 23:58:16.422845]  : volume status : SUCCESS
> [2016-04-07 23:58:17.480674]  : volume status : SUCCESS
> [2016-04-07 23:58:17.498446]  : volume status : SUCCESS
> [2016-04-07 23:58:18.555364]  : volume status : SUCCESS
> [2016-04-07 23:58:18.572318]  : volume status : SUCCESS
> [2016-04-07 23:58:19.629799]  : volume status : SUCCESS
> [2016-04-07 23:58:19.648091]  : volume status : SUCCESS
> [2016-04-07 23:58:20.710409]  : volume status : SUCCESS
> [2016-04-07 23:58:20.727520]  : volume status : SUCCESS
> [2016-04-07 23:58:21.785556]  : volume status : SUCCESS
> [2016-04-07 23:58:21.802865]  : volume status : SUCCESS
> [2016-04-07 23:58:22.018119]  : volume remove-brick c_glusterfs replica
> 1 10.32.1.144:/opt/lvmdir/c2/brick force : SUCCESS
> [2016-04-07 23:58:22.115762]  : peer detach 10.32.1.144 : SUCCESS    
> [2016-04-07 23:58:23.426499]  : peer probe 10.32.1.144 : SUCCESS    
> [2016-04-07 23:58:26.023473]  : volume add-brick c_glusterfs replica 2
> 10.32.1.144:/opt/lvmdir/c2/brick force : SUCCESS
> [2016-04-08 08:54:24.788298]  : volume status : SUCCESS
> [2016-04-08 08:54:24.804415]  : volume status : SUCCESS
> #
>  
>  
> 002500> cat /var/log/messages | grep -r "name lookup failed for"
> Apr  7 23:47:55 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2428]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:47:56 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2501]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:47:56 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2517]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:47:56 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2550]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:47:56 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2665]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:47:56 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2715]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:47:57 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2740]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:47:57 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2772]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:47:57 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2797]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:47:57 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2822]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:47:57 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2837]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:48:01 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2844]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:48:01 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2851]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:48:01 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2858]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:48:02 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2881]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:48:02 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2901]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:48:02 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2912]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:48:02 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2931]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:48:02 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2938]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:48:02 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[2951]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:48:03 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[3032]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:33 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10587]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:34 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10713]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:34 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10719]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:34 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10726]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:35 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10733]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:35 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10749]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:35 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10777]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:35 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10805]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:35 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10812]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:35 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10822]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:36 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10830]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:36 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10869]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:36 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10875]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:36 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10892]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:36 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10900]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:36 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10906]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:36 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10914]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:37 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10920]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:37 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10927]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:37 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10934]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> Apr  7 23:56:38 2016 oamhost daemon.info <http://daemon.info>
> rsyncd[10980]: name lookup failed for 10.32.0.48 <http://10.32.0.48>:
> Temporary failure in name resolution
> 002500>
> 
> Regards,
> Abhishek
> 
> 
> 
> On Tue, Apr 5, 2016 at 3:35 PM, ABHISHEK PALIWAL
> <abhishpali...@gmail.com <mailto:abhishpali...@gmail.com>> wrote:
> 
> 
> 
>     On Tue, Apr 5, 2016 at 2:22 PM, Atin Mukherjee <amukh...@redhat.com
>     <mailto:amukh...@redhat.com>> wrote:
> 
> 
> 
>         On 04/05/2016 01:04 PM, ABHISHEK PALIWAL wrote:
>         > Hi Team,
>         >
>         > We are using Gluster 3.7.6 and facing one problem in which brick is 
> not
>         > comming online after restart the board.
>         >
>         > To understand our setup, please look the following steps:
>         > 1. We have two boards A and B on which Gluster volume is running in
>         > replicated mode having one brick on each board.
>         > 2. Gluster mount point is present on the Board A which is sharable
>         > between number of processes.
>         > 3. Till now our volume is in sync and everthing is working fine.
>         > 4. Now we have test case in which we'll stop the glusterd, reboot 
> the
>         > Board B and when this board comes up, starts the glusterd again on 
> it.
>         > 5. We repeated Steps 4 multiple times to check the reliability of 
> system.
>         > 6. After the Step 4, sometimes system comes in working state (i.e. 
> in
>         > sync) but sometime we faces that brick of Board B is present in
>         >     “gluster volume status” command but not be online even waiting 
> for
>         > more than a minute.
>         As I mentioned in another email thread until and unless the log
>         shows
>         the evidence that there was a reboot nothing can be concluded.
>         The last
>         log what you shared with us few days back didn't give any indication
>         that brick process wasn't running.
> 
>     How can we identify that the brick process is running in brick logs?
> 
>         > 7. When the Step 4 is executing at the same time on Board A some
>         > processes are started accessing the files from the Gluster mount 
> point.
>         >
>         > As a solution to make this brick online, we found some existing 
> issues
>         > in gluster mailing list giving suggestion to use “gluster volume 
> start
>         > <vol_name> force” to make the brick 'offline' to 'online'.
>         >
>         > If we use “gluster volume start <vol_name> force” command. It will 
> kill
>         > the existing volume process and started the new process then what 
> will
>         > happen if other processes are accessing the same volume at the time 
> when
>         > volume process is killed by this command internally. Will it impact 
> any
>         > failure on these processes?
>         This is not true, volume start force will start the brick
>         processes only
>         if they are not running. Running brick processes will not be
>         interrupted.
> 
>     we have tried and check the pid of process before force start and
>     after force start.
>     the pid has been changed after force start.
> 
>     Please find the logs at the time of failure attached once again with
>     log-level=debug.
> 
>     if you can give me the exact line where you are able to find out
>     that the brick process
>     is running in brick log file please give me the line number of that
>     file.
> 
>     002500 - Board B that brick is offline
>     00300 - Board A logs
> 
>         >
>         > *Question : What could be contributing to brick offline?*
>         >
>         >
>         > --
>         >
>         > Regards
>         > Abhishek Paliwal
>         >
>         >
>         > _______________________________________________
>         > Gluster-devel mailing list
>         > Gluster-devel@gluster.org <mailto:Gluster-devel@gluster.org>
>         > http://www.gluster.org/mailman/listinfo/gluster-devel
>         >
> 
> 
> 
> 
> 
> 
> 
> -- 
> 
> 
> 
> 
> Regards
> Abhishek Paliwal
_______________________________________________
Gluster-devel mailing list
Gluster-devel@gluster.org
http://www.gluster.org/mailman/listinfo/gluster-devel

Reply via email to