On Thu, 21 Aug 2008, Brett Monroe wrote:
>
> Hmm, that's an interesting config...kinda an expensive but fast JBOD
> then?  I'm pretty sure MPxIO is not responsible for deciding which
> path is active and which is passive. My understanding is that it's up
> to the 2540 config.  Is CAM reporting that all volumes are on their
> preferred paths?  I assume you have been looking at iostat with the -Y
> flag?  You should only see io moving down the active controller port

I don't see anything in CAM regarding paths or path configuration. 
All of the volumes are indicated to be 'optimal' and the per disk 
loading is indicated to be quite uniform.  I was not aware of the 
iostat -Y flag until now.

> so I would expect that it you have 12 volumes with 6 assigned to one
> port and 6 assigned to another and they are on their preferred path,
> you should see an equal load on on both.  You would see even better
> throughput if you have 4 connections to the array (the dual paths to
> the individual controllers would be load balanced (symmetric)...though
> I'm not sure how much that would give you with your config (one
> drive/volume).

I have been meaning to ask on this list if MPxIO will do anything 
useful (from a load-share and reliability standpoint) if I install 
another fiber channel card so that I have four links to the array. 
Will it handle that ok?  I don't really expect to see much more 
performance but it would eliminate total failure due to FC card 
failure.

Here is a 'iostat -Y' dump for 30 second interval while 'zfs scrub' is 
going on.  The drives serviced by fp1 seem to have a much larger 
service time than the drives serviced by fp0:

                    extended device statistics
device         r/s    w/s   kr/s   kw/s wait actv  svc_t  %w  %b
sd0            0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd1            0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd2            0.0    0.0    0.0    0.3  0.0  0.0    6.3   0   0
sd10         379.9    1.0 47909.5    4.2  0.0  2.8    7.3   0  61
sd10.t1        0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd10.t1.fp1    0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd10.t2      379.9    0.8 47909.5    4.2  0.0  0.0    0.0   0   0
sd10.t2.fp0  379.9    0.8 47909.5    4.2  0.0  0.0    0.0   0   0
sd11         379.0    0.3 47861.4    0.1  0.0  8.1   21.3   0  96
sd11.t1      379.0    0.1 47861.4    0.1  0.0  0.0    0.0   0   0
sd11.t1.fp1  379.0    0.1 47861.4    0.1  0.0  0.0    0.0   0   0
sd11.t2        0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd11.t2.fp0    0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd12         379.4    0.1 47879.2    0.0  0.0  8.0   21.0   0  95
sd12.t1      379.4    0.0 47879.2    0.0  0.0  0.0    0.0   0   0
sd12.t1.fp1  379.4    0.0 47879.2    0.0  0.0  0.0    0.0   0   0
sd12.t2        0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd12.t2.fp0    0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd13         380.4    0.7 47909.0    2.7  0.0  2.8    7.4   0  62
sd13.t1        0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd13.t1.fp1    0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd13.t2      380.4    0.6 47909.0    2.7  0.0  0.0    0.0   0   0
sd13.t2.fp0  380.4    0.6 47909.0    2.7  0.0  0.0    0.0   0   0
sd14         380.7    0.7 47863.4    4.4  0.0  8.0   20.9   0  95
sd14.t1      380.7    0.6 47863.4    4.4  0.0  0.0    0.0   0   0
sd14.t1.fp1  380.7    0.6 47863.4    4.4  0.0  0.0    0.0   0   0
sd14.t2        0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd14.t2.fp0    0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd15         379.7    0.1 47913.5    0.0  0.0  2.7    7.1   0  59
sd15.t1        0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd15.t1.fp1    0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd15.t2      379.7    0.0 47913.5    0.0  0.0  0.0    0.0   0   0
sd15.t2.fp0  379.7    0.0 47913.5    0.0  0.0  0.0    0.0   0   0
sd16         380.4    0.7 47906.8    4.2  0.0  7.9   20.9   0  96
sd16.t1      380.4    0.5 47906.8    4.2  0.0  0.0    0.0   0   0
sd16.t1.fp1  380.4    0.5 47906.8    4.2  0.0  0.0    0.0   0   0
sd16.t2        0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd16.t2.fp0    0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd17         379.1    0.9 47863.4    4.4  0.0  2.7    7.2   0  61
sd17.t1        0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd17.t1.fp1    0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd17.t2      379.1    0.8 47863.4    4.4  0.0  0.0    0.0   0   0
sd17.t2.fp0  379.1    0.8 47863.4    4.4  0.0  0.0    0.0   0   0
sd18         379.3    0.7 47885.1    2.7  0.0  7.9   20.9   0  95
sd18.t1      379.3    0.5 47885.1    2.7  0.0  0.0    0.0   0   0
sd18.t1.fp1  379.3    0.5 47885.1    2.7  0.0  0.0    0.0   0   0
sd18.t2        0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd18.t2.fp0    0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd19         378.9    0.1 47940.8    0.0  0.0  8.0   21.1   0  95
sd19.t1      378.9    0.0 47940.8    0.0  0.0  0.0    0.0   0   0
sd19.t1.fp1  378.9    0.0 47940.8    0.0  0.0  0.0    0.0   0   0
sd19.t2        0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd19.t2.fp0    0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd20         381.7    0.1 47964.5    0.0  0.0  2.9    7.5   0  62
sd20.t1        0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd20.t1.fp1    0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd20.t2      381.7    0.0 47964.5    0.0  0.0  0.0    0.0   0   0
sd20.t2.fp0  381.7    0.0 47964.5    0.0  0.0  0.0    0.0   0   0
sd21         380.2    0.3 47860.4    0.1  0.0  8.2   21.6   0  96
sd21.t1      380.2    0.1 47860.4    0.1  0.0  0.0    0.0   0   0
sd21.t1.fp1  380.2    0.1 47860.4    0.1  0.0  0.0    0.0   0   0
sd21.t2        0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0
sd21.t2.fp0    0.0    0.0    0.0    0.0  0.0  0.0    0.0   0   0

And here is the current status reported by MPxIO:

=== /dev/rdsk/c4t600A0B800039C9B500000AB447B4595Fd0s2 ===
         Current Load Balance:  round-robin
                 Access State:  active
                 Access State:  standby
=== /dev/rdsk/c4t600A0B80003A8A0B0000097347B457D4d0s2 ===
         Current Load Balance:  round-robin
                 Access State:  standby
                 Access State:  active
=== /dev/rdsk/c4t600A0B800039C9B500000AB047B457ADd0s2 ===
         Current Load Balance:  round-robin
                 Access State:  standby
                 Access State:  active
=== /dev/rdsk/c4t600A0B800039C9B500000AAC47B45739d0s2 ===
         Current Load Balance:  round-robin
                 Access State:  standby
                 Access State:  active
=== /dev/rdsk/c4t600A0B80003A8A0B0000096E47B456DAd0s2 ===
         Current Load Balance:  round-robin
                 Access State:  active
                 Access State:  standby
=== /dev/rdsk/c4t600A0B800039C9B500000AA847B45605d0s2 ===
         Current Load Balance:  round-robin
                 Access State:  standby
                 Access State:  active
=== /dev/rdsk/c4t600A0B80003A8A0B0000096A47B4559Ed0s2 ===
         Current Load Balance:  round-robin
                 Access State:  active
                 Access State:  standby
=== /dev/rdsk/c4t600A0B800039C9B500000AA447B4544Fd0s2 ===
         Current Load Balance:  round-robin
                 Access State:  standby
                 Access State:  active
=== /dev/rdsk/c4t600A0B80003A8A0B0000096647B453CEd0s2 ===
         Current Load Balance:  round-robin
                 Access State:  active
                 Access State:  standby
=== /dev/rdsk/c4t600A0B800039C9B500000AA047B4529Bd0s2 ===
         Current Load Balance:  round-robin
                 Access State:  standby
                 Access State:  active
=== /dev/rdsk/c4t600A0B800039C9B500000A9C47B4522Dd0s2 ===
         Current Load Balance:  round-robin
                 Access State:  standby
                 Access State:  active
=== /dev/rdsk/c4t600A0B80003A8A0B0000096147B451BEd0s2 ===
         Current Load Balance:  round-robin
                 Access State:  active
                 Access State:  standby

Bob
======================================
Bob Friesenhahn
[EMAIL PROTECTED], http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer,    http://www.GraphicsMagick.org/

_______________________________________________
storage-discuss mailing list
[email protected]
http://mail.opensolaris.org/mailman/listinfo/storage-discuss

Reply via email to