Hi Mark,

We do this. We have sync replication between two sites with extended san and 
Ethernet fabric between them. We then use copies=2 for both metadata and data 
(most filesets).

We then also have a vm quorum node which runs on VMware in a fault tolerant 
cluster. We tested split braining the sites before we went into production. It 
does work, but we did find some interesting failure modes doing the testing, so 
do that and push it hard.

We multicluster our ces nodes (yes technically I know isn't supported), and 
again have a quorum vm which has dc affinity to the storage cluster one to 
ensure ces fails to the same DC.

You may also want to look at readReplicaPolicy=local and Infiniband fabric 
numbers, and probably subnets to ensure your clients prefer the local site for 
read. Write of course needs enough bandwidth between sites to keep it fast.

Simon
________________________________________
From: [email protected] 
[[email protected]] on behalf of [email protected] 
[[email protected]]
Sent: 20 July 2016 21:47
To: gpfsug main discussion list
Subject: [gpfsug-discuss] NDS in Two Site scenario

For some reason this concept is a round peg that doesn’t fit the square hole 
inside my brain.  Can someone please explain the best practice to setting up 
two sites same cluster?  I get that I would likely have two NDS nodes in site 1 
and two NDS nodes in site two.  What I don’t understand are the failure 
scenarios and what would happen if I lose one or worse a whole site goes down.  
Do I solve this by having scale replication set to 2 for all my files?  I mean 
a single site I think I get it’s when there are two datacenters and I don’t 
want two clusters typically.



Mark R. Bush | Solutions Architect
Mobile: 210.237.8415 | [email protected]
Sirius Computer Solutions | www.siriuscom.com<http://www.siriuscom.com>
10100 Reunion Place, Suite 500, San Antonio, TX 78216


This message (including any attachments) is intended only for the use of the 
individual or entity to which it is addressed and may contain information that 
is non-public, proprietary, privileged, confidential, and exempt from 
disclosure under applicable law. If you are not the intended recipient, you are 
hereby notified that any use, dissemination, distribution, or copying of this 
communication is strictly prohibited. This message may be viewed by parties at 
Sirius Computer Solutions other than those named in the message header. This 
message does not contain an official representation of Sirius Computer 
Solutions. If you have received this communication in error, notify Sirius 
Computer Solutions immediately and (i) destroy this message if a facsimile or 
(ii) delete this message immediately if this is an electronic communication. 
Thank you.

Sirius Computer Solutions<http://www.siriuscom.com>
_______________________________________________
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss

Reply via email to