The general recommendation is to target around 100 PG/OSD. Have you tried the https://ceph.com/pgcalc/ tool?
On Wed, 4 Apr 2018 at 21:38, Osama Hasebou <[email protected]> wrote: > Hi Everyone, > > I would like to know what kind of setup had the Ceph community been using > for their Openstack's Ceph configuration when it comes to number of Pools & > OSDs and their PGs. > > Ceph documentation briefly mentions it for small cluster size, and I would > like to know from your experience, how much PGs have you created for your > openstack pools in reality for a ceph cluster ranging from 1-2 PB capacity > or 400-600 number of OSDs that performs well without issues. > > Hope to hear from you! > > Thanks. > > Regards, > Ossi > > _______________________________________________ > ceph-users mailing list > [email protected] > http://lists.ceph.com/listinfo.cgi/ceph-users-ceph.com >
_______________________________________________ ceph-users mailing list [email protected] http://lists.ceph.com/listinfo.cgi/ceph-users-ceph.com
