I am interested to learn this too. So please add me sending a direct mail. 

Thanks,
Yugi

> On Apr 26, 2018, at 10:51 AM, Oesterlin, Robert <[email protected]> 
> wrote:
> 
> Hi Lohit, Nathan
>  
> Would you be willing to share some more details about your setup? We are just 
> getting started here and I would like to hear about what your configuration 
> looks like. Direct email to me is fine, thanks.
>  
> Bob Oesterlin
> Sr Principal Storage Engineer, Nuance
>  
>  
> From: <[email protected]> on behalf of 
> "[email protected]" <[email protected]>
> Reply-To: gpfsug main discussion list <[email protected]>
> Date: Thursday, April 26, 2018 at 9:45 AM
> To: gpfsug main discussion list <[email protected]>
> Subject: [EXTERNAL] Re: [gpfsug-discuss] Singularity + GPFS
>  
> We do run Singularity + GPFS, on our production HPC clusters.
> Most of the time things are fine without any issues.
>  
> However, i do see a significant performance loss when running some 
> applications on singularity containers with GPFS.
>  
> As of now, the applications that have severe performance issues with 
> singularity on GPFS - seem to be because of “mmap io”. (Deep learning 
> applications)
> When i run the same application on bare metal, they seem to have a huge 
> difference in GPFS IO when compared to running on singularity containers.
> I am yet to raise a PMR about this with IBM.
> I have not seen performance degradation for any other kind of IO, but i am 
> not sure.
> 
> Regards,
> Lohit
> 
> On Apr 26, 2018, 10:35 AM -0400, Nathan Harper <[email protected]>, 
> wrote:
> 
> We are running on a test system at the moment, and haven't run into any 
> issues yet, but so far it's only been 'hello world' and running FIO.
>  
> I'm interested to hear about experience with MPI-IO within Singularity.
>  
> On 26 April 2018 at 15:20, Oesterlin, Robert <[email protected]> 
> wrote:
> Anyone (including IBM) doing any work in this area? I would appreciate 
> hearing from you.
>  
> Bob Oesterlin
> Sr Principal Storage Engineer, Nuance
>  
> 
> _______________________________________________
> gpfsug-discuss mailing list
> gpfsug-discuss at spectrumscale.org
> http://gpfsug.org/mailman/listinfo/gpfsug-discuss
> 
> 
> 
>  
> --
> Nathan Harper // IT Systems Lead
>  
> 
> 
> e: [email protected]   t: 0117 906 1104  m:  0787 551 0891  w: 
> www.cfms.org.uk  
> CFMS Services Ltd // Bristol & Bath Science Park // Dirac Crescent // 
> Emersons Green // Bristol // BS16 7FR 
>  
> CFMS Services Ltd is registered in England and Wales No 05742022 - a 
> subsidiary of CFMS Ltd 
> CFMS Services Ltd registered office // 43 Queens Square // Bristol // BS1 4QP
> _______________________________________________
> gpfsug-discuss mailing list
> gpfsug-discuss at spectrumscale.org
> http://gpfsug.org/mailman/listinfo/gpfsug-discuss
> _______________________________________________
> gpfsug-discuss mailing list
> gpfsug-discuss at spectrumscale.org
> http://gpfsug.org/mailman/listinfo/gpfsug-discuss
_______________________________________________
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss

Reply via email to