Save the date!

You are invited to the IBM Storage Scale User Days 2026, taking place on March 
10–11, 2026, at the IBM Campus in Ehningen, Germany.
This free, in-person expert-level workshop brings together customers, IBM 
Storage Scale development, and IBM partner technical experts to share insights 
and experiences.

Key topics:

  *   What’s new in IBM Storage Scale 6.0.0
  *   IBM Storage Scale System 7.0.0.0
  *   Data for AI use cases and architectures

Event details:

  *   Start: March 10, 2026, at 10:00 AM (local time)
  *   Location: IBM Campus 1, 71139 Ehningen, Germany
  *   Language: Mixed German and English (no translation provided)
  *   Format: Face-to-face workshop

The event will be hosted at our new, spacious IBM facility. More details: 
https://www.schuessler-plan.de/en/projects/ibm-technologycampus-ehningen.html

Registration:
If you would like to join, please contact Markus Merz for the registration 
link: [email protected]<mailto:[email protected]> We look forward to welcoming 
you in Ehningen.

Appendix: Storage Scale 6.0.0

IBM Storage Scale 6.0.0 is a major evolution in IBM’s global data platform, 
purpose-built to meet the demands of AI-driven enterprises. At the heart of 
this release is the Storage Scale System Data Acceleration Tier (DAT) a 
high-performance NVMeoF-based storage layer designed to deliver extreme IOPS 
and ultra-low latency for real-time AI inferencing workloads.

https://www.ibm.com/docs/en/announcements/storage-scale-60

Appendix: Storage Scale System 7.0.0.0

IBM Storage Scale RAID is a software implementation of storage RAID 
technologies within IBM Storage Scale. Using conventional dual-ported disks in 
a JBOD configuration, IBM Storage Scale RAID implements sophisticated data 
placement and error-correction algorithms to deliver high levels of storage 
reliability, availability, and performance. Standard GPFS file systems are 
created from the NSDs defined through IBM Storage Scale RAID.

https://www.ibm.com/docs/en/storage-scale-system/storage-scale-software/7.0.0

Appendix: Accelerating AI inference with IBM Storage Scale

Large language model (LLM) based inference is a dominant workload for modern AI 
applications, and it has become so taxing on all three infrastructure resources 
(compute, network, and storage) that entirely new software capabilities are 
emerging to support resource management and optimization for inference.

https://research.ibm.com/blog/accelerating-ai-inference-with-ibm-storage-scale

-frank-

Frank Kraemer
IBM Senior Technical Specialist
Tower One, Bruesseler Street 1-3, 60327 Frankfurt
mailto:[email protected]
Mobile +491713043699
IBM Germany

_______________________________________________
gpfsug-discuss mailing list
gpfsug-discuss at gpfsug.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss_gpfsug.org

Reply via email to