On 04/14/2013 12:46 PM, James Cuff wrote: > > Finally a well balanced,hype free article by respected industry expert > Jeff Layton: > > http://www.admin-magazine.com/HPC/articles/the_cloud_s_role_in_hpc
Just read it. This is a very good article. "I consider cloud computing a tool or technique for solving research computing problems. Nothing more or less. It’s not a panacea, not should it be ignored. Issues that must be addressed include data movement and security, but it also can save you money and make your traditional HPC resources stretch further. If you examine your workloads and their characteristics carefully, I think you will be surprised by how many can be run easily in the cloud." I've been making the argument for a while that what is euphemistically called "Big Data" is an applied version of high performance computing ... not in a traditional sense (e.g. MPI, ultra low latency interconnects, etc.), but focused upon analyzing huge swaths of data, and turning it from "data" with noise into usable actionable intelligence. What Jeff outlines in this piece is that the way we implement these huge (less "traditional HPC" but rapidly becoming the norm in) research computing (RC) computations is also changing. RC and Big Data are well interrelated, though I've been hearing people start talking about "Huge Data" and "Oh my gosh its coming this way -- Data", all of which is apropos to the mechanisms of how/where you perform these calculations. Kudos to Jeff, and thanks to James for pointing this out! Well worth the read! -- Joseph Landman, Ph.D Founder and CEO Scalable Informatics, Inc. email: [email protected] web : http://scalableinformatics.com http://scalableinformatics.com/siflash phone: +1 734 786 8423 x121 fax : +1 866 888 3112 cell : +1 734 612 4615 _______________________________________________ Beowulf mailing list, [email protected] sponsored by Penguin Computing To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf
