Sunday, Full Day
Title: Sharable and Scalable I/O Solutions for High Performance Computing Applications
Presenters: Larry Schoof, Sandia National Laboratory; Mark Miller, Lawrence Livermore National Laboratory; Mike Folk and Albert Cheng, National Center for Supercomputing Applications
Level: 30% Introductory | 50% Intermediate | 20% Advanced
Two challenges facing HPC applications are the need to improve I/O performance, and an ability to share complex scientific data and data analysis software. The computational times for HPC applications have decreased in recent years by 2-3 orders of magnitude, but unfortunately I/O performance has not kept pace with these impressive increases in raw compute power. Scientific data and tools for working with scientific data have also evolved, from application stovepipes (little/no data interoperability) to a recognition of the value of large-scale integration (full data interoperability) that facilitates sharing data and tools among applications and across a varied and changing landscape of computing environments. In this tutorial we discuss two complementary I/O libraries that address these issues. The first, Hierarchical Data Format Version 5 (HDF5) represents and operates on scientific data as concrete arrays. The second, Sets and Fields (SAF) data modeling system, represents and operates on scientific data as abstract fields. Building upon HDF5 as a foundation, SAF encapsulates parallel and scientific constructs intrinsically to provide greater sharability of data and interoperability of software.