Multifacets of lossy compression for scientific data in the Joint-Laboratory of Extreme Scale Computing
Abstract The Joint Laboratory on Extreme-Scale Computing (JLESC) was initiated at the
same time lossy compression for scientific data became an important topic for the scientific …
same time lossy compression for scientific data became an important topic for the scientific …
Optimizing error-bounded lossy compression for scientific data by dynamic spline interpolation
Today's scientific simulations are producing vast volumes of data that cannot be stored and
transferred efficiently because of limited storage capacity, parallel I/O bandwidth, and …
transferred efficiently because of limited storage capacity, parallel I/O bandwidth, and …
SDRBench: Scientific data reduction benchmark for lossy compressors
Efficient error-controlled lossy compressors are becoming critical to the success of today's
large-scale scientific applications because of the ever-increasing volume of data produced …
large-scale scientific applications because of the ever-increasing volume of data produced …
Dynamic quality metric oriented error bounded lossy compression for scientific datasets
With ever-increasing execution scale of the high performance computing (HPC)
applications, vast amount of data are being produced by scientific research every day. Error …
applications, vast amount of data are being produced by scientific research every day. Error …
Image quality assessment for magnetic resonance imaging
Image quality assessment (IQA) algorithms aim to reproduce the human's perception of the
image quality. The growing popularity of image enhancement, generation, and recovery …
image quality. The growing popularity of image enhancement, generation, and recovery …
FRaZ: A generic high-fidelity fixed-ratio lossy compression framework for scientific floating-point data
With ever-increasing volumes of scientific floating-point data being produced by high-
performance computing applications, significantly reducing scientific floating-point data size …
performance computing applications, significantly reducing scientific floating-point data size …
AMRIC: A novel in situ lossy compression framework for efficient I/O in adaptive mesh refinement applications
As supercomputers advance towards exascale capabilities, computational intensity
increases significantly, and the volume of data requiring storage and transmission …
increases significantly, and the volume of data requiring storage and transmission …
Optzconfig: Efficient parallel optimization of lossy compression configuration
Lossless compressors have very low compression ratios that do not meet the needs of
today's large-scale scientific applications that produce vast volumes of data. Error-bounded …
today's large-scale scientific applications that produce vast volumes of data. Error-bounded …
Foresight: analysis that matters for data reduction
As the computation power of supercomputers increases, so does simulation size, which in
turn produces orders-of-magnitude more data. Because generated data often exceed the …
turn produces orders-of-magnitude more data. Because generated data often exceed the …
Compressing atmospheric data into its real information content
Hundreds of petabytes are produced annually at weather and climate forecast centers
worldwide. Compression is essential to reduce storage and to facilitate data sharing. Current …
worldwide. Compression is essential to reduce storage and to facilitate data sharing. Current …