Multifacets of lossy compression for scientific data in the Joint-Laboratory of Extreme Scale Computing
Abstract The Joint Laboratory on Extreme-Scale Computing (JLESC) was initiated at the
same time lossy compression for scientific data became an important topic for the scientific …
same time lossy compression for scientific data became an important topic for the scientific …
SZ3: A modular framework for composing prediction-based error-bounded lossy compressors
Today's scientific simulations require a significant reduction of data volume because of
extremely large amounts of data they produce and the limited I/O bandwidth and storage …
extremely large amounts of data they produce and the limited I/O bandwidth and storage …
Priority-based parameter propagation for distributed DNN training
Data parallel training is widely used for scaling distributed deep neural network (DNN)
training. However, the performance benefits are often limited by the communication-heavy …
training. However, the performance benefits are often limited by the communication-heavy …
Full-state quantum circuit simulation by using data compression
Quantum circuit simulations are critical for evaluating quantum algorithms and machines.
However, the number of state amplitudes required for full simulation increases exponentially …
However, the number of state amplitudes required for full simulation increases exponentially …
High-ratio lossy compression: Exploring the autoencoder to compress scientific data
Scientific simulations on high-performance computing (HPC) systems can generate large
amounts of floating-point data per run. To mitigate the data storage bottleneck and lower the …
amounts of floating-point data per run. To mitigate the data storage bottleneck and lower the …
Cusz: An efficient gpu-based error-bounded lossy compression framework for scientific data
Error-bounded lossy compression is a state-of-the-art data reduction technique for HPC
applications because it not only significantly reduces storage overhead but also can retain …
applications because it not only significantly reduces storage overhead but also can retain …
Significantly improving lossy compression for HPC datasets with second-order prediction and parameter optimization
Today's extreme-scale high-performance computing (HPC) applications are producing
volumes of data too large to save or transfer because of limited storage space and I/O …
volumes of data too large to save or transfer because of limited storage space and I/O …
Understanding and modeling lossy compression schemes on HPC scientific data
Scientific simulations generate large amounts of floating-point data, which are often not very
compressible using the traditional reduction schemes, such as deduplication or lossless …
compressible using the traditional reduction schemes, such as deduplication or lossless …
A structured and scalable mechanism for test access to embedded reusable cores
EJ Marinissen, R Arendsen, G Bos… - … 1998 (IEEE Cat. No …, 1998 - ieeexplore.ieee.org
The main objective of core-based IC design is improvement of design efficiency and time-to-
market. In order to prevent test development from becoming the bottleneck in the entire …
market. In order to prevent test development from becoming the bottleneck in the entire …
Data reduction techniques for simulation, visualization and data analysis
Data reduction is increasingly being applied to scientific data for numerical simulations,
scientific visualizations and data analyses. It is most often used to lower I/O and storage …
scientific visualizations and data analyses. It is most often used to lower I/O and storage …