Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
An improved deep Q-learning algorithm for a trade-off between energy consumption and productivity in batch scheduling
X Zheng, Z Chen - Computers & Industrial Engineering, 2024 - Elsevier
The single-batch machine, commonly found in industrial manufacturing, can concurrently
process a group of jobs in variable-speed batches, leading to fluctuating levels of both …
process a group of jobs in variable-speed batches, leading to fluctuating levels of both …
A high-quality workflow for multi-resolution scientific data reduction and visualization
Multi-resolution methods such as Adaptive Mesh Refinement (AMR) can enhance storage
efficiency for HPC applications generating vast volumes of data. However, their applicability …
efficiency for HPC applications generating vast volumes of data. However, their applicability …
A Prediction‐Traversal Approach for Compressing Scientific Data on Unstructured Meshes with Bounded Error
We explore an error‐bounded lossy compression approach for reducing scientific data
associated with 2D/3D unstructured meshes. While existing lossy compressors offer a high …
associated with 2D/3D unstructured meshes. While existing lossy compressors offer a high …
Tac+: Optimizing error-bounded lossy compression for 3d amr simulations
Today's scientific simulations require significant data volume reduction because of the
enormous amounts of data produced and the limited I/O bandwidth and storage space. Error …
enormous amounts of data produced and the limited I/O bandwidth and storage space. Error …
CUSZ-i: High-Ratio Scientific Lossy Compression on GPUs with Optimized Multi-Level Interpolation
Error-bounded lossy compression is a critical technique for significantly reducing scientific
data volumes. Compared to CPU-based compressors, GPU-based compressors exhibit …
data volumes. Compared to CPU-based compressors, GPU-based compressors exhibit …
AdapCK: Optimizing I/O for Checkpointing on Large-Scale High Performance Computing Systems
J Jia, Y Liu, Y Liu, Y Chen, F Lin - European Conference on Parallel …, 2024 - Springer
With the scaling-up of high-performance computing (HPC) systems, the resilience has
become an important challenge. As a widely used resilience technique for HPC systems …
become an important challenge. As a widely used resilience technique for HPC systems …
Lossy Data Compression By Adaptive Mesh Coarsening
Today's scientific simulations, for example in the high-performance exascale sector, produce
huge amounts of data. Due to limited I/O bandwidth and available storage space, there is the …
huge amounts of data. Due to limited I/O bandwidth and available storage space, there is the …
SZOps: Scalar Operations for Error-bounded Lossy Compressor for Scientific Data
Error-bounded lossy compression has been a critical technique to significantly reduce the
sheer amounts of simulation datasets for high-performance computing (HPC) scientific …
sheer amounts of simulation datasets for high-performance computing (HPC) scientific …
Accelerating Viz Pipelines Using Near-Data Computing: An Early Experience
Q Zheng, B Atkinson, D Wang, J Lee… - SC24-W: Workshops …, 2024 - ieeexplore.ieee.org
Traditional scientific visualization pipelines transfer entire data arrays from storage to client
nodes for processing into displayable graphics objects. However, this full data transfer is …
nodes for processing into displayable graphics objects. However, this full data transfer is …
Enhancing Lossy Compression Through Cross-Field Information for Scientific Applications
Lossy compression is one of the most effective methods for reducing the size of scientific
data containing multiple data fields. It reduces information density through prediction or …
data containing multiple data fields. It reduces information density through prediction or …