Towards securing data transfers against silent data corruption
Scientific applications generate large volumes of data that often needs to be moved between
geographically distributed sites for collaboration or backup which has led to a significant …
geographically distributed sites for collaboration or backup which has led to a significant …
Globus service enhancements for exascale applications and facilities
Many extreme-scale applications require the movement of large quantities of data to, from,
and among leadership computing facilities, as well as other scientific facilities and the home …
and among leadership computing facilities, as well as other scientific facilities and the home …
RIVA: Robust integrity verification algorithm for high-speed file transfers
End-to-end integrity verification is designed to protect file transfers against silent data
corruption by comparing checksum of files at source and destination end points using …
corruption by comparing checksum of files at source and destination end points using …
A low-overhead integrity verification for big data transfers
The amount of data generated by scientific and commercial applications is growing at an
ever-increasing pace. This data is often moved between geographically distributed sites for …
ever-increasing pace. This data is often moved between geographically distributed sites for …
Integrity protection for scientific workflow data: Motivation and initial experiences
With the continued rise of scientific computing and the enormous increases in the size of
data being processed, scientists must consider whether the processes for transmitting and …
data being processed, scientists must consider whether the processes for transmitting and …
Accelerating I/O performance of ZFS-based Lustre file system in HPC environment
To meet increasing data access performance demands of applications run on high-
performance computing (HPC) systems, an efficient design of HPC storage file system is …
performance computing (HPC) systems, an efficient design of HPC storage file system is …
TPBF: Two-Phase Bloom-Filter-Based End-to-End Data Integrity Verification Framework for Object-Based Big Data Transfer Systems
Computational science simulations produce huge volumes of data for scientific research
organizations. Often, this data is shared by data centers distributed geographically for …
organizations. Often, this data is shared by data centers distributed geographically for …
CRBF: Cross-Referencing Bloom-Filter-Based Data Integrity Verification Framework for Object-Based Big Data Transfer Systems
Various components are involved in the end-to-end path of data transfer. Protecting data
integrity from failures in these intermediate components is a key feature of big data transfer …
integrity from failures in these intermediate components is a key feature of big data transfer …
Concurrent and robust end-to-end data integrity verification scheme for flash-based storage devices
The amount of data generated by scientific applications on high-performance computing
systems is growing at an ever-increasing pace. Most of the generated data are transferred to …
systems is growing at an ever-increasing pace. Most of the generated data are transferred to …
Towards generalizable network anomaly detection models
Finding the root causes of network performance anomalies is critical to satisfy the quality of
service requirements. In this paper, we introduce machine learning (ML) models to process …
service requirements. In this paper, we introduce machine learning (ML) models to process …