Bridging data center AI systems with edge computing for actionable information retrieval
Extremely high data rates at modern synchrotron and X-ray free-electron laser light source
beamlines motivate the use of machine learning methods for data reduction, feature …
beamlines motivate the use of machine learning methods for data reduction, feature …
Optimizing scientific data transfer on globus with error-bounded lossy compression
The increasing volume and velocity of science data necessitate the frequent movement of
enormous data volumes as part of routine research activities. As a result, limited wide-area …
enormous data volumes as part of routine research activities. As a result, limited wide-area …
Characterization and identification of HPC applications at leadership computing facility
High Performance Computing (HPC) is an important method for scientific discovery via large-
scale simulation, data analysis, or artificial intelligence. Leadership-class supercomputers …
scale simulation, data analysis, or artificial intelligence. Leadership-class supercomputers …
Resilient error-bounded lossy compressor for data transfer
Today's exa-scale scientific applications or advanced instruments are producing vast
volumes of data, which need to be shared/transferred through the network/devices with …
volumes of data, which need to be shared/transferred through the network/devices with …
SciStream: Architecture and toolkit for data streaming between federated science instruments
Modern scientific instruments, such as detectors at synchrotron light sources, generate data
at such high rates that online processing is needed for data reduction, feature detection …
at such high rates that online processing is needed for data reduction, feature detection …
Online optimization of file transfers in high-speed networks
File transfers in high-speed networks require network and I/O parallelism to reach high
speeds, however, creating arbitrarily large numbers of I/O and network threads overwhelms …
speeds, however, creating arbitrarily large numbers of I/O and network threads overwhelms …
Globus service enhancements for exascale applications and facilities
Many extreme-scale applications require the movement of large quantities of data to, from,
and among leadership computing facilities, as well as other scientific facilities and the home …
and among leadership computing facilities, as well as other scientific facilities and the home …
Use only what you need: Judicious parallelism for file transfers in high performance networks
Parallelism is key to efficiently utilizing high-speed research networks when transferring
large volumes of data. However, the monolithic design of existing transfer applications …
large volumes of data. However, the monolithic design of existing transfer applications …
Design and evaluation of a simple data interface for efficient data transfer across diverse storage
Modern science and engineering computing environments often feature storage systems of
different types, from parallel file systems in high-performance computing centers to object …
different types, from parallel file systems in high-performance computing centers to object …
An exabyte a day: throughput-oriented, large scale, managed data transfers with Effingo
WAN bandwidth is never too broad---and the speed of light stubbornly constant. These two
fundamental constraints force globally-distributed systems to carefully replicate data close to …
fundamental constraints force globally-distributed systems to carefully replicate data close to …