Efficient in-situ workflow planning for geographically distributed heterogeneous environments
In-situ workflows are a particular class of scientific workflows where different components
(such as simulation, visualization, machine learning, and data analysis) run concurrently. In …
(such as simulation, visualization, machine learning, and data analysis) run concurrently. In …
CAPIO: a middleware for transparent I/O streaming in data-intensive workflows
With the increasing amount of digital data available for analysis and simulation, the class of
I/O-intensive HPC workflows is fated to quickly expand, further exacerbating the …
I/O-intensive HPC workflows is fated to quickly expand, further exacerbating the …
BeeSwarm: enabling parallel scaling performance measurement in continuous integration for HPC applications
Testing is one of the most important steps in software development–it ensures the quality of
software. Continuous Integration (CI) is a widely used testing standard that can report …
software. Continuous Integration (CI) is a widely used testing standard that can report …
Towards Highly Compatible I/O-Aware Workflow Scheduling on HPC Systems
Y Dai, R Wang, Y Dong, K Lu - SC24: International Conference …, 2024 - ieeexplore.ieee.org
Scientific workflows on High-Performance Computing (HPC) consist of multiple data
processing and computing tasks with dependencies. Efficiently scheduling computing …
processing and computing tasks with dependencies. Efficiently scheduling computing …
INSTANT: A Runtime Framework to Orchestrate In-Situ Workflows
In-situ workflow is a type of workflow where multiple components execute concurrently with
data flowing continuously. The adoption of in-situ workflows not only accelerates mission …
data flowing continuously. The adoption of in-situ workflows not only accelerates mission …
[BOOK][B] Mars: Multi-scalable actor-critic reinforcement learning scheduler
B Baheri - 2020 - search.proquest.com
In this thesis we introduce a new scheduling algorithm MARS based on a cost-aware multi-
scalable reinforcement learning approach, which serves as an intermediate layer between …
scalable reinforcement learning approach, which serves as an intermediate layer between …
An HPC-Container Based Continuous Integration Tool for Detecting Scaling and Performance Issues in HPC Applications
Testing is one of the most important steps in software development–it ensures the quality of
software. Continuous Integration (CI) is a widely used testing standard that can report …
software. Continuous Integration (CI) is a widely used testing standard that can report …
Shared-memory communication for containerized workflows
Scientific computation increasingly consists of a workflow of interrelated tasks.
Containerization can make workflow systems more manageable, reproducible, and portable …
Containerization can make workflow systems more manageable, reproducible, and portable …
BEE orchestrator: Running complex scientific workflows on multiple systems
In this paper, we propose a workflow orchestration system that is able to run workflows on
both HPC systems and in the cloud using HPC containers. Most existing workflow …
both HPC systems and in the cloud using HPC containers. Most existing workflow …
Hydra: Brokering Cloud and HPC Resources to Support the Execution of Heterogeneous Workloads at Scale
Scientific discovery increasingly depends on middleware that enables the execution of
heterogeneous workflows on heterogeneous platforms. One of the main challenges is to …
heterogeneous workflows on heterogeneous platforms. One of the main challenges is to …