Harnessing the computing continuum for programming our world
This chapter outlines a vision for how best to harness the computing continuum of
interconnected sensors, actuators, instruments, and computing systems, from small numbers …
interconnected sensors, actuators, instruments, and computing systems, from small numbers …
Metaheuristic task scheduling algorithms for cloud computing environments
Cloud computing has the advantage of providing flexibility, high‐performance, pay‐as‐you‐
use, and on‐demand service. One of the important research issues in cloud computing is …
use, and on‐demand service. One of the important research issues in cloud computing is …
Twister2: Design of a big data toolkit
Data‐driven applications are essential to handle the ever‐increasing volume, velocity, and
veracity of data generated by sources such as the Web and Internet of Things (IoT) devices …
veracity of data generated by sources such as the Web and Internet of Things (IoT) devices …
Twister2: Tset high-performance iterative dataflow
The dataflow model is gradually becoming the de facto standard for big data applications.
While many popular frameworks are built around this model, very little research has been …
While many popular frameworks are built around this model, very little research has been …
MTCL: a multi-transport communication library
To pave the way toward adopting the Compute Continuum paradigm, there is the need to
support highly distributed heterogeneous application workflows that require the …
support highly distributed heterogeneous application workflows that require the …
Task-parallel analysis of molecular dynamics trajectories
Different parallel frameworks for implementing data analysis applications have been
proposed by the HPC and Big Data communities. In this paper, we investigate three task …
proposed by the HPC and Big Data communities. In this paper, we investigate three task …
Streaming machine learning algorithms with big data systems
Designing low latency applications that can process large volumes data with higher
efficiency is a challenging problem. With the limited time to process data, usage of online …
efficiency is a challenging problem. With the limited time to process data, usage of online …
Hptmt: Operator-based architecture for scalable high-performance data-intensive frameworks
Data-intensive applications impact many domains, and their steadily increasing size and
complexity demands highperformance, highly usable environments. We integrate a set of …
complexity demands highperformance, highly usable environments. We integrate a set of …
Contributions to high-performance big data computing
Our project is at the interface of Big Data and HPC–High-Performance Big Data computing
and this paper describes a collaboration between 7 collaborating Universities at Arizona …
and this paper describes a collaboration between 7 collaborating Universities at Arizona …
[PDF][PDF] Learning Everywhere: Pervasive machine learning for effective High-Performance computation: Application background
This paper describes opportunities at the interface between large-scale simulations,
experiment design and control, machine learning (ML including deep learning DL) and High …
experiment design and control, machine learning (ML including deep learning DL) and High …