Testing MapReduce programs: A systematic map** study
Context MapReduce is a processing model used in Big Data to facilitate the analysis of large
data under a distributed architecture. Objective The aim of this study is to identify and …
data under a distributed architecture. Objective The aim of this study is to identify and …
Avoiding slow running nodes in distributed systems
In distributed systems like Hadoop, work is segmented into various tasks and then
subsequently executed in parallel on nodes in the cluster. Stragglers, the nodes which are 6 …
subsequently executed in parallel on nodes in the cluster. Stragglers, the nodes which are 6 …
[CITATION][C] Prediction models using machine learning
A Kish - Data in Brief, 2016