Toward storytelling from visual lifelogging: An overview

M Bolanos, M Dimiccoli… - IEEE Transactions on …, 2016 - ieeexplore.ieee.org
Visual lifelogging consists of acquiring images that capture the daily experiences of the user
by wearing a camera over a long period of time. The pictures taken offer considerable …

Analysis of the hands in egocentric vision: A survey

A Bandini, J Zariffa - IEEE transactions on pattern analysis and …, 2020 - ieeexplore.ieee.org
Egocentric vision (aka first-person vision–FPV) applications have thrived over the past few
years, thanks to the availability of affordable wearable cameras and large annotated …

In the eye of beholder: Joint learning of gaze and actions in first person video

Y Li, M Liu, JM Rehg - Proceedings of the European …, 2018 - openaccess.thecvf.com
We address the task of jointly determining what a person is doing and where they are
looking based on the analysis of video captured by a headworn camera. We propose a …

A survey on activity detection and classification using wearable sensors

M Cornacchia, K Ozcan, Y Zheng… - IEEE Sensors …, 2016 - ieeexplore.ieee.org
Activity detection and classification are very important for autonomous monitoring of humans
for applications, including assistive living, rehabilitation, and surveillance. Wearable sensors …

Survey on 3D hand gesture recognition

H Cheng, L Yang, Z Liu - … on circuits and systems for video …, 2015 - ieeexplore.ieee.org
Three-dimensional hand gesture recognition has attracted increasing research interests in
computer vision, pattern recognition, and human-computer interaction. The emerging depth …

Computer vision for assistive technologies

M Leo, G Medioni, M Trivedi, T Kanade… - Computer Vision and …, 2017 - Elsevier
In the last decades there has been a tremendous increase in demand for Assistive
Technologies (AT) useful to overcome functional limitations of individuals and to improve …

An outlook into the future of egocentric vision

C Plizzari, G Goletto, A Furnari, S Bansal… - International Journal of …, 2024 - Springer
What will the future be? We wonder! In this survey, we explore the gap between current
research in egocentric vision and the ever-anticipated future, where wearable computing …

Static hand gesture recognition based on convolutional neural networks

RF Pinto Jr, CDB Borges, AMA Almeida… - Journal of Electrical …, 2019 - Wiley Online Library
This paper proposes a gesture recognition method using convolutional neural networks. The
procedure involves the application of morphological filters, contour generation, polygonal …

Predicting gaze in egocentric video by learning task-dependent attention transition

Y Huang, M Cai, Z Li, Y Sato - Proceedings of the European …, 2018 - openaccess.thecvf.com
We present a new computational model for gaze prediction in egocentric videos by
exploring patterns in temporal shift of gaze fixations (attention transition) that are dependent …

In the eye of the beholder: Gaze and actions in first person video

Y Li, M Liu, JM Rehg - IEEE transactions on pattern analysis …, 2021 - ieeexplore.ieee.org
We address the task of jointly determining what a person is doing and where they are
looking based on the analysis of video captured by a headworn camera. To facilitate our …