Toward storytelling from visual lifelogging: An overview
Visual lifelogging consists of acquiring images that capture the daily experiences of the user
by wearing a camera over a long period of time. The pictures taken offer considerable …
by wearing a camera over a long period of time. The pictures taken offer considerable …
Analysis of the hands in egocentric vision: A survey
Egocentric vision (aka first-person vision–FPV) applications have thrived over the past few
years, thanks to the availability of affordable wearable cameras and large annotated …
years, thanks to the availability of affordable wearable cameras and large annotated …
In the eye of beholder: Joint learning of gaze and actions in first person video
We address the task of jointly determining what a person is doing and where they are
looking based on the analysis of video captured by a headworn camera. We propose a …
looking based on the analysis of video captured by a headworn camera. We propose a …
A survey on activity detection and classification using wearable sensors
Activity detection and classification are very important for autonomous monitoring of humans
for applications, including assistive living, rehabilitation, and surveillance. Wearable sensors …
for applications, including assistive living, rehabilitation, and surveillance. Wearable sensors …
Survey on 3D hand gesture recognition
Three-dimensional hand gesture recognition has attracted increasing research interests in
computer vision, pattern recognition, and human-computer interaction. The emerging depth …
computer vision, pattern recognition, and human-computer interaction. The emerging depth …
Computer vision for assistive technologies
In the last decades there has been a tremendous increase in demand for Assistive
Technologies (AT) useful to overcome functional limitations of individuals and to improve …
Technologies (AT) useful to overcome functional limitations of individuals and to improve …
An outlook into the future of egocentric vision
What will the future be? We wonder! In this survey, we explore the gap between current
research in egocentric vision and the ever-anticipated future, where wearable computing …
research in egocentric vision and the ever-anticipated future, where wearable computing …
Static hand gesture recognition based on convolutional neural networks
RF Pinto Jr, CDB Borges, AMA Almeida… - Journal of Electrical …, 2019 - Wiley Online Library
This paper proposes a gesture recognition method using convolutional neural networks. The
procedure involves the application of morphological filters, contour generation, polygonal …
procedure involves the application of morphological filters, contour generation, polygonal …
Predicting gaze in egocentric video by learning task-dependent attention transition
We present a new computational model for gaze prediction in egocentric videos by
exploring patterns in temporal shift of gaze fixations (attention transition) that are dependent …
exploring patterns in temporal shift of gaze fixations (attention transition) that are dependent …
In the eye of the beholder: Gaze and actions in first person video
We address the task of jointly determining what a person is doing and where they are
looking based on the analysis of video captured by a headworn camera. To facilitate our …
looking based on the analysis of video captured by a headworn camera. To facilitate our …