Sensor-level computer vision with pixel processor arrays for agile robots

P Dudek, T Richardson, L Bose, S Carey, J Chen… - Science Robotics, 2022 - science.org
Vision processing for control of agile autonomous robots requires low-latency computation,
within a limited power and space budget. This is challenging for conventional computing …

Neural sensors: Learning pixel exposures for HDR imaging and video compressive sensing with programmable sensors

JNP Martel, LK Mueller, SJ Carey… - IEEE transactions on …, 2020 - ieeexplore.ieee.org
Camera sensors rely on global or rolling shutter functions to expose an image. This fixed
function approach severely limits the sensors' ability to capture high-dynamic-range (HDR) …

In-Sensor Visual Perception and Inference

Y Liu, R Fan, J Guo, H Ni, MUM Bhutta - Intelligent Computing, 2023 - spj.science.org
Conventional machine vision systems have separate perception, memory, and processing
architectures, which may exacerbate the increasing need for ultrahigh image processing …

Navigating the landscape for real-time localization and map** for robotics and virtual and augmented reality

S Saeedi, B Bodin, H Wagstaff, A Nisbet… - Proceedings of the …, 2018 - ieeexplore.ieee.org
Visual understanding of 3-D environments in real time, at low power, is a huge
computational challenge. Often referred to as simultaneous localization and map** …

Visual odometry using pixel processor arrays for unmanned aerial systems in gps denied environments

A McConville, L Bose, R Clarke… - Frontiers in Robotics …, 2020 - frontiersin.org
Environments in which Global Positioning Systems (GPS), or more generally Global
Navigation Satellite System (GNSS), signals are denied or degraded pose problems for the …

Mantissacam: Learning snapshot high-dynamic-range imaging with perceptually-based in-pixel irradiance encoding

HM So, JNP Martel, G Wetzstein… - 2022 IEEE International …, 2022 - ieeexplore.ieee.org
The ability to image high-dynamic-range (HDR) scenes is crucial in many computer vision
applications. The dynamic range of conventional sensors, however, is fundamentally limited …

Tracking control of a UAV with a parallel visual processor

C Greatwood, L Bose, T Richardson… - 2017 IEEE/RSJ …, 2017 - ieeexplore.ieee.org
This paper presents a vision-based control strategy for tracking a ground target using a
novel vision sensor featuring a processor for each pixel element. This enables computer …

Real-time depth from focus on a programmable focal plane processor

JNP Martel, LK Müller, SJ Carey… - … on Circuits and …, 2017 - ieeexplore.ieee.org
Visual input can be used to recover the 3-D structure of a scene by estimating distances
(depth) to the observer. Depth estimation is performed in various applications, such as …

SoDaCam: Software-defined Cameras via Single-Photon Imaging

V Sundar, A Ardelean, T Swedish… - Proceedings of the …, 2023 - openaccess.thecvf.com
Reinterpretable cameras are defined by their post-processing capabilities that exceed
traditional imaging. We present" SoDaCam" that provides reinterpretable cameras at the …

High-speed depth from focus on a programmable vision chip using a focus tunable lens

JNP Martel, LK Müller, SJ Carey… - 2017 IEEE International …, 2017 - ieeexplore.ieee.org
In this paper, we present a 3D imaging system providing a semi-dense depth map, using a
passive, low-power, compact, static, monocular camera. The demonstrated depth estimation …