Ners: Neural reflectance surfaces for sparse-view 3d reconstruction in the wild

J Zhang, G Yang, S Tulsiani… - Advances in Neural …, 2021 - proceedings.neurips.cc
Recent history has seen a tremendous growth of work exploring implicit representations of
geometry and radiance, popularized through Neural Radiance Fields (NeRF). Such works …

Object 3dit: Language-guided 3d-aware image editing

O Michel, A Bhattad, E VanderBilt… - Advances in …, 2023 - proceedings.neurips.cc
Existing image editing tools, while powerful, typically disregard the underlying 3D geometry
from which the image is projected. As a result, edits made using these tools may become …

Semantically-aware neural radiance fields for visual scene understanding: A comprehensive review

TAQ Nguyen, A Bourki, M Macudzinski… - arxiv preprint arxiv …, 2024 - arxiv.org
This review thoroughly examines the role of semantically-aware Neural Radiance Fields
(NeRFs) in visual scene understanding, covering an analysis of over 250 scholarly papers. It …

Partial convolution for padding, inpainting, and image synthesis

G Liu, A Dundar, KJ Shih, TC Wang… - … on Pattern Analysis …, 2022 - ieeexplore.ieee.org
Partial convolution weights convolutions with binary masks and renormalizes on valid pixels.
It was originally proposed for image inpainting task because a corrupted image processed …

Shape, pose, and appearance from a single image via bootstrapped radiance field inversion

D Pavllo, DJ Tan, MJ Rakotosaona… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Neural Radiance Fields (NeRF) coupled with GANs represent a promising direction
in the area of 3D reconstruction from a single view, owing to their ability to efficiently model …

Dual contrastive loss and attention for gans

N Yu, G Liu, A Dundar, A Tao… - Proceedings of the …, 2021 - openaccess.thecvf.com
Abstract Generative Adversarial Networks (GANs) produce impressive results on
unconditional image generation when powered with large-scale image datasets. Yet …

Auv-net: Learning aligned uv maps for texture transfer and synthesis

Z Chen, K Yin, S Fidler - … of the IEEE/CVF conference on …, 2022 - openaccess.thecvf.com
In this paper, we address the problem of texture representation for 3D shapes for the
challenging and underexplored tasks of texture transfer and synthesis. Previous works either …

[PDF][PDF] Visual saliency and quality evaluation for 3D point clouds and meshes: An overview

W Lin, S Lee - APSIPA Transactions on Signal and …, 2022 - nowpublishers.com
ABSTRACT Three-dimensional (3D) point clouds (PCs) and meshes have increasingly
become available and indispensable for diversified applications in work and life. In addition …

TEXRO: generating delicate textures of 3D models by recursive optimization

J Wu, X Liu, C Wu, X Gao, J Liu, X Liu, C Zhao… - arxiv preprint arxiv …, 2024 - arxiv.org
This paper presents TexRO, a novel method for generating delicate textures of a known 3D
mesh by optimizing its UV texture. The key contributions are two-fold. We propose an …

Progressive learning of 3d reconstruction network from 2d gan data

A Dundar, J Gao, A Tao… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
This paper presents a method to reconstruct high-quality textured 3D models from single
images. Current methods rely on datasets with expensive annotations; multi-view images …