jp6/cu129/: nerfview-0.1.4 metadata and description

Simple index

Interactive NeRF rendering web viewer

classifiers
  • Programming Language :: Python :: 3
  • Programming Language :: Python :: 3.8
  • Programming Language :: Python :: 3.9
  • Programming Language :: Python :: 3.10
  • Programming Language :: Python :: 3.11
  • Programming Language :: Python :: 3.12
  • License :: OSI Approved :: MIT License
  • Operating System :: OS Independent
description_content_type text/markdown
license MIT
project_urls
  • GitHub, https://github.com/hangg7/nerfview
requires_dist
  • viser>=0.2.1
  • black; extra == "dev"
  • isort; extra == "dev"
  • ipdb; extra == "dev"

Because this project isn't in the mirror_whitelist, no releases from root/pypi are included.

File Tox results History
nerfview-0.1.4-py3-none-any.whl
Size
23 KB
Type
Python Wheel
Python
3

nerfview

core-tests codecov

nerfview is a minimal* web viewer for interactive NeRF rendering. It is largely inspired by nerfstudio's viewer, but with a standalone packaging and simple API to quickly integrate into your own research projects.

*The whole package contains two files and is less than 400 lines of code.

Installation

This package requires Python 3.8+.

For existing project, you can install it via pip:

pip install nerfview

To run our examples, you can clone this repository and then install it locally:

git clone https://github.com/nerfstudio-project/nerfview
# Install torch first.
pip install torch
# Then this repo and dependencies for running examples. Note that `gsplat`
# requires compilation and this will take some time for the first time.
pip install -e .
pip install -r examples/requirements.txt

Usage

nerfview is built on viser and provides a simple API for interactive viewing.

The canonical usage is as follows:

from typing import Tuple

import viser
import nerfview


def render_fn(
    camera_state: nerfview.CameraState, render_tab_state: nerfview.RenderTabState
) -> np.ndarray:
    # Parse camera state for camera-to-world matrix (c2w) and intrinsic (K) as
    # float64 numpy arrays.
    if render_tab_state.preview_render:
        width = render_tab_state.render_width
        height = render_tab_state.render_height
    else:
        width = render_tab_state.viewer_width
        height = render_tab_state.viewer_height

    c2w = camera_state.c2w
    K = camera_state.get_K([width, height])
    # Do your things and get an image as a uint8 numpy array.
    img = your_rendering_logic(...)
    return img

# Initialize a viser server and our viewer.
server = viser.ViserServer(verbose=False)
viewer = nerfview.Viewer(server=server, render_fn=render_fn, mode='rendering')

It will start a viser server and render the image from a camera that you can interact with.

Examples

We provide a few examples ranging from toy rendering to real-world NeRF training applications. Click on the dropdown to see more details. You can always ask for help message by the -h flag.

Rendering a dummy scene.

https://github.com/hangg7/nerfview/assets/10098306/53a41fac-bce7-4820-be75-f90483bc22a0

This example is the best starting point to understand the basic API.

python examples/00_dummy_rendering.py
Rendering a dummy training process.

https://github.com/hangg7/nerfview/assets/10098306/8b13ca4a-6aaa-46a7-a333-b889c2a4ac15

This example is the best starting point to understand the API for training time update.

python examples/01_dummy_training.py
Rendering a mesh scene.

https://github.com/hangg7/nerfview/assets/10098306/84c9993f-82a3-48fb-9786-b5205bffcd6f

This example showcases how to interactively render a mesh by directly serving rendering results from nvdiffrast.

# Only need to run once the first time.
bash examples/assets/download_dragon_mesh.sh
CUDA_VISIBLE_DEVICES=0 python examples/02_mesh_rendering.py
Rendering a pretrained 3DGS scene.

https://github.com/hangg7/nerfview/assets/10098306/7b526105-8b6f-431c-9b49-10c821a3bd36

This example showcases how to render a pretrained 3DGS model using gsplat. The scene is cropped such that it is smaller to download. It is essentially the simple viewer example, which we include here to be self-contained.

# Only need to run once the first time.
bash examples/assets/download_gsplat_ckpt.sh
CUDA_VISIBLE_DEVICES=0 python examples/03_gsplat_rendering.py \
    --ckpt examples/assets/ckpt_6999_crop.pt
Rendering a 3DGS training process.

https://github.com/hangg7/nerfview/assets/10098306/640d4067-e410-49aa-86b8-325140dd73a8

This example showcases how to render while training 3DGS on mip-NeRF's garden scene using gsplat. It is essentially the simple trainer example, which we include here to be self-contained.

# Only need to run once the first time.
bash examples/assets/download_colmap_garden.sh
CUDA_VISIBLE_DEVICES=0 python examples/04_gsplat_training.py \
    --data_dir examples/assets/colmap_garden/ \
    --data_factor 8 \
    --result_dir results/garden/

Acknowledgement

This project cannot exist without the great work of nerfstudio and viser. We rely on nvdiffrast for the mesh example and gsplat for the 3DGS examples. We thank the authors for their great work and open-source spirit.