animatedrng / nerf-jax Goto Github PK
View Code? Open in Web Editor NEWA JAX rewrite of the NeRF reconstruction technique
License: Apache License 2.0
A JAX rewrite of the NeRF reconstruction technique
License: Apache License 2.0
At the moment, we sow a few intermediate values and gradients of the integrate
function using Oryx. It would be nice to sow all of these values, but only if the a certain flag is provided. We can then reap all of these functions and visualize them with Tensorboard.
Currently it blows up and the appearance gradients look totally wrong. Implementing #3 will help here!
We can actually do the finite difference grad checks inline with chex too. More importantly, we can check for NaNs.
Currently these functions return the RNG key and are implemented in a weird and un-intuitive way (e.g. the arguments into the map callback). We can probably refactor these functions to have a cleaner, more JAX-idiomatic interface.
There's a lot of sub-experiments that aren't well demarcated.
It's a bit unusual, and I'm not totally sure if this is the purview of the geometry network or the appearance network. Probably the geometry network because it's not parameterized with the view direction. But this is one way to ensure that we can represent intrinsically fuzzy details (e.g. hair) without sacrificing surface quality on the non-fuzzy surfaces. Unclear exactly how the geometry network represents this value, and how it interacts with the global coarse-to-fine phi schedule. Maybe it just scales the standard deviation on phi accordingly?
See geometric init
Use weight_norm?
Currently we use a stratified sampler to pick isosurfaces, with a constant step size between each isosurface sample (i.e we sample the -2, -1, 0, +1, +2 isosurfaces with a bit of jitter). We also have a standard deviation on
These values currently don't change over the course of optimization. We probably should reduce them, and possibly even reduce the number of samples that we perform.
At the moment, we don't have any finite difference tests (like autograd.gradcheck in PyTorch). For cases in which the rendering integrand is actually smooth (i.e all composited isosurfaces intersect), these tests will probably do a good job. The first ideal candidate would be the integrate function, as we can manually specify valid isosurfaces. It's also possible that these tests will help alert us to the presence of NaNs during the backwards pass. See this page in the Autodiff cookbook for details.
I think we can probably adopt a more NERF-like, multi-headed geometry/appearance network. We should probably look carefully at the NERF network to ensure that it's an apples-to-apples comparison.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.