Coder Social home page Coder Social logo

gltf-sample-viewer's Introduction

glTF Sample Viewer Web App

This is the official Khronos glTF 2.0 Sample Viewer using WebGL: glTF 2.0 Sample Viewer

Viewer

Link to the live glTF 2.0 Sample Viewer.

Usage

Controls

click + drag : Rotate model

scroll : Zoom camera

GUI : Use to change models and settings

Change glTF model

  • Choose one of the glTF models in the selection list
  • Drag and drop glTF files into viewer

Change the environment map

  • Drag and drop a .hdr panorama file

Setup

For local usage and debugging, please follow these instructions:

  1. Make sure Git LFS is installed.

  2. Checkout the main branch

  3. Pull the submodule for the required glTF-Sample-Renderer git submodule update --init --recursive

  4. Build the web app

When making changes, the project is automatically rebuilt and the ./dist directory is populated with the web app. This directory contains all files necessary for deployment to a webserver.

Debugging

  • Requirements
  • Install the Debugger for Chrome or Debugger for Firefox extension for Visual Studio Code
  • Open the project directory in Visual Studio Code and select Debug->Add Configuration->Chrome or Debug->Add Configuration->Firefox so the .vscode/launch.json file is created.
  • Append /dist to ${workspaceFolder} in the launch.json file
  • Debug->Start Debugging should now launch a Chrome or Firefox window with the sample viewer and VS Code breakpoints should be hit.

Known Issues

npm install give the following warning:

npm WARN deprecated [email protected]: This module is not supported, and leaks memory. Do not use it. Check out lru-cache if you want a good and tested way to coalesce async requests by a key value, which is much more comprehensive and powerful.
npm WARN deprecated [email protected]: See https://github.com/lydell/source-map-resolve#deprecated
npm WARN deprecated [email protected]: Glob versions prior to v9 are no longer supported

These warnings come from rollup plugins copy and sourcemaps, both are dev dependencies and not used in the final distribution.
Sourcemap is used to make debugging of @khronosgroup/gltf-viewer possible.
Copy is used to copy all required files to dist.

gltf-sample-viewer's People

Contributors

abwood avatar but0n avatar corporateshark avatar dependabot[bot] avatar emackey avatar fanna avatar jim-ec avatar lunarsong avatar mahiuchun avatar moneimne avatar msfeldstein avatar paramveerpatil avatar pascalschoen avatar pasu avatar peted70 avatar pjcozzi avatar sbtron avatar snagy avatar timvanscherpenzeel avatar ux3d-becher avatar ux3d-bernhardt avatar ux3d-goll avatar ux3d-haertl avatar ux3d-hohenester avatar ux3d-kanzler avatar ux3d-labode avatar ux3d-nopper avatar ux3d-wahlster avatar zellski avatar ziriax avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gltf-sample-viewer's Issues

Strange dots face the camera at all times.

There's a visual artifact in this repo that I haven't seen in BabylonJS or ThreeJS implementations. When the normal vector of a surface points directly at the camera, a small white dot appears. Is this intentional? Is there a light source attached to the camera?

For example, here's a screenshot of BoomBox.gltf. The green arrow points to an expected specular highlight, where a known light source creates a hot spot on the smooth surface of the device. But what does the red arrow point to? It's a smaller hotspot that follows the camera around.

mysterydotonboombox

Here's the same effect, on the MetalRoughSpheres model. Note that even the "roughest" spheres create tight hot spots facing the camera at all times. These rough spheres are supposed to be mostly uniform in color, certainly with no tight specular highlights on them.

mysterydotonspheres

Some models are not displayed in Chrome

I tried the PBR model sample with several browsers.
Windows 10 + Intel HD Graphics 520 + Chrome 58
Windows 10 + Intel HD Graphics 520 + Firefox 53
Windows 10 + Intel HD Graphics 520 + Edge 14

Models Chrome 58 Firefox 53 Edge 14
MetalRoughSpheres
AppleTree
Avocado
BarramundiFish
BoomBox
Corset
FarmLandDiorama
NormalTangentTest
Telephone
Triangle
WaterBottle

Most models are displayed in Chrome, but some models are not displayed. AppleTree, FarmLandDiorama and Triangle.
Firefox and Edge can not display because EXT_sRGB WebGL extension is not supported.

Typo on main page

I can't seem to clone this repo and make a formal pull request, there's something funky with my machine. But, this is a simple fix.

in README.md:
Change "Frensel" to "Fresnel".
Change "youself" to "yourself".

I'd also do "we are able to pick out" to "we can pick out", but that's optional.

BRDF LUT uses the sRGB format, by design?

The textures seem to be loaded using the following formats:

textures/brdfLUT.png  ->sRGB
models/DamagedHelmet/glTF/Default_emissive.jpg  ->sRGB
models/DamagedHelmet/glTF/Default_normal.jpg  ->RGBA
models/DamagedHelmet/glTF/Default_AO.jpg  ->RGBA
models/DamagedHelmet/glTF/Default_albedo.jpg  ->sRGB
models/DamagedHelmet/glTF/Default_metalRoughness.jpg  ->RGBA

the BRDF lookup table seems to be using the sRGB color space. Is this by design?

The scaling factors do not seem to be taken into account

The scaling factors like

do not seem to be taken into account. At least, for example the ao (which probably is the occlusion) seems to be read from the texture in https://github.com/KhronosGroup/glTF-WebGL-PBR/blob/master/shaders/pbr-frag.glsl#L259 , and multiplied with the color, unmodified.

(I did not yet dive deeply into the details of the code, for various reasons, but from what I have seen so far, these factors are not supported. Maybe I overlooked them...?)

Wrong normal computation in vertex shader?

In the vertex shader, the normal is multiplied with the model matrix:

vec3 normalW = normalize(vec3(u_ModelMatrix * vec4(a_Normal.xyz, 0.0)));

However, someone once explained to me that normal vectors should be transformed using the inverse transpose of that matrix (well only if the matrix has non-uniform scaling). See for example this GDC lecure, or this page

Now since glTW does not allow skewing, the inverse transpose matrix might be faster to compute.

Or, instead of using normal vectors, one could use bivectors, these don't have the problems of normal vectors, but that is off topic and experimental I guess :)

How to properly load and render .gltf file exported with glTF-Blender-Exporter?

Hi, I tried exporting a simple model with Blender using the glTF-Blender-Exporter add-on. I exported it to .gltf format (also tried exporting it to .glb which gets rendered correctly when loaded in https://gltf-viewer.donmccurdy.com for example). However when I try to load the OfficeChair/glTF/OfficeChair.gltf model within main.js it doesn't gets rendered. I think the model is being parsed because I don't get any error loading it, however it doesn't show up in the viewport. Is this the correct way to load one .gltf file or maybe the format is not correct? I attached the exported file in case you can try to reproduce

OfficeChair.zip

Thanks a lot for the great work!

Wrong normal computation in fragment shader?

Hi,

It looks like the normals are not correctly calculated. So far I've only checked out the damaged helmet model, I've included some images below to showcase the issue.

On the left side the triangle outline is 'pushed in' the mesh:
image
image

On the right side the triangle outline is 'pushed out' the mesh:
image
image

Thanks!

NPoT textures

Implementation should rescale non-power-of-two textures before uploading them to GPU.
BoxTextured sample model contains texture of 211x211.

Rough metal looks too dark, may be losing energy

I've often thought that the full-roughness, full-metal spheres in the MetalRoughSpheres demo looks much darker than the other 3 extremes. At a SIGGRAPH 2017 course on Physically Based Shading, the last presenters in the group were Christopher Kulla and Alejandro Conty, presenting Revisiting Physically Based Shading at Imageworks [slides] [supplemental].

Pretty early in that slide deck, the authors dig into "Microfacet Energy Compensation" and talk about the problem of rough surfaces losing energy and ways to compensate.

Perhaps we can add some form of energy compensation to this reference shader?

How is the environment map in your project created?

I may need to create the similar ones with some new images. I notice that the IBL baker can only export dds format with HDR/MDR images. How do you convert the dds file into individual cube map image and convert the HDR/MDR to jpg format?

Thank you.

Metallic and Roughness parameters instantly deleted

When I run the live demo from the main repository, the metallic and roughness parameters are always 0. I think that this issue may have come from the most recent pull request. By moving the initialization of the u_MetallicRoughnessValues uniform up, the values are deleted right after in following checks.

Normal map is slightly incorrectly scaled?

Currently, when converting a pixel in the normal map to a tangent-space-normal, the following formula is used:

    vec3 n = texture2D(u_NormalSampler, v_UV).rgb;
    n = normalize(tbn * ((2.0 * n - 1.0) * vec3(u_NormalScale, u_NormalScale, 1.0)));

A unit tangent-space-normal (0,0,1) maps to the RGB value (128,128,255) and vice versa.

But instead (128,128,255) now maps to (128/255*2-1, 128/255*2-1, 255/255*2-1) = (0.004, 0.004, 1).

This slight deviation could be noticable when the reflection vector is used to lookup a pixel in the environment maps at the edges of mirrored triangles (where the tangent.w sign changes).

So instead some advice to multiply with 255.0/128.0 instead of 2.0. See also slide #14 of Crytek's presentation

Note that in most examples JPEG is used to store the normal maps. Due to compression, (128,128,255) is never exactly preserved, so the above theory doesn't fix lossy normal maps...

I guess the advice would be, when using JPEGs for normal maps, don't use mirrored geometry to avoid seams at the edges?

Add better selection method for different lighting models

The fragment shader has multiple implementations of different parts of the lighting equation.

If someone wants to switch between different implementations, the only way is to edit the fragment shader and reload the page.

It would be great if there was a mechanism for choosing different implementations at runtime, or even editing the math within an implementation and seeing updates immediately.

Implementation help

Not sure where to ask questions like this... I think it'd get shut down on StackOverflow for being off-topic, and it's a bit too niche to ask generally.

Basically, I'm trying to duplicate the example here in a proprietary but thin API... I've been over and over the settings and I can't see a difference yet in any of the logs, unless maybe it's in how I setup the mipmaps or something (haven't done that manually before)

I thought - maybe someone with a trained eye can see the comparison pictures and can already know where I went wrong.

Here's my image with front and back:

screen shot 2018-03-11 at 19 00 29
screen shot 2018-03-11 at 19 00 52

As you can see comparing to the reference page - there's like some missing specular hilight, too much reflection, and other little details. The geometry seems fine.

Any pointers are greatly appreciated!

Consider lowering the default for c_MinRoughness

This popped up when @emackey and I were reviewing #18. As it stands with a min roughness of 0.04, the bottom 10 pixels are ignored from the brdf_LUT.

From the shader, we lookup into the brdf_LUT using the following:
vec3 brdf = texture2D(u_brdfLUT, vec2(NdotV, 1.0 - roughness)).rgb;

With a min roughness of 0.04, the largest y value we can get is 0.96 (1.0 - 0.04). Note that in webgl the y value is inverted, meaning that a y of 1.0 will index into the bottom of the image. Since the brdf_LUT is 256x256, we're unable to access the bottom 10 pixels of the image.

TypeError: gl.hasSRGBExt is null

This appears in the live demo, on FireFox 52.1.2, Windows (32bit), GeForce 8800 GT.

Most likely an issue of outdated drivers or so (this is not my "real" PC), but maybe it can or should somehow be handled, I'm not sure how common this extension is.

Stack trace:

TypeError: gl.hasSRGBExt is null[Learn More]  scene.js:230:92
    initTextures http://github.khronos.org/glTF-WebGL-PBR/scene.js:230:92
    Mesh http://github.khronos.org/glTF-WebGL-PBR/scene.js:49:30
    Scene http://github.khronos.org/glTF-WebGL-PBR/scene.js:323:30
    updateModel/<.success http://github.khronos.org/glTF-WebGL-PBR/main.js:86:25
    jQuery.Callbacks/fire http://github.khronos.org/glTF-WebGL-PBR/jquery-3.1.1.js:3305:11
    jQuery.Callbacks/self.fireWith http://github.khronos.org/glTF-WebGL-PBR/jquery-3.1.1.js:3435:7

Why gamma correction is not used in fragment shader?

I'm sorry this might be off topic. I guess all shaders that render into screen should use gamma correction since most of designer design textures in sRGB color space.
However, the shader used in this repository is not using gamma correction like gl_FragColor = pow(color,vec3(1.0/2.2)).

Is that correct?

Add support for KHR_materials_pbrSpecularGlossiness

The concept of PBR's spec/gloss is already here, and the conversion to it from metal/rough is hard-coded:

    vec3 f0 = vec3(0.04);
    vec3 diffuseColor = baseColor.rgb * (vec3(1.0) - f0);
    diffuseColor *= 1.0 - metallic;
    vec3 specularColor = mix(f0, baseColor.rgb, metallic);

To support the KHR_materials_pbrSpecularGlossiness, we just need a conditional around this, and the ability to parse that extension, loading its maps directly here without this transformation.

/cc CesiumGS/gltf-pipeline#331

Damaged Helmet normalmap issue

Load the Damaged Helmet model and rotate the light source around past the front of the helmet. The front visor on the helmet has a sharp edge in the normal map, where we lose the specular highlight for a bit as the light source rotates.

helmet

Add comments and references

I don't want to offend anyone, and I admit that I may have a particularly picky and controversial attitude towards some coding practices. So I don't want to argue about the "style" of something like this....

// diffuse
vec3 disneyDiffuse(PBRInfo pbrInputs)
{
  float f90 = 2.*pbrInputs.LdotH*pbrInputs.LdotH*pbrInputs.roughness - 0.5;

  return (pbrInputs.baseColor/M_PI)*(1.0+f90*pow((1.0-pbrInputs.NdotL),5.0))*(1.0+f90*pow((1.0-pbrInputs.NdotV),5.0));
}
..

// G
float microfacetCookTorrance(PBRInfo pbrInputs)
{
  return min(min(2.*pbrInputs.NdotV*pbrInputs.NdotH/pbrInputs.VdotH, 2.*pbrInputs.NdotL*pbrInputs.NdotH/pbrInputs.VdotH),1.0);
}

But I, personally, think that for a "reference implementation", this should in any case at least include comments about what this code is doing, what its purpose is, and last but not least, information about where this code was copied from. (I'm pretty sure that this was not implemented from scratch based on a given set of formulas, but even then, information about the publications that contain the formulas should not be omitted)

Bug in PBR shader - light direction inverted

If you run the live demo, and you put the lightRotation to 0 and the lightPitch to 90, the calculated light direction is set to [0, 1, 0] because it is calculated as [Math.sin(rot) * Math.cos(pitch),
Math.sin(pitch),
Math.cos(rot) * Math.cos(pitch)].

This means a light shining upwards. However, if you look at the render, the light is actually shining downwards. This is because, in the calculation of the specular reflection, you forgot to invert the light direction (line 252 in pbr-frag.glsl):

    vec3 n = getNormal();                             // normal at surface point
    vec3 v = normalize(u_Camera - v_Position);        // Vector from surface point to camera
    vec3 l = normalize(u_LightDirection);             // Vector from surface point to light
    vec3 h = normalize(l+v);                          // Half vector between both l and v
    vec3 reflection = -normalize(reflect(v, n));

    float NdotL = clamp(dot(n, l), 0.001, 1.0);
    float NdotV = abs(dot(n, v)) + 0.001;
    float NdotH = clamp(dot(n, h), 0.0, 1.0);
    float LdotH = clamp(dot(l, h), 0.0, 1.0);
    float VdotH = clamp(dot(v, h), 0.0, 1.0);

The actual line should be:

vec3 l = normalize(-u_LightDirection); // Vector from surface point to light

Edges of brdfLUT.png are discontinuous

Looking at the brdfLUT.png map (the original one, not the modified one in #18), @abwood and I experimented with contrast-enhancing the image to look for irregularities. All four borders of the image have 1-pixel-thick discontinuous irregularities. In a few cases, such as the lower-right and lower-left corners, these are visible to a trained naked eye. But with contrast adjustments, a 1-pixel border around the entire image becomes visible.

I will try to post some screenshots of these findings later today or tomorrow. In the meantime, I'm curious how this image was created, if it can be regenerated, or if the repo here could actually generate this image with a pre-render pass at runtime?

/cc @moneimne

glTF features beyond PBR

Are there any plans or intentions to extend the support of other glTF features?

This mainly refers to skinning, which will influence the shaders - although fortunately, it should be possible to clearly separate this, meaning that it should only influence the vertex shader and not the fragment shader. (The fact that the vertex shader will depend on the number of joints would make it a bit tricky, though...)

(To a lesser extent, it could refer to other features, like animations or morph targets etc.)

If there are no such plans, this issue can probably be closed immediately, but maybe the goals and scope of the implementation (regarding these features) could then be pointed out in the README.

Reverse back-face normals for double sided lighting

The spec says about double sided lighting:

The doubleSided property specifies whether the material is double sided. When this value is false, back-face culling is enabled. When this value is true, back-face culling is disabled and double sided lighting is enabled. The back-face must have its normals reversed before the lighting equation is evaluated.

I believe the last part is missing and might be solved by something like this in the fragment shader:

v_Normal *= (2.0 * float(gl_FrontFacing) - 1.0);

(avoiding a branch because it wouldn't be cheap in this case I think).

Cleanup / simplify reference shaders

Does anyone have the bandwidth to clean up some of the core PBR components - and perhaps add some documentations to navigate the code - to help make this repo more friendly for devs?

Some feedback from SIGGRAPH:

It’s a good sample, but it’s a very dense and without much guidance for those who aren’t well versed in shader lighting parlance.

Lots of #ifdefs make for a confusing read and function names like “SmithG1_var2” are completely opaque to the average dev.

For developers who don’t want Image Based Lighting it’s hard to separate out just the analytical lighting bits

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.