Coder Social home page Coder Social logo

gltf's Introduction

gltf

crates.io docs.rs


This crate is intended to load glTF 2.0, a file format designed for the efficient transmission of 3D assets.

rustc version 1.61 or above is required.

Reference infographic

infographic

From javagl/gltfOverview

PDF version

Usage

See the crate documentation for example usage.

Features

Extras and names

By default, gltf ignores all extras and names included with glTF assets. You can negate this by enabling the extras and names features, respectively.

[dependencies.gltf]
version = "1.2"
features = ["extras", "names"]

glTF extensions

The following glTF extensions are supported by the crate:

  • KHR_lights_punctual
  • KHR_materials_pbrSpecularGlossiness
  • KHR_materials_unlit
  • KHR_texture_transform
  • KHR_materials_variants
  • KHR_materials_volume
  • KHR_materials_specular
  • KHR_materials_transmission
  • KHR_materials_ior
  • KHR_materials_emissive_strength

To use an extension, list its name in the features section.

[dependencies.gltf]
features = ["KHR_materials_unlit"]

Examples

gltf-display

Demonstrates how the glTF JSON is deserialized.

cargo run --example gltf-display path/to/asset.gltf

gltf-export

Demonstrates how glTF JSON can be built and exported using the gltf-json crate.

cargo run --example gltf-export

gltf-roundtrip

Deserializes and serializes the JSON part of a glTF asset.

cargo run --example gltf-roundtrip path/to/asset.gltf

gltf-tree

Visualises the scene heirarchy of a glTF asset, which is a strict tree of nodes.

cargo run --example gltf-tree path/to/asset.gltf

Tests

Running tests locally requires to clone the glTF-Sample-Models repository first.

git clone https://github.com/KhronosGroup/glTF-Sample-Models.git

gltf's People

Contributors

alexapps99 avatar aloucks avatar alteous avatar anderejd avatar azriel91 avatar bwasty avatar corneliuscornbread avatar derwiath avatar digama0 avatar elabajaba avatar elrnv avatar elzair avatar emilk avatar expenses avatar forsakenharmony avatar frankenapps avatar icandivideby0 avatar kirdaybov avatar kpreid avatar nuew avatar oceanusxiv avatar ralith avatar repi avatar rhuagh avatar rukai avatar shaddydc avatar stefnotch avatar veykril avatar warmwaffles avatar wlinna avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gltf's Issues

Finalise Validate trait

The Validate trait is used for checking the correctness of glTF JSON. The wrapper library assumes validation has succeeded as an invariant. This ensures classic bugs such as out-of-range indices are checked upfront on behalf of the user. The final design of the Validate trait needs to be agreed upon prior to the 1.0 release.

Final features:

  • Index range checks for all top-level glTF objects.

  • Index range checks for animation samplers.

  • Semantic name correctness.

  • Restrictions on parameter values.

  • Ability to opt-out of validation.

  • Ability to ignore 'less severe' validation errors - implemented with minimal / complete flavours.

  • Ability to terminate validation early (more important.) (see https://github.com/alteous/gltf/pull/42#issuecomment-312511952)

Reason for Source trait to use &mut self instead of &self?

https://github.com/alteous/gltf/blob/wrapper/src/import.rs#L79

Is there a reason to have &mut self instead of &self? Logically, we are not changing the Source when reading from it. If implementer would need it to be mutable, they could use RefCell, Mutex or else inside of their implementation. This would make user code working with functions that take Source cleaner as the locks in non-concurrent Sources would be hidden inside the implementation instead of being propagated to the user code.

Fix loading glTF 2.0 sample models

@bwasty quite rightly reported that tests/import_v2.rs is failing on the latest 2.0 sample models.

On a side note, it would be worth rewriting the test to read the directory entries.

Add Gltf::default_scene function

An easy task for a new contributor / Rust beginner.

fn default_scene(&self) -> Option<Scene> {
    // If `json.scene` is present, return the indexed scene, otherwise return `None`.
}

Lifetime of scene::Children bound to parent

Currently, one can't own the children of nodes due to incorrect lifetimes.

fn process(node: gltf::Node) {
    struct Item {
        node: gltf::Node,
    }

    let mut stack = vec![Item { node }];

    while let Some(item) = stack.pop() {
        for child in item.node.children() {
            // Error: lifetime of `child` bound to `item.node`.
            stack.push(Item { node: child });
        }
    }
}

Implement image decoding

This should happen during the import process. The only valid MIME types in core glTF are image/jpeg and image/png, however we should try to abstract over image decoding to account for new formats, user specific formats, and image extensions.

Fix Texture::sampler default case

Analogous to #48, sampler should not return an Option, but a Sampler with default values.
Docs
Spec:

The index of the sampler used by this texture. When undefined, a sampler with repeat wrapping and auto filtering should be used.

Validate matrices to be decomposable to TRS

From the 2.0 draft spec:

Matrices must be decomposable to TRS. This implies that transformation matrices cannot skew or shear.

(Source)

I think it would be good to check the matrices on load. Perhaps this could be optional, so it can be skipped e.g. on release builds.
Relevent discussion on the spec repo: KhronosGroup/glTF#892

Make image loading opt-out

Buffer data is required to be loaded so that accessors can iterate over the data etc. Image data however has no real reason to be preloaded other than for convenience. Some rendering APIs, notably three-rs, only allows loading images from a path, so preloading this data is a complete waste of resources.

Should glTF 1.0 support be discontinued?

I am tempted to deprecate the v1 module and remove it in the planned 1.0.0 release. The contents of the v2 module would then be moved to the crate root.

Rationale

  • The v1 module feels like legacy code in which I am not personally interested in maintaining.
  • I'm skeptical whether anybody is using the v1 module since it's far more restrictive than 2.0.
  • The crate could be better organised without having two separate modules.
  • More attention can be focused on glTF 2.0, and especially on the wrapper interface.

Until now there has been a lot of hard work to get the 1.0 version up to scratch. It would be a shame to abandon it entirely. Perhaps it could be migrated to another crate, say gltf-legacy?

Personally, I would like to see glTF 2.0 become a widely adopted 3D interchange format. I think having a solid idiomatic Rust library for glTF 2.0 will encourage game / graphics development in Rust and help support glTF 2.0 adoption, which is one of the main reasons I began working on gltf in the first place.

If you are actively using the v1 module, please speak out now!

gltf-utils

In the upcoming version (0.9) some of the existing functionality, notably accessor iteration, will be moved to a new crate, namely gltf-utils. This issue exists to discuss the design and implementation of this new crate.

Rationale

  • The main gltf crate may stabilise more quickly.
  • Accessor iteration et al. may remain unstable post gltf 1.0.
  • gltf-importer may stabilise with gltf.

Currently planned features

  • A Source trait for sourcing pre-existing buffer data. This must be implemented by the user.
  • A generic Iterator that visits the components of an Accessor.
  • Hence, iterators for visiting the positions, indices, colours, etc. of a Primitive.
  • Flat normal generation.
  • Tangent generation using the mikktspace algorithm.

Note

The gltf-json crate will continue to exist in order to reduce compile times. Its semver post gltf 1.0 will match that of gltf although the crate is not intended to stabilise. This should be mentioned in the documentation.

gltf v2

I have seen that you have started to support v2 in the incoming branch. Would you say that it is already in a usable state?

Improve ergnomics of Indices, TexCoords, Colors, etc.

This pattern arises when working with data with multiple representations:

let indices: Vec<u32> = match primitive.indices().unwrap() {
    Indices::U8(iter) => iter.map(|x| x as u32).collect(),
    Indices::U16(iter) => iter.map(|x| x as u32).collect(),
    Indices::U32(iter) => iter.collect(),
};

I would like to remove the user of this burden with the help of some extra iterator adaptors. The following comes to mind:

let indices: Vec<u32> = primitive.indices().unwrap().map_u32().collect();

Include image dimensions

Many consumers require width and height of images to be specified. This should be provided by the crate importer.

Make iterators ExactSizeIterator

I think considering that gltf is non-streaming format it's fair to implement ExactSizeIterator for iterators that are giving out gltf's objects

Fix Primitive::material default case

Another case where there's an Option, but the spec defines a default value:

If material is not specified, then a default material is used.

The default material, used when a mesh does not specify a material, is defined to be a material with no properties specified. All the default values of material apply. Note that this material does not emit light and will be black unless some lighting is present in the scene.

(src)

Fix PbrMetallicRoughness default case

Currently the Material type has Option<PbrMetallicRoughness> instead of PbrMetallicRoughness with default values. This is an oversight which needs to be fixed.

Doesn't build on MacOS

Biuld error:

Running `rustc --crate-name gltf src/lib.rs --crate-type lib -g -C metadata=b55367fa1a6bd86a -C extra-filename=-b55367fa1a6bd86a --out-dir /Users/user/Repos/daeTo3DTiles/gltf/target/debug/deps --emit=dep-info,link -L dependency=/Users/user/Repos/daeTo3DTiles/gltf/target/debug/deps --extern gl=/Users/user/Repos/daeTo3DTiles/gltf/target/debug/deps/libgl-24b429b1b60cf506.rlib --extern serde_json=/Users/user/Repos/daeTo3DTiles/gltf/target/debug/deps/libserde_json-3d35c676bbe42cf7.rlib --extern serde=/Users/user/Repos/daeTo3DTiles/gltf/target/debug/deps/libserde-256e72629d0f4e21.rlib --extern serde_derive=/Users/user/Repos/daeTo3DTiles/gltf/target/debug/deps/libserde_derive-64a60f35c65de494.dylib`
error: custom derive attribute panicked
  --> src/lib.in.rs:53:17
   |
53 | #[derive(Debug, Deserialize, Serialize)]
   |                 ^^^^^^^^^^^
   |
   = help: message: assertion failed: !p.is_null()

error: Could not compile `gltf`.

Generate normals/tangents/bitangents

I found these implementation notes in the 2.0 spec draft (under Meshes):

Implementation note: When normals are not specified, implementations should calculate flat normals.

Implementation note: When tangents are not specified, implementations should calculate tangents using default MikkTSpace algorithms. For best results, the mesh triangles should also be processed using default MikkTSpace algorithms.

Implementation note: When normals and tangents are specified, implementations should compute the bitangent by taking the cross product of the normal and tangent xyz vectors and multiplying against the w component of the tangent: bitangent = cross(normal, tangent.xyz) * tangent.w

References to resources

@warmwaffles brought up the topic of references to resources in #4. Whilst it's probably infeasible to return such references directly from import(), it is certainly possible to write a wrapper around Root to 'walk the tree' using iterators.

Here is experimental wrapper interface and an example of its usage.

The question is, should the library provide this functionality and, if so, what design considerations need to be accounted for?

Reduce build times

The crate build time is getting out of hand (again), and it affects the iteration time of other projects too. People may be dissuaded from using gltf if this continues for too long, so it's important we do something about this sooner rather than later. The trouble is, I have no idea what to do about it. So, if you have any ideas, please discuss!

Finalise Source trait

The final design of the Source trait needs to be agreed upon prior to the planned 1.0 release.

The final design should handle at least:

  • User extensions (assumed to be handled due to the generality of Source::source_external_data)
  • Async I/O (achieved with futures)

Edit: https://github.com/alteous/gltf/issues/31 discusses whether a Source implementation should be mutable, which is an interesting consideration.

Extensions and Extras

The library would be much more useful if it included the extensions and extras fields that are found on most objects in the specification. Until version 0.4.0 this was achieved by having an untyped Map<String, Value> structure that was taken directly from serde_json. This was a very lazy implementation and was not the best option in terms of user-friendliness and efficiency.

I propose we add support for extensions and extras through Extensions* and Extras traits respectively. Each trait will contain user-defined types to be (de)serialised by the library. These types could then be passed through the glTF tree structure as follows:

*Edit: See updates posted below.

// User-defined data is declared with the `Extras` trait
trait Extras {
    type Accessor: Clone + Debug + Default + Deserialize + Serialize;
    ...
}

// Data targeting official extensions is declared with the `Extensions` trait
trait Extensions {
    type Accessor: Clone + Debug + Default + Deserialize + Serialize;
    ...
}

// This is the 'entry point' for the `Extras` and `Extensions` definitions
pub fn import<P, E, X>(path: P) -> Result<Root<E, X>, ImportError>
    where P: AsRef<Path>, E: Extensions, X: Extras
{
    ...
}

// The `Extensions` and `Extras` types are passed down through the glTF tree structure
struct Root<E: Extensions, X: Extras> {
    accessors: Vec<Accessor<E, X>>,
    ...
}

// Here we resolve the user-provided extensions and extras
struct Accessor<E: Extensions, X: Extras> {
    ...
    extensions: <E as Extensions>::Accessor,
    extras: <X as Extras>::Accessor,
}

The implementation might be a bit bonkers, but it does allow for user-data (de)serialisation to be a zero-cost abstraction.

As far as extensions are concerned, it will be the responsibility of the library to provide the structures necessary to represent official extensions. These structures will be defined in an extensions module. The user should not be able to implement their own Extensions; only those offered by the library will be supported. If an extension is required but not supported, the library will return an appropriate error during importing / exporting.

Redesign Image type

The current Image abstraction is too basic. It ought to be redesigned with at least the following features:

  • Ability to obtain dimensions in pixels (N.B. all glTF images are 2D.)
  • Ability to obtain the pixel format, e.g. Rgb.

A nice-to-have feature would be to be able to choose the pixel type, i.e. u8 or f32 etc.

I considered briefly re-exporting DynamicImage from the image crate but gut feeling tells me that's not a good idea. We could design the image abstraction closely following this type, however - perhaps even providing From / Into conversions.

Here's a draft of my ideas:

#[derive(Clone, Debug)]
pub enum Image<T: Copy> {
    Gray(Vec<[T; 1]>),
    GrayAlpha(Vec<[T; 2]>),
    Rgb(Vec<[T; 3]>),
    Rgba(Vec<[T; 4]>),
}

impl From<image::DynamicImage> for Image<u8> { ... }
impl Into<image::DynamicImage> for Image<u8> { ... }

On a side note there are lots of image types now, for example image::Image, json::image::Image, extensions::image::Image, json::extensions::image::Image, import::data::AsyncImage, import::data::EncodedImage, etc. Is this getting out of hand?!

Generate code from json schema?

Hi,
Have you considered generating the Rust code from the official json schema?
I did a quick experiment with schemafiy, but immediately ran into Marwes/schemafy#3 (panic on external references).

I got the idea initially from https://github.com/slack-rs/slack-rs-api/tree/master/codegen. They use a custom generator to generate bindings to the Slack API. It works pretty well there (I've worked with the generated code).

Side question: How far along is the 2.0 branch?

Generate tangents and normals

Generating tangents and normals has been an elephant-in-the-room for a while. The specification says we should use the mikktspace algorithm when tangents are not provided and compute flat normals when normals are not provided. There is an important caveat, namely the mikktspace algorithm reorders the existing geometry, and flattens any original index list - that is, the output data is suitable for glDrawArrays. We have to recreate a new index list ourselves if wanted.

Read ahead for new ideas and updates

I propose the following ideas to deal with these concerns. (N.B. this is pseudo-code and probably won't compile.)

/// Geometry to be rendered with the given material.
#[derive(Clone, Debug)]
pub struct Primitive<'a>  {
    /// The parent `Mesh` struct.
    mesh: &'a Mesh<'a>,

    /// The corresponding JSON index.
    index: usize,

    /// The corresponding JSON struct.
    json: &'a json::mesh::Primitive,

    /// New: Computed vertex normals.
    computed_normals: Option<Vec<[f32; 3]>>,
}

impl<'a> Primitive<'a> {
    /// Same as before, returning `None` if there are no tangents available.
    pub fn tangents(&self) -> Option<Tangents<'a>> {
        self.find_accessor_with_semantic(Semantic::Tangents)
    }

    /// New function: Generates flat normals, returning `None` if there are no positions available.
    pub fn normals(&mut self) -> Option<Normals<'a>> {
        if let Some(iter) = self.find_accessor_with_semantic(Semantic::Normals) {
            Normals(Either::Left(iter))
        } else {
            let normals = self.computed_normals
                .as_ref()
                .unwrap_or_else(|| self.compute_normals());
            Normals(Either::Right(normals.iter()))
        }
    }

    /// New function: Generates tangents, returning `None` if there the primitive lacks positions
    /// or texture co-ordinates - we can generate the normals if necessary.
    pub fn mikk_tspace(self) -> MikkTSpace<'a> {
         // *Magic*
    }
}

/// We might have to duplicate all the other iterators.
/// Alternatively, iterators could take an ordering argument instead. (`Incremental` or `Explicit`)
pub mod mikk_tspace {
    /// New: Geometry guaranteed to have positions, normals, and tangents, but not in original order.
    #[derive(Clone, Debug)]
    pub struct MikkTSpace<'a>  {
        /// The original `Primitive` struct.
        original: &'a Primitive<'a>,

        /// New ordering of vertex data.
        ordering: Vec<u32>,

        /// Newly computed tangents.
        tangents: Vec<[f32; 4]>,

        /// Newly computed index list.
        indices: Vec<u32>,
    }

    /// New: Example color iterator.
    pub struct Colors<'a> {
        original: &'a super::Colors<'a>,
        order: &'a [u32],
    }

    impl<'a> Iterator for Colors<'a> {
        type Item = [f32; 4];
        fn next(&mut self) -> Option<Self::Item> {
            self.order.map(|index| self.original.iter().nth(index).unwrap())
        }
    }
}

Implement base64 decoding

This should happen when importing images and buffers with data URIs. Use the base64 crate or otherwise.

Fix data URI decoding

gltf-viewer /home/alteous/gltf/glTF-Sample-Models/2.0/VC/glTF-Embedded/VC.gltf fails with error Source(Io(Error { repr: Os { code: 36, message: "File name too long" } })). This is because the URI has MIME type data:image/jpeg;base64, which isn't explicitly handled by FromPath.

Suggested fix: Use the mime crate or otherwise to parse the data URI and handle loading in a more robust way.

New Validation Strategy

As the title suggests, I'm experimenting with a new strategy for validating the glTF JSON data. This new strategy uses the proc_macro feature to generate implementations of the Validate trait. You can browse the code in the validation branch.

The Validate trait ignores 'untestable' values and propagates through containers such as Vec, Option, and HashMap. The real power of the trait comes with non-trivial Validate implementations such as Index. This new strategy uses the type system to guarantee that all Index types are tested.

So far only index bounds checking has been implemented but the trait should be flexible enough to handle any future use case.

Supporting glTF version 2.0

The current draft of the 2.0 specification is mostly a refinement on the current 1.0 specification. One of the major changes is that shaders are removed from materials and replaced with Physically-Based Rendering (PBR) values. Once the 2.0 specification is released, the question is how should the library continue to support 1.0, if at all? I'm considering one of three options:

  • Use the version field of the asset object to return either a Gltf::V1 or Gltf::V2 and let the user deal with it
  • Convert 1.0 assets to 2.0 assets using gltf-pipeline or otherwise
  • Not support 2.0 (or at least until 1.0 conformance has been achieved)

Plans for 1.0.0

Now that the implementation of the glTF 1.0 specification is complete, the next steps are to consolidate the library to a production standard. 1.0.0 will be the version where this condition is met.

Below is an non-exhaustive list of things I would expect from a 1.0.0 release:

  • Guaranteed backward compatibility for all 1.x.x releases
  • Comprehensive documentation
  • glTF 2.0 support
  • KHR_binary_glTF support
  • Validate implementations for all 2.0 data structures
  • Basic wrapper library for the 2.0 implementation
  • Conversion to and from 1.0 / 2.0 assets (see #14)

Idea: Add gltf-utils crate

The new Loaded type was designed to add extra functionality to the basic 'tree-traversal' library (gltf). This functionality may be better suited in its own crate.

Example contents:

gltf_utils::iter_accessor<T: Copy>(accessor: &gltf::Primitive, source: &gltf_utils::Source) -> Iter<T>

extern crate gltf;
extern crate gltf_utils;

use gltf_utils::iter_accessor;

for position in iter_accessor::<cgmath::Point3<f32>>(&position_accessor, &buffer_source) {
   ...
}

gltf_utils::colors(primitive: &Primitive) -> gltf_utils::Colors
gltf_utils::generate_flat_normals(primitive: &Primitive) -> Vec<(usize, [f32; 3])>

Source would be moved to gltf_utils.

Derive ExactSizeIterator trait for Positions and others

Right now something like Indices is defined like this:

pub enum Indices<'a> {
    U8(Iter<'a, u8>),
    U16(Iter<'a, u16>),
    U32(Iter<'a, u32>),
}

And because it exposes Iter, we can use .len() easily, but that is not possible for Positions - it hides the Iter field.

Possible bug: Avocado example has negative texture co-ordinates

Reproduce with:

fn main() {
    let gltf = gltf::Import::from_path("glTF-Sample-Models/2.0/Avocado/glTF/Avocado.gltf").sync().unwrap();
    let mesh = gltf.meshes().nth(0).unwrap();
    let primitive = mesh.primitives().nth(0).unwrap();
    let mut tex_coords = vec![];
    match primitive.tex_coords(0).unwrap() {
        gltf::mesh::TexCoords::U8(iter) => {
            for x in iter {
                tex_coords.push([x[0] as f32 / 255.0, x[1] as f32 / 255.0]);
            }
        },
        gltf::mesh::TexCoords::U16(iter) => {
            for x in iter {
                tex_coords.push([x[0] as f32 / 65535.0, x[1] as f32 / 65535.0]);
            }
        },
        gltf::mesh::TexCoords::F32(iter) => {
            for x in iter {
                tex_coords.push([x[0], x[1]]);
            }
        },
    }
    println!("{:?}", tex_coords);
}

Note that all the x[1] values are negative but otherwise valid in unit range.

Library structure

This thread exists as a place to discuss the modules, functions, data structures, and their names included in the library. In version 1.0.0 names must be agreed upon and remain immutable for backward compatibility. Until then, we should experiment as much as possible and find out what works best.

Current structure

gltf

  • The main crate, intended for navigating glTF JSON.
  • Intentionally limited in scope.

gltf-json

  • Child crate for gltf.
  • Exists primarily to reduce compile times for gltf.
  • semver matches that of gltf for convenience but is otherwise unstable.

gltf-importer

  • Reference importer with a simple API.
  • May be used by run times to read from the file system.
  • Handles base 64 decoding and distinguishing binary glTF on behalf of the user.
  • Optionally loads buffers and images (to do).

gltf-utils

  • Utility functions to complement the gltf crate.
  • Addresses cases where access to pre-existing buffer / image data is required.
  • May remain unstable post gltf 1.0.

Rewrite import tests (again)

Issue https://github.com/alteous/gltf/issues/32 arose due to the naïve way the basic import test (tests/import_examples.rs) searches for glTF to load. The tests currently assume .gltf / .glb files of the form x/glTF/x.gltf or x/glTF-Binary/x.glb where x is the name of the sample model. https://github.com/alteous/gltf/issues/32 failed because one model had the path BoxInterleaved/glTF/Box_Interleaved.gltf, which led to a file not found error.

The suggested solution is to rewrite the tests by recursively scanning the sample models directory (using walkdir or otherwise) and search for .gltf / .glb files to load. This should future-proof the basic import tests.

It is worth noting that these tests serve more as sanity checks rather than actual rigorous testing of the crate. Testing the importer and wrapper is a difficult task which I'm not sure how to approach yet.

Remove Deref target for texture::Info

There are multiple methods from Texture that conflict with Info which is confusing. Suggested action would be to remove in favour of a fn texture(&self) -> Texture<'a> method instead.

Handling different glTF versions

Currently implementations exist for loading both the 1.0 and draft 2.0 version of the glTF specification in the incoming branch. In #8 I mention the possibility of implementing conversion functions between different glTF versions. However after some experimentation I don't believe a simple 1.0 -> 2.0 conversion is feasible since we can't choose the PBR values that determine look of the models the without extra input from the user. Therefore I propose the library does not attempt to make such conversions and instead forces the user to choose a version upfront.

The implementation will be very simple. The 2.0 glTF tree is implemented in a v2 module alongside the existing v1 module. We provide that functions v1::import() and v2::import() that load only glTF version 1.0 and version 2.0 respectively. An error is returned if the user tries to call import() on glTF that doesn't match the required version.

What is incomplete and still left to do?

I'm interested in working to get this lib up to GLTF 1.0 compliant.

Definitely need to add more tests in to ensure that it is 1.0 compliant.

How are buffer binary files going to be handled?
Should this library load them up into memory? Lazy load?
Buffers can be base64 encoded uri's which I don't know if it is decoded.

Side notes

There is a spec for 2.0 that is currently under way and they introduce a .glb format which from the looks of it, is just the binary gltf extension.

In 2.0 it looks like they are going to pull the shader code stuff out of the spec and make it an extension instead. https://github.com/KhronosGroup/glTF/tree/2.0/extensions/Khronos/KHR_technique_webgl

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.