Comments (19)
I'll do a PR with the change.
from three.js.
Ok, I'll do the wipe thing at the end of parse in some days.
from three.js.
If I am not wrong, without using @gkjohnson parts repository will have the editor be limited to loading only pre-packed mpd
files. This is quick and efficient since loading parts from the repo does produce lots of traffic.
I only created this issue so it could be looked at. You guys can handle it and close this at any time.
from three.js.
A model material related to a mesh has associated the material for the edge lines of the same object. This one in turn can also have associated a conditional edges material (edges with a sort of back-face auto-culling)
This is only needed for building the model while parsing so perhaps the second solution I proposed is the best one (I can give it a try)
I did put them in userData
because at the time the use of this field was more common. In SVGLoader
for example the path style is also in the path userData
.
If the class ConditionalLineSegments
is not serializable itself, then I think the conditional segments can't be saved but I'm not sure about this.
from three.js.
Okay, thanks for the explanation!
Storing things in userData
is of course fine as long as it does not break serialization/deserialization. But if the contents of userData
created by LDRAWLoader
have no relevance for the app, it's indeed best to clean things up.
from three.js.
It was easy. I've made it by cloning the material (before cloning I unhang temporarily the non-serializable fields, there are only 2 of them) This has the benefit of not touching the materials in the loader library, so the loader is not affected and can be used to parse more files.
In LDrawLoader.applyMaterialsToMesh()
, inside the local getMaterial()
function, at the end, I've changed return material;
by:
(edited)
let result = material;
if ( finalMaterialPass ) {
// Clone the material without non-serializable fields
const edgeMaterial = material.userData.edgeMaterial;
const conditionalEdgeMaterial = material.userData.conditionalEdgeMaterial;
delete material.userData.edgeMaterial;
delete material.userData.conditionalEdgeMaterial;
result = material.clone();
material.userData.edgeMaterial = edgeMaterial;
material.userData.conditionalEdgeMaterial = conditionalEdgeMaterial;
}
return result;
cc @gkjohnson Do you see it okay?
from three.js.
I was unsure if delete could garbage-collect the two members. They won't, since they are still referenced by the two local variables, right?
Yeah they'll be kept around because you've assigned them to other local variables.
from three.js.
To me, the code required for loading PDB files is too extensive. And since the loader is quite special, I would leave it out until more user request it.
Regarding the LDRAW issue, it seems the support in the editor broke with r137
. #23157 changed the signature of parse()
without updating the editor.
from three.js.
Okay, the import should work again. However, there is a runtime error when the editor tries to save the scene after the import.
Uncaught DOMException: Failed to execute 'put' on 'IDBObjectStore': function onMaterialDispose( event ) {
This only happens with LDRAW models. Other loaders work fine. To be clear the runtime error prevents a safe so when you reload the editor the model is gone.
from three.js.
The serialization issue also happens with versions earlier than r137
so it looks like a new bug. Potentially in LDRAWLoader
.
/cc @gkjohnson
For testing: https://rawcdn.githack.com/mrdoob/three.js/dev/editor/index.html
from three.js.
Yeah, loading is possible again and that is the most important thing. However, it would be nice if the loader works like any others inside the editor. So let's keep the issue open until it is clear where the error message in #26279 (comment) comes from.
from three.js.
@gkjohnson When I delete the userData
fields of the materials produced by LDRAWLoader
, serialization works as expected. It seems the loader assigns objects to userData
which are not compatible with serialization. This is something that should be fixed in LDRAWLoader
.
I have deleted the user data fields with below code. Just put it in Storage.set()
(editor) for testing:
if ( data.scene.materials ) {
for ( const material of data.scene.materials ) {
delete material.userData;
}
}
from three.js.
It looks like the companion edge and line materials are stored in the userData
field on the materials. I think this approach has been around since before I started working on the loader - but it probably makes sense to remove that and index the materials in a different way when building the model. I don't think I'll have the time to make the change. cc @yomboprime, as well, incase they have any thoughts.
from three.js.
I think the easiest solution is to make the classes LDrawConditionalLineMaterial
and ConditionalLineSegments
serializable. Is this possible?
If not, then another solution is to put the userData
objects in another maps (as many as needed) inside the LDrawLoader
, hashed by material code. But then reloaded models will not have the edges materials.
from three.js.
What was the reason for putting the materials into userData
in the first place?
from three.js.
Another simpler solution is to wipe the userData
at the end of parsing.
from three.js.
It might make sense to index the materials differently in the future but this looks good. It might also make sense to delete
the edge and conditonalEdge material members so that those empty fields aren't empty and on the material provided to the end user.
from three.js.
I was unsure if delete
could garbage-collect the two members. They won't, since they are still referenced by the two local variables, right?
(I've edited the piece of code)
from three.js.
@Mugen87 , this is just an FYI related to PDB support.
All of the previously mentioned code can be moved to the PDBLoader.js file itself, which would leave the Loader.js related code as:
case 'pdb':
{
reader.addEventListener( 'load', async function ( event ) {
const contents = event.target.result;
const { PDBLoader } = await import( '../../examples/jsm/loaders/PDBLoader.js' );
let pdb = new PDBLoader().parse( contents );
pdb.name = filename;
editor.execute( new AddObjectCommand( editor, pdb ) );
}, false );
reader.readAsText( file );
break;
}
Below is my version of the modified PDBLoader.js file just so you can see how I went about changing it (this code can probably be further simplified / improved):
import {
BufferGeometry,
IcosahedronGeometry,
BoxGeometry,
FileLoader,
Float32BufferAttribute,
Mesh,
MeshStandardMaterial,
Vector3,
Loader,
Group,
Color
} from 'three';
class PDBLoader extends Loader {
constructor( manager ) {
super( manager );
}
load( url, onLoad, onProgress, onError ) {
const scope = this;
const loader = new FileLoader( scope.manager );
loader.setPath( scope.path );
loader.setRequestHeader( scope.requestHeader );
loader.setWithCredentials( scope.withCredentials );
loader.load( url, function ( text ) {
try {
onLoad( scope.parse( text ) );
} catch ( e ) {
if ( onError ) {
onError( e );
} else {
console.error( e );
}
scope.manager.itemError( url );
}
}, onProgress, onError );
}
// Based on CanvasMol PDB parser
parse( text ) {
function trim( text ) {
return text.replace( /^\s\s*/, '' ).replace( /\s\s*$/, '' );
}
function capitalize( text ) {
return text.charAt( 0 ).toUpperCase() + text.slice( 1 ).toLowerCase();
}
function hash( s, e ) {
return 's' + Math.min( s, e ) + 'e' + Math.max( s, e );
}
function parseBond( start, length, satom, i ) {
const eatom = parseInt( lines[ i ].slice( start, start + length ) );
if ( eatom ) {
const h = hash( satom, eatom );
if ( _bhash[ h ] === undefined ) {
_bonds.push( [ satom - 1, eatom - 1, 1 ] );
_bhash[ h ] = _bonds.length - 1;
} else {
// doesn't really work as almost all PDBs
// have just normal bonds appearing multiple
// times instead of being double/triple bonds
// bonds[bhash[h]][2] += 1;
}
}
}
function buildGeometry() {
const build = {
geometryAtoms: new BufferGeometry(),
geometryBonds: new BufferGeometry(),
json: {
atoms: atoms
}
};
const geometryAtoms = build.geometryAtoms;
const geometryBonds = build.geometryBonds;
const json = build.json;
const verticesAtoms = [];
const colorsAtoms = [];
const verticesBonds = [];
// atoms
const c = new Color();
for ( let i = 0, l = atoms.length; i < l; i ++ ) {
const atom = atoms[ i ];
const x = atom[ 0 ];
const y = atom[ 1 ];
const z = atom[ 2 ];
verticesAtoms.push( x, y, z );
const r = atom[ 3 ][ 0 ] / 255;
const g = atom[ 3 ][ 1 ] / 255;
const b = atom[ 3 ][ 2 ] / 255;
c.set( r, g, b ).convertSRGBToLinear();
colorsAtoms.push( c.r, c.g, c.b );
}
// bonds
for ( let i = 0, l = _bonds.length; i < l; i ++ ) {
const bond = _bonds[ i ];
const start = bond[ 0 ];
const end = bond[ 1 ];
const startAtom = _atomMap[ start ];
const endAtom = _atomMap[ end ];
let x = startAtom[ 0 ];
let y = startAtom[ 1 ];
let z = startAtom[ 2 ];
verticesBonds.push( x, y, z );
x = endAtom[ 0 ];
y = endAtom[ 1 ];
z = endAtom[ 2 ];
verticesBonds.push( x, y, z );
}
const atomGeometry = new IcosahedronGeometry( 1, 2 );
const bondGeometry = new BoxGeometry( 1, 1, 1 );
const position = new Vector3();
const offset = new Vector3();
const start = new Vector3();
const end = new Vector3();
const color = new Color();
const root = new Group();
const mesh_atoms = [];
const mesh_bonds = [];
// build geometry
const positions = new Float32BufferAttribute( verticesAtoms, 3 );
const colors = new Float32BufferAttribute( colorsAtoms, 3 );
geometryAtoms.setAttribute( 'position', positions );
geometryAtoms.setAttribute( 'color', colors );
const bond_positions = new Float32BufferAttribute( verticesBonds, 3 );
geometryBonds.setAttribute( 'position', bond_positions );
// Center the model
geometryAtoms.computeBoundingBox();
geometryAtoms.boundingBox.getCenter( offset ).negate();
geometryAtoms.translate( offset.x, offset.y, offset.z );
geometryBonds.translate( offset.x, offset.y, offset.z );
// Add atoms to the model group
for ( let i = 0; i < positions.count; i ++ ) {
position.x = positions.getX( i );
position.y = positions.getY( i );
position.z = positions.getZ( i );
color.r = colors.getX( i );
color.g = colors.getY( i );
color.b = colors.getZ( i );
let object = new Mesh( atomGeometry, new MeshStandardMaterial( { color: color } ) );
object[ 'name' ] = 'atom_' + i;
object.position.copy( position );
object.position.multiplyScalar( 75 );
object.scale.multiplyScalar( 25 );
// Add atomic text to the atom's userData
object.userData[ 'Element' ] = json.atoms[ i ][ 4 ];
mesh_atoms.push( object );
root.add( object );
}
// Add bonds between atoms to the model group
for ( let i = 0; i < bond_positions.count; i += 2 ) {
start.x = bond_positions.getX( i );
start.y = bond_positions.getY( i );
start.z = bond_positions.getZ( i );
end.x = bond_positions.getX( i + 1 );
end.y = bond_positions.getY( i + 1 );
end.z = bond_positions.getZ( i + 1 );
start.multiplyScalar( 75 );
end.multiplyScalar( 75 );
let object = new Mesh( bondGeometry, new MeshStandardMaterial( { color: 0xFFFFFF } ) );
object[ 'name' ] = 'bond_' + parseInt( i / 2 );
object.position.copy( start );
object.position.lerp( end, 0.5 );
object.scale.set( 5, 5, start.distanceTo( end ) );
object.lookAt( end );
mesh_bonds.push( object );
root.add( object );
}
return root;
}
const CPK = { h: [ 255, 255, 255 ], he: [ 217, 255, 255 ], li: [ 204, 128, 255 ], be: [ 194, 255, 0 ], b: [ 255, 181, 181 ], c: [ 144, 144, 144 ], n: [ 48, 80, 248 ], o: [ 255, 13, 13 ], f: [ 144, 224, 80 ], ne: [ 179, 227, 245 ], na: [ 171, 92, 242 ], mg: [ 138, 255, 0 ], al: [ 191, 166, 166 ], si: [ 240, 200, 160 ], p: [ 255, 128, 0 ], s: [ 255, 255, 48 ], cl: [ 31, 240, 31 ], ar: [ 128, 209, 227 ], k: [ 143, 64, 212 ], ca: [ 61, 255, 0 ], sc: [ 230, 230, 230 ], ti: [ 191, 194, 199 ], v: [ 166, 166, 171 ], cr: [ 138, 153, 199 ], mn: [ 156, 122, 199 ], fe: [ 224, 102, 51 ], co: [ 240, 144, 160 ], ni: [ 80, 208, 80 ], cu: [ 200, 128, 51 ], zn: [ 125, 128, 176 ], ga: [ 194, 143, 143 ], ge: [ 102, 143, 143 ], as: [ 189, 128, 227 ], se: [ 255, 161, 0 ], br: [ 166, 41, 41 ], kr: [ 92, 184, 209 ], rb: [ 112, 46, 176 ], sr: [ 0, 255, 0 ], y: [ 148, 255, 255 ], zr: [ 148, 224, 224 ], nb: [ 115, 194, 201 ], mo: [ 84, 181, 181 ], tc: [ 59, 158, 158 ], ru: [ 36, 143, 143 ], rh: [ 10, 125, 140 ], pd: [ 0, 105, 133 ], ag: [ 192, 192, 192 ], cd: [ 255, 217, 143 ], in: [ 166, 117, 115 ], sn: [ 102, 128, 128 ], sb: [ 158, 99, 181 ], te: [ 212, 122, 0 ], i: [ 148, 0, 148 ], xe: [ 66, 158, 176 ], cs: [ 87, 23, 143 ], ba: [ 0, 201, 0 ], la: [ 112, 212, 255 ], ce: [ 255, 255, 199 ], pr: [ 217, 255, 199 ], nd: [ 199, 255, 199 ], pm: [ 163, 255, 199 ], sm: [ 143, 255, 199 ], eu: [ 97, 255, 199 ], gd: [ 69, 255, 199 ], tb: [ 48, 255, 199 ], dy: [ 31, 255, 199 ], ho: [ 0, 255, 156 ], er: [ 0, 230, 117 ], tm: [ 0, 212, 82 ], yb: [ 0, 191, 56 ], lu: [ 0, 171, 36 ], hf: [ 77, 194, 255 ], ta: [ 77, 166, 255 ], w: [ 33, 148, 214 ], re: [ 38, 125, 171 ], os: [ 38, 102, 150 ], ir: [ 23, 84, 135 ], pt: [ 208, 208, 224 ], au: [ 255, 209, 35 ], hg: [ 184, 184, 208 ], tl: [ 166, 84, 77 ], pb: [ 87, 89, 97 ], bi: [ 158, 79, 181 ], po: [ 171, 92, 0 ], at: [ 117, 79, 69 ], rn: [ 66, 130, 150 ], fr: [ 66, 0, 102 ], ra: [ 0, 125, 0 ], ac: [ 112, 171, 250 ], th: [ 0, 186, 255 ], pa: [ 0, 161, 255 ], u: [ 0, 143, 255 ], np: [ 0, 128, 255 ], pu: [ 0, 107, 255 ], am: [ 84, 92, 242 ], cm: [ 120, 92, 227 ], bk: [ 138, 79, 227 ], cf: [ 161, 54, 212 ], es: [ 179, 31, 212 ], fm: [ 179, 31, 186 ], md: [ 179, 13, 166 ], no: [ 189, 13, 135 ], lr: [ 199, 0, 102 ], rf: [ 204, 0, 89 ], db: [ 209, 0, 79 ], sg: [ 217, 0, 69 ], bh: [ 224, 0, 56 ], hs: [ 230, 0, 46 ], mt: [ 235, 0, 38 ], ds: [ 235, 0, 38 ], rg: [ 235, 0, 38 ], cn: [ 235, 0, 38 ], uut: [ 235, 0, 38 ], uuq: [ 235, 0, 38 ], uup: [ 235, 0, 38 ], uuh: [ 235, 0, 38 ], uus: [ 235, 0, 38 ], uuo: [ 235, 0, 38 ] };
const atoms = [];
const _bonds = [];
const _bhash = {};
const _atomMap = {};
// parse
const lines = text.split( '\n' );
for ( let i = 0, l = lines.length; i < l; i ++ ) {
if ( lines[ i ].slice( 0, 4 ) === 'ATOM' || lines[ i ].slice( 0, 6 ) === 'HETATM' ) {
const x = parseFloat( lines[ i ].slice( 30, 37 ) );
const y = parseFloat( lines[ i ].slice( 38, 45 ) );
const z = parseFloat( lines[ i ].slice( 46, 53 ) );
const index = parseInt( lines[ i ].slice( 6, 11 ) ) - 1;
let e = trim( lines[ i ].slice( 76, 78 ) ).toLowerCase();
if ( e === '' ) {
e = trim( lines[ i ].slice( 12, 14 ) ).toLowerCase();
}
const atomData = [ x, y, z, CPK[ e ], capitalize( e ) ];
atoms.push( atomData );
_atomMap[ index ] = atomData;
} else if ( lines[ i ].slice( 0, 6 ) === 'CONECT' ) {
const satom = parseInt( lines[ i ].slice( 6, 11 ) );
parseBond( 11, 5, satom, i );
parseBond( 16, 5, satom, i );
parseBond( 21, 5, satom, i );
parseBond( 26, 5, satom, i );
}
}
// build and return geometry
return buildGeometry();
}
}
export { PDBLoader };
from three.js.
Related Issues (20)
- TubeBufferGeometry Missing in v154 HOT 2
- Crash When ObjectLoader parse EdgesGeometry HOT 1
- Port Water2 to WebGPU
- Loading VideoTexture in WebXR gives camera feed instead of actual video HOT 7
- DataTextureLoader: Properly bubble parsing errors to onError
- ObjectLoader fails to parse geometry for EdgesGeometry HOT 1
- Error when empty BufferGeometry is rendered with wireframe HOT 1
- The WebGPURenderer causes the browser to crash while loading the glb model HOT 18
- Animations regression with r154 HOT 2
- OrbitControls are framerate dependent HOT 4
- GLTFExporter: Support clearcoatNormalMap => Scale HOT 4
- lightMap not working with the second UV. it seems use the first uv. HOT 2
- WebGPU, PerspectiveCamera and SkyBox (CubeMap) - Artifacts when large difference between far and near values
- Roadmap for supporting wide-gamut color workflows HOT 10
- Memory leak: cannot dispose of a hidden texture HOT 8
- `Box3.expandByObject` not working as expected with 154 HOT 2
- when i set model position far away from 0,0,0, such as new Vector3( 13487527.049811473, 3.0885315154125355, -3741641.995818826), shadow has bug HOT 1
- Browser crashes when loading large files (dxf, glTF, etc.) HOT 2
- three/addons/loaders/GLTFLoader.js is missing HOT 1
- Merging editor and frame.js HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from three.js.