leapmotion / vrcollage Goto Github PK
View Code? Open in Web Editor NEWExperiment w/ Leap Motion and Virtual Reality
Home Page: http://leapmotion.github.io/VRCollage/
License: Apache License 2.0
Experiment w/ Leap Motion and Virtual Reality
Home Page: http://leapmotion.github.io/VRCollage/
License: Apache License 2.0
e.g., cloth deformation
A hand should appear, going through the actions of a motion, explaining to the user how to perform an action
by putting one finger behind the frame, and one in front, and causing a torque.
Probably need to pass in the correct viewing angles. This should be a good example:
https://github.com/dmarcos/vrHelloWorld/tree/master/examples/webgl
http://swimminglessonsformodernlife.com/vrHelloWorld/examples/webgl/
On Firefox Nightly 39a01 on OSX using DK2 and Leap Motion sensor mounted on HMD. Head and hand tracking both working. Can "highlight" Oculus mounted option but no way to confirm selection. Am I missing a key button or other control mechanism? If I stick my head past dialog I can sometimes see what I believe to be the photo window so I'm guessing the dialog just isn't dismissing properly.
Planes should be able to be fit to a specific curvature, e.g., a sphere around the user
1: do an audit
2: Notice several large methods, which resist optimization, make them smaller
3: profit
It could be really cool to hear a low-bass note when touching planes. Especially with something like this:
https://www.kickstarter.com/projects/1382889335/woojer-feel-the-sound
Two key methods should be stress tested: getPosition
and getZReposition
(which is currently known to be slow.)
Travis doesn't help out with this at all: travis-ci/travis-ci#352
Something like this would be ideal: https://www.dartlang.org/performance/
For starters we can just write a test, and manually log the results when they're run. These could potentially even be run in travis
Something like this could get the test results and add them to a graph: https://github.com/travis-ci-examples/webhook
The viewport freezes and nothing happens when I select the desk mount option on startup.
Oculus mounted version works.
It would be nifty to have a "hold card" gesture. E.g. hold hand flat, palm facing you, fingers horizontal, and slightly behind and image. Then bring thumb down in front of image to cause a pinch/grab, which holds the image to the palm.
~Currently tracking here is a bit iffy, but it still may be worth exploring.
The hands in the live demo on the gh-page has both hands inverted, you move the right hand and the displayed left hand moves.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.