provencher / mrtk-quest Goto Github PK
View Code? Open in Web Editor NEWThis project forked from hololabinc/mrtkextensionforoculusquest
MRTK-Quest: Mixed Reality Toolkit (MRTK) extension bridge for Oculus Quest + Rift / S
License: Other
This project forked from hololabinc/mrtkextensionforoculusquest
MRTK-Quest: Mixed Reality Toolkit (MRTK) extension bridge for Oculus Quest + Rift / S
License: Other
When you grab an object and move it toward or away from your face, it drifts out of your hand. We have posted this issue on the MRTK Github page but they have not responded. We've tried many Unity and MRTK versions, including the newest ones. It makes grabbing objects very inaccurate but doesn't appear to be a limitation of the accuracy of the hand-tracking itself...
While the plugin seems to be pretty cool, development for it has stalled for 2 years now and issues concerning Unity 2019 and 2020 area appearing.
the reference should be removed.
If uncheck "renderAvatarHandsInsteadOfControllers" in MRTK-OculusConfig, the controller model can shown instantly when start scene.
So why there is 30 seconds delay when show avatar hands? is it a oculus problem?
Hi Everyone.
I am building a Rehabilitation Treatments by VR Games( mainly for Oculus Quest for its mobility). I am very interested in adding the Hand Tracking Teleporting feature of this MRTK-Quest project to my own project (it would be great for that patient who has little mobility with their "hands" for your health problem and can move around the game by your hands. I guess that this feature is in the MRTK-Quest_2.4_Extra scene, for that reason, Would it be possible? Could anybody advise me about how to do it?
Thanks for your time
Regards
Alejandro
Ps. Maybe it is a little crazy idea but it's worth a try :)
debug.LogError :
"AssertionException: Assertion failure. Value was Null
Expected: Value was not Null
When a dockable is docked, its dockedPosition must be valid.
UnityEngine.Assertions.Assert.Fail (System.String message, System.String userMessage) (at <23a7799da2e941b88c6db790c607d655>:0)
UnityEngine.Assertions.Assert.IsNotNull (UnityEngine.Object value, System.String message) (at <23a7799da2e941b88c6db790c607d655>:0)
UnityEngine.Assertions.Assert.IsNotNull[T] (T value, System.String message) (at <23a7799da2e941b88c6db790c607d655>:0)
Microsoft.MixedReality.Toolkit.Experimental.UI.Dockable.Update () (at Assets/MRTK/SDK/Experimental/Dock/Dockable.cs:133)"
Way to recreate the issue:
Add two buttons in MRTK-Quest_2.4_Extra scene.
Button one: Hide the DockExample object group in Hierarchy panel with "GameObject.SetActive" value=off.
Button two: Show the DockExample object group in Hierarchy panel with "GameObject.SetActive" value=on.
Click button one to hide the DockExample object group, then click button two to show the group, then the debug.LogError will keep coming.
Camera always start from (0,0,0) position. In xr interaction toolkit it is based on the local position of the parent gameobject of the camera but here it s overriden by the main position so I can't move user to start from a specific point of the scene.
I could n't find a way to do it. Also I want to create snap teleportation points so i need some control over it.
Objects that have rigidbody physics (the coffee cup, the cubes etc.) stutter badly when they're grabbed and moved. The stuttering disappears when I let go of the object and it just falls to the floor smoothly.
The objects that don't have any physics are fine when grabbed. The floaty coffee cup that doesn't have gravity moves smoothly when manipulated.
Interestingly enough it isn't very noticeable in recordings which makes me think this is some sort of framerate issue
I just downloaded MRTK-Quest_v120.apk from the releases and ran it on my quest 2 (v23) so nothing was changed on my end.
See #60 (comment)
and a little more detail in Slack
https://holodevelopers.slack.com/archives/CTW7K59U4/p1591902858020400?thread_ts=1591456024.474400&cid=CTW7K59U4
hi dear,
thanks for this awesome :]
but it would be great if provide this as Package that be able to adding by Package Manager as git url.
When interacting with the far pointer on UI buttons or MRTK buttons (in hand mode), single pinch action results in multiple OnClick events triggering. Is there anything I am doing wrong? Near interactions works fine as expected.
I am using Unity 2021.03.01f and the following submodules
Oculus Integration 28.0
Mixed Reality Toolkit v2.6
Oculus XR Plugin 1.8.1
Oculus Link uses a different rendering path than builds for Quest, and the hand shader doesn't seem to be single pass instanced compatible.
MRTK-Audio is currently a submodule that makes it more complicated to easily package up MRTK-Quest Core, as that has a dependency on two clips from MRTK-Audio for the teleport.
TODO:
First of all thanks for this great extension of MRTK. I love how easy it makes to use MRTK on Oculus Quest. It brings together the best of two worlds.
I am fairly new to all this. I'm currently doing some experiments with the Quest and testing out different input methods. MRTK also supports Speech Input. But for Oculus Quest I could not make it run. Since on the Readme "Full support for any interaction in the MRTK designed to work for HoloLens 2" is written, I was not sure whether I made something wrong, or whether it is currently still in work.
https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/Input/Speech.html
I tested this on Oculus by deploying the testscene, but also by using Oculus Link and an external microphone. I saw that there are some voice commands on some buttons. But none of them seem to work. So the question is: How can I make speech input work?
Actualy CheckCapability of OculusQuestInputManager always returns true when we're trying to find out if HandTracking is enabled.
public bool CheckCapability(MixedRealityCapability capability)
{
return (capability == MixedRealityCapability.ArticulatedHand);
}
So, if we want a specific behavior without hand tracking, this is not possible (via mrtk profile).
Maybe an Option Enable Hand Tracking should be exposed in the OculusQuestInputManager ?
Oculus v18 adds voice command support to the platform. It would be great to route the voice command provider through the MRTK speech recognition system.
Hi Eric,
I know this project is "discontinued", but I'm rather asking for advice, than pointing out issues.
Your project is pretty much the only one that I could find where the hands "PokePointer" also follows the index finger if controllers are being used. And it's the only one, where the hand menu opens if used with controllers (HandConstraintPalmUp).
Both are things that are pretty neat, but I can't get them to work... I looked through your v1.2.0 project and compared it with mine and it seems that the relevant files have mostly not changed, and yet, my poke pointer does not follow my hand if I use my controller and the menu only appears with my workaround in a somewhat unstable manner.
Any chance you can point me in the right direction? Did you use the articulated hand for the controller poses? That would props allow for the above. And your input profile (Input Data Providers) differs quite a bit, yet, your Input>PokePointer is only active for "articulated hand" - or are profiles stil the relevant change?
In any case, thanks for the great work of basically porting mrtk to quest! And for projects like mrtk+normcore!
Cheers
why the keyboard cannot be called out after pressing the key?
"Here is overlay docs. I use a 2nd camera(only rendering UI) outputting to a render texture. Assign the RT to the overlays texture. For main camera, set cullingmask to include overlay layer but not UI layer. This produces the sharpest text for me."
https://developer.oculus.com/documentation/unity/unity-ovroverlay/?locale=en_US
https://twitter.com/robertncoomber/status/1278671117746659328?s=20
Quest supports various performance options. It would be good to add performance options to the quest configuration window.
Hello, thank you so much for a great toolkit. I try to play the video as a tutorial for my app but in the built the player render only the black screen with video audio. Do it need any additional setting to play the video? Thank you in advance.
when lower my hand,the hand ray did not follow my hand direction,and disappeared instead.
Is it by design? Because it is very inconvenient to manipulate far objects below my hand
There should be haptic feedback for the following actions.
Hi, I am very excited about using this project in my VR Health project.
Since URP is very important to reduce the Calls and the FPS in the game....
For that reason, Is MRTK-Quest Project with Unity URP compatible?
or if it is not like that, Could I use in my own URP project and add the articulated hand tracking, and simulating hand tracking using controllers with avatar hands?
Thanks for your time
Best Regards
Alejandro
Ps. Sorry if it not the correct place to add this post but I did not know where ask this... :)
Add support for avatar hands when using controllers to emulate the use of hands, when using controllers.
When detecting and losing hands, I receive about 3 different errors completely at random.
In the MRKT-Quest_BasicSetup scene I randomly get spammed with this because the Controller is null:
NullReferenceException: Object reference not set to an instance of an object
Microsoft.MixedReality.Toolkit.Input.LinePointer.get_IsInteractionEnabled () (at Assets/MixedRealityToolkit/MRTK/SDK/Features/UX/Scripts/Pointers/LinePointer.cs:56)
Microsoft.MixedReality.Toolkit.Input.MixedRealityInputModule.Process () (at Assets/MixedRealityToolkit/MRTK/Services/InputSystem/MixedRealityInputModule.cs:123)
UnityEngine.EventSystems.EventSystem.Update () (at C:/Program Files/Unity/Hub/Editor/2019.3.14f1/Editor/Data/Resources/PackageManager/BuiltInPackages/com.unity.ugui/Runtime/EventSystem/EventSystem.cs:377)
Since I am not using the shipped but the official teleport pointer. That one is also causing two errors every now and then:
TeleportCursor.cs: 96
Debug.LogError($"{pointer.GetType().Name} has not been registered! " + pointer.PointerId);
This is because the FocusProvider does not have the requested pointer. And finally:
TeleportPointer.cs: 355
if (eventData.SourceId == InputSourceParent.SourceId &&
eventData.Handedness == Handedness &&
eventData.MixedRealityInputAction == teleportAction)
This is because InputSourceParent is null by the time this is called.
In sum of the errors lead to the pointers still being active for at a least one frame (the first error can get spammed quite a lot) after a source has been lost.
It was beyond me to find out why this is. Most of the missing references happen for objects that shouldn't be in the dictionary they were iterated over because of them being altered in OnSourceDetected and OnSourceLost.
Hi, it seems like the movement of the pan is inverted in the Hand Interaction Pan Zoom component. (move uv up and down when going left right with the hand)
I was able to get this fixed by changing line 373 of the script from :
Vector2 uvDelta = new Vector2(totalUVOffset.x, -totalUVOffset.y);
to
Vector2 uvDelta = new Vector2(-totalUVOffset.y, -totalUVOffset.x);
Will do some further testing to see if its related to MRTK or just MRTK Quest.
Hi, I followed your instruction and FAQ, imported the scene but still hands are not able to interact with anything. What am I possibly missing?
" I see both controllers and OVR hands. When I put the controllers down, switch to hand-tracking and then back to using controllers, the issue goes away."
Temp workaround:
Issue found with:
After hand is no longer tracked, and returns, the pointers never return either.
Hi All,
I am trying the MRTK-Quest_Development Scene through Unity and Quest by Oculus Link Cable and I saw the hand tracking feature successfully of this way, but after that; I built this one to see it in my Oculus Quest( without any Link cable). It was built "successfully". After I opened the App in my Headset and I cannot see the Hands Tracking( This feature is enable in my Quest Headset). Then I tried it again, and I got a message that "to see this app, I have to change to Controllers feature", for that reason, Could anybody help me to fix it?
Thanks for your time
Regards
Alejandro
Currently hands extend the Controller abstract class in MRTK, not the BaseHand.
Controller expects the implementation to update Velocity and Angular velocity, which it does not do.
BaseHand does take care of setting these with a custom algorithm, that remains to be verified.
This data is necessary for users looking to implement throwing of objects using hand tracking.
The MRTK port to Oculus hand tracking is excellent!
Now, I would like to grab a weapon and shoot with the index finger.
Since the grabbing works with the index finger the shooting is not possible.
I tried to modify QculusQuestHands.cs file. I was focusing on the the section containing
isIndexGrabbing = HandPoseUtils.IsIndexGrabbing(ControllerHandedness);
and
float indexFingerCurl = HandPoseUtils.IndexFingerCurl(ControllerHandedness);
However, it did not work as expected. Have you got any other idea?
Given that controllers emulate hands, the controller implementation currently does not properly produce bounds that make sense.
Because of this hand menus do not currently work as expected.
Oculus just released the V14 integration. Will have to test to make sure hand tracking joint rotations haven't changed.
The build process aborts with several errors. One of them is the following:
`Shader error in 'Mixed Reality Toolkit/Standard': Program 'frag', error X8000: D3D11 Internal Compiler Error: Invalid Bytecode: Incompatible min precision type for operand #1 of opcode #39 (counts are 1-based). Expected int or uint. at line 134 (on gles)
Compiling Vertex program with _DISABLE_ALBEDO_MAP _CLIPPING_BOX _ROUND_CORNERS _IGNORE_Z_SCALE
Platform defines: UNITY_NO_DXT5nm UNITY_NO_RGBM UNITY_ENABLE_REFLECTION_BUFFERS UNITY_FRAMEBUFFER_FETCH_AVAILABLE UNITY_NO_CUBEMAP_ARRAY UNITY_NO_SCREENSPACE_SHADOWS UNITY_PBS_USE_BRDF3 UNITY_NO_FULL_STANDARD_SHADER SHADER_API_MOBILE UNITY_HARDWARE_TIER1 UNITY_COLORSPACE_GAMMA UNITY_HALF_PRECISION_FRAGMENT_SHADER_REGISTERS UNITY_LIGHTMAP_DLDR_ENCODING UNITY_PASS_FORWARDBASE
Disabled keywords: LIGHTMAP_ON _CLIPPING_PLANE _CLIPPING_SPHERE _ALPHATEST_ON _ALPHABLEND_ON _METALLIC_TEXTURE_ALBEDO_CHANNEL_A _SMOOTHNESS_TEXTURE_ALBEDO_CHANNEL_A _CHANNEL_MAP _NORMAL_MAP _EMISSION _TRIPLANAR_MAPPING _LOCAL_SPACE_TRIPLANAR_MAPPING _DIRECTIONAL_LIGHT _SPECULAR_HIGHLIGHTS _SPHERICAL_HARMONICS _REFLECTIONS _REFRACTION _RIM_LIGHT _VERTEX_COLORS _VERTEX_EXTRUSION _VERTEX_EXTRUSION_SMOOTH_NORMALS _CLIPPING_BORDER _NEAR_PLANE_FADE _NEAR_LIGHT_FADE _HOVER_LIGHT _HOVER_COLOR_OVERRIDE _PROXIMITY_LIGHT _PROXIMITY_LIGHT_COLOR_OVERRIDE _PROXIMITY_LIGHT_SUBTRACTIVE _PROXIMITY_LIGHT_TWO_SIDED _INDEPENDENT_CORNERS _BORDER_LIGHT _BORDER_LIGHT_USES_HOVER_COLOR _BORDER_LIGHT_REPLACES_ALBEDO _BORDER_LIGHT_OPAQUE _INNER_GLOW _IRIDESCENCE _ENVIRONMENT_COLORING _INSTANCED_COLOR INSTANCING_ON UNITY_ENABLE_NATIVE_SHADOW_LOOKUPS UNITY_METAL_SHADOWS_USE_POINT_FILTERING UNITY_USE_DITHER_MASK_FOR_ALPHABLENDED_SHADOWS UNITY_PBS_USE_BRDF1 UNITY_PBS_USE_BRDF2 UNITY_SPECCUBE_BOX_PROJECTION UNITY_SPECCUBE_BLENDING UNITY_ENABLE_DETAIL_NORMALMAP UNITY_HARDWARE_TIER2 UNITY_HARDWARE_TIER3 UNITY_LIGHT_PROBE_PROXY_VOLUME UNITY_LIGHTMAP_RGBM_ENCODING UNITY_LIGHTMAP_FULL_HDR UNITY_VIRTUAL_TEXTURING
`
All errors are of the same kind and affect the same shader.
Is this repo abandoned already? Are there any plans of updating to the latest MRTK release??
Just thinking louder here, but it would be soooo cool to have this running with PUN2 + Voice chat.
This would mean a super functional VR multiplayer framework.
Would you please consider adding this feature in the future?
Hi! Great repositorie, I have been tinkering with it for a while now.
However, I have not yet managed to make the PokePointer element work with every finger.
What am I missing? It should be easy to implement as each finger has its own skeleton with iterable names.
I have also tried to use the HandsManager prefab from the Oculus Framework along with the InteractableToolsSDKDriver. While my try found no avail, perhaps you could use these controllers to define the position of new PokePointer elements.
Sorry for the inconvenience and thanks in advance.
Couple of minor missing bits:
hi,
seems that it isn't compatible with latest MRTK
Sometimes when hot-swapping controllers for hands and vice-versa, or simply losing tracking on controllers, pointer rays may persist leaving visual pollution in the scene.
The current mappings for the motion controllers are insufficient:
public override MixedRealityInteractionMapping[] DefaultInteractions => new[]
{
new MixedRealityInteractionMapping(0, "Spatial Pointer", AxisType.SixDof, DeviceInputType.SpatialPointer, new MixedRealityInputAction(4, "Pointer Pose", AxisType.SixDof)),
new MixedRealityInteractionMapping(1, "Spatial Grip", AxisType.SixDof, DeviceInputType.SpatialGrip, new MixedRealityInputAction(3, "Grip Pose", AxisType.SixDof)),
new MixedRealityInteractionMapping(2, "Select", AxisType.Digital, DeviceInputType.Select, new MixedRealityInputAction(1, "Select", AxisType.Digital)),
new MixedRealityInteractionMapping(3, "Grab", AxisType.SingleAxis, DeviceInputType.TriggerPress, new MixedRealityInputAction(7, "Grip Press", AxisType.SingleAxis)),
new MixedRealityInteractionMapping(4, "Index Finger Pose", AxisType.SixDof, DeviceInputType.IndexFinger, new MixedRealityInputAction(13, "Index Finger Pose", AxisType.SixDof)),
};
should be more like:
/// <inheritdoc />
public override MixedRealityInteractionMapping[] DefaultLeftHandedInteractions => new[]
{
new MixedRealityInteractionMapping(0, "Spatial Pointer", AxisType.SixDof, DeviceInputType.SpatialPointer, new MixedRealityInputAction(4, "Pointer Pose", AxisType.SixDof)),
new MixedRealityInteractionMapping(1, "Spatial Grip", AxisType.SixDof, DeviceInputType.SpatialGrip, new MixedRealityInputAction(3, "Grip Pose", AxisType.SixDof)),
new MixedRealityInteractionMapping(2, "Axis1D.PrimaryIndexTrigger", AxisType.SingleAxis, DeviceInputType.Trigger, new MixedRealityInputAction(6, "Trigger", AxisType.SingleAxis), axisCodeX: ControllerMappingLibrary.AXIS_9),
new MixedRealityInteractionMapping(3, "Axis1D.PrimaryIndexTrigger Touch", AxisType.Digital, DeviceInputType.TriggerTouch, KeyCode.JoystickButton14),
new MixedRealityInteractionMapping(4, "Axis1D.PrimaryIndexTrigger Near Touch", AxisType.SingleAxis, DeviceInputType.TriggerNearTouch, ControllerMappingLibrary.AXIS_13),
new MixedRealityInteractionMapping(5, "Axis1D.PrimaryIndexTrigger Press", AxisType.Digital, DeviceInputType.TriggerPress, new MixedRealityInputAction(1, "Select", AxisType.Digital), KeyCode.None, axisCodeX: ControllerMappingLibrary.AXIS_9),
new MixedRealityInteractionMapping(6, "Axis1D.PrimaryHandTrigger Press", AxisType.SingleAxis, DeviceInputType.TriggerPress, new MixedRealityInputAction(7, "Grip Press", AxisType.SingleAxis), axisCodeX: ControllerMappingLibrary.AXIS_11),
new MixedRealityInteractionMapping(7, "Axis2D.PrimaryThumbstick", AxisType.DualAxis, DeviceInputType.ThumbStick, new MixedRealityInputAction(5, "Teleport Direction", AxisType.DualAxis), axisCodeX: ControllerMappingLibrary.AXIS_1, axisCodeY: ControllerMappingLibrary.AXIS_2, invertYAxis: true),
new MixedRealityInteractionMapping(8, "Button.PrimaryThumbstick Touch", AxisType.Digital, DeviceInputType.ThumbStickTouch, KeyCode.JoystickButton16),
new MixedRealityInteractionMapping(9, "Button.PrimaryThumbstick Near Touch", AxisType.Digital, DeviceInputType.ThumbNearTouch, ControllerMappingLibrary.AXIS_15),
new MixedRealityInteractionMapping(10, "Button.PrimaryThumbstick Press", AxisType.Digital, DeviceInputType.ThumbStickPress, KeyCode.JoystickButton8),
new MixedRealityInteractionMapping(11, "Button.Three Press", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton2),
new MixedRealityInteractionMapping(12, "Button.Four Press", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton3),
new MixedRealityInteractionMapping(13, "Button.Start Press", AxisType.Digital, DeviceInputType.ButtonPress, new MixedRealityInputAction(2, "Menu", AxisType.Digital), KeyCode.JoystickButton6),
new MixedRealityInteractionMapping(14, "Button.Three Touch", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton12),
new MixedRealityInteractionMapping(15, "Button.Four Touch", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton13)
};
/// <inheritdoc />
public override MixedRealityInteractionMapping[] DefaultRightHandedInteractions => new[]
{
new MixedRealityInteractionMapping(0, "Spatial Pointer", AxisType.SixDof, DeviceInputType.SpatialPointer, new MixedRealityInputAction(4, "Pointer Pose", AxisType.SixDof)),
new MixedRealityInteractionMapping(1, "Spatial Grip", AxisType.SixDof, DeviceInputType.SpatialGrip, new MixedRealityInputAction(3, "Grip Pose", AxisType.SixDof)),
new MixedRealityInteractionMapping(2, "Axis1D.SecondaryIndexTrigger", AxisType.SingleAxis, DeviceInputType.Trigger, new MixedRealityInputAction(6, "Trigger", AxisType.SingleAxis), axisCodeX: ControllerMappingLibrary.AXIS_10),
new MixedRealityInteractionMapping(3, "Axis1D.SecondaryIndexTrigger Touch", AxisType.Digital, DeviceInputType.TriggerTouch, KeyCode.JoystickButton15),
new MixedRealityInteractionMapping(4, "Axis1D.SecondaryIndexTrigger Near Touch", AxisType.SingleAxis, DeviceInputType.TriggerNearTouch, ControllerMappingLibrary.AXIS_14),
new MixedRealityInteractionMapping(5, "Axis1D.SecondaryIndexTrigger Press", AxisType.Digital, DeviceInputType.TriggerPress, new MixedRealityInputAction(1, "Select", AxisType.Digital), KeyCode.None, axisCodeX: ControllerMappingLibrary.AXIS_10),
new MixedRealityInteractionMapping(6, "Axis1D.SecondaryHandTrigger Press", AxisType.SingleAxis, DeviceInputType.TriggerPress, new MixedRealityInputAction(7, "Grip Press", AxisType.SingleAxis), axisCodeX: ControllerMappingLibrary.AXIS_12),
new MixedRealityInteractionMapping(7, "Axis2D.SecondaryThumbstick", AxisType.DualAxis, DeviceInputType.ThumbStick, new MixedRealityInputAction(5, "Teleport Direction", AxisType.DualAxis), axisCodeX: ControllerMappingLibrary.AXIS_4, axisCodeY: ControllerMappingLibrary.AXIS_5, invertYAxis: true),
new MixedRealityInteractionMapping(8, "Button.SecondaryThumbstick Touch", AxisType.Digital, DeviceInputType.ThumbStickTouch, KeyCode.JoystickButton17),
new MixedRealityInteractionMapping(9, "Button.SecondaryThumbstick Near Touch", AxisType.Digital, DeviceInputType.ThumbNearTouch, ControllerMappingLibrary.AXIS_16),
new MixedRealityInteractionMapping(10, "Button.SecondaryThumbstick Press", AxisType.Digital, DeviceInputType.ThumbStickPress, KeyCode.JoystickButton9),
new MixedRealityInteractionMapping(11, "Button.One Press", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton0),
new MixedRealityInteractionMapping(12, "Button.Two Press", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton1),
new MixedRealityInteractionMapping(13, "Button.One Touch", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton10),
new MixedRealityInteractionMapping(14, "Button.Two Touch", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton11)
};
The controller could additionally be derived from GenericJoystickController, which handles most of the axisTypes already.
Currently there is no functional example for using the teleport system in MRTK.
Hello again!
I might have found an issue, which could be an unexpected behaviour. The general approach for adding Oculus avatars and camera into the scene is by adding an OVR Camera Rig, and adding a LocalAvatar into "TrackingSpace" in the Camera Rig. In my specific case I needed to have it this way, because i don't want to track position of LocalAvatar and CameraRig seperately (when e.g. implementing some custom locomotion approach).
After starting the application, another LocalAvatar is added into the scene and this behaviour is unexpected, since in most cases we only want to have one localavatar. We easily find out the reason. In MRTKOculusConfig.cs we find this:
/// <summary> /// Prefab reference for LocalAvatar to load, if none are found in scene. /// </summary> public GameObject LocalAvatarPrefab => localAvatarPrefab;
The comment and actual behaviour differ from each other. This object is used and initialized in OculusQuestInputManager.cs without further checking whether this object already exists:
if (useAvatarHands) { // Initialize the local avatar controller GameObject.Instantiate(MRTKOculusConfig.Instance.LocalAvatarPrefab, cameraRig.trackingSpace); }
In my project I just commented out the initializing of the game object. This is of course not a permanent solution. Probably one good approach might be to leave handling of LocalAvatar fully to the developer and remove initializing it in the InputManager.
Kind regards,
Enes
Current MRTKOculusConfig
is a project singleton. Having an ability to update parameters of the configs enables control of the input behavior as well as visuals of the input. For example, hand material can be updated per scene based on the requirement.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.