Coder Social home page Coder Social logo

provencher / mrtk-quest Goto Github PK

View Code? Open in Web Editor NEW

This project forked from hololabinc/mrtkextensionforoculusquest

389.0 389.0 67.0 187.05 MB

MRTK-Quest: Mixed Reality Toolkit (MRTK) extension bridge for Oculus Quest + Rift / S

License: Other

Batchfile 0.01% Shell 0.01% C# 95.58% HLSL 0.45% ShaderLab 3.95%

mrtk-quest's People

Contributors

andreibosco avatar chetu3319 avatar dheinmw avatar machenmusik avatar marek-stoj avatar provencher avatar rogpodge avatar tarukosu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mrtk-quest's Issues

Objects Leave Hand When Head Moves

When you grab an object and move it toward or away from your face, it drifts out of your hand. We have posted this issue on the MRTK Github page but they have not responded. We've tried many Unity and MRTK versions, including the newest ones. It makes grabbing objects very inaccurate but doesn't appear to be a limitation of the accuracy of the hand-tracking itself...

How to export the Hand Tracking Teleporting feature from this MRTK-Quest to my VR own project?

Hi Everyone.
I am building a Rehabilitation Treatments by VR Games( mainly for Oculus Quest for its mobility). I am very interested in adding the Hand Tracking Teleporting feature of this MRTK-Quest project to my own project (it would be great for that patient who has little mobility with their "hands" for your health problem and can move around the game by your hands. I guess that this feature is in the MRTK-Quest_2.4_Extra scene, for that reason, Would it be possible? Could anybody advise me about how to do it?
Thanks for your time
Regards
Alejandro
Ps. Maybe it is a little crazy idea but it's worth a try :)

Keep getting debug.LogError when hide/reload DcokExample objects group

debug.LogError :
"AssertionException: Assertion failure. Value was Null
Expected: Value was not Null
When a dockable is docked, its dockedPosition must be valid.
UnityEngine.Assertions.Assert.Fail (System.String message, System.String userMessage) (at <23a7799da2e941b88c6db790c607d655>:0)
UnityEngine.Assertions.Assert.IsNotNull (UnityEngine.Object value, System.String message) (at <23a7799da2e941b88c6db790c607d655>:0)
UnityEngine.Assertions.Assert.IsNotNull[T] (T value, System.String message) (at <23a7799da2e941b88c6db790c607d655>:0)
Microsoft.MixedReality.Toolkit.Experimental.UI.Dockable.Update () (at Assets/MRTK/SDK/Experimental/Dock/Dockable.cs:133)"

Way to recreate the issue:

  1. Add two buttons in MRTK-Quest_2.4_Extra scene.
    Button one: Hide the DockExample object group in Hierarchy panel with "GameObject.SetActive" value=off.
    Button two: Show the DockExample object group in Hierarchy panel with "GameObject.SetActive" value=on.

  2. Click button one to hide the DockExample object group, then click button two to show the group, then the debug.LogError will keep coming.

Camera always start from (0,0,0) position.

Camera always start from (0,0,0) position. In xr interaction toolkit it is based on the local position of the parent gameobject of the camera but here it s overriden by the main position so I can't move user to start from a specific point of the scene.
I could n't find a way to do it. Also I want to create snap teleportation points so i need some control over it.

Grabbed objects with physics stutter badly in the demo

Objects that have rigidbody physics (the coffee cup, the cubes etc.) stutter badly when they're grabbed and moved. The stuttering disappears when I let go of the object and it just falls to the floor smoothly.

The objects that don't have any physics are fine when grabbed. The floaty coffee cup that doesn't have gravity moves smoothly when manipulated.

Interestingly enough it isn't very noticeable in recordings which makes me think this is some sort of framerate issue

I just downloaded MRTK-Quest_v120.apk from the releases and ran it on my quest 2 (v23) so nothing was changed on my end.

BaseControllerPointer.cs Error Quest MRTK or MRTK?

Null Reference Exception

  • I think this is a joint issue with general MRTK but I realized that it's activated by OculusQuestInputManagers.cs which is MRTK-Quest related.

Steps to Reproduce

  • While tethered to the PC running Oculus Link in Unity's Editor
  • This only seems to happen when I go from a transition between pulling the headset up on my head and pulling it back down while simultaneously having my hands being captured.

What I think is going on & Quick Fix

  • Line 475 on BaseControllerPointer.cs throws a null reference exception error
  • 'InputSourceParent' hasn't been established yet thus throwing the exception.
  • If a null reference check is added on the BaseControllerPointer.cs before InputSourceParent is referenced the error goes away

Image of Unity Debugger & Fix

image
image

Provide git Package

hi dear,

thanks for this awesome :]
but it would be great if provide this as Package that be able to adding by Package Manager as git url.

Mutliple trigger events from single Pinch Geusture

When interacting with the far pointer on UI buttons or MRTK buttons (in hand mode), single pinch action results in multiple OnClick events triggering. Is there anything I am doing wrong? Near interactions works fine as expected.

I am using Unity 2021.03.01f and the following submodules
Oculus Integration 28.0
Mixed Reality Toolkit v2.6
Oculus XR Plugin 1.8.1

Clean up MRTK-Audio dependency

MRTK-Audio is currently a submodule that makes it more complicated to easily package up MRTK-Quest Core, as that has a dependency on two clips from MRTK-Audio for the teleport.

TODO:

  • Move the teleport and background music audio clips into MRTK-Quest Core, and add proper attribution to @joncohenproducer.
  • Remove the MRTK-Audio dependency from MRTK-Quest.

Speech Input in Demo Scene

First of all thanks for this great extension of MRTK. I love how easy it makes to use MRTK on Oculus Quest. It brings together the best of two worlds.

I am fairly new to all this. I'm currently doing some experiments with the Quest and testing out different input methods. MRTK also supports Speech Input. But for Oculus Quest I could not make it run. Since on the Readme "Full support for any interaction in the MRTK designed to work for HoloLens 2" is written, I was not sure whether I made something wrong, or whether it is currently still in work.
https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/Input/Speech.html

I tested this on Oculus by deploying the testscene, but also by using Oculus Link and an external microphone. I saw that there are some voice commands on some buttons. But none of them seem to work. So the question is: How can I make speech input work?

CheckCapability always return true

Actualy CheckCapability of OculusQuestInputManager always returns true when we're trying to find out if HandTracking is enabled.

public bool CheckCapability(MixedRealityCapability capability)
{
  return (capability == MixedRealityCapability.ArticulatedHand);
}

So, if we want a specific behavior without hand tracking, this is not possible (via mrtk profile).
Maybe an Option Enable Hand Tracking should be exposed in the OculusQuestInputManager ?

NearInteraction with PokePointer on Quest-Controller

Hi Eric,
I know this project is "discontinued", but I'm rather asking for advice, than pointing out issues.
Your project is pretty much the only one that I could find where the hands "PokePointer" also follows the index finger if controllers are being used. And it's the only one, where the hand menu opens if used with controllers (HandConstraintPalmUp).
Both are things that are pretty neat, but I can't get them to work... I looked through your v1.2.0 project and compared it with mine and it seems that the relevant files have mostly not changed, and yet, my poke pointer does not follow my hand if I use my controller and the menu only appears with my workaround in a somewhat unstable manner.

Any chance you can point me in the right direction? Did you use the articulated hand for the controller poses? That would props allow for the above. And your input profile (Input Data Providers) differs quite a bit, yet, your Input>PokePointer is only active for "articulated hand" - or are profiles stil the relevant change?

In any case, thanks for the great work of basically porting mrtk to quest! And for projects like mrtk+normcore!
Cheers

Videoplayer rensers only black texture.

Hello, thank you so much for a great toolkit. I try to play the video as a tutorial for my app but in the built the player render only the black screen with video audio. Do it need any additional setting to play the video? Thank you in advance.

The hand ray can not point downwards?

when lower my hand,the hand ray did not follow my hand direction,and disappeared instead.

Is it by design? Because it is very inconvenient to manipulate far objects below my hand

Support for haptics

There should be haptic feedback for the following actions.

  • Pointer Enter
  • Pointer Exit
  • Pointer Down
  • Pointer Up

Is this Project URP compatible?

Hi, I am very excited about using this project in my VR Health project.
Since URP is very important to reduce the Calls and the FPS in the game....
For that reason, Is MRTK-Quest Project with Unity URP compatible?
or if it is not like that, Could I use in my own URP project and add the articulated hand tracking, and simulating hand tracking using controllers with avatar hands?
Thanks for your time
Best Regards
Alejandro
Ps. Sorry if it not the correct place to add this post but I did not know where ask this... :)

Hand detection errors in different places

When detecting and losing hands, I receive about 3 different errors completely at random.
In the MRKT-Quest_BasicSetup scene I randomly get spammed with this because the Controller is null:

NullReferenceException: Object reference not set to an instance of an object
Microsoft.MixedReality.Toolkit.Input.LinePointer.get_IsInteractionEnabled () (at Assets/MixedRealityToolkit/MRTK/SDK/Features/UX/Scripts/Pointers/LinePointer.cs:56)
Microsoft.MixedReality.Toolkit.Input.MixedRealityInputModule.Process () (at Assets/MixedRealityToolkit/MRTK/Services/InputSystem/MixedRealityInputModule.cs:123)
UnityEngine.EventSystems.EventSystem.Update () (at C:/Program Files/Unity/Hub/Editor/2019.3.14f1/Editor/Data/Resources/PackageManager/BuiltInPackages/com.unity.ugui/Runtime/EventSystem/EventSystem.cs:377)

Since I am not using the shipped but the official teleport pointer. That one is also causing two errors every now and then:

TeleportCursor.cs: 96
Debug.LogError($"{pointer.GetType().Name} has not been registered! " + pointer.PointerId);

This is because the FocusProvider does not have the requested pointer. And finally:

TeleportPointer.cs: 355
                    if (eventData.SourceId == InputSourceParent.SourceId &&
                    eventData.Handedness == Handedness &&
                    eventData.MixedRealityInputAction == teleportAction)

This is because InputSourceParent is null by the time this is called.

In sum of the errors lead to the pointers still being active for at a least one frame (the first error can get spammed quite a lot) after a source has been lost.
It was beyond me to find out why this is. Most of the missing references happen for objects that shouldn't be in the dictionary they were iterated over because of them being altered in OnSourceDetected and OnSourceLost.

Hand Interaction Pan Zoom not in good direction

Hi, it seems like the movement of the pan is inverted in the Hand Interaction Pan Zoom component. (move uv up and down when going left right with the hand)
I was able to get this fixed by changing line 373 of the script from :

Vector2 uvDelta = new Vector2(totalUVOffset.x, -totalUVOffset.y);

to

Vector2 uvDelta = new Vector2(-totalUVOffset.y, -totalUVOffset.x);

Will do some further testing to see if its related to MRTK or just MRTK Quest.

Hands visible but not working

Hi, I followed your instruction and FAQ, imported the scene but still hands are not able to interact with anything. What am I possibly missing?

When starting a scene with controllers, dummy hands appear on controllers

image (4)

" I see both controllers and OVR hands. When I put the controllers down, switch to hand-tracking and then back to using controllers, the issue goes away."

Temp workaround:

  • Set confidence behavior on the OVRHand prefab to ToggleRenderer. This will bypass the MRTK-Quest confidence behavior.
    image (5)

Issue found with:

  • MRTK-Quest 0.6.1
  • Oculus Integration V14

Hand tracking feature is not working in Oculus Quest Headset.....

Hi All,
I am trying the MRTK-Quest_Development Scene through Unity and Quest by Oculus Link Cable and I saw the hand tracking feature successfully of this way, but after that; I built this one to see it in my Oculus Quest( without any Link cable). It was built "successfully". After I opened the App in my Headset and I cannot see the Hands Tracking( This feature is enable in my Quest Headset). Then I tried it again, and I got a message that "to see this app, I have to change to Controllers feature", for that reason, Could anybody help me to fix it?
Thanks for your time
Regards
Alejandro

Proper support of velocity and angular velocity for hand tracking

Currently hands extend the Controller abstract class in MRTK, not the BaseHand.
Controller expects the implementation to update Velocity and Angular velocity, which it does not do.
BaseHand does take care of setting these with a custom algorithm, that remains to be verified.

This data is necessary for users looking to implement throwing of objects using hand tracking.

Grabing only with middle finger

The MRTK port to Oculus hand tracking is excellent!
Now, I would like to grab a weapon and shoot with the index finger.
Since the grabbing works with the index finger the shooting is not possible.
I tried to modify QculusQuestHands.cs file. I was focusing on the the section containing

isIndexGrabbing = HandPoseUtils.IsIndexGrabbing(ControllerHandedness);
and
float indexFingerCurl = HandPoseUtils.IndexFingerCurl(ControllerHandedness);

However, it did not work as expected. Have you got any other idea?

Error when building project

The build process aborts with several errors. One of them is the following:

`Shader error in 'Mixed Reality Toolkit/Standard': Program 'frag', error X8000: D3D11 Internal Compiler Error: Invalid Bytecode: Incompatible min precision type for operand #1 of opcode #39 (counts are 1-based). Expected int or uint. at line 134 (on gles)

Compiling Vertex program with _DISABLE_ALBEDO_MAP _CLIPPING_BOX _ROUND_CORNERS _IGNORE_Z_SCALE
Platform defines: UNITY_NO_DXT5nm UNITY_NO_RGBM UNITY_ENABLE_REFLECTION_BUFFERS UNITY_FRAMEBUFFER_FETCH_AVAILABLE UNITY_NO_CUBEMAP_ARRAY UNITY_NO_SCREENSPACE_SHADOWS UNITY_PBS_USE_BRDF3 UNITY_NO_FULL_STANDARD_SHADER SHADER_API_MOBILE UNITY_HARDWARE_TIER1 UNITY_COLORSPACE_GAMMA UNITY_HALF_PRECISION_FRAGMENT_SHADER_REGISTERS UNITY_LIGHTMAP_DLDR_ENCODING UNITY_PASS_FORWARDBASE
Disabled keywords: LIGHTMAP_ON _CLIPPING_PLANE _CLIPPING_SPHERE _ALPHATEST_ON _ALPHABLEND_ON _METALLIC_TEXTURE_ALBEDO_CHANNEL_A _SMOOTHNESS_TEXTURE_ALBEDO_CHANNEL_A _CHANNEL_MAP _NORMAL_MAP _EMISSION _TRIPLANAR_MAPPING _LOCAL_SPACE_TRIPLANAR_MAPPING _DIRECTIONAL_LIGHT _SPECULAR_HIGHLIGHTS _SPHERICAL_HARMONICS _REFLECTIONS _REFRACTION _RIM_LIGHT _VERTEX_COLORS _VERTEX_EXTRUSION _VERTEX_EXTRUSION_SMOOTH_NORMALS _CLIPPING_BORDER _NEAR_PLANE_FADE _NEAR_LIGHT_FADE _HOVER_LIGHT _HOVER_COLOR_OVERRIDE _PROXIMITY_LIGHT _PROXIMITY_LIGHT_COLOR_OVERRIDE _PROXIMITY_LIGHT_SUBTRACTIVE _PROXIMITY_LIGHT_TWO_SIDED _INDEPENDENT_CORNERS _BORDER_LIGHT _BORDER_LIGHT_USES_HOVER_COLOR _BORDER_LIGHT_REPLACES_ALBEDO _BORDER_LIGHT_OPAQUE _INNER_GLOW _IRIDESCENCE _ENVIRONMENT_COLORING _INSTANCED_COLOR INSTANCING_ON UNITY_ENABLE_NATIVE_SHADOW_LOOKUPS UNITY_METAL_SHADOWS_USE_POINT_FILTERING UNITY_USE_DITHER_MASK_FOR_ALPHABLENDED_SHADOWS UNITY_PBS_USE_BRDF1 UNITY_PBS_USE_BRDF2 UNITY_SPECCUBE_BOX_PROJECTION UNITY_SPECCUBE_BLENDING UNITY_ENABLE_DETAIL_NORMALMAP UNITY_HARDWARE_TIER2 UNITY_HARDWARE_TIER3 UNITY_LIGHT_PROBE_PROXY_VOLUME UNITY_LIGHTMAP_RGBM_ENCODING UNITY_LIGHTMAP_FULL_HDR UNITY_VIRTUAL_TEXTURING
`

All errors are of the same kind and affect the same shader.

Deprecated?

Is this repo abandoned already? Are there any plans of updating to the latest MRTK release??

Connect with PUN2

Just thinking louder here, but it would be soooo cool to have this running with PUN2 + Voice chat.
This would mean a super functional VR multiplayer framework.
Would you please consider adding this feature in the future?

PokePointer for every finger

Hi! Great repositorie, I have been tinkering with it for a while now.
However, I have not yet managed to make the PokePointer element work with every finger.
What am I missing? It should be easy to implement as each finger has its own skeleton with iterable names.

I have also tried to use the HandsManager prefab from the Oculus Framework along with the InteractableToolsSDKDriver. While my try found no avail, perhaps you could use these controllers to define the position of new PokePointer elements.

Sorry for the inconvenience and thanks in advance.

Missing prefabs and and unlinked tooltips

Couple of minor missing bits:

  • All "PressableButtonHoloLens2ToggleRadio" prefabs are missing under "RadioGroup96x32" in "MRTK-Quest_Development" scene
  • "EarthCore" object is missing tooltip links in On Manipulation events in both sample scenes.

Insufficient Controller mapping

The current mappings for the motion controllers are insufficient:

        public override MixedRealityInteractionMapping[] DefaultInteractions => new[]
        {
            new MixedRealityInteractionMapping(0, "Spatial Pointer", AxisType.SixDof, DeviceInputType.SpatialPointer, new MixedRealityInputAction(4, "Pointer Pose", AxisType.SixDof)),
            new MixedRealityInteractionMapping(1, "Spatial Grip", AxisType.SixDof, DeviceInputType.SpatialGrip, new MixedRealityInputAction(3, "Grip Pose", AxisType.SixDof)),
            new MixedRealityInteractionMapping(2, "Select", AxisType.Digital, DeviceInputType.Select, new MixedRealityInputAction(1, "Select", AxisType.Digital)),
            new MixedRealityInteractionMapping(3, "Grab", AxisType.SingleAxis, DeviceInputType.TriggerPress, new MixedRealityInputAction(7, "Grip Press", AxisType.SingleAxis)),
            new MixedRealityInteractionMapping(4, "Index Finger Pose", AxisType.SixDof, DeviceInputType.IndexFinger,  new MixedRealityInputAction(13, "Index Finger Pose", AxisType.SixDof)),
        };

should be more like:

        /// <inheritdoc />
        public override MixedRealityInteractionMapping[] DefaultLeftHandedInteractions => new[]
        {
            new MixedRealityInteractionMapping(0, "Spatial Pointer", AxisType.SixDof, DeviceInputType.SpatialPointer, new MixedRealityInputAction(4, "Pointer Pose", AxisType.SixDof)),
            new MixedRealityInteractionMapping(1, "Spatial Grip", AxisType.SixDof, DeviceInputType.SpatialGrip, new MixedRealityInputAction(3, "Grip Pose", AxisType.SixDof)),
            new MixedRealityInteractionMapping(2, "Axis1D.PrimaryIndexTrigger", AxisType.SingleAxis, DeviceInputType.Trigger, new MixedRealityInputAction(6, "Trigger", AxisType.SingleAxis), axisCodeX: ControllerMappingLibrary.AXIS_9),
            new MixedRealityInteractionMapping(3, "Axis1D.PrimaryIndexTrigger Touch", AxisType.Digital, DeviceInputType.TriggerTouch, KeyCode.JoystickButton14),
            new MixedRealityInteractionMapping(4, "Axis1D.PrimaryIndexTrigger Near Touch", AxisType.SingleAxis, DeviceInputType.TriggerNearTouch, ControllerMappingLibrary.AXIS_13),
            new MixedRealityInteractionMapping(5, "Axis1D.PrimaryIndexTrigger Press", AxisType.Digital, DeviceInputType.TriggerPress, new MixedRealityInputAction(1, "Select", AxisType.Digital), KeyCode.None, axisCodeX: ControllerMappingLibrary.AXIS_9),
            new MixedRealityInteractionMapping(6, "Axis1D.PrimaryHandTrigger Press", AxisType.SingleAxis, DeviceInputType.TriggerPress, new MixedRealityInputAction(7, "Grip Press", AxisType.SingleAxis), axisCodeX: ControllerMappingLibrary.AXIS_11),
            new MixedRealityInteractionMapping(7, "Axis2D.PrimaryThumbstick", AxisType.DualAxis, DeviceInputType.ThumbStick, new MixedRealityInputAction(5, "Teleport Direction", AxisType.DualAxis), axisCodeX: ControllerMappingLibrary.AXIS_1, axisCodeY: ControllerMappingLibrary.AXIS_2, invertYAxis: true),
            new MixedRealityInteractionMapping(8, "Button.PrimaryThumbstick Touch", AxisType.Digital, DeviceInputType.ThumbStickTouch, KeyCode.JoystickButton16),
            new MixedRealityInteractionMapping(9, "Button.PrimaryThumbstick Near Touch", AxisType.Digital, DeviceInputType.ThumbNearTouch, ControllerMappingLibrary.AXIS_15),
            new MixedRealityInteractionMapping(10, "Button.PrimaryThumbstick Press", AxisType.Digital, DeviceInputType.ThumbStickPress, KeyCode.JoystickButton8),
            new MixedRealityInteractionMapping(11, "Button.Three Press", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton2),
            new MixedRealityInteractionMapping(12, "Button.Four Press", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton3),
            new MixedRealityInteractionMapping(13, "Button.Start Press", AxisType.Digital, DeviceInputType.ButtonPress, new MixedRealityInputAction(2, "Menu", AxisType.Digital), KeyCode.JoystickButton6),
            new MixedRealityInteractionMapping(14, "Button.Three Touch", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton12),
            new MixedRealityInteractionMapping(15, "Button.Four Touch", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton13)
        };

        /// <inheritdoc />
        public override MixedRealityInteractionMapping[] DefaultRightHandedInteractions => new[]
        {
            new MixedRealityInteractionMapping(0, "Spatial Pointer", AxisType.SixDof, DeviceInputType.SpatialPointer, new MixedRealityInputAction(4, "Pointer Pose", AxisType.SixDof)),
            new MixedRealityInteractionMapping(1, "Spatial Grip", AxisType.SixDof, DeviceInputType.SpatialGrip, new MixedRealityInputAction(3, "Grip Pose", AxisType.SixDof)),
            new MixedRealityInteractionMapping(2, "Axis1D.SecondaryIndexTrigger", AxisType.SingleAxis, DeviceInputType.Trigger, new MixedRealityInputAction(6, "Trigger", AxisType.SingleAxis), axisCodeX:  ControllerMappingLibrary.AXIS_10),
            new MixedRealityInteractionMapping(3, "Axis1D.SecondaryIndexTrigger Touch", AxisType.Digital, DeviceInputType.TriggerTouch, KeyCode.JoystickButton15),
            new MixedRealityInteractionMapping(4, "Axis1D.SecondaryIndexTrigger Near Touch", AxisType.SingleAxis, DeviceInputType.TriggerNearTouch, ControllerMappingLibrary.AXIS_14),
            new MixedRealityInteractionMapping(5, "Axis1D.SecondaryIndexTrigger Press", AxisType.Digital, DeviceInputType.TriggerPress, new MixedRealityInputAction(1, "Select", AxisType.Digital), KeyCode.None, axisCodeX: ControllerMappingLibrary.AXIS_10),
            new MixedRealityInteractionMapping(6, "Axis1D.SecondaryHandTrigger Press", AxisType.SingleAxis, DeviceInputType.TriggerPress, new MixedRealityInputAction(7, "Grip Press", AxisType.SingleAxis), axisCodeX: ControllerMappingLibrary.AXIS_12),
            new MixedRealityInteractionMapping(7, "Axis2D.SecondaryThumbstick", AxisType.DualAxis, DeviceInputType.ThumbStick, new MixedRealityInputAction(5, "Teleport Direction", AxisType.DualAxis), axisCodeX: ControllerMappingLibrary.AXIS_4, axisCodeY: ControllerMappingLibrary.AXIS_5, invertYAxis: true),
            new MixedRealityInteractionMapping(8, "Button.SecondaryThumbstick Touch", AxisType.Digital, DeviceInputType.ThumbStickTouch, KeyCode.JoystickButton17),
            new MixedRealityInteractionMapping(9, "Button.SecondaryThumbstick Near Touch", AxisType.Digital, DeviceInputType.ThumbNearTouch, ControllerMappingLibrary.AXIS_16),
            new MixedRealityInteractionMapping(10, "Button.SecondaryThumbstick Press", AxisType.Digital, DeviceInputType.ThumbStickPress, KeyCode.JoystickButton9),
            new MixedRealityInteractionMapping(11, "Button.One Press", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton0),
            new MixedRealityInteractionMapping(12, "Button.Two Press", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton1),
            new MixedRealityInteractionMapping(13, "Button.One Touch", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton10),
            new MixedRealityInteractionMapping(14, "Button.Two Touch", AxisType.Digital, DeviceInputType.ButtonPress, KeyCode.JoystickButton11)
        };

The controller could additionally be derived from GenericJoystickController, which handles most of the axisTypes already.

Duplicate LocalAvatar in Scene

Hello again!
I might have found an issue, which could be an unexpected behaviour. The general approach for adding Oculus avatars and camera into the scene is by adding an OVR Camera Rig, and adding a LocalAvatar into "TrackingSpace" in the Camera Rig. In my specific case I needed to have it this way, because i don't want to track position of LocalAvatar and CameraRig seperately (when e.g. implementing some custom locomotion approach).

After starting the application, another LocalAvatar is added into the scene and this behaviour is unexpected, since in most cases we only want to have one localavatar. We easily find out the reason. In MRTKOculusConfig.cs we find this:
/// <summary> /// Prefab reference for LocalAvatar to load, if none are found in scene. /// </summary> public GameObject LocalAvatarPrefab => localAvatarPrefab;

The comment and actual behaviour differ from each other. This object is used and initialized in OculusQuestInputManager.cs without further checking whether this object already exists:
if (useAvatarHands) { // Initialize the local avatar controller GameObject.Instantiate(MRTKOculusConfig.Instance.LocalAvatarPrefab, cameraRig.trackingSpace); }

In my project I just commented out the initializing of the game object. This is of course not a permanent solution. Probably one good approach might be to leave handling of LocalAvatar fully to the developer and remove initializing it in the InputManager.

Kind regards,
Enes

Updating MRTKOculusConfig's Custom Hand Material runtime

Current MRTKOculusConfig is a project singleton. Having an ability to update parameters of the configs enables control of the input behavior as well as visuals of the input. For example, hand material can be updated per scene based on the requirement.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.