ju1ce / april-tag-vr-fullbody-tracker Goto Github PK
View Code? Open in Web Editor NEWFull-body tracking in VR using AprilTag markers.
License: MIT License
Full-body tracking in VR using AprilTag markers.
License: MIT License
I have two cameras installed; a facerig virtual cam and I tried to use OBS's Virtual cam but it instantly crashes the program when I try with no error message. if I try a higher number it throws an error that there aren't this many cameras
Hi, I'm not used to github markdown so I don't know if It's the right way to submit issues, but since I downloaded the software, it started when using it as it lagged down my VR a lot, then it started lagging out other things such as discord or even my web browser, and even after deleting it the lag still goes on.
I don't know if it's a know issue or just me ?
The VideoCapture::set function returns a bool if the option is supported for the cameras backend, although returning true does not guarantee the option was set.
Any options that are not supported by the backend should be reported to the user.
I am very interested in this project and believe it has potential, but I feel like implementing more trackers shouldn’t be too much of a hassle.
So in other places talking about this project I've seen it mentioned that it can be set up to do 360 degree tracking instead of just 180. But your setup guide doesn't offer a lot of hints about how to do it...
Can I just generate more tags and make the trackers into octagons or similar? I would assume that's the reason for having such a large range assigned to each tracker... Or does it require a second camera? I can likely fiddle out the details if I can get an idea of where to start...
Assuming it just takes additional tags, then personally I can probably handle chasing down through the subrepos to find more tags (the pre-generated tags from the AprilTag project are 9x9px and need a bit of scaling and fitting onto standard pages...) Would you like the results of that work posted as something for other users?
Just a bit of QOL to avoid users from unintentionally loosing changes made in Params window
Yes, I'm using the correct tracking images. Not sure what else I should do, I've re calibrated everything as well as going into the tracker management for steamvr and set them to foot left and foot right.
Would it be possible to use a phone strapped to my headset to detect finger positions?
I could have a glove with April tags on the fingertips and wrist to perfectly track my hands.
Perhaps the hand drivers could be implemented with this? https://github.com/LucidVR/opengloves-driver
Currently
tracker::VideoCapture
matches opencv naming, a class which is used to get frames of video from cameras.cfg::Camera
parameters used by VideoCapture
to open and read a camera.cfg::VideoStream
contains a cfg::Camera
, plus any parameters required to process the frames from cameras.cfg::CameraCalib
calibration for a camera.tracker::CapturedFrame
a single frame and the timestamp it was captured at.Definitions for some real pointless discussion
Video
- visual media product featuring moving images, a sequence of pictures.Capture
- take possession ofVideo Capture
- google: mostly capture cardsCamera
- a device for capturing a photographic image or recording a video.Stream
- to send forth or discharge in a streamVideo Stream
- google: tv streaming servicesImage
- array of pixelsto image
- instead of capture?Frame
- a single image in a sequence of picturesI'm looking for alternatives, or thoughts on current names.
Everything was working fine, then one day steamvr crashed and then the driver stopped connecting to steamvr, and the error code was nothing. and i tried to troubleshoot myself. But everything that i tried didn't work, for example(Turning the driver off and on in the settings, reinstalling the driver, switching from beta to none, starting apriltagtrackers as administrator. ) All of that didn't help. Does anyone have a solution?
I was thinking that this software works amazingly well for something that requires line of sight. But what if we were to try to integrate the motion stream with slime trackers?
The goal of this added feature would be to correct in yaw values of VMC protocol from a stream of information coming from the SlimeVR Server app. Slime trackers work extremely well, except for the gyroscopic drift that occurs over time. Mixing these two systems would allow us to post the QR codes on-top of the slime trackers themselves, which would allow for basically seamlessly interpolated as well as positionally tracked anchors without the cost of laser tracking.
I would love to hear back on this!
From my testing the obs-virtualcam dll that gets runtime loaded can throw SEH exceptions which cannot be caught, although considering how frequently people experience this, its likely more backends can aswell.
The only way to catch SEH exceptions in C++ is to compile with the /EHa option, while /EHsc is the default for most C++ apps, as /EHa can introduce some performance and binary size issues.
This leaves the alternative of spawning a separate process to test the camera address, before the main process does.
A small exe distributed in the utilities folder will be included on windows builds, it only needs to link with OpenCV, and will attempt to open the hardware index, Using CreateProcess,the main process will then read the exit code and notify the user that the camera can't be opened with this address.
Add custom title or text in the app to show what camera is being used for multi camera setups
Its kinda confusing to find which application belongs to which camera so it would be nice to have either a title showing what camera its using or text on the application showing what camera the app is using.
I kinda like it custom bc i use 2 PSeye cameras and i could name them differently
In the readme file I was instructed to run
cmake --build build --target install
This did not work for me. This worked for me
cmake --build build --target AprilTagTrackers-install
I have been trying to get this set up and I have calibrated the camera and trackers, but when I launch steamvr I cannot see the white camera that you are supposed to be matching to the camera. I will connect to steamvr and then start then the window for the tracking opens but I do not see the camera. Any ideas how to get it working? I am on quest 2
edit: this has been solved
The camera calibration should store the resolution it was calibrated at, and if changed in the UI, should invalidate the calibration.
Calibrations could potentially be saved as well, Allowing for an array per camera that is identified by its resolution.
Are there other values that should invalidate the calibration?
Hey! I recently made a waist and leg trackers for AprilTag.
Once i calibrated the cam (good calibration as far as i see) and the trackers i hopped onto SteamVR.
Trackers seem to be recognized correctly (left leg is indeed a left leg, etc), but all of them seem to be way above me, like 1.5 to 2 meters above me.
There should be a way to calibrate XYZ offsets for the trackers because of that, because i honestly have no idea what should i do about it. I tried repositioning the camera, restarting software, recalibrating everything (even room in SteamVR). No luck.
Anyone knows a fix/workaround for that?
OS: Arch linux
DE: Gnome 43 with gtk4 and gtk3 libs installed on Xorg
April-Tag-VR-FullBody-Tracker version: at least from v0.6 to v0.7.1
Steps: Click on params
Console output:
./AprilTagTrackers
(AprilTagTrackers:79940): Gtk-CRITICAL **: 01:48:13.044: gtk_box_gadget_distribute: assertion 'size >= 0' failed in GtkNotebook
(AprilTagTrackers:79940): Gtk-CRITICAL **: 01:48:13.044: gtk_box_gadget_distribute: assertion 'size >= 0' failed in GtkNotebook
(AprilTagTrackers:79940): Gtk-WARNING **: 01:48:13.044: Negative content width -2 (allocation 0, extents 1x1) while allocating gadget (node header, owner GtkNotebook)
(AprilTagTrackers:79940): Gtk-CRITICAL **: 01:48:13.044: gtk_box_gadget_distribute: assertion 'size >= 0' failed in GtkNotebook
(AprilTagTrackers:79940): Gtk-WARNING **: 01:48:13.044: Negative content width -1 (allocation 1, extents 1x1) while allocating gadget (node scrolledwindow, owner GtkScrolledWindow)
(AprilTagTrackers:79940): Gtk-CRITICAL **: 01:48:13.044: gtk_box_gadget_distribute: assertion 'size >= 0' failed in GtkScrollbar
(AprilTagTrackers:79940): Gtk-WARNING **: 01:48:13.044: Negative content width -1 (allocation 1, extents 1x1) while allocating gadget (node scrolledwindow, owner GtkScrolledWindow)
(AprilTagTrackers:79940): Gtk-CRITICAL **: 01:48:13.044: gtk_box_gadget_distribute: assertion 'size >= 0' failed in GtkScrollbar
(AprilTagTrackers:79940): Gtk-WARNING **: 01:48:13.044: Negative content width -1 (allocation 1, extents 1x1) while allocating gadget (node scrolledwindow, owner GtkScrolledWindow)
(AprilTagTrackers:79940): Gtk-CRITICAL **: 01:48:13.044: gtk_box_gadget_distribute: assertion 'size >= 0' failed in GtkScrollbar
01:48:14: Debug: window wxTextCtrl@0x55f69ba1c800 ("text") lost focus even though it didn't have it
01:48:14: Debug: window wxTextCtrl@0x55f69ba1c800 ("text") lost focus even though it didn't have it
01:48:14: Debug: window wxTextCtrl@0x55f69ba1c800 ("text") lost focus even though it didn't have it
01:48:14: Debug: window wxTextCtrl@0x55f69ba1c800 ("text") lost focus even though it didn't have it
01:48:14: Debug: window wxTextCtrl@0x55f69ba1c800 ("text") lost focus even though it didn't have it
01:48:14: Debug: window wxTextCtrl@0x55f69ba1c800 ("text") lost focus even though it didn't have it
01:48:14: Debug: window wxTextCtrl@0x55f69ba1c800 ("text") lost focus even though it didn't have it
01:48:14: Debug: window wxTextCtrl@0x55f69ba1c800 ("text") lost focus even though it didn't have it
01:48:14: Debug: window wxTextCtrl@0x55f69ba1c800 ("text") lost focus even though it didn't have it
01:48:14: Debug: window wxTextCtrl@0x55f69ba1c800 ("text") lost focus even though it didn't have it
[1] 79940 segmentation fault (core dumped) ./AprilTagTrackers
I can use my Kinect colour camera as a webcam in most applications, it's a nice fullHD res sensor and should be well suited for tracking.
My other camera worked but nothing would make the Kinect sensor work.
To Clarify. I got an image, but when doing "calibrate camera" using the full sheet, it did not detect or draw any dots. Calibrate tags also never detected the tags.
I tried using the install driver file but it can't find anything and just recognizes whatever I put in as a command as opposed to the file path it was asking for.
I am completely lost!
More detailed explanation:
when I try to connect to Steamvr in AprilTagTrackers.exe it says my ATT version does not match.
I then tried to use the install_driver.bat in driver_files and this is what happens:
The system cannot find the file C:\Users\Cassie.
The system cannot find the file C:\Users\Cassie.
The network path was not found.
vrpathreg.exe not found: ""=\bin\win64\vrpathreg.exe"
This usualy means an error with your SteamVR installation, or if you have multiple installations of SteamVR.
You can also try to locate the vrpathreg.exe file yourself and input it below. The file is inside SteamVR\bin\win64.
Enter full path to vrpathreg.exe:
When I put in anything command prompt just recognizes it as a command entirely unrelated to what it asked me.
I have tried my best, but have no clue what is going on.
Hi! I was testing a wiki in another repository but accidentally had this one open. It actually allowed me to add a footer despite not being a contributor. Is that unintended?
Steps to reproduce:
1• Load up a PS3 Eye
2• Enter Calibration
3• Disconnect USB
4• App becomes unresponsive and crashes
I'm on windows btw
The build type will get set to Release, however its not obvious, some message should be printed as well.
hello there i am trying to get a camera to actually appear using the preview camera checkbox and start/stop button however every number between 1 and 10 throws me a "camera error" i have attempted using a ps3 eye camera and a akaso v50 pro. i dont know if i am doing something wrong
When the camera image is mirrored during camera calibration, there's 3 specific markers that still get detected (because they look the same in mirror image), this could be used to detect mirrored camera image and tell the user to unmirror it.
When using the additional smoothing parameter, some users report one of the trackers are not seen in SteamVR, despite tracked perfectly in the out window.
Other notes:
Possible explanation:
When using additional smoothing, the smoother combines the previous pose sent to steamvr with the pose predicted for current frame. This causes a problem where, if pose is NaN at any point in time, all future poses will be NaN as well, causing the tracker to disappear completely instead of disappearing for just one frame.
Workaround:
Solution has not been found yet. If you encounter this issue, set additional smoothing to 0 and restart SteamVR/ATT.
I do not know if this is just me, but the trackers cannot be positioned in VR. The same thing happens with the camera. I am in SteamVR with SteamVR Home disabled. Any help would be appreciated!
I have been trying to set this up for a few days now and have uninstalled and reinstalled everything once but when I get to the step to preview the camera and set up the trackers the program stops responding and forces me to close it I don't know if this is a problem on my end and have been working to figure out if it is but would like to hear opinions and /or likely solutions. Thank you
So I've gotten it to work, but the big issue is it's not capable of tracking movement. Slight movement makes it lose tracking or straight up makes the trackers unnoticeable. If I move my foot from point a to point b the transition isn't linear, it's more like a teleport. Then the tracker well, untracks. I'm using a 720p 30 fps webcam.
When trying to use a actioncam, the trackers are in the wrong distance to the camera. The scaling of distance to the camera is broken, so if you are close to the camera, the trackers are near you, but if you are a bit away, the trackers are way behind you
my PC BSoD as I save and now the app only says "unhandled unknown expevtion, terminating the application"
so look it works completley fine calibratíon etc but when i try to go on vr it wont work i t dose see that i have tracker and it showing the trackers in vr but there completley off center and the wont move for me pls help (thx for readforing)
hi, I found that after I add tracker like mediapipe project,the openvr API GetPoseActionData()
will got zero pose.
I keep on getting an error that the program can't find the bindings file att_actions.json. when I want to connect to steamvr. I've tried re-installing and manually installation, but I keep on getting the same problem.
if there is any chinese character in steamvr.vrsettings, it raises an exception like this.
exception:
Traceback (most recent call last):
File "installer.py", line 60, in <module>
File "json\__init__.py", line 293, in load
UnicodeDecodeError: 'gbk' codec can't decode byte 0xac in position 1031: illegal multibyte sequence
[13936] Failed to execute script installer
the exception occour on line 59 in installer.py
with open(config) as f:
config_data = load(f)
we need change these lines to fix it as below
with open(config,encoding="utf-8") as f:
config_data = load(f)
This isn't really an issue with the software but really that your discord server link you provided is broken. Try creating a permanent discord server invite link.
When In Vrchat its either 0.010 or 12
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.