-
-
Notifications
You must be signed in to change notification settings - Fork 482
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Publish tracking data through VMC Protocol #2461
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall LGTM. Run cargo xtask format
I haven't tested myself, at least seems it doesn't break anything else. |
https://gist.github.com/grillo-delmal/b7712e4935f017067ced3d0428536a01 here is a short python script if you want to see the data being communicated :) (requires pyliblo3) |
Ok, got orientations fixed. Had to build some tooling to record the VMC communication and transform the output on the go. One thing I learned is that VMC expects for the data to be sent in Unity orientation to work correctly. Also, the actual math to make this work, I had to apply different rotations to each bone, but all where some multiple of pi/2, so it wasn't that bad. It didn't help that the model that I have available is a free one generated with VRoid, which is not intended for this purpose XD. But at least I'm happy with the end result ^^. Screencast.From.2024-10-21.02-40-01.mp4 |
})) | ||
.unwrap(), | ||
) | ||
.ok(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might want to log errors that occur here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I based myself on the one implemented here https://github.com/alvr-org/ALVR/blob/master/alvr/server_core/src/tracking/body.rs#L48-L60
So if it's really needed, adding error logging there could also be helpful.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tbh it's probably fine without error logging, was just a hunch
Some(( | ||
*id, | ||
tracking_manager_lock | ||
.get_device_motion(*id, timestamp) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't really have the time to actually look right now, how does this interact with hand tracking/multi-input protocol (I forgot what it's called sry). I.e. does it miss the hand poses? And is this before and after the removal of the hand poses in the case of protocol compat?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can see the multimodal compat is at line 352. that already handles everything we need. after that point you don't care if the client does or doesn't support multimodal protocol, it will always be used.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great. Well then I think hands won't get exported properly if you're currently using hand tracking. Or I'm missing something
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is only sending hand position/rotation information. Even if the protocol supports hand poses (through bone position/rotation), I haven't checked how to integrate that. So if there is any data about that, the sink should just ignore it for now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The issue is that if you're using hand tracking then the palm/wrist pose is carried as part of the hand tracking info, meaning your code will miss it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll look how it behaves with hand tracking when I get home and report back then.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@The-personified-devil Ah i got it now, you're right. if hand tracking is active, we don't send hand DeviceMotions, you should at least get the palm pose (first element of the hand skeleton array)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now I see what you mean! I started toying with hand tracking and integrating it wasn't hard, but noticed that they require their own rotation corrections :p
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's odd. controllers do need a pose offset fix, which we do from settings. since you are getting the data from the tracking_manager then the pose is already fixed, and also recentered. same for hands. so i'm not sure a separate pose fix for hands
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in the end it was an off calibration of the Right Elbow what was off. It's kinda hard to calibrate without shoulder/upper arm data, but now it should be fine.
2d4d3bd
to
6e23ce2
Compare
@grillo-delmal are you sure what you are doing is right? shouldn't you just follow clippy suggestion? |
@grillo-delmal your link is unrelated lol. in any case the PR looks good to me |
Thx for the catch :) (deleted old message cos of the wrong link, here is what actually said xD)
|
The VMC Protocol is a protocol to send motion tracking data through OSC for avatar motion applications. It's implemented in software like VSeeFace and others.
Here is a test of it being used through the inochi-session app
Screencast.From.2024-10-16.00-05-31.mp4