Hi, new to Normcore and Unity and trying to develop a simple multi-user social app demo for the Quest 2. I’ve successfully created a scene using XR Interaction Manager and an XR Origin object as described in this tutorial: Unity Multiplayer Virtual Reality with Normcore | Jeff Rafter but using a RealtimeAvatar like shown here: XR Avatars and Voice Chat | Normcore..
However I would really like to use the Oculus system avatar instead, but I can’t seem to figure out how to use the Oculus Avatar as shown in this tutorial (https://developer.oculus.com/documentation/unity/as-avatars-gsg-unity/) with Normcore. Are there any tutorials or examples that use the Oculus Avatar with Normcore?
Thanks!
This is a good question! We don’t have any built-in integration for the Oculus Avatars, but we’ve seen other Normcore apps use them. I believe some folks synchronize pull the head/hand offsets from the Normcore avatars and apply them directly to the Oculus ones.
I could be mistaken, but I believe there’s also an API where the Oculus avatars will give you a byte[] blob of the data they need to synchronize and you can set that on a Normcore model to synchronize it to all clients and then apply it to the avatar.
Max
Hello @jmillerid !
I was in charge of adding the v2 oculus avatars to GOLF+ when they were first released on early access a few months ago. @maxweisel is right that you can just network the byte[] blob that’s generated by the snapshot method that gives you the skinned avatar data.
There’s a loopback example that comes with the v2 Avatars that is the best place to start. They have example code that shows how to get a snapshot and then play it back.
Make sure you look at the avatar entity “Features” selections because they matter depending if an avatar will be used to run IK or playback the skinned data. For ones that will just playback the data, use Preset_Remote. Again - just follow the loopback example and you should be ok.
The only other thing to consider is lipsync. I haven’t looked at what the public v2 avatar package has but when I integrated it the lip sync script they included would break the normcore microphone. I think the microphone bindings would conflict somehow. What we did to solve this is we modified the normcore code to copy the microphone data being sent and passed that into the lip sync solver for the local avatar.
Hope this helps! If you have any other questions feel free to ask
Hey @mattsich I’m currently also setting up v2 Avatars with Normcore and I was wondering at some point would we have to tie in Realtime Transform/View to the v2 avatars?
Unfortunately, I don’t believe Oculus gives you access to the avatars to be able to do that. I think they’re fairly blackbox, but @mattsich will know more.
It’s not the preferred way to set up v2 avatar networking BUT you can do it. If you networking the controller and head positions you could just pass those to the avatar input controller and have it run through IK but this won’t scale since IK is expensive.
To recap, the preferred way to networking the avatars it to only run IK for your own avatar and then network the resulting skinned data to the other clients. This mean every headset only ever needs to run IK for a single avatar.
Based on some conversations I’ve had I believe Eleven Table Tennis is not using the preferred approach and instead running IK for both players and this is ok for them because they only ever have two players in the scene and it allows them to display the avatar for non oculus users who may not have a version of the game that supports avatars at all.
Hi Matt, I’ve been following along with these post and they have been quite helpful so far. I am trying to get simple implementation of meta avatars working and I am running into an issue with my implementation.
The issue I am getting is that I am getting a Null Reference Exception
on Normal.Realtime.Serialization.WriteStream.WriteRawBytes (System.Byte[] value)
and then the error Failed to serialize datastore message. Disconnecting.
for my [RealtimeProperty] byte[]
I have been able to get tracking data and set tracking data via the following NOT over the network. The local entity is my personal meta avatar and the remote entity is another meta avatar in scene being passed a byte[] of my local avatars position data.
var data = localEntity.RecordStreamData(OvrAvatarEntity.StreamLOD.Medium);
// set byte[] position data to remote entity
remoteEntity.ApplyStreamData(data);
After digging through the loopback example, I have determined these are the main two functions for retrieving data and setting this byte[] data. So now in theory, I can take the byte[]
returned from RecordStreamData()
and have a RealtimeProperty
for the avatar which gets updated each frame.
Below is my Realtime Model
[RealtimeModel]
public partial class NetworkAvatarDataModel
{
[RealtimeProperty(10, true, true)]
private byte[] _packetData;
// I have used the following with no luck, 620 is byte length of the medium LOD data
//private byte[] _packetData = new byte[620]
}
Here is my RealtimeView
with the NetworkedAvatarData
as a child view. I have used normcore Realtime to sync other components but not a byte array before. Is there anything I am overlooking / missing which could be causing this serialization issue?
Also is this is the right direction for getting this implemented?
Thanks!
EDIT:
So the specific issue at hand was that byte array was getting set as a null byte array[] in the RealtimeComponent<NetworkAvatarDataModel>
derived class. I am not getting those errors any more. The byte array in the model and in the derived class both need to be set to new byte[0].
However, I am still having some issues getting this implemented properly. The new remote avatar isn’t updating after passing the data via remoteEntity.ApplyStreamData(data)
with the new data. On the client side, I am logging the byte[] that is being passed over the network. I am seeing that the byte array is being updated with data frequently when a client avatar is added.
I will update with any significant progress.
I’ve solved the issue. Nothing to do with Normcore but its still worth sharing.
When a realtime remote avatar was connecting, the realtime avatar prefab had a standard SampleAvatarEntity
component attached to it. It was using the same SampleAvatarEntity
Creation info for both local and remote avatars.
The creation info for the local and remote OvrAvatarEntity should match the creation info for the Local and Remote avatars in the NetworkLoopbackExample that oculus provides. Below is an example of this info.
This info cannot be changed after the object has been created. Because of this, I have two different prefabs, one with all Local and one with all Remote settings, that are instantiated based upon whether the Realtime Avatar is remote or local.
If those settings are NOT changed, the remote avatar will just have a T pose even though you are correctly passing data via ApplyStreamData(data)
.
It now works great
Hi @Robertnc , I’m having a ton of trouble getting this set up myself and I’m not quite sure how to ask for help. Is there any way you could zip up a package of your functioning meta avatar scene and post it here? Thank you for your help!
sorry, this is vague, so I’ll add as an addendum the message I just sent to @mattsich who so graciously helped earlier -
i Matt,
Just reaching out in response to your post in this thread. Since my project only requires 2 of the v2 avatars in any given space, I’m happy to go around the skinned data route and just rig two IK players in one scene - but I feel like I’m missing something very basic about the v2 avatars, since unlike an OVRCamera rig I can’t figure out where the actual head and hand targets are located on the avatars - or what prefab to instantiate with the realtime normcore manager in order to get two avatars in the scene. Can you help me out? Thanks!
Any advice y’all?
Hello, this was very informative, but I’m still struggling to understand some of the core concepts here.
Like
- why did your realtime view need to have the head and hands mapped as views, aren’t they brought over in the networkAvatarData?
- You mention keeping the settings the same as the network loopback example, but the remote avatar in that scene doesn’t get the right skin?
- How do the head transforms link up? Mine remote avatar is like 90degrees rotated and far from the local body.
- What does the networkAvvatarData.cs look like? Is it something like this:
using UnityEngine;
using Normal.Realtime;
using Oculus.Avatar2;
using StreamLOD = Oculus.Avatar2.OvrAvatarEntity.StreamLOD;
public class NetworkAvatarData : RealtimeComponent<NetworkAvatarDataModel>
{
public OvrAvatarEntity remoteEntity;
public OvrAvatarEntity localEntity;
private Realtime _realtime;
[SerializeField]
private bool avatarFound = false;
[SerializeField]
private bool connectedToRoom = false;
private void Awake() {
_realtime = GetComponent<Realtime>();
// Notify us when Realtime successfully connects to the room
_realtime.didConnectToRoom += DidConnectToRoom;
}
public byte[] packetData {
get {
if (model.packetData == null) return new byte[0];
return (byte[])(localEntity.RecordStreamData(StreamLOD.Medium));
}
}
protected override void OnRealtimeModelReplaced(NetworkAvatarDataModel previousModel, NetworkAvatarDataModel currentModel) {
if (previousModel != null) {
// Unregister from events
previousModel.packetDataDidChange -= PacketDataDidChange;
}
if (currentModel != null) {
// If this is a model that has no data set on it, populate it with the current blob
if (currentModel.isFreshModel)
currentModel.packetData = localEntity.RecordStreamData(StreamLOD.Medium);
// Update the mesh render to match the new model
ReceivePacketData();
// Register for events so we'll know if the color changes later
currentModel.packetDataDidChange += PacketDataDidChange;
}
}
private void PacketDataDidChange(NetworkAvatarDataModel model, byte[] value) {
// Update the mesh renderer
ReceivePacketData();
}
private void ReceivePacketData() {
// Get the color from the model and set it on the mesh renderer.
// _meshRenderer.material.color = model.color;
if(!avatarFound)return;
Debug.Log("Applying Stream Data");
remoteEntity.ApplyStreamData(model.packetData);
}
private void Update() {
// if (!connectedToRoom)return;
if(!connectedToRoom||avatarFound){
// return;
model.packetData = localEntity.RecordStreamData(StreamLOD.Medium);
// ReceivePacketData();
};
GameObject playerGameObject = GameObject.Find("RemoteLoopbackAvatarPrefab(Clone)");
if(playerGameObject!= null){
playerGameObject.name = "RemoteMadeInNetwork";
remoteEntity = playerGameObject.GetComponent<OvrAvatarEntity>();
avatarFound = true;
}
}
private void DidConnectToRoom(Realtime realtime) {
connectedToRoom=true;
}
}
Hello,
I’ve been trying to get this working, and this thread has been very informative. I think I’m close to getting it right, but not 100% yet. Does anyone have any example “glue” code and example of how to setup the prefabs for local and remote avatars that they would be willing to post here? (Or any public repos to link to?)
Thanks in advance for anything you can share.
How is your networkavatardata script can i get a copy of that. and where do you attach this script.
I published my setup here: