This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Mobile synchronisation with VR environment

Hi ARM development team,

I hope this is the correct place to ask. I was at the VRTGO developer meet up in Gateshead a few weeks ago where I met briefly with @Roberto Lopez Mendez. Although his talk was excellent, what piqued my interest was the piece of code he had created in Unity to mirror what a user in the VR environment sees on a mobile device. Instead of streaming the graphics buffer he created a second user that was tied to the initial user. Is there a way that this piece of code could be posted up? Creating a window into the virtual world in this way would be very useful not only for testing but also in the use of mobile devices as aids to users outside of the Virtual Environment.

kind regards

Stephen

Parents
  • Roberto is currently taking a well deserved break after GDC, so he can't answer this himself, but to my understanding it wasn't particularly complex.

    The on screen demo was running on a device connected to the same Wi-Fi hub as the VR headset. The headset was on a known IP address which the on screen device connected to as a server. The only details it had to send was the camera transformation for each frame. No other animations were synchronized over the connection as they are all time based and non-interactive.

    -Stacy

Reply
  • Roberto is currently taking a well deserved break after GDC, so he can't answer this himself, but to my understanding it wasn't particularly complex.

    The on screen demo was running on a device connected to the same Wi-Fi hub as the VR headset. The headset was on a known IP address which the on screen device connected to as a server. The only details it had to send was the camera transformation for each frame. No other animations were synchronized over the connection as they are all time based and non-interactive.

    -Stacy

Children
No data