At Game Developers Conference (GDC) 2018, one of the main themes for this year’s show was the important role Arm played in shaping the last 25 years of the mobile gaming and handheld entertainment industry. One of the ideas our team came up with during our initial brainstorming was to create something visual. GDC is of course a very colourful, stimulating event and we wanted to create something in line with GDC’s values. Our brainstorming led us to a Virtual Reality (VR) demo called the ‘Arm Museum of Gaming” which would be a central feature of our booth. Alongside the VR demo, there would be a display full of the original devices featured in the demo, from the original GSM Nokia 6110, some iconic Nintendo devices, to the latest Samsung Galaxy devices.
The demo needed to showcase the devices, and enable the user to select and check the components inside these various devices. This demo is made in Unreal Engine 4 with the target device of a Galaxy S8 using a GearVR.
Firstly, we needed to establish requirements so we wouldn’t lose focus of what we’re trying to achieve, sometimes easily done when creating a demo. The demo tries to succinctly meet these and I will expand on them in this blog.
We then established the major requirements for this demo;
We then identified potential problems and pitfalls with creating this demo;
Figure 1: Device carousel
Given that at the time GDC was right around the corner for us, time was a valuable commodity. We had to be efficient with our experimentation in this demo and stick with what we knew was already proven to work. The decision to reuse elements from a previous demo was key to us meeting the timescale needed for this demo. The demo that was repurposed was one that our team previously created around mid-2017 using similar gameplay mechanics, albeit slightly more complex with a lot more handheld mobile models from well known companies like Nintendo, Nokia and Samsung.
So did we just copy the demo over? Not quite. We took the core of the demo and stripped everything away. Any unnecessary features were removed and the demo was now at a minimal viable product. This enabled us to have a functional prototype within days, which allowed us to have the simple demo tested in VR very early in the build process. When developing for VR it is incredibly important to test the very early stages of development, as sometimes what works well in an editor view might not actually render very well when viewed in the VR headset.
Figure 2: Demo with device teardown
One of the most important aspect of any VR demo/game is the gameplay mechanic. If this is intuitive it can result in a very enjoyable experience for the user. In the demo, the gameplay mechanic is very simple. The user stands in one place and all gameplay interaction is done with the Gear VR controller in a point-and-click interface. Due to the space restrictions that come with an event booth, a decision was made early on to make the interactivity of the demo stationary for the user, which means the user stands in one spot and doesn’t have to move too much. For people who may have never used a VR demo before, which at events is usually a high percentage, it reduces the anxiety of having to put the headset on.
The Gear VR controller is another challenging aspect for us in terms of the game mechanic. You have to make the assumption that this may be the first experience the user has ever had with a Gear VR controller. The Gear VR controller is pretty simple and easy to get used to, but initially handling a device with four buttons can be pretty challenging, especially for the person guiding the user of the demo. To avoid any awkward instructions being needed we designed the interaction to be as simple as possible, with the touch pad of the controller to swipe left and right through the carousel of devices and a back trigger button to confirm selection of the device the user wishes to view. The only other button that can be used in the demo is the home button to do the specific function of resetting controller calibration if the point and click arrow has become uncalibrated.
Even with this basic setup, when someone is using the demo it is still necessary to explain to each user how to control the demo, as not everybody has used a Gear VR controller, or even the headset before!
The overall idea for the demo was to showcase the iconic devices containing Arm IP that shaped mobile gaming over the past few decades. What sort of environment should the demo be placed in was straightforward. The idea was to combine the museum environment with technological props such as a projector, computer, and the stage in the center of the room.
When the decision was made that it will be a museum environment, the first step was researching the various references of the museum. I collected many images of great looking museums from around the world. We ended up being hugely inspired by London’s Natural History Museum, the Romanesque interior is so iconic and creates a fantastic contrast with all technological props that surround the user of the demo.
Once we had decided on the styling and direction of the environment, we started building the environment by placing placeholder objects (most of the time this will be just simple boxes). This process is called whiteboxing and its main purpose is to achieve the right scale. We don’t really care about detail at this stage, but it is very important to place the user location spawn, and test the look in the headset as soon as possible. This ensures the user position and all placeholder props are as expected.
Figure 3: The users view with find reference
Once we were happy with proportions, scale and user position in the virtual world, we began building the proper meshes to replace these placeholder boxes from the previous step. We’re utilizing the Unreal Marketplace to quickly build the museum environment, by repurposing a dungeon environment and turning it into a museum. It’s not as straightforward a process as we would like, but it’s a good starting point as we only needed to build meshes that are not in the dungeon package.
An important step at this point is cleaning up the textures and creating a new simpler material to make them run well in mobile VR. We must always look back to the who, what and how of demos as a sanity check. If we put in items that make the demo unstable on mobile VR then we have failed to follow the demo requirements.
For objects near the user we created them entirely from scratch and took inspiration from some old school technologies, such as a tube monitor, projector and scanner. We tried to keep the silhouette simple and have coherent textures with the environment. The placement of these objects is also important, as in mobile VR mipmapping is very harsh, and you’ll want objects to be close to the user to get the best quality of the textures. As we have a massive virtual environment, this placement is also crucial to make the room not feel empty. If you place the objects close to the user in the demo, you need less props to fill the room.
Figure 4: Lighting example
With the whole environment built up, the next step is to tackle lighting. We decided to go with a night setting as it enables us to play more with lighting, and combining natural light (moonlight) with artificial lighting (lamp/light bulb). All of the lighting in the environment is baked, no dynamic light is used due to performance constraints.
Figure 5: Finished lighting setup
Figure 6: Devices compartment
These are the stars of the show; the whole objective of the demo is to showcase these iconic gaming devices containing Arm IP. We came up with a long list of devices that could be used, but narrowed it down to a handful due to time constraints and rights usage.
To showcase the technology inside, we let the user open these devices and have the deconstructed view of them. It’s not a 1 to 1 representation of the real device as we limit the number of inner parts to be 6-7 components, due to number of triangles restriction as well as time. Due to the extremely detailed nature of the devices we had an external company create the models of the them.
After we created the device models, the next step was to import them into the engine and create the opening/closing animation in 3DS Max for each of these devices.
Here are few of the renders of the 3D devices.
Figure 7: N-Gage device model
Figure 8: Nintendo Game Boy Advance Model
We then hooked these devices and animation in-game using the Unreal Blueprint to our projector meshes and Gear VR controller to enable the user to select them. We used the GearVR controller touch pad to swipe and the button to control the projector tile rotation, and then used the trigger to select the device that they’re interested in. Viewing more information about the device and seeing it in an exploded view was done via another trigger press.
This is how the final scene looked after we combined the environment, animated device meshes and UI elements.
This demo was a collaboration between artists, engineers, project managers, marketing and an external vendor. It is always great to see an idea come to life, and being an essential part of that process is always enjoyable. Of course, this demo will be used a lot to showcase our influence on the gaming industry, but it was for a specific event, GDC 2018. I manned the booth at GDC for three hectic days and I can say it was a great experience. The demo was well received and I found others in the team were constantly chatting with attendees about the influence that the companies we all work/worked for had on the industry. History is a common subject for us all and makes conversation with attendees at events easy. I guess that is what a good demo should do; promote and influence conversations around technology.