From the humble beginnings of a Kickstarter campaign to becoming the world’s leading Virtual Reality (VR) company, Oculus gained its success by focusing on building a rich ecosystem that connected its community to its developers. It was a monumental experience working on the forefront of an emerging technology, contributing to create the first VR platform from the ground up to where we are today.
As the 2nd designer on the team, I began by exploring what it means to design for VR. While some design principles remained constant, VR was a spatial experience and it required all touch points to how we interact with objects and our surroundings. It introduced new constraints to consider such as FOV (field of view), ergonomics, frame rates, and rendering capabilities. While the thought of VR felt so futuristic, our tools in 2013 to create for VR were still quite primitive and the hardware was still in its infancy.
Input methods are the main drivers for the type of interactions that are possible. The keyboard and gamepad were our primary forms of input which led us to choosing Raycast as our selection method (a cursor that is fixed at the center of the FOV). Since it was so crucial for the selection to be made within view, Raycasting proved to be the most natural way to navigate in VR. While it brought more precision and freedom than using a joystick or D-Pad, it was a sure way of bringing fatigue to the user's neck. We continued to learn how ergonomics played a large role in VR.
Besides the narrow FOV, lens distortion and the screen-door effect significantly hindered legibility. The distortion on the outer rims of the lens were not only illegible but brought significant discomfort. These challenges led us to determine our initial UI layout. Due to the 6DOF (6 Degrees of Freedom) head tracking, thin lines on screen vibrated with harsh aliasing. We used larger fonts with heavier weights and we avoided thin strokes in the iconography and UI to compensate. We continued to simplify our UI for the most comfortable viewing experience.
Breaking old habits were also something we had to get used to. Design reviews on a monitor were no longer enough. Each interaction and motion design had to be experienced in the headset, it was a constant reminder of checking in VR early to avoid wasting time and effort. Some designs that clearly seemed to fail in 2D surprisingly worked well in VR. This led to a culture of truly embracing VR as our source of truth, moving away from the comfort of the framed screen.
In 2014 a new partnership had formed with Samsung riding on their new Note 4 product launch. Oculus had decided to ship our software using their hardware — Gear VR was born. While we scrambled to apply our learnings in VR, there were magnitudes of infrastructure and foundational features that were required to offer a consumer ready end-to-end product.
Building a platform from scratch was not something we were familiar with. Fundamental elements that make up the foundation of the platform were easily overlooked. As simple as they were, we all tend to take these for granted. We swept each feature with all possible edge cases and covered for every error ensuring we left no user at a dead end.
Due to the ambitious timeline and small team, our challenge was relentless prioritization. Over scoping and working on features that was not within reach for launch had to be identified and corrected quickly. We kept our scope highly focused on core features in order to ship the minimum viable product.
In 2016, we released our official flagship product, the Rift. With the power of the PC and 6 degrees of freedom, we took our existing infrastructure and continued to scale, pushing the limits of our platform for the PC division. Rift’s integrated spatial audio and haptic controller feedback opened new doors to leverage new forms of feedback to help assist our interface in VR. Along with hardware improvements, the platform also gained some new features including user identities/avatars, voice commands, and multi-player capabilities.
With new hardware came new design challenges. Input degradation was carefully considered as there were now multiple input methods available for the Rift platform. While the new fresnel lenses brought clarity by reducing the Screen-door Effect, high contrast colors now became an issue (e.g. white text on black background) which limited our color palettes.
Our FOV plays a large role in how we digest information in VR. Due to its narrow view, what could be UI moving across the screen could feel as if the world was moving instead. Motion sickness was drastically reduced by ensuring objects in motion were clearly within FOV or by staggering the UI animation flow to make clear that the UI was the one in motion.
In 2017, we redesigned our platform and introduced Dash (Rift Core 2.0). While our existing platform served its purpose, traversing through applications was still a very disruptive experience. We introduced an overlaid multi-tasking capability to access your platform as well as your PC, rendering right on top of the game. This not only reduced friction within VR, but also when interacting with your PC outside of the headset. It was also our first approach at solving native input using your hands in VR. Imagine a world where the user can check their chat while listening to spotify with seamless gameplay?
The UI for Dash was much closer within reach. Moving the UI within the personal space helped clearly differentiate what was platform UI vs game UI, avoiding object clipping which caused discomfort. Objects within reach naturally hint at interactive elements which was key in introducing touch based interactions, while a laser cursor was activated once reaching distance was exceeded. Considering the long-term usage of gesture-based interactions were also important. Certain positions can become exhausting over time so a lazy mode using laser cursors were always readily available for those who needed it.
Due to the close proximity of the interface, ergonomics was carefully considered throughout our placement, size, tilt angle, and buttons sizes. Adding hover/touch feedback using hands were especially difficult since there was nothing stopping your hand from going through the objects. We avoided stacking, placing objects behind one another, and ensuring enough space was in-between to reduce overreaching which would cause accidental selections. With the help of haptic feedback from the controllers, we were able to bring tangible feelings to the interface.
My typical workflow starts by ideating through mood boards, comps, or style frames, then quickly validating its scale and interactions through low fidelity prototypes (Grey Boxing) using Unity. Some traditional interactions were validated faster by using Framer, then prototyped in Unity to experience spatially. As the ideas became more clear through the translation to 3D, the more details were applied. Once the prototype became much more complex and higher fidelity was required, I partnered with a design engineer to continue with the iteration process.
Most times our ideal version of the prototypes were met with harsh realities. Performance and rendering capabilities challenged us again and again to deliver a build that met within the technical constraints without compromising a good experience. It was a true testament of teamwork that led to our cross-functional teams to succeed as one. We ran ongoing user research studies to gather feedback and observed how our users interacted with our prototypes. With each session we were able to clearly identify problem areas which was a crucial part of our process.
Dash was designed as a framework with scalable components, allowing us to take a step into an OS model. We applied this framework to our 1st party applications (Oculus Desktop, Library, Explore, etc) and potentially opening up the use of 2D “Panel” apps to run for our 3rd party developers. A style guide was used as the source of truth to keep our designers and engineers aligned on building a cohesive design language. The component based design system not only prevented fragmentation of styles but also sped up the workflow for our engineers in cross-functional teams.
Although we have made strides since our humble beginnings, VR still feels like the early days of the web. With each year, we take one step closer at bringing this computing platform to the masses and it has been a true honor to be able to contribute to the industry to get it there.