XR Design / Prototype
UI / UX Design
Sep – Dec '22
Encompass is an exploratory mixed reality smart home system that allows multiple users to interact with their smart homes more naturally and intuitively.
With the development of lightweight MR devices, in the future, Encompass can enable users to navigate smart TV, control the lights, or set up automation scenes in the house by simply using different household objects within reach, or through users' daily habits / gestures.
We began developing our project as an Immersion Lab experiment, focusing on the concept of objects existing in both virtual and real worlds. As our ideas evolved, we focused on a central theme concerning the future of extended reality (XR) technology. We observed that current market offerings, particularly virtual reality (VR) headsets, impede user interaction with the physical world, limiting potential use cases.
We assert that XR technology should enhance and extend human interaction in the real world, seamlessly integrating with our lives. Mixed reality (MR) offers the greatest potential in this regard, as users can engage with the real world without interruption while benefiting from virtual overlays and enhancements. This realization prompted us to explore how MR could improve aspects of our daily lives, leading us back to a familiar space: our homes and the current smart home experience.
We began by investigating the current market and verifying whether the technology assumption was correct. Through our trend research, we were able to analyze existing use cases and the direction in which MR technology is developing.
Google has brought back its lightweight augmented reality (AR) glasses for a second attempt, allowing users to utilize them in public settings. Features like real-time translation and directions inside the discreet lenses make them potentially even more useful.
Meta's new Quest Pro also represents a trend in MR technology. With the introduction of features like color pass-through, hand and eye-tracking, and physical object mapping in virtual space, it allows for more natural interactions with both real and virtual objects.
We are also seeing an increase in investment in new form factors, technology barriers, and novel use cases for lightweight MR devices from both startups and existing tech companies, indicating the potential for a breakthrough in this concept in the future.
We also looked back at our own homes and smart home use cases, and the existing market and products for smart home, and tried to identify its problems and potential development when incorporated with MR technology.
To identify how we can better address the current solution's weak points, we looked further into the existing XR devices (which may share similarities with the future MR devices) and some of the already established smart home standards. Here are two parts that we think we can leverage when designing our vision.
The existing smart home system includes two main parts: personal devices (phones) for basic automation and triggering, and hubs (smart speakers) for controlling. Lightweight MR devices, however, are more intuitive and accurate due to their ability to track everything from the environment to the user's movement and display information as overlays. This opens up new possibilities for controlling devices and more accurate and detailed automations, even down to the posture-level.
There are also new smart home standards and technologies on the horizon. The upcoming adoption of the new Matter standards will enable cross-ecosystem compatibility for more smart home devices and appliances, allowing them to be controlled with the service the user prefers. Newer wireless standards like WiFi 6E and UWB connections can also enable a hub-less, interconnected IoT network that is distributed and always tracked.
Experimenting with the emerging technology and still-developing platform is quite an interesting journey for me. To have hands-on experience of the real developing tools is such a valuable experience for designers. Here are some points I learned during the development process of Encompass:
Having hands-on experience creating a scene, experimenting with Unity engine settings, and exploring existing APIs and codebase gave me a better understanding of XR design and development, and why the platform hasn't been popularized yet, even with major pushes from companies like Meta. The constraints of existing APIs and hardware limitations not only challenged us to create novel experiences while considering all the requirements, but also sparked more ideas to improve the workflow in the future.
The original plan for Encompass was to show a complete scenario with a daily routine of how users interact with the smart home system. However, due to time and effort constraints, we had to make trade-offs to finish the design. I learned that the best way to make trade-offs is to evaluate the ultimate goal of the design (in this case, showcasing our vision) and consider the scope of the entire project. This resulted in our final prototypes, one of which is fully interactive and still demonstrates our vision fully.
There is still much to do regarding both the Encompass vision and the prototypes. Here are some of the points we can further develop based on our existing design:
Me and Haoran are also on the way to setting up a stage for a live demo of the prototypes at our graduation show. Stay tuned for more updates!