Fall 2022 • XR Designer & Prototyper

Encompass is an MR smart home system that can be controlled through natural interactions like universal gestures and habits-driven automations.

Check Out the Demo

A Little Summary

Team

Tianhao He

Haoran Xu

Weide Zhang

Skills

XR Design / Prototype
Future Casting
UI / UX Design
User Testing

Timeline

Sep – Dec '22
14 Weeks

Tools

Unity
C#
Visual Studio
Figma

Brief

Encompass is an exploratory mixed reality smart home system that allows multiple users to interact with their smart homes more naturally and intuitively.

With the development of lightweight MR devices, in the future, Encompass can enable users to navigate smart TV, control the lights, or set up automation scenes in the house by simply using different household objects within reach, or through users' daily habits / gestures.

Background

We began developing our project as an Immersion Lab experiment, focusing on the concept of objects existing in both virtual and real worlds. As our ideas evolved, we focused on a central theme concerning the future of extended reality (XR) technology. We observed that current market offerings, particularly virtual reality (VR) headsets, impede user interaction with the physical world, limiting potential use cases.

We assert that XR technology should enhance and extend human interaction in the real world, seamlessly integrating with our lives. Mixed reality (MR) offers the greatest potential in this regard, as users can engage with the real world without interruption while benefiting from virtual overlays and enhancements. This realization prompted us to explore how MR could improve aspects of our daily lives, leading us back to a familiar space: our homes and the current smart home experience.

The MR Trend

We began by investigating the current market and verifying whether the technology assumption was correct. Through our trend research, we were able to analyze existing use cases and the direction in which MR technology is developing.

Google's Upcoming AR Glasses

Google has brought back its lightweight augmented reality (AR) glasses for a second attempt, allowing users to utilize them in public settings. Features like real-time translation and directions inside the discreet lenses make them potentially even more useful.

Source: CNBC, Google

Meta Quest Pro

Meta's new Quest Pro also represents a trend in MR technology. With the introduction of features like color pass-through, hand and eye-tracking, and physical object mapping in virtual space, it allows for more natural interactions with both real and virtual objects.

Source: Meta

Other Startups and Projects

We are also seeing an increase in investment in new form factors, technology barriers, and novel use cases for lightweight MR devices from both startups and existing tech companies, indicating the potential for a breakthrough in this concept in the future.

Source: Nreal, Trusted Reviews
Current Solutions

We also looked back at our own homes and smart home use cases, and the existing market and products for smart home, and tried to identify its problems and potential development when incorporated with MR technology.

Voice Assistances

Currently, users can control their homes through smart speakers using natural interactions. However, due to the limits of VUI, there is much to improve in terms of intuitiveness, precision, and accuracy.

Home Apps and Ecosystems

Users can also control their homes with smart home apps. However, as there are differences in manufacturers and ecosystems, it typically takes more than one app to control devices from different systems.

Dedicated Controllers

Dedicated controllers for specific appliances limit their usefulness, and the lack of automation and communication with the rest of the system make them necessary, which also increases the hassle when they are lost.

Opportunities

To identify how we can better address the current solution's weak points, we looked further into the existing XR devices (which may share similarities with the future MR devices) and some of the already established smart home standards. Here are two parts that we think we can leverage when designing our vision.

Opportunity 01

The Nature of Tracking & Displaying

The existing smart home system includes two main parts: personal devices (phones) for basic automation and triggering, and hubs (smart speakers) for controlling. Lightweight MR devices, however, are more intuitive and accurate due to their ability to track everything from the environment to the user's movement and display information as overlays. This opens up new possibilities for controlling devices and more accurate and detailed automations, even down to the posture-level.

Opportunity 02

New Smart Home Standards & Technology

There are also new smart home standards and technologies on the horizon. The upcoming adoption of the new Matter standards will enable cross-ecosystem compatibility for more smart home devices and appliances, allowing them to be controlled with the service the user prefers. Newer wireless standards like WiFi 6E and UWB connections can also enable a hub-less, interconnected IoT network that is distributed and always tracked.

Our Mission

How might we revolutionize home interaction with mixed-reality-enabled, lightweight devices, allowing users to control their smart home with everyday objects and habits?

ENCOMPASS

Experience

Introducing Encompass, a future mixed reality smart home system that redefines user interaction. With multi-affordance, posture-based automation, multi-user support, and customization through MR and mobile devices, Encompass streamlines smart home control. Embrace intuitive, immersive experiences as you navigate smart TVs, adjust lighting and scenes, and much more using everyday objects and gestures.

Under Construction... 🚧

Hey, looks like you found this page! Encompass is still an under-construction project. More content is coming this way, so stay tuned!

Behind the Scene Process

01

Open Source!

The whole Encompass demo is intended to showcase the possibility of the future as more lightweight MR devices become available to the general public. However, with the limitation of the current technology and platform, we are only able to simulate the experience using Unity and Meta's Oculus Integrations. All the demo source code is open-source on GitHub, so feel free to try them out if you have the adequate hardware!

02

The Building & Prototyping Process

Developing with Unity and XR platforms is a relatively new experience for our team. Experimenting with the capabilities and constraints of the existing toolkit has been a great learning experience. We have gained a clearer understanding of the potential and constraints of the current technology, such as gesture tracking accuracy, existing MR passthrough technology, and general prototyping intuitiveness, by diving deep into Meta's Oculus integration and exploring object-oriented programming. It's always fun to view a design possibility from the perspective of a developer!

03

Iterations & User Testing

During the exploration and development process of our main prototype, the Multi-Affordance scene, we also conducted several rounds of user tests to see what are their reactions to some of the main main features we proposed and what we can improve. The results were surprisingly useful and shaped a lot regarding what the final prototype would be, including how the user is going to interact with the multi-controller, and how gesture can be significantly influenced by the position the user's performing.

04

...And With a Little Help from AI

In order to present the overall concept in a more persuasive way, we also created a little brand guide for Encompass. Utilize the emerging AIGC, we were able to quickly iterate between all the potential names for our concept and goal. We also chose GitHub's new open-source font Mona as both the display and paragraph font thanks to it flexibility as a variable font.

Learnings & Next Steps

Experimenting with the emerging technology and still-developing platform is quite an interesting journey for me. To have hands-on experience of the real developing tools is such a valuable experience for designers. Here are some points I learned during the development process of Encompass:

Point 01

The State of XR Design & Development

Having hands-on experience creating a scene, experimenting with Unity engine settings, and exploring existing APIs and codebase gave me a better understanding of XR design and development, and why the platform hasn't been popularized yet, even with major pushes from companies like Meta. The constraints of existing APIs and hardware limitations not only challenged us to create novel experiences while considering all the requirements, but also sparked more ideas to improve the workflow in the future.

Point 02

Making Trade-Offs

The original plan for Encompass was to show a complete scenario with a daily routine of how users interact with the smart home system. However, due to time and effort constraints, we had to make trade-offs to finish the design. I learned that the best way to make trade-offs is to evaluate the ultimate goal of the design (in this case, showcasing our vision) and consider the scope of the entire project. This resulted in our final prototypes, one of which is fully interactive and still demonstrates our vision fully.

What's Next?

There is still much to do regarding both the Encompass vision and the prototypes. Here are some of the points we can further develop based on our existing design:

  • Further expand the prototype to cover more interactions and postures.
  • Revamp the existing UI design to further corroborate with the existing brand identity.
  • Further experiment with the new Oculus' hardware and APIs, especially MR passthrough and object / scene mapping.

Me and Haoran are also on the way to setting up a stage for a live demo of the prototypes at our graduation show. Stay tuned for more updates!

You've reached the end!

Thank you for reading! You can check out more cool stuff by smashing either one of the following buttons...