top of page

VR Interior

VR Interaction Design for Interior design application

Problem Statement

Architects and interior designers need to have a clear understanding of spatial qualities of their designs. However imagination and 3D visualization of spaces are the only prototyping tools that are insufficient. Wrong decisions in designs can cost a lot of money, material and other resources.

Design Goal

Designing a VR prototyping application for architects and interior designers to empower them in their design decision making process

Overview

Overview

Project Type

Individual  independent work

Timeline

April 2020 - September 2020

Tools
  • Unity

  • C# 

  • XR Interaction Toolkit

 

Objective

Designing an immersive experience to help interior designers be able to review and modify their design works interactively 

Approach

After doing some research on the subject, the design phase started with finding out the basic interactions that an interior designer need to have in a VR app to be able to review and modify their design work. After identifying interactions that are necessary for the experience, I started prototyping for making those interactions doable and tested with different use cases and users during the process. 

Demo

The following demo shows what was the result of this project as a lo-fi VR design and development project. ​

Background

Background

Initiation

Though the output of this project is platform agnostic due to employing the XR Interaction Toolkit in its development,  I use specifications of Oculus Quest VR headset that I used to design and develop the project for referencing to controller buttons.

Prototyping

Interactions

After selecting the platform and getting basic user inputs, I started defining the interactions that were considered vital to the existence of the application and mapping them to the possible user inputs from the selected device. The following interactios were the results of this project.

Basic Movements

xOculus-Quest-Button.png
Snap Rotation

When the user wants to turn around in the space and look at things that are out of their field of view, they can use the Thumbstick on the left controller to do this interaction.

Teleportation

For instant movements and long distance displacements, teleportation is used in VR. User can activate the teleportatio arch by X button and after selecting the destination, press Grip button to do the movement.

Locomotion

For simple movement in the space, the user can use the Thumbstick on the right controller.

User Interfaces

When an object is selected in the scene, it will have a transparent yellow box around it and if any interaction is active on that object, an interaction indicator will show the type of interaction.

ObjectIndicator_edited_edited.jpg
Selection Laser

When user wants to select an object, the can use Trigger buttons on either of the controllers to see the laser line. If the laser line is red, it means that the object is not selectable, and if it is white in color, it means that the object can be selected.

invalidSelection_edited_edited.jpg

Invalid object

validSelection_edited_edited_edited.jpg

valid object

2D UI Elements
ObjectInteractions_edited.jpg
Object Removal UI

If the user selects the remove interaction, which is a destructive one, the app will ask for confirmation for the selected interaction to prevent any error.

RemoveUi_edited_edited.jpg

Users can hold the Menu button on the left controller for 1 second to activate the menu button and hold it again to deactivate it.

Object Manipulations

Move

After selecting the Rotate interaction from the UI for the selected object. Thumbstick on the right controller will no longer work for locomotion but it will only work only for Rotating the object around its own axis that is perpendicular to the plane that the object is located on.

Duplicate

After selecting the Duplicate interaction from the UI for the selected object, the object will be duplicate it and user can select it to move, rotate or do other interactions on it.

Remove

After selecting the Remove interaction from the UI for the selected object, another overlay 2D UI will appear to ask the user if they are sure about removing the object. By selecting Yes, the object will be removed from the scene.

Change Daylight/Daytime

Background

3d-printing-scale-architecture-models_ed

I used to work as an architect. Interior designers and architects always try to imagine the spatial quality of their design products and check it with different tools and techniques like sketching, 3D modelling and building physical models. VR technology is capable of providing an spatial experience that is so close to real experience and that can help designers decide about details in their work.

As someone who has an experience in architecture and is practicing UX design for digital products, I decided to work on a MVP (Minimum Viable Product) that helps architects and designers in making better decisions for their designs. 

Basics

For starting this project, I interviewed 5 interior designers about their basic needs in reviewing an interior design schema. Through these interview I found the following as the basic interactions that they expect from an app:

  • Moving Objects in the space

  • Rotating Objects around their axis

  • Duplicating objects 

  • Removing Objects

  • Measuring distances

  • Changing lighting settings

  • Changing materials and colors

I decided to start designing for these interactions for Oculus Quest VR headset with 6 Degrees of Freedom and two  touch controllers.

xOculus-Quest-Button.png

Though the output of this project is platform agnostic due to employing the XR Interaction Toolkit in its development,  I use specifications of Oculus Quest VR headset that I used to design and develop the project for referencing to controller buttons.

Prototyping

I started prototyping for the tool by making a minimum viable program at Unity that allowed users to manipulate some basic objects in the space. Basic manipulations were:

  • Displacing objects

  • Rotating objects

Also in this stage I tested some 2D and 3D interfaces for further design and development. A demo of the work at this stage is shown below:

User Testing

I tested the product at this stage to see if users find the defined interactions intuitive and UI elements makes sense to them at this stage. The major issues that I found at this stage were:

  • Thumbstick buttons do not adjust with user orientation and that makes the manipulation process so hard. .

  • Text that shows the mode of interaction moves with the head and it is easily missed when user is looking down.

  • The red sphere that shows the active object and can deactivate the object is so small to interact with.

  • Moving objects on the wall with hand movement makes precise movements so hard. It is easy but not good.

UserTesting_edited.jpg

User Interaction Design

After selecting the platform and getting basic user inputs, I started defining the interactions that were considered vital to the existence of the application and mapping them to the possible user inputs from the selected device. The following interactios were the results of this project.

Basic Movements

Locomotion

For simple movement in the space, the user can use the Thumbstick on the right controller.

Teleportation

For instant movements and long distance displacements, teleportation is used in VR. User can activate the teleportatio arch by X button and after selecting the destination, press Grip button to do the movement.

Snap Rotation

When the user wants to turn around in the space and look at things that are out of their field of view, they can use the Thumbstick on the left controller to do this interaction.

User Interfaces

User interfaces in this project were of various types. Two-dimensional interfaces that overlay the screen, spatial interfaces that show a specific condition and Meta UI that only represents a state to inform the user  are all some types that are used in this project.

3D UI Elements

Object Active Box

When an object is selected in the scene, it will have a transparent yellow box around it and if any interaction is active on that object, an interaction indicator will show the type of interaction.

ObjectIndicator_edited_edited.jpg
Selection Laser

When user wants to select an object, the can use Trigger buttons on either of the controllers to see the laser line. If the laser line is red, it means that the object is not selectable, and if it is white in color, it means that the object can be selected.

invalidSelection_edited_edited.jpg

Invalid object

validSelection_edited_edited_edited.jpg

valid object

2D UI Elements

Interactions UI

When a user selects an object, an interaction interface will appear to show the possible interactions that can be implemented on that object. If an interaction is not possible for the selected object, that will be shown gray in the UI.

ObjectInteractions_edited.jpg
Object Removal UI

If the user selects the remove interaction, which is a destructive one, the app will ask for confirmation for the selected interaction to prevent any error.

RemoveUi_edited_edited.jpg
Daytime Slider

Users can hold the Menu button on the left controller for 1 second to activate the menu button and hold it again to deactivate it.

MainMenu_edited_edited.jpg

Main Menu is considered for the app to change global settings in the app, importing and exporting objects and other use cases. But at this implementation, only changing the day time is considered an active interaction in this menu. 

DaytimeUi_edited_edited.jpg

Object Manipulations

Move

After selecting the Move interaction from the UI for the selected object. Thumbstick on the right controller will no longer work for locomotion but it will only work only for moving object in the space. 

If the object  is a floor object it will move on the floor it is attached to. The same is for all the walls and ceiling.

Also the app is smart enough to adjust the Thumbstick movements according to the direction that the user is facing.

Rotate

After selecting the Rotate interaction from the UI for the selected object. Thumbstick on the right controller will no longer work for locomotion but it will only work only for Rotating the object around its own axis that is perpendicular to the plane that the object is located on.

Duplicate

After selecting the Duplicate interaction from the UI for the selected object, the object will be duplicate it and user can select it to move, rotate or do other interactions on it.

Remove

After selecting the Remove interaction from the UI for the selected object, another overlay 2D UI will appear to ask the user if they are sure about removing the object. By selecting Yes, the object will be removed from the scene.

Light On/Off

If the selected object is one that holds light objects, user can see the Light button is activated on the 2D UI and can turn on/off that object. 

Change Daylight/Daytime

By activating the Main Menu and selecting the  Day Time, user will see the UI with the slider that allows them to change the time of the day and see the result as a change in the ambient light of the space.

Expert Reviews

After making the project demo at this stage, I sent it to some of the well-known experts that work at the industry and academia and asked them for feedback. Here is some of the feedback that I received

0.jpg

UX Design Lead for YouTube VR

" Interior design is definitely one of the most valuable use cases for headsets, as well as architecture... Regarding the interface, I'd recommend moving the buttons you've got from the face to the non-dominant hand. Then you can hold up the menu with one hand and point at it with the other. Tilt-brush is an example of one of many apps that do this. "

1516337615920_edited.jpg

XR Senior Designer at Microsoft

" My only feedback will be trying to make the tool less modal. Right now it seems like the user needs to switch between modes (translate, rotate, scale, etc..) very often. That'd probably make a slow workflow for him. Rather than that, I think your tool is super neat! Congrats! "

1566019602605.jpg

Senior Designer at Microsoft

" What I recommend to increase the affordance/reduce friction are: 1) provide a grid system to know where your object is placed (especially a symmetry is required). 2) pre-configured rotation to make it simpler (custom rotation option can be provided as well). I would also consider more UI feedback on object selection and what editing mode the user is in. It was difficult for me to tell what object is being copied, etc."

Documentation

VR projects can get so complex both in prototyping and development stage. Constant documentation and updating each the documentation is a necessary part of the job because there are so many tiny details that might cause the program to break at a certain point.

I used Notion.io platform to document my experience in this project and documented all my learning experience, decisions and lessons learned as much as possible. 

doc1.PNG

Reflection

In this project I learned a ton about designing for immersive experiences and specifically VR platforms. My understanding of design and prototyping for VR totally changed during this project and I found that engineering the ideas to a testable prototype is the main challenge of this process. If I want to look back and say what would I do different if I want to do this project again, I can address the following points:

  • I will look for more ready libraries and toolkits that enable me to get to the user testing stage sooner. One of the best decisions of this project was to use Unity XR Interaction Toolkit. This toolkit saved me a lot of time for being able to interact with object and customize my interactions.

  • I will think of ways to change interaction modes without calling the appearance of the 2D menu. Experts review suggested avoiding modes since it makes the user workflow cumbersome and slow.

  • I will use handheld menu instead of head-dependent menu. the type of menu that I used in this project is a large display that appears and disappears based on user inputs and follows the movements of the user's head. I would begin with handheld menu since there is no need to hide the menu and the interaction will be so much easier.

Basics
Prototyping
Interaction Design
Expert Reviews
Documentation
Reflection
bottom of page