VR Interior

VR Interaction Design for Interior design application

Problem Statement

Architects and interior designers need to have a clear understanding of spatial qualities of their designs. However imagination and 3D visualization of spaces are the only prototyping tools that are insufficient. Wrong decisions in designs can cost a lot of money, material and other resources.

Design Goal

Designing a VR prototyping application for architects and interior designers to empower them in their design decision making process

 

Overview

Project Type

Individual  independent work

Timeline

April 2020 - September 2020

Tools
  • Unity

  • C# 

  • XR Interaction Toolkit

 

Objective

Designing an immersive experience to help interior designers be able to review and modify their design works interactively 

Approach

After doing some research on the subject, the design phase started with finding out the basic interactions that an interior designer need to have in a VR app to be able to review and modify their design work. After identifying interactions, 

Demo

The following demo shows what was the result of this project as a lo-fi VR design and development project. ​

 

Background

Initiation

Though the output of this project is platform agnostic due to employing the XR Interaction Toolkit in its development,  I use specifications of Oculus Quest VR headset that I used to design and develop the project for referencing to controller buttons.

Prototyping

Interactions

After selecting the platform and getting basic user inputs, I started defining the interactions that were considered vital to the existence of the application and mapping them to the possible user inputs from the selected device. The following interactios were the results of this project.

Basic Movements

Snap Rotation

When the user wants to turn around in the space and look at things that are out of their field of view, they can use the Thumbstick on the left controller to do this interaction.

Teleportation

For instant movements and long distance displacements, teleportation is used in VR. User can activate the teleportatio arch by X button and after selecting the destination, press Grip button to do the movement.

Locomotion

For simple movement in the space, the user can use the Thumbstick on the right controller.

User Interfaces

When an object is selected in the scene, it will have a transparent yellow box around it and if any interaction is active on that object, an interaction indicator will show the type of interaction.

Selection Laser

When user wants to select an object, the can use Trigger buttons on either of the controllers to see the laser line. If the laser line is red, it means that the object is not selectable, and if it is white in color, it means that the object can be selected.

invalidSelection_edited_edited.jpg

Invalid object

validSelection_edited_edited_edited.jpg

valid object

2D UI Elements
Object Removal UI

If the user selects the remove interaction, which is a destructive one, the app will ask for confirmation for the selected interaction to prevent any error.

Users can hold the Menu button on the left controller for 1 second to activate the menu button and hold it again to deactivate it.

Object Manipulations

Move

After selecting the Rotate interaction from the UI for the selected object. Thumbstick on the right controller will no longer work for locomotion but it will only work only for Rotating the object around its own axis that is perpendicular to the plane that the object is located on.

Duplicate

After selecting the Duplicate interaction from the UI for the selected object, the object will be duplicate it and user can select it to move, rotate or do other interactions on it.

Remove

After selecting the Remove interaction from the UI for the selected object, another overlay 2D UI will appear to ask the user if they are sure about removing the object. By selecting Yes, the object will be removed from the scene.

Change Daylight/Daytime

Background

3d-printing-scale-architecture-models_ed

I used to work as an architect. Interior designers and architects always try to imagine the spatial quality of their design products and check it with different tools and techniques like sketching, 3D modelling and building physical models. VR technology is capable of providing an spatial experience that is so close to real experience and that can help designers decide about details in their work.

As someone who has an experience in architecture and is practicing UX design for digital products, I decided to work on a MVP (Minimum Viable Product) that helps architects and designers in making better decisions for their designs. 

Basics

For starting this project, I interviewed 5 interior designers about their basic needs in reviewing an interior design schema. Through these interview I found the following as the basic interactions that they expect from an app:

  • Moving Objects in the space

  • Rotating Objects around their axis

  • Duplicating objects 

  • Removing Objects

  • Measuring distances

  • Changing lighting settings

  • Changing materials and colors

I decided to start designing for these interactions for Oculus Quest VR headset with 6 Degrees of Freedom and two  touch controllers.

xOculus-Quest-Button.png

Though the output of this project is platform agnostic due to employing the XR Interaction Toolkit in its development,  I use specifications of Oculus Quest VR headset that I used to design and develop the project for referencing to controller buttons.

Prototyping

I started prototyping for the tool by making a minimum viable program at Unity that allowed users to manipulate some basic objects in the space. Basic manipulations were:

  • Displacing objects

  • Rotating objects

Also in this stage I tested some 2D and 3D interfaces for further design and development. A demo of the work at this stage is shown below:

User Testing

I tested the product at this stage to see if users find the defined interactions intuitive and UI elements makes sense to them at this stage. The major issues that I found at this stage were:

  • Thumbstick buttons do not adjust with user orientation and that makes the manipulation process so hard. .

  • Text that shows the mode of interaction moves with the head and it is easily missed when user is looking down.

  • The red sphere that shows the active object and can deactivate the object is so small to interact with.

  • Moving objects on the wall with hand movement makes precise movements so hard. It is easy but not good.

UserTesting_edited.jpg

User Interaction Design

After selecting the platform and getting basic user inputs, I started defining the interactions that were considered vital to the existence of the application and mapping them to the possible user inputs from the selected device. The following interactios were the results of this project.

Basic Movements

Locomotion

For simple movement in the space, the user can use the Thumbstick on the right controller.

Teleportation

For instant movements and long distance displacements, teleportation is used in VR. User can activate the teleportatio arch by X button and after selecting the destination, press Grip button to do the movement.

Snap Rotation

When the user wants to turn around in the space and look at things that are out of their field of view, they can use the Thumbstick on the left controller to do this interaction.

User Interfaces

User interfaces in this project were of various types. Two-dimensional interfaces that overlay the screen, spatial interfaces that show a specific condition and Meta UI that only represents a state to inform the user  are all some types that are used in this project.

3D UI Elements

Object Active Box

When an object is selected in the scene, it will have a transparent yellow box around it and if any interaction is active on that object, an interaction indicator will show the type of interaction.

Selection Laser

When user wants to select an object, the can use Trigger buttons on either of the controllers to see the laser line. If the laser line is red, it means that the object is not selectable, and if it is white in color, it means that the object can be selected.

invalidSelection_edited_edited.jpg

Invalid object

validSelection_edited_edited_edited.jpg

valid object

2D UI Elements

Interactions UI

When a user selects an object, an interaction interface will appear to show the possible interactions that can be implemented on that object. If an interaction is not possible for the selected object, that will be shown gray in the UI.

Object Removal UI

If the user selects the remove interaction, which is a destructive one, the app will ask for confirmation for the selected interaction to prevent any error.

Daytime Slider

Users can hold the Menu button on the left controller for 1 second to activate the menu button and hold it again to deactivate it.

Main Menu is considered for the app to change global settings in the app, importing and exporting objects and other use cases. But at this implementation, only changing the day time is considered an active interaction in this menu. 

Object Manipulations

Move

After selecting the Move interaction from the UI for the selected object. Thumbstick on the right controller will no longer work for locomotion but it will only work only for moving object in the space. 

If the object  is a floor object it will move on the floor it is attached to. The same is for all the walls and ceiling.

Also the app is smart enough to adjust the Thumbstick movements according to the direction that the user is facing.

Rotate

After selecting the Rotate interaction from the UI for the selected object. Thumbstick on the right controller will no longer work for locomotion but it will only work only for Rotating the object around its own axis that is perpendicular to the plane that the object is located on.

Duplicate

After selecting the Duplicate interaction from the UI for the selected object, the object will be duplicate it and user can select it to move, rotate or do other interactions on it.

Remove

After selecting the Remove interaction from the UI for the selected object, another overlay 2D UI will appear to ask the user if they are sure about removing the object. By selecting Yes, the object will be removed from the scene.

Light On/Off

If the selected object is one that holds light objects, user can see the Light button is activated on the 2D UI and can turn on/off that object. 

Change Daylight/Daytime

By activating the Main Menu and selecting the  Day Time, user will see the UI with the slider that allows them to change the time of the day and see the result as a change in the ambient light of the space.

Expert Reviews

After making the project demo at this stage, I sent it to some of the well-known experts that work at the industry and academia and asked them for feedback. Here is some of the feedback that I received

0.jpg

UX Design Lead for YouTube VR

" Interior design is definitely one of the most valuable use cases for headsets, as well as architecture... Regarding the interface, I'd recommend moving the buttons you've got from the face to the non-dominant hand. Then you can hold up the menu with one hand and point at it with the other. Tilt-brush is an example of one of many apps that do this. "

1516337615920_edited.jpg

XR Senior Designer at Microsoft

" My only feedback will be trying to make the tool less modal. Right now it seems like the user needs to switch between modes (translate, rotate, scale, etc..) very often. That'd probably make a slow workflow for him. Rather than that, I think your tool is super neat! Congrats! "

1566019602605.jpg

Senior Designer at Microsoft

" What I recommend to increase the affordance/reduce friction are: 1) provide a grid system to know where your object is placed (especially a symmetry is required). 2) pre-configured rotation to make it simpler (custom rotation option can be provided as well). I would also consider more UI fe