top of page
Overview

VR Interior

VR Interaction Design for Interior design application

Problem Statement

Architects and interior designers need to have a clear understanding of the spatial qualities of their designs. However, imagination and 3D visualization of spaces are the only prototyping tools that are insufficient. Wrong decisions in designs can cost a lot of money, material, and other resources.

Design Goal

Designing a VR prototyping application for architects and interior designers to empower them in their design decision-making process

Overview

 

Objective

Designing an immersive experience to help interior designers be able to review and modify their design works interactively 

​

Approach

After doing some research on the subject, the design phase started with finding out the basic interactions that an interior designer need to have in a VR app to be able to review and modify their design work. After identifying interactions that are necessary for the experience, I started prototyping for making those interactions doable and tested with different use cases and users during the process. 

​

Demo

The following demo shows what was the result of this project as a lo-fi VR design and development project. ​

Project Type

Individual  independent work

Tools
  • Unity

  • C# 

  • XR Interaction Toolkit

Timeline

1st iteration May - Sep 2020

2nd iteration Mar - May 2022

I updated this experience based on the experts review feedback and my own personal findings from testing with some users. This demo is recorded using a VR headset and the video edited in Adobe After Effects to highlight critical moments in the experience. 

Experts reviews on the first iteration and major changes are explained on this page.

Second iteration demo

Updated Spring 2022

Prototyping

Overall, the prototyping for this project had 4 stages:​

​

​

For starting this project, I interviewed 5 interior designers about their basic needs in reviewing an interior design schema. Through these interviews I found the following as the basic interactions that they expect from an app​

​

​

​

I started prototyping for the tool by making a minimum viable program at Unity that allowed users to manipulate some basic objects in the space. Basic manipulations were:

  • Displacing objects

  • Rotating objects

Also in this stage, I tested some 2D and 3D interfaces for further design and development. A demo of the work at this stage is shown below:

  • Moving Objects in the space

  • Rotating Objects around their axis

  • Duplicating objects 

  • Removing Objects

  • Measuring distances

  • Changing lighting settings

  • Changing materials and colors

Interaction Design

After selecting the platform and getting basic user inputs, I started defining the interactions that were considered vital to the existence of the application and mapping them to the possible user inputs from the selected device. I designed and developed those interactions and made my prototype. The result was 2 iterations that I put some parts of the 1st iterations and all of the 2nd iteration designs here.

First Iteration (Summer 2020)

In the first iteration, active objects were enclosed in a transparent box to show their active status and an interaction UI showed the current interaction in text format

In the first iteration, the 2D UI was of Head-Up-Display (HUD) that appeared when an interactable object was selected and disappeared when it was unselected

In the first iteration, there were some UI elements that appeared by pressing buttons on the controller which were removed in the 2nd iteration from the experience

After making the first iteration demo in summer 2020, I sent it to some of the well-known experts that work in the industry and academia and asked them for feedback. Here is some of the feedback that I received

0.jpg

UX Design Lead for YouTube VR

" Interior design is definitely one of the most valuable use cases for headsets, as well as architecture... Regarding the interface, I'd recommend moving the buttons you've got from the face to the non-dominant hand. Then you can hold up the menu with one hand and point at it with the other. Tilt-brush is an example of one of many apps that do this. "

1516337615920_edited.jpg

XR Senior Designer at Microsoft

" My only feedback will be trying to make the tool less modal. Right now it seems like the user needs to switch between modes (translate, rotate, scale, etc..) very often. That'd probably make a slow workflow for him. Rather than that, I think your tool is super neat! Congrats! "

1566019602605.jpg

Senior Designer at Microsoft

" What I recommend to increase the affordance/reduce friction are: 1) provide a grid system to know where your object is placed (especially a symmetry is required). 2) pre-configured rotation to make it simpler (custom rotation option can be provided as well). I would also consider more UI feedback on object selection and what editing mode the user is in. It was difficult for me to tell what object is being copied, etc."

Second Iteration (Spring 2022)

In the 2nd iteration, I tried to move the work forward based on the feedback that I received from experts. In this iteration, all the possible interactions are reflected in a hand-held menu on the non-dominant hand of the user that user can hide/unhide and can swap hands if they prefer to do so.

Also a there is a spatial guiding grind that user can hide/unhide in the space and use it for aligning objects when needed.

OculusScreenshot1653075263.jpeg

Hand-held menu UI

Turn the guiding grid on/off

Hide/Unhide menu

Swap hands for holding the menu

Move Object

Rotate Object

Duplicate Object

 

By selecting the Move interaction from the UI for the selected object. The Thumbstick on the right controller will no longer work for locomotion but it will only work only for moving objects in the space. 

​

 

By selecting the Rotate interaction from the UI for the selected object. The Thumbstick on the right controller will only work for Rotating the object around its own axis that is perpendicular to the plane that the object is located on.

 

By selecting the Duplicate interaction from the UI for the selected object, the object will be duplicated and user can select it to move, rotate or do other interactions on it.

​

​

Remove Object

 

By selecting the Remove interaction from the UI for the selected object, another overlay 2D UI will appear to ask the user if they are sure about removing the object. By selecting Yes, the object will be removed from the scene.

​

​

Turn on/off Light

 

If the selected object is one that holds light objects, the user can see the Light button is activated on the 2D UI and can turn on/off the light on that object.

Change Daytime

 

By selecting daylight button the user will see the UI with the slider that allows them to change the time of the day and see the result as a change in the ambient light of the space.

Reflection

On the first iteration of this project, I learned a ton about designing for immersive experiences and specifically VR platforms. My understanding of design and prototyping for VR totally changed during this project and I found that engineering the ideas to a testable prototype is the main challenge of this process. 

On the second iteration, I found out how important clear documentation and implementation of the work is. Fortunately, I did a very well documentation of this project on Notion and made a very organized Unity file that helped me a lot to get back on the project after more than a year of the first iteration and being able to work on it and fix the areas that I wanted to.

  • I Achieved in the 2nd iteration to think of ways to change interaction modes without calling the appearance of the 2D menu. 

​

  • I Achieved in the 2nd iteration to use a handheld menu instead of the head-dependent menu. So there is no need to hide the menu and the interaction will be so much easier.

bottom of page