VR Interior

VR Interaction Design for Interior design application

Problem Statement

Architects and interior designers need to have a clear understanding of the spatial qualities of their designs. However, imagination and 3D visualization of spaces are the only prototyping tools that are insufficient. Wrong decisions in designs can cost a lot of money, material, and other resources.

Design Goal

Designing a VR prototyping application for architects and interior designers to empower them in their design decision-making process


Project Type

Individual  independent work


April 2020 - September 2020

  • Unity

  • C# 

  • XR Interaction Toolkit



Designing an immersive experience to help interior designers be able to review and modify their design works interactively 


After doing some research on the subject, the design phase started with finding out the basic interactions that an interior designer need to have in a VR app to be able to review and modify their design work. After identifying interactions, 


The following demo shows what was the result of this project as a lo-fi VR design and development project. ​

Expert Reviews

After making the project demo at this stage, I sent it to some of the well-known experts that work in the industry and academia and asked them for feedback. Here is some of the feedback that I received


UX Design Lead for YouTube VR

" Interior design is definitely one of the most valuable use cases for headsets, as well as architecture... Regarding the interface, I'd recommend moving the buttons you've got from the face to the non-dominant hand. Then you can hold up the menu with one hand and point at it with the other. Tilt-brush is an example of one of many apps that do this. "


XR Senior Designer at Microsoft

" My only feedback will be trying to make the tool less modal. Right now it seems like the user needs to switch between modes (translate, rotate, scale, etc..) very often. That'd probably make a slow workflow for him. Rather than that, I think your tool is super neat! Congrats! "


Senior Designer at Microsoft

" What I recommend to increase the affordance/reduce friction are: 1) provide a grid system to know where your object is placed (especially a symmetry is required). 2) pre-configured rotation to make it simpler (custom rotation option can be provided as well). I would also consider more UI feedback on object selection and what editing mode the user is in. It was difficult for me to tell what object is being copied, etc."


For starting this project, I interviewed 5 interior designers about their basic needs in reviewing an interior design schema. Through these interviews I found the following as the basic interactions that they expect from an app:

  • Moving Objects in the space

  • Rotating Objects around their axis

  • Duplicating objects 

  • Removing Objects

  • Measuring distances

  • Changing lighting settings

  • Changing materials and colors

I started prototyping for the tool by making a minimum viable program at Unity that allowed users to manipulate some basic objects in the space. Basic manipulations were:

  • Displacing objects

  • Rotating objects

Also in this stage, I tested some 2D and 3D interfaces for further design and development. A demo of the work at this stage is shown below:

User Interaction Design

After selecting the platform and getting basic user inputs, I started defining the interactions that were considered vital to the existence of the application and mapping them to the possible user inputs from the selected device. The following interactions were the results of this project.

Interactions UI

When a user selects an object, an interaction interface will appear to show the possible interactions that can be implemented on that object. If an interaction is not possible for the selected object, that will be shown gray in the UI.


After selecting the Move interaction from the UI for the selected object. The Thumbstick on the right controller will no longer work for locomotion but it will only work only for moving objects in the space. 


After selecting the Rotate interaction from the UI for the selected object. The Thumbstick on the right controller will no longer work for locomotion but it will only work only for Rotating the object around its own axis that is perpendicular to the plane that the object is located on.


After selecting the Duplicate interaction from the UI for the selected object, the object will be duplicate it and user can select it to move, rotate or do other interactions on it.


After selecting the Remove interaction from the UI for the selected object, another overlay 2D UI will appear to ask the user if they are sure about removing the object. By selecting Yes, the object will be removed from the scene.

Light On/Off

If the selected object is one that holds light objects, the user can see the Light button is activated on the 2D UI and can turn on/off that object. 

Change Daylight/Daytime

By activating the Main Menu and selecting the  Day Time, the user will see the UI with the slider that allows them to change the time of the day and see the result as a change in the ambient light of the space.


In this project, I learned a ton about designing for immersive experiences and specifically VR platforms. My understanding of design and prototyping for VR totally changed during this project and I found that engineering the ideas to a testable prototype is the main challenge of this process. If I want to look back and say what would I do differently if I want to do this project again, I can address the following points:

  • I will look for more ready libraries and toolkits that enable me to get to the user testing stage sooner. One of the best decisions of this project was to use Unity XR Interaction Toolkit. This toolkit saved me a lot of time for being able to interact with the object and customize my interactions.

  • I will think of ways to change interaction modes without calling the appearance of the 2D menu. Experts review suggested avoiding modes since it makes the user workflow cumbersome and slow.

  • I will use a handheld menu instead of the head-dependent menu. the type of menu that I used in this project is a large display that appears and disappears based on user inputs and follows the movements of the user's head. I would begin with a handheld menu since there is no need to hide the menu and the interaction will be so much easier.