[Eng] VR UX 14. How to Build Virtual Worlds Through User Engagement
2024. 5. 13. 12:38

In VR interaction design, the focus is on creating intuitive interfaces that facilitate dynamic user experiences, crucial for engaging users effectively in virtual environments. This approach emphasizes human-centered design, which revolves around tailoring the interaction systems to meet user needs and behaviors, enhancing their ability to navigate and engage with the VR environment.

 

Interaction in VR involves a two-way communication process: users input actions into the VR system and receive outputs from it. Inputs are user-generated actions communicated through various devices that interact with the VR environment. These can include gestures made with hands or body, movements tracked by motion sensors, and commands given via voice. Common input devices used in VR settings include hand controllers, which allow for precise manipulation of virtual objects, gloves that capture detailed hand movements, motion sensors that track broader body actions, and eye-tracking technology that enhances interaction by detecting where a user is looking.

 

Output devices in VR are the means by which the system responds to user inputs and helps deepen the immersion within the virtual world. These devices deliver sensory feedback, which can be visual, auditory, or tactile. For example, VR headsets provide visual and auditory feedback directly in front of the eyes and ears, creating a deeply immersive visual and sound environment. Haptic feedback devices like gloves or vests convey touch sensations, enhancing the realism of virtual interactions. Additionally, speakers or headphones deliver audio outputs that further enrich the user's sensory experience in VR.

 

By integrating these input and output devices, VR interaction design aims to create a seamless and responsive environment where users can feel truly engaged and connected to the virtual world, making their experiences both intuitive and enjoyable.

 

Norman's Principles of Interaction Design

© Interaction Design Foundation, CC BY-SA 4.0

 

Don Norman's Principles of Interaction Design are pivotal for creating user-friendly and effective virtual reality (VR) environments. These principles help ensure that VR experiences are accessible, intuitive, and engaging. Here’s how each principle is applied in VR:

  1. Discoverability: This principle emphasizes the importance of making system functionalities obvious to the user. In VR, this could mean using clear visual cues or sounds that guide users on how to interact with the virtual environment. For instance, in a VR cooking game, a spatula icon might appear when the user looks at a pan, accompanied by the sound of sizzling, indicating the pan is interactable.
  2. Affordances: Affordances in VR should suggest how objects can be interacted with, based on their appearance or location within the environment. This helps users understand what actions are possible without explicit instruction. In a VR driving simulator, the steering wheel would act as an affordance; users know they can "grab" it with their controllers and steer accordingly.
  3. Signifiers: These are indicators that signal what actions are possible. Clear signifiers in VR help users navigate the environment and understand how to interact with different elements. For example, glowing arrows could direct users to press buttons in a VR puzzle game, clearly indicating interaction points.
  4. Constraints: By limiting the ways in which an object can be interacted with, constraints prevent user error and streamline interactions within the VR environment. An example would be a virtual bow in an archery game that doesn’t allow the string to be pulled back beyond a certain point, mimicking the physical limitations of a real bow.
  5. Feedback: It is crucial to provide immediate and clear responses to user actions in VR. This feedback can be visual, auditory, or haptic (touch), enhancing the immersive experience and guiding the user. In a music creation app, adding a new instrument might trigger a visible musical note and a sound, confirming the action to the user.
  6. Mappings: This involves designing controls that logically correspond to their effects in the virtual world. For instance, in a flight simulation, moving the controller forward might cause the plane to dive, creating a natural and intuitive control scheme.
  7. Non-Spatial Mappings: These translate spatial inputs into non-spatial outputs, or vice versa, broadening the scope of interaction beyond direct manipulation. An example is using head movements to scroll through a virtual library in a language learning program, which transforms a physical action into a navigation command.
  8. Compliance: This principle aligns the design of VR interactions with user expectations and natural behaviors, making the experience more intuitive. For example, in a VR meditation app, the user’s breathing and heart rate could control visualizations on the screen, creating a seamless connection between real-world actions and virtual responses.

 

The VR Interaction Cycle

The VR Interaction Cycle is a comprehensive framework used to design and evaluate virtual reality experiences. It breaks down the user's journey into a series of steps that facilitate goal-driven actions and assess their outcomes, ensuring a dynamic and engaging user experience.

© Interaction Design Foundation, CC BY-SA 4.0

 

Here’s a detailed look at each phase and step in the cycle:

  1. Forming the Goal: This initial phase is where the user identifies what they want to achieve within the VR environment. For instance, the user decides to clear an obstacle like a fallen tree that blocks a path. This step sets the direction for subsequent actions.
  2. Planning the Action: Once the goal is set, the user strategizes how to achieve it. This could involve deciding whether to physically move the tree or find a tool within the virtual environment, such as a saw, to help remove it. Planning involves weighing different approaches and choosing the one that best fits the user’s needs and the constraints of the VR environment.
  3. Specifying the Action Sequence: After planning, the user needs to determine the specific actions required to execute the plan. This step involves choosing the exact sequence of interactions needed, such as grabbing a saw, positioning it against the tree, and deciding whether to cut or push.
  4. Performing the Action Sequence: This is the execution phase where the user actively engages with the VR environment to carry out the planned actions. Actions could involve cutting the tree, pushing it aside, or using other tools and methods available within the virtual world.
  5. Perceiving the State Change: After the actions have been performed, the user observes the changes in the environment. This could be seeing the tree fall or move, which alters the landscape and path accessibility.
  6. Interpreting the Perception: The user processes what the change means in the context of their goal. For example, if the tree has been successfully removed, the user assesses whether the path is clear and navigable.
  7. Comparing the Outcome with the Goal: Finally, the user evaluates if their original goal has been achieved based on the outcomes observed. This comparison determines whether the actions were successful and if further adjustments are necessary.

These steps create a loop of interaction where the user continually assesses and reacts to the virtual environment, adjusting their actions based on feedback and changes within the world. This cycle helps in maintaining a continuous engagement with the VR experience, making interactions feel more intuitive and aligned with the user’s intentions. By iterating through these steps, VR designers can refine and enhance the user experience to be more immersive and satisfying.

 

Understanding Human Hands in VR

In VR, human hands are used as direct interfaces for interacting with the virtual environment. This interaction is more natural and precise than using other input devices because it mirrors how we interact with the real world. Effective hand interaction design allows users to perform tasks intuitively within the virtual space.

 

Bimanual Interaction: This refers to the use of both hands simultaneously to perform tasks. Bimanual interactions can be:

  • Symmetric: Both hands perform identical actions, which can simplify the learning curve and enhance coordination.
  • Asymmetric: Each hand performs different actions, which can increase the complexity but also the potential for sophisticated and realistic interactions.

Iterative Design and User Feedback: To optimize these interactions, it’s crucial to implement iterative design processes that involve continuous user feedback. This approach helps refine the interfaces by adjusting spatial relationships and enhancing usability based on real user interactions and preferences.

 

Key Considerations

  • Input and Output Tools: Effective VR experiences rely on various tools for input (like hand controllers, gloves, voice commands) and output (like haptic feedback and visual displays). These tools must be integrated thoughtfully to create a coherent and functional system that supports hand interactions.
  • Intuitiveness and User Comfort: The success of VR interaction design hinges on how intuitive and comfortable the interactions are. Designers must consider the natural movements and limitations of human hands to ensure that interactions feel realistic and do not strain the user.
  • Norman’s Principles of Interaction Design in VR: These principles are crucial for guiding VR interaction design. They emphasize:
    • Discoverability: Making it easy for users to understand what actions they can perform.
    • Affordances: Designing objects that intuitively communicate how they can be used.
    • Signifiers: Providing cues that guide users on how to interact with objects.
    • Constraints: Limiting interactions that can lead to errors or confusion.
    • Feedback: Offering immediate and clear responses to user actions to confirm their effects.
    • Mappings: Ensuring that the movements and actions in the virtual world correspond logically to user inputs.

Learn More