VR interactions hurricane


Modern day PCs and tablet devices have a standardized set of inputs (keyboard, mouse, touch-screen). They also implement a standardized set of interactions (e.g. Ctrl+C is recognizable as the copy command). In contrast, VR inputs and interactions are not standardized. We as creator of a virtual world has to makes critical decisions on how accessible their world is going to be to the user. These decisions are based on the choice of hardware and the interactions necessary for the VR experience. Years of research have produced a myriad of interaction techniques for VR and these techniques support one of the three main types of action:

  • Controller Input (Laser-pointer or Ray-cast)
  • Gestures
  • Gaze
  • S/R/T (scale, rotation and translation)
  • Affordance-driven manipulation
  • On rails
  • Gaze-directed steering
  • Teleport
  • Real movement

Tech Stack

  • Game Engine – Unity
  • Prog Patterns – Event Driven
  • VR – Oculus
  • Other Assets – Asset Store
Demo 100%
Architecture 100%
Level Design 65%

Hurricane VR

This is an VR interaction Framework for you to kick start your project.

It Contains:

1. Smooth and Responsive One and Two Handed Physics Grabbing
2. Supports custom rigged hand models
3. Advanced Pose Editor with per finger animations and bone mirroring
4. Dynamic Auto Pose Solving using Physics
5. Gravity Gloves or Force Grab style remote grabbing
6. Line Grab – grab anywhere on the line w/ optional grip adjustment
7. Dynamic Auto Poser Live Updates in editor view for quick pose creation
8. Configurable one and two hand strength per grabbable
9. Powerful socket system with easy filtering and auto scaling by mesh size