VR Prototyping

A collection / feed of ongoing explorations in interaction design, tech art, and programming.

Holographic Map, Gesture-Activated

An idea for a holographic mini-map in VR that is activated by hand gestures instead of using items or menu buttons–something the user can always access intuitively. This method activates the map when the user grips and the palm is turned up.

Artificial Distance Fog on Android

Using some simple distance + height math to create an artificial distance fog effect across large landscapes. Quest mobile struggles with any kind of post-processing effects as well as Unreal’s default height fog. This calculation works inside the Unlit material shader itself to improve performance.

Object/UI Manipulation via Gestures

An idea for using ‘waving’ gestures to manipulate objects during runtime. This is a simple 1/2 hand implementation which allows swiping and ‘pinch-to-zoom’ scaling, similar to gestures we are already familiar with on tablets and smartphones. Since there is no physical surface feedback with in-game objects, I opted for the Johnny Mnemonic-style hand waving gestures.

Interior Parallax Shader

An implementation of Unreal’s cubemap shader, tested on large scale cityscapes for VR. Custom models at this scale would normally impose too many polygons and draw calls, which would easily tank the framerate on a standalone device like Quest. This scene uses only 3 different materials with each building less than 1500 polygons. I also created a single-channel bitmap texture which the shader uses to choose the appropriate cubemap, allowing me to add variation to the windows.

Full-Body Character Spine Solver Calculation

When implementing my full-body VR character, I could not find a suitable spine solver that correctly bends the neck and spine chain as the player rotates their head. VR characters use the HMD as the origin point, but traditional animation workflows use the pelvis as the origin. This results in a mismatch between the player’s head location and the character’s head location. Most other spine solvers I tried place 100% of the corrective rotation at the neck joint, making the character look like their neck is broken.

My implementation starts from the HMD’s position and rotation and ‘un-rotates’ a portion of this total at each bone along the spine, resulting in a more faithful and realistic character posture. The final pelvis offset is calculated as a sum of the bone chain’s transforms. Each bone’s transform is then passed into the animation blueprint.

Gun Handling Physics Impulses

In pursuit of making VR guns feel immersive, I implemented several small physics impulses that are triggered when the player performs an interaction on the gun. These small impulses provide a subtle sense of feedback to the player, confirming their movements and interactions and enhancing the sense that they are wielding a powerful, mechanically complex tool.

Gun Recoil Patterns

Gun recoil is challenging to design for VR interactions since it touches not just the domain of real-world simulation, but also the gun’s aesthetic feel and gameplay balancing. I studied slow motion footage of guns and determined that the location and rotation of the gun are quite different during the firing timeline. This implementation uses two separate curves to adjust the gun’s location and rotation, allowing me to shape the recoil in a way that feels realistic while still leaving control over the gameplay implications.

Gun Magazine Ejection Physics

Upon release, the magazine follows a linear spline and is limited along this axis by a physics constraint where it is pulled by gravity. After clearing the magazine well, the magazine’s physics constraint is cleared and it behaves like a normal simulating physics object.