Tutorials

WebXR Ray Input

Click the VR/AR button if you have a VR/AR compatible device/headset.

This is a WebXR experience that interacts with valid XR input source, such as: laser pointer; gaze; touch screen. Supports desktop, mobile, Google Cardboard™, Google Daydream™, Samsung Gear VR™ and other VR/AR headsets.

Let's have a look at the source of the tutorial project.

Entering VR/AR

Every WebXR experience on PlayCanvas will always have these two elements in some form:

button.element.on('click', function() {
    // check support for VR
    if (app.xr.isAvailable(pc.XRTYPE_VR)) {
        // start VR session
        cameraEntity.camera.startXr(pc.XRTYPE_VR, pc.XRSPACE_LOCAL);
    }
});

In this project, we have xr.js which is added to the Root entity. It manages VR and AR UI buttons, reacts to XR availability changes and XR session state changes.

To read more about the direct PlayCanvas API for WebXR, please refer to the User Manual.

XR Input Types

The level of fidelity for input devices can be broken into the following groups (DOF == Degrees of Freedom):

Every input source has a ray with an origin where it starts and a direction in which it is pointing. WebXR input source implementation in PlayCanvas supports all input source types without any extra work from a developer. If an input source is grippable, then we can render its model based on the provided position and rotation.

XR Tracked Input Devices

The system for the tracked input sources consists of two files:

xr-input-manager.js

This tracks added/removed input sources and makes instances of controller entities for them. For example:

app.xr.input.on('add', function (inputSource) {
    // new input source is added
});

controller.js

This is attached to each entity that represents an input source. When an input source can be gripped, it will enable the rendering of a model for a controller.

On each update, it will position and rotate entity based on input source position and rotation:

if (inputSource.grip) {
    entity.model.enabled = true;
    entity.setLocalPosition(inputSource.position);
    entity.setLocalRotation(inputSource.rotation);
}

Additionally, it tracks the primary action of an input source that allows the user to trigger the select event. And uses a ray to interact with virtual objects. Here is a basic example of how to check if a mesh AABB is intersecting with controller's ray when the user uses the primary action on an input source.

inputSource.on('select', function() {
    if (mesh.aabb.intersectsRay(inputSource.ray)) {
        // use triggered primary action
        // on a virtual object
    }
});

Interacting with World

Ray Picking

The Ray is a way of pointing in XR environments. Either gaze, screen or laser pointer-style input sources, they all have a ray with an origin and a direction.

In this tutorial, we track each input source and constantly check if it intersects with bounding shapes of pickable objects in the scene. Ray, position and rotation of an input source are in XR session space, but if we transform the camera by ancestors, then we need to inherit that transformation on the ray, position and rotation.

The controller fires the following events to the entities that it interacts with:

In this tutorial, we have the button-type-toggle.js listen to the controller:hover:off, controller:hover:on and controller:button:click events like so:

entity.on('controller:button:click', function () {
    // entity was clicked with a controller
});

For more information on using events, please refer to the API reference.

As this is a scalable experience, it is catered for the lowest common dominator between input sources and therefore it is assumed there is only one primary action for interaction.

However, it is simple to modify or expand on top if you only wanted to support a particular controller like the Oculus Rift Touch™.

Shapes

We use the PlayCanvas Shapes to approximate the physical volume and they are added to a global collection that can be tested against.

This is all packaged in shape.js which are attached to the interactive entities and are automatically added to the global collection in shape-world.js that can be queried against by the rest of the application.

shape.js supports Spheres, Axis Aligned Boxes and Oriented Boxes using the world position, world orientation and local scale to construct the Shape.

Once the shape.js script has been added to the entity, the entity is now an object that can be interacted with controller.js and can listen for the events listed above.

Taking the PlayCanvas Cube entity as an example:

PlayCanvcs Cube

The Shape is set to be an OBB (Oriented Box) with local scale of the entity is 1, 1, 1 so it will construct a Oriented Box that is of size, 1 by 1 by 1 unit.

In the case where the Shape and visual representation are of different scales such as the Rotate Left entity, it should have the following hierarchy:

Rotate Left

Where the core logic (in this case, rotate the cube left) of the entity is on the parent entity (1), the Shape as a child with the local scale set to physical volume (2) and the visual representation as a child (3).

'Use Parent' in the shape.js component is ticked so that controller.js events are fired at the parent entity where the core logic for the object is rather than the entity that the shape.js component is attached to.

Shape Use Parent

shape.js can also be used to create a compound of shapes to represent one entity simply by adding more Shape entities to the parent entity.

Shape World

shape-world.js contains the collection of Shapes in the world and makes it globally accessible. Through this script component, we can raycast into the world and find the closest intersected entity to the ray's origin.

E.g.

var ray = inputSource.ray;
var hitEntity = app.shapeWorld.raycast(ray);
if (hitEntity) {
    // Ray has intersected with a Shape
    // and hitEntity is the associated entity for that Shape
}