Tutorials

WebVR Ray Input

Use the mouse to move the camera around and interact with the 3D world. Click the VR button in the bottom right if you have a VR compatible device and headset to enter VR.

This is a WebVR experience that interacts with the world with a laser point like control. Supports desktop, mobile, Google Cardboard™, Google Daydream™, Samsung Gear VR™ and other VR headsets.

Let's have a look at the tutorial project.

Enter VR UI and VR Camera

Every WebVR experience on PlayCanvas will always have these two elements in some form:

In this project, we have web-vr-ui.js which is added to the Root entity. This will show an HTML UI button in the bottom right corner to enter/exit VR.

look-camera.js supplies the support for VR to the camera and also listens to events to rotate the view from input devices such as mouse, touch and gamepad.

Example from the input-mouse.js:

InputMouse.prototype._onMouseMove = function (event) {
    if (this.app.mouse.isPressed(pc.MOUSEBUTTON_LEFT)) {
        this.app.fire('camera:pitch:rotate', event.dy * this.sensitivity);
        this.app.fire('camera:yaw:rotate', event.dx * this.sensitivity);
    }
};

For the most simplest of WebVR experiences where user can look around a scene, these two files are all that needed and can be used as-is.

To read more about the direct PlayCanvas API for WebVR, please refer to the User Manual.

VR Input Types

The level of fidelity for input devices can be broken into the following groups (DOF == Degrees of Freedom):

This tutorial project supports all three via the input-*.js files. Mouse, touch and gamepad are for 0 DOF input. input-vr.js for 3 DOF (with a simulated arm model) and 6 DOF input types.

VR Tracked Input Devices

The system for the tracked input devices consists of two files:

input-vr.js can be configured to represent the left, right, either or 'neither' (when the controller doesn't identify itself as left or right) hand and the priority is set to map the nth connected device that represents that hand.

Input VR Script

Interacting with World

Ray Controller

The ray input logic is in controller-ray.js and works like a laser pointer to interact with the world. The script can be attached to any entity and uses the entity's forward property for the direction of the ray.

In this tutorial, we have it attached to the camera entity for 0 DOF input and also to the Tracked Controller 1 and 2 entities for 3 DOF and 6 DOF input.

The controller fires the following events to the entities that it interacts with:

In this tutorial, we have the button-type-toggle.js listen to the controller:hover:off, controller:hover:on and controller:button:click events like so:

this.entity.on('controller:button:click', this._onButtonClick, this);
this.entity.on('controller:hover:on', this._onHoverOn, this);
this.entity.on('controller:hover:off', this._onHoverOff, this);

For more information on using events, please refer to the API reference.

As this is a scalable experience, it is catered for the lowest common dominator between the three and therefore it is assumed there is only one button for interaction.

However, it is simple to modify or expand on top if you only wanted to support a particular controller like the Oculus Rift Touch™.

Shapes

We use the PlayCanvas Shapes to approximate the physical volume and they are added to a global collection that can be tested against.

This is all packaged in shape.js which are attached to the interactive entities and are automatically added to the global collection in shape-world.js that can be queried against by the rest of the application.

shape.js supports Spheres, Axis Aligned Boxes and Oriented Boxes using the world position, world orientation and local scale to construct the Shape.

Once the shape.js component has been added to the entity, the entity is now an object that can be interacted with controller-ray.js and can listen for the events listed above.

Taking the PlayCanvas Cube entity as an example:

PlayCanvcs Cube

The Shape is set to be an OBB (Oriented Box) with local scale of the entity is 1, 1, 1 so it will construct a Oriented Box that is of size, 1 by 1 by 1 unit.

In the case where the Shape and visual representation are of different scales such as the Rotate Left entity, it should have the following hierarchy:

Rotate Left

Where the core logic (in this case, rotate the cube left) of the entity is on the parent entity (1), the Shape as a child with the local scale set to physical volume (2) and the visual representation as a child (3).

'Use Parent' in the shape.js component is ticked so that controller-ray.js events are fired at the parent entity where the core logic for the object is rather than the entity that the shape.js component is attached to.

Shape Use Parent

shape.js can also be used to create a compound of shapes to represent one entity simply by adding more Shape entities to the parent entity.

Shape World

shape-world.js contains the collection of Shapes in the world and makes it globally accessible. Through this script component, we can raycast into the world and find the closest intersected entity to the ray's origin.

E.g.

var ray = new pc.Ray(this.entity.getPosition(), this.entity.forward);
var hitEntity = this.app.shapeWorld.raycast(ray);
if (hitEntity) {
    // Ray has intersected with a Shape and hitEntity is the associated entity for that Shape
}