Skip to main content

33 docs tagged with "input"

View all tags

360 lookaround camera

Use mouse and touch to rotate the camera and look around a scene for a panoramic or first-person style viewer.

Basic Keyboard Input

Use the Keyboard API to detect key presses and held keys for gameplay actions and simple interactions.

Basic Mouse Input

Use the Mouse API to read movement, button presses, and consistent screen coordinates across browsers for interactive controls.

Basic touch input

Handle touch events to drag and move on-screen objects for a minimal mobile-first interaction sample.

Detecting a double click

Detect double-clicks with timing logic and use them to move the camera or trigger actions on desktop browsers.

Detecting a double tap

Recognize double-tap gestures on touchscreens separately from single taps for mobile-friendly interactions.

Detecting a long press

Measure press duration on touch or mouse to fire actions after the user holds on an element long enough.

First Person Movement

Implement first-person movement with rigidbody forces, mouse look, and an optional child camera rig.

Flaming fireball

Drive a particle fireball that follows the pointer for a simple magical projectile and trail effect.

Information hotspots

Place clickable hotspots in the 3D view that open info panels using raycasts and screen overlays.

Locking the mouse

Request pointer lock on click to capture the mouse for first-person look, hiding the cursor while movement drives the camera.

Multitouch input

Track every active touch point and visualize multitouch gestures by drawing lines between fingers on the screen.

Orbit camera

Implement an orbit camera with mouse and touch, including scroll-wheel zoom and pinch-to-zoom around a focal entity.

Pan Camera to Target

Smoothly move and aim the camera toward a world-space target so clicks or events frame important locations in the scene.

Physics raycasting by tag

Filter physics raycast hits by entity tags so picking ignores the wrong colliders and only selects intended objects.

Place an entity with physics

Spawn physics-enabled props at clicked world positions using raycasts against the ground so new bodies collide correctly.

Place entity without physics

Instantiate decorative entities at clicked ground positions using math and raycasts without adding rigidbody simulation.

Point and click movement

Move a character or object toward clicked ground positions using screen-to-world raycasts for simple navigation.

Raycast with Camera Viewports

Convert pointer coordinates into per-camera rays when using split viewports so picking hits the correct 3D region in each pane.

Rotating Objects with Mouse

Orbit or spin an object by mapping mouse drag deltas in screen space to rotation quaternions or euler angles each frame.

Simple shape raycasting

Cast a physics ray from the camera through the pointer to detect which entity or shape sits under the cursor.

Third Person Controller

Follow a controllable character with a chase camera while handling input and physics for a basic third-person movement rig.

Touchscreen Joypad Controls

Add a customizable on-screen twin-stick style joypad with PlayCanvas UI elements for mobile character control.

WebXR AR Raycasting Shapes

Raycast against PlayCanvas meshes during an AR session so taps on virtual shapes work alongside the passthrough camera view.

WebXR Controller/Hand Models

Load WebXR input profile meshes for controllers and hands so tracked hardware matches each vendor silhouette in VR.

WebXR Hands

Read WebXR hand-tracking joint poses and mirror finger motion in the scene for natural direct manipulation without controllers.

WebXR Realistic Hands

Skin tracked hand joints with detailed meshes and materials so articulated fingers read clearly in immersive VR sessions.

WebXR Tracked Controllers

Follow six-DoF controllers in VR with starter code for poses, buttons, and haptics inside a PlayCanvas WebXR template.

WebXR UI Interaction

Point WebXR laser pointers or gaze at screen-space UI and handle clicks across desktop, mobile, and headset browsers.

WebXR VR Lab

Explore a maintained VR sandbox that demonstrates scalable interaction patterns and responsive layouts across XR devices.