360 lookaround camera
Use mouse and touch to rotate the camera and look around a scene for a panoramic or first-person style viewer.
Use mouse and touch to rotate the camera and look around a scene for a panoramic or first-person style viewer.
Use the Keyboard API to detect key presses and held keys for gameplay actions and simple interactions.
Use the Mouse API to read movement, button presses, and consistent screen coordinates across browsers for interactive controls.
Handle touch events to drag and move on-screen objects for a minimal mobile-first interaction sample.
Detect double-clicks with timing logic and use them to move the camera or trigger actions on desktop browsers.
Recognize double-tap gestures on touchscreens separately from single taps for mobile-friendly interactions.
Measure press duration on touch or mouse to fire actions after the user holds on an element long enough.
Cast physics raycasts from the pointer to hit colliders and select entities under the cursor in 3D.
Pick meshes using math ray tests instead of the physics engine to avoid extra runtime weight in published builds.
Implement first-person movement with rigidbody forces, mouse look, and an optional child camera rig.
Extend first-person controls with jumping and navigation as a starter for FPS-style movement prototypes.
Drive a particle fireball that follows the pointer for a simple magical projectile and trail effect.
Place clickable hotspots in the 3D view that open info panels using raycasts and screen overlays.
Request pointer lock on click to capture the mouse for first-person look, hiding the cursor while movement drives the camera.
Track every active touch point and visualize multitouch gestures by drawing lines between fingers on the screen.
Implement an orbit camera with mouse and touch, including scroll-wheel zoom and pinch-to-zoom around a focal entity.
Smoothly move and aim the camera toward a world-space target so clicks or events frame important locations in the scene.
Filter physics raycast hits by entity tags so picking ignores the wrong colliders and only selects intended objects.
Spawn physics-enabled props at clicked world positions using raycasts against the ground so new bodies collide correctly.
Instantiate decorative entities at clicked ground positions using math and raycasts without adding rigidbody simulation.
Move a character or object toward clicked ground positions using screen-to-world raycasts for simple navigation.
Convert pointer coordinates into per-camera rays when using split viewports so picking hits the correct 3D region in each pane.
Orbit or spin an object by mapping mouse drag deltas in screen space to rotation quaternions or euler angles each frame.
Cast a physics ray from the camera through the pointer to detect which entity or shape sits under the cursor.
Follow a controllable character with a chase camera while handling input and physics for a basic third-person movement rig.
Add a customizable on-screen twin-stick style joypad with PlayCanvas UI elements for mobile character control.
Raycast against PlayCanvas meshes during an AR session so taps on virtual shapes work alongside the passthrough camera view.
Load WebXR input profile meshes for controllers and hands so tracked hardware matches each vendor silhouette in VR.
Read WebXR hand-tracking joint poses and mirror finger motion in the scene for natural direct manipulation without controllers.
Skin tracked hand joints with detailed meshes and materials so articulated fingers read clearly in immersive VR sessions.
Follow six-DoF controllers in VR with starter code for poses, buttons, and haptics inside a PlayCanvas WebXR template.
Point WebXR laser pointers or gaze at screen-space UI and handle clicks across desktop, mobile, and headset browsers.
Explore a maintained VR sandbox that demonstrates scalable interaction patterns and responsive layouts across XR devices.