360 lookaround camera
Use mouse and touch to rotate the camera and look around a scene for a panoramic or first-person style viewer.
Use mouse and touch to rotate the camera and look around a scene for a panoramic or first-person style viewer.
Build a Gaussian splat product viewer with clickable hotspots and UI annotations over the captured scene.
Combine SOG splats with shadows, dynamic relighting, and physics for walkable real-world scanned spaces.
Present a gallery of Gaussian splat statues with custom shaders and post effects between exhibit transitions.
Stream huge Gaussian splat scenes with spatial LOD and SOG ordering so large environments load progressively.
Customize the loading screen with project image assets so branding and splash art show while the app initializes.
Drive entity motion with animation curves for simple scripted movement without building a full Anim State Graph.
Play and control animation clips from scripts without an Anim State Graph for straightforward playback and tests.
Handle touch events to drag and move on-screen objects for a minimal mobile-first interaction sample.
Keep planes facing the camera for trees, flares, and sprites that always show their front toward the viewer.
Animate a camera along a spline path through waypoints for flythroughs and guided cinematic motion.
Use camera and light mask groups so specific lights and views only affect chosen models in layered scenes.
Render from a camera to a texture and export pixels as a PNG for captures and sharing from your app.
Swap a material texture map from script to retexture models on the fly without duplicating mesh assets.
Generate QR codes in the browser at runtime with a small library and show them in UI or on textures.
Spawn entities with collision and rigidbody components entirely from scripts for procedural physics objects.
Detect double-clicks with timing logic and use them to move the camera or trigger actions on desktop browsers.
Recognize double-tap gestures on touchscreens separately from single taps for mobile-friendly interactions.
Measure press duration on touch or mouse to fire actions after the user holds on an element long enough.
Add and remove scroll view children at runtime to build lists and feeds with PlayCanvas screen-space UI.
Cast physics raycasts from the pointer to hit colliders and select entities under the cursor in 3D.
Pick meshes using math ray tests instead of the physics engine to avoid extra runtime weight in published builds.
Layer particle emitters with camera shake to sell an explosion impact and debris burst in real time.
Animate material opacity so individual models fade in and out without duplicating geometry or materials.
Extend first-person controls with jumping and navigation as a starter for FPS-style movement prototypes.
Drive a particle fireball that follows the pointer for a simple magical projectile and trail effect.
Recreate Flappy Bird-style tap flight with pipes and scoring as a compact PlayCanvas gameplay sample.
Add a lightweight FPS overlay script to monitor frame rate during development and performance tuning.
Edit HTML and CSS assets with live reload so HUD changes appear instantly while the scene keeps running.
Build layered HUDs and menus with standard HTML and CSS composited over the WebGL canvas.
Place clickable hotspots in the 3D view that open info panels using raycasts and screen overlays.
Load multiple runtime assets while updating a UI progress bar so users see download and parse progress during loads.
Load several assets after startup using PlayCanvas asset APIs instead of preloading every file before the scene begins.
Load a single asset on demand so unused resources stay off the initial preload list until your script requests them.
Implement a radial loading spinner UI with materials and animation for clear feedback while content streams in.
Load Draco-compressed GLB models in PlayCanvas with the right pipeline so meshes stay small without breaking rendering.
Load glTF GLB files as binary assets at runtime and add the resulting model instance to your PlayCanvas scene hierarchy.
Request pointer lock on click to capture the mouse for first-person look, hiding the cursor while movement drives the camera.
Lay out screen UI inside notched and rounded device safe areas so controls stay clear of system bars and hardware cutouts.
Combine multiple cameras and render layers to mix screen-space UI with world-space elements in one PlayCanvas scene.
Split the canvas into separate camera viewports by adjusting each camera rectangle for picture-in-picture or split-screen views.
Track every active touch point and visualize multitouch gestures by drawing lines between fingers on the screen.
Explore an interior archviz sample with dynamic reflections and baked HDR lightmaps for realistic lighting and materials.
Implement an orbit camera with mouse and touch, including scroll-wheel zoom and pinch-to-zoom around a focal entity.
Smoothly move and aim the camera toward a world-space target so clicks or events frame important locations in the scene.
Pause and resume gameplay by toggling Application timeScale so physics, animations, and time-based logic freeze together.
Filter physics raycast hits by entity tags so picking ignores the wrong colliders and only selects intended objects.
Enable continuous collision detection on rigidbodies so fast projectiles do not tunnel through thin colliders between steps.
Spawn physics-enabled props at clicked world positions using raycasts against the ground so new bodies collide correctly.
Instantiate decorative entities at clicked ground positions using math and raycasts without adding rigidbody simulation.
Render planar reflections for mirrors and water by projecting the scene onto a plane with a reflective shader pass.
Shade a globe with Earth textures, materials, and shaders to present a stylized planetary body in real time.
Move a character or object toward clicked ground positions using screen-to-world raycasts for simple navigation.
Generate a gradient texture in code and apply it to materials instead of importing a bitmap for smooth color ramps.
Build a rainbow ribbon trail by updating mesh vertices and colors with the Mesh API for dynamic GPU-driven geometry.
Convert pointer coordinates into per-camera rays when using split viewports so picking hits the correct 3D region in each pane.
Render a 3D preview with a dedicated camera into UI elements such as inventory icons or character selection panels.
Scale internal 3D resolution for performance while keeping crisp screen-space UI by separating world and overlay render paths.
Adapt Element component text and layout scripts for right-to-left languages such as Arabic with correct alignment flow.
Orbit or spin an object by mapping mouse drag deltas in screen space to rotation quaternions or euler angles each frame.
Add a radial shockwave distortion to the framebuffer with a post-effect shader for impacts and cinematic camera pulses.
Cast a physics ray from the camera through the pointer to detect which entity or shape sits under the cursor.
Approximate a stylized water plane with scrolling normals and fresnel-tinted materials for simple reflective surfaces.
Ease the camera between positions and orientations each frame using vector lerp and quaternion slerp for cinematic motion.
Drive a sound slot volume from sampled curve data so loudness follows authored ramps instead of linear code tweaks.
Fork an Asteroids-style shooter template with mouse aiming, shooting, and waves of rocks to learn arcade game flow.
Merge static meshes into fewer draw calls with batching to cut CPU overhead while keeping non-moving geometry efficient.
Use stencil tests to mask a playing-card shape and reveal a separate 3D scene through the window like a magic portal.
Swap material assets on render components in code to change colors, shaders, or skins without duplicating mesh entities.
Follow a controllable character with a chase camera while handling input and physics for a basic third-person movement rig.
Implement a playable 3D tic-tac-toe board with turn handling, win detection, and UI feedback in a small sample game.
Extend the engine with one-shot timers that fire callbacks after delays while respecting global pause via time scale.
Arrange buttons and panels with Layout Group and child layout rules to build responsive PlayCanvas UI flows.
Generate normal and parallax maps for dynamic text so lettering catches light and reads embossed on surfaces.
Port a Shadertoy-style plasma effect into a material shader chunk so animated GLSL drives surface color in engine lighting.
Compose a shop screen with layout groups, buttons, sprites, and atlases to list items and purchase affordances cleanly.
Decouple scripts by firing and listening for custom events on entities or the application for modular gameplay messaging.
Drive a wheeled vehicle with Ammo raycast wheels, suspension, and steering that works on desktop, mobile, and WebXR.
Stack scanlines, roll, and color bleed in a fullscreen pass to mimic noisy VHS and curved CRT monitor presentation.
Darken image edges with a vignette while splitting color channels for subtle chromatic aberration in a post-process pass.
Distort sprite UVs in a custom GLSL shader to create wavy or bulging effects without rebuilding mesh geometry each frame.
View an equirectangular 360 photo inside VR with WebXR so users can look around an immersive spherical backdrop.
Play omnidirectional video in a WebXR session so headset users can rotate freely while the clip streams around them.
Raycast against PlayCanvas meshes during an AR session so taps on virtual shapes work alongside the passthrough camera view.
Combine WebXR AR with the DOM overlay API to place styled HTML/CSS UI above the camera feed without custom WebGL text.
Cast rays into real-world geometry with the WebXR hit-test API to anchor virtual objects on detected floors and tables.
Load WebXR input profile meshes for controllers and hands so tracked hardware matches each vendor silhouette in VR.
Read WebXR hand-tracking joint poses and mirror finger motion in the scene for natural direct manipulation without controllers.
Enter immersive VR from a minimal scene with a WebXR-ready camera rig and session lifecycle hooks you can extend.
Skin tracked hand joints with detailed meshes and materials so articulated fingers read clearly in immersive VR sessions.
Follow six-DoF controllers in VR with starter code for poses, buttons, and haptics inside a PlayCanvas WebXR template.
Explore a maintained VR sandbox that demonstrates scalable interaction patterns and responsive layouts across XR devices.
Track printed reference images in AR and align 3D content to their pose using the WebXR image-tracking extension.
Subscribe to WebXR plane detection updates to visualize floor and table estimates for anchoring content to real surfaces.
Layer world-space Element panels so UI quads draw after opaque geometry and stay readable over busy 3D scenes.
Project 3D world positions into screen-space anchors so labels and markers track moving objects on the 2D overlay.