
Delirium 88 is a solo-built cyberpunk adventure game developed in PlayCanvas for the VIVERSE XR platform. The concept: a neon-soaked future city gone wrong, where you're a rogue investigator navigating corrupt megacorps, hostile AI drones, and the flickering neon underbelly of a society in collapse. Equal parts noir detective game and action-adventure, it punches far above its 5–6 week development timeline.
The feature set is genuinely ambitious for a solo build. Enemy AI operates via full finite state machines — patrol, alert, search, chase, and combat states, each with transitions tuned so the behaviour feels organic rather than mechanical. The inventory system handles item pickup, stacking, consumption, and equipment slots with full persistence. A branching dialogue system drives NPC interactions and gates story progression based on player choices.
An objective tracker surfaces current, completed, and failed mission states with animated UI transitions. Animated cutscene sequences play at key story beats, built entirely within PlayCanvas using timeline-driven camera and character animation. The whole experience was designed to work across both desktop browsers and mobile VR headsets — touch controls were implemented for mobile with virtual joysticks and gesture-based interaction mapped to the WebXR input model.
VIVERSE platform integration includes spatial audio zoning, avatar synchronisation, and social presence features that allow the world to be shared. The game reached the finals of the VIVERSE Creator Competition.
Claude was not an occasional helper on this project — it was the primary coding partner throughout the entire five-week build. Given the scale of systems required and the compressed timeline, relying on Claude to write, debug, and architect code was a deliberate strategic choice, not a shortcut. The result was a more complex game than could have been built in the same timeframe working alone.
The enemy AI architecture was designed in conversation with Claude over several sessions. Describing the intended behaviour in natural language — patrol routes, alert triggers, search patterns, combat engagement logic — and having Claude generate the corresponding JavaScript state machine code meant the underlying logic could be prototyped, tested, and iterated fast. When behaviour felt off, describing what was happening to Claude produced targeted fixes rather than hours of debugging.
The dialogue system required a data structure that could represent branching conversation graphs with condition gates and outcome triggers. Claude designed the JSON schema, wrote the parser, and built the UI rendering layer. The resulting system is clean enough to extend: adding new conversations means authoring JSON, not touching engine code.
Custom GLSL shaders for the neon glow effects, scanline overlays, and holographic display materials were written with Claude. Describing the visual target — the look of a corrupted CRT monitor, the refraction of light through rain-wet streets — and asking for shader implementations produced working code that required only minor tuning.
WebXR compatibility across Quest, Pico, and desktop browser targets introduced a long tail of device-specific issues. Claude was invaluable for diagnosing problems from error logs and stack traces — particularly issues with controller mapping, spatial anchor behaviour, and render scaling across different headset resolutions.
AI-generated concept art produced in Midjourney guided the visual language — the colour palette, the level of surface grime and neon saturation, the character silhouettes. Having reference imagery generated in hours rather than commissioned over weeks gave a concrete visual target to build towards from day one.
Built on PlayCanvas with custom WebXR extensions, GLSL shaders, and deep VIVERSE SDK integration.