
Vurt is an original cinematic sci-fi short film drawing deep inspiration from Jeff Noon's cult 1993 novel of the same name. Set in a fractured near-future Manchester, the film follows a young drifter who becomes addicted to Vurt feathers — hallucinogenic objects that blur the boundary between consensual reality and collective dreamspace. It's a story about escape, grief, and the cost of living only inside your own head.
The project was a solo production in every meaningful sense — from initial concept development and location scouting through to the final sound mix. Shooting took place across three weekends in and around Manchester city centre, using available light wherever possible to maintain a raw, documentary-adjacent texture that contrasts with the otherworldly VFX sequences.
The editing approach was deliberately non-linear, mirroring the fractured consciousness of the protagonist. Colour grading in DaVinci Resolve was central to the storytelling: the real world sits in a desaturated, cool palette, while the dream sequences bleed into high-contrast amber and crimson — a visual grammar the audience learns to read over the film's runtime.
VFX compositing handled particle systems, screen replacements, and the signature Vurt feather glow effects. Sound design mixed recorded location audio with synthesised drones and sub-bass elements to build a score that sits uncomfortably between ambient and industrial.
Several key sequences required clean rotoscoping — separating subjects from backgrounds for composite work. AI-assisted rotoscoping tools, specifically Adobe's Sensei-powered tracking built into After Effects, reduced what would have been days of frame-by-frame masking into a supervised cleanup process. The AI handled the heavy lifting on motion tracking; manual intervention cleaned edges on the most complex frames.
Topaz Video AI was used extensively across the film's post-production pipeline. Some footage, particularly handheld interior shots, needed aggressive noise reduction without sacrificing the filmic grain structure. Topaz's Proteus model gave the ability to dial in exactly how much grain to preserve. The same toolset handled upscaling for select sequences shot at lower resolution when storage became a limiting factor mid-production.
The dream sequence audio beds were generated using AI ambient sound tools — feeding textual scene descriptions into generative audio models to produce raw tonal material, which was then layered, pitched, and distorted in Adobe Audition. This method dramatically sped up the sound design process and introduced textural elements that would have been difficult to synthesise manually.
Multiple passes of the screenplay went through Claude — not for content generation, but for structural critique. Asking Claude to analyse scene rhythm, flag exposition-heavy dialogue, and suggest where subtext was being over-explained produced genuinely useful notes that tightened the script's third act significantly.
A full end-to-end production and post-production pipeline, built around industry-standard tools supplemented by AI-powered enhancement.