[LAB] K-Pop Demon Slasher

[LAB] K-Pop Demon Slasher

The Brief

My daughter had watched me build something for her brother’s birthday, and naturally expected something equally over-the-top for hers. She explicitly asked for a K-Pop Demon Hunters game.

But underneath that was the real brief I gave myself:

How fast can I build something fun if I let my AI workflow do as much of the work as possible?

I wanted to see how far I could push the idea of generating interactions, visuals, and logic without actually writing feature code. Not to replace the craft — but to see how much overhead I could strip away.

The Challenge

Anyone in creative tech knows the real tax isn’t the idea, it’s the implementation.

Setting up scenes. Coding systems. Wiring up particles. Debugging.

I wanted to see if I could skip the glue work entirely and focus on “shaping the idea” instead of wrestling with setup. Could I prompt my way into a prototype? Could the agent layer understand and extend my code in real time? Could Cursor generate the right logic without me writing all of it by hand?

The challenge wasn’t the game itself.

It was testing whether the whole pipeline could collapse days of work into an afternoon.

The Idea

The aesthetic direction was set by the birthday party itself; energetic, neon, playful K-pop Demon Hunters imagery everywhere. So the idea was simple:

A tiny slash-em-up where the characters run toward you and you cut through them with a real sword.

The twist: the sword is a 3D-printed prop with a Switch Joy-Con inside, streaming gyroscope data into the game.

The Build

Everything started in my AI workflow:

Foundry generated base assets.

The agent layer refined prompts and interpreted code.

World Labs turned references into 3D splats and generated panoramas for global illumination.

Cursor handled code generation for the interactions and logic.

From there, it became a two-hour sprint of prompting and iteration inside a WebGPU sandbox:

  • Characters and movement logic
  • Particle systems driven by curl noise
  • Mirror-reflection rendering
  • Global illumination using the AI-generated panorama
  • Gaussian-splat environments
  • All built without writing traditional feature code

The biggest “aha” moment came when the particle system materialized entirely from a prompt. No boilerplate. No manual wiring. Just: describe the effect, and the framework builds the scaffolding.

The Joy-Con Sword

The icing on the cake was the physical interaction.

I embedded a Nintendo Switch Joy-Con into a 3D-printed sword and read the gyroscope data over Bluetooth. The rotation mapped to a nested point at the sword’s tip in 3D space. That world-space position became the hit detector — wherever the player slashed, particles erupted.

Simple. Chaotic. Fun.

The Outcome

Visually, it’s still prototype quality — and that’s the entire point.

Because this wasn’t a two-month production.

It wasn’t even a two-day sprint.

It was two hours.

And in those two hours, everything that normally slows down creative tech — setup, glue code, assets, systems — disappeared. What used to devour a weekend now fits into an afternoon. The time saved becomes time you can redirect into polishing, storytelling, or pushing the idea further.

For Forged, this prototype is a small example of the bigger shift underway:

Timelines are collapsing, iteration is frictionless, and we can build pitch-ready prototypes that would’ve been impossible on traditional budgets.