Hall of Flowers: The Collective Bloom

Hall of Flowers: The Collective Bloom

When Hall of Flowers came to us, they weren’t looking for another trade show activation.

They wanted to elevate what’s already the defining event in their industry.

Hall of Flowers is to cannabis what SEMA is to automotive or Basel is to art — a high-end, invite-only showcase where the best of the industry meets. It’s not casual. It’s curated. So when they asked us to create something worthy of that environment, we set out to build something that wasn’t about spectacle — but connection.

The Brief

The ask was simple: make attendees feel part of something bigger than themselves.

The event spans multiple halls filled with massive LED walls, screens, and installations. We saw that as an opportunity — to use those screens not just for advertising, but as a canvas for a living, collective artwork that every attendee could contribute to.

The Idea

We reimagined the Statue of Liberty — not as a monument of stone, but as a flowing symbol of community.

Our concept: a real-time, 4.5-million-particle recreation of the statue that would evolve as attendees interacted with it. By scanning a QR code, anyone could open an AR experience, place a digital flower, and see it instantly appear on the large-scale projection at the venue.

The flowers themselves were carefully chosen — each carrying a distinct story tied to both New York and Hall of Flowers.

  • Daffodil – New York City’s Official Flower - Chosen after 9/11 as part of The Daffodil Project, it symbolizes remembrance and resilience — a tribute to the city where the original statue stands.
  • Freesia – The Seventh-Year Flower - Traditionally associated with the seventh anniversary, it represents trust, loyalty, and lasting connection — a quiet nod to Hall of Flowers entering its seventh year.
  • ‘Lady Liberty’ Hybrid-Tea Rose - A soft, coral-hued rose bred for elegance and strength — named in honor of the statue herself.
  • ‘Torch of Liberty’ Miniature Rose - A smaller, flame-colored bloom echoing the statue’s torch — a symbol of light and creativity passed from hand to hand.

The Process

We began with the same Statue of Liberty model used in the AR experience to ensure perfect continuity between handheld and large-scale visuals.

Inside Houdini, we transformed that static model into a volumetric sculpture made of motion and light. We sculpted dynamic particle fields that would flow across the statue’s surface — a blend of physical realism and artistic abstraction — and exported them as volumetric particle data.

Particles, unlike meshes, have no inherent normals — they don’t know which way they’re facing. So we built a custom raytracing system that understood each particle’s relationship to the statue’s geometry.

If the torch was the light source, every point on the statue was shaded based on how it faced the flame — allowing millions of independent particles to behave as if they were part of one unified surface.

The result: a fluid, luminous form that looked like it was sculpted from energy itself.

The difference in just the particle texture (left), and raytracing enabled (right)

Capturing the Shots

Once the digital statue existed, we needed to find ways to film it — to make something entirely virtual feel as if it were shot in the real world.

So we built an AR camera-capture tool. Using the phone’s motion sensors and camera feed, we could move around a blank physical space while seeing our virtual Statue of Liberty anchored in place. Every step we took in the real world translated into camera movement in the digital scene.

This approach let us compose shots as if we were physically on set — finding angles, reveals, and parallax moments that felt cinematic and grounded. Once a movement felt right, we’d lock the camera’s position and orientation, then fine-tune the shot inside our GUI to perfect framing, focal length, and depth of field.

It gave us the best of both worlds: the spontaneity of handheld cinematography and the precision of digital control.

The AR Experience – No App, No Friction

The magic only works if everyone can participate, so we made it completely browser-based.

On Android, users launched directly into WebXR — scan the QR code, and the statue appeared instantly in AR.

On iOS, we used Apple’s App Clip system to bridge the WebXR gap. It ran a native ARKit experience off the same shared codebase, giving iPhone users the same fidelity and responsiveness as Android users — without ever downloading an app.

Every flower placed through the AR experience synced in real time to the main installation. Within seconds, what appeared in someone’s hand also appeared on the towering, particle-based statue across the venue.

The Outcome

Over two days, attendees placed more than 14,000 flowers, each one appearing live on the collective sculpture.

The installation became a gathering point — people stopped to find their contribution, point it out to friends, take photos. The once-static screens around the venue pulsed with movement and color, reflecting the combined presence of thousands of attendees.

For Hall of Flowers, it became more than an activation — it became a shared ritual that embodied the spirit of the event. Attendee feedback showed a measurable lift in brand sentiment and intent to return for the next show — proof that creative technology can strengthen not just engagement, but emotional connection.

Impact Highlights

  • 4.5M real-time ray-traced particles rendered in WebGPU
  • Cross-platform AR built with WebXR and App Clips (no app download)
  • Over 14,000 digital flowers placed in two days
  • Live venue-wide visual synchronization across all screens
  • Elevated attendee sentiment and re-engagement for next-year participation