About Projects Clients News Contact

The Wood Beneath the World - The Chime Hours...

'The Chime Hours' is a collaboration with fellow Leeds-based studio and Connoisseurs of Make-Believe, Lord Whitney, with funding from XR Stories and supported by The University of York.

It's an R&D project to create a location based VR prototype based on Lord Whitney's existing site-specific experience, The Wood Beneath the World. It's intended as a taster: one chapter of a potential much larger piece.

Step this way [scroll down] to read more...

Enter The Chime Hours

Set in a mysterious world of the forgotten, the surreal and the subconscious, the project provides valuable insights on the ways that creatives can incorporate immersive, multiuser technology into their installations and experiences in large spaces.

We created a virtual world, melded with the real life location of Testbed: "a newly renovated 10,000 sq foot event venue in Leeds that offers endless possibilities for creating unique and inspiring experiences".

Participants freely explore the testbed space in VR

Participants could walk freely around the space, in a wireless mixed reality headset, interacting with the mysterious woods, its inhabitants and each other via a virtual torch and their very presence. The further they step into the space, and shine a light on the world, the more the real world fades away and the leafy depths of the woods emerge...

Partipants enter the experience

Bookending the VR portion of the experience, Lord Whitney worked their magic on a physical installation, with actors/facilitators introducing them to the wonders and mystery of the space and greeting them as they took off their headsets, before guiding them back into the physical set.

There was a desire to engage people at a multi-sensory level, combining VR visuals with an intimate performance, free-roaming exploration, the smell of woodland, wood-chips underfoot and the feeling of wind on your face.

The challenge

The unique technical challenge of this project was to find a way for up to four people to share a space together, to be able to see a representation of each other and to each be able to interact with the VR elements to activate the trees, plants and inhabitants of the woods. All of this in a much larger space than we'd ever been able to work with before for a location based, multiuser VR experience. in short, a fantastic challenge!

Early on in the project we decided we'd work with HTC's XR Elite headsets as they offer the ability to use colour "passthrough" (see the real world around you and mesh it with virtual objects). Aside from that, much like the Meta Quest headsets, these VR devices are designed to be used as a standalone headset. They can either connect (via Wi-Fi) to a PC or mobile apps can be built to run directly on them.

HTC XR Elite and controller as torch

We worked to create a particle effect that could be used to allow the trees to morph and grow in shape when they are activated, and also to form a vortex of particles and disappear (with a gust of wind and the sound of rustling leaves) if you tried walking through them. Since there's no way of stopping a person walking through a virtual object in such a space, we made a feature of it!

Spatial audio played a key part of the experience, with dynamic, intimate elements such as the voice of the "Will 'o the Wisp" character you encounter in the space; conveyed through the headset's onboard earphones, an experience and effect personal to each participant, while XR Stories lent us their amazing speaker system (and surround sound know how) to bring external atmospherics and music into the mix as a shared audio ingredient/layer for everyone in the space.

We integrated the real Testbed space into the virtual world and looked at ways of gradually morphing from the real world to the virtual and back again at the end of the experience, even tying it to the position of the participants; every step forward they took, the real world would disintegrate away.

Following some initial tests and discussion, we decided to use a single hand-held controller as the primary means of interacting with the world, either by shining the torch beam on objects to "activate" them or quite simply by proximity.

Whilst this is just a limited taster into the possibilities, there's so much more we would have liked to have tried, from more tactile elements you could touch, more interactive elements and "clues" to link to the overall story; to tweaking how the multiuser elements could work; to really exploring the limits of mixed reality and pass-through techniques.

We'd also like to investigate ways of making it as accessible as possible to as many people as possible, everything from how to allow for alternative ways of moving about, to methods of avoiding nausea, to options for people who are hard of hearing or visually impaired. Lots to explore next time! 😊

Credits

By Lord Whitney & Reflex Arc

Funded by XR Stories Supported by The University of York

  • Amy Lord: Artistic Director (Lord Whitney)
  • Rebecca Whitney: Creative Director (Lord Whitney)
  • Richard England: Creative Technology Director
  • Elliot Mann: VR Design Intern (University of York)
  • Sammy Gooch: Producer
  • David Gochfeld: Research Fellow (XR Stories)
  • Melodie Ash: Creative Producer (XR Stories)
  • Joe Rees-Jones: Creative Technologist / Audio (XR Stories)
  • Buffalo: Sound Design
  • Joe Kent-Walters: Performer
  • Drewit Studios: Set Build Crew
  • Florence Simms: Feedback Facilitator
  • Jennie Gilman: Event Runner
  • Jemma Micklebrough: Photos
  • Matt Brown: Audio for storm and wind effects
  • Thanks to Joe P for helping with boundary management, testing and tracking.
  • With thanks to Testbed & Slung Low