at Virginia Tech
Virginia Tech’s Institute for Creativity, Arts, and Technology (ICAT) is home to a collaborative research environment called the Cube – a completely reconfigurable immersive environment unlike anywhere else. It hosts research ranging from embodied data analytics to drone research to cutting-edge creative work.
To accomplish this, the Cube houses a motion capture system that can capture human performance (and some non-human performance) across the entire space, one of the largest immersive audio systems in the world, and a new projection system that enables projection mapping on all walls.
Indeed it is one of the very few places of this kind worldwide. The space is 42 feet high (five stories), 50 feet long, and 40 feet wide. We usually refer to it as a black box theater because the color black absorbs all light and makes it possible to make almost anything technological or artistic happen therein, from musical performances to robot ensembles to XR simulations to art exhibitions.
Unique in the world, the Cube is a five-story-high, state-of-the-art theatre and high tech laboratory that serves multiple platforms of creative practice by faculty, students, and national and international guest artists and researchers.
The Cube is a highly adaptable space for research and experimentation in big data exploration, immersive environments, intimate performances, audio and visual installations, and experiential investigations of all types. This facility is shared between ICAT and the Moss Arts Center at Virginia Tech.
Currently, the Cube is equipped with three modular 4K projectors (with two of them packing 10,000 lumens each and the other one capable of going up to 30,000 lumens), one disguise vx4 (multimedia presentation server and projection mapping system with four 4K UHD outputs), 134.6 channels of 3D spatial audio, nine holosonic directional loudspeakers, sixteen channels of wireless headphone transmitters, a 24-camera Qualisys Optical Tracking System, a triptych immersive projection screen, four channels of Pozyx Ultra-Wide Band (UWB) Radio Frequency Tracking, and nineteen miles of analog and digital audio, video, and data patchable connections. This patching system connects the entire Moss Arts Center and has a private 10G fiber link to the Andrews Information Systems Building, a secure data center housing additional computing and storage.
The technical aspects of the Cube allow for innovative projects to be completed and experienced therein, and a few of them are worth mentioning:
Article ItemEmbodied Virtual Reality for Training and Performance , article
A project striving to conduct a feasibility study for a fully immersive virtual environment focused on training American football quarterbacks. The study seeks to create a flexible platform that integrates proprioceptive, kinesthetic, physiological, visual, and auditory elements to enhance athletic training and measure psychomotor responses.
General ItemTransforming Highway Construction Training through Multi-User Immersive Augmented Virtual Reality
A project taking highway construction worker training to the next level and bridging the gap in traditional worker training methods. The project will develop innovative content-based Augmented VR training that enables a seamless training experience for the workers.
Article ItemAn Educational Tool to Explore the Dynamics of Subatomic Physics Interactions , article
A project seeking to develop a new immersive educational tool for experimental subatomic physics using a virtual reality (visuals and sound) world in the ICAT Cube.
Initially funded by the Commonwealth Cyber Initiative (CCI), ICAT has also developed a "mobile Cube" known as the Tesseract. The Tesseract has been deployed at the Smithsonian Museum of American History, the Taubman Museum of Art, the Torpedo Factory and Art Center, and all over the Virginia Tech Campus. It is built with off-the-shelf theatrical truss components for rapid deployment for a social, immersive audio experience. About eight people can simultaneously stand inside the structure and share an objective spatialized audio environment without headphones. Up to sixty-four individually addressable loudspeakers are driven by Power over Ethernet (PoE) network switches, delivering digital audio entirely over a network. Each loudspeaker receives a unique stream via Audio Video Bridging (AVB) AVoIP protocol. The miniDSP loudspeakers come in stereo pairs, pulling up to 15W of power per channel. The entire system can present up to 100LCSmax (with 100LCSmax being a sound level measurement).
From this, the first-ever layered immersive audio system was created using bone conduction headphones, the Tesseract, and the Cube, and demonstrated in Liminal Spaces, a composition by Guggenheim Fellow composer Eric Lyon and ICAT executive director Ben Knapp. This layered system has subsequently been used for human factors students to explore and innovate using a system that no other students have had a chance to experience.
At ICAT, we strive to push the boundaries of what's possible, and we plan to upgrade the current instrumentation of the Cube to future-proof it and make it ready for even more cutting-edge ideas from students and faculty. Upgrades are both technological and thematic, allowing more flexibility to present Extended Reality (XR) environments that blend real-world physical objects with a large-scale virtual environment.
The Cube infrastructure upgrades include doubling the modular projector heads and light source engines, a second disguise video server, five more projection surfaces, and an image-generating workstation connected with Network Device Interface (NDI). The 3D spatial audio will also be upgraded to 198.8 channels and bone conduction headphone wearables with a central audio renderer using audio over Internet protocol (AoIP) to instantiate the layered audio infrastructure as permanent. Additional motion capture cameras and UWB will be added for high-resolution tracking.
A modular and acoustic transparent floor will be added, allowing sensors and media presentation systems to be placed underneath researchers. Rigging systems will be added to deploy the presentation systems and modular floor. 10G and 1G network infrastructures and computing will be added/upgraded to accommodate the additional instrumentation using existing Cat6a and single/multimode Fiber patchable cable systems. This networking will connect to the University of California Santa Barbara's Allosphere, a facility with similar immersive environment capabilities, allowing both sites to share simultaneous, large-scale XR experiences. These additions will afford a relatively rapid turnover to accommodate a variety of simulation, training, and research experiences.
- First of its kind, full-scale (50’w x 40’l x 32’h), $15M data exploration facility
- Response to the U.S. initiative on “big data”
- Unlike traditional virtual environments, this collaborative research environment for augmented team exploration (CREATE) enables multi-person (social) collaboration with data
- Augmented reality (head-mounted display and tablet interaction interface)
- Wave field synthesis and holosonic sound display interaction
- Synchronized data capture, including Information Retrieval motion capture, audio/video, physiological, and interaction signals
- Real-time audio/visual rendering system
- High-performance computing
- Real-time interaction research
- Virtual vs. real world investigations
- Human performance modeling and studies
- Distributed gaming and social environments
- Multi-person “walk-through” of virtual buildings and environments
- Education and training in full-scale virtual environments
- Artistic installations/performances using multi-screen display and a 128-speaker system