Futuristic fashion with an augmented edge

Augmented reality tech made its way on to the catwalk for fashion designer Steven Tai’s ground-breaking show. Zoe Mutter discovered how London College of Fashion and Lucasfilm transformed the presentation into a digitally augmented experience using motion capture techniques and AR innovation.

It wasn’t only the high fashion garments that grabbed the audience’s attention at designer Steven Tai’s show; one model in particular was turning heads. The newcomer to the world of modelling stood out from her catwalk peers because she was entirely digitally augmented.

Vita the avatar’s movement was controlled by the improvised performance of a motion capture actor backstage. Meanwhile, realtime digital effects were applied to the CG model appearing on the screen, enabling her to change outfits during the performance.

Designer Tai teamed up with London College of Fashion and ILMxLAB to create the tech-enhanced LiveCGX fashion presentation at Durbar Court within London’s Foreign and Commonwealth Office. Set against the backdrop of an augmented street scene in China, the show saw ILMxLAB venture into new territory working on their first live event utilising realtime rendering.

Catwalk collaboration
ILMxLAB and London College of Fashion had been in contact for two years leading up to the partnership. The organisations share a common passion for exploring the opportunities presented by new technologies such as AR and VR and how the two fields open up a creative sector in the arts.

“We realised there was a natural fit between a new technology we were exploring at ILMxLAB involving augmented live performances and the London College of Fashion’s desire to explore new ways for designers to present their work,” says Mohen Leo, director of content and platform strategy, ILMxLAB.

Matthew Drinkwater, head of the Fashion Innovation Agency at London College of Fashion, began the process by searching for the most suitable designer to collaborate with who would help turn ideas into practice. He found the perfect partner in Steven Tai, a fashion designer who was keen to experiment with new technology, having previously taken part in an interactive video game-style fashion presentation.

The Fashion Innovation Agency has also worked on multiple projects promoting the incorporation of technology in the fashion industry. They were involved with developing the first interactive skirt made up of Nokia smartphones and played a role in the design and creation of a fully functional Star Wars-inspired bionic arm which was showcased at the Star Wars: Fashion Finds the Force presentation.

Realistic digital garments
While the two-hour presentation strived to display garments from Steven Tai’s Autumn Winter 2018 collection, it was also important that technology was incorporated in an ambitious way. ILMxLAB wanted to highlight the improvisational capabilities of the LiveCGX technology stack and continue to set the bar high for content.

The complex tech set-up would allow them to showcase digital garments that look, move and drape in a realistic manner and which were indistinguishable from the real garment. The audience could compare actual garments with realtime costume changes on the CG avatar as they were displayed side-by-side on stage.

“Steven Tai was also keen to tell his story by transforming the classical setting of Durbar Court using elements from his homeland, Macau. The digital elements transitioned from traditional elements such as jungle flora and fauna through to modern-day Macau imagery such as casino designs, neon lights and signage,” explains Maggie Oh, technical project manager at ILMxLAB’s advanced development group. “Ultimately, we wanted to show a fantastical, never-before-seen environmental mash-up of Macau’s jungle, neon signs, and Durbar Court’s classical architecture.”

Realtime rendered special effects were used for two key elements in the fashion presentation, firstly when the digital garments morphed between the Origin garment (orange pantsuit) to the Destination garment (navy shirtdress). The effects also enhanced the environment. In 30-minute intervals Durbar Court’s classical architecture transformed between traditional Macau, complete with jungle flora and fauna, and modern Macau’s casino aesthetic with neon signage and bright awnings.

Mo-cap magic
ILMxLAB aspires to create content and technology that build towards a world where augmented content will be all around us. The fashion show project was a major milestone for the company because it marked the first public use of LiveCGX. Motion capture and depth sensing techniques were invaluable to the team when creating the immersive environment. The set-up comprised a series of Microsoft Kinect cameras which captured depth information from the audience and the models on stage.

Microsoft Kinect cameras were used right from the beginning of LiveCGX development so the team could binocularly capture human movement with a proper depth using a commercially proven SDK. However, during the London Fashion Week LiveCGX development, Microsoft stopped manufacturing Kinect cameras, so in the future the team will need to investigate other camera hardware options.

As live events can be unpredictable ILMxLAB paid close attention to detail when setting up the kit: “Some of the details we figured out on site included where to mount the Kinects so they were out of the audience’s immediate footpath, how to dial in the correct lighting for digital assets that are composited with live footage on an LED display, and where in London we could buy extra camera mounts at short notice,” says Oh.

In addition to a combination of off-the-shelf hardware and software and their own proprietary software the ILMxLAB also relied upon Vicon Vero cameras to create the LiveCGX experience at London Fashion Week. The industry-standard motion capture camera systems with which the team has a lot of prior experience were used to create an easy-to-deploy and calibrate motion capture system.

Whilst the venue itself is enclosed there is a large skylight for stories directly above where the presentation would occur. Therefore ILMxLAB rented Black Onyx displays from VER because they needed bright, modular, LED displays without much glare to display the augmented environments and garments in the dramatically differing and changing lighting conditions.

Realtime effects
LiveCGX is a sandbox toolset that in the future will support performers of all types – from sports through to music – to augment performances in an improvised manner. “The events don’t need to be choreographed or timed, as the augmentation can be triggered by the performer when they want it. The performer has full control of their image and enhancements, without having to wait for a post-processing step to happen,” says Oh.

“As was shown in Steven Tai’s London Fashion Week presentation, the models’ choreographer and backstage coordinator could call out to our mocap performer Vita Oldershaw when to hit a specific mark. The way the backstage coordinator was able to interact with the real-life models and Vita was the same. Vita was able to trigger the garment transformation herself via a gesture detection system.”

What makes LiveCGX unique from other augmented live event experiences is that projection techniques such as Pepper’s Ghost were not used. The Microsoft Kinect cameras acquired a depth buffer of the live audience and models, enabling a realtime composite featuring the CG avatar.

“LiveCGX also stands out because we used off-the-shelf hardware and combined the power of those available resources to produce a creatively-driven, improvisational experience,” says Oh. “We hacked SDKs, built our own server, and enhanced Unreal Engine’s current capabilities. Not only were we able to display the high quality realistic art and content that people expect of our company, but it was also delivered in a realtime rendered live event.”

The depth buffer in Unreal’s Composure compositing module meant the CG elements were not just rendered as an overlay on top of the live video footage; it allowed the real models to hug, interact, and have the correct eye-line when engaged with the CG avatar.

ILMxLAB’s LiveCGX technology makes it possible for the team to trigger auto-locomotion with spatial awareness from a gestural motion in Steven Tai’s presentation when Vita extended out her hand as if she was throwing dice onto a table that would trigger her avatar on screen to walk amongst the group of models towards the centre stage. A deep bow would then trigger a wardrobe change which acted as the perfect focal point for the show finale.

ILMxLAB upgraded its internal engine of Unreal to be based off Unreal Engine v4.18. It upgraded mid-development so it could use Unreal Engine 4’s realtime compositing module, Composure.

Design software Marvelous Designer was well-suited to the London Fashion Week LiveCGX event since it can set up realistic fabric interfaces including stitching, draping and pleating based on real fashion patterns. This made it possible to take the patterns from Steven Tai’s actual garments, set up the digital garment in Marvelous Designer and simulate the digital garment in Unreal.

Is this fashion’s future?
The show stood out as one of the most unique fashion presentations at London Fashion Week, says Oh: “The fashion industry audience was curious about how the technology would be incorporated as a lot of fashion presentations don’t usually use much of it in their shows. It really put FIA, ILMxLAB, steventai, and the GREAT British Campaign together in a spectacular ‘you just had to be there’ live event. The presentation itself was a stretch for ILMxLAB, both technically and from a content production standpoint, as it was our first live event utilising realtime rendering.”

ILMxLAB expects many professional performers and artists will embrace the opportunity to add realtime improvisational digital content to their toolbox and use it to express themselves in ways that are not possible in the physical world.

“We are already seeing people spend more time as digital representations of themselves, from live streaming to game avatars to camera filters,” says Oh. “As AR becomes more common, all industries will have a desire to digitally alter and enhance how they present themselves across different mediums in realtime. Virtual fashion and fashion design will be an important part of that as well.”

Watch ILMxLAB’s video for a behind the scenes look at the real-time performance-driven live digital augmentation…

www.ilm.com/ilmxlab

www.fialondon.com

steventai.co.uk

Have your say

or a new account to join the discussion.