Categories
Uncategorized

Physical+Virtual Events

The 10th annual League of Legends World Championship has just finished (Korea’s Damwon Gaming team won). The final is famed for its opening ceremony; this year saw physical music stars and dancers perform with the virtual group K/DA, in an augmented reality experience created with live in-camera digital effects and broadcast on the big screens of Pudong football stadium to 6,000 fans

The quarter-finals of the competitions used virtual studios: the room the contestants played in had walls and floor made of LED screens which ran animations to provide optical illusions, enhanced by in-camera AR. This meant that the dancers in the opening sections could interact with the digital effects in real-time.

This behind-the-scenes video explains the technology (also used in the Disney+ show, The Mandalorian) and shows more effects that it enabled.

In the same weekend, Fortnite’s Party Royale Island hosted a 30+ minute set by musician J Balvin, using a virtual studio and post-production effects (as well as some cute ghost costumes).

The New York Times has an in-depth piece on how the J Balvin set was recorded—including its virtual guest stars, recorded separately in front of a green screen, then added later.

The LoL World Championship was an IRL event augmented with digital; Fortnite’s Afterlife Party was a digital event enhanced by IRL enhanced by digital!

Lockdowns around the world make it hard to produce live (or as-live) entertainment events, but the desire to be entertained hasn’t gone away. Entertainment (and fashion) brands moving into games is one of the most interesting shifts happening in digital at the moment; another is the move in the opposite direction, where the graphics engines which power those games (like Epic’s Unreal Engine) are starting to be used for real-time digital effects in visual media. A great merge is underway.

Categories
Uncategorized

Digital Fashion: Avatars and Virtual Identity

Inspired by two stories last week—Ralph Lauren thinks people want to shop their Bitmoji, and Helsinki Fashion Week Explores New Frontiers With Purely Digital Format—I made this short film about digital fashion:

Not long after I made it, I read Is Direct To Avatar The Next Direct To Consumer?, an excellent article by Cathy Hackl with Ryan Gill explaining digital fashion and the D2A model:

Direct-to-avatar (D2A) refers to an emerging business model selling products directly to avatars (D2A) – or digital identities – bypassing any supply chain management like dropshipping, logistics of how to get a physical product to a consumer’s door.

Ryan Gill, co-founder and CEO of Crucible

And then a further article, From Animal Crossing To Digital-Only Dresses, Is Fashion Becoming Our New Virtual Reality?, by Hannah Banks-Walker, on digital fashion in gaming and social:

The pandemic has accelerated our acceptance of blending the real world with more and more digital experiences.

Matthew Drinkwater, Head of Innovation Agency at London College of Fashion

There’s an interesting point where three accelerating trends—the use of avatars in virtual spaces, the digital intermediation of our identities, and fashion brands exploring digital tools—are meeting.

Categories
Uncategorized

COVID-19 and the QR Code Comeback

I read a story about Coca-Cola updating some of it’s vending machines across the US, which are touchscreen-based vending machines, and they ran an over-the-air software update to convert them to be touchless by using QR codes. And it’s funny, isn’t it? So many changes have been accelerated by COVID-19 and I didn’t think that necessarily QR codes would be; but perhaps they will.

Because as well as the Coca Cola vending machines, there’s also the UK’s test and trace system. Now, you probably know the story of the app that never was, but instead there’s a web-based system that many places (like pubs) are using where you scan a QR code, which takes you to a website where you check in your details. And it’s kind of training people to use QR codes.

Categories
Uncategorized

Believability and Persistence in AR

There are two important concepts arriving in smartphone AR technology at the moment: believability, and persistence.

Believability comes from digital and physical objects appearing to naturally occupy a space together: for example, if you move a physical object in front of a digital object, the digital one should appear to be partly obscured. In AR parlance this is occlusion, or blending.

You can see the advantage of blending in the two photos at the top of this post: in the photo on the left I’ve disabled blending so the image of the goat appears in front of the table, flattening the depth in the picture; in the second, blending is enabled so the goat appears occluded by the table, as you’d naturally expect it to be; it’s believable.