Epic Games’ Unreal Engine User Group presentation brought many companies under one roof.

The annual ACM SIGGRAPH conference is the preeminent gathering of computer graphics pioneers in the world. Full stop. As such, it is something of a time machine, a visit to SIGGRAPH affords a glimpse at where industries that rely on computer graphics will be in over the next decade, depending on how close you’re willing to look.

For our immersive family this is the place to scope out what’s on the horizon in virtual and augmented reality, the broad collection of disciplines that is rapidly becoming referred to as “immersive tech.”

This series represents my notes on the things I’ve seen and experienced at SIGGRAPH 2019.


Day One: The User Group

At the end of day one Epic Games held their Unreal Engine Users Group meeting, which served as Epic’s off-site keynote for SIGGRAPH.

Here’s the whole video of the event. There’s a lot. A LOT going on here.

Now when you think Epic you probably think Fortnite — currently the most popular video game in the world — and you’d be right. What you might not know is that the Unreal Engine, the game engine that powers Fortnite and a whole lot of other video games, has become a big deal in everything from commercial to film production.

(Note: when you see a link below, it will jump you to a bookmark in the video above in this window. Right click to open in a separate tab.)

Epic brought out executives from Digital Domain, ILMxLab, and Walt Disney Imagineering among others to talk about how the game engine is being used to enable real-time photo-realistic graphics. Which means that, for instance, special effects can be created in real-time while on set, so that instead of waiting to see how a finished FX shot will look like, directors and their art teams can see that through the viewfinder.

Or on the LED volume that’s been set up in lieu of a green screen.

This is where things get crazy, and the company brought out director Jon Favreau to talk not only about how the game engine enabled virtual production on his remake of The Lion King (currently the biggest movie at the box office for two weeks in a row) but how the virtual sets on the upcoming Star Wars show The Mandalorian were sometimes good enough to make it into the final shot.

It’s difficult to overstate how big of a shift in how film & television are made that real-time imaging is. For one it changes how filmmakers work, putting the nexus of the creative process back onto the set. Yet for our purposes here in immersive world this kind of real-time image processing is table stakes.

Get Noah J Nelson’s stories in your inbox

Join Medium for free to get updates from this writer.

SubscribeSubscribe

Experiences like The VOID and Smuggler’s Run at Galaxy’s Edge are made possible because of the advancements in game engines. WDI’s Bei Yang even showed off the video for Smuggler’s Run, the Millennium Falcon ride, running on an 8GPU machine tucked away off stage. He drove the simulation with a game controller for about a minute on-stage, cutting away once the jump to hyperspace was made.

Let’s cut to Smuggler’s Run, shall we?

The whole stack of technology is collapsing down into one set of tools, and means that artists and technologists are going to have skillsets that stretch from one industry to another, with toolsets that are increasingly flexible and interchangeable. (More on that when I write about Glassbox.)

Performance capture is becoming less onerous, which was apparent from the number of setups on the SIGGRAPH expo floor where an iPhone alone was driving facial capture. Couple that to the increasingly photo-realistic digital characters that engines like Unreal can pump out in real-time, and suddenly we have what we need to have actors in interactive virtual spaces that look real.

This isn’t tomorrow. It’s today. It’s still pricey as hell, but it’s here.


Day Two: APOLLO 11 on Hololens 2

I’ve been struggling with the WHY of AR headsets for a while now.

While the idea of a data layer superimposed on physical reality is interesting, one that opens up a lot of possibilities, I hadn’t seen anything that had made a compelling use case in AR yet. Almost everything I’d tried so far felt like it would have been better off as something else.

That melted away with the Apollo 11 demo that Unreal had setup off site.

Using Microsoft’s Hololens 2 — which features a big step up in field of view from both the original Hololens and the current Magic Leap One — a team from Epic created this educational short using film quality assets streamed wirelessly to the headset from a nearby desktop.

While there’s still field of view limitations, that fell to the wayside thanks to the graphical fidelity of the models of the Apollo 11 rocket, lander, and astronauts. The whole thing felt like it belongs in a museum — in a good way. It’s readily apparent that an interactive program that allows one to examine the pieces of a vast engineering project, or some other scientific endeavor, can be a powerful pedagogical tool.

This is an “on stage” version version of the demo that was made for Microsoft’s Build conference. So it takes some liberties.

The demo is short, the team only had a few weeks to put the final together, but there’s enough trailheads here of development that I could see how valuable it could be to get down into the details of how something works either alone or with someone else. The current demo is for one person, but the team told me that they had ways of making it a multi-user experience if that’s what they wanted to do.

It seems to me that the key to making the augmented data layer compelling isn’t a matter so much of adding elements to the physical world, as it is of providing elements that reveal hidden dimensions — either of virtual or real objects. That allow the user to go deeper into a topic, maybe even a fictional one, rather than turning the world into a weightless canvas.

In any case, between the User Group meeting and the Apollo 11 demo I came away from SIGGRAPH with the sense that Epic Games has already positioned itself to be powerhouse in entertainment well beyond its success in gaming. The more that this company turns its attention toward immersive experiences, the better off our field will be.


NoPro is a labor of love made possible by our generous Patreon backers. Join them today!

In addition to the No Proscenium web site, our podcast, and our newsletters, you can find NoPro on Twitter, Facebook, YouTube, Instagram, in the Facebook community Everything Immersive, and on our Slack forum.

Office facilities provided by Thymele Arts, in Los Angeles, CA.