Skip to main content

The Engine is Unreal

If you're not a gaming enthusiast or you've been living under a rock this week, you may have missed the Technology Demonstration of Unreal Engine 5 (UE5) on a PS5 preview system.  It's an incredible piece of work packaged in a fully playable demo that showcases the advancements that UE5 has made over the years.

A bit of history:  back when Quake was the epitome of first person shooter (FPS) games a company out of Cary, NC called Epic MegaGames (now just Epic Games) developed this original, futuristic FPS game called Unreal that ran on an original graphics engine simply dubbed The Unreal Engine.  I was a huge Quake fan at the time, and when Unreal was released everyone at the office that were also gamers rushed to see who could finish the game first.  Along the way, we were all awed at how amazing the scenery looked in spite of our relatively low-level graphics cards.

(I won the contest, by the way.  It went down via a phone call to my boss on a Saturday:

Me [leaving message]:  Steve, I have good news and bad news...

...the good news is that you're not that far from the end of the game.

...the bad news is...well, you figure it out.

He called me an asshole on Monday when we returned to work.  Heh.)

Opening scene from the UE5 playable demo
Fast forward to this week.  The Unreal Engine, which powers games like Fortnite, Gears of War and others, is now on its fifth iteration.  To put that in perspective, the original engine was released in 1995.  It's taken 25 years to evolve 5 times.  The video is simply stunning, but not completely unexpectedly there are plenty of people complaining that it is "only" running in 1080p and "only" running at 30 Frames Per Second (another definition for FPS; context matters of course).

These people are missing the point - let me explain.

It Isn't the Resolution, Stupid...

When NVIDEA was about to release the RTX series of GPU cards, everyone drooled over their demo in August 2018 that used the then-unreleased Battlefield 5 game to showcase how ray-tracing rendered scenes in a very realistic manner along with A/B comparisons to highlight how much better it is vs. the traditional technique of rendering lighting that's on screen only (called screenspace reflection), i.e. it wouldn't show reflections of light that wasn't in the viewport of the player.  Ray-tracing, however, is an incredibly computationally intense process, meaning that it was impossible to expect that level of quality to happen in a software only package.

How's that for dynamic lighting and shading?
Now, UE5 storms into the room screaming "hold my beer!" as you can see in the video captures (a.k.a. vidcaps) that I've included here.  If you can see the realism in these scenes, you are also thinking "it must be hardware based ray-tracing!" but you'd be wrong.  This is rendered completely in software.

The point I'm trying to make is that UE5 demonstrated in less than 2 years since the RTX card was released in October 2018 that hardware is no longer necessary to get ray-tracing-quality rendering.

...But Nothing Happens in a Vacuum

That doesn't mean that specialized hardware isn't needed.  If you watch the UE5 demonstration video (embedded at the bottom), you'll hear Epic Games' Brian Karis (Technical Director of Graphics) talk about how the engine is able to import the cinematic versions of Quixel Megascan assets using 8K textures that can be as small as an individual pixel, which also requires pixel accurate dynamic shading resulting in over 1 billion (yes, with a "b") triangles in each frame of the demo that are crunched, in real time and in a lossless fashion, to 20 million triangles that are actually rendered.

This is what 20 million triangles looks like
That's a lot of triangles, Karen.

Obviously, every single asset, whether Quixel Megascan, Niagara Effects, etc. isn't kept in memory for the entirety of the game / demo, but instead are loaded at runtime.  The massive size of these data structures collectively means that both SATA and even current generation NVMe drives won't have the juice to supply data to UE5 at the rate in which it is consumed.

So Sony developed a brand new storage architecture for the PS5 to make this all a reality.  I won't go into details (mainly because I don't know them heh) but, to put this into perspective, whereas the PS4 took 20 seconds to load 1G of data the new storage architecture allows the PS5 to load 5G of data in one second for a massive 100x increase in performance.

This statue alone is comprised of 34 million triangles
The reason why I bring this up is that it's impossible to imagine a world where Sony doesn't license this architecture to disk drive manufacturers meaning that it'll eventually end up in your gaming laptop a few years down the road.  Considering how storage is one of the two biggest bottlenecks in general purpose computing (the other being network speeds), the implications of having this in my laptop are enormous.

The Impact on Film-making

This is an area that many people probably don't realize is already happening.  UE4 (the previous generation) was used by top-tier effects companies such as Digital Domain, ILM, Lucasfilm and Weta Digital to render complex scenes on large projection screens allowing the actors to be filmed directly in the original shot rather than require green screens to be used.

Over 500 of those statues are in this scene for a total of
over 16 billion triangles just for the statues
However, the production pipeline for this process wasn't for the faint of heart nor was it cheap due to the limitations of the platform.  Now that UE5 can render cinematic quality scenes in software only, it isn't too far fetched to imagine that the use of frequency of CGI in smaller budget films will increase and the complexity of scenery will increase as well.  This is great news for indie production houses who want to dip their toes into film genres that were previously out of reach due to budgeting constraints.

Summary

All-in-all the pending release of the UE5, coupled with the architectural advancements of the PS5 storage system, is great news for everyone whether we benefit from it directly (via gaming) or indirectly (via new storage devices or better quality cinematography).  You simply need to look beyond what the demonstration showed to see the bigger picture, pun most definitely intended.

Check it out in the demonstration video, below.

Popular posts from this blog

"Ni jiang yi yang de hua ma?"

Last week, I wrote about the necessity of having a clear message . Because this topic is so important I decided to follow-up with another entry on this general subject. This week we will approach it from another angle. (For the curious, the title says " Do you speak the same language? " in pinyin, which is a transliterated Mandarin Chinese.) Recently, a good friend of mine (who is Chinese, ironically) and I were playing pool. He had to bank the 8-ball in the pocket to win the game, and since it was an informal game and bank shots are my area of expertise, he asked me for advice. I told him, "you just need to strike the cue ball with medium speed so that it hits the 8-ball right in the middle." He didn't believe me so we marked the positions of the balls, and then he took his shot only to watch the 8-ball sail past the pocket. "A-ha!" he exclaimed. "I told you it wasn't that easy." But when we reset the positions and I made an attemp

It's Easier to Fail at DevOps than it is to Succeed

Slippery when wet Since the term DevOps was coined in Belgium back in 2009, it is impossible to avoid the term whether in discussions with colleagues or in professional trade magazines.  And during the years while this movement has gained momentum, many things have been written to describe what elements of a DevOps strategy are required for it to be successful. Yet in spite of this, there is an interesting data point worth noting: not many organizations feel there is a need for DevOps.  In a Gartner report entitled DevOps Adoption Survey Results (published in September 2015),  40%  of respondents said they had no plans to implement DevOps and 31% of respondents said they hadn't implemented it but planned to start in the 12 months after the survey was conducted. That left only 29% who had implemented DevOps in a pilot project or in production systems, which isn't a lot. "Maybe it's because there truly isn't a need for DevOps," you say.  While that

Is No/Low-Code the Key to IT Nirvana?

 Unless you've had your head in the sand for the past year or so, you've seen the phrases low-code  and no-code  bandied about quite frequently everywhere you look.  You've probably wondered if this is something new that's here to stay or just a "flash in the pan."  Although the terms have been in the fore of the IT trade publications recently, Low Code Development Platforms (LCDP) (and the corresponding No Code Development Platforms) have been in existence since 2011.  Their roots can be traced to the 90's with 4th generation programming languages and GUI-assisted programming paradigms, e.g. IBM VisualAge for Basic, which was discontinued in 1998. For those of you who aren't familiar with either, the premise is that these platforms allow someone to quickly build applications using a WYSIWYG interface and a "click and configure" paradigm to Isn't this the source code to Roblox? rapidly build full applications with little or no coding requ