Tips for Building Games with Unreal Engine 4

Posted October 3rd, 2020, last updated October 5th, 2020 in game-development

I have released Estranged: The Departure on 3 platforms, Windows, macOS and Linux, and I will be bringing it to the Nintendo Switch and other platforms in the coming months.

This blog post covers my experiences as a solo developer, working with Unreal Engine since 2014 (without access to the UDN), and what to look out for if you are new to the engine.

I intend to keep this post updated (a living document).


  1. Never Use the Latest .0 Release
  2. Value Time over Correctness
  3. Steer Clear of Limbo Engine Features
  4. Submit Bug Reports to Epic
  5. Carefully Consider Third-Party Plugins
  6. Use Blueprints Correctly
  7. Be Careful with Level Streaming
  8. Plan Your Save System
  9. Use Hard References Sparingly
  10. Localisation
  11. UMG
  12. BSP
  13. Wrap-up

Never Use the Latest .0 Release

It is never worth upgrading to a new engine version as soon as it comes out. For example, if 4.25.0 was just released, always wait for at least 4.25.1 or later to move to it. This is because brand new engine releases typically come with bugs that are normally fixed in the next patch version(s).

Depending on the state and scope of your project, you will also want to ask yourself the following questions before you upgrade the engine:

  1. What am I upgrading for? Is there a new feature that I need to use in the latest version?
  2. Is the new engine version stable on all platforms? Do all my features work on consoles for example, and am I going to verify that up-front?
  3. Have any of the features I have been using been deprecated in the latest version?

There are many permutations of engine features, and unfortunately some things do break. Ultimately, it is your responsibility to verify all the features in the engine - if you do not, it will cost you time. Time that you could better spend creating content :-)

That said, it is worth testing the new engine releases (even the previews), because it means you can see what is being deprecated and what's new ahead of the release, giving you more context to plan an upgrade.

Value Time over Correctness

I am not a rendering engineer, and while I have a high level of understanding of 3D graphics pipelines, the idea of debugging Unreal Engine's rendering pipeline is daunting to me.

In Estranged, throughout development I encountered graphical glitches which were specific to certain platforms (RHIs). When I encountered something like this, I usually ran through the following checklist:

  1. Is it game-breaking? Would a player notice this issue, and does it detract from the gameplay?
  2. Is this glitch related to a feature that Epic may no longer be focussing on? If so, can you use a different, better supported feature?
  3. Can you work around the glitch within your material or avoid it in your assets?

Your time as a game developer is best spent developing your game, which means focussing on gameplay and content. Debugging graphics pipelines is a distraction from what really matters.

Steer Clear of Limbo Engine Features

Some features in Unreal Engine are explicitly marked as "experimental" - where you are very clearly on your own if you use them. However, some features, if not used by a large portion of the developer base, also fall by the wayside since Epic do not pick up on regressions in them (nor do enough of the developer base because of usage).

To identify such features, you can typically use the following checklist:

  1. Is the feature actively being developed by Epic - was it updated in the last 5 engine versions or so?
  2. Do Epic's own games use the feature?
  3. Does it have problems on the lesser used platforms such as Linux?

On the other side, there are features that are marked as experimental in the engine that may be stable enough to use in shipping products. There unfortunately is not an indication of how stable/unstable these features are, it is on you as the developer to test it out.

Submit Bug Reports to Epic

This one might seem obvious, but if you do hit a problem that blocks you, it is worth submitting a bug report to Epic. The downside is that you will need to provide an isolated project to reproduce the problem with - which I found very difficult. If you can do that though, you will typically get a UE-XXXXX bug report raised, but once that's triaged you will be able to find out when/if the issue will be fixed.

I should temper the above with a caution that if the bug is not considered severe by Epic, it will be pushed out for a few engine versions. If the bug is critical to your product and you cannot wait for it to be fixed, you may need to fix it or work around it yourself.

Carefully Consider Third-Party Plugins

Third party plugins can help make certain things easier, however you should always consider their impact on your project:

  1. Does the plugin support all platforms you want to release on? (Consoles for example)
  2. Has the developer committed to supporting the plugin?
  3. Has the developer committed to updating the plugin with new engine versions?

It is worth having confidence in the above and weighing it against whether you need the plugin at all. If it is something that you could replicate yourself with relative ease, then that might be a better option since then you do not need to worry about a third-party developer.

Estranged was built without third-party plugins, which meant avoiding plugin related issues when deploying to all target platforms.

Use Blueprints Correctly

Blueprints are extremely good for bolting pieces of logic together, and quick iteration. There is no question that they are one of the killer features in Unreal Engine 4. However, there are some drawbacks with Blueprints:


You can only diff blueprints within Unreal Engine, you cannot do this in external source control (P4v, PlasticSCM, Git). Even on a very small project like Estranged it is essential to be able to look back over the history of a file to see what has changed, and Blueprints make this process high cost.

I do not have any experience of working on larger Unreal Engine projects (where code review is involved), but I imagine the problem scales with the number of people contributing. The added issue with multiple people working on the same project is conflicts, which some SCM solutions solve with locking.


Blueprints are not good for large or complex pieces of logic. Because of the size of the nodes on the screen and the sprawling nature of connecting wires, it gets very difficult to follow very fast. From experience, it is much better to define complex methods in code and use Blueprints to connect pieces of logic together.

There is no easy way to describe at what point to break out a method into code, though if you find that you want to break a blueprint into multiple methods, or if you're vertically/horizontally scrolling for a while to trace all the logic through that may be a sign.

From a maintenance standpoint, as stated above Blueprints are not only hard to diff, but they are also hard to refactor. The search tooling around Blueprints in the editor is not very powerful, and when methods change/break due to engine version bumps it is hard to see until you execute the blueprint or cook the project.


This is one of the typically listed gotchas with Blueprints because they're often (unfairly) compared to C++, but I found that as long as you do not run lots of logic in tight loops, Blueprints are very fast.

For example there's no problem with running logic in many Blueprints every single frame - as long as you keep performance in mind as you do when writing code, then you won't run into problems with the execution speed of Blueprints themselves.

After profiling the game thread, I never found Blueprint logic to be at the source of performance issues in Estranged, despite many game elements relying heavily on it.

Be Careful with Level Streaming

Level Streaming is an interesting feature of Unreal Engine, and it is built on the asynchronous asset loading feature that is at the core of the engine.

When starting Estranged, I decided to try and build the entire game around level streaming with the premise that there would be no loading screens, and levels would load in and out transparently.

Level streaming works using a parent "persistent" level, and child streaming level(s). The persistent level is always loaded, and the child levels and loaded and unloaded on demand. I ended up with two persistent levels which streamed in and out around 10 levels each in the single player campaign.

The decision to use level streaming ultimately led to the single biggest pain point and time sink on the project, and is one thing I wish I had never embarked upon at the start, because:

Incompatible Features

There are many features which will not work if you use level streaming - for example, a big one is Volumetric Lightmaps. That also knocks out features which rely on them, so Volumetric Fog also does not work.

From the above, it seems like level streaming has taken a back seat to feature development, and the happy path is to not use streaming. There are other examples of this, for example that one main directional light is supported, and fog transitioning is hard when streaming levels in and out.

Static Lighting is Difficult

I found it extremely hard to handle lighting builds between levels and found that limitations with directional lights meant that I could only have one main sky light per persistent level. The two persistent levels I ended up with represented night and day.

Building lighting presented a challenge, and I ended up with a hacky approach where I had a few "lighting" levels (which were never deployed with the game). These lighting levels would pull in a group of streaming child levels and the common sky level and bake lighting for those levels alone. This worked, but the engine always moaned about the lighting being out of date - so it obviously was not a supported workflow.

Limits with Large Levels

I ran into problems with too many reflection captures in the persistent level for example, which led to visual artefacts when streaming in and out a specific sequence of levels. I never did get to the bottom of that but from memory it affected the forward renderer more than the deferred renderer and was one of the reasons I moved from forward back to deferred shading.

Lightmass (the static lightmap calculator) also seemed to break when presented with large levels, and by the end of the project I could not build lighting in either persistent level at once because lightmass generated visual artefacts in random places in the maps (see my workaround above).

Custom Tech Required

Fog is one area that is not supported with level streaming in the sense that there is no way to transition it between streaming levels. If one streaming level is in a hospital you might want thin fog, and if it is in a cave you might want thicker fog. I ended up having to write all that logic, triggers which changed the fog as the player moves through the game.

The streaming volumes supplied with the engine were also not fit for use in Estranged either, since I needed more fine-grained control over when levels loaded in and out rather than being driven by large boxes. For that I had to create a trigger-based system, which cost time.

When streaming levels in you must think of low-end devices, streaming not only needs fast disk i/o, it also needs CPU cycles. For low-end machines it can result in stuttering, and if the machine is slow enough the player might make so much progress that they witness the level being streamed in front of them.

I ended up adding code to handle the slow machine case, to throw up a loading banner and hold the player in place. This solves the issue but defeats the purpose of level streaming in the first place and presents a testing challenge.

Level Streaming Performance

As mentioned above level streaming requires good disk i/o and CPU cycles, but even on fast systems it can cause stuttering when large levels are loaded in at once. I tried to separate out levels into small units and load them as needed but managing streaming levels with static lighting is difficult and it is simply not managable to have hundreds of small streaming levels.

I ended up with specific points in Estranged where streaming occurs and compensated the small frame rate drop with a loading spinner. Prior to this I would see complaints, but after this feature went in it was no longer an issue, since players understood that the game was loading content.

Plan Your Save System

With Estranged I was learning the engine at the same time as developing content. I had always planned to build a save system which saves the state of everything in the world on demand, rather than a trigger-based approach with known state.

This system eventually required each actor to implement a common ISaveRestore interface, and the save process would be:

  • Loop each actor → call OnSave → serialise to byte array

Similarly, to restore:

  • Loop each actor → restore from byte array → call OnRestore

The engine has all the necessary tech already to handle the object serialisation, but I added the save restore system at the mid-point of the project. This ended up being very painful, because every single actor and component needed to become save-aware, and some of the more short-sighted decisions I had made early on in the development process around state came back to bite me.

As a result, many of the reported bugs during the development period were around actors not saving or not restoring correctly, and it was a big effort to track down all the bugs. Fixing them was not hard but making changes to already tested game logic does provide the opportunity for regressions to creep in - so I was tracking down bugs around the save system for many months.

One great feature in the engine which helps with managing state is a Level Sequence. This system allows you to animate actors in a level, change properties on them and call functions all against a timeline. The best part with regards to save/restore is that you can tell a level sequence to jump directly to a time. For example, you can use a level sequence to animate a door opening, a lift falling, or a cinematic, and support save/restore just by storing and restoring the current timestamp of the sequence.

Use Hard References Sparingly

There are two types of asset references in Unreal:

  • Hard references: load all referenced assets when this asset is loaded
  • Soft references: allow referenced assets to be loaded on demand

As mentioned previously at the core of Unreal is a powerful asset streaming system, and this is another place where it is evident. With soft references, the asset can be lazy loaded in on demand, thus speeding up the initial load of the referencing asset.

Why is this important? Well, in Estranged I ran into issues where the start-up process for the game took over 10 seconds extra. When I debugged this, I saw load requests for things I would not expect to see enemies, weapons and their materials, textures, and models. This stemmed from the use of a cheats menu; players could enter cheats to spawn in enemies, weapons and other in-game objects, however I had naïvely used hard references to the assets.

Once I replaced my cheats menu's hard references with soft references, and used Blueprints to lazily-load the assets when needed, the game started up 10 seconds faster and no longer requested assets from around the game content until they were needed.

Unreal has a powerful reference viewer which can be used on any asset, simply by right clicking on it and hitting "Reference Viewer". This visual editor shows a visualisation of all connections between assets, and lets you visualise whether they are hard or soft references.


Unreal has built in support for localisation, and it is something you should consider in your project from the start (this applies to any project using any engine).

String tables are invaluable in Unreal since they let you define a map of Identifier → English, which can then be used from Text instances in Blueprints and code. You can then set up Unreal to gather all text from all string tables, and export/import from .po files (the Linux gettext format).

The nice thing about Unreal using a standard localisation format is that you can use standard tooling - so things like Poedit can be used to edit these files by translators.


I have mentioned that Blueprints is one of Unreal Engine's major features, and I believe UMG, "Unreal Motion Graphics" (its UI framework) is another major feature.

UMG provides a WYSIWYG editor for UI, backed by Blueprints with support for all the basic UI features you would expect. The closest technology I would compare UMG to is WinForms with a C# codebehind, but it also borrows concepts from WPF and other UI frameworks.

The only issue I had with UMG was its lack of support for controller navigation - it supports basic up/down focus with the controller and things like pressing A to click, however it doesn't have good support for things like scrolling lists, or making sure focus starts off in the right place. That does require custom logic, but it is often very light touch - the most difficult bit is figuring out how to build it in a robust fashion.

A very powerful feature of UMG is the ability to use it for the player's HUD and in-game things like computers and other media. Unreal does come with a HUD implementation based on 2D drawing methods, but you should consider it obsolete and just go with UMG for your HUD. For in-game media like computers, it is an incredible tool - it allows you to build in-world UI, which is fully localisable, without the need to manually draw to render targets.


BSP in Unreal Engine is a tool that allows you to design level geometry in the editor. It supports simple shapes, but also allows you to edit at the vertex level.

My experience before Unreal Engine 4 was the Hammer editor, part of the Source Engine. The workflow there was brush based, with rich tooling around creating and manipulating brush geometry in the editor.

Transitioning to Unreal, I naïvely thought that I could use the same workflows. The first level I created was based around BSP, and as the level grew, the editor began to grind to a halt. I found an editor setting to defer the processing of BSP, but then that shifted the delay to when I hit the "build geometry" option in the editor.

It became evident pretty fast that the BSP tooling was good for the odd primitive or blocking out, but not for the majority of geometry in a level. The majority of meshes should be static meshes, created by a 3D package - that is the workflow Epic would like you to use.

That said there are beta mesh editing tools in Unreal Engine 4.26 - but these are mesh editor tools, not related to BSP.


The above thoughts are all from my limited experience of using the engine alone since 2014 - I imagine you would hear different advice from developers working with the engine on AAA titles or even other indie developers. I wanted to document the above for others and myself; it is no use to anyone if I forget the lessons learned.

Looking forwards to future projects, I would take the same path with the same engine, though perhaps I would not depend so highly on level streaming and instead use it as an optimisation.

Estranged: The Departure is available on Steam for Windows, macOS and Linux, and is coming soon to Nintendo Switch.