Balancing C++ and Blueprints

Started by
13 comments, last by Juliean 1 year, 6 months ago

Hey all,

I've been spending the past few months diving into Unreal Engine, and I'm really enjoying my time learning something new. I've spent time with OOP prior to this, and feel pretty comfortable using C++, but seeing the speed and ease of use of Blueprints have made me wonder where people tend to draw the line.

It seems to me that the best approach is to develop most abilities etc. using Blueprint, and expand upon them using C++. Basically distinguishing Blueprint for front-end, and C++ for back-end/infrastructure.

Is this the general workflow that others have found? Would love to hear how others typically approach their workflow, and any tips you might have for a hobbyist!

Thanks!

Advertisement

I personally use mostly Blueprints for my Unreal-projects, unless its something that can't be done in Blueprint (nonexposed features, plugins w/o full blueprint-support), or can't be done in Blueprint elegantly (big algorithms, math problems, …).

I know lots of people will disagree, but I have a few reasons:

First, C++ in Unreal is so-so. While the boilerplate of their preprocessor/compiler for blueprint-bindings etc… is pretty neat, their C++ at core is very oldschool. No exceptions, no references being used, etc… means that more than all, error-handling in c++ is very tedious. Pretty much any object-parameter, you need to add a nullcheck to avoid a crash, and since there are no exceptions you have to sprinkle log-statements all over the place. That is just one example. But it doesn't feel very good working in Unreal-C++ (while i usually like c++, especially the more modern standards).

Secondly, I like many aspects of Blueprints. Being able to directly tie assets into the code without having to go through an “inspector” is a huge plus. Being able to “delay” or “task” certain things naturally without having to go through a delegate, is also nice. Personally, I also find Blueprints have many places to be improved upon - which I made a personalized version of in my own engine; but since not everybody has that luxus - using it in Unreal will be fine.

Though in the end, its really a matter of personal preference, and also who you are working with. One drawback of Blueprint in a multi-person team is that if not anybody is experienced with them, they will find it hard to read your “code" - though I 100% think that its only a matter of getting used to, not like some people claim that visual code is objectively harder to read.

I hope this gives you a few points to decide upon.

EDIT: For clarity - I use Unreal professionally at work, and my own engine in my free-time.

@Juliean Thanks for the response!

I spent a little time in Unity and noticed that the integration of C# seems a little more intuitive than C++ in UE, and having studied C++ in isolation (college classes etc), I've definitely noticed the lack of current standards. I'm working on a demo for a brawler right now, and blueprints seem to be a more intuitive system when it comes to juggling between animation states and Niagara effects, but I may use C++ to make parent classes for commonly used objects in the game so they can hold custom parameters/structs to make my life easier.

I appreciate your input on this! I'll definitely be seeing what works as a best workflow for me as time goes on, but it is nice to know what someone who uses is professionally thinks!

In a real game dev setting, I only created Blueprint functions for stuff that designers asked for, like creating quests that they can design what logic completes a quest step and what logic ends the quest, what you earn from the quest etc.

NBA2K, Madden, Maneater, Killing Floor, Sims http://www.pawlowskipinball.com/pinballeternal

Blueprint can be bad, I can tell you when we picked up Chivalry 2 originally to work on at Tripwire from a co-developing studio, the modder guys over there went absolutely nuts on blueprint that the game was so massively bogged down. BP code is slow.

NBA2K, Madden, Maneater, Killing Floor, Sims http://www.pawlowskipinball.com/pinballeternal

dpadam450 said:

Blueprint can be bad, I can tell you when we picked up Chivalry 2 originally to work on at Tripwire from a co-developing studio, the modder guys over there went absolutely nuts on blueprint that the game was so massively bogged down. BP code is slow.

Performance can be an issue, but shouldn't unless drastically overdone. Can you share a bit more about the scope of what you had with BPs, that turned out to be a performance-issue?

Personally, I use a JIT-interpreter to speed up my own “blueprint” variant. Even without aggressive optimization, that already lowers the cost to be comparable to native languages. Not sure why Unreal doesn't do anything like that. They do seem to have nativization, which will turn BPs into C++, which should also be way faster, but I didn't use them so far.

I'm the reverse of Juliean, I do most everything in C++ unless there's a good reason for me to do it in blueprint. Personally I'm very adept at C++ so I feel that, in general, I can write better systems in native than in Blueprint. There are a lot of things like const correctness, polymorphic structures or templates that you just can't do in blueprint or don't have an equivalent.

UI is a big one that I do primarily in Blueprint. Using a WYSIWYG editor for those layouts is tremendous. I usually have some sort of native bit involved, but usually only to declare purely abstract types. The Blueprint is the thing used directly by the game.

Mission scripts are another case that I stick to blueprints. They generally require lots of iteration so they should be editable in the Editor.

I also keep all my content configuration in blueprint, ini's or other assets. There shouldn't be a single hard-coded asset path anywhere.

As you mention, there are specific assets like animation or VFX that are easier to do things with in Blueprint/their custom editor and that's usually where I would expect those to be created and edited as well.

Some of this depends on the members of the team you're working with. I will push a bit more out to blueprint at work than at home because of individuals that need to work with and adjust that system.

--Russell Aasland
--Lead Gameplay Engineer
--Firaxis Games

In my experience it varies tremendously based on the game. There is no “one size fits all” set of problems or solutions.

Blueprints are great for non-programmers to implement and modify functionality. They serve a great purpose and simply a lot of development. They can serve as a multiplier during game development, getting a lot of work done with very little effort and enabling lower-skilled workers to generate amazing results.

You can't get away from the fact that blueprints are interpreted, and therefore slower than code that is run directly. Attempting to nativize blueprints has always been troublesome and I've never seen it work on non-trivial projects.

Various logic errors in blueprints can cause slowdowns that are almost impossible for non-programmers to diagnose, and are tricky for experienced developers to hunt down since they only show up as just running in a blueprint. More proper bugs in blueprints can do similar. Designers and others require training and experience to learn what is dreadfully slow, what is somewhat slow, and what is not a performance issue. Programmers learn about it in school, but it's common enough for designers to implement exponential or factorial complexity algorithms out of ignorance.

Experienced teams tend to balance it out live as the project grows. Team leads and senior folks watch out for issues they know about, and try to educate people as issues are discovered. Creative designers will find new and exciting ways to break things. Ideas that are good but simply a bad fit can be implemented in C++.

frob said:
You can't get away from the fact that blueprints are interpreted, and therefore slower than code that is run directly. Attempting to nativize blueprints has always been troublesome and I've never seen it work on non-trivial projects.

First, is “interpreted” the right word? Blueprints are at least compiled to bytecode, and from what I've seen “interpreted” only described languages whos source-files are executed line-by-line at runtime, no?

Even so, I'm still going to repeat the question, of why nothing like a JIT-compiler is then attempted by Epic. A lot of languages like python, lua, javascript, ruby etc… are slower than native code, but there exists solution to make them faster, and thus more viable for game-development, right? Even C# would theoretically be a VM-based language without the JIT in the background (yeah I know they force that a JIT is always used).

I'm just harping on that point because I feel from all the different pros/cons that can be made about blueprints, this is one that could theoretically be solved easily in the background. In the end, the blueprints you write are a frontend for whatherver interpreter/compiler is running it. There is some obscure functionality where you write c++-code that interacts directly with the BP-VM (from user-code), but other than that, adding something like JIT could be done without affecting the majority of users. I know this at least somewhat, because I did the same thing for my own “blueprints”, by first adding a VM (I had something even worse), and later a JIT, for a huge improvement in performance; without affecting the actual games code/assets at all. I'm just going to be a bit naive here and summice that if I, a single programmer, can do something like that, Epic could too. And there seems to be a general interest in that direction, otherwise nativization wouldn't be on the table. Sad to here BTW that this doesn't work very well, too. Maybe there is something about Blueprints implementation that is making those optimization more difficult. Who knows (maybe someone here, though?)

Juliean said:
from what I've seen “interpreted” only described languages whos source-files are executed line-by-line at runtime, no?

You may have a different definition.

Typically the options are “compiled” and “interpreted”. Compiled means it's turned into machine code which the computer can execute directly. For example: mov ax, 13h; int 21h; etc. Interpreted can go through intermediate steps, but must be run through a processor. In the case for Unreal there's a bunch of interpreting work that goes on as they're run. Yes they are validated and turned into a condensed form of tokens, but those tokens are interpreted.

Juliean said:
this is one that could theoretically be solved easily in the background. In the end, the blueprints you write are a frontend for whatherver interpreter/compiler is running it.

Not easily, no. It's quite difficult, in fact. There have been attempts at nativization tools in the past, and there are tools and scripting languages like Anglescript that can help with some of the concerns, but overall there is a very poor mapping from the visual elements to the actual executable code. There are many cases, especially error cases and visual script pieces that are just left floating out there which cause severe problems with the system that is running it.

So you're either faced with what they're doing today (interpreting it with a heavy runtime library that validates and corrects as it goes) or you're facing a model with a more fragile model that requires designers, artists, and others to write code like programmers do with the same rules that programmers dedicate years of their lives studying followed by tremendous work time in validating and resolving.

Juliean said:
I'm just going to be a bit naive here and summice that if I, a single programmer, can do something like that, Epic could too.

Epic has released and used nativization tools in the past. The problem is what was described as above. If the blueprints are implemented perfectly, and if everything follows the ideal path through the system then yes, it works great. However, in any non-trivial project that isn't the case. I've worked on several major programs that used Unreal, including Fortnite, and none have had blueprints that were clean enough for the nativization tools to work. There are just too many troublesome real-world cases that block it from happening. Either you live with the costs at runtime, or you limit development to programmers only.

This topic is closed to new replies.

Advertisement