Creating a VFX Engine in a Week

Updated: May 10

She should probably get that checked out

When I started this week, the custom C++ engine Kinematic is built in didn't have a VFX system. I decided to spend the week remedying this situation, with the goal of having a fully fleshed out particle editing, simulation, and rendering system by the end of the week.

For context, let me share some of my background. My first job in the industry was at the fledgling 343 Industries, working on Halo 4. I was the dedicated VFX programmer for that project. I was extremely fortunate to be able to learn not just from the amazingly talented graphics team on that project, but also from the work of Bungie's VFX team in whose footsteps we were following. Thus, a ton of how I think about VFX is shaped by the work of Steve Scott and the team he led on Halo 3 and Reach. (See "Blowing S#!t Up the Bungie Way" for more)

In particular, I'm obsessed with using monochrome control textures, then passing those textures through a palette texture to color them.

Basic Framework

Particles, kind of

On Monday, I stood up a lot of the basics of Kinematic's ParticleSystem system. I created a ParticleSystemComp that had a fixed size array of 16 Particles. Those particles had dummy position, velocity, and size values. And they drew in beautiful lime green.


The first thing I stood up to make this actually useful was a ParticleCurve data structure. A ParticleCurve takes in a ParticleSystem (and usually a Particle), and from that, returns a float. The float is specified by a curve that is defined as f(x) over the domain 0 <= x <= 1. Importantly, every curve gets to choose an input value from that System/Particle. That input is a value from 0 to 1 that is then passed as the x coordinate into the curve function.

Currently, I have implemented:

  • Constant: Always inputs x=0

  • [Particle/System]Life: x goes from 0 to 1 over the course of each Particle's/System's life

  • [Particle/System]Random: x is a random value from 0 to 1 that is chosen once, the first time the curve is sampled for a given Particle/System

For Constant and Life values, no additional storage space is required. For Random values that are only sampled at the birth of the system or particle (e.g. Emission Speed), that is also true. But for Random values that are sampled every frame (e.g. Alpha), I need to store the chosen random value. Rather than create storage on every particle for every curve, I store the value in a map. The key is the offset of the curve within the ParticleSystemDef structure (which uniquely identifies it), and the value is the randomly selected float. (Since my engine doesn't use dynamic memory, I use my ArrayMap class, which has an interface based on std::unordered_map, but stores std::pair<Key, Value> in an array contained within the ArrayMap class)

Now, at the end of the week, the set of curves I have implemented is:

SystemLifetime, ParticleLifetime

BurstCount, SpawnRate

Alpha, PaletteU, BlackPoint, WhitePoint

EmissionSize, SizeMultiplier, SizeMultiplierX, SizeMultiplierY

EmissionAngle, EmissionOffset, EmissionSpeed

Rotation, RotationRate

AccelerationAngle, AccelerationMagnitude


ParticleSystemDef is by far the most complicated data structure in the data of Kinematic at this point. Thus, the raw json editing that I've been happy with for simpler data wasn't going to cut it. I leveraged a python/Qt data editor I built for LED art a couple years ago. (The editor is open-source at, though I make no guarantees about it being easy to use for humans that don't have my brain inside their head)

Editing the first few curves

At this point, I don't have any fancy curve editing, previews, or other features that my previous jobs' VFX editors have supported. But it's very easy for me to add fields to the editor, and it formats my json correctly. So that's enough for me.


As with other elements of Kinematic, I'm interested in finding ways to make good looking content quickly, using my not-artistically-trained hands and brain. Thus, I'm not doing any hand-drawn VFX. Instead, I want to build 2D pixelated VFX using the same procedural/noise texture/shader methods I'm used to from working on Halo. Every Particle in Kinematic renders by sampling 1 or more textures to acquire a "palette UV". Then it samples into the single palette texture, which returns a 2D LUT coordinate. Finally, it samples the game's current color LUT to get an RGB value. Optionally, there can also be an alpha texture that is applied to the result.

The palette UV comes from two separate sources. The U coordinate comes from the PaletteU curve on the particle itself. This can be used to make different particles in the same system have different color palettes (using ParticleRandom). Or it can be used to have the palette change over time (using ParticleLifetime). Later, you'll see examples of each of these techniques. The V coordinate is a value between 0 and 1 obtained by sampling one or more textures, and combining them via simple mathematical operations. I currently support add, multiply, max, and diff.

Scrolling palette U
The palette being scrolled

Probably my favorite trick from Bungie's toolkit is the use of the palette's U coordinate to dramatically change a particle's look over time. It's particularly effective with relatively smooth "smoke" gradients to make electricity, as you can see in this early prototype of yellow ion VFX.

Eroding fire spaghetti-os

Another great trick of theirs is "black point", which can be used to give nice erosion effects. Basically, instead of multiplying the whole range a texture by a float, you instead clamp values below a float to 0, and linearly interpolate values above that float from 0 to 1.

Attaching to Objects

Whip it good

It was very easy to attach a particle system to a game object. I simply store the game object's ID, and then ask for it's PosComp every frame to update the system's position. Since my Blender exporter saves out bone positions, I could also easily attach particle systems to the bones of any object that was created in Blender (as opposed to drawn in Aseprite). Finally, I made the option for particles to be emitted in world or local space, and optionally to inherit the velocity of whatever the system is attached to when they spawn.

Multiple Systems per Effect

In the middle of the week, I was working on a yellow spark effect for a resource in your factory that follows along the contours of terrain. I realized that I couldn't put off any longer the ability to have multiple particle systems in a single effect. This required me to build some new data structures to support dynamically sized effects within my static memory allocation. Thus, I created BlockAllocator, BlockLinkedList, and ArrayLinkedList.

BlockAllocator is pretty simple. It's just a template class that takes the type that should be stored, and the quantity of them. It has a statically allocated array of that size, and a parallel std::bitset of occupancy. It then allows you to allocate and deallocate from that array. When you allocate, you don't get a pointer. You get an index, that you can then ask the allocator to turn into a reference. Thus, we can confirm occupancy at every access via an assert, and it still works when memcpyed to a new location.

BlockLinkedList is a class designed to work with a BlockAllocator. BlockLinkedList::Allocator is a using directive that defines the allocator that will support an arbitrary number of BlockLinkedList instances. A BlockLinkedList allows you to add, remove, and iterate elements that are stored inside that BlockAllocator.

ArrayLinkedList is a specialization of BlockLinkedList that's specifically designed for things like Particle, which are expected to come in significant quantities. Rather than have each particle exist in its own LinkedList node, they're stored in arrays of size 16. Those arrays are then stored in a linked list. But the ParticleSystem doesn't care about all that. It simply interacts with an interface very similar to std::vector<Particle>.

I'm not sure if the extra work for ArrayLinkedList was worth it, vs. just doing single Particles in a BlockLinkedList. But it made me happy to write it, and took less than an hour. So it's in there.

As a result of all of this work, an EffectComp can spawn any number of ParticleSystems, and each of them can spawn any number of Particles.


Here are some examples of VFX I built towards the end of the week. None of them are final, but they helped me figure out what features I need to build.

Pink Goo

Pink goo splatter palette

This is a pretty simple effect, but it helps sell the splattering of the Pink Goo resource when it hits terrain. Probably the most interesting thing going on is the use of ParticleLifetime -> PaletteU on the little circle that spawns at the center of the effect, in order to give the sense of a bubble popping.

Green Leaves

Falling leaves

This effect will be used to show columns of space where you can build Green Leaf resource harvesters. These leaves use ParticleRandom -> PaletteU in order to have each leaf choose among several color palettes I created. Also, I repurposed some code I wrote for leaf floating motion, by converting it into a static function in my GliderComp class. Thus the same code can be shared by the physics system and particles. The algorithm is based on (1) from a paper by Chengyang Li et al. from Tsinghua University Press.

Yellow Ion

Yellow ion races

This effect is for the Yellow Ion resource I mentioned above. I'm pretty happy with how this one turned out. There are 3 particle systems going on. One is a flickering, circular yellow glow. The second is the little sparks that shoot off and die when they collide with terrain. The third, "hero", system is the spinning yellow and blue lightning bolt particle. That particle uses a bunch of the techniques listed above. The alpha texture is the lightning bolts, which spin once per "particle lifetime". But the particle's lifetime is set to loop, thus allowing it to keep spinning over and over as long as the resource lives. The palette U scrolls from 0 to 1 (looping back around) 3 times every particle lifetime. The palette V is supplied by a cloudy noise texture generated by the amazing EffectTextureMaker tool. Thus you get a flickering, noisy yellow and blue lightning ball that hopefully evokes the "plasma ball" toy style of arcing energy.

Spark Textures

(Sadly, it definitely looks much better at 60fps in game than in the gif, but that's life)


I had a blast working on VFX this week. I can't wait to use this system to liven up the world Jem explores in Kinematic. Is there anything else you'd like to hear more about? Come join the discussion on Reddit, and join Kinematic's Discord.


©2020 by Lt Randolph Games. Proudly created with