Simulating Particle Effects using OpenGL

Particle Effect

Particle Effect

In this article I will demonstrate one possible way to implement a particle effect in C++ using OpenGL to render the effect. This demo uses the fixed function pipeline and the host processor (CPU) to perform the simulation. In this article, I will use OpenGL and GLUT to render graphics to the application window. If you do not know how to setup an application using OpengGL and GLUT you can refer to my previous article titled [Introduction to OpenGL for Game Programmers] available [here].


Particles systems have been used extensively in games for many years. They are used to add visual flare and can simulate phenomenon that occur in nature such as dust, fire, water, or clouds. They are also used to represent unnatural phenomenon such as magical effects, or waves of geometrically shapped enemies attacking the player character in a rectangular space.

Geometry Wars

Geometry Wars - Particle Example

There are several different types of particle effects that can be created.

  1. Billboard Particles: Billboard particles are flat textured quads that are rotated to always face the camera. They are called “billboard” because like a billboard you see on the side of the road, they are meant to display an image and they work best when they are facing you.
  2. Trail Particles: Trail particles are similar to billboard particles in the sense that they use an image (usually just a gradient image) but instead of just a quad, the trails are generated using a triangle strip that follows a spline path and usually tapers-off or fades-out at the end.
  3. Beam Particles: A beam particle effect is usually used to visualize energy beams, laser beams, or light beams. A beam emitter is set of stretched quads along an axis where each quad is rotated about that axis. The result is a full energy beam when viewed from any angle except when viewed from the axis the quads are aligned to but usually the beam ends are capped by some geometry.
  4. Mesh Particles: It is also possible to generate a particle effect that produces geometric meshes. Although this is not commonly used as a preference to less geometry to represent a particle is favored.

In this article, I will demonstrate the billboard particle effect.


A few dependencies are used by this project to ease the coding process and to make the material more readable.

  • OpenGL Utility Toolkit (GLUT): Is used for platform-independent window management and window event handling that simplifies the implementation of an OpenGL windowed application.
  • OpenGL Mathmatics Library (GLM): Used for vector classes, quaternion classes, matrix classes and math operations that are suited for OpenGL applications.
  • Simple OpenGL Image Library (SOIL): A library that provides functions to load textures from disc as easy as possible.  This library is ideal for loading images to be used by OpenGL.

The Particle Class

The Particle class (actually, it’s a struct) defines the properties of a single particle that is used to simulate the particle effect.

The meaning of the particle members is given below:

  • glm::vec3 m_Position: The center point of the simulated particle local to the particle effect.
  • glm::vec3 m_Velocity: The velocity of the particle in 3D space.
  • glm::vec4 m_Color: The 4-component color (red, green, blue, alpha) of the color. The texture of the particle will be blended with this color to determine the final representation.
  • float m_fRotate: Determines the amount of rotation to apply to the particle’s local z-axis.
  • float m_fSize: Determines how large the particle will be and this is measured in world-coordinates, not screen-coordinates.
  • float m_fAge: The duration in seconds since the particle was emitted.
  • float m_fLifeTime: How long the particle will live for. When the particle’s age has exceeded it’s lifetime, it is considered “dead” and will not be rendered until it is re-emitted.

You may be able to think of a few other properties that are unique to the particle but I don’t recommend you add properties to the particle definition that will be constant for the entire effect. For example, it wouldn’t be wise to add the constant force vector that will be applied to every particle in the effect to the individual particle definition.

The ParticleEffect Class

The ParticleEffect class will be used to emit, update, and render particles that share common properties such as texture, interpolated colors, interpolated scale, forces that will be applied to each particle and blending modes, and sorting criteria. Since all the particles in a particle effect will all share the same texture, several particle effects will be required to create a complex particle effect that requires several different texture to look correct (for example, a fire and smoke particle system would require at least two different particle effects – one for the fire and one for the smoke).

The ParticleEffect Header File

The particle effect class will declare the vertex definition that will be used to render the particle effect to the GPU. The particle effect class will also store the buffers for the particle array and the vertex array as well as the texture that is used to render each particle sprite. The ParticleEffect class will also store a force vector that will be applied to each particle in the simulation each frame to give the particles some motion.

The particle effect is initialized with the maximum number of particles that will be used to render this effect. The number of particles can be adjusted using the ParticleEffect::Resize method.

The ParticleEffect::SetCamera method is used to store the instance of the camera class that is used to orient the particle facing the camera and the ParticleEffect::SetParticleEmitter method stores the instance to an emitter class that determines the position and velocity of newly emitted particles.

The ParticleEffect::SetColorInterplator method is used to specify a reference to an Interpolator type this is used to interpolate the color of the particle over it’s lifetime. Other properties of the particle can also be interpolated using this class (such as the scale or the rotation of the particle) but for the sake of simplicity I only used it to interpolate the color of the particle because I wanted to specify multiple different colors to give the particles a “rainbow” effect whereas the scale and the rotation will be linearly scaled over the lifetime of the particle.

The ParticleEffect::RandomizeParticles method is used to randomly distribute all of the particles in the particle effect giving them a (pseudo) random position and velocity around a unit sphere center at the origin of the effect. This algorithm will be deomonstrated when I show the code for this method. This method is used to distribute the particles if no ParticleEmitter instance is defined for the m_pParticleEmitter member variable.

The ParticleEffect::EmitParticles method is used to distribute the particles if a ParticleEmitter instance is defined for the m_pParticleEmitter member variable.

The ParticleEffect::Update method is used to update the positions, velocities, colors, rotations, sizes, and age of the particle effect for a single step of the simulation while the ParticleEffect::Render method is used to render the particle effect in OpenGL.

The ParticleEffect::LoadTexture method is used to load the texture that will be applied to every particle in the effect.

The ParticleEffect::Resize method can be used to shrink, or grow the number of particles in the effect after the effect instance has been created.

The two protected members ParticleEffect::RandomizeParticle and ParticleEffect::EmitParticle are self-explanatory – they are simply used to randomize, or emit a single particle.

The ParticleEffect::BuildVertexBuffer method will loop through the particle buffer and build the vertex buffer that contains the vertices for the camera orientated quads that are used to render the particles.

And of course, there are the private member variables of the class that are used to perform the simulation and render the effect.

The ParticleEffect Source File

I will first show the constructor, destructor and some of the setters for the class before I show some of the more complex functions.

First we include the pre-compiled header file that is required if you enable pre-compiled header generation (this is done in the project settings that is included at the end of this post) followed by a few of the dependencies this class uses. The “Random.h” header declares some functions that are used to produce some (pseudo) random values for the particle effect to emit the particles.

The effect’s m_Force member is initialized to the force due to gravity (if we assume that 1-world unit is equivalent to 1 meter) which will cause all of the particles to fall downwards after being spawned.

The class constructor will use the ParticleEffect::Resize method to ensure the particle buffers and vertex buffers contain enough particles and vertices. The destructor will simply delete the OpenGL texture if one was loaded.

Following the destructor, we define the setters for the camera, particle emitter, and color interpolator that are used by this effect.

The ParticleEffect::LoadTexture Method

The ParticleEffect::LoadTexture method is very simple thanks to the “Simple OpenGL Image Library” (SOIL). It simply asks the SOIL library to load the texture and we get a texture ID back from the load texture method.

The ParticleEffect::RandomizeParticle Method

If the particle effect does not have a reference to a ParticleEmitter class, I wanted to provide a fall-back method to emit the particles. The ParticleEffect::RandomizeParticle method will randomly distribute a single particle on a unit circle centered at the origin of the particle effect. It will also give it a random velocity to force the particle away from the center of the circle.

The particles age is reset so it is no longer “dead” and a (pseudo) random number between 3 seconds and 5 seconds is used to determine how long this particle will stay alive.

The unitVec variable stores a random vector distributed onto a unit sphere.

The velocity of the vector is set to a velocity that points away from the unit sphere with a random rate between 10 and 20 world-units per second.

The ParticleEffect::RandomizeParticles Method

The ParticleEffect::RandomizeParticles method simply iterates through the particle buffer and randomly distributes the particle using the ParticleEffect::RandomizeParticle method shown earlier.

The ParticleEffect::EmitParticle Method

If the particle effect has a reference to a valid ParticleEmitter class instance then we can use the ParticleEffect::EmitParticle method to emit each particle.

I have defined two different emitter types in this project – a sphere emitter, and a cube emitter.

The SphereEmitter class will randomly emit a particle somewhere about a sphere within some range. The sphere emitter uses spherical coordinates to determine the range in which particles will be emitted in. The spherical coordinates are converted to Cartesian coordinates to determine the position and velocity that the particle is emitted in 3D space.

The sphere emitter uses a range of values for the radius, inclination, and azimuth of the sphere (or semi-sphere) in which to emit the particle. A random lifetime is chosen for the particle and it’s age is reset by the emitter. To emit the particles at a single point in space, simply set the MinRadius and MaxRadius to 0, or define a new emitter called PointEmitter that defines a single point to emit the particles from and a pair of vectors that define the min and max velocity for the particle.

The cube emitter uses a bounding volume to determine the position particles will be emitted.

Using the CubeEmitter defines a Min and Max property for the width, height, and depth of the cube. If you want the particles to be emitted on a line segment, you could simply set 2 of the 3 dimensions to 0.

When running in debug mode, these emitters will render some debug output that shows the boundaries on which the particles will be emitted.

You could also create a mesh emitter that will cause particles to be emitted at random vertex positions of a mesh and the velocity could be determined by the normal of the vertex multiplied by some scalar to determine the initial speed of the particle. I haven’t implemented this type of emitter in this project. I will leave it to the reader to implement.

The ParticleEffect::EmitParticles Method

Not surprisingly, the ParticleEffect::EmitParticles method will iterate through the list of particles and invoke the ParticleEmitter::EmitParticle method if a particle emitter has been defined for this particle effect. If no emitter has be assigned to the effect, the default ParticleEffect::RandomizeParticles method will be used instead.

The ParticleEffect::Update Method

The ParticleEffect::Update method will update the velocity, position, color, rotation, and size of each particle in the effect.

This method will simply loop through the particle buffer and update each particle. First, the particles age is incremented based on the elapsed time this frame. If the age of the particle exceeds it’s lifetime, it is considered dead and will be immediately re-emitted as a new particle.

The velocity of the particle is incremented based on the effect’s global force parameter, and the position of the particle will be incremented based on it’s velocity.

The color value is determined by the m_ColorInterpolator values. The default color of the particle if the color interpolator container is empty is white (specified in the ParticleEffect’s constructor). For this, I wanted to have a rainbow like effect during the life of the particle, so in the main method, I created a ColorInterpolator with seven different colors over the particle’s life.

The result will be particles are spawned red and will interpolate through all the colors of the rainbow before they fade-out to red again.

On line 162 the particle is rotated (it will rotate twice during it’s lifetime) and the size of the particle is interpolated between 5 and 0 during it’s lifetime. The size parameter is also something that could benefit from the Interpolator type. Using the Interpolator to determine the size of the particle, we could have the particle born at a scale of 0 and then grows to it’s largest size very quickly, then slowly shrink back down to zero during it’s lifetime. This kind of effect would not be possible with a simple linear interpolation between two values during the particles life.

After all of the particles have been processed, the vertex buffer is generated by invoking the ParticleEffect::BuildVertexBuffer method.

The ParticleEffect::BuildVertexBuffer Method

The ParticleEffect::BuildVertexBuffer method will modify the vertex buffer in order to render the particle effect with scree-aligned quads. We want to make sure that the quad is rotated using the rotation of the particle, but we also want to make sure that the particle is counter-rotated (rotated by the inverse of) the camera matrix. Doing this will ensure the quad is always facing the camera. The texture coordinates for each vertex must also be specified as well as the vertex color that will be used to modulate the color of the texture. Let’s look at each part of the implementation separately:

First, a few constant vectors are defined that will be used to align the width and height of the particle. You will notice I am using “0.5” for the X and Y constants. This is because these values represent the half-height and half-width of the particle so that the unscaled size of the particle will be 1×1. The Z vector constant is used to create the rotation quaternion which will be used to rotate the particle around that axis.

We also need the camera’s matrix as a quaternion in order to counter-rotate the view matrix so that the quad is always facing the camera.

And we also want to make sure that if any particles were added to the particle buffer since the last time we built the vertex buffer, that the vertex buffer still contains the right number of vertices. If the size doesn’t change the call to m_VertexBuffer.resize will have no effect.

After we’ve setup some initial variables, we need to build the vertex buffer by looping through each particle and setting the vertex position, texture, and diffuse color.

After we have a reference to the particle in the particle buffer, we can generate a quaternion from the particle’s rotation parameter and the unit Z axis to rotate the particle around the look-at vector in the particle’s space.

Since we need four vertices for each quad, we need to get a reference to the next four vertices in the vertex buffer. The order of the vertices is important if you enable back-face culling. The default winding order of front-facing polygons is counter-clockwise so we have to pass the vertices of the quad in a counter-clockwise order to OpenGL to prevent the quads from being back-face culled. It doesn’t really matter which one of the four corners you specify first as long as they are in a counter-clockwise order. The image below shows the positions of the vertices that are used in my implementation.

Vertex Winding Order

Vertex Winding Order - Counter Clockwise

The vertex position is equal to the position of the particle plus some offset depending on which vertex we are computing. The rotation of the particle is applied to the offset vector and a scale is applied to adjust the size of the particle.

For each vertex position, the camera matrix is applied on the right. This is equivalent to multiplying the offset vector by the inverse of the camera’s rotation thus undoing the camera’s rotation and forcing the particle to be always facing the camera.

The texture coordinate for each vertex is specified depending on the corner of the quad and the diffuse color for the vertex is set to the color of the particle.

The ParticleEffect::Render Method

The ParticleEffect::Render method will pass the vertex information to the GPU to be rendered on screen.

On line 173, and 174, we disable depth writes and enable blending. Disabling depth writes will prevent the particle from writing anything to the depth buffer. The effect is that anything that is drawn after the particle effect will always be drawn on top of the particles. Usually we want to disable depth writing for transparent materials (like these particles) otherwise particle that appear behind other particles but are drawn after the front particles will not be rendered even where the front particle is completely transparent. If we don’t disable depth-writes, then round particles that are drawn in the foreground will appear square when background particles are draw after the foreground particles. Another solution to this problem is simply to depth-sort every particle and render all transparent objects from back-to-front (relative to the viewer). Depth sorting particles according to the current viewer is a subject all on it’s own and won’t appear in this article.

In order to render the particle effect, we have to pass the vertex information to the GPU and tell open GL to render the vertex buffer as a set of quads.

We first enable the client states that our particle effect is using (GL_VERTEX_ARRAY, GL_TEXTURE_COORD_ARRAY, and GL_COLOR_ARRAY for the vertex position, texture coordinate, and diffuse color respectively).

Then we also have to bind the buffers to the display list. We do this using the glVertexPointer method to bind the first element of the vertex positions for each quad. Since we are using an interlaced (the vertex properties are mixed in the array), we need to specify the stride of the buffer. The stride determines how much memory we have to skip to get to the next vertex element in the buffer. This is always going to be the size of a vertex object regardless of the offset of the member in the struct.

To bind the texture coordinate data, we use the glTexCoordPointer method and to bind the color data we use the glColorPointer method.

To actually tell OpenGL to pass that data we just bound to the GPU for rendering, we use the glDrawArrays method which will render a non-indexed (see glDrawElements documentation to render indexed data) primitive list. In this case we want to render GL_QUADS starting at the first index in the buffer up to the total number of vertices in our vertex buffer.

Then we have to restore the OpenGL state so we disable the client states we’ve enabled before and disable the texture used for the particle.

The glPopAttribute function is used to restore the attribute bits that we’ve pushed at the beginning of this method.

In debug mode, we may want to visualize the particle emitter that is used for this effect, for that I created debug render functions that will draw the dimensions of the emitter.


There are always various things you could do to improve the performance and functionality of your applications. In this section, I would like to discuss a few things that can be done to improve these characteristics.

Using STL Container Classes

You may have noticed a few things about my implementation that may not be ideal. For example, you may cringe at the fact that I’m using the STL vector class to store my particle and vertex buffers. There are several reasons why you would not want to use this container class to store buffers that will be accessed very often. The main reason being that the array index operator (operator[](size_type _Pos)) method is slow to do the element look-up because of the bounds checking that the STL container classes perform. In debug mode, this might be desirable to ensure you don’t try to access an element in the container that is out-of-range. However, for a production release version of your software you may want to disable this bounds checking to improve the performance of your software. The _SECURE_SCL macro is used to check if these buffer-overrun errors could occur in the STL container. By default this value is “1” which enables the bounds checks on STL containers. By setting this macro to “0” (in your preprocessor definitions for your project for example) the bounds checking will be disabled and in release mode, your program will preform no slower than as if you were simply performing static-array accesses.

The other reason you may not want to use STL containers is the fact that if you try to add more elements than the container has capacity for, the container will be resized and a reallocation will occur for every element in the list. If you are only storing pointers to objects, this may not be too expensive, but if you are storing elements by value (which is the case in my program), you may not want this reallocation to occur (ever). To eliminate the need for a reallocation, I used the resize method in the constructor to ensure that the buffers will always have the right number of particles and I never use the push_back to add new particles to the buffers. If the container already has the number of elements that are specified in the resize method, the container size will not be modified in any way.

With these considerations in mind, I would argue to any C-programming enthusiast who only believes in pure C-style static arrays that the same performance can be achieved with the STL vector container types as long as you know how to use it correctly.

Taking full advantage of the power of the GPU

Another point of discussion when trying to build particle systems is performance (I suppose this is always a point of discussion in all programming circles). There are several areas where the performance of the particle system can be improved. The performance of the update loop can be improved by multi-threading the update loop. The performance of the renderer can be improved by eliminating the need to transfer the vertex buffer to the GPU each frame. But how can we prevent this if we need to update the position and orientation of each quad every frame? Surely we have to transfer the updated vertex information to the GPU in order to render the quads… right? Well, no actually.

If we could create a vertex program and and a fragment program that is actually used to update the velocity and position of the particle then we could store the particle buffer data in textures and then each pixel of the texture could be used to store either the position, the velocity, or the color of each particle in the simulation. We set the correct texture as a render target for the fragment program that is used to update the particle properties, update the textures values then use the textures in the vertex program to build the final vertices to render the particle effect. The only limitation would be that the vertex program supports texture fetching which has been available in shader programs since shader model 3.0. Shader model 3.0 is available on all retail graphics adapters released since 2004 so I think we can assume that a large population of users have graphics adapters that supports vertex texture fetching.

A more recent approach to this performance issue would be to use a general purpose GPU programming language (GPGPU) like nVidia’s CUDA or the open standard OpenCL to update the particle buffer and vertex buffer directly on the GPU so that the memory never has to be moved off the GPU. Since this method requires a more recent GPU to work, it isn’t guaranteed to work on everyone’s computer and you would still need to provide a fall-back method for a user’s computer that doesn’t support these technologies.

However, both of these methods would provide a significant performance improvement because of the massively parallelized architecture of the GPU. Instead of simply processing one particle at a time in the update loop, we could process every particle at the same time in just as much time as it would cost to update one or two particles (depending on the number of particles in your simulation).

Unfortunately vertex programs, fragment programs and programming directly on the GPU are beyond the scope of this article so they won’t be discussed here, but I’d love to hear your opinion about it…

Let me know what you think

If you have any comments about these topics, or any other discussion about the implementation of this particle effect, please feel free to leave a comment at the end of this post. I will try to reply to your comments as quickly as I possibly can.


We have now seen one possible implementation for a particle effect implemented in the fixed-function pipeline of OpenGL. There are many improvements that can be made to this implementation. On standard hardware with the release build settings, this particle effect implementation should be able to simulate particle effects with 100,000 particles with real-time frame rates. I would challenge the reader to try to create a particle effect that could render over one million particles at real-time frame-rates (at least 30 fps).


To get the inspiration for this demo, I did refer to the NeHe tutorial 19 “Particle Engine Using Triangle Strips”. The author there decided to use triangle strips and his argument for doing so was that he could reduce the number of vertices that are passed to the GPU (he needed 4 vertices to represent each particle, but using GL_QUADS also only requires 4 vertices, so I’m not sure what he was using before he decided to use GL_TRIANGLE_STRIP. He also assumes that camera is never moved, doing so he could avoid doing the inverse camera rotation on the quads, but in my demo the camera can move so I came up with my own implementation.

Download the Source

You can download the source code for this article here:


31 thoughts on “Simulating Particle Effects using OpenGL

    • Alex, It seems that Google Docs have changed the interface for zip files. It now shows the content of the zip file… This is a bit annoying as the user might think that they have to download each file individually.
      There is an option to save the original document (using the short-cut key Ctrl-S may also work).

      I tried this in both Chrome and Internet Explorer 9 and I didn’t have any problems viewing/downloading the zip file. The only issue you may experience is that you have to be logged in with a google account to access the files even though the files are set to “Public on the web”.

  1. “To actually tell OpenGL to pass that data we just bound to the GPU for rendering, we use the glDrawArrays method which will render a non-indexed (see glDrawElements documentation to render indexed data) primitive list. In this case we want to render GL_QUADS starting at the first index in the buffer up to m_Particles.size() primitives – notice this is the number of primitives to draw, not the number of vertices!”

    Are you sure this last part is correct? I think you are supposed to specify the number of vertices instead of primitives.

  2. >> the array index operator (operator[](size_type _Pos)) method is slow to do the element look-up because of the bounds checking that the STL container classes perform.

    I don’t believe this is correct. STL vectors perform bounds checking when using vector::at(size_type n), which throws an out_of_range exception for bad indices, while vector::operator[](size_type n) behaves like a typical array access and returns whatever values are currently in memory for indices which are out of bounds.

    While C purists may disagree, I don’t believe that there are really any significant performance concerns that should discourage people from using STL vectors in place of arrays, so long as due thought is taken for how to use them (such as using vector::reserve() as you suggest to prevent resizing the underlying array).

    • Brian,

      The operator[] function in my implementation looks like this:

      reference operator[](size_type _Pos)
      { // subscript mutable sequence

      if (size() <= _Pos) { _DEBUG_ERROR("vector subscript out of range"); _SCL_SECURE_OUT_OF_RANGE; } #endif /* _HAS_ITERATOR_DEBUGGING */ _SCL_SECURE_VALIDATE_RANGE(_Pos < size()); return (*(_Myfirst + _Pos)); }

      Which has several checks for out-of-bounds errors. _HAS_ITERATOR_DEBUGGING is enabled in Debug mode by default, but the _SCL_SECURE_VALIDATE_RANGE macro will always be performed in Debug and Release builds unless _SECURE_SCL is explicitly set to 0 in the project's preprocessor definitions.

  3. Hi. I’m studying with your source code, but your reshape code doesn’t work.
    If I call ApplyProjectionTransform function to modify window’s size or to manipulate the fovy argument, it disappears everything and appears only a black screen . Could you tell me what the problem is?

    • Aiden,

      This is fixed by modifying the ApplyProjectionTransform function in the Camera.cpp file:

      I’ve updated the source code to include this fix so if you download it again, you’ll get this change.

      Thanks for pointing this out!

    • I just tested the link and it seems to be working fine. Can you be more specific about the problem you are experiencing? Some people have reported that the link can be downloaded fine using the Google Chrome web browser. If IE or FireFox are not working for you, please try Chrome.

  4. Hi, let me first say that this particle system tutorials is one of the best I’ve seen so far. I was just wondering, how would you go about making a vertex and fragment shader for this? I have tried on my own, but can’t seem to get it working properly

  5. Hi Jeremiah.

    I am combining this project with one of your other projects “Multi-textured Terrain in OpenGL”. The goals is to implement some particle system on the terrain, like a fire or waterfall. I managed to merge the projects successfully, and the bounding volumes of both CubeEmitter and SphereEmitter are drawn correctly in debug mode. Though the particles themselves are not visible at all. The standalone ParticleEffect project worked like a charm so my machine should be allright.

    I’m new to OpenGL and I’ve had this problem for weeks now, but can’t seem to find the cause. The main difference between both projects is the use of TerrainCamera instead of PivotCamera, so I still think the problem is there.

    I linked the merged project. I would very much appreciate it if you would take a moment and think about what could be the cause, or even take a look at the code. Thanks in advance.

    • Stan,

      I vaguely remember some issue with the render states set by the terrain class that broke the rendering of the particles but I can’t remember the details.

      Do the particles render if you don’t render the terrain?

      • Jeremiah,

        Yes, the skydome and particle effects are rendered when the terrain in not rendered.

        Note that commenting out g_Terrain.Render() is not enough. Also g_Terrain.LoadHeightmap must not be executed.

        Thanks in advace

        • Jeremiah,

          I fixed the problem by implementing the Render function of the ParticleEffect in a similar way as Terrain, namely by using glGenBuffersARB instead of just a std::vector VertexBuffer.

          Thanks for sharing your projects!

  6. Hello,
    Really helpful particle system that u have put.
    I am developing a project for fireworks simulation and wanted to use this.
    Can u pls help me in what changes i’ll need to make inorder to add trails and add the gravity property to the update function i believe. It would be really helpful, if u give ur insights on how do i go about this.
    My idea is something like this :
    Thank You.

  7. hi,
    how can i emit mutiple burst on one window? is there any method to do so by modifying your source code?

    • Abdul,

      It’s a simple matter of creating multiple instances of the ParticleEffect class. Each ParticleEffect instance has its own m_LocalToWorldMatrix matrix which determines its position and orientation in the world. I realize that my example does not provide accessor methods for the m_LocalToWorldMatrix but it is a trivial matter to add these methods.

  8. Hi, i can’t find a place where next “emits’ take place …
    First one is shooted out and second later it comes aone after another. Where is code responsible for that ?
    I want to make an emit for a key pressed action.

    • Szymon,

      Particles are emitted in the ParticleEffect::Update method:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.