Particle Systems

        In this chapter, we will see how to implement a particle system with all the update and animation code running entirely on the GPU, inside some pixel shaders.

The starting code can either be a new project, or the project at the end of the previous chapter. The bonus section will combine the particle systems with the terrain rendering, but until then, we have no need of any of the previous code. If you start with a new project, please add a camera component, and create two folders, Textures and Shaders.

The technique will use two render targets, of the same size. The textures associated with the two render targets hold the data used for the particle system. Each pixel in the texture stores information associated with exactly one particle: one of the rendertagets holds velocities, while the other holds positions. Using Render To Texture, these textures are updated, and then, when the particles are drawn, the position information is extracted using vertex texture fetch. The positions render target will also contain, in the fourth component (w) the life of the particle. When the particle’s life passes a certain value, it dies, and can be reborn again. The updating of the position and velocity textures is done by using some pixel shaders, similar to how morphing was done in the previous chapter.

To begin, please add flare.dds to the Textures folder, using the Texture(mipmapped) processor. Next, we open Game1.cs, and add the following members.

 

 

Texture2D randomTexture;                                  //used to store a series of random values, that will be accessed from various shaders
Texture2D particleTexture;                                  //texture used to draw the particles
RenderTarget2D positionRT;                               // render target that will hold the positions of the particles
RenderTarget2D velocityRT;                               // render target that will hold the velocities of the particles
RenderTarget2D temporaryRT;                           // temporary render target, needed when updating the other render targets
DepthStencilBuffer simulationDepthBuffer;           // depth buffer of the same size as the render targets, used when updating the particle system

VertexBuffer particlesVB;                                   // vertex buffer that will hold the particle system's vertices.
Effect renderParticleEffect;                                 // effect file used to render the particles
Effect physicsEffect;                                           // effect file used to update the physics (position and velocities)
SpriteBatch spriteBatch;                                    // sprite batch used for 2D drawing
Boolean isPhysicsReset;                                    // true if the physics was reset. If false, the velocity and position textures will be reset to initial values
int particleCount = 512;                                    // dimension for render targets. We will have particleCount * particleCount particles.

Oh my, that’s a lot of members! As commented, we will need a randomTexture which we will fill with random values, that will be used in the simulation. The temporaryRT render target is needed when updating the positions and velocities, because XNA cannot read and write to a texture at the same time. So for example, when updating the new positions, which are computed using the old positions, they are written in this temporary render target, which is then copied back in the position render target. The vertex buffer will hold as many particles as the system has, and for each vertex, certain data will be transmitted to the vertex buffer. If we want, we can use several vertex buffers, each representing an individual particle system, but which all map to different portions of the same position and velocity render targets. All would be updated at the same time, but we could avoid rendering some of the systems, using frustum culling. You can try to implement this as an exercise. The two Effects are obvious: one is used to draw the particle, and this one will only read the positions inside the vertex shader, while the other is used to reset and update the positions and velocities. Finally, the particleCount variable represents the size of the render targets, so this should be a power of 2. In the end we will have particleCount * particleCount particles.

Now, we need to go into the LoadGraphicsContent function, and initialize them. Some are straightforward.

 

 

particleTexture = content.Load<Texture2D>("Textures\\flare");
spriteBatch = new SpriteBatch(graphics.GraphicsDevice);

temporaryRT = new RenderTarget2D(graphics.GraphicsDevice, particleCount, particleCount, 1, SurfaceFormat.Vector4, MultiSampleType.None, 0);
positionRT = new RenderTarget2D(graphics.GraphicsDevice, particleCount, particleCount, 1, SurfaceFormat.Vector4, MultiSampleType.None, 0);
velocityRT = new RenderTarget2D(graphics.GraphicsDevice, particleCount, particleCount, 1, SurfaceFormat.Vector4, MultiSampleType.None, 0);
simulationDepthBuffer = new DepthStencilBuffer(graphics.GraphicsDevice, particleCount, particleCount, graphics.GraphicsDevice.DepthStencilBuffer.Format);
isPhysicsReset = false;

The SurfaceFormat for the render targets is Vector4. On my video card (GeForce 7600), Vector4 was the only supported format for using vertex texture fetch (HalfVector4 is not supported, and Single only holds one number, and we need 3 for position). If you write this for the Xbox360, you’ll need to use HalfVector4, since Vector4 generates a run-time error, saying that it is not supported. The physics will need to be reset when rebuilding the render targets, so isPhysicsReset is set to false.

The code for initializing the vertex buffer for the particles is this:

 

 

VertexPositionColor[] vertices = new VertexPositionColor[particleCount * particleCount];
   Random rand = new Random();
   for (int i = 0; i < particleCount; i++)
   {
       for (int j = 0; j < particleCount; j++)
       {
           VertexPositionColor vert = new VertexPositionColor();

           vert.Color = new Color(150, 150, (byte)(200 + rand.Next(50)));
           vert.Position = new Vector3();
           vert.Position.X = (float)i / (float)particleCount;
           vert.Position.Y = (float)j / (float)particleCount;
           vertices[i * particleCount + j] = vert;
       }
   }
   particlesVB = new VertexBuffer(graphics.GraphicsDevice, typeof(VertexPositionColor), particleCount * particleCount, ResourceUsage.Points);
   particlesVB.SetData<VertexPositionColor>(vertices);

We will use Point Sprites, so we need only one vertex per particle. For each particle, we set a random color. In this code, I chose to set a random “white-blue-ish” color. Instead of the position, we set the texture coordinates that will be used to fetch the real position from the vertex texture. You can notice we didn’t set Position.Z, since we have all the information we need already. This could be set if we need to pass some other information, like choosing between different textures for the particles, or setting some sort of blinking rate, etc. There are many possibilities in which we could use this, but for this tutorial, we will just ignore it. The vertex buffer is created using ResourceUsage.Points, and the data is copied to it.

Next, we will create the randomTexture, and fill it with random values, between -0.5 and 0.5. This way, X,Y and Z cover a cube with each edge 1, centered around the origin. The size of this texture can be chosen at your discretion.

 

 

randomTexture = new Texture2D(graphics.GraphicsDevice, 128, 128, 1, ResourceUsage.None, SurfaceFormat.Vector4);
Vector4[] pointsarray = new Vector4[128*128];
for (int i = 0; i < 128*128; i++)
   {
    pointsarray[i] = new Vector4();
    pointsarray[i].X = (float)rand.NextDouble() - 0.5f;
    pointsarray[i].Y = (float)rand.NextDouble() - 0.5f;
    pointsarray[i].Z = (float)rand.NextDouble() - 0.5f;
    pointsarray[i].W = (float)rand.NextDouble() - 0.5f;
   }
randomTexture.SetData<Vector4>(pointsarray);

Let’s write some shader code. Create a new file, in the Shaders folder and name it Particle.fx. We need the standard matrices. Then, we add the textures, one for drawing the particle, and one from which we will read the positions of the particles.

 

 

float4x4 view;
float4x4 proj;
float4x4 world;

float sizeModifier : PARTICLE_SIZE = 3.5f;

texture textureMap : DiffuseMap;  // texture for scene rendering
sampler textureSampler= sampler_state
{
    Texture = <textureMap>;
    AddressU  = CLAMP;
    AddressV  = CLAMP;
    MIPFILTER = LINEAR;
    MINFILTER = LINEAR;
    MAGFILTER = LINEAR;
};

texture positionMap;                                // texture for positions, used by VTF
sampler positionSampler = sampler_state
{
    Texture   = <positionMap>;
    MipFilter = None;
    MinFilter = Point;
    MagFilter = Point;
    AddressU  = Clamp;
    AddressV  = Clamp;
};

The particleSize modifies the size of all the particles. After writing all the code, you can play with this value until you are satisfied with how the particles look. Next, the input and output structures:

 

 

struct VS_INPUT
 {
         float4 vertexData : POSITION;
         float4 color          : COLOR0;
 };

struct VS_OUTPUT
 {
     float4 position         : POSITION;
     float4 color             : COLOR0;
     float Size                : PSIZE0;
 };
struct PS_INPUT
 {
        #ifdef XBOX
            float4 textureCoordinate : SPRITETEXCOORD;
        #else
            float2 textureCoordinate : TEXCOORD0;
        #endif
        float4 Color : COLOR0;
 };

The #ifdef block is needed, because Point Sprites are different on Xbox360 than they are on Windows. See this for more information. Next, in the vertex shader, we compute the worldViewProjection matrix, we extract the particle’s position from the position texture, and we transform it. We need to calculate the size of the particle, in relation to the projection matrix and the screenHeight, so the particles are resolution independent.

 

 

float screenHeight = 600;
 VS_OUTPUT Transform(VS_INPUT In)
 {
         VS_OUTPUT Out = (VS_OUTPUT)0;
         float4x4 worldView= mul(world, view);
         float4x4 WorldViewProj=mul(worldView, proj);
         // Transform the position from object space to homogeneous projection space
         float4 realPosition = tex2Dlod ( positionSampler, float4(In.vertexData.x, In.vertexData.y,0,0));
         Out.color = In.color;
         realPosition.w = 1;                     //we need to set this to 1, because the value read from the texture is the life of the particle, which doesn't interest us here.
         Out.position = mul( realPosition , WorldViewProj);
         Out.Size = sizeModifier * proj._m11 / Out.position.w * screenHeight / 2;
         return Out;
 }

Finally, the pixel shader draws the particle with the specified color. One thing you could do here is use the life of the particle in the coloring process, either by fading out the alpha channel, or by using a 1D look-up texture to colorize the particle. The pixel shader for drawing the particle:

 

 

float4 ApplyTexture(PS_INPUT input) : COLOR
        {
                 float2 textureCoordinate;
        #ifdef XBOX
                textureCoordinate = abs(input.textureCoordinate.zw);
        #else
                textureCoordinate = input.textureCoordinate.xy;
        #endif
                 float4 col=tex2D(textureSampler, textureCoordinate) * input.Color;
                 return col;
        }
technique TransformAndTexture
{
        pass P0
        {
                vertexShader = compile vs_3_0 Transform();
                pixelShader  = compile ps_3_0 ApplyTexture();
        }
}

Now, let’s create a new effect file, called ParticlePhysics.fx, in the Shaders folder. In this shader, we need four textures. The temporary texture is used when copying from the temporary render target to one of the others. The position and velocity textures contain the positions and velocities of the particles, and the random texture is the same texture we generated earlier.

 

 

texture temporaryMap;
sampler temporarySampler : register(s0)  = sampler_state
{
    Texture   = <temporaryMap>;
    MipFilter = None;
    MinFilter = Point;
    MagFilter = Point;
    AddressU  = Clamp;
    AddressV  = Clamp;
};

texture positionMap;
sampler positionSampler  = sampler_state
{
    Texture   = <positionMap>;
    MipFilter = None;
    MinFilter = Point;
    MagFilter = Point;
    AddressU  = Clamp;
    AddressV  = Clamp;
};

texture velocityMap;
sampler velocitySampler = sampler_state
{
    Texture   = <velocityMap>;
    MipFilter = None;
    MinFilter = Point;
    MagFilter = Point;
    AddressU  = Clamp;
    AddressV  = Clamp;
};

texture randomMap;
sampler randomSampler : register(s0)  = sampler_state
{
    Texture   = <randomMap>;
    MipFilter = None;
    MinFilter = Point;
    MagFilter = Point;
    AddressU  = wrap;
    AddressV  = wrap;
};

We need a variable for the maximum life a particle can have. Remember when we said that the position texture has four floating point numbers for each particle. Three of them store the position of the particle, and the last one stores the life of the particle. Resetting is done in two pixel shaders. They set the position to a new initial position, the life to a random value between 0 and maxLife, and the initial velocity to 0. The strange number in ResetPositionPS (10.2484) is just a random number. We use this to randomize things a little. Adding the multiplication with this random number seems to scramble the particles a little more.

 

 

float maxLife = 5.0f;
   float3 generateNewPosition(float2 uv)
   {
           float4 rand =  tex2D(randomSampler, uv);
           return float3(rand.x*1024,250,rand.y*1024);
   }
   float4 ResetPositionsPS(in float2 uv : TEXCOORD0) : COLOR
   {
           return float4(generateNewPosition(uv),  maxLife * frac (tex2D(randomSampler, 10.2484 * uv).w));
   }
   float4 ResetVelocitiesPS(in float2 uv : TEXCOORD0) : COLOR
   {
           return float4(0,0,0,0);
   }
   technique ResetPositions
   {
       pass P0
       {
           pixelShader  = compile ps_3_0 ResetPositionsPS();
       }
   }
   technique ResetVelocities
   {
       pass P0
       {
           pixelShader  = compile ps_3_0 ResetVelocitiesPS();
       }
   }

In the generateNewPosition function, we set the height to 250, because the effect we will try to achieve is that of “falling particles”. This part of the code should be written however you wish, in order to achieve a certain effect. For example, if we use the following code, the particles would appear on the boundary of a sphere:

 

 

float3 generateNewPosition(float2 uv)
  {
          float3 rand =  tex2D(randomSampler, uv);
          rand = normalize(rand);
          return float3(rand.x*128,rand.y*128,rand.z*128);
  }

For this tutorial, we’ll just stick to creating them with the first version, on a plane.

We saw earlier that we will need a pixel shader that simply copies one texture to the output. This will be used when copying from the temporary render target to one of the other render targets.

 

 

float4 CopyTexturePS(in float2 uv : TEXCOORD0) : COLOR
  {
           return tex2D(temporarySampler,uv);
  }

  technique CopyTexture
  {
      pass P0
      {
          pixelShader  = compile ps_3_0 CopyTexturePS();
      }
  }

Now, for the fun part. The update functions will need to know the elapsedTime, so we can have framerate independent animations. The code to update the positions is as follows:

 

 

float elapsedTime = 0.0f;
float4 UpdatePositionsPS(in float2 uv : TEXCOORD0) : COLOR
{
         float4 pos = tex2D(positionSampler, uv);
         // check for life
         if (pos.w >= maxLife)
         {
                  // the particle is dead, so we need to revive it
                  // Restart time
                  pos.w = 0;
                  // Compute new position
                    pos.xyz = generateNewPosition(uv);

         }
         else
         {
                  // Update particle position and life
                  float4 velocity = tex2D(velocitySampler, uv);
                  pos.xyz += elapsedTime * velocity;
                  pos.w += elapsedTime;
         }
 return pos;
}

When updating the position, we first compare if the particle’s age is greater than the maximum life of any particle. If this is true, we reset the life of the particle to zero, and we generate a new position for the particle. When updating velocities, we will use the same reasoning. If the life if greater that the maximum life, we set a new random velocity, otherwise we update the velocity as we wish. Here is the place where you would want to modify the code to make the system behave as you wish, and achieve various effects. Also, this is the place where you would want to handle collisions and the influence of other forces (attractors and generators), passed as shader parameters. We will see an example for this later.

 

 

float4 UpdateVelocitiesPS(in float2 uv : TEXCOORD0) : COLOR
 {
         float4 velocity = tex2D(velocitySampler, uv);
         float4 pos = tex2D(positionSampler, uv);

         if (pos.w>= maxLife)
         {
                 //reset velocity with a pseudo-random value
                 float4 rand = tex2D(randomSampler, uv);
                 float4 rand2 = tex2D(randomSampler, uv + float2(rand.y,rand.w));
                 velocity.xyz = rand2.xyz * 8 + 10.0 * rand.xyz;
         }
         else
         {
                  //gravitational acceleration. Tweak it until you like the effect :)
                 velocity.y -= 20.0 * elapsedTime;
         }
         return velocity;
 }

All we need to do is add the techniques, and we’re done with writing HLSL code, for now.

 

 

technique UpdatePositions
 {
     pass P0
     {
         pixelShader  = compile ps_3_0 UpdatePositionsPS();
     }
 }

 technique UpdateVelocities
 {
     pass P0
     {
         pixelShader  = compile ps_3_0 UpdateVelocitiesPS();
     }
 }

Now let’s get back to writing C# code. Let’s load the effect files in the corresponding variables.

 

 

protected override void LoadGraphicsContent(bool loadAllContent)
        {
            [...]
            physicsEffect = content.Load<Effect>("Shaders\\ParticlePhysics");
            renderParticleEffect = content.Load<Effect>("Shaders\\Particle");
        }

We will use the techniques from the ParticlePhysics.fx file a lot, so we need a helper function that calls them, and writes the result wherever we need it. As parameters, we pass the technique that will be used, and the render target where we want the result to be written to. We apply the technique, writing to the temporary render target, and after that, we use the “CopyTexture” technique to write the result to the desired render target. If the physics need resetting (isPhysicsReset is false), than we can’t call positionRT.GetTexture(), because it would generate an error if the device has just been reset, and the render targets just created, but never resolved.

 

 

private void DoPhysicsPass(string technique, RenderTarget2D resultTarget)
        {
            RenderTarget2D oldRT = graphics.GraphicsDevice.GetRenderTarget(0) as RenderTarget2D;
            DepthStencilBuffer oldDS = graphics.GraphicsDevice.DepthStencilBuffer;

            graphics.GraphicsDevice.DepthStencilBuffer = simulationDepthBuffer;
            graphics.GraphicsDevice.SetRenderTarget(0, temporaryRT);

            graphics.GraphicsDevice.Clear(ClearOptions.Target | ClearOptions.DepthBuffer, Color.White, 1, 0);

            spriteBatch.Begin(SpriteBlendMode.None,
                              SpriteSortMode.Immediate,
                              SaveStateMode.None);

            physicsEffect.CurrentTechnique = physicsEffect.Techniques[technique];
            physicsEffect.Begin();

            if (isPhysicsReset)
            {
                physicsEffect.Parameters["positionMap"].SetValue(positionRT.GetTexture());
                physicsEffect.Parameters["velocityMap"].SetValue(velocityRT.GetTexture());
            }

            physicsEffect.CurrentTechnique.Passes[0].Begin();
            // the positionMap and velocityMap are passed through parameters
            // We need to pass a texture to the spriteBatch.Draw() function, even if we won't be using it some times, so we pass the randomTexture
            spriteBatch.Draw(randomTexture, new Rectangle(0, 0, particleCount, particleCount), Color.White);
            physicsEffect.CurrentTechnique.Passes[0].End();
            physicsEffect.End();

            spriteBatch.End();
            graphics.GraphicsDevice.ResolveRenderTarget(0);

            graphics.GraphicsDevice.SetRenderTarget(0, resultTarget);
            spriteBatch.Begin(SpriteBlendMode.None,
                              SpriteSortMode.Immediate,
                              SaveStateMode.None);

            physicsEffect.CurrentTechnique = physicsEffect.Techniques["CopyTexture"];
            physicsEffect.Begin();
            physicsEffect.CurrentTechnique.Passes[0].Begin();
            spriteBatch.Draw(temporaryRT.GetTexture(), new Rectangle(0, 0, particleCount, particleCount), Color.White);
            physicsEffect.CurrentTechnique.Passes[0].End();
            physicsEffect.End();
            spriteBatch.End();
            graphics.GraphicsDevice.ResolveRenderTarget(0);
            graphics.GraphicsDevice.SetRenderTarget(0, oldRT);
            graphics.GraphicsDevice.DepthStencilBuffer = oldDS;
        }

For example, if we would want to use the UpdateVelocities technique, and write the result to the velocity render target, the call would be DoPhysicsPass(“UpdateVelocities”, velocityRT).

In the function that simulates the particle system, we set the elapsedTime parameter. If the physics needs resetting, we apply the ResetPositions and ResetVelocities techniques. On every frame, we update the velocities, and then the positions.

Now, we need to call this function and draw the particle system. This is done inside the Draw() function. We use additive blending, and disable writing to the depth buffer, so all our particles are drawn ok. For this reason, any other geometry needs to be drawn BEFORE the particles, so the depth info is written to the depth buffer, and particles behind other geometry are not drawn.

 

 

protected override void Draw(GameTime gameTime)
        {

            graphics.GraphicsDevice.Clear(Color.Black);

            SimulateParticles(gameTime);
           [...] //other drawing code we might have
           // set the parameters of the shader
            renderParticleEffect.Parameters["world"].SetValue(Matrix.Identity);
            renderParticleEffect.Parameters["view"].SetValue(camera.View);
            renderParticleEffect.Parameters["proj"].SetValue(camera.Projection);
            renderParticleEffect.Parameters["textureMap"].SetValue(particleTexture);
            renderParticleEffect.Parameters["positionMap"].SetValue(positionRT.GetTexture());
            renderParticleEffect.CommitChanges();
            graphics.GraphicsDevice.RenderState.AlphaBlendEnable = true;
            graphics.GraphicsDevice.RenderState.AlphaBlendOperation = BlendFunction.Add;
            graphics.GraphicsDevice.RenderState.DepthBufferWriteEnable = false;
            graphics.GraphicsDevice.RenderState.PointSpriteEnable = true;
            graphics.GraphicsDevice.RenderState.SourceBlend = Blend.SourceAlpha;
            graphics.GraphicsDevice.RenderState.DestinationBlend = Blend.One;
            using (VertexDeclaration decl = new VertexDeclaration(
                   graphics.GraphicsDevice, VertexPositionColor.VertexElements))
            {
                graphics.GraphicsDevice.VertexDeclaration = decl;

                renderParticleEffect.Begin();
                renderParticleEffect.CurrentTechnique.Passes[0].Begin();

                graphics.GraphicsDevice.Vertices[0].SetSource(particlesVB, 0, VertexPositionColor.SizeInBytes);
                graphics.GraphicsDevice.DrawPrimitives(PrimitiveType.PointList, 0, particleCount * particleCount);

                renderParticleEffect.CurrentTechnique.Passes[0].End();

                renderParticleEffect.End();

            }
            graphics.GraphicsDevice.RenderState.PointSpriteEnable = false;
            graphics.GraphicsDevice.RenderState.AlphaBlendEnable = false;
            graphics.GraphicsDevice.RenderState.DepthBufferWriteEnable = true;

}

If you run the code now, you should see something like this:

particles

Let’s see what would it take to add some wind. Inside ParticlePhysics.fx, we need to add some parameters that define the wind, and then use them inside UpdateVelocitiesPS to modify the velocity of the particles.

 

 

float3 windDirection;
 float windStrength;
 float4 UpdateVelocitiesPS(in float2 uv : TEXCOORD0) : COLOR
 {
         [...]
         if (pos.w>= maxLife)
         {
         [...]
         }
         else
         {
                 //gravity
                 velocity.y -= 20.0 * elapsedTime;
                 velocity += windDirection * windStrength * elapsedTime;
         }
         return velocity;
 }

In Game1.cs, go inside SimulateParticles, and add the following lines at the beginning of the function.

 

 

Vector2 leftStick = GamePad.GetState(PlayerIndex.One,GamePadDeadZone.Circular).ThumbSticks.Left;
  if (leftStick.Length() > 0.2f)
  {
      physicsEffect.Parameters["windStrength"].SetValue(leftStick.Length() * 50);
      leftStick.Normalize();
      physicsEffect.Parameters["windDirection"].SetValue(new Vector4(-leftStick.X, 0, leftStick.Y,0));
  }
  else
      physicsEffect.Parameters["windStrength"].SetValue(0)

Now, start the application, and as you move the left stick, the wind will begin to blow. For a more complex scenario, you could simulate winds using a starting point, an attenuation function, and strength, to add some special effects to spells, or fast objects. You could pass many of these through arrays. For collisions with objects, assuming you use bounding spheres, you could pass these spheres to the shader through some arrays (center point and radius), and in here, check for collision with the spheres and modify the velocity accordingly. See the conclusions chapter for some discussion on further development.

        Bonus: Terrain Collision

Let’s see how to detect collision with the terrain, and stop the particles in place. It’s easy to combine the two projects, and have the terrain rendered alongside the particle system. Unfortunately, the particles pass right through the terrain. To fix this, we will read from the displacement texture for the terrain, inside the pixel shader used to update the velocities and positions. For static terrain, it’s very simple: when detecting collision, set the velocity to 0. For dynamic terrain, however, we need to also update the position as the terrain morphs.

Since our last chapter had morphing terrain in it, we will write code to accommodate that. We begin by opening ParticlePhysics.cs and adding a sampler for the displacement map.

 

 

texture displacementMap;
sampler displacementSampler = sampler_state
{
        Texture   = <displacementMap>;
        MIPFILTER = LINEAR;
        MINFILTER = LINEAR;
        MAGFILTER = LINEAR;
        AddressU  = Clamp;
        AddressV  = Clamp;
};

Next, you’ll have to add the following code to UpdateVelocitiesPS and UpdatePositionsPS.

 

 

float4 UpdatePositionsPS(in float2 uv : TEXCOORD0) : COLOR
{
         float4 pos = tex2D(positionSampler, uv);
         if (pos.w >= maxLife)
         { [...] }
         else
         {
                float2 displacementUV = float2( (pos.x/4+128)/256, (pos.z/4+128)/256 );
                float height = 3 + tex2D(displacementSampler,displacementUV).x * 128;
                if (pos.y < height)
                {
                    pos.y = height;
                }
                else
                {
                        // Update particle position
                        float4 velocity = tex2D(velocitySampler, uv);
                        pos.xyz += elapsedTime * velocity;
                }
                pos.w += elapsedTime;
        }
        return pos;
}

float4 UpdateVelocitiesPS(in float2 uv : TEXCOORD0) : COLOR
{
        [...]
        if (pos.w>= maxLife)
        { [...] }
        else
        {
                float2 displacementUV = float2( (pos.x/4+128)/256, (pos.z/4+128)/256 );
                float height = 3 + tex2D(displacementSampler,displacementUV).x * 128;
                if (pos.y <= height)
                {
                        velocity = 0;
                }
                else
                {
                        //gravity
                        velocity.y -= 20.0 * elapsedTime;
                        velocity += windDirection * windStrength * elapsedTime;
                }
        }
        return velocity;
}

There are some very hard coded values. The 3+ when calculating the height is just a bias number, to keep the particles a little above the ground. The 128 that the height is multiplied with is the maxHeight value encountered in the VTFDisplacement.fx shader. It is trivial to make it a variable, and set it from inside the application. Now, that strange operation (pos.x/4 + 128)/256. This transforms a coordinate in space to the texture coordinate that point has in the displacement texture. 4 is the equivalent of Grid.CellSize, and 256 is the equivalent of Grid.Dimension. This is the inverse of the formula used to compute the position in the GenerateStructures function in Grid.cs. I chose to hard code it here for commodity, but in a real-game situation, these values should be passed as shader parameters.

 

 

vert.Position = new Vector3((i - dimension / 2.0f) * cellSize, 0, (j - dimension / 2.0f) * cellSize);

In Game1.cs we need to add one more line of code, in the SimulateParticles function.

 

 

physicsEffect.Parameters["displacementMap"].SetValue(morphRenderTarget.GetTexture());

After you change the clearing color to Color.SkyBlue, this final result looks like this:

particlesCollision

The collision response can be improved, of course. We could also pass the normal map to the shaders in ParticlePhysics.fx, and use the normals for more physically correct responses, like bouncing, flowing on the slopes of the terrain, accumulation inside puddles, and others. But in this case, I wanted to only show the principle behind dealing with collisions inside the vertex shader, so I decided not to go too deep inside physics.

This chapter was a lengthy one, and I hope it wasn’t too boring. We saw how to hold information about particles in two textures, how to update those textures, and how to draw the particles using vertex texture fetch. We also added some very basic wind in the simulation, and basic collision with the terrain. There are some things that can be added, like better physics, different system dynamics (generalized system with emitter, attractors, etc), several particle systems residing in the same textures, so feel free to experiment, and if you come up with something interesting, please drop me a message. Again, see the conclusions chapter for some discussion on further development.

The code for this chapter (including terrain collision) can be found in Chapter3.zip.

  • Bo

    Excellent :)

    Took the liberty to upgrade to XNA 2.0, hope its okay :)

    http://www.freeplay.dk/Chapter3_XNA2.zip

  • http://www.catalinzima.com Catalin Zima

    It’s ok, no problem :) Thanks for letting me know

  • http://nkari.uw.hu Carlos

    Nice work, pal!

  • http://www.catalinzima.com Catalin Zima

    Thanks, glad you like it

  • Taylor

    This is great code! I’m a huge fan of particles, and looking at this code finally let me push the 200k-1 million mark.

    One quick question I have is taking the example as is and changing the SurfaceFormats of the particle-related Render Targets and converting them from Vector4 to HalfVector4 so the code can run on the XBOX.

    When I do this, everything compiles and launches fine, but the screen and clear color all have a blue hue. I can change the clear color to black and I still get this blue hue on the XBOX. With the exact same code, it looks as expected on Windows.

    Any idea on why this is? Forgive me if it’s something simple that I am overlooking.

  • http://www.catalinzima.com Catalin Zima

    In the Draw function, please add a clearing instruction, right after the call to SimulateParticles(). I overlooked this, by mistake. In XNA GS 1.0 (the article was written for that version), Windows and Xbox had different behaviours related to the contents of the Render Target. While on windows the contents are preserved while working with another RT, on Xbox, they are lost. This is why you got a different result. Clearing the device after finishing work with other render targets whould yield a correct result

    protected override void Draw(GameTime gameTime)
    {
    graphics.GraphicsDevice.Clear(Color.SkyBlue);

    Render2TextureMorph((float)Math.Sin(gameTime.TotalGameTime.TotalSeconds) * 0.5f + 0.5f);
    Render2TextureNormalCompute();

    SimulateParticles(gameTime);
    graphics.GraphicsDevice.Clear(Color.Black);
    [...]
    }

  • Taylor

    Thanks a lot! This worked like a charm.

  • Uberfly

    Hey, nice article… As always!

    Are these texture fetch vertex positions faster than manually updating a DynamicVertexBuffer?
    Would it still be quick to use the CPU program to update the position texture’s colours, depending on positions using Texture2D.SetData? That way I could make much more dynamic particles using my physics engine instead of the GPU.

  • winipcfg

    Thanks for your great tutorial

    One straight problem for me is when the Windows is minimized and reopen it, position of all the particles changes to (0,0,0);
    This problem would not happen when I switch to other task by alt-tab

  • http://www.catalinzima.com Catalin Zima

    Uberfly: Yes, they are faster, since everything is done on the GPU, no communications between CPU and GPU are needed.
    Using SetData might be ok. Try it out and see how well it goes.

    winipcfg: minimizing causes a device rebuild, rather than a device reset. I’m not really sure why the positions don’t reset to a plane, but that may be because the tutorial was written in XNA 1.0.

  • winipcfg

    Shawn Hargreaves said data in rendertarget is lost when the device is rebuild
    http://forums.xna.com/forums/t/13070.aspx

    I think we could stored total elapsed time for the particle system in each frame. So when it tries to rebuild, the total elapsed time can help us to recreate the particles approximately.

  • http://www.catalinzima.com Catalin Zima

    Yes, you could that, but it would probably be better just to quickly run a hidden simulation (simulating about 100 frames), after the device is reset

  • Zeroconf

    Seems to be really great, however I can’t used it: I have only XNA 3.0, I tried to convert this tuto (from this tutorial version and from XNA 2.0 version), all compile but there is an error at run-time at line:

    graphics.GraphicsDevice.DrawPrimitives(PrimitiveType.PointList, 0, particleCount * particleCount);

    Error is:
    Both a valid vertex shader and pixel shader (or valid effect) must be set on the device before any draw operations may be performed.

    Indeed the GraphicsDevice.VertexShader is null, not the GraphicsDevice.pixelShader, but the effect is corrrectly compiled and loaded, VS and PS are compiled in 3.0 version. If I replace the line:

    float4 realPosition = tex2Dlod ( positionSampler, float4(In.vertexData.x, In.vertexData.y, 0, 0));

    by this one:
    float4 realPosition = float4(0,0,0,0);

    there is no more error… Please help me, it will be so great…

  • http://www.catalinzima.com Catalin Zima

    What video card do you have? If your video card does not support Vertex Texture Fetch, then an error of that kind is likely to occur.

  • Zeroconf

    Oups I forgot that, ATI X1950 XT wich does normally support PS and VS 3.0 … Thanks for this rapid post.

  • http://www.catalinzima.com Catalin Zima

    Well, I know for sure that VTF doesn’t work on DirectX 9 ATI cards, but the X1950 is a DirectX10 generation card, so the hardware should theoretically allow VTF for DirectX 9 applications…

  • Zeroconf

    Thanks a new time for a so rapid answer, too many hours passed on search why tex2Dlod don’t work, i prefere use another method for have approximatly the same effect, even if this one seems more efficient, however thanks for help.

  • winipcfg

    How can we improve the particle system for handling collision detection with 3D models?
    It is weird if some snow pass through a human body

  • http://www.catalinzima.com Catalin Zima

    That’s a pretty interesting job. One way would be to make a pass and render the whole scene from above, so it becomes like a sort of heightmap of the scene at a given moment. Then, use that “heightmap” for collision, instead of just the terrain heightmap.

  • winipcfg

    What is “a sort of heightmap”?
    Assume we know the position of all entities. We can calculate the height of them. But the snow does not just come from top to bottom, it moves in all directions.

    Therefore, the snow can pass through the body in any direction. It may pass under an aeroplane and does not collide with it.

    Would you enlighten us how to solve it? Thank you very much

  • http://www.catalinzima.com Catalin Zima

    Indeed, my idea only works as you said it: when snow falls from the top, and doesn’t go under objects.

    Currently, simulation of snow on the GPU and accurate physics modeling are very hard, if not impossible to be done together, so I don’t think I can offer you a satisfying solution.

    But you should first condider if this sort of realistic collision simulation is what you really want to spend CPU/GPU cycles on

  • zander

    Hi! Great tutorial, thanks!

    I have a problem is that I can an InvalidOperationException at spriteBatch.End() inside the DoPhysicsPass() method. As far as I can tell my graphics card supports DirectX 10 and PS 3.0. The exception gives no more detail than ‘An unknown error occurred’.

    What am I doing wrong?

  • http://www.catalinzima.com Catalin Zima

    Looks like a bit of code is missing from the tutorial. Here’s the SimulateParticles function:

    private void SimulateParticles(GameTime gameTime)
    {
    physicsEffect.Parameters["elapsedTime"].SetValue(
    (float)gameTime.ElapsedGameTime.TotalSeconds);

    if (!isPhysicsReset)
    {
    DoPhysicsPass(“ResetPositions”, positionRT);
    DoPhysicsPass(“ResetVelocities”, velocityRT);

    isPhysicsReset = true;
    }

    DoPhysicsPass(“UpdateVelocities”, velocityRT);
    DoPhysicsPass(“UpdatePositions”, positionRT);
    }

  • 62316e

    HI! How to make point sprite rotate?

  • zander

    Take a look at the Particle System sample on the XNA Creators Website; there is some HLSL code that can rotate point sprites.

    Found the problem… seems that Shader model 3.0 doesn’t like the directX Debug Runtime.

  • http://dpsf.danskingdom.com Dan

    Nice article. If anyone is looking for a particle system framework they can use in their XNA applications, instead of creating their own, check out DPSF at http://dpsf.danskingdom.com. It allows you to create your own custom particle systems in XNA.

  • http://Website Che`

    Hey, has anyone updated this to use XNA 4? Awesome article btw.

  • http://Website Adam

    I’ve been trying to convert this to XNA 4.0 but have run into a couple of problems.

    1. Point Lists have been removed so you have to have triangles (not an issue by itself, just not compatible with this tutorial)

    2. The framework doesn’t allow writing to alpha/color channels on any floating point texture (Vector4, HalfVector4, etc…)

    Problem 2 is by far the most annoying and I’m having a lot of trouble getting around it. I’ve tried setting the render targets to use HDRBLENDABLE and the physics update shader seems to write to them fine, but I’m having trouble drawing the particles with the rendering shader.

    Any insight would be appreciated Catalin :)

  • http://Website Ludo

    Hello Catalin Zima !
    Thank you so much for your tutorial ! It is really helpful to me !

    Just a question,
    I am programming in .NET, Do you know where I can find a similar source/tutorial in C# or .NET ? I am kind of lost at several points.

    Thank you again !
    Best