Color per particle based on texture

I basically want to do something similar to what is shown as the example here in maya. I have been searching very hard but have not found a way to do so in maya. I have seen a couple tutorials for 3ds max.

As you are probably aware, the color was projected onto the particles using camera mapping. This is now totally doable using Magma in Krakatoa for Maya v2.3, as shown in this Webinar video:

Obviously, you would need the correct Magma setup to do this. I have uploaded the demo scene (Maya 2014 format) to our Box account:

I have not included the projection map, but you can easily render the bunny in the Software renderer and use the resulting image to project via the Apply Texture Krakatoa modifier. Look at the Magma setup (you can save it to disk and load it in another scene to reuse it) and the general setup of the particle system. Hopefully you will be able to recreate the same effect using other objects / particles…

Things to do:
*Download and open the scene in Maya 2014+
*Switch the renderer to Maya Software and render the bunny mesh to an image, then switch back to Krakatoa.
*Select the particle1 system and check Display > Visibility which is currently off
*Click the MOD icon in the Krakatoa shelf to open the Modifiers Editor
*Select the top Magma and press EDIT Modifier to open the Editor and look at the camera mapping flow
*Double-click the second modifier (PRTApplyTexture1) to enable it.
*Go to the file1 texture used by it and pick the image of the bunny rendered in the beginning.
*If you would render the animation in Krakatoa, the particles would move and carry the color of the camera projection

Note that the camera’s position can be animated, and you could have an animated sequence rendered in the Maya Software, mental ray, V-Ray etc. projected onto the particles like we did with the Mini - it should work exactly the same…

Hey Bobo

We are currently testing out krakatoa MY, pretty cool stuff. I was trying to get the mapping of texture to particles technique working, after watching the webinar 10 video. However in the video there are a couple multiply nodes that i am not sure what values to type in.
I’ve marked which ones in following image

I also downloaded the file you posted, but the magma flow is different then the one in the video.
Any help appreciated. thanks :slight_smile:


Hi Erik,

  • The Multiply node after the InputValue 54.43f has a right value of 0.0175. 54.43 was the Camera FOV Angle in Degrees, but we want Radians, so we multiply by Pi/180. The actual value would be 0.01745329251994329576923690768489, but 0.0175 is close enough to perform the DegToRad conversion.
  • The 1.778 is the Aspect Ratio of the projected image. My image was 960x540, or Image Aspect of 1.777(77)…
  • The last Multiply before the Output is 0.5 (could have been Divide by 2.0 to be more obvious).

The flow does the following:

  • Convert the birth position (stored in the Color channel by an expression) from object to World Space, then convert that to Camera Space.
  • Split the X, Y and Z components of the particle position.
  • Divide the X by the Absolute value of Z. This is equivalent to the Tan of the angle at the Camera position between the Z axis and the X component of the vector connecting the camera and the particle.
  • Divide the Y multiplied by the Image Aspect by the Absolute value of Z. Same as above, but for the Y component.
  • Build a new Vector with X and Y equal to those (X/absZ) and (Y*ImageAspect/absZ) values. Keep the Z component as 0.0.
  • Divide the resulting Vector by the Tan of half the FOV angle in Radians.

WHY do we do this? We want to produce a normalized value of the particle position in the image plane space. For example, if a particle is at the left edge of the camera plane, and the FOV is, say, 50 degrees, the angle at the camera will be -25 degrees. Dividing Tan(-25) by Tan(half of 50) will produce a value of -1.0. If the particle was on the far right, its (X/absZ) would correspond to Tan of 25 degrees. Tan(25) divided by Tan(half the Camera FOV) would produce 1.0. The same applies to the Y, but we have to take into account the aspect of the projected image.
Now we have the particle’s position turned into the Tan of an angle, normalized along X and Y in the range from -1.0 to 1.0, but we want it to be from 0.0 to 1.0.

  • Add 1,1,0 to the result to shift the values to the right and make them positive in the range from 0.0 to 2.0.
  • Multiply by 0.5 to scale down the range from (0.0 - 2.0) to (0.0 - 1.0).

As result, the Output TextureCoord contains the remapped birth position of the particle from object space through world space through camera space through image plane space to UV space.

Sounds complicated, but it is quite simple trigonometry.

Privacy | Site terms | Cookie preferences