AWS Thinkbox Discussion Forums

PerParticle color/Texture on PRT Volume/Surface

Hi. Thanks guys for creating awesome Particle system!

I have a modest project coming up with an organic sand creature in a walk cycle, then partially breaking up by the wind.

I’ve seen the post : color perparticle based on texture. Its very good, but may not be what i need. My character is moving. I dont need a camera view based projection.

So… Is it possible to apply PerParticle colour to PRT volume or surface animated objects without a noticeable float in the texture coordinates. Maybe a cheat, a compromise or a sensible alternative?

IMHO The PRT Volume/Surface procedures are just to amazing not to use…




Hi Jeff,

Both PRT Volume and PRT Surface will acquire the UV coordinates from the source geometry. Points deep in the volume will still get the UVs from the closest point on the surface, so if the particles are a bit transparent, you might see some texture streaks into the volume. But in general the point cloud should represent exactly the mapping of the mesh, deforming or not.

That being said, the particles will be generated again and again on each frame, with NO correspondence to particles on previous frames. So I am not entirely sure PRT Volume or PRT Surface are a good approach for something that will crumble over time, and you might get a lot of flickering.

We just had a discussion about DEFORMING existing particle clouds using a mesh as a skin wrap (a Magma-based setup).
That approach might be better as you could create an initial particle system in a base T-pose, then deform those particles as the character moves without changing the particle distribution. The base data could still be animated in the T-pose, for example particles disappearing from the crumbling areas. Then you would also need the dynamic simulation of the crumbling particles using regular Maya particles/nParticle which would be independent from the PRT / Magma setup.

I was made aware by our Krakatoa MY lead developer that the PRT Volume in KMY does not propagate the UV channel automatically like in other implementations. Still, it would be trivial to acquire the UVs for the closest point of a mesh using Magma and NearestPoint in the same manner, it would just be a bit slower.

Hi Jeff, we have now implemented UV coordinates from meshes, so what Bobo is describing will work in the coming release for KMY.

Privacy | Site terms | Cookie preferences