The attached zip file contains two sample scenes - one using a Plane and one using a Sphere.
In fact, there are two Planes and two Spheres, because of cyclic reference problems - the one is used by Ember to acquire the mapping, the other is used with Genome to acquire the new mapping from Ember and show the texture.
Since Genome does not have an InputField operator yet, I used a PRT Ember to sample the SIM Ember and get the advected Mapping channel, then sampled that using ParticleSumRadius in the Genome modifier.
In the future, Genome will be able to read the data directly from the SIM Ember.
So the setup is like this:
*Clone the object
*Create a SIM Ember around it and get its mapping using NearestPoint into a TextureCoord channel in the Initial flow.
*In the Simulation flow, get the particle velocities and output as Velocity to advect the TextureCoords.
*Create a PRT Ember from the SIM Ember, get the TextureCoord channel from the SIM Ember
*Add Genome to the second object and sample the TextureCoord. Output as TextureCoord and Color (for visualization).
*On the second object, add a Genome and sample the TextureCoord channel of the PRT Ember to get the advected coordinates.
Note that in the case of UV seams, this does not look pretty because the 0 and 1 get mixed in the field. There isn’t much can be done about that… But it works well with open objects like a plane.
EMBER_PFlow_AffectingMapping_MAX2010_v005.zip (432 KB)