So I’ve just confused myself into confusion. I’m trying to use the particle distance to an object and color it with a gradient ramp. For instance, if the particle is at 0 distance, then it will be equal to X-position 0 on the gradient, and the farthest particles will be equal to the x-position 100 on the gradient. Then the ones in between will be whatever the interposition colors are. I’m basically having trouble mapping the distance to the x-position of the texture gradient. I could’ve sworn I’ve seen this somewhere already but I can’t seem to find it.
Oh, and just to add. I’m using a PRT Volume with a Gradient Ramp applied to it. I’m going to drill away and see if I can figure it out, but any help is much appreciated
There are two approaches: You can set the TextureCoord’s first component (U, which is X in Magma) to a value between 0.0 and 1.0, or you can apply the GradientRamp directly inside the Magma and output the result to Color.
Since you already have the GradientRamp applied to the PRT Volume, you should use the former approach.
So, take a NearestPoint operator, feed in the mesh(es) and the WorldSpace Position inputs. Take the Distance output and divide by the Max. Distance you want to allow (you always need to normalize by a known maximum value to produce values between 0.0 and 1.0). Let’s say you assume that the right color of the GradientRamp will be at distance of 500.0 units. So you take the Distance output, divide by 500.0, then pass the result through a Function>Clamp to limit it between 0.0 and 0.99, otherwise at distance of 600.0 you will get a value > 1.0, and the U will repeat again. If you want the gradient to tile after 500.0 units, don’t add the Clamp. Now pass the resulting Float into a Convert>ToVector operator’s X socket and output the result as TextureCoord channel. If the GradientRamp is set to use Mapping Channel 1, you should now see the GradientRamp on the PRT Volume based on the normalized distance to the mesh’s surface…
I am clamping between 0.0 and 0.99 instead of 1.0 because otherwise the GradientRamp might still evaluate a 0.0 color when exactly 1.0 due to the way the UV tiling works…
Thanks Bobo! Where I went wrong was I was multiplying the distance by the position and then clamping it. I got a really strange result and it started repeating. I think my logic was that I was trying to have the result scale along the x-axis and expecting the results to make a gradient. But doing it that way just gave me a single value instead of a spectrum…I think. Oh well, it didn’t work. Again, thanks.
Hi, I have a question regarding this topic. Is it possible to set magma flow to measure max or min channel value? I need to get max distance not as typed-in value, but measure it throughout the animation. In debbuger mode this value is nicely shown, but distance is changing during the animation, so I must somehow read it from the channel. I cannot find the right node or solution to do this
This is a very good question.
The Debugger processes the complete stream and figures out the Min, Max and Mean in the process.
Magma does not do that (yet) because it would require a pre-pass over all data, which would make it twice as slow.
It has been requested previously though, so it is already logged as a Wishlist item.
As an intermediate hack, you could use the Debugger MAXScript Interface to calculate the Min. and Max. values of the stream in a separate pass and keyframe the value into an animation track that you could reference from inside your Magma flow. It is not fully automatic, but it would beat doing it by hand…
To get the Debug data of a Krakatoa Magma node, you can use
where in both places 1 is the index of the Magma modifier counted from the top of the stack. In other words, if there is only one Magma modifier of the stack, the above would work. $TheObject should be replaced with the actual object signature, e.g. $PRTVolume_GeoSphere002_001 or whatever the name of the object is that has the Magma.
The result of the above call should be an IObject:DebugInformation interface.
You can now call a function to get the MinMaxMean data from it for a specific node and output socket. Say you have a flow that has an InputChannel:Position connected to an Output:Color and you want to know the Min/Max/Mean info from the Position input. Assuming the InputPosition was created first and has an ID of 0, and the Output was created second and has an ID of 1, you would say
theVal = debugData.GetNodeMinMaxMeanValue 0 1
where the 0 is the ID of the node, and 1 is the Index of its Output Socket. If a node has multiple output sockets, you can ask each one of them for the value.
The returned array will contain 3 elements - the Min. value, the Max. value and the Mean value.
You would have to run a for loop to advance the time slider and evaluate the flow on each frame, get the Interface, call the function to get the Min/Max/Mean array, and then keyframe some property in the scene to the result you want to store…
This is very hacky, but it is a temp. workaround until we add native Min/Max/Mean evaluation to Magma.