AWS Thinkbox Discussion Forums

How to pass object or diffuse colors to Frost?

I’d like to pass colors from different meshes to a Frost object. My goal is to produce a mesh for 3d printing (this works fine) but also to keep the original colors for a colored print (no need for textures, only simple flat colors).

From my tests it seems I can pass only vertex colors from the original objects to the Frost object. This means I would have to bake the diffuse color to vertex color for each object first which means a lot of work since the model consists of a lot of separate objects.
Is there a way to pass either the diffuse color of the object’s material OR the object color to Frost without the need to touch vertex colors for each object?

You can, but I will need more details about the source meshes and how they are used by Frost.

A Mesh can be used in two ways in Frost:

  • as a “point source”, where the vertices are used as points to be meshed
  • as a custom shape in Geometry mode where each source point is replaced by a copy of the custom mesh

Which case are were discussing here?

In general, Frost supports multi-sub materials, so in most cases it is possible to colorize faces with sub-materials based on the MtlIndex (Material ID) channel.

Some more details:

  • When mesh vertices are used as a source in Frost, they cannot provide a MtlIndex channel, because 3ds Max simply does not know about that data. Not even Genome can add a MtlIndex to a vertex, because there is no place to store it (we could start using user vertex data channels, but we haven’t done that yet). So there is no direct way to pass MtlIndex to Frost directly, only indirectly:
    • Use Krakatoa to save the vertices to a PRT file, then add MtlIndex or Color to the PRT Loader via Magma
    • Use multiple Krakatoa PRT Surface objects to turn the surfaces of the meshes to a point cloud and mesh that with Frost; add MtlIndex to each PRT Surface via Magma.

Alternatively, as you already mentioned, you could use a mapping channel to pass the data to Frost, while retaining the Vertex Color channel. You could use MAXScript to set a custom mapping channel in the each mesh to the same value (wirecolor, or material diffuse color).

Sorry for replying late, I missed your answers because I was sure I’d get a notification about any activities here which wasn’t the case.

My workflow is this: since it’s a massive model consisting of 1000s of objects I’m collapsing meshes to make it easier to manage the model. I’m using a single PRT Surface to generate points from these meshes and Frost for the PRT Surface.

Since the final model will have around 10 different colors my plan is to collapse all the meshes which I want to have the same color into one mesh, then assign the desired vertex color, and use these in a PRT Surface. Actually, using more than one PRT surface gives me a bit more control, that’s a good idea.

I was hoping there is way to assign vertex colors based on the object color (or each source object material’s diffuse color) directly without any manual steps as this would make process a bit faster. I’ll have to stick to my workflow for now but it would be great if this feature could be added at some point, especially with 3d printing in mind it’s going to be useful.

Thanks for your ideas!

Have you looked at the latest additions to Magma in Krakatoa MX 2.4? That was EXACTLY the reason for adding the ability to detect which object a NearestPoint operator is operating on, and get any property via PropertyQuery!
thinkboxsoftware.com/krakato … ase-notes/
(See “Magma Improvements” sub-topic.)

I recorded a quick video to show you the idea - you don’t even have to combine your objects by color, you can have any number of them with the same or different colors, and let Magma do the legwork for you…

youtu.be/WKzMae0v9t4

This is fantastic, thanks so much for the video. I wouldn’t have been able to solve this without your help.

This opens up a lot of possibilities for other things now that I think of it. Being able to read material properties means I could generate diffuse, reflection amount and glossiness colors that I could bake to maps, with 2 or 3 different Frost meshes I could generate different LOD versions of some very heavy models that would render correctly when seen from a distance, all without too much work or the need to remodel in lowres. Amazing :smiley: Thanks again!

It really helped me a lot. Thanks for sharing.

Bobo, I only now had the time to finally give it a go. Thanks to your video I was able to set this up in a minute.

For some reason though, the PRT surface will not pick up colors from the meshes where they are in the scene but kind of re-projects them in a different place on the point cloud (world space is checked in the PRT surface object and in the magma flow were applicable). The only difference from the test case you are showing in the video I can think of is that I have some hierarchy set up in the scene, but still everything is kept pretty simple.
I’ll follow up by e-mail with image examples as the geometry shows some parts of our client’s dataset.

So I tried again and unlinked all object being part of a hierarchy as a first step (before adding them to the PRT surface and flow’s InputGeometry operator) and it works as expected.
I’m not sure but it’s possible that the local space of a parent object may have undesired effects on how the flow interprets a mesh and its position. I’ll see if I can send you a repro case.

Another update, sorry for the flood:

I tried to create a repro case and it worked as expected. Only difference now was that I placed the PRT Surface object to the world center before adding any objects.
If I place the PRT surface to any other coordinate (than world center) prior to adding the source objects it will ‘project’ the vertex colors in a wrong place on the point cloud.

I guess there’s no need to create a scene as it’s easy to reproduce, in case you still need something, please let me know.

This might be a misunderstanding, or a user error, depending on how you look at it.

Here is how things work under the hood:

  • In most cases, you have a PRT object that loads some particles from disk or generates them on the fly, and passes them up the stack.
  • When a Magma gets those particles, they are in OBJECT space, because the Transforms (if enabled) are applied to the particles on TOP of the stack, before SpaceWarps.
  • So if you process any data in a Magma on the stack, you have to remember that the particle channels like Position, Normal, Velocity etc. are still in Object space.
  • However, Object operators like NearestPoint and IntersectRay operate in World space.
  • So when asking for the Position to sample at, the Position channel of particles that are in Object space needs to be transformed into World space via a ToWorld operator. This operator takes the Transform matrix defined by the PRT object’s Transform controller, and transforms the data into World space by multiplying the value by that matrix.
  • Once a value like Position is computed and has to be written out to a particle channel, it needs to be converted back into Object space so it flows up the stack as it should.

HOWEVER

  • When a PRT Surface or PRT Volume object is explicitly or implicitly in World Space (a PRT Surface of two or more objects is ALWAYS in World Space, even if the option is not checked), the data flowing up the stack is ALREADY in World space.
  • If the InputChannel Position is passed through a ToWorld operator, then these world positions will be multiplied ONCE MORE by the PRT Surface’s transform matrix.
  • If that matrix is identity (same as the world origin - zero offset, zero rotation, one scaling), then there will be no side effect because any value multiplied by the identity matrix remains unchanged. This is why placing the PRT Surface at the origin (something the Krakatoa menu does automatically when you select the creation option with two or more objects already selected in the scene) produces the expected Magma output - in this case World space and Object space are identical.
  • But if the PRT Surface object was moved, rotated or scaled, then the ToWorld operator inside the Magma flow will basically perform an additional transform that is not needed, and the Position being sampled will be shifted accordingly, while the particles themselves will stay where they are, locked on the World space positions on the meshes. The solution is obviously to either disable (Select, then Ctrl+P) the ToWorld operator, or hit Backspace to delete it completely. You only need to do this when the incoming data is known to be already in World space.

Hope this helps!

Thanks for the explanation, it makes perfectly sense - I think it’s safe to file this under ‘user error’ :wink:

So coming back to this after a few years, I wonder if it’s possible to get this in some easier way now with more features added to Krakatoa/Frost?

Basically, I’d like to pass different MtlIDs from multiple objects in a PRT surface to Frost and have the meshed result to have different MtlIDs based on input…
I have to admit I only briefly remember how the setup described above worked and need to test it first, but I’d like to know if things maybe have been simplified.

Privacy | Site terms | Cookie preferences