Wishlist - write bitmap

I would love a way to write to a bitmap.
This would be great for doing things like a homebrew weathering tool or similar. All the tools for raytracing are there so there are lots of cool things you could do.
You could also quickly make something like foam maps etc.

Rhys.

Couldn’t you already do that with texture baking and a vertex color map?

No. That only works with an incredibly dense mesh.
I might want to compute more detail.

Oh, well you’d need to compute it per pixel anyway, which is the same as a dense mesh. 1 megapixel would be the same as a 1000x1000 plane, right? There isn’t a Genome map/material yet though. I suspect the reason why is that it could be horribly slow for an actual render. Imagine a raytracer generating billions and billions of evaluations for each hit. But for a bitmap it wouldn’t be bad at all. With Ember, we could cache/discretize the map, so that would be fine, too. I don’t know if Thinkbox intends to make a Genome Map or just make Ember do that.

I think a Genome shader/map is an interesting idea. While there is a bit of overlap with Ember, I think Genome’s infrastructure for exposing mesh information about the underlying object being shaded (via the ShadeContext) could be really useful. Ember’s texmap on the other hand is going to stay fairly “dumb”. Its just a simple 3D map that doesn’t care about the evaluation point’s normal, view direction, mesh information, etc.

If you’re actually looking for a bitmap when the results are cached at a specific resolution, I think Chad is correct when he says its simply a matter of making an equivalent plane and caching the vertex colors. Did you foresee anything fancier than that?

Well it isn’t just on planes that I want to use it.

It is on full 3D models.

Having to subdivide models to a stupidly high amount, bake the vert colors then render those to a bitmap is a clumsy and overcomplicated workflow. Being able to render straight out to a bitmap would be much easier.

R

Agreed. The suggestion for Ember was just that it supports 3D textures already and it supports discretizing to a grid. Genome might have shade context, but it lacks the other two. So something using both would be needed. If you didn’t care that it got slow, or needed something like Darktree, or you had to have a very high resolution, then I suppose you could skip to a Genome map, but speed-wise it would be the same as massively tesselating your mesh.

Now the question comes up, why doesn’t 3ds max allow you to do 3D texture baking?