This is a general question, and I’m not sure if it’s a bug, because I don’t know what it’s supposed to do, just that it doesn’t do what I expect it to do, and I haven’t read the documentation.
Ok, I have pflow particles, with a material operator that has a standard mat with cellular in the color and opacity. In the Krakatoa UI, I have set color to material self-illumination, use lighting on, volumetric density, and scale density using material opacity.
The color works fine, but where the opacity nears 0, the particles, instead of being less dense turn white. If I turn off the opacity map in the material, they render cyan, the color defined by the material color map. Other than the color changing, nothing else is different.
Am I doing something wrong?
Also, the UI says “Material Self-Illumination” but it seems to require both self-illumination AND diffuse color. Am I setting it up incorrectly?
There is a problem with getting the opacity from the standard material. For whatever reason, it returns junk values when it’s animated or anything different than the simplest case.
I’ll probably have to get around it by making a ‘Krakatoa Material’ that returns it in a better way.
To get the self-illumination, Krakatoa queries the material without giving it any lights. That means that just the part of the material that is self-illuminated will show up.
The opacity mapping works better in beta 2. I was able to at least get a cellular map to work. But some other maps are not working, such as SimbiontMax. If I apply the material to a box and render with scanline, it looks good, but applied to a box of particles rendered in Krakatoa beta 2, the density is perfectly uniform.
Uh oh.
I think something in the UI is messed up. Density from mapped opacity doesn’t work, but No Scaling DOES read (I think correctly) the opacity map.
For that SimbiontMax material, have you made sure that the UVWs it needs are transferred to the particles? I don’t think particle flow does that by default. The cellular map is using world space coordinates by default, so it would work without any UVWs in the particles.
Yeah, I’m using World XYZ coordinates.
My comment about the UI being messed up isn’t perfectly accurate. I’m getting unpredictable behavior by simply turning UI elements on and off. Trying to find a pattern now.
I can’t tell if it’s Simbiont on Krakatoa. Dang it if Simbiont doens’t work just fine on geometry in scanline, and Krakatoa works just fine on 3ds max standard 3D maps.
I’m still testing this out.
I’m still testing this out. <<
Thanks. Keep us posted.
(Do you have Orbaz Particle Flow Tools, Box 1? It provides a great Mapping
operator that transfers UVs from any geometry onto your particles. I’ve
been using that for tests of particle colors and density scaling by material
opacity.)
We have Box#1, and Box#3 should be ordered by next week (our CFO is on vacation and took the credit cards with him).
The mapping thing though it surface only, though, right? I mean, if I emit a particle from a teapot’s surface, I can get the teapot’s UVW transfered from the point on the surface where the particle was emited from, but it won’t help me if I’m emitting particles from the volume of the teapot, right?
I made a script that assigns UVW coordinates to the particles when they are born, based on the XYZ position of the particle at birth. Works well, but let’s just say maxscript running per-particle on 10 million particles is a good way to emulate a lockup.
I really need a way (box#3?) to adjust particle density (real, in space) to the density read in from the map. So that if the density = 0, delete the particle, and if the density > .8, and another particle exists within a certain radius, delete the particle.
I only really need particles in areas of low to medium density. In high density areas, you’ll get good coverage with low particles, and where there is no density, why have particles at all?
The mapping thing though it surface only, though, right? <<
True.
I made a script that assigns UVW coordinates to the particles when they
are born, based on the XYZ position of the particle at birth. Works well,
but let’s just say maxscript running per-particle on 10 million particles is
a good way to emulate a lockup. <<
Agreed. But without writing a new operator, it may be the only solution at
the moment.
…So that if the density = 0, delete the particle, and if the density >
.8, and another particle exists within a certain radius, delete the
particle. I only really need particles in areas of low to medium density.
In high density areas, you’ll get good coverage with low particles, and
where there is no density, why have particles at all? <<
I think you are undermining the advantages of the volumetric particle
renderer and creating a lot of extra work by second-guessing density
measurements and forcing the particle system to predetermine when and where
to deposit particles.
“Density” applies at the both the micro and macro level… With an
individual particle, you can assign a “density” to essentially weight its
contribution against other particles influencing a rendered pixel. In the
bigger picture, however, the camera view ultimately determines the way
particles stack up in any given pixel, also changing the “density”. To help
my own understanding of what’s happening, I tend to think of the two as
“particle influence” and “render density”, respectively.
As we kicked around in another thread, even fully influential particles are
still affected by their distance from the camera, and the number of
neighbors they have within the space covered by a pixel. (Likewise, if you
have enough particles, even at near zero influence, they can still add up to
create some visible rendered density.) Instead of second-guessing the
renderer, you should define behavior rules for the particle system and let
the resulting location of the particles determine the final rendered look.
I think that density scaling by material opacity would be most useful for
decreasing influence of a particle based on age, or by conditions dictated
by the birth location of a particle. For example, smoke particles may
dissipate over time such that they are too small to be visible, or smoke
particles may vary in density if coming from an area of burning plastic
versus burning wood. However, just because I know my densest particles are
in one location, I’m not sure that forcing the particle system to emit fewer
particles there would provide any significant optimization (and in fact
might actually disrupt the ability to get a true representation of the
particle system behavior.)
I made a script that
assigns UVW coordinates to the
particles when they
are born, based on the XYZ
position of the particle at
birth. Works well,
but let’s just say maxscript
running per-particle on 10
million particles is
a good way to emulate a
lockup. <<
Agreed. But without writing a
new operator, it may be the
only solution at
the moment.
We were on the beta for box#3, and I think it could do it much faster. We’ll know soon.
As we kicked around in another
thread, even fully influential
particles are
still affected by their
distance from the camera, and
the number of
neighbors they have within the
space covered by a pixel.
(Likewise, if you
have enough particles, even at
near zero influence, they can
still add up to
create some visible rendered
density.) Instead of
second-guessing the
renderer, you should define
behavior rules for the
particle system and let
the resulting location of the
The render is one thing. But I still have to worry about how many particles are processed, transfered, and stored. Even if Krakatoa is smart enough to render them efficiently (it is, I did some tests) I still want pflow and the Krakatoa caching to be efficient. When density = 0, I’m still going to want to cull them. The other conditions are more complex, and would be done as needed.
I think that density scaling
by material opacity would be
most useful for
decreasing influence of a
particle based on age, or by
conditions dictated
by the birth location of a
particle. For example, smoke
particles may
dissipate over time such that
they are too small to be
visible, or smoke
particles may vary in density
if coming from an area of
burning plastic
versus burning wood.
However, just because I know
my densest particles are
in one location, I’m not sure
that forcing the particle
system to emit fewer
particles there would provide
any significant optimization
(and in fact
might actually disrupt the
ability to get a true
representation of the
particle system behavior.)
Right now, I’m having a very, very hard time hitting 20 million, so I have to cull something. When’s the max 9 version expected?
I hear what you are saying about the material vs spacial density though. Perhaps I could, via script or box#3 birth particles based on a map so that in areas where they are dense, we get more particles? Problem is just the numbers… If all the particles are the same density, in order to get a “density gamut” of 30 to 1, I’d need HEAPS more particles than I have now. I need ~3 particles per pixel at a bare minimum now, and that would mean for the more dense areas I’d need 90 particles per pixel minimum. I don’t think that’s going to be possible while still keeping it under 20 million total. Especially if I need to cover a large portion of the image, or worse, have to place the camera inside the particle system!
- Chad
That’s weird formatting.
Did we figure out what the depth maps were? <<
I had Mark explain it to me in layman’s terms, and I still didn’t understand
the specifics of the benefit it adds.
In my current crude level of understanding, it is somewhat analagous to a
shadow map, and provides information about the z-depth at each point, used
for improving performance of matte object rendering. I’m not sure yet why
you would ever want to turn it off.
I’ll need Mark to jump in with more info.
10 million particles isn’t the limit for Pflow, it’s the limit per source.
<<
Thank you. That’s an important distinction I completely overlooked.