Scaling of particles?

Why do far away particles become less dense? If they are scaling by distance to camera, then can I control that? Or better, have it work on near particles, so they become larger than normal?



I have a camera, and I drew a spline from the camera to real far away, made it renderable, and had that be the position object for some pflow particles. The near particles are very dense, but have lots of empty space between them. The middle particles blend together to form a solid mass. The far particles, however, are LESS dense. The space between the particles in screen space is much smaller for them, so the only reason I can think that they would be less dense is if they were being scaled or weighted so that far away particles would show up less.



Anyway, I’m not sure if it’s normal or not.


  • Chad

I'll leave the technical specifics for Mark to explain.  I took a visual approach to looking at this, and I'm certainly not familiar with the math behind the rendering.

I ran a simple test consisting of a uniform grid of particles, receding into the horizon.  This illustrates the worst and best cases for Krakatoa.  There are rows of solitary particles in places, but converging to dense spots toward the horizon.

In the case of "Additive Density" I found that all particles were in fact uniformly dense from distant to close proximity.

In the case of "Volumetric Density", single particles did grow less dense in the distance.  Where multiple particles converged, density visually appeared to behave as expected. 

I've been told that single particles break internal assumptions for volumetric shading. You need to start stacking them up to start seeing natural-looking results. Krakatoa seems to work best with many millions of particles.  Cases that run thin on particles may begin to exhibit some of the results you describe.  I'll need Mark to jump in here with the technical specifics.

What you’re observing is the alpha of the particles changing based on how far away they are from the camera. Krakatoa is drawing the particles based on a model of volumetric density, which means that for particles closer to the camera represent much more area than the distant particles, and hence the alpha difference.



We could add another ‘constant alpha blend’ mode which would render all the particles with the same alpha, hence no longer following a volumetric model. This would prevent lighting and shading from converging to a consistent result, however. In particular, the occlusion due to lighting wouldn’t match the occlusion visible from the camera, and it would change depending on how big you make the shadow buffer.



Cheers,

Mark

Yeah, more density would change the front end, but how would it change the far parts?



The furthest away areas would have “smaller” particles, but they would be much denser in particles per pixel, so shouldn’t that balance out?



I mean, I have HUNDREDS of particles per pixel. Seems like that should be enough, especially when some of the middle ground areas with just a few dozen particles per pixel are fully opaquing their pixels.

The furthest away areas would have “smaller” particles, but they would be

much denser in particles per pixel, so shouldn’t that balance out?



I mean, I have HUNDREDS of particles per pixel. Seems like that should be

enough, especially when some of the middle ground areas with just a few

dozen particles per pixel are fully opaquing their pixels. <<



Mark mentioned that Krakatoa considers the physical area covered by the

particles when determining the density.



(As a real world analogy, it helps me to think of photographing a bunch of

grapes up close, versus the same bunch hanging on a vine 100 feet away. In

the distant case, they share the same physical density, but other objects in

the same space would influence the light reaching my camera lens.)



If you have a specific case that doesn’t do what you are expecting,

certainly send it to us for review. We want Krakatoa to be as versatile as

possible, so we would like to get our hands on as many examples as possible.

In the case of the grapes, you would NOT expect the grapes to become transparent as they moved away from the camera. There’s no lighting in my scene, and no “fog” or other depth cue.



Grapes close to the camera and grapes far away should like identical in every respect except for the surface area they occupy on your film back.



Could the distance fading curve or multiplier or whatever be exposed to the UI?


  • Chad

Again, coming back to my crude grape example:

Close to the camera, they could possibly be representing 100% coverage of the space shown in a given pixel thus creating an alpha of 1.0. 

But at a distance, the pixel is showing a larger physical area, so they might only be covering 25% of the pixel, thus there isn't adequate coverage for fully opaque alpha.  Other portions of the space are empty, thus contributing 0 alpha to the pixel.

>>Could the distance fading curve or multiplier or whatever be exposed to the UI?<<

I'll log this as a wish in the bug database.