By the numbers

Admittedly, I’m new to Krakatoa and was wondering what kind of particle counts people were using in production. I’ve heard maximums and capabilities but I’d like to get a sense of what people are finding as good practice. At what limit is it smart to go to particle partitioning?



Thanks,



Khye Kading

Khye,



I would like to mention some technical limitations.



*Max 8 running on Win32 can address only 2GB of RAM, so Krakatoa will surely have less than that. Once the available memory is used up (or if it is too fragmented, even earlier), Krakatoa will exit with a Bad Allocation error. It usually does not crash to desktop but once you see the error pop up, you have to restart Max.



*Max 8 and 9 32 bit on Win64 can address up to 4GB of RAM, so Krakatoa gets a bit more headroom, but still can exit with Bad Allocation.



For the above reasons, we had a stand-alone version of Krakatoa used in the production of Superman Returns for example. It wouldn’t have to share memory with Max and could render about 50M particles in one pass, so the shots with 1 billion of particles really had lots of particle layers.



With Max 9 64 on WinXP 64, suddenly the memory limit was sort of removed. Right now we have mostly 4GB machines which can handle about 60M particles without swapping in Krakatoa 1.0.x, but once the physical memory limit is reached, Krakatoa 64 would not exit with Bad Allocation but continue slowly. (Chad Capeland will tell you stories about the billions of particles they have been crunching lately).



In all these cases, the internal memory demand was 38 bytes per particle.



With Krakatoa 1.1.0, we started to reduce the memory footprint by allocating memory only if a channel is actually used. We also added a calculator to show you how many particles you can fit in certain memory or how much memory a quantity of particles require. With most features disabled (For example no motion blur, no normals and lighting and a single custom color, one particle would need only 14 bytes instead of 38, more than doubling the number of particles you could fit in memory).



Of course, most of the time you would like to use these features so the memory would be closer to 38 bytes/particle. So we are looking at ways to compress the data for color and normals in the coming builds (see related threads) to allow for even more particles to fit in memory.



Ok, so these were the technical aspects, let’s hear from users what numbers they have processed lately… :o)

With the new beta I was able to render 226 MegaPoints while I still had another copy of max and a copy of Fusion running. That’s with density, color, and normals. I know I could have done more if I had a larger dataset.



As Bobo mentioned, if you are willing to swap to disk, you can get over 1 GigaPoints to render if you are very patient.


  • Chad

I’m starting to get up to speed on the workflow of Krakatoa. Firstly, I want to say what good information is on the Krakatoa product documentation. There was some information there that I should have read before posting in the first place. I do have some more production-related questions.



1)Since the latest beta I’m finding my 32bit version (no 3gb switch) runs out of memory around 21m. This is the most basic scene without lights or other geometry.

2)Once you run into your limit do you A)render every nth particle in different passes or B) Go to disk swapping.

3)I read on the product page and have experienced the difference in rendering at different resolutions. In peoples experience do you just test at final resolution or is there a neat correlation between resolution to density (ie resolution/3 == density/3). Am I correct in thinkign that the density is more related to resolution than world space? If this is the case can you lock density to rendering resolution (in the same way there is a padlock to lock aspect ratio)?



Khye

>I'm starting to get up to speed on the workflow of Krakatoa.
>Firstly, I want to say what good information is on the
>Krakatoa product documentation. There was some information
>there that I should have read before posting in the first
>place. I do have some more production-related questions.

We have not updated the docs for a while because our internal version is covering 1.1.0 already, but we will do soon. If you find anything that should have been covered in the documentation but is not, please let us know!

>
>1)Since the latest beta I'm finding my 32bit version (no 3gb
>switch) runs out of memory around 21m. This is the most
>basic scene without lights or other geometry.

32 bit has the drawback of fragmented memory space - sometimes it might run out of memory even earlier than expected. 21 m particles with no Lighting should require "only" 400MB of memory. How is the Max memory usage in Task Manager before you start rendering? Also, do you use geometry vertices as particle sources? We found a really nasty memory behavior when rendering vertices and will try to fix it ASAP.

Krakatoa really loves 64 bit and never runs out of memory (read: crash), it just gets very slow when Windows has to swap. Thus, we highly recommend using 64 bit Max and Windows for Krakatoa production work.

>2)Once you run into your limit do you A)render every nth
>particle in different passes or B) Go to disk swapping.

Neither is really a good option - rendering every nth will not give you the same results after compositing if you are doing lighting. What we used to do for such shots was splitting the partices into layers by depth and creating elements that are independent, each with its own full density.

You cannot swap to disk in 32 bit if you have reached the application's memory limit of 2GB - once your Max copy is using more than 2GB of memory, it is the end of the line. Only solution is switching to 64 bit.

>3)I read on the product page and have experienced the
>difference in rendering at different resolutions. In peoples
>experience do you just test at final resolution or is there
>a neat correlation between resolution to density (ie
>resolution/3 == density/3). Am I correct in thinkign that
>the density is more related to resolution than world space?
>If this is the case can you lock density to rendering
>resolution (in the same way there is a padlock to lock
>aspect ratio)?

I suggest testing at final resolution with less particles - Krakatoa's render speed depends mainly on the number of particles and not on the screen resolution. Since it fills pixels with points, the more pixels, the finer the points and the less the perceived density.

I will let the programmers comment on whether it would be possible to lock the density to the screen resolution. I assume that if it was easily possible, it would have been implemented already, but I could be wrong.

In

peoples experience do you just

test at final resolution or is

there a neat correlation

between resolution to density

(ie resolution/3 ==

density/3). Am I correct in

thinkign that the density is

more related to resolution

than world space? If this is

the case can you lock density

to rendering resolution (in

the same way there is a

padlock to lock aspect ratio)?



The densities represent quantities in world space, and when the particles get drawn, the alpha gets adjusted to produce the same overall effect at different resolutions. For instance, if you double the resolution, the densities will in effect be divided by four to compensate. Except for in ‘Constant Alpha’ mode, which isn’t recommended for volumetric type rendering.



The one drawback you will find is if there are not sufficient particles to give overlap, then the equivalence won’t be as clear, because the averaging effects aren’t kicking in. In that case, the usual solution is to use more particles…



-Mark

As soon as you have fewer than 2 points per pixel, you will get odd density problems. Having >5 is better; it depends on a lot of factors. We noticed that in a shot where we were zooming in on something. As we zoomed, the number of points per pixel was reduced and at a certain point they started to break up and you could see the spaces between them. By adding more points, we were able to get a predictable relationship between coverage and density. When in doubt, more points are better. :slight_smile:



In the future it would be nice if we could enlarge particles based on the screen space they covered. So as the camera got closer, or the FOV got narrower, the size would change (but keep the same shadowing (or not)).



Likewise, it would be nice if the size of the points could be encoded into a PRT file. Either with easy to compress 1/3/5/7/9 for 1x1, 3x3, 5x5, 7x7, 9x9, etc. Or with a more accurate (but almost certainly slow to render) float size, where the number would be the radius.



Another idea is to do something like the DOF effect where you create a disk of instanced points with a specified radius for each point. This is closest to what’s already available, if we could override the DOF and just set our own radius either globally, or better, per point, that would make for some interesting (if fuzzy) rendering with only a small overhead (similar to what we get with DOF now)



Or have a rendering system where you specify a number of neighbors (N) and a max search radius ®, and Krakatoa, for each point, finds the nearest N points within R distance. Then it draws not a point, but an N-gon with vertices at those neighbor points and blends the colors, densities, and normals across the N-gon. Or generate a N-gon shaped cloud of single pixel points, like the DOF effect, but with interpolated values. This effect CAN be done with Box#3, but doing it at rendertime would be much more awesome, as it would require less memory, just as the DOF effect uses less memory than pre-calculating the circles of confusion. As soon as I get the new PRT Pflow ops, I’ll try this in Box#3, but I’m pretty sure it’s going to be painful.



There are of course other options. The system in Krakatoa now (with the fixed size with global filter) is very fast, even with lots of points. Any other setup will likely be slower, but could provide a net gain if it can reduce the number of points needed, and thus the time needed to generate and load those points.


  • Chad