Krakatoa PRT and Cache Question

Hey Krakatoa-forum,
I use the Demo-Version of Krakatoa for something like 1month now. I wanted to make just a nice little wind animation. I saw the “Figh the grainy look”- post and so i wanted to have a lot of particles. I went for 100Mio Particles, wich spawn until frame 300. In 270 it beginns to start some particles… So i got 70Mio in Frame 217. I wanted to cache the particles, with the “Save particle” - tab. I begann, but it took really long. In the later frames 1 Frame took like 15 Minutes to cache. Is there anything i do wrong? I saw in the “Grainy Look”-post, that movies like Superman use 2Billion of Particles and in the example they had 100 Partitions with 100Mio particles!!! I got a good PC (Intel i7 4770K, GTX 770, 16GB Ram).
After 7 Hours of rendering i stopped at frame 217 (at 2:15am). My RAM was used 100% the Cpu likem 30%

My question is: Did I do anything wrong or is that normal? Whats the difference between 100Mio Particles - Partition and 1Million Particles - 100 Partitions? Are there any differences in the render time? The last question is: Can I run those renders overnight or will i break my Ram? The PC is brand-new so I dont want to risk anything.

Thanks for the great support I had until now i this forum
Matthias

PS: I’m german and just 16 years old, so sorry for my bad english!

*Sorry I ment: In frame 270 it starts to delete some particles…

There is a (little known) limitation of Particle Flow. So it is not recommended to go very high in the particle count.

If you are creating millions of particles at once in a single session and delete particles to give birth to new ones, at some point no new particles will be born because there is an internal limit to the highest bornID value that can be given to a particle. Other than the old legacy particles which had a limit of 64K particles bit were able to recycle indices (if a particle died, another one could take its ID), PFlow has unique IDs for every particle EVER BORN in the system. According to the developer of PFlow, the limit on 64 bit systems is 1 billion particles.

So if you emit 100 million particles per second at, say, 25 fps, around frame 250 you will have produced a billion unique particles and no new particles can be born in PFlow.

If you emit 10 million in 10 partitions, you can go up to 2500 frames before this happens.

Then there is the memory. PFlow has to store the particles it generates for the current frame. Krakatoa does NOT use memory when saving (only when rendering), but PFlow is not as efficient as Krakatoa either. So with too many particles it could run out of memory. With less particles it would not. In fact, one could run multiple simulations on multiple copies of Max on the same machine and save Partitions in parallel. It would be probably faster because all CPUs would be loaded (PFlow is mostly single-threaded), and the memory would be split between the Max instances.

Also be sure to disable any PFlow operators that don’t affect Krakatoa. For example, Shape would be a very bad operator to have. Rotation is usually not needed, and so on.

The other thing is that on Superman Returns we had a network with over 200 machines controlled by Deadline. So we could run each partition on a different computer. Deadline also allows us to launch multiple copies of Max on the same machine with a single license license of Max (up to 16 instances of Max, in fact). You could install Deadline on a single machine for free (or if you have two computers like a desktop and a laptop, you could run both for free). If you have 8 cores on your computer, you could launch up to 8 copies of Max without any additional licensing. Obviously, this should take a bit less time that a single CPU making ALL particles at once…