AWS Thinkbox Discussion Forums

Error opening Krakatoa render

Hello,

I have some time not played with Krakatoa (very busy at work), but after a couple of betas I can’t render even simple scene.

I’m getting “Bad allocation” error. No particle partitioning, caching or some other Krakatoa new feature isn’t used.





Max File attached… (3dsmax8 SP3, PF ToolBox #1)



Best Regards,

DeKo

I opened the file in Max 9 32bit, hit render and it rendered.



Will try it in Max 8 now.



Cheers,



Borislav “Bobo” Petrov

Technical Director 3D VFX

Frantic Films Winnipeg

Just did the same in max 8 - works fine.

Okay tried some of the later frames in the sequence and had a problem - I reckon it was a memory allocation problem which might be more to do with particle flow itsaelf and not krakatoa.

I tried running the Analyzer - the scene has a max. of 15M particles and would require 562MB to load the last frame. Any 32 bit machine with 2GB of RAM should be able to handle that. If memory is less than 2GB, it could be an issue…



(I also found a problem with the Analyzer with large numbers - the average value was wrong because I exceeded the largest possible integer while collecting particles. Will have to fix it)

Another thing I discovered - The Analyzer tries to extrapolate the render count from the viewport count. When running with 1% viewport, I got different results compared to 10% viewport. With 1% viewport, it actually claims that the render count would be 27M or 1GB of RAM. The only way to know would be to run the analyzer in full 100% viewport mode which would take forever.



In other words, using some random-seed based operators can cause the extrapolation to be incorrect when it comes to PFlows.



It is better to save all particles to PRTs, then load them for rendering. This way, PFlow will have all the memory available during the saving, then Krakatoa will have all the memory during the rendering. Otherwise, they have to share, and PFlow is more memory hungry than Krakatoa…



Cheers,



Borislav “Bobo” Petrov

Technical Director 3D VFX

Frantic Films Winnipeg

Thanks for help guys,



With saving particles to PRTs, I have same error.

Now i’m trying to check Cache to disk from Box #3, maybe that helps.



Btw,

I tried to use 3GB switch:

http://www.microsoft.com/whdc/system/platform/server/PAE/PAEmem.mspx

http://www.vfxpedia.com/index.php?title=FAQ/3GB_Switch



and now I have less errors (at least at that frame range).



My PC Specs:

Intel Core 2 6300 1.86GHz

2GB RAM

Radeon X1600 Pro

Win XP Pro SP2 (32b)







Best Regards,

DeKo

Try saving smaller PRT’s. Start with 10,000 per partition and see if that works. If that fails, then there’s definately something amiss.


  • Chad

With lower particle count everything ok.

But with higher I have “Bad allocation” error pretty often.

The only workaround - restart whole 3dsmax and trying again… :slight_smile:

…with hope that next time will be rendered a few frames more :slight_smile:





Best Regards,

DeKo

Please define “low” and “high”, actual numbers would be very useful.



This is going to happen in Max 32 bit no matter what we do - 32 bit memory gets fragmented and is limited to 2 GB per application (3 with switch, 4 on 64bit OS).

The only (expensive) solution to run Krakatoa without Bad Allocation errors is using Max 9 64bit on WinXP 64. Also, it is twice as fast there just because of the better memory management.


Why is it more expensive? Other than upgrading the OS.


  • Chad

Why is it more expensive?

Other than upgrading the OS.


  • Chad



    New 64 bit hardware + more RAM + new OS.

    Some people still run 4 year old machines…

    I am not suggesting everybody should switch to 64 bit to use Krakatoa, but it surely makes things easier…



    Cheers,



    Borislav “Bobo” Petrov

    Technical Director 3D VFX

    Frantic Films Winnipeg

Yeah, I definitely must switch to x64



Thanks for the tips :wink:



Best Regards,

DeKo

He has the right kind of CPU, and as long as you don’t mind swapping, he has enough memory.



With x64, you can swap terabytes of data to disk, meaning you only run out of memory when your hard disk is full.



Slow, but at least it doesn’t get a bad allocation.



I did a little render here with over 200 million particles BEFORE depth of field. Was swapping to disk, but rendertimes only went up 20x or so. Didn’t crash once.


  • Chad
Privacy | Site terms | Cookie preferences