Krakatoa 1.5 Alpha Teaser Movie

Hi guys,

We are preparing a first Alpha drop for you to play with soon.
In the mean time, we wanted to give you something to look forward to.

Both sides of the movie show the same 2 million particles rendered in Krakatoa 1.5 with the same lighting.
The only difference is the render mode. Simply flip a switch and…

Enjoy!

Sweet… Voxels! Do tease :smiley: cleans up drool on keyboard

wow…wow wow… I just felt down from my desk… Amazing sweet stuff! I think I will wait and try this version on Tsunami/wave’s foam cloud particles… cool stuff! :sunglasses:

I think i just tinkled in my pants any chance of seeing how well it works with the particles moving around?

looks awesome! can’t wait…

Sure.

Here are some movies showing the default 1M particles teapot we used in our first tutorial.
One spotlight, Volumetric shading with Density of 5/-4.

The main control parameters for Voxel rendering are the Voxel Size (in generic units) and the Voxel Filter Radius (number of neighbor voxels to filter by).
The tests I post use a Filter Radius of 2 (1 looks crispy, higher values blur more and are slower).
I used a voxel size of 0.5, 2.5 and 0.25 to show you the difference.
Also rendered the same particles as points for comparison.
Krakatoa_Voxels_Teapot_Points_v001.MOV (2.03 MB)
Krakatoa_Voxels_Teapot_V0_25_VFR2_v001.MOV (2.06 MB)
Krakatoa_Voxels_Teapot_V2_5_VFR2_v001.MOV (1.19 MB)
Krakatoa_Voxels_Teapot_V0_5_VFR2_v001.mov (1.69 MB)

And here is a comparison of a camera flying through the cloud in Voxel and in Point mode!

:open_mouth: The fly-through is well… kinda at a loss for a ultimately cool descriptive word for it !

How are render times, just for the sake of curiosity?

It is a bit early to talk about what the render times will be. We had to make it work first before looking into making it fast.
Right now, still single-threaded and unoptimized, it is approx. 10 times slower than point rendering.

The good news: It is great for multi-threading, unlike the point rendering which is rather tricky to run in parallel.

And then there is the fact that voxel rendering scales differently than point rendering. Point rendering scales more or less linearly - 10 times more particles, 10 times longer rendering. Voxels on the other hand scale based on the number of voxels to be processed, since the particles are encoded into the voxels really fast, so 10 times more particles with the same voxel size does not necessarily mean 10 times longer render times.

The one million particles in these tests took 3 seconds in point mode and around 30 in voxel mode. These times could go down depending on the voxel size and the voxel filter radius, affecting quality a bit.

I will have to run some more benchmarks to see what happens when rendering 10 or 100 million particles using both methods. We hope to post an Alpha build this month so you could play with it yourself…

Completely understood, my curiosity always gets the best of me, lol.

Sorry for stating the obvious but the “niche” renderer will no longer be a “niche” renderer, this just opened the doors to a whole new realm of possibilities :wink:

Excited to get a Krak at it :smiley:

I know the whole point is to just flip a switch to populate the voxel array with density and color from the points, but will there be methods for intercepting the voxel data so we can apply maps to it? Or to completely skip the point-to-voxel conversion and just provide the renderer with our own pre-cached voxel array?

  • Chad