AWS Thinkbox Discussion Forums

Guess what?

That’s what :smiley:

Haha cool, a few improvements in that one, at least 3 that I can see :wink:

I don’t think you realize what this really is… :mrgreen:

Mental ray?

From the file name i get that mental something or other render thingy, I see shadows receiving and casting as well as what appear to be proper reflections. If it is neither of those three you have me completely stumped!

The image file name was selected to be deliberately misleading. :smiling_imp:

It’s not atmospheric, but raytraced reflections in the point renderer? All those objects are PRT Volumes.

Nope, try again! :smiley:

geometry in krakatoa?

Some day maybe, but not yet.

Converted a raytraced image into particles then rendered that in krak.

Or you are just tormenting us.

I AM tormenting you :smiling_imp:

Here is a hint:
youtube.com/watch?v=5-dpIpIhfmg

The cat in the box is dead?

Ok, here is the deal. The “M” stands for “Magma”.
This is not a new Krakatoa feature, but a showcase of Magma 2’s power:

Very cool. ICE cool. :wink:

How are you integrating this in the viewport?

I created a Plane in front of the Camera. Then I converted it to a PRT Volume and parented it to the camera so it moves with it. That’s the Image Plane. I can switch between grid and jitter distribution, the noise you see in the image is obviously caused by the Jitter. The image was originally rendered at 1920 resolution and resized down 50% to get rid of some of the noise.


The KCM shoots rays from the camera through each particle, hits geometry in the scene, traces the reflections and sets the Color and Emission of the particle to the resulting color. In the current test one primary ray plus two reflection bounces are calculated, each with raytraced shadows, so 6 ray intersections per particle are performed. The same code also calculates the surface shading.

The first iteration of this experiment was producing a facetted look because I had to use the Face Normal output of IntersectRay. So we added the option to the FaceQuery operator to get the Smoothed (Render-time) Normals.

There is one operator that is still missing and I had to “fake it” in this setup. Getting the wireframe color of the object that was hit was not possible because Geometry input and Object input are currently handled separately. So we would need an operator to return the node of the geometry that was hit, and then perform PropertyQuery to get the object color. In the current demo, the colors are hard-coded to match the order of objects on the InputGeometry list.

Haha, you get pie!

DulcedeLech ApplePie.jpg

That is pretty cool!

Privacy | Site terms | Cookie preferences