AWS Thinkbox Discussion Forums

KCM Noise Debug input vectors >1 (1.5.2.40961)

The Noise node gives bad debug results when all the components of the vector are longer than 1. The real output is fine, just the debug is bad.

  • Chad

Screenshot? Example? Or at least step by step repro case?

*I created a Teapot and a PRT Volume out of it
*Added a KCM and set it to output to Color the Vector input passing through a DNoise with 3 Octaves and 4.0 Persistence
*Changed the Input vector value between 0,0,0 and 3,3,3 and could not see any bad Debug info.
*Also, in our current inhouse version, we have the ability to draw a graph of the Output node based on changes to one or more inputs. When I graph the Output with Vector Input changing gradually from 0,0,0 to 3,3,3, the results are consistent and there is no visible discrepancy in the graph when the vector goes above 1,1,1. (see image)

Not DNoise, just the 1D Noise. DNoise seems to be fine for me.

Similar KCM to what you have, but with Noise returns 0.0 for vectors > [1,1,1].
As soon as any of the components falls below 1, then you get a non-zero value.

Hmm, that is strange.

What do you get for [1.1,1.1,1.1] ?
I am getting -5.76983 for both the Noise and the Output it is connected to.

Also, any vector with integer components seems to return 0, so [1,0,0], [1,1,1], [2,2,2], [3,3,3], [2,3,4] etc. all return 0, but it seems to be the nature of the beast.

Oh, I see. Yeah, I was testing really small values, like [1.001, 1, 1], and it was rounding off to 0.0. Ok, just dumb choice of debug numbers.

Privacy | Site terms | Cookie preferences