Ok, I’ll have a look. Thank you for sending me the info!
Ok, I think I have a better understanding of what’s up. It’s definitely an issue with the versions of the vc 90 common runtimes I’m using. I made a few changes in the build, can you try running this version of KrakatoSR.dll. I’m not entirely sure if it solved the problem though, as they both work on my computer.
If it does not work, I will attempt to run it on a clean copy of Windows XP 64 without any common runtimes installed and see what’s up.
KrakatoaSR.zip (3.19 MB)
Ahhhh… thats very much better! Here it works now. Thanks a lot.
Unfortunatly my jubilation was to enthusiastic. The dll works with a simple exe file. It does not work if I use it in a plugin.
The MSVCR90.dll is still missing if I check with dependency walker. It is missing as well in the simple exe file, but it seems to work anyway.
Just discovered, that the MSVCR90.dll is missing in the old working krakatoaSR.dll as well. So this seems not to be the reason. In the dependency walker I could see that there are other libraries included now. I will attach an dependecy walker outoput as soon as possible.
Yes, I just tested the new version. I’m able to use some of the APIs which were not loading earlier (like create_from_particle_stream_interface). But the renderer doesn’t load.
Additionally, my team mate noticed that create_from_particle_stream_interface API crashes if the particle_stream_interface* supplied as its argument has zero particles (particle_count method returns zero).
The code is within a try-catchall block, in case the API threw an exception, but that didn’t help either. Can this be fixed as well.
If needed, I can open a separate thread in “bug reports”, for the second issue, for tracking.
thanks
arun
Thanks for pointing out the zero-particle bug. I will make sure that gets fixed.
I’m not sure what you mean when you say the renderer doesn’t load. Can you elaborate?
Here is the output of the dependency walker, textifle B is from a working version, file A is from current not loading version.
mayaToKrakatoa.zip (20.1 KB)
I extended my simple program and used your sample code, posted in one of the first postings. If I try to run it, I get an “Entry point not found” error (?render@krakatora_renderer@krakatoasr@@QEAAXXZ).
Here’s the code I used. I removed all comments and the framebuffer call.
#include <stdio.h>
#include <iostream>
#include "..\..\include\KrakatoaSR.hpp"
#include <string>
#include <iostream>
#include <cstdlib>
#include <vector>
class my_frame_buffer_interface: public krakatoasr::frame_buffer_interface
{
public:
std::vector<krakatoasr::frame_buffer_pixel_data> m_pixels;
virtual void set_frame_buffer( int width, int height, const krakatoasr::frame_buffer_pixel_data* data ) {
m_pixels.resize(width * height);
for (size_t i = 0; i < m_pixels.size(); ++i) {
m_pixels[i] = data[i];
}
}
};
class my_stream : public krakatoasr::particle_stream_interface {
krakatoasr::channel_data m_position;
krakatoasr::channel_data m_velocity;
krakatoasr::channel_data m_density;
krakatoasr::channel_data m_color;
krakatoasr::INT64 m_currentParticle;
krakatoasr::INT64 m_particleCount;
public:
my_stream(krakatoasr::INT64 particleCount = 100) {
m_position = append_channel( "Position", krakatoasr::DATA_TYPE_FLOAT32, 3 );
m_velocity = append_channel( "Velocity",krakatoasr::DATA_TYPE_FLOAT32, 3 );
m_density = append_channel( "Density", krakatoasr::DATA_TYPE_FLOAT32, 1 );
m_color = append_channel( "Color", krakatoasr::DATA_TYPE_FLOAT32, 3);
m_currentParticle = 0;
m_particleCount = particleCount;
}
krakatoasr::INT64 particle_count() const {
return m_particleCount;
}
bool get_next_particle( void* particleData ) {
if ( m_currentParticle < m_particleCount )
{
float myPosition[3] = { rand() * 10.0f / RAND_MAX, rand() * 10.0f / RAND_MAX, rand() * 10.0f / RAND_MAX };
float myVelocity[3] = { 0.0f, 0.0f, 0.0f };
float myColor[3] = { float(rand()) / float(RAND_MAX), float(rand()) / float(RAND_MAX), float(rand()) / float(RAND_MAX) };
float myDensity = 1.0f;
set_channel_value( m_position, particleData, myPosition );
set_channel_value( m_velocity, particleData, myVelocity );
set_channel_value( m_color, particleData, myColor );
set_channel_value( m_density, particleData, &myDensity );
++m_currentParticle;
return true;
}
return false;
}
void close() {}
};
int main(const int argc, const char** argv)
{
if (argc != 2)
{
std::cout << "Usage: " << argv[0] << " <outputFileName.exr>" << std::endl;
return EXIT_FAILURE;
}
const std::string outputFilename = argv[1];
krakatoasr::krakatoa_renderer krakRenderer;
krakRenderer.set_error_on_missing_license(false);
krakRenderer.set_density_per_particle(6.0f);
krakRenderer.set_density_exponent(-1.0f);
krakatoasr::shader_isotropic isotropicShader;
krakRenderer.set_shader(&isotropicShader);
krakRenderer.set_background_color(0.47805f, 0.690593f, 0.903227f);
krakatoasr::animated_transform cameraXForm;
float cameraMatrix[16] =
{
0.707107f, 0.0f, -0.707107f, 0.0f,
-0.408248f, 0.816497f, -0.408248f, 0.0f,
0.57735f, 0.57735f, 0.57735f, 0.0f,
10.0f, 12.0f, 10.0f, 1.0f,
};
cameraXForm.add_transform( cameraMatrix, 0.0f );
krakRenderer.set_camera_tm( cameraXForm );
krakRenderer.set_camera_type( krakatoasr::CAMERA_PERSPECTIVE );
krakRenderer.set_camera_clipping( 0.01f, 10000.0f );
krakRenderer.set_camera_perspective_fov(60.0f);
krakRenderer.enable_motion_blur(false);
krakatoasr::point_light pointLight;
pointLight.set_name("the_light_that_I_created");
krakatoasr::animated_transform lightXForm;
float lightMatrix[16] =
{
1.0f, 0.0f, 0.0f, 0.0f,
0.0f, 1.0f, 0.0f, 0.0f,
0.0f, 0.0f, 1.0f, 0.0f,
50.0f, 0.0f, 0.0f, 1.0f,
};
lightXForm.add_transform(lightMatrix, 0.0f);
krakRenderer.add_light(&pointLight, lightXForm);
my_stream streamInterfaceObject( 100 );
krakatoasr::particle_stream particleStream = krakatoasr::particle_stream::create_from_particle_stream_interface(&streamInterfaceObject);
krakatoasr::animated_transform particleXForm;
float particleMatrix[16] =
{
1.0f, 0.0f, 0.0f, 0.0f,
0.0f, 1.0f, 0.0f, 0.0f,
0.0f, 0.0f, 1.0f, 0.0f,
0.0f, 0.0f, 0.0f, 1.0f,
};
particleXForm.add_transform(particleMatrix, 0.0f);
particleStream.set_transform(particleXForm);
krakRenderer.add_particle_stream(particleStream);
my_frame_buffer_interface myFBI;
krakRenderer.set_frame_buffer_update( &myFBI );
krakRenderer.render();
return EXIT_SUCCESS;
}
Thanks for the code, I’ll give it a try here and see what’s up.
Ok, I believe I found the dependance error. It was attempting to load an older version of the Visual Studio common runtime libraries (CRT).
I have it now working so that it only requires one version of the CRT. The installer for the required CRT is now included in the download. I believe this should fix your issue. If you are having linking errors, ensure that you have the included CRT installed.
I tried out your code with the latest build, and it worked.
Here is the latest API build that I just posted:
viewtopic.php?f=116&t=8137
That worked for me. Thanks.
Works fine here as well. Thanks. Finally I can read Naiads particles.
Just a quick question. Can I create arbitraty channels in krakatoa or am I limited to the ones already available “rgb”, “velocity”, “normal”, “occluded”?
Saving of particles (to .prt or whatnot) isn’t available yet, so the only channels that would matter to Krakatoa are the ones it needs to render. So if you don’t have motion blur enabled, you don’t need velocity; if you aren’t rendering Phong shading, you don’t need normals, etc… You could create arbitrary channels, of course, but they just won’t be used.
Won’t be used and will not be written into the final exr as a seperate layer?
e.g. we have a “droplet” float channel what could be useful to create a whitewater shading in compositing.
Chad is correct in that Krakatoa currently ignores “non-standard” channels. Although, what you have suggested here in an interesting idea. I’m interested to see what exactly would be useful in this regard… because it’s something I could definitely look at adding if people found it really useful.
Let’s say your “float droplet” channel is actually a rgb color. We can currently achieve this effect in using the API by doing a separate render:
a) By copying the “droplet” channel into the “Emission” channel, and turning on emission, and removing the lights. This would create a self-illuminated render of whatever is in the “droplet” channel.
b) By copying the “droplet” channel into the “Color” channel (and using lights). That would produce an image of the “droplet” channel shaded by the lights.
I do see how it would be useful to produce multiple output images from various channels in a single render. At least that would save the time for re-loading the particles.
And Chad, just a note: Writing PRT files will be available in the next API release. It’s fully implemented already, I’m just working out a few other new features before I release it.
But you have to change more than that…
You probably need to change density and filter, too. And how do you handle depth of field, or motion blur?
Check out Krakatoa MX’s render elements… This has the benefit of reusing some processing done in the render, unlike doing a global KCM (which would just trigger another complete render pass where you do the swizzling and modifying of the density/filtering/lighting settings).
But the reuse, I guess that’s the real benefit. Any SR implementation should be able to handle triggering a new render, with new particle data in emission/color/density right?
Do you mean to keep the previous renderer’s particles in-memory between renders?
Right. If you have the particles collected and sorted, you could force the data channel to be stored for the rasterization. That’s what you do anyway for the other outputs like Z depth, right?
If you didn’t want to use up the memory needed to allocate the channels for each pass, you could do them independently using the existing interfaces in SR. You’d have to resort, trading render time for memory saving.
EDIT: Actually, maybe I miss understood. You won’t be keeping the particles “in-between renders”, you’d be just triggering a new render output image. This might be done through one render pass, or it might be done by writing out to different buffers during the render. Not sure which is closer to what you do now.
Right, it does it all in one render currently.
To implement something like what was suggested, it would involve implementing a new render element that takes a particle channel name.
I’m assuming the element would appear as a self-illuminated render using the same density/filtering settings as the main render (but that could be overridden).