writing .prt files, compression question

hey there



thanks to frantic’s open development ideas i am writing out the .prt format from xsi. now i haven’t completed it yet because i am learning about compression and zlib.



my question is how does krakatoa write the compressed part of the file?



is it compressed immediately inline while you are querying the data from the particles system?



is the particle data compressed after all particle data is written?



or is the file closed then reopened and the particle data block compressed?



if some of these questions aren’t informed properly, i hope you will excuse me most of my programming experience is with interpreted languages.



thanks

steven

another question i am debugging my writing of the format and i keep getting these two messages…



prt_file_header.read_header: The data type specified in channel ‘[binary block]’ in the input stream [file] is not valid



channel_map::append_channel() - Called on a non-finalized channel_map



i think the first is saying i didn’t declare the channel’s type correctly in the ‘channel definition secion’



and the second one is saying i didn’t finalize the channel



but i am not sure why these two would be the case, maybe someone that knows the code on the prt loader side would know how to direct me?

prt_file_header.read_header:

The data type specified in

channel ‘[binary block]’ in

the input stream [file] is not

valid



This probably means that the layout of the data that has been written to the file differs somehow from the layout Krakatoa is expecting. I would recommend opening the file you produced in a HEX editor (I usually do this by renaming its extension to .bin and opening it in Visual Studio, which automatically chooses a HEX editing mode for .bin files), and verifying the data layout by hand. Compare with the annotated HEX dump that is included in the PRT documentation to try and find any discrepancies. You can also attach to this forum or email prt files you’ve produced to krakatoa support and we can figure out specifics for you.





channel_map::append_channel()

  • Called on a non-finalized

    channel_map



    This is probably a cascaded bug in our code triggered because of that first error.



    i think the first is saying i

    didn’t declare the channel’s

    type correctly in the ‘channel

    definition secion’



    and the second one is saying i

    didn’t finalize the channel



    but i am not sure why these

    two would be the case, maybe

    someone that knows the code on

    the prt loader side would know

    how to direct me?



    I don’t think the code would help you, the HEX editing of your test file to compare with the reference example would be much more effective. Feel free to send us a sample file, and I’ll post some sample code on this thread once we’ve got it finished.



    -Mark

i have the prt spec page up constantly and it has been a great help.



i was able to fix the channel_map() error by using the documentation on the page, but the original problem was because of inconsistency in the documentation.



in the channel definition section…

(4 bytes) A 32 bit int indicating channel arity (number of data values that each particle has for this channel).

(4 bytes) A 32 bit int indicating channel type. Supported channel types are indicated in the table below.

(4 bytes) A 32 bit int indicating channel offset, relative to the start of the particle.



“arity” is before type but in the example .prt “data type” is before arity. so i corrected my code to match the example file and now i only have the first error left, the data type specified is invalid error.



edit mis read your suggestion to send the .prt and not my code. i will send a .prt file.



if your example doesn’t help me…i will look over my code, clean it up and document it and send it your way.



thanks again

steven

Good catch on the bug in the documentation, I’ve fixed it in our internal copy so next time we deploy that will be correct.



If you’d like to send us your code to look at as well, we’re happy to check it as well as the .prt file.



-Mark

hey mark… any news on a .prt writing example?



my current way is using someone elses zlib example that compresses a whole file. so i am just seeking back to the particle data block and then reading,compressing, then writing to the end of file.



or maybe you could point me to someone else example of zlib being used to compress data immediately for wrting?



thanks

steven

Hey Steven,



Sorry for the delay, I thought I would be able to slip some time in to do this right away, but that hasn’t worked out yet. I’ll see what I can do next week.



This might be the example you’re already looking at, but this example looks like a good reference (we just went straight off the reference docs in our implementation). In the deflate part of the example, you want to replace the file reading parts like this:


strm.avail_in = fread(in, 1, CHUNK, source);
 if (ferror(source)) {
  (void)deflateEnd(&strm);
  return Z_ERRNO;
}
flush = feof(source) ? Z_FINISH : Z_NO_FLUSH;
strm.next_in = in;

into code which reads the particle

// Fill in with the particle (maybe more complex then memcpy)
memcpy( in, get_next_particle(), PARTICLESIZE );
strm.avail_in = PARTICLESIZE;
flush = was_last_particle() ? Z_FINISH : Z_NO_FLUSH;;
strm.next_in = in;

In the routine, basically this replaces the parameter "FILE* source" with the custom particle data source you've got.

Hope that helps you in the meantime!

-Mark

it does help, i will give it a go.



i shoulda go that from the get go. makes total sense to replace the reading of the file with the particle data that xsi is giving me.



s

well it looks like i wasn’t as far as i thought i was. i found i am still writing the header incorrectly.



comparing my file to the example on the site. i am finding when i write the “Extensible Particle Format” i am getting some junk at the end.



example on site…

; Identification null-terminated string: “Extensible Particle Format”

00000c 45 78 74 65 6e 73 69 62 6c 65 20 50 61 72 74 69

00001c 63 6c 65 20 46 6f 72 6d 61 74 00 00 00 00 00 00



mine…

; Identification null-terminated string: “Extensible Particle Format”

00000c 45 78 74 65 6e 73 69 62 6c 65 20 50 61 72 74 69

00001c 63 6c 65 20 46 6f 72 6d 61 74 00 00 77 62 00 00



this is a snip of the code…

fwrite( “Extensible Particle Format\0”, 32, 1, file );



any suggestions?



there are some other problems afterwards but one step at a time.



thanks again mark

steven

ok fixed it…



i replaced this…

fwrite( “Extensible Particle Format\0”, 32, 1, file );



with this…

char epf[32] = “Extensible Particle Format\0”;

fwrite( epf, 32, 1, file );



i think it is because “Extensible Particle Format\0” is not 32 bytes and it wasnt writing null characters into the rest of the block



moving along

if i am writing just pointpositions with float32 datatype the PARTICLESIZE should equal 12 bytes right?



my current file is about 270 bytes longer than one i exported from krakatoa.



thanks mark

hey mark, do you think you could look at this .prt file of mine and suggest what i am doing wrong?



not sure if i can attach two files, but i am going to try.



cube_max_0001.prt is from krakatoa plugin

cube_1.prt is from my plugin



these just are float32 pointpositions of a cube, nothing else.

I’ve taken a look, and here’s what I’ve noticed:


  • cube_1.prt says it has 2 channels instead of 1 like it should:


; Number of channels: 2
00003c 02 00 00 00
; Channel definition entry length: 44
000040 2c 00 00 00

instead of:

; Number of channels: 1
00003c 01 00 00 00
; Channel definition entry length: 44
000040 2c 00 00 00

- Something is suspicious about the compressed data as well. Assuming you've used the default zlib compression settings, I would expect it to start with the bytes 78 9c, but it instead contains:

0000160 00 00 80 c0 00 00 80 c0 00 00 80 c0 00 16 da 12
0000200 a4 5a 96 10 64 b6 a9 04 78 b8 a9 04 00 00 00 00
0000220 a4 aa 27 14
0000224

Cheers,
Mark

ah thanks… i left in a 2 because i was writing velocity. but wanted to simplify the process a bit and forgot to set the channels to 1. i am hardcoding a bit of the variable right now until i can get the compression correct.



i am using -1 for the compression, which is zlib default if i am correct. i believe my problem is how i am writing the compressed data back… for a vector type like position (float 32) i should be writing it a certain way? ie 3 elements at 4 bytes?



thanks

steven

One thing you might check is that the uncompressed data is the correct size. It should be 8 particles * 12 bytes/particles = 96 bytes. Once the data is compressed, it should look like a standard zlib ‘deflate’ stream, maybe you could upload the uncompressed and the compressed particle data you’ve got as separate files excluding the header to examine that more closely?



-Mark

ok i removed the headers from these files. the header should be 112 so i removed 112 bytes from the beginning of the file.



file names are self explanatory…

grrr…i almost have it.



my file…

00000070h 63 60 68 38 c0 80 c0 0e 0c d8 f9 0e 38 e4 1c 70


00000080h a8 83 63 00 c8 9c 18 01




prt from max…

00000070h 78 9c 63 60 68 38 c0 80 c0 0e 0c d8 f9 0e 38 e4


00000080h 1c 70 a8 83 63 00 c8 9c 18 01




so there is something wrong with my first particle, right?

ok so i have successfully exported 2 million particles and they render in krakatoa!



most of my errors with zlib were the avail_in and avail_out parts of the structure. i had a hard time making sure they were the proper values, mostly because i was doing my own thing. as soon as i went back and referenced their example i made a ton of progress.



anyways i need to support velocity now. there is a bug in the way xsi gives me color (working with support on that). then i will do the other useful channels ie. id, age, density, normal etc. support for abitrary channels should be easy enough too. i am considering how i could replicate the “partitioning” feature too. would be a combination of a script job in deadline along with my prt plugin.



thanks for your help mark!

steven



ps for your viewing pleasure

so i have been doing some tests with krakatoa for 3dsmax exporting different particle counts to see if i am getting the same file size on disk. but i am actually getting better file size, which leads me to think i am using a better compression ratio.



krakatoa 4 max

10k - 83kb

100k - 820kb

1mil - 8,195kb



my plugin for xsi

10k - 42kb

100k - 504kb

1mil - 4,676kb



surely there must be something you guys are doing that i am not. any ideas?



steven

hey mark



i am having good progress and i have another question. this time it pertains to the density channel. how do you guys compute density? is this something that pflow/tp/legacy particle systems are giving you? or is it something actually being computed by the krakatoa plugin?



i just want to make sure i am representing the data properly from xsi to krakatoa, any hints will help…



thanks

steven