kinect recorder

are there any plans to make a standalone recorder? not only could it record sound at the same time which would be nice
but having to create particles in realtime seems like a big task, all its doing is converting depth images into particles right?

it would be great to be able to JUST record the depth images (with sound)
that way i can go leave my house and record lots of other scenarios with a laptop, no bottleneck, pure “video” recording

i would not mind creating particles as a post process

the realtime thing is cool and great to show off, but it would be great to be able to record just like that

even cooler would be a “prt camera”, a camera that records depth images and sound onto a SD memory card, letting us capture 3d point cloud “photos” and videos wherever we go, that would be awesome, the fact that particles are generated fromt he images is a huge bonus and wouldnt require a high bandwith DSLR chip to process lots of data, seems like the depth images are quite small compared to the particle size they generate

there are open source depthmap records out there already - not sure that’s in our best interest in competing in that space. The goal was to create an easy to use tool inside max.
you should test some of the outside tools and perhaps we can look at how we can take a depthmap + image = and convert to prts sometime in teh future.

although my goals for fathom would be to increase the toolkit inside max, but for now we are focussed on other things [we have 4 betas right now!]

cb

I’ve thought about this and concluded the best solution is to just use something like this or this with a USB3.0 SSD Drive duct-taped to the bottom.

chris there are, and ive used them like the ipi mocap recorder, but the images arent the same or identical to waht fathom needs to convert to prt? not sure

but its not really about competing with anyone, but allowing to capture and convert without relying on fast hardware, i could take the kinect mobile and capture fathom compatible videos with a laptop,

also, i always had issues recording there are some errors here and there, i want the capture to be flawless and have no problem converting to .prts,

the realtime functionality is a nice gimmick but not necessary for production,

creating this lil app would also allow people that dont use max but use krakatoa (STANDALONE VERSION!) to use fathom…

if there are free kinect capture apps that are fathom compatible to convert prts to, all we need is the converter as standalone app (and a link to the free recording app)
i think paul almost provided this once a few months ago when i noticed a non-realtime capture…

i think fathom is more about realworld point cloud capture combined with krakatoa than just a lil 3dsmax plugin :slight_smile: but of course great that it works inside there already!

Not sure I agree with that. The realtime aspect of it is what makes it immediately useful. If you wanted to do a capture and then process it later, why would you be using the Kinect? Why not just do camera tracking like we’ve been doing for years?

camera tracking is a different genre than particle cloud capture, 2 different things…

you always capture realtime, the difference just being if you NEED to have it in your viewport, or not,

im saying there are benefits for NOT having to capture in viewport, you can do it on any laptop, its mobile, no crazy hardware requirements,

does that make sense?

Until the kinect gets smaller I don’t see a problem with having a small computer attached to operate the kinect and recording.

yes but a small computer will not record high quality particles (over 300k) at 30frames per second in realtime!

you need a beefy machine, SSDs preferred.

i would just like to see it less dependent on hardware, and possible to record anywhere with a laptop.

If i can convert .prts from image depth video, i can sit in the but, and do a kinect recording, no problem.

Right now, the location is limited to where a powerful machine is sitting. Big difference! :slight_smile:

I will do some research on free kinect depth recorders this week.

realtime particle writing is not necessary (or effective) for creating 3d particle clouds as long as we are using kinect depth images, (unless we simply want to show off the realtime particle cloud in 3dsmax viewport effect to someone).

with a laptop i am mobile, can walk anywhere with a helper following, this will be true freedom, and generate alot more interesting 3d point clouds.

our first iteration of the tool is outside of max. i’m sure we could do both - however i’m not sure that the overhead was any different - i’ll speak to paul about it next week

cb

that was meant to BUS not butt! :slight_smile: