AWS Thinkbox Discussion Forums

Grid based point reduction

Curious if there are any magma tricks available to do a decimation of a point cloud to reduce the overall points in an even or distanced based fashion. Tripod based Lidar scan source, so close to the scanner are really dense points, far away from the scanner is a greater distance between the points. If I just do a ‘blind’ every nth reduction then I loose points in both. I’d like to even the point distribution to reduce ‘overlapping’ and remove points where they are dense and keep the ones farther away.

If I convert the points and load into Autodesk Recap I can do a grid based decimation and specify the size of the grid, to just get one point per x unit. Certainly an option in our pipeline, just wanted to take the opportunity to see if I could learn more about Krakatoa and offer the same functionality within Max. Wondering if there’s a way to do that with Krakatoa. I started to look at some of the PRTcloner options but it sounded like that actually creates new points at random position within the sample grid, so would likely blur our source data unless we used really small sample values.

One naive approach would be to measure the distance from the scanner origin to the Position of the point, normalize that, clamp between 0 and 1, subtract from 1, add a Power operator for the bias, and output the result as Selection, defining the probability for deletion using a Krakatoa Delete set to Soft-Selection. This does not take into account the neighbors, just how far the point is from the origin. The closer to the origin, the more likely each point is to be deleted, with farther points not being deleted at all.

I tested it, but it may produce some larger holes in case multiple points in the same vicinity decide they should be gone.

A better approach would be to search for the number of neighbors within a given distance using ParticleSumRadius, and again calculating a probability for deletion based on that - the more the neighbors, the higher the probability for each one of them to be deleted. It has two issues - you need a proxy copy of the original point cloud (so two PRT Loaders instead of one), and you cannot easily control the number of points per unit volume to be left with, because each point decides for itself.

A more advanced and similar to the Recap approach method would be to use a Stoke Field Magma (if you have Stoke) to create a grid around the point cloud and register the actual count in each voxel. Again, make a copy of the PRT Loader (the first is used as the source for the grid), and on the copy add a Magma which looks up the value in the Field Magma grid, and set a probability for every point inside that voxel based on its volume and the number of points in it.

If you don’t have Stoke, you can set up something similar by creating a Box primitive around the point cloud, and generating a PRT Volume with Jitter off to represent the grid. Use a Magma with ParticleSumRadius and collect counts from the first PRT Loader in a custom channel. This won’t be exactly correct because we are using a sphere radius to search instead of a cubic voxel, but it would be close enough. Then use the second PRT Loader to read the counts from the nearest PRT Volume point (representing the voxel grid), and set the Selection using the same logic as in the previous example.

For example, if you want 10 particles per cubic unit, and your voxel has the volume, but you got a point count of 20 in it, you need to delete 50% of the points. Setting a Selection channel value to 0.5 will tell the Krakatoa Delete operator to delete each point with a probability of 50%. You might not get exactly 10 after the Delete modifier, but it will be close. If the voxel has a different volume than an unit volume, you can easily adjust the requested density to the actual density of the voxel, get the actual count, and figure out the necessary deletion probability for the Selection channel.

I will see if I can set up and post some test…

I tested both the Stoke approach, and the PRT Volume approximation, and both worked great.
Here is what I did:

  • I created a Cylinder with a lot of sides and cap segments.
  • I collapsed it to EPoly and deleted all but the top cap.
  • I turned this mesh to PRT Source, which gave me one point per vertex, with very high density around the center, and less density at a distance, which emulates a LiDAR scan fairly well.
  • Then I created a copy of the PRT Source for reference.
  • I added a Magma to it and set a Density channel of 1.0, but I could name that channel anything, e.g. Counter float16[1].
  • In the first test, I made a Stoke Field Magma with grid spacing of 1.0 and used a ParticleSplat operator to collect the Density (or Counter) channel of the second PRT Source into the grid’s Density channel. Since each point has a fixed value of 1.0, the result is the sum of all values in the voxel, which is the same as the count of points in the voxel!
  • In the second test, I simply made a box, PRT Volume with spacing of 1, and a Magma on it to sample the ParticleSumRadius from the second PRT Source and collect the sum of the Density channel. I used a Radius of 0.62, since that produces a Sphere with Volume of 1 cubic unit, roughly equivalent to a 1x1x1 voxel, just differently shaped.
  • In both test cases, I added a Magma to the first PRT Source which reads the value from the grid (either using InputField for the Stoke case, or an NearestParticle for the PRT Volume case), divides the target Density (particles per unit) by the grid value, clamps it between 0.0 and 1.0, and subtracts from 1.0 before outputting to Selection.
  • On top of that Magma, I added a Krakatoa Delete with the Soft-Selection option enabled.
    KMX_DeleteByVoxelDensity_Animation.gif

I even animated the target Density value over time from 10.0 to 0.0 to see how the particles are gradually removed from the cloud…
KMX_DeleteByVoxelDensity_Max2015_v002.zip (518 KB)

I have attached the PRT Volume version saved in Max 2015 format to this post.

Ha, love that you animated it already as that’s what I was thinking of doing too. I’ll have to digest this further tomorrow, but any expectations on your side what the performance would be like with 1 billion points? That’s what I have to start with on this latest project, hence the desire to decimate. I suppose if necessary I could run it individually on the 38 individual scans instead of all of them in a single loader like I have now.

Always appreciate the support, great stuff Bobo!

Privacy | Site terms | Cookie preferences