Single Sided Meshing/Point Normal Generation


Im trying to get a single sided mesh. My scan doesn’t have normal information. So when I try to export my mesh with single sided - cull by normal enabled. I get this error cull_faces_by_normal_similarity: Input mesh is missing point normal channel (Mapping2).For help on how to fix this, please see “Single Sided Meshing” in the Online Help.
I’m guessing It’s because my lack of normal information. Reading some documentation I see that sequoia can generat normal information…? via the Point Normal Generation.

Where is the Point Normal Generation located and how do I attach it to my point loader?



Hi Mike,

There are several ways to generate Point Normal information, and some work better than others, and some are very much work in progress.

The Point Normal Generation operator is supposed to generate normals just from neighboring information. However it is not quite ready for prime time, and is incredibly slow. It works best with aerial scans of terrain where most of the data is pointing mostly up anyway. If you try to use it on arbitrary clouds like a car or a building, it will have hard time figuring out “inside” and “outside”, so while the normals might be mostly correct, they might be pointing exactly in the opposite direction… Improving this operator is on our To Do list. You can see a tutorial describing the steps of using it here: … rrain.html

The Normal From Scanner Position operator uses the Scanner Position data (if any) to determine the “visibility vector” of each point - this is not the normal of the surface, but the vector connecting the point sample to the scanner’s position. It is good enough for determining the “front” and “back”. If a point cloud connects multiple Scanner Positions, each particle will use the value based on the ScannerIndex channel which describes which scanner took that sample. … erpos.html

If you don’t have a Scanner Position, you could still make it work by creating a reference point approximately where the scanner was placed, and picking it as the reference point in the Set Scanner Position operator (new in v1.1). Then you can add a Normal From Scanner Position below it on the Operators list, and it will use the manually specified position as the target of the visibility vectors. This will only work in limited number of cases where a single scanner was used to create the point cloud, and its position is obvious from the “empty circle” below the tripod usually seen in the point cloud…

Hope this helps!


Hi There, I’ve been working on the normal workflow. I have added the Set Scanner Position and Normal from Scanner position but am still not seeing an update in Scanner Data. Also the Normals on all the meshes are still aligned to the world rather then to the scanner location. Perhaps I’m missing a step and am happy to try and troubleshoot this with you.




The Normals of the meshes should be perpendicular to the surface, that’s what they are for.
The Mapping2 channel of the Mesher will contain the Normals data coming from the Point Loader if it was set up correctly as you described.

On the Point Loader, you should add a Set Scanner Position, create and pick a Marker as reference point, then add a Normal from Scanner Position operator to take that data and generate a Normals channel.

You can display as Normal as Color in the viewport via the Display rollout of the Point Loader. It should look like a gradient centered at the Marker (negative values show as black):

Meshing a region of the dataset with normal culling and 10% reduction should produce a mesh that has faces only on the side of the reference point:

Looking from the opposite side, the backfaces would be culled as requested, because the Normal in the mesh is pointing in the opposite direction to what is stored in the Mapping2 channel coming from the Point Loader’s Normals.

Displaying backfaces tinted in red shows you the back side of the front-facing remaining polygons…