-
Notifications
You must be signed in to change notification settings - Fork 5
Points CLI
A simple command line interface for point cloud stores.
- Clone the https://github.com/aardvark-platform/aardvark.algodat repository.
- The
points.cmd
tool is located in the repositories root folder.
On first use, everything will be built automatically.
The following command will import the scan.e57
file into a store at C:\Data\teststore
.
points import C:\Data\scan.e57 -o C:\Data\teststore
After a successful import, some important info will be printed:
-
pointCloudId
is the key you can use to access the imported point cloud
{
"boundingBoxExactGlobal": {
"Min": { "X": -7627.937, "Y": 256956.165, "Z": 693.477 },
"Max": { "X": -7053.295, "Y": 257120.670, "Z": 794.467 }
},
"nodeCount": 313,
"outPath": "C:\\Data\\teststore",
"outType": "Store",
"pointCloudId": "b94e184f-3524-4a2d-b443-402d0c0157cb",
"pointCountTree": 620715,
"rootNodeId": "8aaef45f-9952-4ed8-87d4-4845ecc78953"
}
arg | description |
---|---|
-okey <string> |
a custom pointCloudId
|
-minDist <float> |
point density will be normalized to this distance, default is 0.0 , which means that all points are kept |
-splitLimit <int> |
custom octree split limit, default is 8192
|
The following command will export the point cloud with key mykey
from store C:\Data\teststore
to store C:\Data\teststore2
.
points export -ikey mykey -i C:\Data\teststore -o C:\Data\teststore2
This can be used to compact a store, by extracting all the relevant data for a specific point cloud and leave all irrelevant data behind.
This can also be used to convert old node formats to the newest one, as old formats are converted on read, but only the newest format will be written.
An alternative syntax can be used to specify stores for -i
and -o
.
-i <type> <path>
, where <type>
is
type | description |
---|---|
store |
default, an Uncodium.SimpleStore, the same as if no type is specified |
folder | all entries will be stored as single files inside the given path/folder, for debugging purposes |
The export
command also takes a number of experimental parameters.
Important! When using the following parameters, results are most probably incompatible with most other code.
experimental arg | description |
---|---|
-inline |
can be used with export and will inline all references to per-point property arrays (e.g. positions, colors, ...); this means, that all node data is stored in a single blob (less fragmentation) and loaded all together (unused properties now consume memory) |
-z |
can be used with export and -inline and will store compressed (gzipped) inlined octree nodes |
-meta <key> |
will add metadata of an imported or exported point cloud to the store as UTF8-encoded JSON using the given key |