To identify number of Inferentia devices in a given instance use the neuron-ls
command.
$ neuron-ls
+--------------+---------+--------+-----------+-----------+------+------+
| PCI BDF | LOGICAL | NEURON | MEMORY | MEMORY | EAST | WEST |
| | ID | CORES | CHANNEL 0 | CHANNEL 1 | | |
+--------------+---------+--------+-----------+-----------+------+------+
| 0000:00:1f.0 | 0 | 4 | 4096 MB | 4096 MB | 0 | 0 |
+--------------+---------+--------+-----------+-----------+------+------+
The above output is taken from an Inf1.xlarge instance. The first column shows the PCI Bus Device Function (BDF) ID. The second column shows logical ID assigned to the device. This logical ID is used during Neuron-rtd configuration. The third column shows the number of NeuronCores in the inferentia device. The last two columns show the connection to any other inferentia devices; since this is a single inferentia device, those are empty.
Multiple NeuronCores(NC) can be combined to form a NeuronCore Group (NCG). Neuron framework layer will automatically create a default NeuronCore Group. To view list of available NCGs the following command can be used.
$ neuron-cli list-ncg
Device 1 NC count 4
+-------+----------+--------------------+----------------+
| NCG ID| NC COUNT | DEVICE START INDEX | NC START INDEX |
+-------+----------+--------------------+----------------+
| 1 | 1 | 0 | 0 |
| 2 | 1 | 0 | 1 |
| 3 | 2 | 0 | 2 |
+-------+-----------------+----------+-----------------+----------------+
If there is a need to delete the framework created NCGs the neuron-cli destroy-ncg
command can be used.
Multiple models can be loaded into a single NCG but only one can be in STARTED state at any given moment. Inference can be done only on the models with a STARTED state.
The neuron-cli list-model
command should be used to view all the models.
$ neuron-cli list-model
Found 3 models
10003 MODEL_STATUS_LOADED 1
10001 MODEL_STATUS_STARTED 1
10002 MODEL_STATUS_STARTED 1
In the above output 10001 and 10002 are unique identifiers for models loaded in a Neuron device.
The command neuron-cli start/stop/unload
can be used to start/stop/unload a model.
Each model loaded consumes different amount of memory (host and device), NeuronCore and CPU usage.
The neuron-top
command can be used to biew the memory usage.
$ neuron-top
neuron-top - 2020-02-12 23:03:15
NN Models: 2 total, 2 running
Number of VNCs tracked: 16
0000:00:1c.0 Utilizations: Neuron core0 0.00%, Neuron core1 0.00%, Neuron core2 0.00%, Neuron core3 0.00%,
0000:00:1e.0 Utilizations: Neuron core0 0.00%, Neuron core1 0.00%, Neuron core2 0.00%, Neuron core3 0.00%,
Model ID Model Name UUID Node ID Subgraph Exec. Unit Host Mem Device Mem Neuron core %
10018 1.0.6801.0-/home/ubuntu/benchmarking/compiler_workdir/rn50 d12cf238420d11ea8e270afe835c0a32 3 0 0000:00:1e.0:0 33554816 135290880 0.00
10017 1.0.6801.0-/home/ubuntu/benchmarking/compiler_workdir/rn50 d12cf238420d11ea8e270afe835c0a32 3 0 0000:00:1c.0:0 33554816 135290880 0.00
In the above output:
- Model ID -> Unique Identifier for models loaded in the Neuron device
- Model Name -> Neuron Compiler Version-compiler work directory/User defined model name
- Node ID -> For Internal use only
- UUID -> Unique Id assigned by the Neuron Compiler for a Model
- Exec. Unit -> BDF of Neuron Device followed by the Neuron Core ID, b:d:f.NC
- Host Mem -> Host memory consumed by the Model in bytes
- Device Mem -> Neuron Device memory consumed by the Model in bytes
- Neuron Core % -> utilization % of the neuron core (If there are no active inferences this value should be 0)
Please refer to Neuron Gatherinfo