diff --git a/docs/data-ai/ai/deploy.md b/docs/data-ai/ai/deploy.md index 632caf0898..b7c7ae66c3 100644 --- a/docs/data-ai/ai/deploy.md +++ b/docs/data-ai/ai/deploy.md @@ -14,7 +14,7 @@ aliases: The Machine Learning (ML) model service allows you to deploy [machine learning models](/data-ai/ai/deploy/#deploy-your-ml-model) to your machine. The service works with models trained inside and outside the Viam app: -- You can [train TFlite](/data-ai/ai/train-tflite/) or [other models](data-ai/ai/train/) on data from your machines. +- You can [train TFlite](/data-ai/ai/train-tflite/) or [other models](/data-ai/ai/train/) on data from your machines. - You can upload externally trained models on the [**MODELS** tab](https://app.viam.com/data/models) in the **DATA** section of the Viam app. - You can use [ML models](https://app.viam.com/registry?type=ML+Model) from the [Viam Registry](https://app.viam.com/registry). - You can use a [model](/data-ai/ai/deploy/#deploy-your-ml-model) trained outside the Viam platform whose files are on your machine. diff --git a/docs/data-ai/capture-data/advanced/how-sync-works.md b/docs/data-ai/capture-data/advanced/how-sync-works.md index b5b877b76e..73ef9a281e 100644 --- a/docs/data-ai/capture-data/advanced/how-sync-works.md +++ b/docs/data-ai/capture-data/advanced/how-sync-works.md @@ -61,7 +61,7 @@ When the connection is restored and sync resumes, the service continues sync whe If the interruption happens mid-file, sync resumes from the beginning of that file. To avoid syncing files that are still being written to, the data management service only syncs arbitrary files that haven't been modified in the previous 10 seconds. -This default can be changed with the [`file_last_modified_millis` config attribute](/data-ai/capture-sync/#configure-the-data-management-service). +This default can be changed with the [`file_last_modified_millis` config attribute](/data-ai/capture-data/capture-sync/). ## Storage diff --git a/docs/data-ai/capture-data/conditional-sync.md b/docs/data-ai/capture-data/conditional-sync.md index b5a7600ec8..3319f6047e 100644 --- a/docs/data-ai/capture-data/conditional-sync.md +++ b/docs/data-ai/capture-data/conditional-sync.md @@ -49,7 +49,7 @@ Also leave both **Capturing** and **Syncing** toggles in the "on" position. {{% expand "Create a sensor module. Click to see instructions." %}} -Start by [creating a sensor module](/how-tos/sensor-module/). +Start by [creating a sensor module](/operate/get-started/other-hardware/). Your sensor should have access to the information you need to determine if your machine should sync or not. Based on that data, make the sensor return true when the machine should sync and false when it should not. For example, if your want your machine to return data only during a specific time interval, your sensor needs to be able to access the time as well as be configured with the time interval during which you would like to sync data. diff --git a/docs/data-ai/data/export.md b/docs/data-ai/data/export.md index c131311bd8..e49cb2c00d 100644 --- a/docs/data-ai/data/export.md +++ b/docs/data-ai/data/export.md @@ -57,7 +57,7 @@ Click **Copy export command**. This copies the command, including your org ID and the filters you selected, to your clipboard. {{% /tablestep %}} -{{% tablestep link="/cli/#data" %}} +{{% tablestep link="/dev/tools/cli/#data" %}} **3. Run the command** Run the copied command in a terminal: diff --git a/docs/dev/_index.md b/docs/dev/_index.md index 502f12ff26..f42669a0b3 100644 --- a/docs/dev/_index.md +++ b/docs/dev/_index.md @@ -48,7 +48,7 @@ aliases:
-Once you've set up your machine you can control your device and any attached physical hardware with [Viam APIs](/dev/reference/APIs), for example: +Once you've set up your machine you can control your device and any attached physical hardware with [Viam APIs](/dev/reference/apis/), for example: {{< tabs class="horizontalheaders program" navheader="Examples">}} {{% tab name="Drive a base" %}} diff --git a/docs/dev/reference/apis/fleet.md b/docs/dev/reference/apis/fleet.md index 3fc9af1b49..5f56855080 100644 --- a/docs/dev/reference/apis/fleet.md +++ b/docs/dev/reference/apis/fleet.md @@ -45,7 +45,7 @@ The fleet management API supports the following methods: To use the Viam fleet management API, you first need to instantiate a [`ViamClient`](https://python.viam.dev/autoapi/viam/app/viam_client/index.html#viam.app.viam_client.ViamClient) and then instantiate an [`AppClient`](https://python.viam.dev/autoapi/viam/app/app_client/index.html#viam.app.app_client.AppClient). See the following example for reference. -You can create an [API key](/cloud/rbac/#api-keys) on your settings page. +You can create an [API key](/manage/manage/access/) on your settings page. ```python {class="line-numbers linkable-line-numbers"} import asyncio diff --git a/docs/dev/reference/apis/ml-training-client.md b/docs/dev/reference/apis/ml-training-client.md index 1214132237..1036fe3aa9 100644 --- a/docs/dev/reference/apis/ml-training-client.md +++ b/docs/dev/reference/apis/ml-training-client.md @@ -30,7 +30,7 @@ The ML training client API supports the following methods: To use the Viam ML training client API, you first need to instantiate a [`ViamClient`](https://python.viam.dev/autoapi/viam/app/viam_client/index.html#viam.app.viam_client.ViamClient) and then instantiate an [`MLTrainingClient`](https://python.viam.dev/autoapi/viam/app/viam_client/index.html#viam.app.viam_client.ViamClient.ml_training_client). See the following example for reference. -You can create an [API key](/cloud/rbac/#api-keys) on your settings page. +You can create an [API key](/manage/manage/access/) on your settings page. ```python {class="line-numbers linkable-line-numbers"} import asyncio diff --git a/docs/dev/reference/apis/services/vision.md b/docs/dev/reference/apis/services/vision.md index 692c7e6609..0294c2a988 100644 --- a/docs/dev/reference/apis/services/vision.md +++ b/docs/dev/reference/apis/services/vision.md @@ -102,7 +102,7 @@ To get started using Viam's SDKs to connect to and control your machine, go to y When executed, this sample code creates a connection to your machine as a client. -The following examples assume that you have a machine configured with a [camera](/operate/reference/components/camera/) and a vision service [detector](/operate/reference/services/vision/#detections), [classifier](/operate/reference/services/vision/#classifications) or [segmenter](/operate/reference/services/vision/#segmentations). +The following examples assume that you have a machine configured with a [camera](/operate/reference/components/camera/) and a vision service [detector](/dev/reference/apis/services/vision/#detections), [classifier](/dev/reference/apis/services/vision/#classifications) or [segmenter](/dev/reference/apis/services/vision/#segmentations). {{< tabs >}} {{% tab name="Python" %}} diff --git a/docs/dev/reference/changelog.md b/docs/dev/reference/changelog.md index a322e48039..beeb59fb13 100644 --- a/docs/dev/reference/changelog.md +++ b/docs/dev/reference/changelog.md @@ -349,7 +349,7 @@ Users can now have [access to different fleet management capabilities](/manage/m {{% changelog date="2023-11-30" color="added" title="Authenticate with location API key" %}} -You can now use [API keys for authentication](/sdks/#authentication). +You can now use [API keys for authentication](/dev/tools/cli/#authenticate). API keys allow you to assign the minimum required permissions for usage. Location secrets, the previous method of authentication, is deprecated and will be removed in a future release. @@ -382,7 +382,7 @@ After you upload and train a machine learning model, you can test its results in This allows you to refine models by iteratively tagging more images for training based on observed performance. -For more information, see [Test classification models with existing images in the cloud](/services/vision/mlmodel/#existing-images-in-the-cloud). +For more information, see [Test classification models with existing images in the cloud](/operate/reference/services/vision/mlmodel/#existing-images-in-the-cloud). To use this update, the classifier must have been trained or uploaded after September 19, 2023. The current version of this feature exclusively supports classification models. @@ -968,7 +968,7 @@ You can replace existing Radius Clustering 3D segmenters by [configuring new one #### Add and remove models using the machine config -You must add and remove models using the [machine config](/configure/). +You must add and remove models using the [machine config](/operate/get-started/supported-hardware/#configure-hardware-on-your-machine). You will no longer be able to add or remove models using the SDKs. #### Add machine learning vision models to a vision service @@ -982,13 +982,13 @@ You will need to first register the machine learning model file with the [ML mod You can now [train](/data-ai/ai/train-tflite/) and [deploy](/data-ai/ai/deploy/) image classification models with the [data management service](/data-ai/capture-data/capture-sync/) and use your machine's image data directly within Viam. Additionally, you can upload and use existing [machine learning models](/data-ai/ai/deploy/#deploy-your-ml-model) with your machines. -For more information on using data synced to the cloud to train machine learning models, read [train a TFlite](/data-ai/ai/train-tflite/) or [another model](data-ai/ai/train/). +For more information on using data synced to the cloud to train machine learning models, read [train a TFlite](/data-ai/ai/train-tflite/) or [another model](/data-ai/ai/train/). {{% /changelog %}} {{% changelog date="2023-03-31" color="added" title="Motion planning with new `constraint` parameter" %}} -A new parameter, [`constraint`](/services/motion/constraints/), has been added to the [Motion service API](/dev/reference/apis/services/motion/#api), allowing you to define restrictions on the machine's movement. +A new parameter, [`constraint`](/operate/reference/services/motion/constraints/), has been added to the [Motion service API](/dev/reference/apis/services/motion/#api), allowing you to define restrictions on the machine's movement. The constraint system also provides flexibility to specify that obstacles should only impact specific frames of a machine. {{% /changelog %}} @@ -999,7 +999,7 @@ You can now access {{< glossary_tooltip term_id="fragment" text="fragments" >}} The configurations you added will now show up automatically in the **Builder** view on your machine's **CONFIGURE** tab. This makes it easier to monitor what fragments you've added to your machine and how they're configured. -For more information, see [Fragments](/configure/#fragments). +For more information, see [Fragments](/manage/fleet/reuse-configuration/). {{% /changelog %}} diff --git a/docs/dev/reference/glossary/type.md b/docs/dev/reference/glossary/type.md index 28b83ff180..67fa628b77 100644 --- a/docs/dev/reference/glossary/type.md +++ b/docs/dev/reference/glossary/type.md @@ -9,4 +9,4 @@ In the {{< glossary_tooltip term_id="rdk" text="RDK" >}} architecture's {{< glos However, the meaning of "type" can be context dependent across the Viam platform. -For example, when [configuring a machine](/configure/) in the [Viam app](https://app.viam.com), `"type"` is used in the JSON to indicate a particular implementation of a component or service, which is formally designated as the {{< glossary_tooltip term_id="subtype" text="subtype" >}}. +For example, when [configuring a machine](/operate/get-started/supported-hardware/#configure-hardware-on-your-machine) in the [Viam app](https://app.viam.com), `"type"` is used in the JSON to indicate a particular implementation of a component or service, which is formally designated as the {{< glossary_tooltip term_id="subtype" text="subtype" >}}. diff --git a/docs/dev/reference/try-viam/_index.md b/docs/dev/reference/try-viam/_index.md index 9730ed417f..f9a6f05379 100644 --- a/docs/dev/reference/try-viam/_index.md +++ b/docs/dev/reference/try-viam/_index.md @@ -31,7 +31,7 @@ The easiest way to try Viam is to [rent and remotely configure and control a Via {{}} 1. Click on TRY in Viam -

Log into the Viam app and go to the TRY tab. Don’t have a Viam account? Follow the instructions to sign up for an account.

+

Log into the Viam app and go to the TRY tab. Don’t have a Viam account? Follow the prompts to sign up for an account.

@@ -39,13 +39,13 @@ The easiest way to try Viam is to [rent and remotely configure and control a Via 2. Reserve your slot

If no one’s using a Viam Rover, you’ll take over immediately. Otherwise, you’ll see an estimated time for the next slot, and we’ll send you an email when it’s your turn. -See detailed instructions.

+See detailed instructions.

{{}} 3. Get started with Viam -

Try a Viam Rover in our robotics lab. Drive or program the rover to see how you can build a machine with Viam.

+

Try a Viam Rover in our robotics lab. Drive or program the rover to see how you can build a machine with Viam.

diff --git a/docs/dev/reference/try-viam/reserve-a-rover.md b/docs/dev/reference/try-viam/reserve-a-rover.md index 18aa993ca7..4b7677489c 100644 --- a/docs/dev/reference/try-viam/reserve-a-rover.md +++ b/docs/dev/reference/try-viam/reserve-a-rover.md @@ -10,6 +10,7 @@ tags: ["try viam", "app"] aliases: - "/try-viam/reserve-a-rover/" - "/get-started/try-viam/reserve-a-rover/" + - /appendix/try-viam/reserve-a-rover toc_hide: true date: "2022-01-01" # updated: "" # When the content was last entirely checked @@ -18,7 +19,7 @@ date: "2022-01-01" _Try Viam_ is a way to try out the Viam platform without setting up any hardware yourself. You can take over a Viam Rover in our robotics lab to play around! -Watch this tutorial video for a walkthrough of Try Viam, including [how to reserve a Viam Rover](#using-the-reservation-system), [navigate the Viam platform](/fleet/), and [drive the rover](/components/base/wheeled/#test-the-base): +Watch this tutorial video for a walkthrough of Try Viam, including [how to reserve a Viam Rover](#using-the-reservation-system), [navigate the Viam platform](/operate/), and [drive the rover](/operate/reference/components/base/wheeled/#test-the-base): {{}} @@ -38,7 +39,7 @@ Once your reservation starts and the system has configured your rover, click **T ### Limitations -When using a rented Viam rover, adding [modules](/registry/) is disabled for security purposes. +When using a rented Viam rover, adding {{< glossary_tooltip term_id="module" text="modules" >}} is disabled for security purposes. ### Extend your reservation @@ -67,7 +68,7 @@ You can take over and play around with a Viam Rover in our robotics lab from any 1. Please notify Viam support on [our Community Discord](https://discord.gg/viam). 2. Use the **Add Viam Support** button on your machine's Location page to give Viam Support access to your _location_. - Refer to [Managing Locations and sub-locations](/cloud/locations/). + Refer to [Grant access](/manage/manage/access/#grant-access). ### Can I extend my time? @@ -111,11 +112,11 @@ If you change the location, you must refresh the page. ### Which organization does this machine e belong to? -Your machine belongs to the [organization](/cloud/organizations/) you were in when you made the request. +Your machine belongs to the [organization](/manage/reference/organize/) you were in when you made the request. ### Can I share this Location with a friend to work on the machine together? -Sure, you can [invite other users to your organization](/cloud/locations/) to collaborate on your machine. +Sure, you can [invite other users to your organization](/manage/manage/access/#grant-access) to collaborate on your machine. As members of your organization, those users have full control of your machine. Another collaboration option is to use screen sharing in a Zoom or Webex session. @@ -123,7 +124,7 @@ Another collaboration option is to use screen sharing in a Zoom or Webex session You can only borrow one rover at a time. You cannot join the queue for another reservation while you have an active rental session. -If you would like to, you can [extend your reservation](/appendix/try-viam/reserve-a-rover/#can-i-extend-my-time). +If you would like to, you can [extend your reservation](/dev/reference/try-viam/reserve-a-rover/#extend-your-reservation). ### I loved my experience - can I play around more? diff --git a/docs/dev/reference/try-viam/rover-resources/_index.md b/docs/dev/reference/try-viam/rover-resources/_index.md index 5b7c6b1404..ca97ff0839 100644 --- a/docs/dev/reference/try-viam/rover-resources/_index.md +++ b/docs/dev/reference/try-viam/rover-resources/_index.md @@ -33,7 +33,7 @@ If you want a convenient mobile {{% glossary_tooltip term_id="base" text="base"% The Viam Rover 2 arrives preassembled with two encoded motors with suspension, a webcam with a microphone unit, a 6 axis IMU, power management and more. It is primarily designed for use with a Raspberry Pi 4. Featuring an anodized aluminum chassis with expandable mounting features, the rover can comfortably navigate indoor environments with a 20 lb payload. - You can customize your rover by mounting sensors, LiDAR, and arms. + You can customize your rover by mounting sensors, LiDAR, and arms.

diff --git a/docs/dev/reference/try-viam/rover-resources/rover-tutorial-1.md b/docs/dev/reference/try-viam/rover-resources/rover-tutorial-1.md index 2d4a96f2db..4a538c5f3c 100644 --- a/docs/dev/reference/try-viam/rover-resources/rover-tutorial-1.md +++ b/docs/dev/reference/try-viam/rover-resources/rover-tutorial-1.md @@ -71,8 +71,8 @@ All together, your kit looks like this: {{}} The motors come with integrated encoders. -For information on encoders, see [Encoder Component](/components/encoder/). -For more information on encoded DC motors, see [Encoded Motors](/components/motor/encoded-motor/). +For information on encoders, see [Encoder Component](/operate/reference/components/encoder/). +For more information on encoded DC motors, see [Encoded Motors](/operate/reference/components/motor/encoded-motor/). The kit also includes stiffer suspension springs that you can substitute for the ones on the rover. Generally, a stiff suspension helps with precise steering control. @@ -90,7 +90,7 @@ L298 is a high voltage and high current motor drive chip, and H-Bridge is typica {{}} The webcam that comes with the kit is a standard USB camera device and the rover has a custom camera mount for it. -For more information, see [Camera Component](/components/camera/). +For more information, see [Camera Component](/operate/reference/components/camera/). ### 3D accelerometer @@ -99,7 +99,7 @@ For more information, see [Camera Component](/components/camera/). The [ADXL345](https://github.com/viam-modules/analog-devices/) sensor manufactured by Analog Devices is a digital 3-axis accelerometer that can read acceleration up to ±16g for high-resolution (13-bit) measurements. You can access it with a SPI (3-wire or 4-wire) or I2C digital interface. -In Viam, you can configure it as a [movement sensor component](/components/movement-sensor/). +In Viam, you can configure it as a [movement sensor component](/operate/reference/components/movement-sensor/). ### Buck converter @@ -167,7 +167,8 @@ This is the recommended order to assemble your rover: ### Install Raspberry Pi OS -Install a 64-bit Raspberry Pi OS onto your Pi following our [Raspberry Pi installation guide](/installation/prepare/rpi-setup/). Follow all steps as listed, including the final step, [Enable communication protocols](/installation/prepare/rpi-setup/#enable-communication-protocols), which is required to enable the accelerometer on your rover. +Install a 64-bit Raspberry Pi OS onto your Pi following our [Raspberry Pi installation guide](/operate/reference/prepare/rpi-setup/). +Follow all steps as listed, including the final step, [Enable communication protocols](/operate/reference/prepare/rpi-setup/#enable-communication-protocols), which is required to enable the accelerometer on your rover. ### Attach the Raspberry Pi to the Rover @@ -280,8 +281,8 @@ The following are just a few ideas, but you can expand or modify the rover kit w - For GPS navigation, we support NMEA (using serial and I2C) and RTK. Make and model don't make a difference as long as you use these protocols. - See [Movement Sensor Component](/components/movement-sensor/) for more information. -- For [LiDAR laser range scanning](/services/slam/cartographer/), we recommend RPlidar (including A1, which is a sub-$100 LIDAR). + See [Movement Sensor Component](/operate/reference/components/movement-sensor/) for more information. +- For [LiDAR laser range scanning](/operate/reference/services/slam/cartographer/), we recommend RPlidar (including A1, which is a sub-$100 LIDAR). - For robot arms, we tried the [Yahboom DOFBOT robotics arm](https://category.yahboom.net/products/dofbot-jetson_nano) with success. ### Mount an RPlidar to the rover diff --git a/docs/dev/reference/try-viam/rover-resources/rover-tutorial-fragments.md b/docs/dev/reference/try-viam/rover-resources/rover-tutorial-fragments.md index 88ba86d5cb..235ceb3749 100644 --- a/docs/dev/reference/try-viam/rover-resources/rover-tutorial-fragments.md +++ b/docs/dev/reference/try-viam/rover-resources/rover-tutorial-fragments.md @@ -43,15 +43,15 @@ Click **Save** in the upper right corner of the page to save your new configurat The fragment adds the following components to your machine's JSON configuration: -- A [board component](/components/board/) named `local` representing the Raspberry Pi. -- Two [motors](/components/motor/gpio/) (`right` and `left`) +- A [board component](/operate/reference/components/board/) named `local` representing the Raspberry Pi. +- Two [motors](/operate/reference/components/motor/gpio/) (`right` and `left`) - The configured pin numbers correspond to where the motor drivers are connected to the board. -- Two [encoders](/components/encoder/single/), one for each motor -- A wheeled [base](/components/base/), an abstraction that coordinates the movement of the right and left motors +- Two [encoders](/operate/reference/components/encoder/single/), one for each motor +- A wheeled [base](/operate/reference/components/base/), an abstraction that coordinates the movement of the right and left motors - Width between the wheel centers: 356 mm - Wheel circumference: 381 mm - Spin slip factor: 1 -- A webcam [camera](/components/camera/webcam/) +- A webcam [camera](/operate/reference/components/camera/webcam/) - An [accelerometer](https://github.com/viam-modules/tdk-invensense/) - A [power sensor](https://github.com/viam-modules/texas-instruments/) @@ -74,15 +74,15 @@ Click **Save** in the upper right corner of the page to save your new configurat The fragment adds the following components to your machine's JSON configuration: -- A [board component](/components/board/) named `local` representing the Raspberry Pi. -- Two [motors](/components/motor/gpio/) (`right` and `left`) +- A [board component](/operate/reference/components/board/) named `local` representing the Raspberry Pi. +- Two [motors](/operate/reference/components/motor/gpio/) (`right` and `left`) - The configured pin numbers correspond to where the motor drivers are connected to the board. -- Two [encoders](/components/encoder/single/), one for each motor -- A wheeled [base](/components/base/), an abstraction that coordinates the movement of the right and left motors +- Two [encoders](/operate/reference/components/encoder/single/), one for each motor +- A wheeled [base](/operate/reference/components/base/), an abstraction that coordinates the movement of the right and left motors - Width between the wheel centers: 356 mm - Wheel circumference: 381 mm - Spin slip factor: 1 -- A webcam [camera](/components/camera/webcam/) +- A webcam [camera](/operate/reference/components/camera/webcam/) - An [accelerometer](https://github.com/viam-modules/tdk-invensense/) - A [power sensor](https://github.com/viam-modules/texas-instruments/) @@ -105,16 +105,16 @@ Click **Save** in the upper right corner of the page to save your configuration. The fragment adds the following components to your machine's JSON configuration: -- A [board component](/components/board/) named `local` representing the Raspberry Pi +- A [board component](/operate/reference/components/board/) named `local` representing the Raspberry Pi - An I2C bus for connection to the accelerometer. -- Two [motors](/components/motor/gpio/) (`right` and `left`) +- Two [motors](/operate/reference/components/motor/gpio/) (`right` and `left`) - The configured pin numbers correspond to where the motor drivers are connected to the board. -- Two [encoders](/components/encoder/single/), one for each motor -- A wheeled [base](/components/base/), an abstraction that coordinates the movement of the right and left motors +- Two [encoders](/operate/reference/components/encoder/single/), one for each motor +- A wheeled [base](/operate/reference/components/base/), an abstraction that coordinates the movement of the right and left motors - Width between the wheel centers: 260 mm - Wheel circumference: 217 mm - Spin slip factor: 1 -- A webcam [camera](/components/camera/webcam/) +- A webcam [camera](/operate/reference/components/camera/webcam/) - An [accelerometer](https://github.com/viam-modules/analog-devices/) {{% alert title="Info" color="info" %}} @@ -143,15 +143,15 @@ Click **Save** in the upper right corner of the page to save your new configurat The fragment adds the following components to your machine's JSON configuration: -- A [board component](/components/board/) named `local` representing the Jetson. -- Two [motors](/components/motor/gpio/) (`right` and `left`) +- A [board component](/operate/reference/components/board/) named `local` representing the Jetson. +- Two [motors](/operate/reference/components/motor/gpio/) (`right` and `left`) - The configured pin numbers correspond to where the motor drivers are connected to the board. -- Two [encoders](/components/encoder/single/), one for each motor -- A wheeled [base](/components/base/), an abstraction that coordinates the movement of the right and left motors +- Two [encoders](/operate/reference/components/encoder/single/), one for each motor +- A wheeled [base](/operate/reference/components/base/), an abstraction that coordinates the movement of the right and left motors - Width between the wheel centers: 356 mm - Wheel circumference: 381 mm - Spin slip factor: 1 -- A webcam [camera](/components/camera/webcam/) +- A webcam [camera](/operate/reference/components/camera/webcam/) - An [accelerometer](https://github.com/viam-modules/tdk-invensense/) - A [power sensor](https://github.com/viam-modules/texas-instruments/) @@ -174,15 +174,15 @@ Click **Save** in the upper right corner of the page to save your new configurat The fragment adds the following components to your machine's JSON configuration: -- A [board component](/components/board/) named `local` representing the Jetson. -- Two [motors](/components/motor/gpio/) (`right` and `left`) +- A [board component](/operate/reference/components/board/) named `local` representing the Jetson. +- Two [motors](/operate/reference/components/motor/gpio/) (`right` and `left`) - The configured pin numbers correspond to where the motor drivers are connected to the board. -- Two [encoders](/components/encoder/single/), one for each motor -- A wheeled [base](/components/base/), an abstraction that coordinates the movement of the right and left motors +- Two [encoders](/operate/reference/components/encoder/single/), one for each motor +- A wheeled [base](/operate/reference/components/base/), an abstraction that coordinates the movement of the right and left motors - Width between the wheel centers: 356 mm - Wheel circumference: 381 mm - Spin slip factor: 1 -- A webcam [camera](/components/camera/webcam/) +- A webcam [camera](/operate/reference/components/camera/webcam/) - An [accelerometer](https://github.com/viam-modules/tdk-invensense/) - A [power sensor](https://github.com/viam-modules/texas-instruments/) @@ -201,7 +201,7 @@ The components and services included in the fragment will now appear as cards on ## Modify the config -The fragment you added is read-only, but if you need to modify your rover's config you can [overwrite sections of the fragment](/how-tos/one-to-many/#modify-a-fragment). +The fragment you added is read-only, but if you need to modify your rover's config you can [overwrite sections of the fragment](/manage/fleet/reuse-configuration/#modify-fragment-settings-on-a-machine). ## Next steps diff --git a/docs/dev/reference/try-viam/rover-resources/rover-tutorial/_index.md b/docs/dev/reference/try-viam/rover-resources/rover-tutorial/_index.md index 40d2470935..5c8525b99f 100644 --- a/docs/dev/reference/try-viam/rover-resources/rover-tutorial/_index.md +++ b/docs/dev/reference/try-viam/rover-resources/rover-tutorial/_index.md @@ -77,8 +77,8 @@ All together, your kit looks like this: {{}} The motors come with integrated encoders. -For information on encoders, see [Encoder Component](/components/encoder/). -For more information on encoded DC motors, see [Encoded Motors](/components/motor/encoded-motor/). +For information on encoders, see [Encoder Component](/operate/reference/components/encoder/). +For more information on encoded DC motors, see [Encoded Motors](/operate/reference/components/motor/encoded-motor/). The kit also includes stiffer suspension springs that you can substitute for the ones on the rover. Generally, a stiff suspension helps with precise steering control. @@ -96,7 +96,7 @@ L298 is a high voltage and high current motor drive chip, and H-Bridge is typica {{}} The webcam that comes with the kit is a standard USB camera device and the rover has a custom camera mount for it. -For more information, see [Camera Component](/components/camera/). +For more information, see [Camera Component](/operate/reference/components/camera/). ### Motherboard @@ -218,8 +218,8 @@ If you wish to use a Jetson Nano or Jetson Orin Nano, follow [this guide](./jets If you are using another board, you can skip this step. {{% /alert %}} -Install a 64-bit Raspberry Pi OS onto your Pi following our [Raspberry Pi installation guide](/installation/prepare/rpi-setup/). -Follow all steps as listed, including the final step, [Enable communication protocols](/installation/prepare/rpi-setup/#enable-communication-protocols), which is required to enable [the accelerometer](#6dof-imu) on your rover. +Install a 64-bit Raspberry Pi OS onto your Pi following our [Raspberry Pi installation guide](/operate/reference/prepare/rpi-setup/). +Follow all steps as listed, including the final step, [Enable communication protocols](/operate/reference/prepare/rpi-setup/#enable-communication-protocols), which is required to enable [the accelerometer](#6dof-imu) on your rover. Once you have installed Raspberry Pi OS and `viam-server`, put your SD card in the slot on your Pi. ### Add the power supply @@ -390,7 +390,7 @@ Enable the I2C protocol on your Pi to get readings from the power sen ### Control your rover on the Viam app -If you followed the instructions in the [Pi installation guide](/installation/prepare/rpi-setup/), you should have already made an account on the [Viam app](https://app.viam.com), installed `viam-server` on the board, and added a new machine. +If you followed the instructions in the [Pi installation guide](/operate/reference/prepare/rpi-setup/), you should have already made an account on the [Viam app](https://app.viam.com), installed `viam-server` on the board, and added a new machine. If not, add a new machine in the [Viam app](https://app.viam.com) and follow the {{< glossary_tooltip term_id="setup" text="setup instructions" >}} until your machine is connected. @@ -419,8 +419,8 @@ The following are just a few ideas, but you can expand or modify the rover kit w - For GPS navigation, we support NMEA (using serial and I2C) and RTK. Make and model don't make a difference as long as you use these protocols. - See [Movement Sensor Component](/components/movement-sensor/) for more information. -- For [LiDAR laser range scanning](/services/slam/cartographer/), we recommend RPlidar (including A1, which is a sub-$100 LIDAR). + See [Movement Sensor Component](/operate/reference/components/movement-sensor/) for more information. +- For [LiDAR laser range scanning](/operate/reference/services/slam/cartographer/), we recommend RPlidar (including A1, which is a sub-$100 LIDAR). - For robot arms, we tried the [Yahboom DOFBOT robotics arm](https://category.yahboom.net/products/dofbot-jetson_nano) with success. ### Mount an RPlidar to the rover diff --git a/docs/dev/reference/try-viam/rover-resources/rover-tutorial/jetson-rover-setup.md b/docs/dev/reference/try-viam/rover-resources/rover-tutorial/jetson-rover-setup.md index 89330e5d3d..fa55da7d8d 100644 --- a/docs/dev/reference/try-viam/rover-resources/rover-tutorial/jetson-rover-setup.md +++ b/docs/dev/reference/try-viam/rover-resources/rover-tutorial/jetson-rover-setup.md @@ -69,7 +69,7 @@ Some states do not allow the exclusion or disclaimer of implied warranties, so t ## Setup 1. Install the WiFi board/device on the Nano. Follow the manufacturer's instructions to do so. -2. Power the Jetson Nano with a power supply and [prepare the device and install `viam-server`](/installation/prepare/jetson-nano-setup/). +2. Power the Jetson Nano with a power supply and [prepare the device and install `viam-server`](/operate/reference/prepare/jetson-nano-setup/). 3. Switch back to the main guide and complete these two steps: [Add the power supply](/dev/reference/try-viam/rover-resources/rover-tutorial/#add-the-power-supply) and [Configure the low-voltage cutoff circuit](/dev/reference/try-viam/rover-resources/rover-tutorial/#configure-the-low-voltage-cutoff-circuit). 4. Unscrew the top of the rover with the biggest Allen key. @@ -132,7 +132,7 @@ Some states do not allow the exclusion or disclaimer of implied warranties, so t ## Setup -1. Power the Jetson Orin Nano with a power supply and [prepare the device and install `viam-server`](/installation/prepare/jetson-nano-setup/). +1. Power the Jetson Orin Nano with a power supply and [prepare the device and install `viam-server`](/operate/reference/prepare/jetson-nano-setup/). 2. Switch back to the main guide and complete these two steps: [Add the power supply](/dev/reference/try-viam/rover-resources/rover-tutorial/#add-the-power-supply) and [Configure the low-voltage cutoff circuit](/dev/reference/try-viam/rover-resources/rover-tutorial/#configure-the-low-voltage-cutoff-circuit). 3. Unscrew the top of the rover with the biggest Allen key. @@ -155,7 +155,7 @@ Some states do not allow the exclusion or disclaimer of implied warranties, so t ### Control your rover on the Viam app -If you followed the instructions in the [Jetson installation guide](/installation/prepare/jetson-nano-setup/), you should have already made an account on the [Viam app](https://app.viam.com), installed `viam-server` on the board, and added a new machine. +If you followed the instructions in the [Jetson installation guide](/operate/reference/prepare/jetson-nano-setup/), you should have already made an account on the [Viam app](https://app.viam.com), installed `viam-server` on the board, and added a new machine. To configure your rover so you can start driving it, [add a Viam Rover 2 Fragment to your machine](/dev/reference/try-viam/rover-resources/rover-tutorial-fragments/). diff --git a/docs/dev/reference/try-viam/try-viam-tutorial.md b/docs/dev/reference/try-viam/try-viam-tutorial.md index 5aad38c6d8..9321327649 100644 --- a/docs/dev/reference/try-viam/try-viam-tutorial.md +++ b/docs/dev/reference/try-viam/try-viam-tutorial.md @@ -22,7 +22,7 @@ You can take over a Viam Rover in our robotics lab to play around! The rental rover is made up of a chassis with a Raspberry Pi 4B single-board computer, two motors, encoders, and a camera. The Try Viam area also has an overhead camera to provide a view of the rental rover, allowing you to view its movements in real time. -Watch this tutorial video for a walkthrough of Try Viam, including [how to reserve a Viam Rover](/appendix/try-viam/reserve-a-rover/#using-the-reservation-system), [navigate the Viam platform](/fleet/), and [drive the rover](#control-tab): +Watch this tutorial video for a walkthrough of Try Viam, including [how to reserve a Viam Rover](/dev/reference/try-viam/reserve-a-rover/#using-the-reservation-system), [navigate the Viam platform](/operate/), and [drive the rover](#control-tab): {{}} @@ -48,7 +48,7 @@ The order of these components may vary. ### Base control -The [base component](/components/base/) is the platform that the other parts of a mobile machine attach to. +The [base component](/operate/reference/components/base/) is the platform that the other parts of a mobile machine attach to. Click the `viam_base` component to expand the base control pane to reveal the camera feed and driving interfaces. @@ -61,7 +61,7 @@ We recommend enabling both cameras so you can have a better sense of what's happ ![The viam_base component panel showing both the 'cam' and 'overheadcam' camera feeds enabled.](appendix/try-viam/try-viam/enable-both-cameras.png) -You can also view and control the camera streams from the individual camera components on the [**CONTROL** page](/cloud/machines/#control). +You can also view and control the camera streams from the individual camera components on the [**CONTROL** page](/manage/troubleshoot/teleoperate/default-interface/#viam-app). #### Movement control @@ -95,7 +95,7 @@ If you go from the from **Keyboard** to the **Discrete** tab, you can choose bet ### Camera control -While you can view the camera streams [from the base component panel](#camera-views), you can access more features on each individual [camera component](/components/camera/) panel. In these panels, you can: +While you can view the camera streams [from the base component panel](#camera-views), you can access more features on each individual [camera component](/operate/reference/components/camera/) panel. In these panels, you can: - Set the refresh frequency - Export screenshots @@ -111,7 +111,7 @@ While you can view the camera streams [from the base component panel](#camera-vi ### Motor control -The [motor components](/components/motor/) enable you to move the base. +The [motor components](/operate/reference/components/motor/) enable you to move the base. The motors are named `left` and `right`, corresponding to their location on the rover base. Their initial state is **Idle**. You can click on each motor panel and make the motor **RUN** or **STOP**. @@ -127,7 +127,7 @@ You can also see their current positions (based on encoder readings) in real tim #### Board control -The [board component](/components/board/) is the signal wire hub of a machine which allows you to control the states of individual GPIO pins on the board. +The [board component](/operate/reference/components/board/) is the signal wire hub of a machine which allows you to control the states of individual GPIO pins on the board. For the Viam Rover, the board component is named `local` and controls a Raspberry Pi on the Viam Rover. With it, you can control the states of individual GPIO pins on the board. @@ -147,13 +147,13 @@ There you can view the configuration for each component in the machine: attribut ### Board configuration -The [board component](/components/board/) is the signal wire hub of a machine. +The [board component](/operate/reference/components/board/) is the signal wire hub of a machine. Configuring a board component allows you to control the states of individual GPIO pins to command the electrical signals sent through and received by the board. For the Viam Rover, the board component is a Raspberry Pi with **Name** `local`, **Type** `board`, and **Model** `viam:raspberry-pi:rpi`. ### Encoder configuration -An [encoder](/components/encoder/) is a device that is used to sense angular position, direction and/or speed of rotation. +An [encoder](/operate/reference/components/encoder/) is a device that is used to sense angular position, direction and/or speed of rotation. In this case, the encoders on the left and right motors are `Lenc` and `Renc` and configure the pins to `le` and `re`. {{< alert title="Important" color="note" >}} @@ -164,7 +164,7 @@ When configuring encoded motors for your own robot, you must configure the encod ### Motor configuration -Both [motors](/components/motor/) on this rover use the model `gpio` which is the model for basic DC motors that are connected to and controlled by the configured board. +Both [motors](/operate/reference/components/motor/) on this rover use the model `gpio` which is the model for basic DC motors that are connected to and controlled by the configured board. The attributes section lists the board the motor is wired to, and since the rover's motors are encoded the user interface also shows the encoded motor attributes: the encoder name, motor ramp rate limit, encoder ticks per rotation, and max RPM limit. @@ -174,7 +174,7 @@ Click **Switch to Builder** to return to the default graphical user interface. ### Base configuration -The [base component](/components/base/) is the platform that the other parts of a mobile robot attach to. +The [base component](/operate/reference/components/base/) is the platform that the other parts of a mobile robot attach to. By configuring a base component, the individual components are organized to produce coordinated movement and you gain an interface to control the movement of the whole physical base of the robot without needing to send separate commands to individual motors. The base's type is `base` and its model is `wheeled` which configures a robot with wheels on its base, like the Viam Rover. The **left** and **right** attributes configure the motors on the left and right side of the rover, which are named `left` and `right`, respectively. @@ -187,10 +187,10 @@ The **Spin Slip Factor** of 1.76 is used in steering calculations to account for ### Camera configuration -The [camera component](/components/camera/) configures the webcam that is plugged into the Raspberry Pi of the rover. +The [camera component](/operate/reference/components/camera/) configures the webcam that is plugged into the Raspberry Pi of the rover. The camera component has the **Type** `camera`, the **Model** `webcam`, and the **Video Path** is `video0`. -For more information on choosing the correct video path, refer to our [webcam documentation](/components/camera/webcam/). +For more information on choosing the correct video path, refer to our [webcam documentation](/operate/reference/components/camera/webcam/). ![The video path in the webcam configuration panel is set to 'video0'.](appendix/try-viam/try-viam/camera-config.png) @@ -222,11 +222,11 @@ You can view the complete JSON for your rover by clicking on **Raw JSON** at the ![The CONFIG tab with the mode toggled to Raw JSON. A section of the full raw JSON config is displayed but one would have to scroll to see all of it.](appendix/try-viam/try-viam/raw-json.png) -You can [copy this `JSON` config between rental rovers](/appendix/try-viam/reserve-a-rover/#how-can-i-reuse-my-borrowed-rover). +You can [copy this `JSON` config between rental rovers](/dev/reference/try-viam/reserve-a-rover/#how-can-i-reuse-my-borrowed-rover). ## Next steps -If you have questions, check out our [FAQ](/appendix/try-viam/reserve-a-rover/) or join our [Discord Community](https://discord.gg/viam), where you can ask questions and meet other people working on robots. +If you have questions, check out our [FAQ](/dev/reference/try-viam/reserve-a-rover/#faq/) or join our [Discord Community](https://discord.gg/viam), where you can ask questions and meet other people working on robots. {{< cards >}} {{% card link="/tutorials/control/drive-rover/" %}} diff --git a/docs/dev/tools/cli.md b/docs/dev/tools/cli.md index 8357d13602..76a972cee7 100644 --- a/docs/dev/tools/cli.md +++ b/docs/dev/tools/cli.md @@ -156,7 +156,7 @@ By default, new organization API keys are created with **Owner** permissions, gi You can change an API key's permissions from the Viam app on the [organizations page](/manage/reference/organize/) by clicking the **Show details** link next to your API key. {{% /alert %}} -Once created, you can use the organization API key to authenticate future CLI sessions or to [use the SDKs](/sdks/#authentication). +Once created, you can use the organization API key to authenticate future CLI sessions or to [use the SDKs](/dev/reference/sdks/). To switch to using an organization API key for authentication right away, [logout](#logout) then log back in using `viam login api-key`. An organization can have multiple API keys. @@ -195,7 +195,7 @@ By default, new location API keys are created with **Owner** permissions, giving You can change an API key's permissions from the Viam app on the [organizations page](/manage/reference/organize/) by clicking the **Show details** link next to your API key. {{% /alert %}} -Once created, you can use the location API key to authenticate future CLI sessions or to [connect to machines with the SDK](/sdks/#authentication). +Once created, you can use the location API key to authenticate future CLI sessions or to [connect to machines with the SDK](/dev/reference/sdks/). To switch to using a location API key for authentication right away, [logout](#logout) then log back in using `viam login api-key`. A location can have multiple API keys. @@ -233,7 +233,7 @@ Keep these key values safe. Authenticating using a machine part API key gives the authenticated CLI session full read and write access to your machine. {{% /alert %}} -Once created, you can use the machine part API key to authenticate future CLI sessions or to [connect to your machine with the SDK](/sdks/#authentication). +Once created, you can use the machine part API key to authenticate future CLI sessions or to [connect to your machine with the SDK](/dev/reference/sdks/). To switch to using a machine part API key for authentication right away, [logout](#logout) then log back in using `viam login api-key`. A location can have multiple API keys. diff --git a/docs/manage/_index.md b/docs/manage/_index.md index 47bdd69712..d92aa91910 100644 --- a/docs/manage/_index.md +++ b/docs/manage/_index.md @@ -10,6 +10,7 @@ overview: true description: "Remotely deploy and manage software on any fleet of devices. You can monitor all connected devices and troubleshoot any issues - from anywhere." aliases: - /cloud/ + - /fleet/ --- Viam's fleet management tooling allows you to remotely deploy and manage software on any fleet of devices. You can monitor all connected devices and troubleshoot any issues - from anywhere. diff --git a/docs/operate/_index.md b/docs/operate/_index.md index 09bd7ef684..0b0cd4727f 100644 --- a/docs/operate/_index.md +++ b/docs/operate/_index.md @@ -9,7 +9,7 @@ open_on_desktop: true overview: true description: "To get started, install Viam on any device and integrate your hardware. Then you can control your device and any attached physical hardware securely from anywhere in the world." aliases: - - /configure/ + - /build/ --- To get started, install Viam on any device and integrate your hardware. Then you can control your device and any attached physical hardware securely from anywhere in the world. diff --git a/docs/operate/control/headless-app.md b/docs/operate/control/headless-app.md index 54f23bc4ab..d14375c8cf 100644 --- a/docs/operate/control/headless-app.md +++ b/docs/operate/control/headless-app.md @@ -297,6 +297,7 @@ robot_address = os.getenv('ROBOT_ADDRESS') or '' sensor_name = os.getenv("SENSOR_NAME", "") plug_name = os.getenv("PLUG_NAME", "") + async def connect(): opts = RobotClient.Options.with_api_key( api_key=robot_api_key, @@ -304,6 +305,7 @@ async def connect(): ) return await RobotClient.at_address(robot_address, opts) + async def main(): machine = await connect() @@ -319,7 +321,8 @@ async def main(): while True: readings = await pms_7003.get_readings() # Check if any of the PM values exceed the unhealthy thresholds - if any(readings.get(pm_type, 0) > threshold for pm_type, threshold in unhealthy_thresholds.items()): + if any(readings.get(pm_type, 0) > threshold for pm_type, + threshold in unhealthy_thresholds.items()): LOGGER.info('UNHEALTHY.') await kasa_plug.do_command({"toggle_on": []}) else: diff --git a/docs/operate/control/mobile-app.md b/docs/operate/control/mobile-app.md index 6d78c0ec01..ffdc8d1e8f 100644 --- a/docs/operate/control/mobile-app.md +++ b/docs/operate/control/mobile-app.md @@ -93,7 +93,7 @@ The connection code will establish communication with your machine over LAN or W ## Set up user authentication -Viam uses [FusionAuth](FusionAuth) for authentication and authorization. +Viam uses [FusionAuth](https://fusionauth.io/) for authentication and authorization. Use the [Viam CLI `auth-app` command](/dev/tools/cli/#auth-app) to register your application with FusionAuth so that you or your users can log into your app with the same credentials they use to log into the [Viam app](https://app.viam.com). diff --git a/docs/operate/control/web-app.md b/docs/operate/control/web-app.md index 57e27ad496..dcab1ac7c9 100644 --- a/docs/operate/control/web-app.md +++ b/docs/operate/control/web-app.md @@ -53,7 +53,7 @@ You can also host your app on a server or hosting service of your choice. ## Set up user authentication -Viam uses [FusionAuth](FusionAuth) for authentication and authorization. +Viam uses [FusionAuth](https://fusionauth.io/) for authentication and authorization. Use the [Viam CLI `auth-app` command](/dev/tools/cli/#auth-app) to register your application with FusionAuth so that you or your users can log into your app with the same credentials they use to log into the [Viam app](https://app.viam.com). diff --git a/docs/operate/get-started/other-hardware/_index.md b/docs/operate/get-started/other-hardware/_index.md index e1d945bbde..e848ae0bc7 100644 --- a/docs/operate/get-started/other-hardware/_index.md +++ b/docs/operate/get-started/other-hardware/_index.md @@ -11,6 +11,12 @@ aliases: - /how-tos/create-module/ - /how-tos/sensor-module/ - /registry/advanced/iterative-development/ + - /build/program/extend/modular-resources/ + - /extend/modular-resources/ + - /extend/ + - /build/program/extend/modular-resources/key-concepts/ + - /modular-resources/key-concepts/ + - /modular-resources/ prev: "/operate/get-started/supported-hardware/" next: "/operate/get-started/other-hardware/hello-world-module/" --- @@ -616,7 +622,8 @@ Click **Create**. Click the **+** button again, this time selecting **Local module** and then **Local component**. -Select or enter the {{< glossary_tooltip term_id="model-namespace-triplet" text="model namespace triplet" >}} you specified in the [Name your model step](/how-tos/sensor-module/#generate-template-module-code), for example `jessamy:weather:meteo-PM`. +Select or enter the {{< glossary_tooltip term_id="model-namespace-triplet" text="model namespace triplet" >}}, for example `jessamy:weather:meteo-PM`. +You can find the triplet in the `model` field of your meta.json file. Select the **Type** corresponding to the API you implemented. @@ -762,7 +769,7 @@ Do not change the module_id.

visibility string Required -Whether the module is accessible only to members of your organization (private), or visible to all Viam users (public). You can later make a private module public using the viam module update command. Once you make a module public, you can change it back to private if it is not configured on any machines outside of your organization. +Whether the module is accessible only to members of your organization (private), or visible to all Viam users (public). You can later make a private module public using the viam module update command. Once you make a module public, you can change it back to private if it is not configured on any machines outside of your organization. url @@ -792,7 +799,7 @@ Do not change the module_id.

build object Optional -An object containing the command to run to build your module, as well as optional fields for the path to your dependency setup script, the target architectures to build for, and the path to your built module. Use this with the Viam CLI's build subcommand. +An object containing the command to run to build your module, as well as optional fields for the path to your dependency setup script, the target architectures to build for, and the path to your built module. Use this with the Viam CLI's build subcommand. $schema diff --git a/docs/operate/get-started/other-hardware/cpp-module.md b/docs/operate/get-started/other-hardware/cpp-module.md index 4d51d64869..a9dbab0724 100644 --- a/docs/operate/get-started/other-hardware/cpp-module.md +++ b/docs/operate/get-started/other-hardware/cpp-module.md @@ -289,7 +289,8 @@ class MyBase(Base, Reconfigurable): # Here is where we define our new model's colon-delimited-triplet: # acme:my-custom-base-module:mybase - # acme = namespace, my-custom-base-module = module-name, mybase = model name. + # acme = namespace, my-custom-base-module = module-name, + # mybase = model name MODEL: ClassVar[Model] = Model( ModelFamily("acme", "my-custom-base-module"), "mybase") @@ -1490,7 +1491,7 @@ _Add instructions here for any requirements._ ## Configure your -Navigate to the [**CONFIGURE** tab](https://docs.viam.com/configure/) of your [machine](https://docs.viam.com/fleet/machines/) in the [Viam app](https://app.viam.com/). +Navigate to the **CONFIGURE** tab of your [machine](https://docs.viam.com/fleet/machines/) in the [Viam app](https://app.viam.com/). [Add to your machine](/operate/get-started/supported-hardware/#configure-hardware-on-your-machine). On the new component panel, copy and paste the following attribute template into your ’s attributes field: @@ -1558,7 +1559,7 @@ On the new component panel, copy and paste the following attribute template into ``` > [!NOTE] -> For more information, see [Configure a Machine](https://docs.viam.com/configure/). +> For more information, see [Configure hardware on your machine](/operate/get-started/supported-hardware/#configure-hardware-on-your-machine). ### Attributes diff --git a/docs/operate/get-started/other-hardware/hello-world-module.md b/docs/operate/get-started/other-hardware/hello-world-module.md index c95d6f956b..6549e797ce 100644 --- a/docs/operate/get-started/other-hardware/hello-world-module.md +++ b/docs/operate/get-started/other-hardware/hello-world-module.md @@ -802,8 +802,6 @@ For more information about uploading modules, see [Update and manage modules you ## Next steps -For a guide that walks you through creating different sensor models, for example to get weather data from an online source, see [Create a sensor module with Python](/how-tos/sensor-module/). - -For more module creation information with more programming language options, see the [Create a module](/operate/get-started/other-hardware/) guide. +For more module creation information, see the [Integrate other hardware](/operate/get-started/other-hardware/) guide. To update or delete a module, see [Update and manage modules](/operate/get-started/other-hardware/manage-modules/). diff --git a/docs/operate/get-started/setup.md b/docs/operate/get-started/setup.md index 9b9b4158ae..f2628609e4 100644 --- a/docs/operate/get-started/setup.md +++ b/docs/operate/get-started/setup.md @@ -82,7 +82,7 @@ Since the configuration is cached locally, your machine does not need to stay co If it is online, the machine checks for new configurations every 15 seconds and changes its config automatically when a new config is available. All communication happens securely over HTTPS using secret tokens that are in the machine's config. -If your machine will never connect to the internet, you can also create a [local configuration file](/operate/reference/local-configuration-file/) on the machine itself. +If your machine will never connect to the internet, you can also create a [local configuration file](/operate/reference/viam-server/local-configuration-file/) on the machine itself. ### Manage your installation @@ -90,7 +90,7 @@ On Linux installs, by default `viam-server` or `viam-agent` and `viam-server` wi On macOS installs, `viam-server` does not start automatically on boot. You can change this behavior if desired. -To learn how to run, update, or uninstall `viam-agent`, see [Manage `viam-agent`](manage/reference/viam-agent/manage-viam-agent/). +To learn how to run, update, or uninstall `viam-agent`, see [Manage `viam-agent`](/manage/reference/viam-agent/manage-viam-agent/). For manual installs of only `viam-server`, see [Manage `viam-server`](/operate/reference/viam-server/manage-viam-server/). diff --git a/docs/operate/get-started/supported-hardware/_index.md b/docs/operate/get-started/supported-hardware/_index.md index 7db5a9f2a9..a5969e2faa 100644 --- a/docs/operate/get-started/supported-hardware/_index.md +++ b/docs/operate/get-started/supported-hardware/_index.md @@ -12,6 +12,11 @@ aliases: - /modular-resources/configure/ - /registry/configure/ - /registry/modular-resources/ + - /configure/ + - /manage/configuration/ + - /build/configure/ + - /configure/ + - /registry/ prev: "/operate/get-started/setup/" next: "/operate/get-started/other-hardware/" --- @@ -69,7 +74,7 @@ You can browse the [Viam Registry in the Viam app](https://app.viam.com/registry The following is a selection of components (some built-ins and some modules) written for use with `viam-micro-server`. To use any of the built-in components, configure them according to their readmes. -To use a module with `viam-micro-server`, you need to [build firmware that combines `viam-micro-server` with one or more modules](/operate/get-started/other-hardware/micro-module). +To use a module with `viam-micro-server`, you need to [build firmware that combines `viam-micro-server` with one or more modules](/operate/get-started/other-hardware/micro-module/). | Model | Description | Built-in | @@ -127,4 +132,4 @@ To add a service to your machine: Modules run alongside [`viam-server`](/operate/reference/viam-server/) as separate processes, communicating with `viam-server` over UNIX sockets. When a module initializes, it registers its {{< glossary_tooltip term_id="model" text="model or models" >}} and associated [APIs](/dev/reference/apis/) with `viam-server`, making the new model available for use. -`viam-server` manages the [dependencies](/operate/reference/viam-server/#dependency-management), [start-up](/operate/reference/viam-server/#start-up), [reconfiguration](/operate/reference/viam-server/#reconfiguration), [data management](/services/data/#configuration), and [shutdown](/operate/reference/viam-server/#shutdown) behavior of your modular resource. +`viam-server` manages the [dependencies](/operate/reference/viam-server/#dependency-management), [start-up](/operate/reference/viam-server/#start-up), [reconfiguration](/operate/reference/viam-server/#reconfiguration), [data management](/data-ai/capture-data/capture-sync/), and [shutdown](/operate/reference/viam-server/#shutdown) behavior of your modular resource. diff --git a/docs/operate/mobility/define-obstacles.md b/docs/operate/mobility/define-obstacles.md index a3ec60a337..8640077e1e 100644 --- a/docs/operate/mobility/define-obstacles.md +++ b/docs/operate/mobility/define-obstacles.md @@ -13,7 +13,7 @@ The motion service will take into account the obstacles as well as the geometry Start by [defining your machine's geometry](/operate/mobility/define-geometry/) so that you can define the obstacles with respect to the machine's reference frame. Next, define one or more obstacles. -Here is a Python example from the [Add constraints and transforms to a motion plan guide](/operate/mobility/move-arm/constrain-motion/#modify-your-robots-working-environment): +Here is a Python example from the [Add constraints and transforms to a motion plan guide](/tutorials/services/constrain-motion/#modify-your-robots-working-environment): ```python {class="line-numbers linkable-line-numbers"} box_origin = Pose(x=400, y=0, z=50+z_offset) diff --git a/docs/operate/mobility/move-arm.md b/docs/operate/mobility/move-arm.md index 20fdbc2ecd..6e85c40693 100644 --- a/docs/operate/mobility/move-arm.md +++ b/docs/operate/mobility/move-arm.md @@ -25,7 +25,7 @@ You have two options for moving a robotic [arm](/operate/reference/components/ar ## Configure and connect to your arm {{< table >}} -{{% tablestep link="/get-started/supported-hardware/" %}} +{{% tablestep link="/operate/get-started/supported-hardware/" %}} **1. Configure an arm component** First, physically connect the arm to your machine. diff --git a/docs/operate/mobility/move-gantry.md b/docs/operate/mobility/move-gantry.md index 01995b1db9..d7937ddc25 100644 --- a/docs/operate/mobility/move-gantry.md +++ b/docs/operate/mobility/move-gantry.md @@ -68,15 +68,19 @@ from viam.robot.client import RobotClient from viam.rpc.dial import Credentials, DialOptions from viam.components.gantry import Gantry + async def connect(): opts = RobotClient.Options.with_api_key( - # Replace "" (including brackets) with your machine's api key + # Replace "" (including brackets) with + # your machine's API key api_key='', - # Replace "" (including brackets) with your machine's api key id + # Replace "" (including brackets) with + # your machine's API key ID api_key_id='' ) return await RobotClient.at_address('', opts) + async def main(): machine = await connect() @@ -91,16 +95,16 @@ async def main(): # Home the gantry await gantry_1.home() - # Move this three-axis gantry to a position 5mm in the positive Y direction from (0,0,0) + # Move this three-axis gantry to a position 5mm in the + # positive Y direction from (0,0,0) # and set the speed of each axis to 8 mm/sec - await gantry_1.move_to_position([0,5,0], [8,8,8]) + await gantry_1.move_to_position([0, 5, 0], [8, 8, 8]) # Don't forget to close the machine when you're done! await machine.close() if __name__ == '__main__': asyncio.run(main()) - ``` ## Use automated complex motion planning @@ -123,15 +127,19 @@ from viam.components.gantry import Gantry from viam.services.motion import MotionClient from viam.proto.common import Pose, PoseInFrame + async def connect(): opts = RobotClient.Options.with_api_key( - # Replace "" (including brackets) with your machine's api key + # Replace "" (including brackets) with + # your machine's API key api_key='', - # Replace "" (including brackets) with your machine's api key id + # Replace "" (including brackets) with + # your machine's API key ID api_key_id='' ) return await RobotClient.at_address('', opts) + async def main(): machine = await connect() @@ -147,13 +155,13 @@ async def main(): goal_pose = Pose(x=0, y=0, z=300, o_x=0, o_y=0, o_z=1, theta=0) # Move the gantry - await motion.move(component_name=gantry_1, - destination=PoseInFrame(reference_frame="myFrame", pose=goal_pose)) + await motion.move( + component_name=gantry_1, + destination=PoseInFrame(reference_frame="myFrame", pose=goal_pose)) # Don't forget to close the machine when you're done! await machine.close() if __name__ == '__main__': asyncio.run(main()) - ``` diff --git a/docs/operate/reference/architecture/_index.md b/docs/operate/reference/architecture/_index.md index 3d03d613f5..d70ac35819 100644 --- a/docs/operate/reference/architecture/_index.md +++ b/docs/operate/reference/architecture/_index.md @@ -109,7 +109,7 @@ For more details, see [Machine-to-Machine Communication](/operate/reference/arch TLS certificates automatically provided by the Viam app ensure that all communication is authenticated and encrypted. -Viam uses API keys with [role-based access control (RBAC)](/cloud/rbac/) to control access to machines from client code. +Viam uses API keys with [role-based access control (RBAC)](/manage/manage/rbac/) to control access to machines from client code. ## Data management flow @@ -153,7 +153,7 @@ Now imagine you want to run code to turn on a fan when the temperature sensor re - Configure the fan motor as a motor component and wire the fan motor relay to the same board as the sensor. - Write your script using one of the Viam [SDKs](/dev/reference/sdks/), for example the Viam Python SDK, using the sensor API and motor API. - You then run this code either locally on the SBC, or on a separate server. - See [Run code](/sdks/#run-code) for more options. + See [Create a headless app](/operate/control/headless-app/) for more information. Your code connects to the machine, authenticating with API keys, and uses the [sensor API](/operate/reference/components/sensor/#api) to get readings and the [motor API](/operate/reference/components/motor/#api) to turn the motor on and off. ![A desktop computer (client in this case) sends commands to robot 1 (server) with gRPC over wifi.](/build/program/sdks/robot-client.png) diff --git a/docs/operate/reference/components/arm/_index.md b/docs/operate/reference/components/arm/_index.md index e356bacaf6..34762a6ca7 100644 --- a/docs/operate/reference/components/arm/_index.md +++ b/docs/operate/reference/components/arm/_index.md @@ -95,5 +95,5 @@ For general configuration, development, and usage info, see: You can also use the arm component with the following services: -- [Motion service](/operate/reference/services/slam//services/slam/): To move machines or components of machines +- [Motion service](/operate/reference/services/slam/): To move machines or components of machines - [Frame system service](/operate/reference/services/navigation/): To configure the positions of your components diff --git a/docs/operate/reference/components/base/_index.md b/docs/operate/reference/components/base/_index.md index 35832cd386..6986107811 100644 --- a/docs/operate/reference/components/base/_index.md +++ b/docs/operate/reference/components/base/_index.md @@ -23,7 +23,7 @@ The base component provides an API for moving all configured components attached If you have a mobile robot, use a base component to coordinate the motion of its motor components.

-A robot comprised of a wheeled base (motors, wheels and chassis) as well as some other components. The wheels are highlighted to indicate that they are part of the concept of a 'base', while the non-base components are not highlighted. The width and circumference are required attributes when configuring a base component. +A robot comprised of a wheeled base (motors, wheels and chassis) as well as some other components. The wheels are highlighted to indicate that they are part of the concept of a 'base', while the non-base components are not highlighted. The width and circumference are required attributes when configuring a base component.

## Configuration diff --git a/docs/operate/reference/components/component/board1.md b/docs/operate/reference/components/component/board1.md index 1b322757f6..0e037f9d11 100644 --- a/docs/operate/reference/components/component/board1.md +++ b/docs/operate/reference/components/component/board1.md @@ -14,7 +14,7 @@ draft: true {{% alert title="REQUIREMENTS" color="note" %}} -Follow the [setup guide](/installation/prepare/board1-setup) to prepare your to run `viam-server` before you configure your board. +Follow the [setup guide](/operate/reference/prepare/board1-setup) to prepare your to run `viam-server` before you configure your board. {{% /alert %}} diff --git a/docs/operate/reference/components/gantry/_index.md b/docs/operate/reference/components/gantry/_index.md index 5e36e1cc19..3d926d4713 100644 --- a/docs/operate/reference/components/gantry/_index.md +++ b/docs/operate/reference/components/gantry/_index.md @@ -18,7 +18,7 @@ date: "2024-10-21" ---
-Example of what a multi-axis robot gantry looks like as a black and white illustration of an XX YY mechanical gantry. +Example of what a multi-axis robot gantry looks like as a black and white illustration of an XX YY mechanical gantry.
The gantry component provides an API for coordinated control of one or more linear actuators. diff --git a/docs/operate/reference/components/input-controller/_index.md b/docs/operate/reference/components/input-controller/_index.md index 9f7a506dfd..9c9a65b865 100644 --- a/docs/operate/reference/components/input-controller/_index.md +++ b/docs/operate/reference/components/input-controller/_index.md @@ -26,7 +26,7 @@ To use the controller's inputs, you must [register callback functions](/dev/refe The callback functions can then handle the [Events](/dev/reference/apis/components/input-controller/#getevents) that are sent when the `Control` is activated or moved. For example, when a specific button is pushed, the callback function registered to it can move another component, or print a specific output. -The [base remote control service](/services/base-rc/) implements an input controller as a remote control for a base. +The [base remote control service](/operate/reference/services/base-rc/) implements an input controller as a remote control for a base. ## Configuration diff --git a/docs/operate/reference/components/sensor/_index.md b/docs/operate/reference/components/sensor/_index.md index 253c7a35a5..ece93ee95c 100644 --- a/docs/operate/reference/components/sensor/_index.md +++ b/docs/operate/reference/components/sensor/_index.md @@ -42,7 +42,7 @@ For additional configuration information, click on the model name: {{}} {{< alert title="Add support for other models" color="tip" >}} -If none of the existing models fit your use case, you can [create a modular resource](/how-tos/sensor-module/) to add support for it. +If none of the existing models fit your use case, you can [create a modular resource](/operate/get-started/other-hardware/) to add support for it. {{< /alert >}} {{% /tab %}} diff --git a/docs/operate/reference/services/navigation/_index.md b/docs/operate/reference/services/navigation/_index.md index a6d24c755a..b826f5632e 100644 --- a/docs/operate/reference/services/navigation/_index.md +++ b/docs/operate/reference/services/navigation/_index.md @@ -152,7 +152,7 @@ The following attributes are available for `Navigation` services: | `base` | string | **Required** | The `name` you have configured for the [base](/operate/reference/components/base/) you are operating with this service. | | `movement_sensor` | string | **Required** | The `name` of the [movement sensor](/operate/reference/components/movement-sensor/) you have configured for the base you are operating with this service. | | `motion_service` | string | Optional | The `name` of the [motion service](/operate/reference/services/motion/) you have configured for the base you are operating with this service. If you have not added a motion service to your machine, the default motion service will be used. Reference this default service in your code with the name `"builtin"`. | -| `obstacle_detectors` | array | Optional | An array containing objects with the `name` of each [`"camera"`](/operate/reference/components/camera/) you have configured for the base you are navigating, along with the `name` of the [`"vision_service"`](/services/motion/) you are using to detect obstacles. Note that any vision services on remote parts will only be able to access cameras on the same remote part. | +| `obstacle_detectors` | array | Optional | An array containing objects with the `name` of each [`"camera"`](/operate/reference/components/camera/) you have configured for the base you are navigating, along with the `name` of the [`"vision_service"`](/operate/reference/services/motion/) you are using to detect obstacles. Note that any vision services on remote parts will only be able to access cameras on the same remote part. | | `position_polling_frequency_hz` | float | Optional | The frequency in Hz to poll for the position of the machine.
Default: `1` | | `obstacle_polling_frequency_hz` | float | Optional | The frequency in Hz to poll each vision service for new obstacles.
Default: `1` | | `plan_deviation_m` | float | Optional | The distance in meters that a machine is allowed to deviate from the motion plan.
Default: `2.6`| diff --git a/docs/operate/reference/services/slam/cloudslam/_index.md b/docs/operate/reference/services/slam/cloudslam/_index.md index ddb51b62e6..4af4920abf 100644 --- a/docs/operate/reference/services/slam/cloudslam/_index.md +++ b/docs/operate/reference/services/slam/cloudslam/_index.md @@ -64,15 +64,15 @@ To use CloudSLAM on a live machine, you must meet the following requirements: 1. A cloudslam supported algorithm must be configured on the machine. Currently only the [cartographer module](../cartographer/) is supported. Please configure a supported algorithm on the machine before continuing. -2. A location owner [API key](/cloud/rbac/#api-keys) or higher. +2. A location owner [API key](/manage/manage/access/) or higher. ### Configuration To use CloudSLAM you must enable data capture and configure your `cloudslam-wrapper` SLAM service: {{< alert title="Tip: Managing Data Capture" color="tip" >}} -Note that when the [data management service](/services/data/) is enabled, it continuously monitors and syncs your machine’s sensor data while the machine is running. -To avoid incurring charges while not in use, [turn off data capture for your sensors](/data-ai/capture-data/capture-sync/#configuration) once you have finished your SLAM session. +Note that when the [data management service](/data-ai/capture-data/capture-sync/) is enabled, it continuously monitors and syncs your machine’s sensor data while the machine is running. +To avoid incurring charges while not in use, [turn off data capture for your sensors](/data-ai/capture-data/capture-sync/#stop-data-capture) once you have finished your SLAM session. {{< /alert >}} {{< tabs name="Create new map">}} @@ -379,7 +379,7 @@ This feature can also be used with SLAM algorithms that CloudSLAM does not curre - A SLAM algorithm must be configured on the machine. This algorithm does **not** need to be supported by cloudslam to work. -- A location owner API Key or higher. See [Add an API key](/cloud/rbac/#api-keys) to learn how to create a key! +- A location owner API Key or higher. See [Add an API key](/manage/manage/access/) to learn how to create a key! ### Configuration @@ -428,10 +428,10 @@ The following attributes are available for `viam:cloudslam-wrapper:cloudslam` | Name | Type | Required? | Description | | ------- | ------ | ------------ | ----------- | | `slam_service` | string | **Required** | The name of the SLAM Service on the machine to use with cloudslam. | -| `api_key` | string | **Required** | An [API key](/cloud/rbac/#api-keys) with location owner or higher permission. | +| `api_key` | string | **Required** | An [API key](/manage/manage/access/) with location owner or higher permission. | | `api_key_id` | string | **Required** | The associated API key ID with the API key. | | `organization_id` | string | **Required** | The organization ID of your [organization](/dev/reference/glossary/#organization). | -| `location_id` | string | **Required** | The location ID of your [location](/dev/reference/glossary/location/). | +| `location_id` | string | **Required** | The location ID of your [location](/dev/reference/glossary/#location/). | | `machine_id` | string | **Required** | The machine ID of your [machine](/dev/reference/apis/fleet/#find-machine-id). | | `machine_part_id` | string | Optional | The machine part ID of your [machine part](/dev/reference/apis/fleet/#find-machine-id). Used for local package creation and updating mode. | | `viam_version` | string | Optional | The version of viam-server to use with CloudSLAM. Defaults to `stable`. | diff --git a/docs/operate/reference/services/vision/detector_3d_segmenter.md b/docs/operate/reference/services/vision/detector_3d_segmenter.md index b72d9fb514..2b4c55d600 100644 --- a/docs/operate/reference/services/vision/detector_3d_segmenter.md +++ b/docs/operate/reference/services/vision/detector_3d_segmenter.md @@ -14,9 +14,9 @@ aliases: # SMEs: Bijan, Khari --- -_Changed in [RDK v0.2.36 and API v0.1.118](/appendix/changelog/#vision-service)_ +_Changed in [RDK v0.2.36 and API v0.1.118](/dev/reference/changelog/#vision-service)_ -The `detector_3d_segmenter` vision service model takes 2D bounding boxes from an [object detector](../#detections), and, using the intrinsic parameters of the chosen camera, projects the pixels in the bounding box to points in 3D space. +The `detector_3d_segmenter` vision service model takes 2D bounding boxes from an [object detector](/dev/reference/apis/services/vision/#detections), and, using the intrinsic parameters of the chosen camera, projects the pixels in the bounding box to points in 3D space. If the chosen camera is not equipped to do projections from 2D to 3D, then this vision model will fail. The label and the pixels associated with the 2D detections become the label and point cloud associated with the 3D segmenter. diff --git a/docs/operate/reference/services/vision/mlmodel.md b/docs/operate/reference/services/vision/mlmodel.md index 469fabcaed..e4c5a6c93a 100644 --- a/docs/operate/reference/services/vision/mlmodel.md +++ b/docs/operate/reference/services/vision/mlmodel.md @@ -16,7 +16,7 @@ aliases: # SMEs: Bijan, Khari --- -_Changed in [RDK v0.2.36 and API v0.1.118](/appendix/changelog/#vision-service)_ +_Changed in [RDK v0.2.36 and API v0.1.118](/dev/reference/changelog/#vision-service)_ The `mlmodel` {{< glossary_tooltip term_id="model" text="model" >}} of the Viam vision service supports machine learning detectors and classifiers that draw bounding boxes or return class labels based on a deployed TensorFlow Lite, TensorFlow, PyTorch, or ONNX ML model. @@ -29,7 +29,7 @@ Before configuring your `mlmodel` detector or classifier, you need to:

1. Train or upload an ML model

-You can add an [existing model](/data-ai/ai/deploy/#deploy-your-ml-model) or [train a TFlite](/data-ai/ai/train-tflite/) or [another model](data-ai/ai/train/) for object detection and classification using your data in the [Viam Cloud](/data-ai/capture-data/capture-sync/). +You can add an [existing model](/data-ai/ai/deploy/#deploy-your-ml-model) or [train a TFlite](/data-ai/ai/train-tflite/) or [another model](/data-ai/ai/train/) for object detection and classification using your data in the [Viam Cloud](/data-ai/capture-data/capture-sync/). {{% /manualcard %}} {{% manualcard %}} diff --git a/docs/operate/reference/services/vision/obstacles_depth.md b/docs/operate/reference/services/vision/obstacles_depth.md index 73d60c610e..98c5e77918 100644 --- a/docs/operate/reference/services/vision/obstacles_depth.md +++ b/docs/operate/reference/services/vision/obstacles_depth.md @@ -13,7 +13,7 @@ aliases: # SMEs: Bijan, Khari --- -_Changed in [RDK v0.2.36 and API v0.1.118](/appendix/changelog/#vision-service)_ +_Changed in [RDK v0.2.36 and API v0.1.118](/dev/reference/changelog/#vision-service)_ The `obstacles_depth` vision service model is for depth cameras, and is best for motion planning with transient obstacles. Use this segmenter to identify well separated objects above a flat plane. diff --git a/docs/operate/reference/services/vision/obstacles_distance.md b/docs/operate/reference/services/vision/obstacles_distance.md index c1532b8a77..3aea94a133 100644 --- a/docs/operate/reference/services/vision/obstacles_distance.md +++ b/docs/operate/reference/services/vision/obstacles_distance.md @@ -13,7 +13,7 @@ aliases: # SMEs: Bijan, Khari --- -_Changed in [RDK v0.2.36 and API v0.1.118](/appendix/changelog/#vision-service)_ +_Changed in [RDK v0.2.36 and API v0.1.118](/dev/reference/changelog/#vision-service)_ `obstacles_distance` is a segmenter that takes point clouds from a camera input and returns the average single closest point to the camera as a perceived obstacle. It is best for transient obstacle avoidance. diff --git a/docs/operate/reference/services/vision/obstacles_pointcloud.md b/docs/operate/reference/services/vision/obstacles_pointcloud.md index 9823a8d07f..c8314402b8 100644 --- a/docs/operate/reference/services/vision/obstacles_pointcloud.md +++ b/docs/operate/reference/services/vision/obstacles_pointcloud.md @@ -13,7 +13,7 @@ aliases: # SMEs: Bijan, Khari --- -_Changed in [RDK v0.2.36 and API v0.1.118](/appendix/changelog/#vision-service)_ +_Changed in [RDK v0.2.36 and API v0.1.118](/dev/reference/changelog/#vision-service)_ `obstacles_pointcloud` is a segmenter that identifies well separated objects above a flat plane. It first identifies the biggest plane in the scene, eliminates that plane, and clusters the remaining points into objects. diff --git a/docs/operate/reference/viam-server/local-configuration-file.md b/docs/operate/reference/viam-server/local-configuration-file.md index 05b5b5062f..ea5268d7e1 100644 --- a/docs/operate/reference/viam-server/local-configuration-file.md +++ b/docs/operate/reference/viam-server/local-configuration-file.md @@ -22,8 +22,6 @@ However, if your machine will never connect to the internet, you will need to cr - [Build a local configuration file in the Viam app](#build-a-local-configuration-file-in-the-viam-app) - Use the Viam app to build the configuration file and copy it to your machine, without connecting your machine to the Viam app. - [Build a local configuration file manually](#build-a-local-configuration-file-manually) - Build your own local configuration file based on the example file. -For information on the individual configuration options available, see [Configuration](/configure/). - ## Build a local configuration file in the Viam app If your machine will never connect to the internet, and you want to create a local configuration file manually, you can still use the Viam app to build the configuration file even without connecting your machine to it. @@ -227,5 +225,3 @@ The following file contains some example {{< glossary_tooltip term_id="component ] } ``` - -For more information on the individual configuration options available, see [Configuration](/configure/). diff --git a/docs/operate/reference/viam-server/manage-viam-server.md b/docs/operate/reference/viam-server/manage-viam-server.md index 61d1fb97d3..35e4cc00d8 100644 --- a/docs/operate/reference/viam-server/manage-viam-server.md +++ b/docs/operate/reference/viam-server/manage-viam-server.md @@ -21,7 +21,7 @@ Running as a system service enables you to configure `viam-server` to start auto Running on the command line is suitable for local development. {{< alert title="Note" color="note" >}} -If you have installed `viam-agent`, see [Manage `viam-agent`](manage/reference/viam-agent/manage-viam-agent/) instead. +If you have installed `viam-agent`, see [Manage `viam-agent`](/manage/reference/viam-agent/manage-viam-agent/) instead. {{< /alert >}} ## Run `viam-server` @@ -88,7 +88,7 @@ sudo viam-server -config /path/to/my/config.json If you followed the [Installation Guide](/operate/get-started/setup/), your machine's configuration file is available at /etc/viam.json. You can provide this path in the above command, or move the configuration file to a desired location and change the path in this command accordingly. -If you don't yet have a configuration file, you can [build a new configuration file](/internals/local-configuration-file/). +If you don't yet have a configuration file, you can [build a new configuration file](/operate/reference/viam-server/local-configuration-file/). Note that on a Raspberry Pi, `viam-server` must always run as `root` (using `sudo`) in order to access the DMA subsystem for GPIO. When running `viam-server` from your home directory on a Linux computer, you do not need to use `sudo`. @@ -119,7 +119,7 @@ viam-server -config /path/to/my/config.json If you followed the [Installation Guide](/operate/get-started/setup/), your machine's configuration file is available in your ~/Downloads/ directory, named similarly to viam-machinename-main.json. You can provide this path in the above command, or move the configuration file to a desired location and change the path in this command accordingly. -If you don't yet have a configuration file, you can use the example configuration file provided at /opt/homebrew/etc/viam.json or you can [build a new configuration file](/internals/local-configuration-file/). +If you don't yet have a configuration file, you can use the example configuration file provided at /opt/homebrew/etc/viam.json or you can [build a new configuration file](/operate/reference/viam-server/local-configuration-file/). #### Stop diff --git a/docs/tutorials/configure/build-a-mock-robot.md b/docs/tutorials/configure/build-a-mock-robot.md index 31f38e9c5f..e0d54ba7df 100644 --- a/docs/tutorials/configure/build-a-mock-robot.md +++ b/docs/tutorials/configure/build-a-mock-robot.md @@ -53,7 +53,7 @@ If you don't already have a Viam account, sign up for one on the [Viam app](http ### Configure your mock robot -[Configure your mock robot](/configure/) to represent a physical machine with robotic board, arm, and motor hardware. +[Configure your mock robot](/operate/get-started/supported-hardware/#configure-hardware-on-your-machine) to represent a physical machine with robotic board, arm, and motor hardware. If you were using physical hardware, this process would provide `viam-server` with information about what hardware is attached to it and how to communicate with it. For this robot, you configure `viam-server` to use `fake` components that emulate physical hardware. diff --git a/docs/tutorials/configure/pet-photographer.md b/docs/tutorials/configure/pet-photographer.md index a9e8720814..81f1d5f86a 100644 --- a/docs/tutorials/configure/pet-photographer.md +++ b/docs/tutorials/configure/pet-photographer.md @@ -828,7 +828,7 @@ Whether you've downloaded the `colorfilter` module, or written your own color fi Next, add the following services to your smart machine to support the color filter module: - The [data management service](/data-ai/capture-data/capture-sync/) enables your smart machine to capture data and sync it to the cloud. -- The [vision service](/operate/reference/services/vision/#detections) enables your smart machine to perform color detection on objects in a camera stream. +- The [vision service](/dev/reference/apis/services/vision/#detections) enables your smart machine to perform color detection on objects in a camera stream. ### Add the data management service @@ -849,7 +849,7 @@ To enable data capture on your machine, add and configure the [data management s ![An instance of the data management service named "dm". The cloud sync and capturing options are toggled on and the directory is empty. The interval is set to 0.1](/tutorials/pet-photographer/data-management-services.png) - For more detailed information, see [Add the data management service](/data-ai/capture-data/capture-sync/#configuration). + For more detailed information, see [Add the data management service](/data-ai/capture-data/capture-sync/). {{% /tab %}} {{% tab name="JSON Template" %}} Add the data management service to the services array in your rover’s raw JSON configuration: diff --git a/docs/tutorials/control/air-quality-fleet.md b/docs/tutorials/control/air-quality-fleet.md index e35962f3cc..f2128dce6c 100644 --- a/docs/tutorials/control/air-quality-fleet.md +++ b/docs/tutorials/control/air-quality-fleet.md @@ -173,7 +173,7 @@ For each sensing machine: ## Configure your air quality sensors -You need to [configure](/configure/) your hardware so that each of your machines can communicate with its attached air quality [sensor](/operate/reference/components/sensor/). +You need to [configure](/operate/get-started/supported-hardware/#configure-hardware-on-your-machine) your hardware so that each of your machines can communicate with its attached air quality [sensor](/operate/reference/components/sensor/). No matter how many sensing machines you use, you can configure them efficiently by using a reusable configuration block called a _{{< glossary_tooltip term_id="fragment" text="fragment" >}}_. Fragments are a way to share and manage identical machine configurations across multiple machines. diff --git a/docs/tutorials/control/gamepad.md b/docs/tutorials/control/gamepad.md index bc548ea23d..88388fe74e 100644 --- a/docs/tutorials/control/gamepad.md +++ b/docs/tutorials/control/gamepad.md @@ -125,7 +125,7 @@ To link the controller input to the base functionality, you need to add the base ## Add the base remote control service Services are software packages that provide robots with higher level functionality. -To link the controller's input to the base functionality, you need to configure the [base remote control service](/services/base-rc/): +To link the controller's input to the base functionality, you need to configure the [base remote control service](/operate/reference/services/base-rc/): {{< tabs >}} {{% tab name="Config Builder" %}} diff --git a/docs/tutorials/custom/controlling-an-intermode-rover-canbus.md b/docs/tutorials/custom/controlling-an-intermode-rover-canbus.md index 1d745c42d5..c51fc98f23 100644 --- a/docs/tutorials/custom/controlling-an-intermode-rover-canbus.md +++ b/docs/tutorials/custom/controlling-an-intermode-rover-canbus.md @@ -187,7 +187,7 @@ When registering it, the code also provides the API that the new model supports. That means in this case that the base should support the default [base API](/dev/reference/apis/components/base/#api) with methods such as `MoveStraight` and `Spin`. The **API** of any Viam resource is also represented as colon-separated triplets where the first element is a namespace. -Since you are using the default Viam API for a [base](/operate/reference/components/base/), the [API](/how-tos/create-module/#valid-api-identifiers) you are using is: +Since you are using the default Viam API for a [base](/operate/reference/components/base/), the {{< glossary_tooltip term_id="api-namespace-triplet" text="API namespace triplet" >}} is: `rdk:component:base`. In the code this is specified on line 30 as `base.Subtype`. @@ -292,7 +292,7 @@ To make your module accessible to `viam-server`, you must add it as a local modu 1. Navigate to the **CONFIGURE** tab of your machine's page in the [Viam app](https://app.viam.com). 1. Click the **+** (Create) icon next to your machine part in the left-hand menu and select **Local module**, then **Local module**. 1. Enter a **Name** for this instance of your modular resource, for example `my-custom-base-module`. -1. Enter the [module's executable path](/how-tos/create-module/#compile-or-package-your-module). +1. Enter the module's [executable path](/operate/get-started/other-hardware/#test-your-module-locally). This path must be the absolute path to the executable on your machine's filesystem. Add the path to where you downloaded the [compiled binary](https://github.com/viam-labs/tutorial-intermode/blob/main/intermode-base/intermode-model). 1. Then, click the **Create** button, and click **Save** in the upper right corner to save your config. diff --git a/docs/tutorials/projects/claw-game.md b/docs/tutorials/projects/claw-game.md index 7173778260..a2ab9968b2 100644 --- a/docs/tutorials/projects/claw-game.md +++ b/docs/tutorials/projects/claw-game.md @@ -140,7 +140,7 @@ Machines are organized into {{< glossary_tooltip term_id="part" text="parts" >}} Every machine has a main part which is automatically created when you create the machine. Since you just created a new machine, your machine's main part is already defined. Multi-part machines also have one or more sub-parts representing additional computers running `viam-server`. -If you have two computers within the _same machine_, you can use one as the main part and [connect the other to it as a sub-part](/operate/reference/architecture/parts/rchitecture/parts/#configure-a-sub-part). +If you have two computers within the _same machine_, you can use one as the main part and [connect the other to it as a sub-part](/operate/reference/architecture/parts/#configure-a-sub-part). This is the approach this tutorial follows: you'll run the [motion planning service](/operate/reference/services/motion/) on a laptop and connect that laptop as a sub-part to your machine. {{< alert title="Tip" color="tip" >}} diff --git a/docs/tutorials/projects/send-security-photo.md b/docs/tutorials/projects/send-security-photo.md index f99841fea4..4df6b59e71 100644 --- a/docs/tutorials/projects/send-security-photo.md +++ b/docs/tutorials/projects/send-security-photo.md @@ -87,7 +87,7 @@ This tutorial uses a pre-trained Machine Learning model from the Viam Registry c The model can detect a variety of things, including `Persons`. You can see a full list of what the model can detect in [labels.txt](https://github.com/viam-labs/devrel-demos/raw/main/Light%20up%20bot/labels.txt) file. -If you want to train your own model instead, follow the instructions to [train a TFlite](/data-ai/ai/train-tflite/) or [another model](data-ai/ai/train/). +If you want to train your own model instead, follow the instructions to [train a TFlite](/data-ai/ai/train-tflite/) or [another model](/data-ai/ai/train/). 1. **Configure the ML model service** diff --git a/docs/tutorials/projects/verification-system.md b/docs/tutorials/projects/verification-system.md index 8c280b3de6..cef69e31fb 100644 --- a/docs/tutorials/projects/verification-system.md +++ b/docs/tutorials/projects/verification-system.md @@ -64,7 +64,7 @@ Make sure to connect your camera to your machine's computer (if it isn't built-i Navigate to the **CONFIGURE** tab of your machine's page on the [Viam app](https://app.viam.com). Configure the camera you want to use for your security system. We configured ours as a `webcam`, but you can use whatever model of camera you'd like. -Reference [these available models](/operate/reference/operate/reference/components/camera/#configuration). +Reference [these available models](/operate/reference/components/camera/#configuration). To configure a `webcam`: @@ -139,7 +139,7 @@ To add the [data management service](/data-ai/capture-data/capture-sync/) and co Here you can view the images captured so far from the camera on your machine. You should see new images appearing steadily as cloud sync uploads them from your machine. -For more information, see [configure data capture for individual components](/services/data/#configuration). +For more information, see [configure data capture for individual components](/data-ai/capture-data/capture-sync/). {{% alert title="Tip" color="tip" %}} If you are using a different model of camera, you may need to use a different method **Type** in your data capture configuration. @@ -319,7 +319,7 @@ To add a transform camera to your machine: The various states do not cause anything to happen on their own besides appearing as overlays on the transform cam. To trigger an audio alarm or otherwise have your machine take an action based on the reported state, you can write your own logic using one of the [Viam SDKs](/dev/reference/sdks/) to [poll the classifications](/dev/reference/apis/services/vision/#getclassificationsfromcamera). -See [2D Image Classification](/operate/reference/services/vision/#classifications) for information about working with classifiers in Viam, and [Vision API](/dev/reference/apis/services/vision/#api) for usage of the Computer Vision API this module implements. +See [2D Image Classification](/dev/reference/apis/services/vision/#classifications) for information about working with classifiers in Viam, and [Vision API](/dev/reference/apis/services/vision/#api) for usage of the Computer Vision API this module implements. {{% /alert %}} With everything configured, you are now ready to see your facial recognition machine in action by watching the transform camera as a person passes in front of the camera. @@ -340,7 +340,7 @@ For example: - Write a program using one of the [Viam SDK](/dev/reference/sdks/) to poll the `facial-verification` module for its current state, and take action when a particular state is reached. For example, you could use [`GetClassificationsFromCamera()`](/dev/reference/apis/services/vision/#getclassificationsfromcamera) to capture when a transition into the `ALARM` state occurs, and then send you an email with the captured image of the trespasser! -- Try changing the type of [detectors](/operate/reference/services/vision/#detections), using different detectors for the `TRIGGER_1` and `TRIGGER_2` states. +- Try changing the type of [detectors](/dev/reference/apis/services/vision/#detections), using different detectors for the `TRIGGER_1` and `TRIGGER_2` states. - Add the [filtered camera module](/data-ai/capture-data/filter-before-sync/) to your machine, and use it as the source camera in your verification system in order to save images to the Viam Cloud only when the system enters into specific states. This way, you could limit the images captured and synced to only those you are interested in reviewing later, for example. - If you don't want the `ALARM` capabilities, and would like to just use it as a notification system when a detector gets triggered, set `disable_alarm: true` in the config, which prevents `TRIGGER_2` from entering into the `COUNTDOWN` state, meaning the system will only cycle between the states of `TRIGGER_1` and `TRIGGER_2`. diff --git a/docs/tutorials/services/color-detection-scuttle.md b/docs/tutorials/services/color-detection-scuttle.md index cd2ed08990..b4a7421966 100644 --- a/docs/tutorials/services/color-detection-scuttle.md +++ b/docs/tutorials/services/color-detection-scuttle.md @@ -68,7 +68,7 @@ Turn on the power to the rover. This tutorial uses the color `#a13b4c` or `rgb(161,59,76)` (a reddish color). -To create a [color detector vision service](/operate/reference/services/vision/#detections): +To create a [color detector vision service](/dev/reference/apis/services/vision/#detections): {{< tabs >}} {{% tab name="Builder" %}} diff --git a/docs/tutorials/services/constrain-motion.md b/docs/tutorials/services/constrain-motion.md index 58c586456d..960b35e38c 100644 --- a/docs/tutorials/services/constrain-motion.md +++ b/docs/tutorials/services/constrain-motion.md @@ -60,7 +60,7 @@ Before starting this tutorial, you must: - If you are connecting to a real robotic arm during this tutorial, make sure your computer can communicate with the arm controller before continuing. Code examples in this tutorial use a [UFACTORY xArm 6](https://www.ufactory.us/product/ufactory-xarm-6), but you can use any [arm model](/operate/reference/components/arm/) including a [fake arm model](/operate/reference/components/arm/fake/). - Complete the previous tutorial, [Plan Motion with an Arm and a Gripper](../plan-motion-with-arm-gripper/), which configures the robot, client and service access, and other infrastructure we'll need for this tutorial. - For reference, see the [full code sample from the prior tutorial](../plan-motion-with-arm-gripper/#full-tutorial-code). + For reference, see the [full code sample from the prior tutorial](../plan-motion-with-arm-gripper/#full-code). ## Configure your robot @@ -309,10 +309,10 @@ If we changed it to `theta=90` or `theta=270`, the gripper jaws would open verti ## Add a motion constraint -To keep the cup upright as the arm moves it from one place on the table to another, create a [linear constraint](/services/motion/constraints/#linear-constraint). +To keep the cup upright as the arm moves it from one place on the table to another, create a [linear constraint](/operate/reference/services/motion/constraints/#linear-constraint). When you tell the robot to move the cup from one upright position to another, the linear constraint forces the gripper to move linearly and to maintain the upright orientation of the cup throughout the planned path. -You could try using an [orientation constraint](/services/motion/constraints/#orientation-constraint) instead, which would also constrain the orientation. +You could try using an [orientation constraint](/operate/reference/services/motion/constraints/#orientation-constraint) instead, which would also constrain the orientation. However, since this opens up many more options for potential paths, it is much more computationally intensive than the linear constraint. The code below creates a linear constraint and then uses that constraint to keep the cup upright and move it in a series of linear paths along the predetermined route while avoiding the obstacles we've defined: diff --git a/docs/tutorials/services/navigate-with-rover-base.md b/docs/tutorials/services/navigate-with-rover-base.md index 9b49d292c6..7d5c053368 100644 --- a/docs/tutorials/services/navigate-with-rover-base.md +++ b/docs/tutorials/services/navigate-with-rover-base.md @@ -620,7 +620,7 @@ You can alternatively use [`viam:ultrasonic:camera`](https://github.com/viam-mod {{< /alert >}} -If you want the robot to be able to automatically detect obstacles in front of it, [configure a Vision service segmenter](/operate/reference/services/vision/#segmentations). +If you want the robot to be able to automatically detect obstacles in front of it, [configure a Vision service segmenter](/dev/reference/apis/services/vision/#segmentations). For example, [configure](/operate/reference/services/vision/obstacles_depth/) the Vision service model [`obstacles_depth`](/operate/reference/services/vision/obstacles_depth/) to detect obstacles in front of the robot. Then, use one of [Viam's client SDKs](/dev/reference/sdks/) to automate obstacle avoidance with the navigation service like in the following Python program: diff --git a/docs/tutorials/services/visualize-data-grafana.md b/docs/tutorials/services/visualize-data-grafana.md index fb96f204b4..7ddc4d8a18 100644 --- a/docs/tutorials/services/visualize-data-grafana.md +++ b/docs/tutorials/services/visualize-data-grafana.md @@ -73,7 +73,7 @@ First, add the data management service to your machine to be able capture and sy {{< imgproc src="/tutorials/data-management/data-management-conf.png" alt="The data management service configuration pane with default settings shown for both capturing and syncing" resize="900x" >}} -For more information, see [data management service configuration](/services/data/#configuration). +For more information, see [data management service configuration](/data-ai/capture-data/capture-sync/). ### Configure data capture for a component @@ -98,7 +98,7 @@ To enable data capture for a sensor component: After a short while, your sensor will begin capturing live readings, and syncing those readings to the Viam app. You can check that data is being captured and synced by clicking on the menu icon on the sensor configuration pane. and selecting **View captured data**. -For more information see [data management service configuration](/services/data/#configuration). +For more information see [data management service configuration](/data-ai/capture-data/capture-sync/). ### Configure data query diff --git a/docs/tutorials/services/webcam-line-follower-robot.md b/docs/tutorials/services/webcam-line-follower-robot.md index d06cab2947..5ce9cb2fa6 100644 --- a/docs/tutorials/services/webcam-line-follower-robot.md +++ b/docs/tutorials/services/webcam-line-follower-robot.md @@ -33,7 +33,7 @@ toc_hide: true Many line-following robots rely on a dedicated array of infrared sensors to follow a dark line on a light background or a light line on a dark background. This tutorial uses a standard webcam in place of these sensors, and allows a robot to follow a line of any color that is at least somewhat different from the background. -**Goal**: To make a wheeled robot follow a colored line along the floor using a webcam and the Viam vision service color detector. +**Goal**: To make a wheeled robot follow a colored line along the floor using a webcam and the Viam vision service color detector. **What you will learn**: @@ -225,7 +225,7 @@ Next, navigate to the **CONFIGURE** tab of your machine's page in the [Viam app] 1. **Add a vision service.** -Next, add a vision service [detector](/operate/reference/services/vision/#detections): +Next, add a vision service [detector](/dev/reference/apis/services/vision/#detections): Click the **+** (Create) icon next to your machine part in the left-hand menu and select **Service**. Select type `vision` and model `color detector`. diff --git a/static/include/app/apis/overrides/protos/app.CreateKey.md b/static/include/app/apis/overrides/protos/app.CreateKey.md index 33056e0d50..542a5d15eb 100644 --- a/static/include/app/apis/overrides/protos/app.CreateKey.md +++ b/static/include/app/apis/overrides/protos/app.CreateKey.md @@ -1 +1 @@ -Create a new [API key](/cloud/rbac/#api-keys). +Create a new [API key](/manage/manage/access/). diff --git a/static/include/app/apis/overrides/protos/app.DeleteKey.md b/static/include/app/apis/overrides/protos/app.DeleteKey.md index 3ec747a2e8..589e9bbea3 100644 --- a/static/include/app/apis/overrides/protos/app.DeleteKey.md +++ b/static/include/app/apis/overrides/protos/app.DeleteKey.md @@ -1 +1 @@ -Delete an [API key](/cloud/rbac/#api-keys). +Delete an [API key](/manage/manage/access/). diff --git a/static/include/app/apis/overrides/protos/app.GetRobotApiKeys.md b/static/include/app/apis/overrides/protos/app.GetRobotApiKeys.md index d337e30720..ccde79ee3c 100644 --- a/static/include/app/apis/overrides/protos/app.GetRobotApiKeys.md +++ b/static/include/app/apis/overrides/protos/app.GetRobotApiKeys.md @@ -1 +1 @@ -Gets the [API keys](/cloud/rbac/#api-keys) for the machine. +Gets the [API keys](/manage/manage/access/) for the machine. diff --git a/static/include/app/apis/overrides/protos/app.RotateKey.md b/static/include/app/apis/overrides/protos/app.RotateKey.md index ec7a763903..08daca68ff 100644 --- a/static/include/app/apis/overrides/protos/app.RotateKey.md +++ b/static/include/app/apis/overrides/protos/app.RotateKey.md @@ -1 +1 @@ -Rotate an [API key](/cloud/rbac/#api-keys). +Rotate an [API key](/manage/manage/access/#rotate-an-api-key). diff --git a/static/include/how-to/query-data.md b/static/include/how-to/query-data.md index 0f0d6beba3..3da38c4e93 100644 --- a/static/include/how-to/query-data.md +++ b/static/include/how-to/query-data.md @@ -9,7 +9,7 @@ viam login ``` {{% /tablestep %}} -{{% tablestep link="/cli/#organizations"%}} +{{% tablestep link="/dev/tools/cli/#organizations"%}} **2. Find your organization ID** To create a database user allowing you to access your data, find your organization ID: @@ -40,7 +40,7 @@ This command configures a database user for your organization for use with data If you have run this command before, this command instead **updates** the password to the new value you set. {{% /tablestep %}} -{{% tablestep link="/cli/#data" %}} +{{% tablestep link="/dev/tools/cli/#data" %}} **4. Determine the connection URI** Determine the connection URI (also known as a connection string) for your organization's MongoDB Atlas Data Federation instance by running the following command with the organization's `org-id` from step 2: diff --git a/static/include/services/apis/generated/motion.md b/static/include/services/apis/generated/motion.md index 799f658474..4a220be5c3 100644 --- a/static/include/services/apis/generated/motion.md +++ b/static/include/services/apis/generated/motion.md @@ -37,7 +37,8 @@ The motion service takes the volumes associated with all configured machine comp Transforms can be used to account for geometries that are attached to the robot but not configured as robot components. For example, you could use a transform to represent the volume of a marker held in your machine's gripper. Transforms are not added to the config or carried into later processes. -- `constraints` ([viam.proto.service.motion.Constraints](https://python.viam.dev/autoapi/viam/proto/service/motion/index.html#viam.proto.service.motion.Constraints)) (optional): Pass in [motion constraints](/services/motion/constraints/). By default, motion is unconstrained with the exception of obstacle avoidance. + +- `constraints` ([viam.proto.service.motion.Constraints](https://python.viam.dev/autoapi/viam/proto/service/motion/index.html#viam.proto.service.motion.Constraints)) (optional): Pass in [motion constraints](/operate/reference/services/motion/constraints/). By default, motion is unconstrained with the exception of obstacle avoidance. - `extra` (Mapping[[str](https://docs.python.org/3/library/stdtypes.html#text-sequence-type-str), Any]) (optional): Extra options to pass to the underlying RPC call. - `timeout` ([float](https://docs.python.org/3/library/stdtypes.html#numeric-types-int-float-complex)) (optional): An option to set how long to wait (in seconds) before calling a time-out and closing the underlying RPC call. @@ -145,7 +146,7 @@ Make sure the [SLAM service](/operate/reference/services/slam/) you use alongsid - `destination` ([viam.proto.common.Pose](https://python.viam.dev/autoapi/viam/components/arm/index.html#viam.components.arm.Pose)) (required): The destination, which can be any [Pose](https://python.viam.dev/autoapi/viam/proto/common/index.html#viam.proto.common.Pose) with respect to the SLAM map's origin. - `slam_service_name` ([viam.proto.common.ResourceName](https://python.viam.dev/autoapi/viam/gen/common/v1/common_pb2/index.html#viam.gen.common.v1.common_pb2.ResourceName)) (required): The `ResourceName` of the [SLAM service](/operate/reference/services/slam/) from which the SLAM map is requested. - `configuration` ([viam.proto.service.motion.MotionConfiguration](https://python.viam.dev/autoapi/viam/gen/service/motion/v1/motion_pb2/index.html#viam.gen.service.motion.v1.motion_pb2.MotionConfiguration)) (optional): -The configuration you want to set across this machine for this motion service. This parameter and each of its fields are optional. + The configuration you want to set across this machine for this motion service. This parameter and each of its fields are optional. - `obstacle_detectors` [(Iterable[ObstacleDetector])](https://python.viam.dev/autoapi/viam/proto/service/motion/index.html#viam.proto.service.motion.ObstacleDetector): The names of each [vision service](/operate/reference/services/vision/) and [camera](/operate/reference/components/camera/) resource pair you want to use for transient obstacle avoidance. - `position_polling_frequency_hz` [(float)](https://docs.python.org/3/library/functions.html#float): The frequency in hz to poll the position of the machine. @@ -189,7 +190,7 @@ For more information, see the [Python SDK Docs](https://python.viam.dev/autoapi/ - `ctx` [(Context)](https://pkg.go.dev/context#Context): A Context carries a deadline, a cancellation signal, and other values across API boundaries. - `req` [(MoveOnMapReq)](https://pkg.go.dev/go.viam.com/rdk/services/motion#MoveOnMapReq): -A `MoveOnMapReq` which contains the following values: + A `MoveOnMapReq` which contains the following values: - `ComponentName` [(resource.Name)](https://pkg.go.dev/go.viam.com/rdk/resource#Name): The `resource.Name` of the base to move. - `Destination` [(spatialmath.Pose)](https://pkg.go.dev/go.viam.com/rdk/spatialmath#Pose): The destination, which can be any [Pose](https://python.viam.dev/autoapi/viam/proto/common/index.html#viam.proto.common.Pose) with respect to the SLAM map's origin. @@ -291,7 +292,7 @@ Translation in obstacles is not supported by the [navigation service](/operate/r - `obstacles` ([Sequence[viam.proto.common.GeoGeometry]](https://python.viam.dev/autoapi/viam/gen/common/v1/common_pb2/index.html#viam.gen.common.v1.common_pb2.GeoGeometry)) (optional): Obstacles to consider when planning the motion of the component, with each represented as a `GeoGeometry`.
  • Default: `None`
- `heading` ([float](https://docs.python.org/3/library/stdtypes.html#numeric-types-int-float-complex)) (optional): The compass heading, in degrees, that the machine's movement sensor should report at the `destination` point.
  • Range: `[0-360)` `0`: North, `90`: East, `180`: South, `270`: West
  • Default: `None`
- `configuration` ([viam.proto.service.motion.MotionConfiguration](https://python.viam.dev/autoapi/viam/gen/service/motion/v1/motion_pb2/index.html#viam.gen.service.motion.v1.motion_pb2.MotionConfiguration)) (optional): -The configuration you want to set across this machine for this motion service. This parameter and each of its fields are optional. + The configuration you want to set across this machine for this motion service. This parameter and each of its fields are optional. - `obstacle_detectors` [(Iterable[ObstacleDetector])](https://python.viam.dev/autoapi/viam/proto/service/motion/index.html#viam.proto.service.motion.ObstacleDetector): The names of each [vision service](/operate/reference/services/vision/) and [camera](/operate/reference/components/camera/) resource pair you want to use for transient obstacle avoidance. - `position_polling_frequency_hz` [(float)](https://docs.python.org/3/library/functions.html#float): The frequency in hz to poll the position of the machine. @@ -335,7 +336,7 @@ For more information, see the [Python SDK Docs](https://python.viam.dev/autoapi/ - `ctx` [(Context)](https://pkg.go.dev/context#Context): A Context carries a deadline, a cancellation signal, and other values across API boundaries. - `req` [(MoveOnGlobeReq)](https://pkg.go.dev/go.viam.com/rdk/services/motion#MoveOnGlobeReq): -A `MoveOnGlobeReq` which contains the following values: + A `MoveOnGlobeReq` which contains the following values: - `componentName` [(resource.Name)](https://pkg.go.dev/go.viam.com/rdk/resource#Name): The `resource.Name` of the base to move. - `destination` [(\*geo.Point)](https://pkg.go.dev/github.com/kellydunn/golang-geo#Point): The location of the component's destination, represented in geographic notation as a [Point](https://pkg.go.dev/github.com/kellydunn/golang-geo#Point) _(lat, lng)_. @@ -415,6 +416,7 @@ You can use the `supplemental_transforms` argument to augment the machine's exis When `supplemental_transforms` are provided, a frame system is created within the context of the `GetPose` function. This new frame system builds off the machine's frame system and incorporates the `Transform`s provided. If the result of adding the `Transform`s results in a disconnected frame system, an error is thrown. + - `extra` (Mapping[[str](https://docs.python.org/3/library/stdtypes.html#text-sequence-type-str), Any]) (optional): Extra options to pass to the underlying RPC call. - `timeout` ([float](https://docs.python.org/3/library/stdtypes.html#numeric-types-int-float-complex)) (optional): An option to set how long to wait (in seconds) before calling a time-out and closing the underlying RPC call. @@ -491,7 +493,7 @@ For more information, see the [Python SDK Docs](https://python.viam.dev/autoapi/ - `ctx` [(Context)](https://pkg.go.dev/context#Context): A Context carries a deadline, a cancellation signal, and other values across API boundaries. - `componentName` [(resource.Name)](https://pkg.go.dev/go.viam.com/rdk/resource#Name): The `resource.Name` of the piece of the machine whose pose is returned. - `destinationFrame` [(string)](https://pkg.go.dev/builtin#string): The name of the frame with respect to which the component's pose is reported. -- `supplementalTransforms` [([]*referenceframe.LinkInFrame)](https://pkg.go.dev/go.viam.com/rdk/referenceframe#LinkInFrame): An optional list of `LinkInFrame`s. +- `supplementalTransforms` [([]\*referenceframe.LinkInFrame)](https://pkg.go.dev/go.viam.com/rdk/referenceframe#LinkInFrame): An optional list of `LinkInFrame`s. A `LinkInFrame` represents an additional frame which is added to the machine's frame system. It consists of: @@ -500,11 +502,12 @@ For more information, see the [Python SDK Docs](https://python.viam.dev/autoapi/ When `supplementalTransforms` are provided, a frame system is created within the context of the `GetPose` function. This new frame system builds off the machine's frame system and incorporates the `LinkInFrame`s provided. If the result of adding the `LinkInFrame`s results in a disconnected frame system, an error is thrown. + - `extra` [(map[string]interface{})](https://go.dev/blog/maps): Extra options to pass to the underlying RPC call. **Returns:** -- [(*referenceframe.PoseInFrame)](https://pkg.go.dev/go.viam.com/rdk/referenceframe#PoseInFrame): The pose of the component. +- [(\*referenceframe.PoseInFrame)](https://pkg.go.dev/go.viam.com/rdk/referenceframe#PoseInFrame): The pose of the component. - [(error)](https://pkg.go.dev/builtin#error): An error, if one occurred. **Example:** diff --git a/static/include/services/apis/generated/vision.md b/static/include/services/apis/generated/vision.md index 70e3bff1fb..bd3ef056e3 100644 --- a/static/include/services/apis/generated/vision.md +++ b/static/include/services/apis/generated/vision.md @@ -1,6 +1,6 @@ ### GetDetectionsFromCamera -Get a list of detections from the next image from a specified camera using a configured [detector](/operate/reference/services/vision/#detections). +Get a list of detections from the next image from a specified camera using a configured [detector](/dev/reference/apis/services/vision/#detections). {{< tabs >}} {{% tab name="Python" %}} @@ -91,7 +91,7 @@ For more information, see the [Flutter SDK Docs](https://flutter.viam.dev/viam_s ### GetDetections -Get a list of detections from a given image using a configured [detector](/operate/reference/services/vision/#detections). +Get a list of detections from a given image using a configured [detector](/dev/reference/apis/services/vision/#detections). {{< tabs >}} {{% tab name="Python" %}} @@ -197,7 +197,7 @@ For more information, see the [Flutter SDK Docs](https://flutter.viam.dev/viam_s ### GetClassificationsFromCamera -Get a list of classifications from the next image from a specified camera using a configured [classifier](/operate/reference/services/vision/#classifications). +Get a list of classifications from the next image from a specified camera using a configured [classifier](/dev/reference/apis/services/vision/#classifications). {{< tabs >}} {{% tab name="Python" %}} @@ -287,7 +287,7 @@ For more information, see the [Flutter SDK Docs](https://flutter.viam.dev/viam_s ### GetClassifications -Get a list of classifications from a given image using a configured [classifier](/operate/reference/services/vision/#classifications). +Get a list of classifications from a given image using a configured [classifier](/dev/reference/apis/services/vision/#classifications). {{< tabs >}} {{% tab name="Python" %}} @@ -396,7 +396,7 @@ For more information, see the [Flutter SDK Docs](https://flutter.viam.dev/viam_s ### GetObjectPointClouds -Get a list of 3D point cloud objects and associated metadata in the latest picture from a 3D camera (using a specified [segmenter](/operate/reference/services/vision/#segmentations)). +Get a list of 3D point cloud objects and associated metadata in the latest picture from a 3D camera (using a specified [segmenter](/dev/reference/apis/services/vision/#segmentations)). {{< tabs >}} {{% tab name="Python" %}} @@ -440,7 +440,7 @@ For more information, see the [Python SDK Docs](https://python.viam.dev/autoapi/ **Returns:** -- [([]*viz.Object)](https://pkg.go.dev/go.viam.com/rdk/vision#Object): A list of point clouds and associated metadata like the center coordinates of each point cloud. +- [([]\*viz.Object)](https://pkg.go.dev/go.viam.com/rdk/vision#Object): A list of point clouds and associated metadata like the center coordinates of each point cloud. - [(error)](https://pkg.go.dev/builtin#error): An error, if one occurred. **Example:** @@ -507,7 +507,7 @@ Used for visualization. **Returns:** -- ([viam.services.vision.vision.CaptureAllResult](https://python.viam.dev/autoapi/viam/services/vision/vision/index.html#viam.services.vision.vision.CaptureAllResult)): A class that stores all potential returns from the vision service. It can return the image from the camera along with its associated detections, classifications, and objects, as well as any extra info the model may provide. +- ([viam.services.vision.vision.CaptureAllResult](https://python.viam.dev/autoapi/viam/services/vision/vision/index.html#viam.services.vision.vision.CaptureAllResult)): A class that stores all potential returns from the vision service. It can return the image from the camera along with its associated detections, classifications, and objects, as well as any extra info the model may provide. **Example:** @@ -745,7 +745,7 @@ For more information, see the [Python SDK Docs](https://python.viam.dev/autoapi/ **Returns:** -- [(*Properties)](https://pkg.go.dev/go.viam.com/rdk/services/vision#Properties) +- [(\*Properties)](https://pkg.go.dev/go.viam.com/rdk/services/vision#Properties) - [(error)](https://pkg.go.dev/builtin#error): An error, if one occurred. For more information, see the [Go SDK Docs](https://pkg.go.dev/go.viam.com/rdk/services/vision#Service). diff --git a/static/include/services/apis/overrides/methods/go.motion.Move.constraints.md b/static/include/services/apis/overrides/methods/go.motion.Move.constraints.md index d8910dd197..3ab0e93692 100644 --- a/static/include/services/apis/overrides/methods/go.motion.Move.constraints.md +++ b/static/include/services/apis/overrides/methods/go.motion.Move.constraints.md @@ -1,2 +1,2 @@ -Pass in optional [motion constraints](/services/motion/constraints/). +Pass in optional [motion constraints](/operate/reference/services/motion/constraints/). By default, motion is unconstrained with the exception of obstacle avoidance. diff --git a/static/include/services/apis/overrides/methods/python.motion.move.constraints.md b/static/include/services/apis/overrides/methods/python.motion.move.constraints.md index 105fc8dd45..6dc85ebe2c 100644 --- a/static/include/services/apis/overrides/methods/python.motion.move.constraints.md +++ b/static/include/services/apis/overrides/methods/python.motion.move.constraints.md @@ -1,2 +1,2 @@ -Pass in [motion constraints](/services/motion/constraints/). +Pass in [motion constraints](/operate/reference/services/motion/constraints/). By default, motion is unconstrained with the exception of obstacle avoidance. diff --git a/static/include/services/apis/overrides/protos/vision.GetClassifications.md b/static/include/services/apis/overrides/protos/vision.GetClassifications.md index b5880f8349..46f8a1a328 100644 --- a/static/include/services/apis/overrides/protos/vision.GetClassifications.md +++ b/static/include/services/apis/overrides/protos/vision.GetClassifications.md @@ -1 +1 @@ -Get a list of classifications from a given image using a configured [classifier](/operate/reference/services/vision/#classifications). +Get a list of classifications from a given image using a configured [classifier](/dev/reference/apis/services/vision/#classifications). diff --git a/static/include/services/apis/overrides/protos/vision.GetClassificationsFromCamera.md b/static/include/services/apis/overrides/protos/vision.GetClassificationsFromCamera.md index 48213bfd86..663b37a125 100644 --- a/static/include/services/apis/overrides/protos/vision.GetClassificationsFromCamera.md +++ b/static/include/services/apis/overrides/protos/vision.GetClassificationsFromCamera.md @@ -1 +1 @@ -Get a list of classifications from the next image from a specified camera using a configured [classifier](/operate/reference/services/vision/#classifications). +Get a list of classifications from the next image from a specified camera using a configured [classifier](/dev/reference/apis/services/vision/#classifications). diff --git a/static/include/services/apis/overrides/protos/vision.GetDetections.md b/static/include/services/apis/overrides/protos/vision.GetDetections.md index f230c686ee..63d3e29b42 100644 --- a/static/include/services/apis/overrides/protos/vision.GetDetections.md +++ b/static/include/services/apis/overrides/protos/vision.GetDetections.md @@ -1 +1 @@ -Get a list of detections from a given image using a configured [detector](/operate/reference/services/vision/#detections). +Get a list of detections from a given image using a configured [detector](/dev/reference/apis/services/vision/#detections). diff --git a/static/include/services/apis/overrides/protos/vision.GetDetectionsFromCamera.md b/static/include/services/apis/overrides/protos/vision.GetDetectionsFromCamera.md index a17aff9b8e..4fcbb7aa05 100644 --- a/static/include/services/apis/overrides/protos/vision.GetDetectionsFromCamera.md +++ b/static/include/services/apis/overrides/protos/vision.GetDetectionsFromCamera.md @@ -1 +1 @@ -Get a list of detections from the next image from a specified camera using a configured [detector](/operate/reference/services/vision/#detections). +Get a list of detections from the next image from a specified camera using a configured [detector](/dev/reference/apis/services/vision/#detections). diff --git a/static/include/services/apis/overrides/protos/vision.GetObjectPointClouds.md b/static/include/services/apis/overrides/protos/vision.GetObjectPointClouds.md index bc87f7cef2..baba74720c 100644 --- a/static/include/services/apis/overrides/protos/vision.GetObjectPointClouds.md +++ b/static/include/services/apis/overrides/protos/vision.GetObjectPointClouds.md @@ -1 +1 @@ -Get a list of 3D point cloud objects and associated metadata in the latest picture from a 3D camera (using a specified [segmenter](/operate/reference/services/vision/#segmentations)). +Get a list of 3D point cloud objects and associated metadata in the latest picture from a 3D camera (using a specified [segmenter](/dev/reference/apis/services/vision/#segmentations)).