Skip to content

Commit

Permalink
More fixews
Browse files Browse the repository at this point in the history
  • Loading branch information
npentrel committed Jan 7, 2025
1 parent 0218f04 commit c466f8a
Show file tree
Hide file tree
Showing 131 changed files with 292 additions and 302 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/update_sdk_methods.py
Original file line number Diff line number Diff line change
Expand Up @@ -435,7 +435,7 @@
"SLAM service": "/services/slam/",
"frame": "/services/frame-system/",
"Viam app": "https://app.viam.com/",
"organization settings page": "/cloud/organizations/",
"organization settings page": "/manage/reference/organize/",
"image tags": "/fleet/dataset/#image-tags",
"API key": "/fleet/cli/#authenticate",
"board model": "/dev/reference/apis/components/board/"
Expand Down
2 changes: 1 addition & 1 deletion docs/data-ai/ai/train-tflite.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ Now that you have seen that the cameras on your Try Viam rover work, begin by [C
You can drive the rover around as you capture data to get a variety of images from different angles.

{{< alert title="Tip" color="tip" >}}
Be aware that if you are running out of time during your rental, you can [extend your rover rental](/appendix/try-viam/reserve-a-rover/#extend-your-reservation) as long as there are no other reservations.
Be aware that if you are running out of time during your rental, you can extend your rover rental as long as there are no other reservations.
{{< /alert >}}

{{% /expand%}}
Expand Down
4 changes: 2 additions & 2 deletions docs/data-ai/capture-data/filter-before-sync.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Contributors have written several filtering {{< glossary_tooltip term_id="module
The following steps use the [`filtered_camera`](https://github.com/erh/filtered_camera) module:

{{< table >}}
{{% tablestep link="/services/ml/"%}}
{{% tablestep link="/data-ai/ai/deploy/"%}}
{{<imgproc src="/services/ml/train.svg" class="fill alignleft" style="width: 150px" declaredimensions=true alt="Train models">}}
**1. Add an ML model service to your machine**

Expand Down Expand Up @@ -95,7 +95,7 @@ Images that pass your filter will be captured and will sync at the specified syn
Your images will begin to appear under the **DATA** tab.

If no data appears after the sync interval, check the [**Logs**](/manage/troubleshoot/troubleshoot/#check-logs) and ensure that the condition for filtering is met.
You can test the vision service from the [**CONTROL** tab](/cloud/machines/#control) to see its classifications and detections live.
You can test the vision service from the [**CONTROL** tab](/manage/troubleshoot/teleoperate/default-interface/) to see its classifications and detections live.

{{% /tablestep %}}
{{% tablestep %}}
Expand Down
2 changes: 1 addition & 1 deletion docs/data-ai/data/query.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ Follow the guide to [capture sensor data](/data-ai/capture-data/capture-sync/).

Once your data has synced, you can query your data from within the Viam app using {{< glossary_tooltip term_id="sql" text="SQL" >}} or {{< glossary_tooltip term_id="mql" text="MQL" >}}.

You must have the [owner role](/cloud/rbac/) in order to query data in the Viam app.
You must have the [owner role](/manage/manage/rbac/) in order to query data in the Viam app.

{{< table >}}
{{% tablestep %}}
Expand Down
8 changes: 4 additions & 4 deletions docs/data-ai/reference/vision/mlmodel.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,14 +27,14 @@ Before configuring your `mlmodel` detector or classifier, you need to:

<h4>1. Train or upload an ML model</h4>

You can add an [existing model](/data-ai/ai/deploy/#deploy-your-ml-model) or [train a TFlite](/data-ai/ai/train-tflite/) or [another model](data-ai/ai/train/) for object detection and classification using your data in the [Viam Cloud](/fleet/data-management/).
You can add an [existing model](/data-ai/ai/deploy/#deploy-your-ml-model) or [train a TFlite](/data-ai/ai/train-tflite/) or [another model](data-ai/ai/train/) for object detection and classification using your data in the [Viam Cloud](/data-ai/capture-data/capture-sync/).

{{% /manualcard %}}
{{% manualcard %}}

<h4>2. Deploy your ML model</h4>

To use ML models with your machine, use a suitable [ML model service](/services/ml/) to deploy and run the model.
To use ML models with your machine, use a suitable [ML model service](/data-ai/ai/deploy/) to deploy and run the model.

{{% /manualcard %}}
{{< /cards >}}
Expand Down Expand Up @@ -123,7 +123,7 @@ The following attributes are available for an `mlmodel` detector or classifier:
<!-- prettier-ignore -->
| Parameter | Type | Required? | Description |
| --------- | ---- | --------- | ----------- |
| `mlmodel_name` | string | **Required** | The name of the [ML model service](/services/ml/) you want to use the model from. |
| `mlmodel_name` | string | **Required** | The name of the [ML model service](/data-ai/ai/deploy/) you want to use the model from. |
| `remap_output_names` | object | Optional | The names of your output tensors, mapped to the service requirements. See [Tensor names](#tensor-names) for more information. |
| `remap_input_names` | object | Optional | The name of your input tensor, mapped to the service requirements. See [Tensor names](#tensor-names) for more information. |
| `input_image_bgr` | bool | Optional | Set this to `true` if the ML model service expects the input image to have BGR pixels, rather than RGB pixels. <br> Default: `false` |
Expand Down Expand Up @@ -217,7 +217,7 @@ The feature is only available for classifiers that were uploaded after September

{{<gif webm_src="/services/vision/mug-classifier.webm" mp4_src="/services/vision/mug-classifier.mp4" alt="A classification model run against an image containing a mug." max-width="250px" class="alignright">}}

If you have images stored in the [Viam Cloud](/fleet/data-management/), you can run your classifier against your images in the [Viam app](https://app.viam.com/).
If you have images stored in the [Viam Cloud](/data-ai/capture-data/capture-sync/), you can run your classifier against your images in the [Viam app](https://app.viam.com/).

1. Navigate to the [Data tab](https://app.viam.com/data/view) and click on the **Images** subtab.
2. Click on an image to open the side menu, and select the **Actions** tab under the **Data** tab.
Expand Down
2 changes: 1 addition & 1 deletion docs/data-ai/reference/vision/obstacles_depth.md
Original file line number Diff line number Diff line change
Expand Up @@ -124,7 +124,7 @@ The following parameters are available for an `"obstacles_depth"` segmenter:

If you want to identify multiple boxes over the flat plane with your segmenter:

- First, [configure your frame system](/services/frame-system/#configuration) to configure the relative spatial orientation of the components of your machine, including your [camera](/operate/reference/components/camera/), within Viam's [frame system service](/services/frame-system/).
- First, [configure your frame system](/operate/mobility/define-geometry/#configure-a-reference-frame) to configure the relative spatial orientation of the components of your machine, including your [camera](/operate/reference/components/camera/), within Viam's [frame system service](/operate/mobility/define-geometry/).
- After configuring your frame system, your camera will populate its own `Properties` with these spatial intrinsic parameters from the frame system.
- You can get those parameters from your camera through the [camera API](/dev/reference/apis/components/camera/#getproperties).
- The segmenter now returns multiple boxes within the `GeometryInFrame` object it captures.
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -789,7 +789,7 @@ api_key, api_key_id = await cloud.create_key(

Viam allows you to organize and manage any number of machines. When collaborating with others, you can assign permissions using Role-Based Access Control (RBAC). Programmatically you can do this with the fleet management API.

[Learn about access control →](/cloud/rbac/)
[Learn about access control →](/manage/manage/rbac/)

</div>
</div>
Expand Down
4 changes: 2 additions & 2 deletions docs/dev/contributing.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ The docs use the [Diátaxis Framework](https://diataxis.fr/) as the basis of t

- **Explanation (conceptual)**: An understanding-oriented piece of content.
This content provides background knowledge on a topic and tends to be referenced in how-to guides and tutorials.
For example the [`viam-server` page](/architecture/viam-server/) or the [Registry page](/registry/).
For example the [`viam-server` page](/architecture/viam-server/).
It’s useful to have a real or imagined "Why?" question to serve as a prompt.

{{< expand "Click to view template" >}}
Expand Down Expand Up @@ -94,7 +94,7 @@ The docs use the [Diátaxis Framework](https://diataxis.fr/) as the basis of t

- **How-to Guide (procedural)**: A task-oriented piece of content that directs a reader to perform actions step by step to complete a task, like instructions to sauté onions.
Generally starts with a description of the task and things to consider, and then provides a set of numbered steps to follow.
For example, the [Installation page](/operate/get-started/setup/) or the [Find module page](/registry/modular-resources/).
For example, the [Move a base](/operate/mobility/move-base/) page.

{{< expand "Click to view template" >}}

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/reference/apis/robot.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ The machine API supports the following methods:

To interact with the machine API with Viam's SDKs, instantiate a `RobotClient` ([gRPC](https://grpc.io/) client) and use that class for all interactions.

To find the API key, API key ID, and machine address, go to [Viam app](https://app.viam.com/), select the machine you wish to connect to, and go to the [**Code sample**](/cloud/machines/#code-sample) tab.
To find the API key, API key ID, and machine address, go to [Viam app](https://app.viam.com/), select the machine you wish to connect to, and go to the **CONNECT** tab.
Toggle **Include API key**, and then copy and paste the API key ID and the API key into your environment variables or directly into the code:

{{< tabs >}}
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/reference/apis/services/SLAM.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ date: "2022-01-01"

The SLAM service API allows you to get a machine's position within a map.

The [SLAM service](/services/slam/) supports the following methods:
The [SLAM service](/operate/reference/services/slam/) supports the following methods:

{{< readfile "/static/include/services/apis/generated/slam-table.md" >}}

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/reference/apis/services/base-rc.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ date: "2022-01-01"

The base remote control service API allows you to get a list of inputs from the controller that are being monitored for that control mode.

The [SLAM service](/services/slam/) supports the following methods:
The [SLAM service](/operate/reference/operate/reference/operate/reference/operate/reference/services/slam/) supports the following methods:

{{< readfile "/static/include/services/apis/generated/base_remote_control-table.md" >}}

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/reference/apis/services/generic.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ date: "2022-01-01"
# updated: "" # When the content was last entirely checked
---

The generic service API allows you to give commands to your [generic services](/services/generic/) for running model-specific commands using [`DoCommand`](/dev/reference/apis/services/generic/#docommand).
The generic service API allows you to give commands to your [generic services](/operate/reference/components/generic/) for running model-specific commands using [`DoCommand`](/dev/reference/apis/services/generic/#docommand).

The generic service supports the following methods:

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/reference/apis/services/ml.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ date: "2022-01-01"

The ML model service API allows you to make inferences based on a provided ML model.

The [ML Model service](/services/ml/) supports the following methods:
The [ML Model service](/data-ai/ai/deploy/) supports the following methods:

{{< readfile "/static/include/services/apis/generated/mlmodel-table.md" >}}

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/reference/apis/services/motion.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ date: "2022-01-01"
# updated: "" # When the content was last entirely checked
---

The motion service API allows you to give commands to your [motion service](/services/motion/) for moving a mobile robot based on a SLAM map or GPS coordinates or for moving a machine's components from one pose to another.
The motion service API allows you to give commands to your [motion service](/operate/reference/services/motion/) for moving a mobile robot based on a SLAM map or GPS coordinates or for moving a machine's components from one pose to another.

The motion service supports the following methods:

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/reference/apis/services/navigation.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ date: "2022-01-01"

The navigation service API allows you to define waypoints and move your machine along those waypoints while avoiding obstacles.

The [navigation service](/services/navigation/) supports the following methods:
The [navigation service](/operate/reference/services/navigation/) supports the following methods:

{{< readfile "/static/include/services/apis/generated/navigation-table.md" >}}

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/reference/apis/sessions.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ This is especially important for machines that physically move.
For example, imagine a wheeled rover gets a [`SetPower()`](/dev/reference/apis/components/base/#setpower) command as the last input from a client before the connection to the machine is interrupted.
Without session management, the API request from the client would cause the rover's motors to move, causing the machine to continue driving forever and potentially colliding with objects and people.

For more information, see [Client Sessions and Machine Network Connectivity](/sdks/connectivity/).
For more information, see [Client Sessions and Machine Network Connectivity](/dev/reference/sdks/connectivity/).

If you want to manage operations differently, you can manage your machine's client sessions yourself.
The Session Management API provides functionality for:
Expand Down
Loading

0 comments on commit c466f8a

Please sign in to comment.