Skip to content

Commit

Permalink
batch 4
Browse files Browse the repository at this point in the history
  • Loading branch information
JessamyT committed Jan 8, 2025
1 parent d65c225 commit ee271f4
Show file tree
Hide file tree
Showing 18 changed files with 32 additions and 31 deletions.
2 changes: 1 addition & 1 deletion docs/dev/reference/apis/services/vision.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ To get started using Viam's SDKs to connect to and control your machine, go to y

When executed, this sample code creates a connection to your machine as a client.

The following examples assume that you have a machine configured with a [camera](/operate/reference/components/camera/) and a vision service [detector](/operate/reference/services/vision/#detections), [classifier](/operate/reference/services/vision/#classifications) or [segmenter](/operate/reference/services/vision/#segmentations).
The following examples assume that you have a machine configured with a [camera](/operate/reference/components/camera/) and a vision service [detector](/dev/reference/apis/services/vision/#detections), [classifier](/operate/reference/services/vision/#classifications) or [segmenter](/operate/reference/services/vision/#segmentations).

{{< tabs >}}
{{% tab name="Python" %}}
Expand Down
8 changes: 4 additions & 4 deletions docs/dev/reference/try-viam/reserve-a-rover.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ date: "2022-01-01"
_Try Viam_ is a way to try out the Viam platform without setting up any hardware yourself.
You can take over a Viam Rover in our robotics lab to play around!

Watch this tutorial video for a walkthrough of Try Viam, including [how to reserve a Viam Rover](#using-the-reservation-system), [navigate the Viam platform](/fleet/), and [drive the rover](/operate/reference/components/base/wheeled/#test-the-base):
Watch this tutorial video for a walkthrough of Try Viam, including [how to reserve a Viam Rover](#using-the-reservation-system), [navigate the Viam platform](/operate/), and [drive the rover](/operate/reference/components/base/wheeled/#test-the-base):

{{<youtube embed_url="https://www.youtube-nocookie.com/embed/YYpZ9CVDwMU" max-width="600px">}}

Expand Down Expand Up @@ -67,7 +67,7 @@ You can take over and play around with a Viam Rover in our robotics lab from any

1. Please notify Viam support on [our Community Discord](https://discord.gg/viam).
2. Use the **Add Viam Support** button on your machine's Location page to give Viam Support access to your _location_.
Refer to [Managing Locations and sub-locations](/cloud/locations/).
Refer to [Grant access](/manage/manage/access/#grant-access).

### Can I extend my time?

Expand Down Expand Up @@ -115,15 +115,15 @@ Your machine belongs to the [organization](/cloud/organizations/) you were in wh

### Can I share this Location with a friend to work on the machine together?

Sure, you can [invite other users to your organization](/cloud/locations/) to collaborate on your machine.
Sure, you can [invite other users to your organization](/manage/manage/access/#grant-access) to collaborate on your machine.
As members of your organization, those users have full control of your machine.
Another collaboration option is to use screen sharing in a Zoom or Webex session.

### How many active rentals can I have?

You can only borrow one rover at a time.
You cannot join the queue for another reservation while you have an active rental session.
If you would like to, you can [extend your reservation](/appendix/try-viam/reserve-a-rover/#can-i-extend-my-time).
If there is no one waiting after you, you can extend your reservation with the **Extend** button.

### I loved my experience - can I play around more?

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ Click **Save** in the upper right corner of the page to save your new configurat

The fragment adds the following components to your machine's JSON configuration:

- A [board component](/components/board/) named `local` representing the Raspberry Pi.
- A [board component](/operate/reference/components/board/) named `local` representing the Raspberry Pi.
- Two [motors](/operate/reference/components/motor/gpio/) (`right` and `left`)
- The configured pin numbers correspond to where the motor drivers are connected to the board.
- Two [encoders](/operate/reference/components/encoder/single/), one for each motor
Expand Down Expand Up @@ -74,7 +74,7 @@ Click **Save** in the upper right corner of the page to save your new configurat

The fragment adds the following components to your machine's JSON configuration:

- A [board component](/components/board/) named `local` representing the Raspberry Pi.
- A [board component](/operate/reference/components/board/) named `local` representing the Raspberry Pi.
- Two [motors](/operate/reference/components/motor/gpio/) (`right` and `left`)
- The configured pin numbers correspond to where the motor drivers are connected to the board.
- Two [encoders](/operate/reference/components/encoder/single/), one for each motor
Expand Down Expand Up @@ -105,7 +105,7 @@ Click **Save** in the upper right corner of the page to save your configuration.

The fragment adds the following components to your machine's JSON configuration:

- A [board component](/components/board/) named `local` representing the Raspberry Pi
- A [board component](/operate/reference/components/board/) named `local` representing the Raspberry Pi
- An I<sup>2</sup>C bus for connection to the accelerometer.
- Two [motors](/operate/reference/components/motor/gpio/) (`right` and `left`)
- The configured pin numbers correspond to where the motor drivers are connected to the board.
Expand Down Expand Up @@ -143,7 +143,7 @@ Click **Save** in the upper right corner of the page to save your new configurat

The fragment adds the following components to your machine's JSON configuration:

- A [board component](/components/board/) named `local` representing the Jetson.
- A [board component](/operate/reference/components/board/) named `local` representing the Jetson.
- Two [motors](/operate/reference/components/motor/gpio/) (`right` and `left`)
- The configured pin numbers correspond to where the motor drivers are connected to the board.
- Two [encoders](/operate/reference/components/encoder/single/), one for each motor
Expand Down Expand Up @@ -174,7 +174,7 @@ Click **Save** in the upper right corner of the page to save your new configurat

The fragment adds the following components to your machine's JSON configuration:

- A [board component](/components/board/) named `local` representing the Jetson.
- A [board component](/operate/reference/components/board/) named `local` representing the Jetson.
- Two [motors](/operate/reference/components/motor/gpio/) (`right` and `left`)
- The configured pin numbers correspond to where the motor drivers are connected to the board.
- Two [encoders](/operate/reference/components/encoder/single/), one for each motor
Expand Down
6 changes: 3 additions & 3 deletions docs/dev/reference/try-viam/try-viam-tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ You can take over a Viam Rover in our robotics lab to play around!
The rental rover is made up of a chassis with a Raspberry Pi 4B single-board computer, two motors, encoders, and a camera.
The Try Viam area also has an overhead camera to provide a view of the rental rover, allowing you to view its movements in real time.

Watch this tutorial video for a walkthrough of Try Viam, including [how to reserve a Viam Rover](/appendix/try-viam/reserve-a-rover/#using-the-reservation-system), [navigate the Viam platform](/fleet/), and [drive the rover](#control-tab):
Watch this tutorial video for a walkthrough of Try Viam, including [how to reserve a Viam Rover](/appendix/try-viam/reserve-a-rover/#using-the-reservation-system), [navigate the Viam platform](/operate/), and [drive the rover](#control-tab):

{{<youtube embed_url="https://www.youtube-nocookie.com/embed/YYpZ9CVDwMU">}}

Expand Down Expand Up @@ -127,7 +127,7 @@ You can also see their current positions (based on encoder readings) in real tim

#### Board control

The [board component](/components/board/) is the signal wire hub of a machine which allows you to control the states of individual GPIO pins on the board.
The [board component](/operate/reference/components/board/) is the signal wire hub of a machine which allows you to control the states of individual GPIO pins on the board.

For the Viam Rover, the board component is named `local` and controls a Raspberry Pi on the Viam Rover.
With it, you can control the states of individual GPIO pins on the board.
Expand All @@ -147,7 +147,7 @@ There you can view the configuration for each component in the machine: attribut

### Board configuration

The [board component](/components/board/) is the signal wire hub of a machine.
The [board component](/operate/reference/components/board/) is the signal wire hub of a machine.
Configuring a board component allows you to control the states of individual GPIO pins to command the electrical signals sent through and received by the board.
For the Viam Rover, the board component is a Raspberry Pi with **Name** `local`, **Type** `board`, and **Model** `viam:raspberry-pi:rpi`.

Expand Down
1 change: 1 addition & 0 deletions docs/manage/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ overview: true
description: "Remotely deploy and manage software on any fleet of devices. You can monitor all connected devices and troubleshoot any issues - from anywhere."
aliases:
- /cloud/
- /fleet/
---

Viam's fleet management tooling allows you to remotely deploy and manage software on any fleet of devices. You can monitor all connected devices and troubleshoot any issues - from anywhere.
Expand Down
2 changes: 1 addition & 1 deletion docs/operate/reference/services/slam/cloudslam/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -431,7 +431,7 @@ The following attributes are available for `viam:cloudslam-wrapper:cloudslam`
| `api_key` | string | **Required** | An [API key](/manage/manage/access/) with location owner or higher permission. |
| `api_key_id` | string | **Required** | The associated API key ID with the API key. |
| `organization_id` | string | **Required** | The organization ID of your [organization](/dev/reference/glossary/#organization). |
| `location_id` | string | **Required** | The location ID of your [location](/dev/reference/glossary/location/). |
| `location_id` | string | **Required** | The location ID of your [location](/dev/reference/glossary/#term-location/). |
| `machine_id` | string | **Required** | The machine ID of your [machine](/dev/reference/apis/fleet/#find-machine-id). |
| `machine_part_id` | string | Optional | The machine part ID of your [machine part](/dev/reference/apis/fleet/#find-machine-id). Used for local package creation and updating mode. |
| `viam_version` | string | Optional | The version of viam-server to use with CloudSLAM. Defaults to `stable`. |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ aliases:
# SMEs: Bijan, Khari
---

_Changed in [RDK v0.2.36 and API v0.1.118](/appendix/changelog/#vision-service)_
_Changed in [RDK v0.2.36 and API v0.1.118](/dev/reference/changelog/#vision-service)_

The `detector_3d_segmenter` vision service model takes 2D bounding boxes from an [object detector](../#detections), and, using the intrinsic parameters of the chosen camera, projects the pixels in the bounding box to points in 3D space.
If the chosen camera is not equipped to do projections from 2D to 3D, then this vision model will fail.
Expand Down
2 changes: 1 addition & 1 deletion docs/operate/reference/services/vision/mlmodel.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ aliases:
# SMEs: Bijan, Khari
---

_Changed in [RDK v0.2.36 and API v0.1.118](/appendix/changelog/#vision-service)_
_Changed in [RDK v0.2.36 and API v0.1.118](/dev/reference/changelog/#vision-service)_

The `mlmodel` {{< glossary_tooltip term_id="model" text="model" >}} of the Viam vision service supports machine learning detectors and classifiers that draw bounding boxes or return class labels based on a deployed TensorFlow Lite, TensorFlow, PyTorch, or ONNX ML model.

Expand Down
2 changes: 1 addition & 1 deletion docs/operate/reference/services/vision/obstacles_depth.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ aliases:
# SMEs: Bijan, Khari
---

_Changed in [RDK v0.2.36 and API v0.1.118](/appendix/changelog/#vision-service)_
_Changed in [RDK v0.2.36 and API v0.1.118](/dev/reference/changelog/#vision-service)_

The `obstacles_depth` vision service model is for depth cameras, and is best for motion planning with transient obstacles.
Use this segmenter to identify well separated objects above a flat plane.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ aliases:
# SMEs: Bijan, Khari
---

_Changed in [RDK v0.2.36 and API v0.1.118](/appendix/changelog/#vision-service)_
_Changed in [RDK v0.2.36 and API v0.1.118](/dev/reference/changelog/#vision-service)_

`obstacles_distance` is a segmenter that takes point clouds from a camera input and returns the average single closest point to the camera as a perceived obstacle.
It is best for transient obstacle avoidance.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ aliases:
# SMEs: Bijan, Khari
---

_Changed in [RDK v0.2.36 and API v0.1.118](/appendix/changelog/#vision-service)_
_Changed in [RDK v0.2.36 and API v0.1.118](/dev/reference/changelog/#vision-service)_

`obstacles_pointcloud` is a segmenter that identifies well separated objects above a flat plane.
It first identifies the biggest plane in the scene, eliminates that plane, and clusters the remaining points into objects.
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/configure/pet-photographer.md
Original file line number Diff line number Diff line change
Expand Up @@ -828,7 +828,7 @@ Whether you've downloaded the `colorfilter` module, or written your own color fi
Next, add the following services to your smart machine to support the color filter module:

- The [data management service](/data-ai/capture-data/capture-sync/) enables your smart machine to capture data and sync it to the cloud.
- The [vision service](/operate/reference/services/vision/#detections) enables your smart machine to perform color detection on objects in a camera stream.
- The [vision service](/dev/reference/apis/services/vision/#detections) enables your smart machine to perform color detection on objects in a camera stream.

### Add the data management service

Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/projects/verification-system.md
Original file line number Diff line number Diff line change
Expand Up @@ -340,7 +340,7 @@ For example:

- Write a program using one of the [Viam SDK](/dev/reference/sdks/) to poll the `facial-verification` module for its current state, and take action when a particular state is reached.
For example, you could use [`GetClassificationsFromCamera()`](/dev/reference/apis/services/vision/#getclassificationsfromcamera) to capture when a transition into the `ALARM` state occurs, and then send you an email with the captured image of the trespasser!
- Try changing the type of [detectors](/operate/reference/services/vision/#detections), using different detectors for the `TRIGGER_1` and `TRIGGER_2` states.
- Try changing the type of [detectors](/dev/reference/apis/services/vision/#detections), using different detectors for the `TRIGGER_1` and `TRIGGER_2` states.
- Add the [filtered camera module](/data-ai/capture-data/filter-before-sync/) to your machine, and use it as the source camera in your verification system in order to save images to the Viam Cloud only when the system enters into specific states.
This way, you could limit the images captured and synced to only those you are interested in reviewing later, for example.
- If you don't want the `ALARM` capabilities, and would like to just use it as a notification system when a detector gets triggered, set `disable_alarm: true` in the config, which prevents `TRIGGER_2` from entering into the `COUNTDOWN` state, meaning the system will only cycle between the states of `TRIGGER_1` and `TRIGGER_2`.
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/services/color-detection-scuttle.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ Turn on the power to the rover.

This tutorial uses the color `#a13b4c` or `rgb(161,59,76)` (a reddish color).

To create a [color detector vision service](/operate/reference/services/vision/#detections):
To create a [color detector vision service](/dev/reference/apis/services/vision/#detections):

{{< tabs >}}
{{% tab name="Builder" %}}
Expand Down
4 changes: 2 additions & 2 deletions docs/tutorials/services/webcam-line-follower-robot.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ toc_hide: true
Many line-following robots rely on a dedicated array of infrared sensors to follow a dark line on a light background or a light line on a dark background.
This tutorial uses a standard webcam in place of these sensors, and allows a robot to follow a line of any color that is at least somewhat different from the background.

**Goal**: To make a wheeled robot follow a colored line along the floor using a webcam and the Viam <a href="/operate/reference/services/vision/#detections">vision service color detector</a>.
**Goal**: To make a wheeled robot follow a colored line along the floor using a webcam and the Viam <a href="/dev/reference/apis/services/vision/#detections">vision service color detector</a>.

**What you will learn**:

Expand Down Expand Up @@ -225,7 +225,7 @@ Next, navigate to the **CONFIGURE** tab of your machine's page in the [Viam app]

1. **Add a vision service.**

Next, add a vision service [detector](/operate/reference/services/vision/#detections):
Next, add a vision service [detector](/dev/reference/apis/services/vision/#detections):

Click the **+** (Create) icon next to your machine part in the left-hand menu and select **Service**.
Select type `vision` and model `color detector`.
Expand Down
10 changes: 5 additions & 5 deletions static/include/services/apis/generated/vision.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
### GetDetectionsFromCamera

Get a list of detections from the next image from a specified camera using a configured [detector](/operate/reference/services/vision/#detections).
Get a list of detections from the next image from a specified camera using a configured [detector](/dev/reference/apis/services/vision/#detections).

{{< tabs >}}
{{% tab name="Python" %}}
Expand Down Expand Up @@ -91,7 +91,7 @@ For more information, see the [Flutter SDK Docs](https://flutter.viam.dev/viam_s

### GetDetections

Get a list of detections from a given image using a configured [detector](/operate/reference/services/vision/#detections).
Get a list of detections from a given image using a configured [detector](/dev/reference/apis/services/vision/#detections).

{{< tabs >}}
{{% tab name="Python" %}}
Expand Down Expand Up @@ -440,7 +440,7 @@ For more information, see the [Python SDK Docs](https://python.viam.dev/autoapi/

**Returns:**

- [([]*viz.Object)](https://pkg.go.dev/go.viam.com/rdk/vision#Object): A list of point clouds and associated metadata like the center coordinates of each point cloud.
- [([]\*viz.Object)](https://pkg.go.dev/go.viam.com/rdk/vision#Object): A list of point clouds and associated metadata like the center coordinates of each point cloud.
- [(error)](https://pkg.go.dev/builtin#error): An error, if one occurred.

**Example:**
Expand Down Expand Up @@ -507,7 +507,7 @@ Used for visualization.

**Returns:**

- ([viam.services.vision.vision.CaptureAllResult](https://python.viam.dev/autoapi/viam/services/vision/vision/index.html#viam.services.vision.vision.CaptureAllResult)): A class that stores all potential returns from the vision service. It can return the image from the camera along with its associated detections, classifications, and objects, as well as any extra info the model may provide.
- ([viam.services.vision.vision.CaptureAllResult](https://python.viam.dev/autoapi/viam/services/vision/vision/index.html#viam.services.vision.vision.CaptureAllResult)): A class that stores all potential returns from the vision service. It can return the image from the camera along with its associated detections, classifications, and objects, as well as any extra info the model may provide.

**Example:**

Expand Down Expand Up @@ -745,7 +745,7 @@ For more information, see the [Python SDK Docs](https://python.viam.dev/autoapi/

**Returns:**

- [(*Properties)](https://pkg.go.dev/go.viam.com/rdk/services/vision#Properties)
- [(\*Properties)](https://pkg.go.dev/go.viam.com/rdk/services/vision#Properties)
- [(error)](https://pkg.go.dev/builtin#error): An error, if one occurred.

For more information, see the [Go SDK Docs](https://pkg.go.dev/go.viam.com/rdk/services/vision#Service).
Expand Down
Original file line number Diff line number Diff line change
@@ -1 +1 @@
Get a list of detections from a given image using a configured [detector](/operate/reference/services/vision/#detections).
Get a list of detections from a given image using a configured [detector](/dev/reference/apis/services/vision/#detections).
Original file line number Diff line number Diff line change
@@ -1 +1 @@
Get a list of detections from the next image from a specified camera using a configured [detector](/operate/reference/services/vision/#detections).
Get a list of detections from the next image from a specified camera using a configured [detector](/dev/reference/apis/services/vision/#detections).

0 comments on commit ee271f4

Please sign in to comment.