Skip to content

Commit

Permalink
Update links
Browse files Browse the repository at this point in the history
  • Loading branch information
npentrel committed Jan 7, 2025
1 parent 8aae414 commit 5572d92
Show file tree
Hide file tree
Showing 100 changed files with 258 additions and 463 deletions.
12 changes: 6 additions & 6 deletions docs/data-ai/ai/advanced/upload-external-data.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ You can also turn off **Syncing** unless you have other directories you'd like t

## Upload data with Python

You can use the Python data client API [`file_upload_from_path`](/appendix/apis/data-client/#fileuploadfrompath) method to upload one or more files from your computer to the Viam Cloud.
You can use the Python data client API [`file_upload_from_path`](/dev/reference/apis/data-client/#fileuploadfrompath) method to upload one or more files from your computer to the Viam Cloud.

{{% alert title="Note" color="note" %}}

Expand All @@ -122,21 +122,21 @@ pip install viam-sdk
### Instructions

{{< table >}}
{{% tablestep link="/appendix/apis/data-client/#establish-a-connection" %}}
{{% tablestep link="/dev/reference/apis/data-client/#establish-a-connection" %}}
**1. Get API key**

Go to your organization's setting page and create an API key for your individual {{< glossary_tooltip term_id="part" text="machine part" >}}, {{< glossary_tooltip term_id="part" text="machine" >}}, {{< glossary_tooltip term_id="location" text="location" >}}, or {{< glossary_tooltip term_id="organization" text="organization" >}}.

{{% /tablestep %}}
{{% tablestep link="/appendix/apis/data-client/" %}}
{{% tablestep link="/dev/reference/apis/data-client/" %}}
**2. Add a `file_upload_from_path` API call**

Create a Python script and use the `file_upload_from_path` method to upload your data, depending on whether you are uploading one or multiple files:

{{< tabs >}}
{{< tab name="Upload a single file" >}}

To upload just one file, make a call to [`file_upload_from_path`](/appendix/apis/data-client/#fileuploadfrompath).
To upload just one file, make a call to [`file_upload_from_path`](/dev/reference/apis/data-client/#fileuploadfrompath).

{{< expand "Click this to see example code" >}}

Expand Down Expand Up @@ -186,7 +186,7 @@ if __name__ == "__main__":
{{% /tab %}}
{{< tab name="Upload all files in a directory" >}}

To upload all the files in a directory, you can use the [`file_upload_from_path`](/appendix/apis/data-client/#fileuploadfrompath) method inside a `for` loop.
To upload all the files in a directory, you can use the [`file_upload_from_path`](/dev/reference/apis/data-client/#fileuploadfrompath) method inside a `for` loop.

{{< expand "Click this to see example code" >}}

Expand Down Expand Up @@ -252,7 +252,7 @@ View your uploaded data in your [**DATA** page in the Viam app](https://app.viam

## Upload images with the Viam mobile app

Upload images as machine data straight from your phone, skipping the normal data capture and cloud synchronization process, through the [Viam mobile app](/fleet/control/#control-interface-in-the-viam-mobile-app).
Upload images as machine data straight from your phone, skipping the normal data capture and cloud synchronization process, through the [Viam mobile app](/manage/troubleshoot/teleoperate/default-interface/#viam-mobile-app).
This is useful if you want to capture images for training machine learning models on the go.

### Prerequisites
Expand Down
2 changes: 1 addition & 1 deletion docs/data-ai/ai/create-dataset.md
Original file line number Diff line number Diff line change
Expand Up @@ -239,7 +239,7 @@ If you have 25 images in your dataset, at least 20 of those must be labelled.

{{< expand "Want to add images to a dataset programmatically? Click here." >}}

You can also add all images with a certain label to a dataset using the [`viam dataset data add` command](/cli/#dataset) or the [Data Client API](/appendix/apis/data-client/#addtagstobinarydatabyfilter):
You can also add all images with a certain label to a dataset using the [`viam dataset data add` command](/dev/cli/#dataset) or the [Data Client API](/dev/reference/apis/data-client/#addtagstobinarydatabyfilter):

{{< tabs >}}
{{% tab name="CLI" %}}
Expand Down
2 changes: 1 addition & 1 deletion docs/data-ai/ai/deploy.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ After deploying your model, you need to configure an additional service to use t
For example, you can configure an [`mlmodel` vision service](/services/vision/) to visualize the inferences your model makes.
Follow our docs to [run inference](/data-ai/ai/run-inference/) to add an `mlmodel` vision service and see inferences.

For other use cases, consider [creating custom functionality with a module](/how-tos/create-module/).
For other use cases, consider [creating custom functionality with a module](/operate/get-started/other-hardware/).

{{< alert title="Add support for other models" color="tip" >}}
ML models must be designed in particular shapes to work with the `mlmodel` [classification](/services/vision/mlmodel/) or [detection](/services/vision/mlmodel/) model of Viam's [vision service](/services/vision/).
Expand Down
2 changes: 1 addition & 1 deletion docs/data-ai/ai/run-inference.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ Then, from the **Select model** dropdown, select the name of the ML model servic

### Test your changes

You can test a deployed vision service by clicking on the **Test** area of its configuration panel or from the [**CONTROL** tab](/fleet/control/).
You can test a deployed vision service by clicking on the **Test** area of its configuration panel or from the [**CONTROL** tab](/manage/troubleshoot/teleoperate/default-interface/#viam-app).

The camera stream shows when the vision service identifies something.
Try pointing the camera at a scene similar to your training data.
Expand Down
2 changes: 1 addition & 1 deletion docs/data-ai/ai/train-tflite.md
Original file line number Diff line number Diff line change
Expand Up @@ -149,7 +149,7 @@ Your training script may output logs at the error level but still succeed.

{{< /alert >}}

You can also view your training jobs' logs with the [`viam train logs`](/cli/#train) command.
You can also view your training jobs' logs with the [`viam train logs`](/dev/cli/#train) command.

{{% /tablestep %}}
{{< /table >}}
Expand Down
16 changes: 8 additions & 8 deletions docs/data-ai/ai/train.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ If you wish to do this, skip to [Submit a training job](#submit-a-training-job).

For image data, you can follow the instructions to [Create a dataset](/data-ai/ai/create-dataset/) to create a dataset and label data.

For other data you can use the [Data Client API](/appendix/apis/data-client/) from within the training script to get data stored in the Viam Cloud.
For other data you can use the [Data Client API](/dev/reference/apis/data-client/) from within the training script to get data stored in the Viam Cloud.

{{% /expand%}}

Expand Down Expand Up @@ -414,7 +414,7 @@ Update the main to call the functions you have just created.
{{% tablestep %}}
**9. Using Viam APIs in a training script**

If you need to access any of the [Viam APIs](/appendix/apis/) within a custom training script, you can use the environment variables `API_KEY` and `API_KEY_ID` to establish a connection.
If you need to access any of the [Viam APIs](/dev/reference/apis/) within a custom training script, you can use the environment variables `API_KEY` and `API_KEY_ID` to establish a connection.
These environment variables will be available to training scripts.

```python
Expand Down Expand Up @@ -442,7 +442,7 @@ You can export one of your Viam datasets to test your training script locally.
{{% tablestep %}}
**1. Export your dataset**

You can get the dataset ID from the dataset page or using the [`viam dataset list`](/cli/#dataset) command:
You can get the dataset ID from the dataset page or using the [`viam dataset list`](/dev/cli/#dataset) command:

```sh {class="command-line" data-prompt="$"}
viam dataset export --destination=<destination> --dataset-id=<dataset-id> --include-jsonl=true
Expand Down Expand Up @@ -514,7 +514,7 @@ viam training-script upload --path=my-training.tar.gz \
{{% /tab %}}
{{< /tabs >}}

You can also [specify the version, framework, type, visibility, and description](/cli/#training-script) when uploading a custom training script.
You can also [specify the version, framework, type, visibility, and description](/dev/cli/#training-script) when uploading a custom training script.

To find your organization's ID, run the following command:

Expand All @@ -530,7 +530,7 @@ You can view uploaded training scripts by navigating to the [registry's **Traini

## Submit a training job

After uploading the training script, you can run it by submitting a training job through the Viam app or using the Viam CLI or [ML Training client API](/appendix/apis/ml-training-client/#submittrainingjob).
After uploading the training script, you can run it by submitting a training job through the Viam app or using the Viam CLI or [ML Training client API](/dev/reference/apis/ml-training-client/#submittrainingjob).

{{< table >}}
{{% tablestep %}}
Expand All @@ -548,7 +548,7 @@ Click **Train model** and select **Train on a custom training script**, then fol
{{% /tab %}}
{{% tab name="CLI" %}}

You can use [`viam train submit custom from-registry`](/cli/#positional-arguments-submit) to submit a training job.
You can use [`viam train submit custom from-registry`](/dev/cli/#positional-arguments-submit) to submit a training job.

For example:

Expand All @@ -562,7 +562,7 @@ viam train submit custom from-registry --dataset-id=<INSERT DATASET ID> \

This command submits a training job to the previously uploaded `MyCustomTrainingScript` with another input dataset, which trains `MyRegistryModel` and publishes that to the registry.

You can get the dataset id from the dataset page or using the [`viam dataset list`](/cli/#dataset) command.
You can get the dataset id from the dataset page or using the [`viam dataset list`](/dev/cli/#dataset) command.

{{% /tab %}}
{{< /tabs >}}
Expand Down Expand Up @@ -595,7 +595,7 @@ Your training script may output logs at the error level but still succeed.

{{< /alert >}}

You can also view your training jobs' logs with the [`viam train logs`](/cli/#train) command.
You can also view your training jobs' logs with the [`viam train logs`](/dev/cli/#train) command.

{{% /tablestep %}}
{{< /table >}}
Expand Down
2 changes: 1 addition & 1 deletion docs/data-ai/capture-data/conditional-sync.md
Original file line number Diff line number Diff line change
Expand Up @@ -289,7 +289,7 @@ To test your setup, [configure a webcam](/components/camera/webcam/) or another
Make sure to physically connect any hardware parts to the computer controlling your machine.
For a camera component, use the `ReadImage` method.
The data manager will now capture data.
Go to the [**CONTROL** tab](/fleet/control/).
Go to the [**CONTROL** tab](/manage/troubleshoot/teleoperate/default-interface/#viam-app).
You should see the sensor.
Click on `GetReadings`.

Expand Down
2 changes: 1 addition & 1 deletion docs/data-ai/data/visualize.md
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,7 @@ See the [guide on querying sensor data](/how-tos/sensor-data-query-with-third-pa

For more detailed instructions on using Grafana, including a full step-by-step configuration walkthrough, see [visualizing data with Grafana](/tutorials/services/visualize-data-grafana/).

On top of visualizing sensor data with third-party tools, you can also [query it with the Python SDK](/appendix/apis/data-client/) or [query it with the Viam app](/data-ai/data/query/).
On top of visualizing sensor data with third-party tools, you can also [query it with the Python SDK](/dev/reference/apis/data-client/) or [query it with the Viam app](/data-ai/data/query/).

To see full projects using visualization, check out these resources:

Expand Down
2 changes: 1 addition & 1 deletion docs/data-ai/reference/vision/mlmodel.md
Original file line number Diff line number Diff line change
Expand Up @@ -228,7 +228,7 @@ If the classifier's results exceed the confidence threshold, the **Run model** s

### Live camera footage

You can test your detector or classifier from the [**Control tab**](/fleet/control/) or with code using a camera that is part of your machine.
You can test your detector or classifier from the [**Control tab**](/manage/troubleshoot/teleoperate/default-interface/#viam-app) or with code using a camera that is part of your machine.

#### Test your vision service

Expand Down
4 changes: 2 additions & 2 deletions docs/dev/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -430,7 +430,7 @@ result, err := myTwilioSvc.DoCommand(context.Background(), command)

Using the Viam Registry you can turn services and your own custom business logic into _{{< glossary_tooltip term_id="module" text="modules" >}}_. You can then deploy your modules to your machines.

[Create a module →](/how-tos/create-module/)
[Create a module →](/operate/get-started/other-hardware/)

</div>
</div>
Expand Down Expand Up @@ -711,7 +711,7 @@ for m in machines:

Get status information and logs from all your deployed machines using the fleet management API.

[Learn about Platform APIs →](/appendix/apis/#platform-apis)
[Learn about Platform APIs →](/dev/reference/apis/#platform-apis)

</div>
</div>
Expand Down
2 changes: 1 addition & 1 deletion docs/dev/contributing.md
Original file line number Diff line number Diff line change
Expand Up @@ -159,7 +159,7 @@ The docs use the [Diátaxis Framework](https://diataxis.fr/) as the basis of t
{{< /expand >}}

- **Reference**: A concise, information-oriented piece of content that generally starts with an overview/introduction and then a list of some kind (configuration options, API methods, etc.).
Examples include the [API pages](/appendix/apis/) as well as [component and service pages](/operate/reference/components/arm/).
Examples include the [API pages](/dev/reference/apis/) as well as [component and service pages](/operate/reference/components/arm/).

Example template: [Component template](https://github.com/viamrobotics/docs/blob/main/docs/components/component/_index.md).

Expand Down
2 changes: 1 addition & 1 deletion docs/dev/reference/apis/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ date: "2024-10-01"

Every Viam {{< glossary_tooltip term_id="resource" text="resource" >}} exposes an [application programming interface (API)](https://en.wikipedia.org/wiki/API) described through [protocol buffers](https://developers.google.com/protocol-buffers).

The API methods provided by the SDKs for each of these resource APIs wrap gRPC client requests to the machine when you execute your program, providing you a convenient interface for accessing information about and controlling the {{< glossary_tooltip term_id="resource" text="resources" >}} you have [configured](/configure/) on your machine.
The API methods provided by the SDKs for each of these resource APIs wrap gRPC client requests to the machine when you execute your program, providing you a convenient interface for accessing information about and controlling the {{< glossary_tooltip term_id="resource" text="resources" >}} you have [configured](/operate/get-started/supported-hardware/) on your machine.

## Platform APIs

Expand Down
Loading

0 comments on commit 5572d92

Please sign in to comment.