-
Notifications
You must be signed in to change notification settings - Fork 41
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
WIP: Updated the CLI docs. Added the images in the folder structure, …
…removed some of the pages and placed the content in 1
- Loading branch information
Ana Loznianu
committed
Oct 3, 2023
1 parent
19475eb
commit 1bc398d
Showing
21 changed files
with
158 additions
and
123 deletions.
There are no files selected for viewing
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,13 @@ | ||
# Consume a Dataset 📥 | ||
|
||
The process of consuming an asset is remarkably straightforward. To achieve this, you only need to execute a single command: | ||
|
||
```bash | ||
npm run cli download 'assetDID' 'download-location-path' | ||
``` | ||
|
||
In this command, replace 'assetDID' with the specific DID of the asset you want to consume, and 'download-location-path' with the desired path where you wish to store the downloaded asset content | ||
|
||
Once executed, this command orchestrates both the **ordering** of a [datatoken](../contracts/datatokens.md) and the subsequent download operation. The asset's content will be automatically retrieved and saved at the specified location, simplifying the consumption process for users. | ||
|
||
<figure><img src="../../.gitbook/assets/cli/download.png" alt=""><figcaption>Consume</figcaption></figure> |
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,20 @@ | ||
# Edit a Dataset ✏️ | ||
|
||
To make changes to a dataset, you'll need to start by retrieving the asset's [Decentralized Data Object](../ddo-specification.md) (DDO). | ||
|
||
## Retrieve DDO | ||
|
||
Obtaining the DDO of an asset is a straightforward process. You can accomplish this task by executing the following command: | ||
|
||
```bash | ||
npm run cli getDDO 'assetDID' | ||
``` | ||
|
||
<figure><img src="../../.gitbook/assets/cli/getAsset.png" alt=""><figcaption>Get DDO</figcaption></figure> | ||
|
||
After retrieving the asset's DDO and saving it as a JSON file, you can proceed to edit the metadata as needed. Once you've made the necessary changes, you can utilize the following command to apply the updated metadata: | ||
|
||
```bash | ||
npm run cli editAsset 'DATASET_DID' 'PATH_TO_UPDATED_FILE` | ||
``` |
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,21 +1,71 @@ | ||
# Installation | ||
# Installation and Configuration 🛠️ | ||
|
||
**Clone the Repository**: Begin by cloning the repository. You can achieve this by executing the following command in your terminal: | ||
To get started with the Ocean CLI, follow these steps for a seamless setup: | ||
|
||
## Clone the Repository | ||
|
||
Begin by cloning the repository. You can achieve this by executing the following command in your terminal: | ||
|
||
```bash | ||
$ git clone https://github.com/oceanprotocol/ocean.js-cli.git | ||
``` | ||
|
||
Cloning the repository will create a local copy on your machine, allowing you to access and work with its contents. | ||
|
||
**Install NPM Dependencies**: After successfully cloning the repository, you should install the necessary npm dependencies to ensure that the project functions correctly. This can be done with the following command: | ||
## Install NPM Dependencies | ||
|
||
After successfully cloning the repository, you should install the necessary npm dependencies to ensure that the project functions correctly. This can be done with the following command: | ||
|
||
```bash | ||
npm install | ||
``` | ||
|
||
Build the TypeScript code | ||
## Build the TypeScript code | ||
|
||
To compile the TypeScript code and prepare the CLI for use, execute the following command: | ||
|
||
```bash | ||
npm run build | ||
``` | ||
|
||
Now, let's configure the environment variables required for the CLI to function effectively. 🚀 | ||
|
||
|
||
## Setting Environment Variables 🌐 | ||
|
||
To successfully configure the CLI tool, two essential steps must be undertaken: the setting of the account's private key and the definition of the desired RPC endpoint. These actions are pivotal in enabling the CLI tool to function effectively. | ||
|
||
### Private Key Configuration | ||
|
||
The CLI tool necessitates the configuration of the account's private key. This private key serves as the means by which the CLI tool establishes a connection to the associated wallet. It plays a crucial role in authenticating and authorizing operations performed by the tool. | ||
|
||
```bash | ||
export MNEMONIC="XXXX" | ||
``` | ||
|
||
### RPC Endpoint Specification | ||
Additionally, it is imperative to specify the RPC endpoint that corresponds to the desired network for executing operations. The CLI tool relies on this user-provided RPC endpoint to connect to the network required for its functions. This connection to the network is vital as it enables the CLI tool to interact with the blockchain and execute operations seamlessly. | ||
|
||
```bash | ||
export RPC='XXXX' | ||
``` | ||
|
||
Furthermore, there are additional environment variables that can be configured to enhance the flexibility and customization of the environment. These variables include options such as the metadataCache URL and Provider URL, which can be specified if you prefer to utilize a custom deployment of Aquarius or Provider in contrast of the default settings. Moreover, you have the option to provide a custom address file path if you wish to use customized smart contracts or deployments for your specific use case. Remeber setting the next envariament variables is optional. | ||
|
||
```bash | ||
export AQUARIUS_URL='XXXX' | ||
export PROVIDER_URL='XXXX' | ||
export ADDRESS_FILE='../path/to/your/address-file' | ||
``` | ||
|
||
## Usage | ||
|
||
To explore the commands and option flags available in the Ocean CLI, simply run the following command: | ||
|
||
```bash | ||
npm run cli h | ||
``` | ||
|
||
<figure><img src="../../.gitbook/assets/cli/usage.png" alt=""><figcaption>Available CLI commands & options</figcaption></figure> | ||
|
||
With the Ocean CLI successfully installed and configured, you're ready to dive into its capabilities and unlock the full potential of Ocean Protocol. If you encounter any issues during the setup process or have questions, feel free to seek assistance from our [support](https://discord.com/invite/TnXjkR5) team. 🌊 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,42 @@ | ||
# Run C2D Jobs 🚀 | ||
|
||
## Start a Compute Job 🎯 | ||
|
||
Initiating a compute job can be accomplished through two primary methods. | ||
1. The first approach involves publishing both the dataset and algorithm, as explained in the previous section, [Publish a Dataset](./publish.md) Once that's completed, you can proceed to initiate the compute job. | ||
2. Alternatively, you have the option to explore available datasets and algorithms and kickstart a compute-to-data job by combining your preferred choices. | ||
|
||
To illustrate the latter option, you can use the following command: | ||
|
||
```bash | ||
npm run cli startCompute 'DATASET_DID' 'ALGO_DID' | ||
``` | ||
In this command, replace `DATASET_DID` with the specific DID of the dataset you intend to utilize and `ALGO_DID` with the DID of the algorithm you want to apply. By executing this command, you'll trigger the initiation of a compute-to-data job that harnesses the selected dataset and algorithm for processing. | ||
|
||
<figure><img src="../../.gitbook/assets/cli/c2dstart.png" alt=""><figcaption>Start a compute job</figcaption></figure> | ||
|
||
|
||
## Download Compute Results 🧮 | ||
|
||
To obtain the compute results, we'll follow a two-step process. First, we'll employ the `getJobStatus`` method, patiently monitoring its status until it signals the job's completion. Afterward, we'll utilize this method to acquire the actual results. | ||
|
||
## Monitor Job Status | ||
To track the status of a job, you'll require both the dataset DID and the compute job DID. You can initiate this process by executing the following command: | ||
|
||
```bash | ||
npm run cli getJobStatus 'DATASET_DID' 'JOB_ID' | ||
``` | ||
|
||
Executing this command will allow you to observe the job's status and verify its successful completion. | ||
|
||
<figure><img src="../../.gitbook/assets/cli/jobstatus.png" alt=""><figcaption>Get Job Status</figcaption></figure> | ||
|
||
## Download the results | ||
|
||
For the second method, the dataset DID is no longer required. Instead, you'll need to specify the job ID, the index of the result you wish to download from the available results for that job, and the destination folder where you want to save the downloaded content. The corresponding command is as follows: | ||
|
||
```bash | ||
npm run cli downloadJobResults 'JOB_ID' 'RESULT_INDEX' 'DESTINATION_FOLDER' | ||
``` | ||
|
||
<figure><img src="../../.gitbook/assets/cli/jobResults.png" alt=""><figcaption>Download C2D Job Results</figcaption></figure> |
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.