From 160ea9338bc4d72b6c263bd7aa675af3b3f062b2 Mon Sep 17 00:00:00 2001 From: kush-alloralabs Date: Thu, 18 Jul 2024 09:07:22 -0400 Subject: [PATCH 1/5] small fixes --- .../devs/consumers/offchain-query-existing-topics.mdx | 2 +- .../deploy-worker/deploy-worker-with-allocmd.mdx | 10 +++++----- 2 files changed, 6 insertions(+), 6 deletions(-) diff --git a/pages/devs/consumers/offchain-query-existing-topics.mdx b/pages/devs/consumers/offchain-query-existing-topics.mdx index db64594..d6e2bfe 100644 --- a/pages/devs/consumers/offchain-query-existing-topics.mdx +++ b/pages/devs/consumers/offchain-query-existing-topics.mdx @@ -70,7 +70,7 @@ Querying through this API is considered a safe request because it queries a sign */} Let's break down this request body: -- https://api.upshot.xyz/v2/allora/adapter/: The generic API request URL +- https://api.upshot.xyz/v2/allora/consumer/: The generic API request URL - `arbitrum-42161`: The target chain and chain ID to query the signed inference from - Currently 2 target chains available: - `ethereum-11155111` diff --git a/pages/devs/workers/deploy-worker/deploy-worker-with-allocmd.mdx b/pages/devs/workers/deploy-worker/deploy-worker-with-allocmd.mdx index 289b9fb..54fc432 100644 --- a/pages/devs/workers/deploy-worker/deploy-worker-with-allocmd.mdx +++ b/pages/devs/workers/deploy-worker/deploy-worker-with-allocmd.mdx @@ -13,11 +13,6 @@ First, [install `allocmd`](/devs/get-started/installation/cli#installing-allocmd Next, initialize the CLI to bootstrap all the needed components to get your worker running. The following command will handle the initialization process. It will create all the files in the appropriate directories and generate identities for your node to be used for local development. -```shell -allocmd generate worker --name --topic --env dev --network allora-testnet-1 -cd /worker -``` - > Note: if you're facing `Permission denied` issues, please try to create the folders before running the `allocmd generate` command: > ```shell > mkdir -p /worker/data/head @@ -26,6 +21,11 @@ cd /worker > chmod -R 777 .//worker/data/worker > ``` +```shell +allocmd generate worker --name --topic --env dev --network allora-testnet-1 +cd /worker +``` + Before running this command you will have to [pick the topic Id](/devs/get-started/existing-topics) you wish to generate inference for after which you can run this command with the topic Id. The command will auto-create some files, the most important of which is the `dev-docker-compose.yaml`file which is an already complete docker-compose that you can run immediately to see your worker and head nodes running on your local machine. You can edit the files as you wish. For instance, the `main.py` is meant for you to call your inference server, hence you will have to edit the sample code with actual URLs and logic as you prefer. After you have written and tested your logic in `main.py`, you can then run From 8ada12c7b9f45c770600cb781c1077b3113f07f7 Mon Sep 17 00:00:00 2001 From: kush-alloralabs Date: Thu, 18 Jul 2024 09:56:09 -0400 Subject: [PATCH 2/5] community guides --- pages/community/resources.mdx | 3 + .../walkthrough-use-topic-inference.mdx | 103 ++++++++++++++++++ pages/home/explore.mdx | 8 +- pages/home/tokenomics.mdx | 10 +- 4 files changed, 119 insertions(+), 5 deletions(-) create mode 100644 pages/devs/consumers/walkthrough-use-topic-inference.mdx diff --git a/pages/community/resources.mdx b/pages/community/resources.mdx index 169d5b3..7f0c7a0 100644 --- a/pages/community/resources.mdx +++ b/pages/community/resources.mdx @@ -28,3 +28,6 @@ - **Troubleshooting a worker node** - https://medium.com/@casual1st/allora-worker-node-tshoot-655f46d9165d +- **Worker node migration (Testnet v2)** + - https://medium.com/@nodescribe/allora-worker-node-step-by-step-guide-ccb29d90e01f + - https://github.com/casual1st/alloraworkersetup/blob/main/testnetmigrator.sh \ No newline at end of file diff --git a/pages/devs/consumers/walkthrough-use-topic-inference.mdx b/pages/devs/consumers/walkthrough-use-topic-inference.mdx new file mode 100644 index 0000000..4a0c213 --- /dev/null +++ b/pages/devs/consumers/walkthrough-use-topic-inference.mdx @@ -0,0 +1,103 @@ +# Walkthrough: Using a Topic Inference on-chain + +Follow these instructions to bring the most recent inference data on-chain for a given topic. + +## Complete Example: + +```solidity + /** + * @notice Example for calling a protocol function with topic inference data + * + * @param protocolFunctionArgument An argument for the protocol function + * @param alloraNetworkInferenceData The signed data from the Allora Consumer + */ + function callProtocolFunctionWithAlloraTopicInference( + uint256 protocolFunctionArgument, + AlloraConsumerNetworkInferenceData calldata alloraNetworkInferenceData + ) external payable { + ( + uint256 value, + uint256[] memory confidenceIntervalPercentiles, + uint256[] memory confidenceIntervalValues, + ) = IAlloraConsumer().verifyNetworkInference(alloraNetworkInferenceData); + + _protocolFunctionRequiringPredictionValue( + protocolFunctionArgument, + value, + confidenceIntervalPercentiles, + confidenceIntervalValues + ); + } +``` + +## Step by Step Guide: + +1. Create an Upshot API key by [creating an account](https://developer.upshot.xyz/signup). +2. Call the Consumer Inference API using the `topicId` found in the [deployed topics list](./existing-topics) and the correct chainId. For example, if you use sepolia, you would provide `ethereum-11155111`. + +```shell +curl -X 'GET' --url 'https://api.upshot.xyz/v2/allora/consumer/?allora_topic_id=' -H 'accept: application/json' -H 'x-api-key: ' +``` + +Here is an example response: +```json +{ + "request_id": "b52b7c20-57ae-4852-bdbb-8f39cf317974", + "status": true, + "data": { + "signature": "0x99b8b75f875a9ecc09fc499073656407458d464edeceb384686dba990ed785d841e6510b578d253a6e19a20503d1ec1e3c38b4c60980ff3b4df9ce3335ebd3851b", + "inference_data": { + "network_inference": "3365485208027959000000", + "confidence_interval_percentiles": ["2280000000000000000", "15870000000000000000", "50000000000000000000", "84130000000000000000", "97720000000000000000"], + "confidence_interval_values": ["2280000000000000000", "15870000000000000000", "50000000000000000000", "84130000000000000000", "97720000000000000000"], + "topic_id": "9", + "timestamp": "1719866777", + "extra_data": "0x" + } + } +} +``` + +3. Construct a call to the Allora Consumer contract on the chain of your choice (options listed under [deployments](./existing-consumers)) using the returned `signature` and `network-inference` as follows: + +## Creating the Transaction: + +Note you be doing something more like `callProtocolFunctionWithAlloraTopicInference` in the example above, so you would want to construct your call to that contract in a similar way to the following. You can find the complete example [here](https://github.com/allora-network/allora-consumer/blob/main/script/verifyDataExampleSimple.ts). + +```typescript +const alloraConsumer = + (new AlloraConsumer__factory()) + .attach(ALLORA_CONSUMER_ADDRESS) + .connect(senderWallet) as AlloraConsumer + +const tx = await alloraConsumer.verifyNetworkInference({ + signature: '0x99b8b75f875a9ecc09fc499073656407458d464edeceb384686dba990ed785d841e6510b578d253a6e19a20503d1ec1e3c38b4c60980ff3b4df9ce3335ebd3851b', + networkInference: { + topicId: 9, + timestamp: 1719866777, + extraData: ethers.toUtf8Bytes(''), + networkInference: '3365485208027959000000', + confidenceIntervalPercentiles:['2280000000000000000', '15870000000000000000', '50000000000000000000', '84130000000000000000', '97720000000000000000' ], + confidenceIntervalValues:[ '3016256807053656000000', '3029849059956295000000', '3049738780726754000000', '3148682039955208400000', '3278333171848616500000' ], + }, + extraData: ethers.toUtf8Bytes(''), +}) + +console.info('tx hash:', tx.hash) +console.info('Awaiting tx confirmation...') + +const result = await tx.wait() + +console.info('tx receipt:', result) +``` + + +## Notes + +- The API endpoint uses `snake_case`, while the smart contract uses `camelCase` for attribute names. +- Ethers.js does not accept `''` for `extraData`. Empty `extraData` should be denoted with `'0x'`. + +## Code Links + +- [Open source consumer code](https://github.com/allora-network/allora-consumer/blob/main/src/) +- [IAlloraConsumer](https://github.com/allora-network/allora-consumer/blob/main/src/interface/IAlloraConsumer.sol), including the structs used for Solidity code. diff --git a/pages/home/explore.mdx b/pages/home/explore.mdx index e1351d9..fec3f46 100644 --- a/pages/home/explore.mdx +++ b/pages/home/explore.mdx @@ -14,14 +14,16 @@ Examples of intelligence include, but are not limited to: ## Learn More - + + + + + - - {/* Allora brings together diff --git a/pages/home/tokenomics.mdx b/pages/home/tokenomics.mdx index 0f1be27..38dccf1 100644 --- a/pages/home/tokenomics.mdx +++ b/pages/home/tokenomics.mdx @@ -1,3 +1,5 @@ +import { Callout } from 'nextra/components' + # Allora Tokenomics The Allora token (ALLO) is minted by the Allora Network to facilitate the exchange of value by network participants. @@ -8,8 +10,12 @@ The ALLO token incorporates a Pay-What-You-Want (PWYW) model to allow token hold Token holders have the autonomy to decide the amount of ALLO they wish to pay for a given inference, which encourages token holders to contribute to the network's ecosystem according to their perceived value of the service. -Flexible price discovery across topics (less opinionated) - how the market reaches an agreed-upon price -running an LLM is a different (heterogeneous) resource provision compared to something like ethereum which is homogeneous (each opcode has a fixed price). + +**Important Note**: If participants choose to pay **zero** fees for a particular topic, the weight of that topic tends to zero. As a result, participants within that topic **receive no rewards**, and the token emission will be redistributed over other topics. This mechanism ensures that topics with no fee payments do not sustain themselves, driving healthy competition and price discovery across the network. + + +Flexible price discovery across topics is a less opinionated method that allows the market to reach an agreed-upon price through natural negotiation and market dynamics. Running a Large Language Model (LLM) involves a different, heterogeneous resource provision compared to something like Ethereum, which is homogeneous since each opcode in Ethereum has a fixed price. + ## Token Emissions ### Bitcoin-like Emission Schedule From fcf30141ffb44c0cae43559b29e717acade8f68e Mon Sep 17 00:00:00 2001 From: kush-alloralabs Date: Thu, 18 Jul 2024 09:57:16 -0400 Subject: [PATCH 3/5] community guides --- pages/devs/consumers/_meta.json | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/pages/devs/consumers/_meta.json b/pages/devs/consumers/_meta.json index ab26d6d..b4dd231 100644 --- a/pages/devs/consumers/_meta.json +++ b/pages/devs/consumers/_meta.json @@ -2,5 +2,6 @@ "offchain-query-existing-topics": "How to Query Existing Workers Off-Chain", "onchain-query-existing-data": "How to Query Topic Data On-Chain", "consumer-contracts": "Consumer Contracts", - "existing-consumers": "Existing Consumers" + "existing-consumers": "Existing Consumers", + "walkthrough-use-topic-inference": "Walkthrough: Using a Topic Inference on-chain" } \ No newline at end of file From 4e95aa37b87f8cbbf9da5d63cd3a1861eb0cba53 Mon Sep 17 00:00:00 2001 From: kush-alloralabs Date: Thu, 18 Jul 2024 09:58:48 -0400 Subject: [PATCH 4/5] community guides --- ...ld-and-deploy-worker-with-node-runners.mdx | 62 ------------------- 1 file changed, 62 deletions(-) delete mode 100644 pages/devs/workers/build-and-deploy-worker-with-node-runners.mdx diff --git a/pages/devs/workers/build-and-deploy-worker-with-node-runners.mdx b/pages/devs/workers/build-and-deploy-worker-with-node-runners.mdx deleted file mode 100644 index 2e097da..0000000 --- a/pages/devs/workers/build-and-deploy-worker-with-node-runners.mdx +++ /dev/null @@ -1,62 +0,0 @@ -# Build and Deploy a Worker Node With AWS Node Runners - -Welcome to the AWS Node Runners documentation! This page provides detailed instructions on how to leverage Node Runners on AWS, including benefits, setup instructions, and useful links. - -## Overview - -Node Runners on AWS enables you to deploy and manage blockchain nodes efficiently using AWS infrastructure. Whether you're deploying Ethereum nodes or other blockchain networks, Node Runners simplifies the process, offering scalability, reliability, and cost-effectiveness. - -For more detailed information and step-by-step guides, please refer to the [AWS Node Runners Documentation](https://aws-samples.github.io/aws-blockchain-node-runners/docs/Blueprints/Ethereum). - -### Allora Network's AWS Infrastructure - -This diagram illustrates the architecture of the integration between the Allora Network (built on a Cosmos AppChain) and an AWS-based infrastructure for handling inference requests. - -![node-runners](/aws-node-runners.png) - -#### Key Components - -1. **Allora Network (Cosmos AppChain)** - - **Public Head Node**: Acts as the entry point for the Allora Network, handling requests and responses. - -2. **AWS Account Setup** - - **Region**: The geographical location within AWS where the resources are deployed. - - **Virtual Private Cloud (VPC)**: Provides an isolated network environment within the AWS region. - - **Public Subnet**: A subnet within the VPC that has access to the internet through the VPC Internet Gateway. - - **VPC Internet Gateway**: Allows communication between the instances in the VPC and the internet. - -3. **EC2 Instance (Allora Worker Node)** - - **Inference Base**: This component handles network communication, receiving requests from the Allora Network's Public Head Node and sending responses back. - - **Node Function**: Processes requests by interfacing with the private model server. It acts as an intermediary, ensuring the requests are correctly formatted and the responses are appropriately handled. - - **Model Server**: Hosts the proprietary model. It executes the main inference script (`Main.py`) to generate inferences based on the received requests. - -#### Process Flow - -1. **Request Flow**: - - The Allora Network's Public Head Node sends a request for inferences to the EC2 instance within the AWS environment. - - The request passes through the VPC Internet Gateway and reaches the Inference Base in the public subnet. - - The Inference Base forwards the request to the Node Function. - - The Node Function calls `Main.py` on the Model Server to generate the required inferences. - -2. **Response Flow**: - - The Model Server processes the request and returns the inferences to the Node Function. - - The Node Function sends the inferences back to the Inference Base. - - The Inference Base communicates the inferences back to the Allora Network's Public Head Node via the VPC Internet Gateway. - -## AWS Activate - -Before proceeding, please note that eligibility for AWS Activate credits and terms are governed by AWS. This documentation may become outdated, so ensure you refer to the [AWS Activate program page](https://aws.amazon.com/startups/credits#hero) for the latest eligibility requirements and instructions. - -## AWS Activate Stepwise Process - -To receive up to $5,000 in AWS Activate credits, follow these steps: - -1. **Fill out our [Typeform](https://vk4z45e3hne.typeform.com/to/TVwcjiL1)**: Provide your details to receive our Activate Provider Organizational ID. - - Name (required) - - Contact Information (optional): Email, Telegram, Discord handle, Linkedin - - Official Company Website (required) - -2. **AWS Activate High-Level Instructions**: After obtaining our Organizational ID, - - Visit [AWS Activate Credit Packages](https://aws.amazon.com/startups/credits#packages). - - Apply through the Activate Portfolio - From 01804a2c6cd41a9ecfa7fe927b501f3af33db6d3 Mon Sep 17 00:00:00 2001 From: kush-alloralabs Date: Thu, 18 Jul 2024 10:01:03 -0400 Subject: [PATCH 5/5] community guides --- pages/community/contribute.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/pages/community/contribute.mdx b/pages/community/contribute.mdx index fe93f83..a689c34 100644 --- a/pages/community/contribute.mdx +++ b/pages/community/contribute.mdx @@ -2,7 +2,7 @@ ## Developers -[Our open-source repos](https://github.com/allora-network) all follow the [same contribution guidelines](https://github.com/allora-network/allora-chain/blob/main/CONTRIBUTING.md), including these docs themselves. CHECKOUT +[Our open-source repos](https://github.com/allora-network) all follow the [same contribution guidelines](https://github.com/allora-network/allora-chain/blob/main/CONTRIBUTING.md), including these docs themselves. Our [Discord](https://discord.gg/allora) is great for speaking directly with the core Allora developers, coordinating with fellow [Developer Advocates](https://www.allora.network/blog/introducing-the-allora-network-community-advocate-program), and keeping up to date with the latest regarding all things Allora.